WO2010091464A1 - Procédé et système de surveillance d'un opérateur de machinerie - Google Patents

Procédé et système de surveillance d'un opérateur de machinerie Download PDF

Info

Publication number
WO2010091464A1
WO2010091464A1 PCT/AU2010/000142 AU2010000142W WO2010091464A1 WO 2010091464 A1 WO2010091464 A1 WO 2010091464A1 AU 2010000142 W AU2010000142 W AU 2010000142W WO 2010091464 A1 WO2010091464 A1 WO 2010091464A1
Authority
WO
WIPO (PCT)
Prior art keywords
machinery
acceleration
head
operator
response characteristic
Prior art date
Application number
PCT/AU2010/000142
Other languages
English (en)
Inventor
Nicholas John Langdale-Smith
Timothy James Henry Edwards
Original Assignee
Seeing Machines Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2009900535A external-priority patent/AU2009900535A0/en
Application filed by Seeing Machines Limited filed Critical Seeing Machines Limited
Publication of WO2010091464A1 publication Critical patent/WO2010091464A1/fr

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K28/00Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions
    • B60K28/02Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver
    • B60K28/06Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver responsive to incapacity of driver
    • B60K28/066Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver responsive to incapacity of driver actuating a signalling device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1113Local tracking of patients, e.g. in a hospital or private home
    • A61B5/1114Tracking parts of the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/163Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/18Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6803Head-worn items, e.g. helmets, masks, headphones or goggles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/20Workers
    • A61B2503/22Motor vehicles operators, e.g. drivers, pilots, captains
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6893Cars
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2520/00Input parameters relating to overall vehicle dynamics
    • B60W2520/10Longitudinal speed
    • B60W2520/105Longitudinal acceleration

Definitions

  • the present invention relates to a method and system for monitoring an operator of machinery.
  • the invention relates to a method and system for monitoring the driver of a vehicle, and it will be described with specific reference to this application. It will be clear to the skilled person that the invention has broader application. For example, the invention may be used to monitor other individuals (not necessarily the driver, pilot, etc) within a vehicle.
  • systems are commonly known for monitoring the location and/or the level of eyelid closure of the eyes of a driver.
  • the monitoring of the location of the eyes and/or the level of eyelid closure can be difficult in view of the relatively large area in which the eyes may reasonably be located in comparison with their size.
  • reflections of light from the surface of the eyes themselves or from glasses, sunglasses or contact lenses may hinder accurate measurement of the desired characteristics.
  • Known methods to address these particular problems have included increasing camera resolution and incorporating strong infra-red lighting beacons to illuminate the features of interest.
  • Such solutions increase the cost of the system.
  • the discrimination of subtle eye movements incurs a large image processing cost, resulting in increased processing power if a system is to be operable in real time. This also can have a significant increase in the cost of the system.
  • Kisacanin (US 2007/0159344), and Malawey et al. (US 2008/0266552). Disadvantageously, these methods are limited in that they do not track the position and orientation of the operator's head in 3D.
  • Victor et al. (US 2007/0008151) discloses a method for detecting drowsiness from head motion tracked in 3D (6 degrees of freedom). A measured density pattern of head movement components is compared to a previously measured reference density pattern. Band-pass filtering is optionally used to attenuate noise and remove long term postural changes from the head movement signal. Disadvantageously, the method disclosed therein does not classify head movements into reflexive movements (for head stabilisation) and conscious movements.
  • a method of estimating the attentiveness state of an operator of machinery including the steps of: monitoring the accelerations of the environment around the user; determining a corresponding estimated adaptation model of the operator, estimating the operator's likely response to the accelerations whilst attentive to the operation of the machinery; determining a measure of the degree of divergence between a current user's adaptation to the accelerations experienced by the user when subjected to the accelerations in the environment around the user; and outputting an indicator of said measure of the degree of divergence.
  • the indicator indicates if the measure of the degree of divergence exceeds a predetermined level.
  • a method for monitoring an operator of machinery including the steps of: detecting acceleration of the machinery; detecting the movement of the head of the operator; and comparing the acceleration of the machinery and the movement of the head to derive an acceleration response characteristic.
  • the step of detecting acceleration of the machinery includes receiving output from one or more inertial sensors mounted, preferably rigidly mounted, to the machinery (for example to the driver's seat).
  • the step of detecting the movement of the head includes detecting the movement of the head relative to the machinery.
  • the step of detecting the movement of the head includes monitoring the position and/or orientation of the head relative to the machinery.
  • the step of detecting the movement of the head includes monitoring the position and/or orientation of the head relative to the machinery using video-based head-pose tracking apparatus having a camera rigidly mounted to the machinery.
  • the method further includes generating a calibration model of expected acceleration response characteristics correlating input data indicative of machinery acceleration conditions and output data indicative of expected head movement conditions in response to the machinery acceleration conditions.
  • the input data of the calibration model includes data indicative of at least one of the following input conditions: duration of acceleration of the machinery, rate of change of acceleration of the machinery, and magnitude and direction of acceleration of the machinery.
  • the input data further includes data indicative of at least one of: the position and/or orientation of the head of the operator, the position of a seat in which the operator is seated.
  • the output data of the calibration model includes data indicative of at least one of the following response conditions: response time, position, orientation, duration of acceleration of the head, rate of change of acceleration of the head, magnitude and direction of acceleration of the head.
  • the method further includes the step of comparing the derived acceleration response characteristic with an expected acceleration response characteristic determined with reference to the calibration model.
  • the method further includes the step of generating an alert condition in the event that the derived acceleration response characteristic satisfies one or more predetermined alert criteria.
  • the alert condition is generated if the derived acceleration response characteristic deviates from the expected acceleration response characteristic.
  • the alert condition is generated if the derived acceleration response characteristic deviates from the expected acceleration response characteristic by greater than a predetermined threshold value.
  • the alert condition is generated if the derived acceleration response characteristic deviates from the expected acceleration response characteristic by greater than a predetermined ratio value.
  • a further aspect provides a system for monitoring an operator of machinery, including: a first sensor for detecting acceleration of the machinery; a second sensor for detecting movement of head of the operator; and a processor for comparing the acceleration of the machinery and the acceleration of the head to derive an acceleration response characteristic.
  • the first sensor includes one or more inertial sensors rigidly mounted to the machinery, or a communications port for receiving a signal from an inertial sensor mounted, preferably rigidly mounted, to the machinery (for example to the driver's seat).
  • the second sensor is adapted for detecting the movement of the head of the operator relative to the machinery.
  • the second sensor includes a head position monitoring device for monitoring the position of the head relative to the machinery.
  • the head position monitoring device includes a head pose tracking apparatus having a camera, preferably mounted rigidly to the machinery.
  • the processor is adapted for generating a calibration model of expected acceleration response characteristics correlating input data indicative of machinery acceleration conditions and output data indicative of expected head movement conditions in response to the machinery acceleration conditions.
  • the processor is adapted to include in the input data of the calibration model data indicative of at least one of the following input conditions: duration of acceleration of the machinery, rate of change of acceleration of the machinery, and magnitude and direction of acceleration of the machinery.
  • the processor is further adapted to include in the input data of the calibration model data indicative of at least one of: the position and/or orientation of the head of the operator, the position of a seat in which the operator is seated.
  • the processor is adapted to include in the output data of the calibration model data indicative of at least one of the following response conditions: response time, position, orientation, duration of acceleration of the head, rate of change of acceleration of the head, magnitude and direction of acceleration of the head.
  • the processor is adapted to compare the derived acceleration response characteristic with an expected acceleration response characteristic determined with reference to the calibration model.
  • the processor is adapted to generate an alert condition in the event that the derived acceleration response characteristic satisfies one or more predetermined alert criteria.
  • the processor is adapted to generate the alert condition if the derived acceleration response characteristic deviates from the expected acceleration response characteristic.
  • the processor is adapted to generate the alert condition if the derived acceleration response characteristic deviates from the expected acceleration response characteristic by greater than a predetermined threshold value.
  • the processor is adapted to generate the alert condition if the derived acceleration response characteristic deviates from the expected acceleration response characteristic by greater than a predetermined ratio value.
  • the system preferably further includes an alert condition communicating device.
  • the alert condition communicating device includes an audio and/or visual device for communicating the alert condition to the operator.
  • the alert condition communication device includes an audio and/or visual communicating device for communicating the alert condition to a third party.
  • the alert condition communicating device includes an output port for data communications with a computer.
  • Figure 1 is schematic representation of a system for monitoring an operator of machinery
  • FIG. 2 is a flowchart illustrating a method for monitoring an operator of machinery. Detailed Description
  • the basis of the preferred embodiments of the present invention relies on the fact that for a human to observe detailed information in any scene, the eye must be held reasonably stable with respect to the scene; so the image of the scene forms a stable image on the retina.
  • the head and eyes are held stable by a series of reflexes to facilitate gaze stabilisation, including:
  • VOR Vestibulo- Ocular Reflex
  • OCR Opto-Collic Reflex
  • VCR Vestibulo- Collie Reflex
  • CCR Cervico-Collic Reflex
  • the reflexes all function together when the head is free, and light is available, in order to stabilize gaze.
  • the vestibular system combines vestibular information with visual input from the optokinetic system, and somatosensory input from neck receptors, to provide the brain with an estimate of self motion.
  • the preferred embodiments of the present invention takes advantage of the fact that all moving vehicles incur a force on the operator and that these forces, and the operator's reaction to them, can be used as a measurement to determine whether or not the operator is frequently fixating on details of the road scene.
  • the present invention uses the fact that the total combined reflex movement of the head is more responsive (faster and more intense) when a person is fixating on or pursuing (tracking) an object in the scene due to visual input.
  • the reflex response of the head is measurably dampened. In this way the current state of the machinery operator can be monitored.
  • Figure 1 illustrates a system for monitoring an operator of machinery.
  • the operator is a driver 1 of a vehicle 2.
  • the system includes a first sensor 3 for detecting acceleration of the vehicle 2.
  • the sensor 3 is a biaxial inertial sensor oriented to sense acceleration in the left-right and forward-rearward directions with respect of the driver 1.
  • the first sensor 3 is a triaxial inertial sensor.
  • the first sensor 3 is rigidly mounted to the vehicle 2.
  • the inertial sensor forms part of the system.
  • the system includes a communications port for connection to an existing inertial sensor in a vehicle.
  • the system further includes a second sensor 4 for tracking movement of the head 5 of the driver, preferably in 6 degrees of freedom (translations Xh, Yh, Zh, and rotations ⁇ x h , ⁇ y h , ⁇ Zh )-
  • a processor 6 is provided for comparing the acceleration of the vehicle and the movement of the head 5 to derive an acceleration response characteristic.
  • the system categorizes head movements as being reflexes when measurement of the movement (position, and/or orientation, and/or velocity, and/or acceleration) of the head indicate that the head stabilisation control system, including the neck and torso, is acting to oppose the input perturbation forces due to acceleration of the machinery, and thereby to help stabilise the position of the eye relative to the scene upon which the operator's attention is directed.
  • Reflexes are isolated from conscious head- movements in this way, as well as in the frequency domain; since reflexes have characteristic frequency responses and occur more quickly than can be achieved by conscious movement.
  • Operator impairment is then detected as a statistical reduction in reflex gain and/or variation in response time to perturbation forces.
  • model-based filtering of the head movements can be used to classify head movements into reflex and other movements (such as conscious and long term postural) and may include the incorporation of supplementary driver behaviour information (such as steering wheel movement, head pose, eye closure, eye gaze, facial expressions).
  • driver behaviour information such as steering wheel movement, head pose, eye closure, eye gaze, facial expressions.
  • the driver's head can be modelled as an inverted pendulum subject to measurable external perturbation forces that is primarily dampened by stabilisation reflexes (provided by the vestibular, optokinetic and somatosensory systems).
  • stabilisation reflexes provided by the vestibular, optokinetic and somatosensory systems.
  • the second sensor 4 includes video- based head-pose tracking apparatus, such as FaceLAB, Driver State Sensor, or faceAPI available from Seeing Machines (http://www.seeingmachines.com), for monitoring the position of the head 5 of the operator 1 relative to the vehicle 2.
  • video- based head-pose tracking apparatus such as FaceLAB, Driver State Sensor, or faceAPI available from Seeing Machines (http://www.seeingmachines.com)
  • Tracking methods and systems can be as described in United States Patents US 7043056 and US 7460693, and PCT Patent Application Publication Numbers WO 2004/003849, WO 2007/062478, and WO 2008/106725, by the same applicant, are hereby incorporated by cross reference.
  • a video camera 7 is rigidly mounted to the vehicle 2, and directed towards the face of the driver. The position of the camera may be adjusted to allow for the capture of video of drivers of different heights.
  • the head-pose tracking apparatus further includes a processor 8 for manipulating video data received from the camera 7.
  • the system could incorporate the monitoring of other indicators from monitoring driver behaviour, vehicle monitoring, and/or vehicle environment monitoring.
  • Monitoring of driver behaviour could include eye and gaze tracking, facial feature tracking (lips, eye brows etc), facial temperature, steering wheel hand grip pressure, and so on.
  • Vehicle monitoring could include vehicle speed, cabin temperature, suspension movements, steering wheel movements, and so on.
  • Vehicle environment monitoring could include GPS, collision detection, lane departure, traffic conditions, time of day, and so on.
  • a calibration phase is performed in which the driver 1 of the vehicle 2 is monitored. The calibration phase is undertaken when the driver is known to be operating in an optimal condition, i.e.
  • the system is operated to gather information regarding the driver's unimpaired response to vehicle acceleration conditions.
  • the processor 6 receives information from the first and second sensors 3 and 4, and uses the received information to build a model of the driver's physiological system for stabilising the head.
  • the model correlates input data indicative of vehicle acceleration conditions and output data indicative of expected head movement conditions in response to the vehicle acceleration conditions.
  • the model correlates the input and output data by means of transfer functions for calculating an expected head movement condition in dependence upon received input data.
  • the model is stored in memory 9 associated with the processor 6.
  • the input data of the calibration model includes at least one of the following input conditions: duration of acceleration of the vehicle, rate of change of acceleration of the vehicle, and magnitude and direction of acceleration of the vehicle.
  • the input data further includes at least one of the position and/or orientation of the head of the operator, and the position of a seat 10 in which the operator is seated.
  • the system includes a seat angle sensor 11 for detecting the angle at which the back of the driver's seat is set.
  • the calibration model is configured to be able to calculate at least one of the following expected response conditions in response to the input data: response time, position, orientation, duration of acceleration of the head, rate of change of acceleration of the head, magnitude and direction of acceleration of the head.
  • the calibration model is configured to provide a statistically-expected response.
  • the calibration model provides as its output the mean expected response and the standard deviation from that mean.
  • the system may be used in an operation phase.
  • the processor 6 receives data indicative of the movement of the driver's head from the head-pose tracking system 4. The processor 6 uses this data to determine an actual response characteristic having the same nature as the expected response characteristic which will be output by the calibration model.
  • the processor determines an actual response time; in embodiments in which the calibration model is configured to output an expected response characteristic which includes a combination of some or all of the possible head acceleration conditions, the processor determines an actual response characteristic including the same combination of conditions.
  • the processor compares the actual response characteristic with the expected response characteristic determined using the calibration model. As a result of the comparison, the processor generates an alert condition in the event that the comparison satisfies one or more predetermined alert criteria. For example, in various embodiments, the processor generates the alert condition if the derived acceleration response characteristic deviates from the expected acceleration response characteristic by greater than a predetermined threshold value, or by greater than a predetermined ratio or gain value.
  • the predetermined alert criteria include the statistical deviation of the actual response characteristic from the expected response characteristic. For example, in one embodiment, the alert condition is raised in the event that an actual response characteristic deviates from a mean expected response characteristic by greater than a predetermined number of standard deviations.
  • the system of the embodiment shown in Figure 1 includes an audio-visual driver alert device in the form of a warning lamp 12 and a sounder 13.
  • the system also includes a communications device 14 for communicating the existence of an alert condition with a third party.
  • the communications device 14 includes a radio transmitter for transmitting data indicative of the existence of the alert condition to the third party.
  • the data is transmitted to a mobile communications device held by the third party and/or a computer device for providing a third party with an audio-visual indication of the existence of an alert condition.
  • the system also includes a vehicle control management communications device, in the form of a communications port, for providing an input into the control system 15 of the vehicle 2 to effect a particular response to an alert condition.
  • the particular response includes at least one of: disablement of the vehicle, disablement of ancillary equipment attached to the vehicle, restriction of the maximum power of the vehicle or the speed of the vehicle, and so on.
  • a calibration model 8 is generated at step S2, as described above.
  • step S8 acceleration of the vehicle and movement of the head of the driver are detected substantially simultaneously at steps S3 and S4 using respectively the inertial sensor 3 and the head-tracking apparatus of Fig. 1. Also substantially simultaneously, the expected response characteristic is derived at step S5 with reference to the detected vehicle acceleration and the calibration model 8, and the actual response characteristic is derived at step S6 with reference to the detected head movement. [0070] At step S7, the expected response characteristic and the actual response characteristic are compared with reference to predetermined criteria as indicated above. [0071] In the event that the response characteristics do not correspond, that is to say if the predetermined criteria are not satisfied because the actual response characteristic deviates from the expected response characteristic in a statistically significant manner, an alert condition is generated at step S8.
  • an alert condition is communicated to the driver and/or to a third party and/or to a vehicle management system of the vehicle 2 itself.
  • steps of detecting acceleration and movement (S3, S4), deriving expected and actual response characteristics (S5, S6), comparing the characteristics (S7) and generating the alert condition (S8) have been described as occurring sequentially, it will be clear to the skilled person that comparison of the actual response with the expected response preferably occurs on an ongoing and continuous basis.
  • the head-pose tracking system 4 and the inertial sensor 3 have respective coordinate frames.
  • the inertial sensor 3 is mounted and/or the software of the head-pose tracking system 4 is configured such that these coordinate frames are parallel.
  • the coordinate frame of the calibration model that is to say the coordinate frame in which movement of the head is considered to occur, is parallel to the coordinate frame of the head-pose tracking system 4 and/or the inertial sensor 3.
  • the coordinate frame is selected for simplicity of the model. For example, if movement in one direction is not possible, or is not considered significant, the coordinate frame may be orientated to have this as one of the axial directions.
  • the head-pose tracking apparatus has a dedicated processor 8 in addition to the main system processor 6. In alternative embodiments, a single processor is provided.
  • the invention has been disclosed with reference to monitoring an operator of a vehicle.
  • machinery in the form of a vehicle.
  • machinery should be read broadly to include any moveable machinery having an onboard human operator, such as: personal, recreational, commercial or military transportation vehicles; industrial machinery such as cranes, earthmoving equipment; test machinery for the monitoring of individuals under particular conditions for aeronautics research, and so on. This list of examples is not to be considered exhaustive.
  • the invention has been described with reference to the driver of a vehicle.
  • operator should also be read broadly to include any individual whose active participation in the control of the machinery is desirable. Examples include navigators other crew members; and operators of ancillary equipment mounted on or controlled from the machinery, such as cranes, drills, earthmoving apparatus, and so on. This list of examples is not to be considered exhaustive.
  • any one of the terms comprising, comprised of or which comprises is an open term that means including at least the elements/features that follow, but not excluding others.
  • the term comprising, when used in the claims should not be interpreted as being limitative to the means or elements or steps listed thereafter.
  • the scope of the expression a device comprising A and B should not be limited to devices consisting only of elements A and B.
  • Any one of the terms including or which includes or that includes as used herein is also an open term that also means including at least the elements/features that follow the term, but not excluding others. Thus, including is synonymous with and means comprising.
  • Coupled when used in the claims, should not be interpreted as being limitative to direct connections only.
  • the terms “coupled” and “connected,” along with their derivatives, may be used. It should be understood that these terms are not intended as synonyms for each other.
  • the scope of the expression a device A coupled to a device B should not be limited to devices or systems wherein an output of device A is directly connected to an input of device B. It means that there exists a path between an output of A and an input of B which may be a path including other devices or means.
  • Coupled may mean that two or more elements are either in direct physical or electrical contact, or that two or more elements are not in direct contact with each other but yet still co-operate or interact with each other.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Medical Informatics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Child & Adolescent Psychology (AREA)
  • Developmental Disabilities (AREA)
  • Psychology (AREA)
  • Psychiatry (AREA)
  • Hospice & Palliative Care (AREA)
  • Educational Technology (AREA)
  • Social Psychology (AREA)
  • Mechanical Engineering (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Physiology (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

L'invention porte sur un procédé et un système d'estimation de l'état d'attention d'un opérateur de machinerie, l'opérateur subissant des accélérations tout en faisant fonctionner la machinerie, les accélérations étant une conséquence de mouvements de la machinerie dans un environnement externe, comprenant les étapes consistant à : (a) surveiller les accélérations de l'environnement autour de l'utilisateur; (b) déterminer un modèle d'adaptation estimée correspondant de l'opérateur, estimant la réponse probable de l'opérateur aux accélérations pendant qu'il est attentif au fonctionnement de la machinerie; (c) déterminer une mesure du degré de divergence par rapport à l'adaptation courante d'un utilisateur aux accélérations subies par l'utilisateur lorsqu'il est soumis aux accélérations dans l'environnement autour de l'utilisateur; et (d) délivrer un indicateur de ladite mesure du degré de divergence.
PCT/AU2010/000142 2009-02-11 2010-02-11 Procédé et système de surveillance d'un opérateur de machinerie WO2010091464A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AU2009900535A AU2009900535A0 (en) 2009-02-11 Method and System for Monitoring an Operator of Machinery
AU2009900535 2009-02-11

Publications (1)

Publication Number Publication Date
WO2010091464A1 true WO2010091464A1 (fr) 2010-08-19

Family

ID=42561322

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2010/000142 WO2010091464A1 (fr) 2009-02-11 2010-02-11 Procédé et système de surveillance d'un opérateur de machinerie

Country Status (1)

Country Link
WO (1) WO2010091464A1 (fr)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014120053A1 (fr) * 2013-01-30 2014-08-07 Telefonaktiebolaget L M Ericsson (Publ) Description de détection de comportement pour conduite sûre et commande d'automobile basée sur le résultat de détection
GB2541937A (en) * 2015-09-07 2017-03-08 Bae Systems Plc A monitoring system for a military vehicle
GB2541936A (en) * 2015-09-07 2017-03-08 Bae Systems Plc A monitoring system for a military vehicle
EP3239011A1 (fr) * 2016-04-28 2017-11-01 Toyota Jidosha Kabushiki Kaisha Dispositif d'estimation de conscience de conduite
WO2019176492A1 (fr) * 2018-03-15 2019-09-19 オムロン株式会社 Système de calcul, dispositif de traitement d'informations, système d'aide à la conduite, procédé de calcul d'indice, programme informatique et support de stockage
WO2023020848A1 (fr) * 2021-08-20 2023-02-23 Cariad Se Procédé de détermination de l'aptitude à la conduite d'un conducteur d'un véhicule automobile
WO2023242842A1 (fr) * 2022-06-15 2023-12-21 Ze Corractions Ltd. Procédé et système de suivi de l'état cognitif d'un sujet

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6661345B1 (en) * 1999-10-22 2003-12-09 The Johns Hopkins University Alertness monitoring system
US6756903B2 (en) * 2001-05-04 2004-06-29 Sphericon Ltd. Driver alertness monitoring system
WO2005052878A1 (fr) * 2003-11-30 2005-06-09 Volvo Technology Corporation Procede et systeme de reconnaissance de defaillances du conducteur
US6974414B2 (en) * 2002-02-19 2005-12-13 Volvo Technology Corporation System and method for monitoring and managing driver attention loads
US20060042851A1 (en) * 2004-09-02 2006-03-02 Thomas Herrmann Passenger-protection device in a vehicle

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6661345B1 (en) * 1999-10-22 2003-12-09 The Johns Hopkins University Alertness monitoring system
US6756903B2 (en) * 2001-05-04 2004-06-29 Sphericon Ltd. Driver alertness monitoring system
US6974414B2 (en) * 2002-02-19 2005-12-13 Volvo Technology Corporation System and method for monitoring and managing driver attention loads
WO2005052878A1 (fr) * 2003-11-30 2005-06-09 Volvo Technology Corporation Procede et systeme de reconnaissance de defaillances du conducteur
US20060042851A1 (en) * 2004-09-02 2006-03-02 Thomas Herrmann Passenger-protection device in a vehicle

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"Intelligent Vehicles Symposium, 2003 Proceedings of.IEEE, 9 JUNE 2003 - 11 June 2003", 2003, article POPIEUL ET AL.: "Using Driver's Head Movements Evolution as a Drowsiriess Indicator", pages: 616 - 621 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014120053A1 (fr) * 2013-01-30 2014-08-07 Telefonaktiebolaget L M Ericsson (Publ) Description de détection de comportement pour conduite sûre et commande d'automobile basée sur le résultat de détection
GB2541937A (en) * 2015-09-07 2017-03-08 Bae Systems Plc A monitoring system for a military vehicle
GB2541936A (en) * 2015-09-07 2017-03-08 Bae Systems Plc A monitoring system for a military vehicle
EP3239011A1 (fr) * 2016-04-28 2017-11-01 Toyota Jidosha Kabushiki Kaisha Dispositif d'estimation de conscience de conduite
US10640122B2 (en) 2016-04-28 2020-05-05 Toyota Jidosha Kabushiki Kaisha Driving consciousness estimation device
WO2019176492A1 (fr) * 2018-03-15 2019-09-19 オムロン株式会社 Système de calcul, dispositif de traitement d'informations, système d'aide à la conduite, procédé de calcul d'indice, programme informatique et support de stockage
WO2023020848A1 (fr) * 2021-08-20 2023-02-23 Cariad Se Procédé de détermination de l'aptitude à la conduite d'un conducteur d'un véhicule automobile
WO2023242842A1 (fr) * 2022-06-15 2023-12-21 Ze Corractions Ltd. Procédé et système de suivi de l'état cognitif d'un sujet

Similar Documents

Publication Publication Date Title
US10569650B1 (en) System and method to monitor and alert vehicle operator of impairment
US20210009149A1 (en) Distractedness sensing system
JP6911841B2 (ja) 画像処理装置及び画像処理方法、並びに移動体
JP6699831B2 (ja) 運転意識推定装置
US9101313B2 (en) System and method for improving a performance estimation of an operator of a vehicle
US9763614B2 (en) Wearable device and system for monitoring physical behavior of a vehicle operator
WO2010091464A1 (fr) Procédé et système de surveillance d'un opérateur de machinerie
JP2021113046A (ja) 車体運動および乗員体験を制御するための方法およびシステム
US9694680B2 (en) System and method for determining drowsy state of driver
JP7099037B2 (ja) データ処理装置、モニタリングシステム、覚醒システム、データ処理方法、及びデータ処理プログラム
EP2743117A1 (fr) Système et procédé pour surveiller et réduire la déficience d'un opérateur de véhicule
US20150158494A1 (en) Method and apparatus for determining carelessness of driver
JP2008079737A (ja) 集中度評価装置及びこれを用いた車両用表示装置
US20210290134A1 (en) Systems and methods for detecting drowsiness in a driver of a vehicle
JP2023093454A (ja) 異常検知装置
US10684695B2 (en) Data processing device, monitoring system, awakening system, data processing method, and data processing program
WO2019155914A1 (fr) Dispositif de traitement de données, système de surveillance, système de vigilance, procédé de traitement de données, programme de traitement de données et support de stockage
Chatterjee et al. Driving fitness detection: A holistic approach for prevention of drowsy and drunk driving using computer vision techniques
Vergnano et al. A methodology for out of position occupant identification from pressure sensors embedded in a vehicle seat
KR20210158525A (ko) 자율주행차 주행 시 멀미 저감 시스템
Bhavya et al. Intel-Eye: An Innovative System for Accident Detection, Warning and Prevention Using Image Processing (A Two-Way Approach in Eye Gaze Analysis)
Zajic et al. Video-based assistance for autonomous driving
Dubey et al. Micro-Sleep Accident Prevention by SMART Vehicle Using AI and Image Processing
Karanam et al. Driver drowsiness estimation using iot and image processing

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10740839

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10740839

Country of ref document: EP

Kind code of ref document: A1