WO2019155914A1 - Dispositif de traitement de données, système de surveillance, système de vigilance, procédé de traitement de données, programme de traitement de données et support de stockage - Google Patents

Dispositif de traitement de données, système de surveillance, système de vigilance, procédé de traitement de données, programme de traitement de données et support de stockage Download PDF

Info

Publication number
WO2019155914A1
WO2019155914A1 PCT/JP2019/002466 JP2019002466W WO2019155914A1 WO 2019155914 A1 WO2019155914 A1 WO 2019155914A1 JP 2019002466 W JP2019002466 W JP 2019002466W WO 2019155914 A1 WO2019155914 A1 WO 2019155914A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
movement
data processing
unit
person
Prior art date
Application number
PCT/JP2019/002466
Other languages
English (en)
Japanese (ja)
Inventor
成典 長江
航一 木下
あゆみ 竹本
向井 仁志
倭 竹内
Original Assignee
オムロン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オムロン株式会社 filed Critical オムロン株式会社
Publication of WO2019155914A1 publication Critical patent/WO2019155914A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/18Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to a data processing device, a monitoring system, an awakening system, a data processing method, a data processing program, and a storage medium.
  • Patent Document 1 discloses a drowsiness sign detection device that uses a vestibulo-oculomotor reflex induced by head movement to detect a sign before a vehicle driver or the like is aware of drowsiness. Yes.
  • the drowsiness sign detection device described in Patent Literature 1 includes a head movement detection unit that detects head movement, an eye movement detection unit that detects eye movement, and head movement data detected by the head movement detection unit.
  • An ideal eye movement angular velocity calculating means for calculating an ideal eye movement angular speed based on the eye movement angular velocity calculating means for calculating an eye rotation angular speed based on the eye movement data detected by the eye movement detecting means, an ideal eye movement angular speed, Vestibulo-ocular reflex (VOR) is detected from the eyeball rotation angular velocity, and drowsiness sign determination means for determining a sign of drowsiness based on the vestibulo-ocular reflex is provided.
  • VOR Vestibulo-ocular reflex
  • Patent Document 1 an experimental system that simulates driving of a vehicle, that is, a driving simulator system is used, and a test task is imposed on the subject such as fixing the upper part of the number plate of the forward vehicle projected on the screen as a point of gaze. The result of the test conducted in is disclosed.
  • the pseudo experimental environment using the driving simulator system is greatly different from the actual driving environment of the vehicle.
  • the present inventor has found that it is extremely difficult to accurately acquire the vestibular eye movement in the actual vehicle environment.
  • eye movement includes saccade movement (also referred to as impulsive eye movement), vergence movement, etc. in addition to vestibulo-ocular reflex movement.
  • saccade movement also referred to as impulsive eye movement
  • vergence movement etc. in addition to vestibulo-ocular reflex movement.
  • the predetermined gaze point may be fixed.
  • the road surface condition, the behavior of the vehicle, and the movement of the driver's head and eyes are not constant. Many eye movements other than oculomotor reflex movements occur.
  • vestibulo-oculomotor reflex movement is induced by head movement.
  • the driver seat is vibrated to induce head movement.
  • the state where the head vibrates does not always occur conveniently. Therefore, there has been a problem that it is difficult to accurately determine whether the eye movement is the vestibular movement reflex movement.
  • it is difficult to accurately determine whether the eye movement is the vestibular movement or eye reflex movement in various real environments such as an operation environment such as equipment and a work environment.
  • the present invention has been made in view of the above problems, and is a data processing device, a monitoring system, a wakefulness system, a data processing method, a data processing program, and a memory capable of improving the monitoring accuracy of the vestibulo-oculomotor reflex movement in a real environment.
  • the purpose is to provide a medium.
  • a data processing device (1) is a data processing device that performs data processing for monitoring a person, A state determination unit that determines whether the state of the person or an object operated by the person is in a predetermined state suitable for calculating the vestibulooculomotor reflex movement of the person; A measurement unit for measuring pupil movement and head movement of the person; The data processing apparatus includes an adding unit that adds identification information indicating a determination result by the state determination unit to data related to the human head movement and pupil movement measured by the measurement unit.
  • the data measured by the measurement unit is in the predetermined state suitable for the calculation of the vestibulo-oculomotor reflex movement by the identification information given by the grant unit. Sometimes it can be properly identified whether the data is measured. Therefore, by using the data measured in the predetermined state based on the identification information, it becomes possible to improve the monitoring accuracy of the vestibulo-ocular reflex movement in the real environment.
  • the data may be data on the person's head movement and pupil movement, or may be values (calculated values) calculated from the data on the person's head movement and pupil movement.
  • the person or the object is in a state suitable for the calculation of the person's vestibular movement reflex movement, in other words, the vestibule movement reflex movement is calculated. It is possible to prompt a state where the signal noise (SN) ratio of the data used in the above is increased. Therefore, even if it is determined that the predetermined state is not satisfied in the actual environment, it is possible to improve the monitoring accuracy of the vestibulo-oculomotor reflex movement. Further, since unnecessary intervention control is not required, power saving can be realized.
  • the signal noise signal indicates eye movement due to the vestibulo-ocular reflex movement, and noise indicates eye movement other than the vestibulo-ocular reflex movement (for example, saccade movement, vergence movement, etc.).
  • the data processing device (3) includes a calculation unit that calculates the vestibulo-ocular reflex movement using the data and the identification information in the data processing device (1) or (2). It is characterized by being.
  • the calculation unit calculates the vestibulo-ocular reflex movement using the data and the identification information, when the predetermined state is established using the identification information By identifying the measured data and using the data, the vestibulo-oculomotor reflex movement can be accurately monitored even in a real environment.
  • the data processing device (4) provides the first identification information to the data measured when the adding unit is in the predetermined state in the data processing device (3), Giving the second identification information to the data measured when not in the predetermined state,
  • the calculation unit is configured to calculate the vestibulo-ocular reflex movement using the data to which the first identification information is given among the data.
  • the first identification information is given to the data measured when the data is in the predetermined state, and the data measured when the data is not in the predetermined state 2 identification information is given. And since the vestibulo-ocular reflex movement is calculated using the data to which the first identification information is given, it is possible to reliably improve the monitoring accuracy of the vestibulo-ocular reflex movement after the determination by the state determination unit Can do.
  • the data processing device (5) includes a calculation unit that calculates the vestibulo-oculomotor reflex movement using the data and the identification information in the data processing device (2), A data determination unit for determining whether the data measured by the measurement unit after the intervention control by the intervention control unit is data suitable for calculation of the vestibulo-ocular reflex movement; , The adding unit attaches first identification information to the data determined by the data determination unit as the suitable data among the data measured by the measurement unit, and determines that the data is not suitable data Second identification information is attached to the data that has been The calculation unit is configured to calculate the vestibulo-ocular reflex movement using the data to which the first identification information is given among the data.
  • the first identification information is given to the data determined to be the suitable data among the data measured after the intervention control is performed
  • the second identification information is given to the data determined not to be suitable data.
  • the data processing device (6) uses any one of the data processing devices (3) to (5) to use the vestibulo-ocular reflex movement calculated by the calculation unit to measure the sleepiness level of the person. It is characterized by having a drowsiness judging unit for judging
  • the sleepiness level in the real environment can be accurately determined by the sleepiness determination unit.
  • the data processing device (7) includes a wake control unit that performs control for making the person wake up based on the sleepiness level determined by the sleepiness determination unit in the data processing device (6). It is characterized by having.
  • the awakening control unit can perform control to appropriately awaken the person according to the sleepiness level.
  • the object is a vehicle, and the person is a driver who drives the vehicle. It is a feature.
  • the driver's vestibulo-ocular reflex motion can be accurately calculated in an actual vehicle environment. it can.
  • the state determination unit includes noise included in the data, the driver's line-of-sight direction, the driving state of the vehicle, and the It is characterized in that it is determined whether or not the vehicle is in the predetermined state based on at least one of detection states of an object existing in the traveling direction of the vehicle.
  • the data processing device (9) whether or not the vehicle is in the predetermined state can be determined based on various states of the vehicle or the driver. Monitoring accuracy can be improved.
  • the noise included in the data includes eyeball and head movement components that hinder the calculation of the vestibular eye movement, for example, eye movement components other than the vestibular movement movement.
  • a data processing device (10) includes the data processing device (8), An acquisition unit for acquiring the acceleration of the vehicle; When the state determination unit has a predetermined relationship between the vehicle acceleration acquired by the acquisition unit and the driver's head movement or pupil movement measured by the measurement unit, the predetermined state is established. It is characterized by being judged to be present.
  • the predetermined It can be determined that it is in the state.
  • the vestibular eye movement can be calculated with high accuracy.
  • a monitoring system (1) includes any one of the data processing devices (1) to (10) and a camera that captures an image including the person,
  • the measurement unit of the data processing device measures the pupil movement and head movement of the person using the image acquired from the camera.
  • any one of the data processing devices (1) to (10) and the camera are configured, any of the data processing devices (1) to (10) is provided.
  • any of the data processing devices (1) to (10) is provided.
  • the wakefulness system which concerns on this indication is comprised including the said data processing device (7) and the wakefulness device controlled by the said wakefulness control part of this data processing device (7).
  • the awakening system since the awakening device is controlled by the awakening control unit, the person can be awakened by the awakening device.
  • the data processing method is a data processing method for monitoring a person, the measuring step of measuring the pupil movement and head movement of the person, A state determination step of determining whether or not the state of the person or an object operated by the person is in a predetermined state suitable for calculating the vestibulooculomotor reflex movement of the person; A step including an adding step of adding identification information indicating a result determined by the state determination step to data related to the head movement and pupil movement of the person measured by the measurement step. Yes.
  • the data processing method measurement is performed when the data measured in the measurement step is in the predetermined state suitable for the calculation of the vestibulo-ocular reflex movement by the identification information given in the grant step. It is possible to appropriately identify whether the data is processed. Therefore, by using the data measured in the predetermined state based on the identification information, it becomes possible to improve the monitoring accuracy of the vestibulo-ocular reflex movement in the real environment.
  • a data processing program is a data processing program for causing at least one computer to execute data processing for monitoring a person, Said at least one computer, A measuring step for measuring human pupil movement and head movement; A state determination step of determining whether or not the state of the person or an object operated by the person is in a predetermined state suitable for calculating the vestibulooculomotor reflex movement of the person; An adding step of giving identification information indicating a result determined by the state determining step to data related to the head movement and pupil movement of the person measured by the measuring step is performed.
  • the data processing program measurement is performed when the data measured in the measurement step is in the predetermined state suitable for the calculation of the vestibulo-ocular reflex movement by the identification information given in the grant step. It is possible to appropriately identify whether the data is processed. Therefore, by using the data measured when in the predetermined state based on the identification information, it is possible to realize data processing capable of improving the monitoring accuracy of the vestibulo-ocular reflex movement in the real environment. it can.
  • a computer-readable storage medium is a computer-readable storage medium storing a data processing program for causing at least one computer to execute data processing for monitoring a person, Said at least one computer, A measuring step for measuring human pupil movement and head movement; A state determination step of determining whether or not the state of the person or an object operated by the person is in a predetermined state suitable for calculating the vestibulooculomotor reflex movement of the person; A program for executing a granting step of giving identification information indicating a result determined by the state determination step to data related to the human head movement and pupil movement measured by the measurement step is stored. It is characterized by.
  • the at least one computer is measured in the measurement step by the identification information given in the giving step by causing the at least one computer to read the program and executing the steps. It is possible to appropriately identify whether the obtained data is data measured when in the predetermined state suitable for the calculation of the vestibulo-ocular reflex movement. Therefore, by using the data measured when in the predetermined state based on the identification information, it is possible to realize a data processing device capable of improving the monitoring accuracy of the vestibulo-ocular reflex movement in a real environment. Can do.
  • FIG. 1 is a schematic diagram illustrating an example in which a data processing apparatus according to an embodiment is applied to a monitoring system.
  • the monitoring system 1 includes a data processing device 10 that is mounted on a vehicle 2 and performs data processing for monitoring the driver 3 of the vehicle 2 and a camera 20 that captures an image including the face of the driver 3. Has been.
  • the vehicle 2 is an automobile, but may be a vehicle such as a motorcycle, and the type of the vehicle 2 is not particularly limited. Further, the vehicle 2 may be an autonomous driving vehicle. Self-driving vehicles are level 1 (driver assistance), level 2 (partial automatic driving), level 3 (conditional automatic driving), level 4 (altitude) in the automatic driving levels presented by the American Automobile Engineering Association (SAE) The vehicle may be any level of (automatic driving) and level 5 (fully automatic driving).
  • SAE American Automobile Engineering Association
  • the vehicle may be any level of (automatic driving) and level 5 (fully automatic driving).
  • the data processing apparatus 10 includes at least one control unit and a storage unit.
  • the control unit includes one or more hardware processors such as a central processing unit (CPU) and a graphics processing unit (GPU).
  • the storage unit may be a random access memory (RAM), a read only memory (ROM), a hard disk drive (HDD), a solid state drive (SSD), a flash memory, other non-volatile memory or volatile memory, or other semiconductor elements. It is composed of one or more storage devices capable of storing data.
  • the data processing device 10 is connected to various devices mounted on the vehicle 2, such as an in-vehicle sensor 30, a start switch 40, a navigation device 50, an electronic control unit (ECU) 60, or a vibration device 70. It is configured to be possible.
  • the data processing device 10 can acquire various detection data detected by the in-vehicle sensor 30, an ON / OFF signal of the start switch 40, and the navigation device 50, the electronic control unit 60, the vibration device 70, etc. A predetermined control signal or the like can be output.
  • a predetermined control operation is executed based on a predetermined control signal acquired from the data processing device 10.
  • the electronic control unit 60 includes one or more electronic control units that control each part of the vehicle 2 such as a driving unit, a braking unit, a steering unit, or a suspension unit.
  • the data processing device 10 may be configured to be able to communicate with the navigation device 50, the electronic control unit 60, or the vibration device 70 via an in-vehicle network such as CAN (Controller Area Network).
  • CAN Controller Area Network
  • One or more vibration devices 70 are disposed, for example, on the lower portion, back portion, or headrest of the driver's 3 seat, and the upper body or head of the driver 3 is moved up and down, front and rear, left and right, or yaw. Or it is an apparatus which can be vibrated continuously or intermittently with a predetermined period in the pitch direction or the like.
  • data indicating the driver's pupil movement and head movement includes many movement components different from the vestibulo-ocular reflex movement, for example, components (noise components) such as saccade movement and convergence movement.
  • components such as saccade movement and convergence movement.
  • the signal noise (SN) ratio of the data for detecting the vestibulo-ocular reflex movement is likely to deteriorate, and it is difficult to accurately detect the driver's vestibulo-ocular reflex movement.
  • the vestibular eye movement is an involuntary eye movement that suppresses blurring of the retinal image by moving the eyeball in the opposite direction of the head movement. is there.
  • the data processing apparatus 10 is configured so that the driver 3 or the vehicle 2 is in the vestibular eye movement reflection of the driver 3 so that the data with the good SN ratio can be identified. It is determined whether or not the vehicle is in a predetermined state suitable for calculation of motion, the image captured by the camera 20 is analyzed, the pupil motion and head motion of the driver 3 are measured, and the measured driver The process which provides the identification information which shows the result of the said determination to the data concerning 3 pupil movements and head movements is performed.
  • the data related to the pupil movement and the head movement of the driver 3 to which the identification information is given may be the data of the head movement and the pupil movement of the driver 3, or the data of the head movement and the pupil movement of the driver 3.
  • the value calculated from (a calculated value) may be used.
  • the identification information may be added to a calculated value such as a coefficient indicating the correlation between head movement and pupil movement.
  • the predetermined state includes a state in which the head of the driver 3 is likely to vibrate, in other words, a state in which the signal component of the vestibulo-oculomotor reflex movement, particularly the amount of displacement increases. More specifically, a state in which the head of the driver 3 is likely to be displaced or vibrated in the up and down, left and right, front and rear directions, or the yaw or pitch direction is included.
  • the predetermined state includes a state in which eye movements other than the vestibular oculomotor reflex movement, for example, a saccade movement or a vergence movement are less likely to occur, in other words, a state in which the noise component of the vestibular oculomotor reflex movement is reduced. May be. More specifically, a state where the vehicle 2 is traveling on a straight road or a state where the driver 3 is gazing at a specific location may be included.
  • the data processing device 10 uses the identification information to identify the data measured when in the predetermined state and the data measured when not in the predetermined state, Calculate reflex motion. Through these processes, the monitoring accuracy of the vestibulo-oculomotor reflex movement in an actual vehicle environment is improved.
  • FIG. 2 is a block diagram illustrating an example of a hardware configuration of the monitoring system 1 according to the embodiment.
  • the data processing apparatus 10 includes a data acquisition unit 11, a control unit 12, and a storage unit 13.
  • the data acquisition unit 11 is connected to the in-vehicle sensor 30, the start switch 40, the navigation device 50, the electronic control unit 60, the vibration device 70, and the like in addition to the camera 20, and various signals and data are exchanged with these devices. It is configured to include an interface circuit, a connection connector, etc. for performing exchanges.
  • the control unit 12 includes a state determination unit 12a, a measurement unit 12b, a grant unit 12c, an intervention control unit 12d, a data determination unit 12e, a calculation unit 12f, a drowsiness determination unit 12g, and a wakefulness control unit 12h.
  • the storage unit 13 includes an image storage unit 13a, an acquisition data storage unit 13b, a measurement data storage unit 13c, and a program storage unit 13d.
  • the image data of the driver acquired from the camera 20 is stored in the image storage unit 13a.
  • Data acquired from the in-vehicle sensor 30, the navigation device 50, the electronic control unit 60, or the like is stored in the acquired data storage unit 13b.
  • the measurement data storage unit 13c data indicating the driver's pupil movement and head movement measured by the measurement unit 12b is stored in association with each image stored in the image storage unit 13a.
  • the program storage unit 13d stores a data processing program executed by each unit of the control unit 12, data necessary for execution and determination processing of the program, and the like.
  • the control unit 12 performs processing for storing image data, measurement data, and the like in the storage unit 13, reads out programs and data stored in the storage unit 13, and executes these programs.
  • the control unit 12 executes these programs in cooperation with the storage unit 13, the state determination unit 12a, the measurement unit 12b, the provision unit 12c, the intervention control unit 12d, the calculation unit 12f, the data determination unit 12e, the sleepiness Operation
  • the camera 20 is a device that captures an image including the face of the driver 3 and includes, for example, a lens unit, an image sensor unit, a light irradiation unit, and a control unit that controls these units (not shown).
  • the image sensor section includes, for example, an image sensor such as a CCD (Charge-Coupled Device) and a CMOS (Complementary Metal-Oxide Semiconductor), a filter, a microlens, and the like.
  • the image pickup element section includes an infrared sensor such as a CCD, a CMOS, or a photodiode that can form a picked-up image by receiving ultraviolet or infrared light, in addition to those that can form a picked-up image by receiving light in the visible region. May be.
  • an infrared sensor such as a CCD, a CMOS, or a photodiode that can form a picked-up image by receiving ultraviolet or infrared light, in addition to those that can form a picked-up image by receiving light in the visible region. May be.
  • the light irradiation unit includes a light emitting element such as an LED (Light Emitting Diode), and an infrared LED or the like may be used so that the state of the driver can be imaged regardless of day or night.
  • the control unit includes, for example, a CPU, a memory, an image processing circuit, and the like.
  • the control unit controls the imaging element unit and the light irradiation unit, emits light (for example, near infrared rays) from the light irradiation unit, and controls the imaging element unit to capture the reflected light.
  • the camera 20 captures an image at a predetermined frame rate (for example, 30 to 60 frames per second), and the image data captured by the camera 20 is output to the data processing device 10.
  • the camera 20 is composed of one unit, but may be composed of two or more units.
  • the camera 20 may be configured separately from the data processing device 10 (separate housing), or may be configured integrally with the data processing device 10 (same housing).
  • the camera 20 may be a monocular camera or a stereo camera.
  • the installation position of the camera 20 in the passenger compartment is not particularly limited as long as it can capture at least the field of view including the face of the driver 3.
  • it may be installed in a steering part, a steering column part, a meter panel part, a position in the vicinity of a rearview mirror, an A pillar part, or a navigation device 50.
  • Information including the specifications of the camera 20 (view angle, number of pixels (vertical x horizontal), etc.) and position and orientation (mounting angle, distance from a predetermined origin (center position of handle, etc.), etc.) 10 may be stored.
  • the in-vehicle sensor 30 includes an out-of-vehicle sensor 31, an acceleration sensor 32, a gyro sensor 33, a steering sensor 34, and the like, but may include other sensors.
  • the vehicle exterior sensor 31 is a sensor that detects an object existing around the vehicle 2.
  • the object may include road markings such as white lines, guardrails, median strips, and other structures that affect the traveling of the vehicle 2.
  • the vehicle exterior sensor 31 includes at least one of a front monitoring camera, a rear monitoring camera, a radar (Radar), a rider, that is, Light Detection and Ranging, Laser Imaging Detection and Ranging (LIDAR), and an ultrasonic sensor.
  • the detection data of the object detected by the outside sensor 31 may be output to the electronic control unit 60 in addition to being output to the data processing device 10.
  • the radar detects a position, a direction, a distance, and the like of an object by transmitting radio waves such as millimeter waves around the vehicle and receiving radio waves reflected by the object existing around the vehicle.
  • the rider detects the position, direction, distance, and the like of the object by transmitting laser light around the vehicle and receiving light reflected by the object present around the vehicle.
  • the acceleration sensor 32 is a sensor that detects the acceleration of the vehicle 2, and a triaxial acceleration sensor that detects acceleration in three directions of the XYZ axes may be used, or a biaxial, uniaxial acceleration sensor may be used.
  • a triaxial acceleration sensor a semiconductor type acceleration sensor such as a piezoresistive type in addition to the capacitance type may be used.
  • the acceleration data detected by the acceleration sensor 32 may be output to the navigation device 50 or the electronic control unit 60 in addition to being output to the data processing device 10.
  • the gyro sensor 33 is an angular velocity sensor that detects the rotational angular velocity (for example, yaw rate) of the vehicle 2.
  • the rotational angular velocity signal detected by the gyro sensor 33 may be output to the data processing device 10 or may be output to the navigation device 50 or the electronic control unit 60.
  • the steering sensor 34 is a sensor that detects a steering amount with respect to the steering of the vehicle 2, and is provided, for example, on a steering shaft of the vehicle 2, and detects a steering torque or steering angle given to the steering by the driver 3.
  • a signal detected by the steering sensor 34 and corresponding to the steering operation of the driver 3 may be output to the electronic control unit 60 in addition to being output to the data processing device 10.
  • the navigation device 50 includes a control unit, a display unit, a voice output unit, an operation unit, a map data storage unit, a GPS reception unit, etc. (not shown).
  • the navigation device 50 determines the road or lane on which the vehicle 2 travels based on the position information of the vehicle 2 measured by the GPS receiver and the map information in the map data storage unit, and determines the destination from the current position of the vehicle 2.
  • the route is calculated on the display unit (not shown), and voice output such as route guidance is output from the voice output unit (not shown).
  • the configuration may be such that the position information of the vehicle 2, the information on the traveling road, the information on the planned traveling route, and the like obtained by the navigation device 50 are output to the data processing device 10.
  • the awakening device 80 is a device controlled by the awakening control unit 12h of the data processing device 10, and executes an operation for waking up the driver 3 based on a control signal from the awakening control unit 12h.
  • the awakening device 80 is, for example, an alarm device that issues a warning to the driver 3 by sound or light, an air conditioner that blows cold air, hot air, or gas containing an odor component or odor component, etc. to the driver 3, a steering, or a seat belt. It can be configured with a vibration device or the like that vibrates.
  • the image data of the camera 20 acquired by the data acquisition unit 11 is stored in the image storage unit 13a.
  • the data of the vehicle-mounted sensor 30, the navigation device 50, or the electronic control unit 60 acquired by the data acquisition unit 11 is stored in the acquisition data storage unit 13b.
  • the state determination unit 12a reads data used for state determination from the image storage unit 13a or the acquired data storage unit 13b, and using the read data, the state of the driver 3 or the vehicle 2 is suitable for calculating the vestibulo-oculomotor reflex movement. A process for determining whether or not the predetermined state is present is performed.
  • the predetermined state includes a state in which the head of the driver 3 is likely to vibrate, in other words, a state in which the signal component of the vestibulo-oculomotor reflex movement, particularly the amount of displacement increases. More specifically, a state in which the head of the driver 3 is likely to be displaced or vibrated in the up and down, left and right, front and rear directions, or the yaw or pitch direction is included.
  • the predetermined state includes a state in which eye movements other than the vestibulo-ocular reflex movement (for example, saccade movement, vergence movement) are unlikely to occur, in other words, a state in which the noise component of the vestibulo-ocular reflex movement is reduced included. More specifically, a state where the vehicle 2 is traveling on a straight road or a state where the driver 3 is gazing at a specific location is included.
  • eye movements other than the vestibulo-ocular reflex movement for example, saccade movement, vergence movement
  • a first determination signal indicating that the state is in the predetermined state is output to the measurement unit 12b.
  • a second determination signal indicating that the state is not in the predetermined state is output to the intervention control unit 12d.
  • the measurement unit 12b performs a process of measuring the pupil movement and the head movement of the driver 3 from the image captured by the camera 20 when the first determination signal is acquired from the state determination unit 12a. For example, the measurement process is performed for each frame of the image, but may be performed at predetermined frame intervals. Thereafter, the measuring unit 12b outputs a signal (giving instruction signal) instructing to give identification information to the giving unit 12c.
  • the measurement unit 12b performs processing for measuring the pupil movement and head movement of the driver 3 even when an intervention execution signal is acquired from the intervention control unit 12d.
  • the measurement unit 12b outputs a signal (determination instruction signal) that instructs the data determination unit 12e to determine whether the measurement data is data suitable for calculating the vestibulo-oculomotor movement.
  • the measurement unit 12b detects the face (face area) of the driver 3 from an image captured by the camera 20 by template matching.
  • the face area may be detected using a face template image prepared in advance.
  • the measurement unit 12b detects the position of the pupil from the face region by template matching for the face region of the driver 3 detected from the image.
  • the pupil position may be detected using a pupil template image prepared in advance.
  • the measuring unit 12b detects the position of the pupil for each frame of the image, and measures the pupil movement from the position change (movement amount) of the pupil for each frame.
  • the measurement unit 12b extracts the face (face area) of the driver 3 from an image captured by the camera 20 by template matching.
  • the face area may be detected using a face template image prepared in advance.
  • the measurement unit 12b detects the position of the eye from the face region by template matching for the face region of the driver 3 detected from the image.
  • the eye position may be detected by using an eye template image prepared in advance. Coordinates indicating the positions of the corners of the eyes and the eyes are linked in advance to the eye template image. From the coordinates of the corners of the eyes and the eyes, the positions of the eyes and the eyes of the driver 3 in the image can be detected. Since the positions of the corners of the eyes and the eyes do not move depending on the opening / closing operation of the eyes such as blinking, it can be estimated that the position changes of the corners of the eyes and the eyes are moved by the head movement. Then, the measurement unit 12b detects the positions of the corners of the eyes and the eyes of the driver 3 for each frame of the image, and measures the head movement from the positional changes (movement amounts) of the corners of the eyes and the eyes for each frame.
  • the monitoring system 1 may be equipped with a three-dimensional image measurement unit.
  • the three-dimensional image measurement unit acquires a three-dimensional image (distance image) in which each pixel of a captured image has a value of distance to the target (information on depth).
  • the three-dimensional image measurement unit may be a passive measurement unit such as a stereo method, or may be an active measurement unit that projects light such as an optical radar or pattern light.
  • the change in the position of the driver's eye corner and the eye is due to the parallel movement of the head (movement in the vertical or horizontal direction), or rotational movement (yaw) Or movement in the pitch direction) can be accurately detected.
  • rotational movement yaw
  • the measurement process of the pupil movement and the head movement of the driver 3 is not limited to the above-described method, and various known methods can be adopted.
  • feature points of facial organs are detected for each frame of an image, and feature points of facial organs are detected.
  • the face orientation may be obtained from the position, and the head movement may be measured from the change (movement amount) of the face orientation for each frame.
  • the identification unit 12c identifies the result of determination by the state determination unit 12a in the data related to the head movement and pupil movement of the driver 3 measured by the measurement unit 12b.
  • a process of giving information is performed.
  • a label (first label) for identifying the determination result that is in the predetermined state (appropriate for calculation) is given.
  • the assigning unit 12c performs a process of storing the measurement data storage unit 13c in a state in which the data related to the head movement and pupil movement of the driver 3 and the identification information (first label) are associated with each other.
  • a signal (calculation instruction signal) for instructing the calculation of the vestibulo-ocular reflex movement is output to the unit 12f.
  • the identification information in this case is for identifying on the computer the determination result by the state determination unit 12a, for example, a case where the state is in a predetermined state suitable for calculation of the vestibulo-oculomotor reflex movement and a case where the state is not in the predetermined state
  • the form is not particularly limited as long as it is the information given to.
  • the identification information may be information such as a weighting coefficient weighted according to the predetermined state.
  • the predetermined state can be finely identified by the weighting. Then, by calculating the vestibulo-ocular reflex movement in consideration of the weighting, not only a part of the data measured by the measuring unit 12b (only the data in the predetermined state) but all the data is used. It becomes possible to monitor the vestibulo-oculomotor reflex movement corresponding to the actual vehicle environment.
  • the provision unit 12c adds the determination result by the data determination unit 12e to the data of the head movement and the pupil movement of the driver 3 measured by the measurement unit 12b. For example, a process of assigning a label for identifying the determination result.
  • the applying unit 12c adds the head movement and pupil movement data determined by the data determining unit 12e to the vestibulo-oculomotor reflex movement.
  • a process of assigning a first label indicating that it is suitable for calculation is performed.
  • the assigning unit 12c performs a process of storing the data related to the head movement and pupil movement of the driver 3 and the first label in a state of being associated with each other in the measurement data storage unit 13c.
  • a signal (calculation instruction signal) instructing calculation of the oculomotor reflex movement is output.
  • the adding unit 12c is unsuitable for calculating the vestibulo-ocular reflex movement based on the head movement and pupil movement data determined by the data determining unit 12e.
  • the process which provides the 2nd label which shows this is performed.
  • the assigning unit 12c performs a process of storing the data relating to the head movement and pupil movement of the driver 3 in the measurement data storage unit 13c in a state where the data is associated with the second label.
  • the identification information in this case is given to identify on the computer the determination result by the data determination unit 12e, for example, the case where the data is suitable for the calculation of the vestibulo-oculomotor reflex movement and the case where the data is not suitable.
  • the form is not particularly limited.
  • the identification information may be information that weights whether or not the data is suitable for the calculation of the vestibulo-ocular reflex movement. According to such a configuration, whether or not the data is suitable can be finely identified by the weighting. Then, by calculating the vestibular ocular reflex movement in consideration of the weighting, not only a part of the data measured by the measurement unit 12b (only the data determined to be suitable data) but all the data By using it, it becomes possible to monitor the vestibulo-ocular reflex movement corresponding to the actual vehicle environment.
  • the intervention control unit 12d acquires the second determination signal from the state determination unit 12a, the state of the driver 3 or the vehicle 2 is set to the predetermined state suitable for calculation of the vestibulooculomotor reflex movement of the driver 3.
  • the intervention control is performed so that an intervention execution signal indicating that the intervention control has been executed is output to the measurement unit 12b.
  • the intervention control unit 12d performs intervention control to output a route change signal that instructs the navigation device 50 to change the route so as to travel on a straight road.
  • a route change signal that instructs the navigation device 50 to change the route so as to travel on a straight road.
  • the driver 3 is usually facing forward, so that the line of sight is easily determined in one direction. Therefore, in the data processing apparatus 10, it is possible to measure head movement and pupil movement in a state in which eye movement other than the vestibulo-oculomotor reflex movement hardly occurs by the intervention control.
  • the route change signal may be, for example, a control signal for changing the current location of the vehicle 2 and a planned travel location ahead of a predetermined distance (for example, 500 m) from the current location to a route on a straight road.
  • the route change signal is, for example, a straight extension line from the current location of the vehicle 2 to a first scheduled travel location ahead of a predetermined distance (for example, 500 m) and a predetermined distance from the first planned travel location ( For example, it may be a control signal for changing the route to a road where the angle formed with the straight line to the second planned traveling point ahead of 500 m) is smaller than a predetermined angle.
  • the navigation device 50 acquires the route change signal from the data processing device 10, it determines whether or not the traveling road of the vehicle 2 is a straight road that satisfies the condition specified by the route change signal.
  • the navigation device 50 When the set route of the navigation device 50 does not satisfy the condition specified by the route change signal, the navigation device 50 performs a route search for each intersection, for example, and matches the condition specified by the route change signal. Repeat the process of changing to root.
  • the state of the vehicle 2 can be brought into a state suitable for the calculation of the vestibular eye movement.
  • the intervention control unit 12d performs a control to output a notification signal that urges the navigation device 50 to focus the driver 3 on a predetermined location. You may go.
  • the navigation device 50 receives the notification signal from the data processing device 10, for example, the navigation device 50 outputs a sound such as “please look closely at the front” or displays the information on the screen.
  • the intervention control makes it possible to guide the eye of the driver 3 to a state suitable for calculating the vestibulo-oculomotor reflex movement, that is, a state in which the front is watched.
  • the confirmation of the surrounding safety indicates that, for example, a preceding vehicle or a succeeding vehicle is not detected 100 m forward and rearward of the vehicle 2 and no other obstacle is detected within 5 m around the vehicle.
  • the method of confirming the surrounding safety is not limited to this.
  • the intervention control unit 12d may control the navigation device 50 to output a notification signal that prompts the driver 3 to move the head.
  • the navigation device 50 receives the notification signal from the data processing device 10, for example, it outputs a sound such as “Please move your head up and down or left and right while looking forward” or display it on the screen.
  • Execute control to With the intervention control the head of the driver 3 can be guided to a state suitable for calculating the vestibulo-oculomotor reflex movement, that is, a state in which the head is vibrated.
  • the intervention control unit 12d may control the vibration device 70 to output a control signal for vibrating the seat of the driver 3 in a predetermined cycle in the vertical, left and right, front and rear, or pitch directions. .
  • the vibration device 70 executes, for example, control for vibrating the driver's 3 seat up and down, left and right, front and rear, or in the pitch direction at a predetermined vibration cycle.
  • the head of the driver 3 can be brought into a state suitable for the calculation of the vestibulo-oculomotor reflex movement, that is, the head is vibrated.
  • the intervention control unit 12d may control the suspension control unit 61 to output a control signal for controlling the vehicle 2 so as to easily vibrate.
  • the suspension control unit 61 receives the control signal from the data processing device 10, for example, the suspension control unit 61 adjusts the damping force of the active suspension mechanism for a certain period to make the vehicle body easy to swing up and down or to the left and right. Execute.
  • the head of the driver 3 can be brought into a state suitable for the calculation of the vestibulo-oculomotor reflex movement, that is, the head is vibrated.
  • the intervention control unit 12d may be configured to perform the intervention control of any one of the above (1) to (5), or depending on the state of the vehicle 2 or the driver 3, the above (1) to (1) You may comprise so that two or more intervention control of (5) may be performed combining suitably.
  • the data determination unit 12e acquires the determination instruction signal from the measurement unit 12b, the data measured by the measurement unit 12b after the intervention control by the intervention control unit 12d is suitable for calculating the vestibulo-oculomotor reflex movement. It is determined whether or not the data is the same.
  • the data determination unit 12e determines that the data is suitable for the calculation of the vestibulo-oculomotor reflex movement, the data determination unit 12e instructs the applying unit 12c to apply the first label (first applying instruction signal). Is output.
  • the data determination unit 12e determines that the data is not suitable for calculating the vestibular eye movement, the data determination unit 12e outputs a signal (intervention instruction signal) instructing the intervention control unit 12d to perform the intervention control.
  • a signal for instructing the application of the second label (second application instruction signal) is output to the application unit 12c.
  • the data determination unit 12e determines whether or not pupil movement and head movement data can be acquired by the measurement unit 12b. In the first place, if the pupil movement and head movement data cannot be acquired, the vestibulo-oculomotor reflex movement cannot be calculated.
  • the similarity between the face area extracted from the image and the face template image, or the eye area and the eye template extracted from the image is determined. If each similarity is lower than a predetermined threshold, the position of the head (eye, ie, the corner of the eye and the eye) or the position of the pupil has not been properly acquired from the image, that is, suitable for calculating the vestibulo-oculomotor reflex movement. It may be determined that the acquired data has not been acquired.
  • the data determination unit 12e may determine whether the pupil movement data is data including a lot of noise components such as eye movements other than the vestibulo-oculomotor reflex movement, that is, saccade movement. For example, when the pupil's momentum is greater than the head's momentum, or when the pupil moves or rotates following the movement or rotation direction of the head, eye movement such as the rotation speed or rotation angle of the eyeball When it is larger than the predetermined threshold value, it may be determined that the pupil movement data includes a lot of noise components such as saccade movement and is not suitable for the calculation of the vestibular eye movement.
  • the driver 3 when the face moves greatly, the driver 3 is not in a state of concentrating on looking at a certain direction, so the head movement such as the face rotation speed and the rotation angle is less than a predetermined threshold.
  • the head movement data When the head movement data is large, it may be determined that the head movement data contains a lot of noise components and is not head movement data suitable for calculating the vestibulo-ocular reflex movement.
  • the data determination unit 12e acquires vehicle speed data when the measurement unit 12b measures pupil movement and head movement, and whether or not the vehicle speed data is smaller than a predetermined speed, or the vehicle speed data is It may be determined whether or not the data measured by the measurement unit 12b is data suitable for calculation of the vestibulo-ocular reflex movement by determining whether or not the speed is higher than a predetermined speed.
  • the driver 3 tends to concentrate on a narrow range in front.
  • the driver 3 tends to voluntarily overlook a wide range in order to ensure the surrounding safety.
  • the data determination unit 12e determines that the data is not suitable for calculating the vestibular eye movement.
  • the vehicle speed data is larger than a predetermined speed, it may be determined that the data is suitable for calculating the vestibulo-ocular reflex movement.
  • the data determination unit 12e may perform weighting according to the vehicle speed on the data of the eye movement and the head movement, instead of performing a binary determination with a predetermined speed as a threshold value. For example, if the vehicle speed is 0 to 20 km / h, the weighting factor is 0.2; if the vehicle speed is 20 to 40 km / h, the weighting factor is 0.5; if the vehicle speed is 40 to 60 km / h, the weighting factor is 0.8;
  • the coefficient 1.0 may be stored in association with the eye movement and head movement data, and the vestibular eye movement may be calculated in consideration of the weight coefficient.
  • the data determination unit 12e acquires steering data when the measurement unit 12b is measuring pupil movement and head movement, and determines whether the steering data is larger than a predetermined steering angle. Alternatively, it may be determined whether the data measured by the measurement unit 12b is data suitable for calculation of the vestibulo-ocular reflex movement.
  • the data determination unit 12e may determine that the data is not suitable for calculating the vestibular eye movement when the steering data is larger than a predetermined steering angle.
  • the data determination unit 12e acquires the position data or traveling road data of the vehicle 2 when the measurement unit 12b measures the pupil movement and the head movement, and the vehicle 2 is traveling on the straight road. It may be determined whether or not the data measured by the measurement unit 12b is data suitable for calculating the vestibulo-oculomotor reflex movement.
  • the data determination unit 12e may determine that the data is not suitable for calculating the vestibular eye movement.
  • the data determination unit 12e acquires the peripheral monitoring data acquired by the vehicle outside sensor 31 when the measurement unit 12b measures the pupil movement and the head movement, and the obstacle around the vehicle 2 By determining whether or not a vehicle or the like exists, it may be determined whether or not the data measured by the measurement unit 12b is data suitable for calculation of the vestibular eye movement.
  • the driver 3 tends to track the preceding vehicle or obstacle moving relatively.
  • the eyes are actively moving.
  • the state in which the eye is actively moved is not in a state suitable for calculating the vestibulo-ocular reflex movement. Therefore, the data determination unit 12e may determine that the data is not suitable for the calculation of the vestibular movement and eye reflex movement when a preceding vehicle or an obstacle moving relative to the vehicle 2 is detected.
  • the data determination unit 12e acquires the direction of the line of sight of the driver 3 when the measurement unit 12b measures the pupil movement and the head movement, and based on the direction of the line of sight of the driver 3, the measurement unit You may determine whether the data measured by 12b are data suitable for calculation of a vestibular movement reflex movement.
  • a known line of sight detection method is employed as a method for detecting the direction of the line of sight of the driver 3 from the image obtained by capturing the face of the driver 3.
  • the driver 3 when the driver 3 is looking at a distant place in the front such as the direction of the horizon, there is a high possibility that the driver 3 is looking at the front in a concentrated manner.
  • the direction of the line of sight is the front of the vehicle (reference direction). If the angle is within a predetermined angle (for example, ⁇ 5 degrees in the vertical direction or ⁇ 5 degrees in the horizontal direction), it may be determined that the front is concentrated.
  • the driver 3 when the driver 3 is looking at the operation unit or display unit in the vehicle such as the navigation device 50, there is a high possibility that the driver 3 is watching a narrow range. Therefore, for example, when the direction of the line of sight of the driver 3 is the installation direction of the navigation device 50 or the like, it may be determined that the data is suitable for calculating the vestibular eye movement. However, when calculating the vestibulo-oculomotor reflex movement from the measurement data while gazing at the equipment in the vehicle, it should be applied to an autonomous driving vehicle having an autonomous driving level of SAE level 3 or higher from the viewpoint of safety. Is preferred.
  • the data determination unit 12e determines whether the data is suitable for the calculation of the vestibulo-ocular reflex movement based on the amount of head movement (translation or rotation) measured by the measurement unit 12b. Also good.
  • the vestibulo-ocular reflex movement is an eye movement that does not occur unless the head of the driver 3 moves. Therefore, the data determination unit 12e determines that the data of the head movement measured by the measurement unit 12b (parallel movement or rotation movement) is not suitable for the calculation of the vestibulo-oculomotor reflex movement when the movement amount (parallel movement or rotation movement) is smaller than the predetermined movement amount. May be.
  • the data determination unit 12e acquires the acceleration data of the vehicle when the pupil movement and the head movement are measured by the measurement unit 12b, and is measured by the measurement unit 12b based on the acceleration data of the vehicle 2 It may be determined whether or not the data is data suitable for calculating the vestibulo-ocular reflex movement.
  • a predetermined acceleration is generated in the up / down, left / right or front / rear direction of the vehicle 2, the head of the driver 3 is easily moved in the up / down, left / right or pitch direction.
  • the data determination unit 12e may determine that the acceleration data of the vehicle 2 is data suitable for the calculation of the vestibulo-ocular reflex movement when the acceleration data of the vehicle 2 is larger than the threshold value at which the driver's 3 head movement is likely to occur. In addition, the data determination unit 12e determines that the vestibule moving eye is in a case where the vibration of the vehicle 2 obtained from the acceleration data of the vehicle 2 and the head movement are in a certain relationship such as vibrating at the same frequency in the same direction. It is determined that the data is suitable for calculating the reflex motion.
  • acceleration data of the vehicle data from the acceleration sensor 32 equipped on the vehicle 2 can be used, and the speed of the vehicle 2 is obtained from the time series change of the distance to the object recognized by the outside sensor 31, and the speed Acceleration data obtained from the above may be used.
  • the data determination unit 12e may be configured to perform any of the determinations (1) to (9) described above, or the above (1) to (9) depending on the state of the vehicle 2 or the driver 3. You may comprise so that 2 or more determinations of 9) may be performed combining suitably.
  • the calculation unit 12f When the calculation unit 12f acquires the calculation instruction signal from the applying unit 12c, the calculation unit 12f reads out the pupil movement and head movement data of the driver 3 to which the first label is applied from the measurement data storage unit 13c, and these data Is used to calculate the vestibulo-oculomotor reflex movement. After performing the calculation process, the calculation unit 12f outputs calculation data (parameters) related to the vestibulo-ocular reflex movement to the drowsiness determination unit 12g.
  • the parameter relating to the vestibulo-ocular reflex movement calculated by the calculation unit 12f includes, for example, data of at least one of VOR gain, residual standard deviation, and phase difference.
  • VOR gain in principle, means the degree of response of pupil movement (rotational angular velocity) to head movement (rotational angular velocity), expressed as pupil movement (rotational angular velocity) / head movement (rotational angular velocity). Can do.
  • the VOR gain is expressed by the equation [Equation 2] as the coefficient G of the regression model of the equation [Equation 1] where the objective variable is the eyeball rotation angular velocity e (t) and the explanatory variable is the ideal eyeball angular velocity h (t) and the constant term dc.
  • ⁇ (t) is the residual of the regression model.
  • is a delay time of the eye movement with respect to the ideal eye movement.
  • the eyeball rotation angular velocity e (t) can be obtained by obtaining an eyeball movement angle based on the pupil movement data measured by the measurement unit 12b and differentiating the eyeball movement angle.
  • the ideal eyeball angular velocity h (t) can be obtained by obtaining a head movement angle based on the head movement data measured by the measurement unit 12b and differentiating the head movement angle.
  • the VOR gain may be calculated in at least one direction of the driver 3 in the front-rear direction, up-down direction, left-right direction, yaw direction, and pitch direction.
  • SDres residual standard deviation
  • the VOR gain and the residual standard deviation are over the second time shorter than the first time, with the data of the first time (for example, several tens of seconds) as one segment so that sufficient estimation accuracy can be obtained. You may calculate the value in each segment for every 3rd time shorter than 2nd time, giving a lap. In general, when the driver 3 is drowsy, the VOR gain decreases and the residual standard deviation tends to increase. Therefore, in order to accurately determine a sign of sleepiness, a change rate such as a decrease rate of the VOR gain or a change rate such as an increase rate of the residual standard deviation may be obtained.
  • the drowsiness determination unit 12g determines the drowsiness of the driver 3 using the acquired calculation data related to the vestibular movement reflex movement. For example, the drowsiness level of the driver 3 may be determined by comparing with a predetermined threshold value using at least one of the parameters of VOR gain, residual standard deviation, and phase difference. The determination may be made in consideration of other parameters. After determining the drowsiness of the driver 3, the drowsiness determination unit 12g outputs the determination result (for example, sleepiness level) of the driver 3 to the awakening control unit 12h.
  • the determination result for example, sleepiness level
  • the awakening control unit 12h performs a process of outputting a control signal for awakening the driver 3 to the awakening device 80 based on the sleepiness level acquired from the sleepiness determination unit 12g.
  • the wake-up device 80 is configured by an alarm device that issues a warning to the driver 3 with sound or light, for example, the wake-up control unit 12h outputs a control signal for operating the alarm device for a predetermined period to the alarm device.
  • the awakening device 80 is configured by an air conditioner that blows cold air, hot air, or a gas containing an aroma component or an odor component to the driver 3
  • the awakening control unit 12h operates the air conditioner for a predetermined period.
  • the awakening device 80 is configured by a vibration device that vibrates a steering, a seat belt, or the like
  • the awakening control unit 12h outputs a control signal for operating the vibration device for a predetermined period to the vibration device.
  • a process of outputting a control signal for awakening the driver 3 may be performed on the navigation device 50 or the vibration device 70.
  • the control signal includes a control signal for outputting a warning sound or a warning display for awakening the driver 3 to the navigation device 50, a control signal for vibrating the seat for a predetermined period by the vibration device 70, and the like. It is.
  • FIG. 3 is a flowchart showing processing operations performed by the control unit 12 in the data processing apparatus 10 according to the embodiment. This processing operation is executed after the start switch 40 of the vehicle 2 is turned on, for example.
  • step S1 the control unit 12 performs processing for starting the camera 20.
  • the camera 20 captures a predetermined number of frames per second.
  • the control unit 12 captures these captured images in time series, and executes this process every frame or every frame at a predetermined interval.
  • step S2 the data acquisition unit 11 performs a process of acquiring data from the camera 20, the vehicle-mounted sensor 30, or the navigation device 50.
  • peripheral monitoring data or the like may be acquired from the in-vehicle sensor 30, or road data including the shape (straight line, curve, etc.) of the traveling road may be acquired. You may acquire from the navigation apparatus 50.
  • step S3 the state determination unit 12a uses the data acquired in step S2 to determine whether the state of the driver 3 or the vehicle 2 is in a predetermined state suitable for calculating the vestibular eye movement (VOR). Determine.
  • the predetermined state is as described above.
  • step S4 the intervention control unit 12d performs intervention control so that the driver 3 or the vehicle 2 is in the predetermined state, and then the control unit 12 executes step S5.
  • step S4 the intervention control described in any one of (1) to (5) described above may be performed, or the above (1) to (1) may be performed according to the state of the vehicle 2 or the driver 3. You may comprise so that two or more intervention control of (5) may be performed combining suitably.
  • step S5 the measurement unit 12b performs a process of measuring the pupil movement of the driver 3, and in the next step S6, the measurement unit 12b performs a process of measuring the head movement of the driver 3 and then performs control.
  • Unit 12 executes step S7. Note that the order of steps S5 and S6 may be changed.
  • step S7 the data determination unit 12e determines whether or not the data of the driver's 3 pupil movement and head movement measured in steps S5 and S6 can be applied to the calculation of the vestibulo-oculomotor reflex movement.
  • step S7 the above-described determinations (1) to (9) may be performed. Depending on the state of the vehicle 2 or the driver 3, the above (1) to (9) You may comprise so that 2 or more determinations of 9) may be performed combining suitably.
  • step S7 if the data determination unit 12e determines that the data is not applicable to the calculation of the vestibular eye movement, the control unit 12 executes step S8.
  • step S8 the assigning unit 12c performs a process of assigning a second label indicating that the data measured in steps S5 and S6 is not suitable for calculating the vestibulo-oculomotor reflex movement (improper calculation), Thereafter, the control unit 12 returns to step S4 and repeats the process.
  • step S11 the assigning unit 12c performs a process of assigning a first label indicating that the data measured in steps S5 and S6 is suitable for calculating the vestibulo-oculomotor reflex motion (calculation suitability), Next, the control unit 12 executes Step S12.
  • Step S9 measurement part 12b performs processing which measures driver's 3 pupil movement, and then, control unit 12 performs Step S10.
  • the pupil movement measurement process is the same as that in step S5.
  • step S10 the measurement unit 12b performs a process of measuring the head movement of the driver 3, and then the control unit 12 executes step S11.
  • the head movement measurement process is the same as that in step S6. Note that the order of steps S9 and S10 may be changed.
  • step S11 the assigning unit 12c performs a process of assigning a first label indicating that the data measured in steps S9 and S10 is suitable for calculation of vestibulo-oculomotor reflex motion (calculation suitability),
  • the control unit 12 executes Step S12.
  • step S12 the calculation unit 12f calculates the vestibular ocular reflex movement using the data to which the first label is attached, and then the control unit 12 executes step S13.
  • the calculated data of the vestibulo-ocular reflex movement includes at least one of VOR gain, residual standard deviation, and phase difference.
  • step S13 the sleepiness determination unit 12g calculates the sleepiness level of the driver 3, and in the next step S14, the sleepiness determination unit 12g determines whether the sleepiness level is smaller than a predetermined threshold. If the drowsiness determination unit 12g determines in step S14 that the drowsiness level is equal to or greater than a predetermined threshold, the control unit 12 executes step S15. In step S15, the awakening control unit 12h performs a process of outputting a predetermined control signal for awakening the driver 3 to the navigation device 50 or the vibration device 70, and then the control unit 12 executes step S16.
  • step S16 it is determined whether or not the start switch 40 is turned off. If it is determined that the start switch 40 is not turned off, the control unit 12 returns to the process in step S2. On the other hand, if it is determined in step S16 that the start switch 40 has been turned off, the control unit 12 executes step S17. In step S17, the control unit 12 stops driving the camera 20 and ends the process.
  • the data measured by the measuring unit 12b based on the identification information (first label and second label) given by the giving unit 12c is used for the vestibulo-oculomotor reflection. It is possible to appropriately identify whether or not the data is measured in the predetermined state suitable for calculation of motion. Therefore, based on the identification information, by using the data with the first label measured in the predetermined state, the monitoring accuracy of the vestibulo-oculomotor reflex movement is improved even in an actual vehicle environment. Can do.
  • the intervention control by the intervention control unit 12d allows the driver 3 or the vehicle 2 to move the vestibular movement reflex movement in a state suitable for the calculation of the vestibular movement reflex movement of the driver 3. It is possible to prompt a state where the signal-to-noise ratio for the calculation can be increased.
  • the data determined by the data determination unit 12e is given the first label by the providing unit 12c, and the suitable data A second label is assigned to the data determined not to be.
  • the vestibulooculomotor reflex movement is calculated using the data to which the first label is applied by the applying unit 12c, it is possible to further improve the monitoring accuracy of the vestibulooculomotor reflex movement after the intervention control is performed. .
  • the data processing device 10 when it is determined that the predetermined state is present, in other words, in a state where the S / N ratio is assumed to be high, without performing intervention control by the intervention control unit 12d,
  • the first label is assigned to the pupil movement and head movement data of the driver 3 measured by the measurement unit 12b, and the vestibular movement reflex movement can be calculated.
  • the burden on the driver 3 due to the intervention control. Can be reduced.
  • the drowsiness determination unit 12g since the drowsiness determination unit 12g is provided, the drowsiness level of the driver 3 in the actual vehicle environment can be accurately determined, and the awakening control unit 12h is provided. Thus, it is possible to perform control to appropriately wake the driver 3 in accordance with the sleepiness level.
  • the monitoring system 1 provided with the data processor 10 and the camera 20, it is possible to provide a monitoring system that can be easily introduced in an actual vehicle environment. Moreover, the system which can awaken the driver
  • the control unit 12 of the data processing apparatus 10 does not have to include all the units illustrated in FIG. 2.
  • the control unit 12 includes at least the state determination unit 12a, the measurement unit 12b, and the provision unit 12c.
  • a first configuration including a intervention control unit 12d in the first configuration, a third configuration further including a data determination unit 12e in the second configuration, the first to A fourth configuration further including a calculation unit 12f in any of the third configurations, a fifth configuration further including a drowsiness determination unit 12g in the fourth configuration, or an awakening control unit 12h in the fifth configuration It may be comprised by the 6th structure further provided.
  • step S9 In the processing operation performed by the control unit 12 described with reference to FIG. 3, the pupil movement measurement process in step S9 and the head movement measurement process in step S10 are executed between step S2 and step S3. Then, after the head movement measurement process, the determination process of step S3 is performed. If it is determined in step S3 that the predetermined state is present, the process proceeds to step S11, and a first label suitable for calculation is attached. The processing after step S12 may be performed. In this case, the data determination process of step S7 may be executed instead of the state determination process of step S3.
  • step S7 whether or not the data of the pupil movement and head movement of the driver 3 is applicable to the calculation of the vestibular movement reflex movement is determined by binary determination.
  • weighting determination according to an applicable state may be performed.
  • step S8 a label corresponding to the weight is assigned, and then the process proceeds to step S12.
  • step S12 vestibular eye movement is calculated using the pupil movement and head movement data to which the weighted label is assigned. You may do it.
  • the vestibulo ocular reflex motion corresponding to the actual vehicle environment can be monitored using not only a part of the data but also all.
  • the monitoring system 1 and the data processing apparatus 10 are not limited to vehicle-mounted use.
  • the monitoring system 1 and the data processing device 10 are installed in a factory or an office, and a person who operates equipment installed in the factory, a person who performs a predetermined work on a desk, etc. It can be widely applied to a system for monitoring sleepiness.
  • a product operated by a person in a factory includes, for example, a production apparatus.
  • office equipment such as a personal computer can be cited as a thing that a person operates in an office.
  • a data processing device for performing data processing for monitoring a person
  • a state determination unit (12a) for determining whether the state of the person or an object operated by the person is in a predetermined state suitable for calculation of the vestibulooculomotor reflex movement of the person
  • a measurement unit (12b) for measuring the pupil movement and head movement of the person
  • a provision unit (12c) for providing identification information indicating a determination result by the state determination unit (12a) to the data of the human head movement and pupil movement measured by the measurement unit (12b); Characteristic data processing device.
  • a data processing method for monitoring a person A measuring step (steps S5, S6, or steps S9, S10) for measuring the human pupil movement and head movement; A state determination step (step S3 or step S7) for determining whether or not the state of the person or an object operated by the person is in a predetermined state suitable for calculation of the vestibulooculomotor reflex movement of the person; An adding step (step S8 or step S9) for adding identification information indicating a result determined by the state determination step to data indicating the head movement and pupil movement of the person measured by the measurement step;
  • the data processing method characterized by performing the step which contains.
  • a data processing program for causing at least one computer to execute data processing for monitoring a person, Said at least one computer (12), A measurement step (steps S5, S6, or steps S9, S10) for measuring human pupil movement and head movement; A state determination step (step S3 or step S7) for determining whether or not the state of the person or an object operated by the person is in a predetermined state suitable for calculation of the vestibulooculomotor reflex movement of the person; An adding step (step S8 or step S9) for adding identification information indicating a result determined by the state determination step to data indicating the head movement and pupil movement of the person measured by the measurement step; A data processing program that is executed.
  • the present invention can be used in a system for monitoring a person using the vestibulo-oculomotor reflex movement of a person in a real environment, for example, a driver of a vehicle, a person operating equipment installed in a factory, Alternatively, it can be widely used in various industrial fields such as a system for monitoring sleepiness of a person working in an office.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Educational Technology (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Psychology (AREA)
  • Hospice & Palliative Care (AREA)
  • Developmental Disabilities (AREA)
  • Child & Adolescent Psychology (AREA)
  • General Physics & Mathematics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Human Computer Interaction (AREA)
  • Physiology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Dentistry (AREA)
  • Traffic Control Systems (AREA)
  • Eye Examination Apparatus (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

Le but de la présente invention est de fournir un dispositif de traitement de données qui est capable d'améliorer la précision de surveillance pour des mouvements de réflexe vestibulo-oculaire en permettant l'identification de données appropriées pour l'acquisition des mouvements de réflexe vestibulo-oculaire. Ce dispositif de traitement de données effectue un traitement de données pour surveiller une personne et comprend : une unité de détermination d'état qui détermine si une personne ou un objet qu'elle exploite est dans un état prédéterminé qui est approprié pour calculer les mouvements de réflexe vestibulo-oculaire de la personne; une unité de mesure qui mesure les mouvements de la pupille de la personne et les mouvements de la tête de la personne; et une unité de fourniture qui fournit, à des données relatives aux mouvements de la pupille et aux mouvements de la tête de la personne mesurées par l'unité de mesure, des informations d'identification qui indiquent le résultat de la détermination à partir de l'unité de détermination d'état.
PCT/JP2019/002466 2018-02-07 2019-01-25 Dispositif de traitement de données, système de surveillance, système de vigilance, procédé de traitement de données, programme de traitement de données et support de stockage WO2019155914A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018019761A JP6418342B1 (ja) 2018-02-07 2018-02-07 データ処理装置、モニタリングシステム、覚醒システム、データ処理方法、及びデータ処理プログラム
JP2018-019761 2018-02-07

Publications (1)

Publication Number Publication Date
WO2019155914A1 true WO2019155914A1 (fr) 2019-08-15

Family

ID=64098790

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/002466 WO2019155914A1 (fr) 2018-02-07 2019-01-25 Dispositif de traitement de données, système de surveillance, système de vigilance, procédé de traitement de données, programme de traitement de données et support de stockage

Country Status (2)

Country Link
JP (1) JP6418342B1 (fr)
WO (1) WO2019155914A1 (fr)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113271852A (zh) * 2019-01-21 2021-08-17 三菱电机株式会社 注意力判定装置、注意力判定系统、注意力判定方法和程序
JP7376996B2 (ja) 2019-03-18 2023-11-09 株式会社Subaru 車両の危険状況判別装置、車両の危険状況判別方法、及びプログラム
JP2021015496A (ja) * 2019-07-12 2021-02-12 株式会社東海理化電機製作所 判定装置およびプログラム

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008091759A1 (fr) * 2007-01-26 2008-07-31 University Of Florida Research Foundation, Inc. Appareil et procédé pour l'évaluation des réflexes oculo-vestibulaires
WO2010032424A1 (fr) * 2008-09-18 2010-03-25 学校法人中部大学 Détecteur de signaux de somnolence
JP2015095162A (ja) * 2013-11-13 2015-05-18 株式会社デンソー ドライバ監視装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008091759A1 (fr) * 2007-01-26 2008-07-31 University Of Florida Research Foundation, Inc. Appareil et procédé pour l'évaluation des réflexes oculo-vestibulaires
WO2010032424A1 (fr) * 2008-09-18 2010-03-25 学校法人中部大学 Détecteur de signaux de somnolence
JP2015095162A (ja) * 2013-11-13 2015-05-18 株式会社デンソー ドライバ監視装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
HIRATA, YUTAKA: "Eye movements for understanding neural mechanisms of motor learning, controling robot, and monitoring car driver's physiological state", THE JAPANESE JOURNAL OF PSYCHONOMIC SCIENCE, vol. 33, no. 1, 2014, pages 81 - 85, ISSN: 0287-7651 *

Also Published As

Publication number Publication date
JP6418342B1 (ja) 2018-11-07
JP2019136166A (ja) 2019-08-22

Similar Documents

Publication Publication Date Title
JP7099037B2 (ja) データ処理装置、モニタリングシステム、覚醒システム、データ処理方法、及びデータ処理プログラム
CN107336710B (zh) 驾驶意识推定装置
CN109791739B (zh) 晕车估计装置、晕车防止装置和晕车估计方法
CN108621923B (zh) 车辆的显示系统及车辆的显示系统的控制方法
JP6950346B2 (ja) 運転者状態把握装置、運転者状態把握システム、及び運転者状態把握方法
JP6369487B2 (ja) 表示装置
US10647201B2 (en) Drive assist device and drive assist method
JP6350145B2 (ja) 顔向き検出装置及び車両用警告システム
JP6693489B2 (ja) 情報処理装置、運転者モニタリングシステム、情報処理方法、及び情報処理プログラム
JP2018180594A (ja) 走行支援装置
WO2019155914A1 (fr) Dispositif de traitement de données, système de surveillance, système de vigilance, procédé de traitement de données, programme de traitement de données et support de stockage
JP6683185B2 (ja) 情報処理装置、運転者モニタリングシステム、情報処理方法、及び情報処理プログラム
CN108621940A (zh) 车辆的显示系统及车辆的显示系统的控制方法
JP2008079737A (ja) 集中度評価装置及びこれを用いた車両用表示装置
JPH07117593A (ja) 車両用警報装置
WO2019155913A1 (fr) Dispositif de traitement de données, système de surveillance, système d'alerte, méthode de traitement de données, programme de traitement de données et support de stockage
JP7099036B2 (ja) データ処理装置、モニタリングシステム、覚醒システム、データ処理方法、及びデータ処理プログラム
WO2019176492A1 (fr) Système de calcul, dispositif de traitement d'informations, système d'aide à la conduite, procédé de calcul d'indice, programme informatique et support de stockage
JP2019064407A (ja) 運転支援装置及び運転支援方法
JP2018139070A (ja) 車両用表示制御装置
JP7331728B2 (ja) 運転者状態推定装置
JP7331729B2 (ja) 運転者状態推定装置
WO2020188629A1 (fr) Dispositif de détermination d'état de vigilance, système d'éveil, et procédé de détermination d'état de vigilance
JP7298351B2 (ja) 状態判定装置、車載機、運転評価システム、状態判定方法、及びプログラム
JP7298510B2 (ja) 状態推定装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19751753

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19751753

Country of ref document: EP

Kind code of ref document: A1