WO2010016244A1 - 運転注意量判定装置、方法およびプログラム - Google Patents
運転注意量判定装置、方法およびプログラム Download PDFInfo
- Publication number
- WO2010016244A1 WO2010016244A1 PCT/JP2009/003724 JP2009003724W WO2010016244A1 WO 2010016244 A1 WO2010016244 A1 WO 2010016244A1 JP 2009003724 W JP2009003724 W JP 2009003724W WO 2010016244 A1 WO2010016244 A1 WO 2010016244A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- driver
- visual field
- peripheral
- attention amount
- stimulus
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/18—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/113—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/369—Electroencephalography [EEG]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/369—Electroencephalography [EEG]
- A61B5/377—Electroencephalography [EEG] using evoked responses
- A61B5/378—Visual stimuli
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K28/00—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions
- B60K28/02—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver
- B60K28/06—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver responsive to incapacity of driver
- B60K28/066—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver responsive to incapacity of driver actuating a signalling device
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
- G06V20/597—Recognising the driver's state or behaviour, e.g. attention or drowsiness
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2503/00—Evaluating a particular growth phase or type of persons or animals
- A61B2503/20—Workers
- A61B2503/22—Motor vehicles operators, e.g. drivers, pilots, captains
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/163—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
Definitions
- the present invention relates to a technique for determining a driver's condition using an electroencephalogram and providing safe driving support.
- Peripheral vision generally refers to a region of 130 degrees up and down and 180 degrees left and right outside the range of about 20 degrees (center vision) centered on the line of sight. In the peripheral vision, it is difficult to recognize the shape and color of an object in detail, but it is known that it reacts sensitively to a moving object and an object that changes with time such as blinking light. The driver needs to pay attention to the peripheral visual field region, the door mirrors existing in the region, etc. in preparation for a pedestrian jumping out or a motorcycle crossing the side. Therefore, when the attention amount to the driver's peripheral visual field is low, measures such as issuing a warning to the driver are required.
- a method of determining the driver's attention state there is a method of detecting the driver's line of sight and face movement with a camera directed at the driver and determining the driver's attention distribution state.
- the driver's attention is determined by comparing the optimal gaze position that the driver should be aware of based on the surrounding situation of the vehicle with the gaze point detected from the driver's line of sight and the movement of the face.
- a technique for determining an allocation state is disclosed.
- Patent Document 2 discloses a technique for determining a driver's concentration level using a brake response time for sudden deceleration of a preceding vehicle, and determining whether or not to output an alarm to the driver. Yes.
- Event-related potential refers to a transient potential fluctuation in the brain that occurs temporally in relation to an external or internal event.
- a positive component appearing in the vicinity of about 300 milliseconds starting from the occurrence timing of an external visual stimulus or the like is called a P300 component, which reflects recognition and attention to the stimulus.
- Non-Patent Document 1 discloses a study on measurement of driving attention using event-related potentials.
- the driver is forced to step on the brake pedal of the vehicle when the brake lamp of the vehicle ahead is lit.
- P300 of the event-related potential in the high caution condition It has been reported that the amplitude of the component increases.
- Patent Document 1 since the technique described in Patent Document 1 is based on the idea that attention is not directed to where the line of sight is not directed, the driver's attention to the peripheral visual field region cannot be accurately determined. For example, in an actual driving situation, the driver detects the movement of parallel vehicles and pedestrians by peripheral vision while monitoring the forward vehicle by central vision, and determines the direction of the line of sight according to the situation in front and surroundings. ing. Therefore, it is difficult for the conventional technology to cope with a case where the line of sight is directed forward while paying attention to the peripheral visual field region.
- Non-Patent Document 1 also uses the event-related potential (ERP) for lighting the brake lamp of the vehicle ahead, as described above. Therefore, the driving attention amount being measured is limited to that for the driver's central visual field region, and the attention amount for the peripheral visual field region cannot be measured.
- ERP event-related potential
- the present invention has been made in view of the above problems, and its purpose is to determine the amount of attention to the driver's peripheral visual field region even when the driver does not point his / her line of sight to the surrounding object, and the determination result It is to provide safe driving support according to the situation.
- a driving attention amount determination apparatus includes an electroencephalogram measurement unit that measures a driver's electroencephalogram signal, and the electroencephalogram signal measured from the start point of occurrence of a visual stimulus generated in the peripheral visual field region of the driver.
- An attention amount determination unit that determines a driver's attention amount with respect to the peripheral visual field region, and an output unit that alerts the driver by outputting a signal based on the determination result.
- the attention amount determination unit may determine the attention amount according to the magnitude of the event-related potential amplitude of the electroencephalogram signal measured from the generation point of the visual stimulus.
- the attention amount determination unit starts the attention amount when the amplitude of an event-related potential of P300, which is a positive component in a section from 300 milliseconds to 600 milliseconds starting from the time point of occurrence of the visual stimulus, is smaller than a predetermined threshold value. May be determined to be low.
- the output unit may output the signal to the driver.
- the attention amount determination unit when the amplitude of the event-related potential of the P300 component, which is a positive component in the section from 300 milliseconds to 600 milliseconds starting from the occurrence time of the visual stimulus, is larger than the predetermined threshold value, When it is determined that the amount of attention is large and the amount of attention is determined to be large, the output unit may not output the signal to the driver.
- the attention amount determination unit may determine the attention amount according to a correlation coefficient between the electroencephalogram signal measured starting from the occurrence time of the visual stimulus and a template stored in advance.
- the output unit may output at least one of a video signal for presenting characters or symbols on a screen for presenting information and an audio signal for output from a speaker for outputting sound.
- the driving attention amount determination apparatus may further include a peripheral stimulus generation unit that generates the visual stimulus in the driver's peripheral visual field region.
- the driving attention amount determination device detects an occurrence point of the visual stimulus generated in the peripheral visual field region from an imaging unit that captures an image in front of a vehicle driven by the driver and the captured image.
- a peripheral stimulus detection unit; and the attention amount determination unit may receive information specifying the detected time point of the visual stimulus from the peripheral stimulus detection unit.
- the driving attention amount determination device further includes a line-of-sight measurement unit that measures the driver's line of sight, and the peripheral stimulus detection unit is measured by the line-of-sight measurement unit, and the driver's attention level determination unit Depending on the line of sight and the captured video, it may be detected whether the visual stimulus has occurred in the peripheral visual field region.
- the driving attention amount determination device includes a situation detection unit that detects the speed of the vehicle or the presence or absence of lighting of a headlamp, and the peripheral stimulus detection unit determines whether or not the visual stimulus is in the peripheral visual field region. You may detect according to the detection result of the said situation detection part.
- the attention amount determination unit is configured to output the visual stimulus detected in the peripheral visual region.
- Data on the event-related potential of the electroencephalogram signal may be excluded from the analysis target.
- the peripheral stimulus generation unit generates the visual stimulus in the driver's peripheral visual field at a generation timing having a time difference greater than or equal to a predetermined value from a visual stimulus generation timing generated in the driver's central visual field. May be.
- a method for determining a driving attention amount includes a step of measuring a driver's brain wave signal, and the brain wave signal measured from the occurrence point of a visual stimulus generated in the driver's peripheral visual field region, The method includes a step of determining a driver's attention amount to the peripheral visual field region and a step of calling attention to the driver by outputting a signal based on the determination result.
- a computer program for determining a driving attention amount according to the present invention is executed by a computer to receive a driver's brain wave signal from the computer, and a vision generated in the driver's peripheral visual field region.
- the attention amount in the driver's peripheral visual field area is determined from the electroencephalogram signal measured from the start point of the visual stimulus generated in the driver's peripheral visual field area.
- EEG signals are used to accurately determine the amount of attention to events that may occur in the driver's peripheral visual field, such as sudden vehicle interruptions or pedestrian jumps, and appropriate attention should be paid to the driver based on the determination results. It is possible to prompt state changes such as arousal.
- FIG. 3 is a configuration diagram of functional blocks of a driving attention amount determination apparatus 1 according to the first embodiment. It is a figure which shows the example of a peripheral visual field area
- FIG. 5 is a flowchart illustrating a processing procedure of a peripheral visual field attention amount determination unit 13; It is a figure which shows the example of a process of the peripheral visual field attention amount determination part. It is a figure which shows the example of alerting of the output part. It is a figure which shows the presentation screen of the experiment which the present inventors conducted. It is a figure which shows the addition average waveform for every visual field area
- FIG. 6 is a configuration diagram of functional blocks of a driving attention amount determination apparatus 1 according to a second embodiment.
- 4 is a flowchart showing a processing procedure of a peripheral stimulus detection unit 16. It is a block diagram of the functional block of the driving attention amount determination apparatus 1 at the time of providing the condition detection part 17 in Embodiment 2.
- FIG. It is a figure which shows the example of a center visual field area
- FIG. 10 is a configuration diagram of functional blocks of a driving attention amount determination apparatus 1 according to a third embodiment.
- 3 is a configuration diagram of functional blocks of a line-of-sight measurement unit 18.
- FIG. (A) It is a figure which shows the data structure of the calibration information of the gaze measurement part 18 in Embodiment 3,
- (b) It is a figure which shows the example of a driver
- 3 is a configuration diagram of functional blocks of a driving attention amount determination device 1a in which a line-of-sight measurement unit 18 is provided for the configuration of the first embodiment.
- FIG. It is a block diagram of the functional block of the driving attention amount determination apparatus 2b which provided the condition detection part 18 with respect to the structure of Embodiment 2.
- FIG. 1 is a block diagram showing a main configuration of a driving attention amount determination apparatus 100 according to the present invention.
- the driving attention amount determination apparatus 100 includes an electroencephalogram measurement unit 11, an attention amount determination unit 13, and an output unit 14.
- the electroencephalogram measurement unit 11 measures an electroencephalogram signal of the driver 10.
- the attention amount determination unit 13 determines the attention amount with respect to the peripheral visual field region of the driver 10 from the electroencephalogram signal measured from the generation time point of the visual stimulus generated in the peripheral visual field region of the driver 10.
- the “peripheral visual field region” refers to a region other than a certain visual field region (central visual field region) determined from the human visual line direction in the human visual field region.
- the central visual field region can be defined as a region surrounded by a certain angle between the side surface of the cone and the line-of-sight direction when a cone with the human viewpoint direction as an axis is assumed. In the following embodiments, this fixed angle is described as being about 20 degrees.
- the output unit 14 alerts the driver 10 by outputting a signal based on the determination result of the attention amount determination unit 13. Thereby, it becomes possible to improve the driver's attention and to support safe driving.
- the above-described attention amount determination unit 13 specifies the generation time point of the visual stimulus generated in the peripheral visual field region.
- the visual stimulus may be given by providing a light emitting device in the driving attention amount determination device 100 and causing the light emitting device to emit light, or may be given from an external environment (for example, a lamp lighted by another vehicle). .
- an external environment for example, a lamp lighted by another vehicle.
- FIG. 2 is a block diagram of the driving attention amount determination apparatus 1 according to the present embodiment.
- the driving attention amount determination device 1 is a device for determining the attention amount for driving using the brain wave signal of the driver 10 and providing support according to the determination result. For example, use brain waves to determine the amount of attention to events that can occur in the peripheral vision area of the driver, such as sudden interruption of a vehicle or pedestrian jumping out, and alert the driver according to the determination result Can do.
- the driving attention amount determination apparatus 1 includes an electroencephalogram measurement unit 11, a peripheral stimulus generation unit 12, an attention amount determination unit 13, and an output unit 14.
- the driver 10 block is shown for convenience of explanation.
- the electroencephalogram measurement unit 11 is an electroencephalograph, for example, and measures the electroencephalogram of the driver 10.
- the peripheral stimulus generator 12 is constituted by, for example, an LED light source and its control circuit, and generates a visual stimulus in the peripheral visual field region of the driver 10.
- the peripheral stimulus generation unit 12 transmits a visual stimulus toward the driver 10 and transmits information indicating a stimulus generation timing toward the attention amount determination unit 13.
- the attention amount determination unit 13 is, for example, a microcomputer, and measures an electroencephalogram signal starting from a stimulus occurrence point specified based on information indicating the stimulus occurrence timing, and attention to the peripheral visual field region of the driver 10 from the electroencephalogram signal. Determine the amount.
- the output unit 14 is a device that can output at least one of an image and a sound.
- the image is output using a display device such as a liquid crystal display device or a so-called organic EL display.
- the sound is output using a speaker.
- the output unit 14 prompts the driver 10 to call attention based on the determination result.
- the electroencephalogram measurement unit 11 detects an electroencephalogram signal by measuring a potential change at an electrode mounted on the head of the driver 10.
- the inventors of the present application envision a wearable electroencephalograph in the future. Therefore, the electroencephalograph may be a head-mounted electroencephalograph. It is assumed that the driver 10 is wearing an electroencephalograph in advance.
- Electrodes are arranged in the electroencephalogram measurement unit 11 so as to come into contact with a predetermined position of the head when worn on the head of the driver 10.
- the arrangement of the electrodes is, for example, Pz (midline parietal), A1 (earlobe) and nose root defined by the International 10-20 method.
- Pz midline parietal
- A1 earlobe
- nose root defined by the International 10-20 method.
- Pz midline parietal
- A1 earlobe
- the P300 component of the event-related potential is said to reach a maximum amplitude at Pz (midline parietal).
- the P300 component can also be measured at Cz (top of the skull) and Oz (back of the head) around Pz, and electrodes may be arranged at the positions. This electrode position is determined from the reliability of signal measurement and the ease of mounting.
- the electroencephalogram measurement unit 11 can measure the electroencephalogram of the driver 10.
- the measured electroencephalogram is sampled so as to be processed by a computer and sent to the peripheral visual field attention amount determination unit 13.
- the electroencephalogram measured by the electroencephalogram measurement unit 11 is subjected to, for example, a 15 Hz low-pass filter process in advance when focusing on the event-related potential.
- the peripheral stimulus generator 12 generates a visual stimulus in the peripheral visual field region of the driver.
- the definition of the peripheral visual field region will be described using an example.
- a light source 23 such as an LED at the edge of the display 22 is used.
- Visual stimuli can be presented by blinking.
- the peripheral stimulus generator 12 includes a light source 23 and a control circuit (not shown) that controls the blinking timing while supplying power to the light source 23.
- the number of blinks per unit time that is a visual stimulus is determined from the determination accuracy and determination interval of the attention amount determination unit 13 described later. For example, when the change in the attention amount is determined every 3 minutes, if the number (number of additions) of brain wave data necessary for the determination is 30, the number of blinks is 10 times per minute.
- the required number of blinks can be further reduced by combining various noise countermeasures and high-precision analysis methods currently used in EEG event-related potential (ERP) research.
- the blinking position may be determined randomly, or may be blinked sequentially in a predetermined order.
- a location where the light source 23 is arranged at this time is defined as a peripheral visual field region viewed from the driver.
- the light source 23 such as an LED may be disposed on the edge portion of the in-vehicle windshield or on the door mirror, and in this case, the place where the light source 23 is disposed is also used as the peripheral visual field region viewed from the driver.
- the peripheral stimulus generator 12 needs to generate a visual stimulus in the peripheral visual field with a timing shifted from the visual stimulus generated in the driver's central visual field. Specific examples will be described below.
- FIG. 4 shows an example of the central visual field region 31.
- the central field of view area 31 an area where a lane where the host vehicle is present and a front panel (not shown) is present is defined as the central field of view area 31.
- a region other than the central region other than the central visual region 31 is set as a peripheral visual region 32.
- the timing of blinking the blinker display on the front panel existing in the driver's central visual field 31 and the interior of the vehicle existing in the peripheral visual field 32 It is necessary to intentionally shift the blinking timing of the light source 23 disposed on the edge portion of the windshield, the door mirror, or the like.
- the attention amount determination unit 13 uses an event-related potential of an electroencephalogram starting from the time of occurrence of the stimulus, and particularly uses an event-related potential from 300 ms to 600 ms starting from the time of occurrence of the stimulus. Judging the amount. If visual stimuli occur simultaneously in both the central visual field and the peripheral visual field, it cannot be specified whether the attention amount determined by the attention amount determination unit 13 is for the central visual field region 31 or the peripheral visual field region 32. Therefore, it is necessary to generate visual stimuli in the peripheral visual field region 32 with a predetermined time difference from the visual stimuli generated in the central visual field region 31 so that the analysis time intervals for the respective stimuli do not overlap.
- the event-related potential (300 milliseconds to 600 milliseconds) of interest is triggered by which visual stimulus.
- it In order to be able to identify what has been done, it must be generated with a time difference of at least 300 milliseconds from other visual stimuli. For example, if the blinking timing of the front panel blinker is set to every 600 milliseconds, the light source 23 in the peripheral visual field region can be blinked every 600 milliseconds at a timing shifted by 300 milliseconds from the visual stimulus of the front panel. good.
- the amount of attention to the peripheral visual field region 32 is set as described above. It cannot be measured correctly. Therefore, the event-related potential data for the stimulus generated in the peripheral visual field region 32 is excluded from the analysis target of the attention amount determination unit 13. This process may be realized when the attention amount determination unit 13 discards the data without using the data, or may be realized when the electroencephalogram measurement unit 11 stops outputting the electroencephalogram signal at that timing.
- the “peripheral visual field region” refers to a region of 130 degrees up and down and 180 degrees left and right outside the range of about 20 degrees (center visual field) around the line of sight (gaze point). Therefore, when providing a gaze measuring unit that measures the driver's gaze, as shown in FIG. 5, an area within the driver's viewing angle of 20 degrees from the measured gaze point 41 is set as the central visual field area 42, and other areas (
- the peripheral visual field region 43 can be 20 degrees or more and 130 degrees or less centered on the line of sight in each of the vertical directions, and an area of 20 degrees or more and 180 degrees or less centered on the line of sight in each of the left and right directions. .
- the central visual field region and the peripheral visual field region are set as shown in FIGS. 4 and 5 described above. It is assumed that the driving driver is basically looking at the front center, and the central visual field region and the peripheral visual field region are fixed.
- the driver's line of sight may fluctuate during actual driving.
- an example in which the driver's line of sight is measured will be described.
- an example in which an external visual stimulus is used will be described in another embodiment described later.
- the peripheral stimulus generation unit 12 transmits information indicating the time or generation timing (trigger) when the above stimulus is generated to the attention amount determination unit 13.
- the attention amount determination unit 13 determines the attention amount with respect to the peripheral visual field region of the driver 10 based on the information received from the peripheral stimulus generation unit 12 by analyzing the measured electroencephalogram signal starting from the generation point of the stimulus.
- the procedure of the attention amount determination unit 13 will be described with reference to FIGS.
- FIG. 6 is a flowchart showing a processing procedure of the attention amount determination unit 13.
- FIG. 7 is a waveform example relating to the processing of the attention amount determination unit 13.
- the attention amount determination unit 13 receives the electroencephalogram data measured from the electroencephalogram measurement unit 11.
- FIG. 7 shows the received electroencephalogram data 61.
- step S52 the attention amount determination unit 13 receives information on the time when the stimulus is generated from the peripheral stimulus generation unit 12.
- FIG. 7 shows the time 62 when the trigger stimulus is generated.
- step S53 the attention amount determination unit 13 cuts out electroencephalogram data from ⁇ 100 milliseconds to 600 milliseconds from the occurrence time acquired in step S52 from the electroencephalogram data received in step S51.
- FIG. 7 shows an example of the cut out electroencephalogram data (event-related potential) 63. Note that the above-described time width for cutting out electroencephalogram data is determined as a range that necessarily includes the P300 component of the event-related potential. If the P300 component is included, the electroencephalogram data may be cut out with a time width different from this time width.
- step S54 the attention amount determination unit 13 performs baseline correction of the cut out electroencephalogram data.
- the baseline correction is performed with an average potential from ⁇ 100 milliseconds to 0 milliseconds starting from the time when the stimulus occurs.
- step S55 the attention amount determination unit 13 temporarily accumulates the electroencephalogram data subjected to the baseline correction in step S54.
- step S56 the attention amount determination unit 13 determines whether or not the number of the electroencephalogram data accumulated in step S55 has reached a preset required number of additions. If not reached, the process returns to S51, and if reached, the process proceeds to S57.
- event-related potentials in general, in the study of event-related potentials, analysis is performed after obtaining an average of electroencephalogram data. As a result, random brain action potentials that are not related to the event of interest are canceled out, and event-related potentials that have a certain latency (the time from when the stimulus occurs to the time when the action potential is generated) and polarity. (For example, P300 component) can be detected.
- the number of additions is, for example, 20 to 30 times. By increasing the number of times, it is possible to improve the SN ratio. However, this number of additions is an example, and the present invention is not limited to this number.
- the attention amount may be determined from the non-additional electroencephalogram (one electroencephalogram data).
- step S57 the attention amount determination unit 13 performs an averaging process on the electroencephalogram data for the required number of times accumulated in step S55.
- FIG. 7 shows a waveform 64 and an amplitude 65 after the averaging.
- the amplitude of the event-related potential from 300 milliseconds to 600 milliseconds is analyzed from the electroencephalogram data after the addition averaging, and the attention amount is determined based on the magnitude of the amplitude.
- the attention amount for the peripheral visual field region is determined based on the ERP characteristic specific to the peripheral visual field region specified by the inventors of the present application. Details of the determination process will be described later with reference to the experimental results shown in FIGS.
- the relationship between the target range of the electroencephalogram data to be added and the target range of the determined attention amount will be described.
- the attention amount for the entire peripheral visual field is determined.
- the attention amount for each light source position is determined.
- the attention amount determination unit 13 transmits the determination result to the output unit 14.
- the output unit 14 presents the result determined by the attention amount determination unit 13 with an image or sound. Alternatively, the output unit 14 outputs a signal for alerting the driver from the device side when the attention amount is low based on the determination result. Thereby, a driver
- the signal output for the output unit 14 to alert the driver may be, for example, one of a video signal and an audio signal, or both.
- a call to a driver by an audio signal, an operation sound and a warning sound by an audio signal, or a video signal may be presented by text or an image on a car navigation system or a head-up display. This makes it possible to call attention in order to improve the driver's attention.
- the signal that the output unit 14 outputs to call attention includes a control signal that causes an action to call attention to the driver.
- control signals for direct information presentation using AR (Augmented Reality) technology that displays an image superimposed on an object to which attention is desired, indirect by adjusting vibration to the handle, smell or air volume
- a control signal for causing a special action is included in the action for calling attention. It can be said that any example including the previous example is calling attention by applying an external action to the driver.
- FIG. 8 shows an example of alerting the output unit 14.
- the attention amount determination unit 13 determines that the attention amount with respect to the left side of the driver is reduced as a result of adding the electroencephalogram data and determining the attention amount for each light source position. It is.
- a left arrow image signal 152 is output (presented) on the head-up display (HUD) 151 in order to prompt the driver to call attention to the left side. This image signal functions as information for calling attention.
- test subjects were a total of 4 people, 1 male and 3 female, with an average age of 21 ⁇ 1.5 years. The contents of the experiment will be described with reference to FIG.
- the inventors of the present application conducted an experiment by a double task method in which the subject performed two tasks in parallel.
- the first task is a central task 71 for counting the number of times of switching of symbols ( ⁇ / ⁇ / ⁇ / x) presented in the center of the screen in FIG.
- the second task is a peripheral task 72 in which lamps around the screen flash in a random order, and the subject presses the button at hand when he notices the flashing.
- the subjects were instructed to keep their eyes on the center of the screen. In this way, by performing the two tasks at the center and the periphery at the same time, it is possible to examine how much attention is directed to the periphery of the screen while paying attention to the center of the screen.
- the test subject wears an electroencephalograph (manufactured by TEAC, polymate AP-1124), the electrode is placed using the international 10-20 electrode method, the lead electrode is Pz (midline parietal), the reference electrode is A1 (right earlobe), The ground electrode was the forehead part.
- EEG data measured at a sampling frequency of 200 Hz and a time constant of 3 seconds is subjected to a band pass filter process of 1 to 6 Hz, and electroencephalogram data from -100 milliseconds to 600 milliseconds is extracted from the time when the peripheral lamp blinks. Baseline correction was performed with an average potential from milliseconds to 0 milliseconds.
- FIG. 10 shows the addition average waveform of all subjects of the electroencephalogram data for each of the first and second conditions after performing the above-described processing.
- the first condition is a classification condition performed on the visual field region.
- the viewing angle the angle at which the line connecting the eye position of the subject and the gaze point in the center of the screen intersects the line connecting the eye position of the subject and the blinking lamp
- region 1 from 0 degrees to less than 10 degrees
- region 2 from 10 degrees to less than 20 degrees
- region 3 as 20 degrees or more.
- the second condition is a classification condition related to the button press reaction time of the subject.
- the reaction time until the button was pressed was used to classify the amount of attention as an experimental condition.
- the reaction time reflects the amount of attention.
- Patent Document 2 also calculates the concentration of attention to driving using the brake reaction time.
- the amplitude of the P300 component is significantly reduced when the attention amount is small (region (f) in FIG. 10) in the region 3 (generally regarded as the peripheral visual field) that is 20 degrees or more. I understand that.
- the maximum amplitudes (81 (d) to (f)) of the P300 component in (d) to (f) of FIG. 10 are 13.6 ⁇ V, 13.2 ⁇ V, and 2.5 ⁇ V, respectively.
- FIG. 11 shows the maximum amplitude of the P300 component under each condition of FIG.
- the horizontal axis is the area 1 / area 2 / area 3 of the visual field area
- the vertical axis is the potential
- the unit is ⁇ V.
- a solid line represents a case where the attention amount is large
- a dotted line represents a case where the attention amount is small.
- the amplitude differences 91 (a) to 91 (c) in each visual field region when the attention amount is large and small are 6.7 ⁇ V, 6.4 ⁇ V, and 18.4 ⁇ V, respectively.
- FIG. 11 also shows that there is a significant amplitude difference in the region 3 (peripheral visual field region) where the visual angle is 20 degrees or more, depending on the amount of attention.
- the amount of attention to events that can occur in the peripheral visual field area of the driver, such as sudden interruption of the vehicle or pedestrian jumping, by determining the magnitude of the amplitude of the event-related potential using the ERP characteristics in the peripheral visual field area described above can be determined with high accuracy by the electroencephalogram.
- FIG. 12 shows a probability distribution with respect to the maximum amplitude of the P300 component at the time of non-additive brain waves for each visual field region.
- (A) shows the probability distribution in the case of the region 1
- (b) shows the region 2
- (c) shows the probability distribution in the case of the region 3 (peripheral visual field region).
- the vertical axis represents potential
- the unit is ⁇ V
- the horizontal axis represents the occurrence probability for each attention amount
- the unit is%.
- Table 1 shows the discrimination rate when the amount of attention is discriminated in each visual field region.
- a threshold value of the maximum ERP amplitude at which the discrimination rate is maximum in each visual field region is set, and the amount of attention is discriminated depending on whether or not the ERP amplitude of each non-additive brain wave is equal to or greater than the threshold value.
- the threshold value that maximizes the discrimination rate is a threshold value that maximizes the average of the correct answer rate when the attention amount is large and the correct answer rate when the attention amount is small.
- the above threshold values are 7.5 ⁇ V, 22.5 ⁇ V, and 32.5 ⁇ V, respectively, and the threshold values are indicated by alternate long and short dash lines in FIGS. 12A to 12C.
- region 3 peripheral visual field region
- the probability distribution when the attention amount is large and the probability distribution when the attention amount is small are separated to some extent.
- the quantity discrimination rate is also 73.1%, which is a very high value for discrimination at the time of non-additive brain waves. Therefore, according to the attention amount determination in the peripheral visual field of the present embodiment, it is possible to maintain a high discrimination rate with the non-additional electroencephalogram without performing the addition of several tens to several hundreds of levels. In other words, it is possible to determine the attention amount of the driver for the very moment, not the determination of the attention state for a certain time width of about several minutes.
- the attention amount in the peripheral visual field region may be determined based on the value of the correlation coefficient with the template stored in advance, instead of the threshold processing for the amplitude of the event-related potential described above.
- the template refers to the addition average waveform data of the electroencephalogram signal of (c) when the attention amount is large in the region 3 (peripheral visual field) in FIG. 10, and the addition average of the electroencephalogram signal of (f) when the attention amount is small. Waveform data.
- the visual stimulus is generated in the peripheral visual field region of the driver, and the generation time point of the stimulus is the starting point.
- the attention amount in the peripheral visual field region of the driver is determined from the event-related potential of the electroencephalogram signal. As a result, the amount of attention can be determined even when an action index such as a brake cannot be obtained for an event that may occur in the peripheral visual field region of the driver, such as a sudden interruption of a vehicle or a pedestrian jumping out. And based on the said determination result, the assistance which prompts a driver
- a photographing unit that photographs the front of the host vehicle is provided.
- This driving attention amount determination device detects the occurrence of a visual stimulus that is a starting point when analyzing an event-related potential of an electroencephalogram from an image captured by an imaging unit, and determines the central visual field region and the visual field region from the position of the occurrence of the visual stimulus in the captured image. Distinguish peripheral vision areas. Then, the attention amount in the peripheral visual field region is determined.
- the driving attention amount determination device does not intentionally give a visual stimulus as in the first embodiment, and uses the natural visual stimulus generated in front of the driver during driving from the front shot image. It is possible to determine the attention amount in the visual field region.
- FIG. 13 is a block diagram of the driving attention amount determination apparatus 2 according to this embodiment.
- the difference between the driving attention amount determination device 2 and the driving attention amount determination device 1 (FIG. 2) according to the first embodiment is that the driving attention amount determination device 2 includes a photographing unit 15 in addition to the driving attention amount determination device 1. Further, the peripheral stimulus generation unit 12 of the driving attention amount determination device 1 is replaced with a peripheral stimulus detection unit 16. The different components will be described in detail below.
- the photographing unit 15 is, for example, a camera that can shoot moving images.
- the imaging unit 15 is installed in front of the vehicle (on the dashboard, behind the rearview mirror, etc.) and, for example, images the front of the vehicle at 30 frames per second at an angle of view of 105 degrees in the vertical direction and 135 degrees in the horizontal direction.
- the photographing unit 15 can photograph the image shown in FIG. 4, for example.
- the peripheral stimulus detection unit 16 detects the generation time point of the visual stimulus that is a starting point when analyzing the event-related potential of the electroencephalogram from the video imaged by the imaging unit 15, and simultaneously determines (specifies the generation region of the visual stimulus in the captured video image )
- the visual stimulus indicates that the amount of change in luminance in the video exceeds a predetermined threshold.
- the amount of change is an example.
- the rate of change in luminance can be employed. In this case, it may be determined that a visual stimulus has occurred when the rate of change in luminance is 50% or more.
- a brake lamp for a preceding vehicle, a blinker for a parallel vehicle, a headlight for an oncoming vehicle, a signal change, and the like correspond to this.
- the time at which the change occurred is detected as the visual stimulus occurrence time.
- the peripheral stimulus detection unit 16 detects the generation time point of the visual stimulus defined as described above, and determines whether the position of the stimulus, that is, the luminance change position is the central visual field region or the peripheral visual field region. As a determination method, as shown in FIG. 4, when the stimulus is present in the lane region where the host vehicle is present in the captured video, it is determined as the central visual field region 31, and the stimulus exists in a region other than the region described above. The case is determined as the peripheral visual field region 32. If the peripheral visual field region 32 is determined, the generation point of the stimulus is transmitted to the attention amount determination unit 13.
- FIG. 14 is a flowchart illustrating a processing procedure of the peripheral stimulus detection unit 16 according to the present embodiment.
- the amount of change in luminance will be described as an example.
- step S161 the peripheral stimulus detection unit 16 calculates the difference between the luminance images of each frame with respect to the vehicle front image captured by the imaging unit 15.
- step S162 the peripheral stimulus detection unit 16 determines whether or not there has been a luminance change equal to or greater than a predetermined threshold value Th1 from the above difference. If there is a change in luminance, the process proceeds to step S163. If not, the process returns to step S161, and the next inter-frame luminance difference is calculated.
- step S163 the peripheral stimulus detection unit 16 stores the luminance change time point and the position where the luminance change has occurred in the image.
- step S164 the peripheral stimulus detection unit 16 detects a white line from the inter-frame luminance difference calculated in step S161. More specifically, road images from vehicles moving at regular intervals appear to move asphalt on the road surface, structures around the road, and vegetation. It appears to be stationary on the image. Therefore, the peripheral stimulus detection unit 16 detects an area of a predetermined threshold Th2 or less as an unmoved white line area from the inter-frame luminance difference.
- step S165 the peripheral stimulus detection unit 16 uses the detected white line, and extracts a region where the distance between both the white lines is a certain width or more as a lane region.
- the lane region is shown as the central visual field region 31.
- step S166 the peripheral stimulus detection unit 16 determines whether or not the luminance change position stored in step S163 is outside the lane region extracted in step S165. If it is determined that it is outside the lane region, it is determined that the luminance change has occurred in the peripheral visual field region 32 (FIG. 4), and the process proceeds to step S167. It is determined that the error occurred in step 4), and the process returns to step S161 to calculate the next inter-frame luminance difference.
- step S167 the peripheral stimulus detection unit 16 transmits the luminance change time point determined as the luminance change in the peripheral visual field region 32 (FIG. 4) to the attention amount determination unit 13.
- peripheral visual field region 31 and the peripheral visual field region 32 do not change greatly and are generally fixed.
- the central visual field area and the peripheral visual field area of the driver are not fixed, and are considered to change depending on the driving situation (the speed of the own vehicle or the brightness around the own vehicle). For example, when the host vehicle is traveling at 100 km / h or higher on an expressway, the driver's field of view is narrower than when the vehicle is stationary. Also, when the periphery of the vehicle is dark, such as at night, the driver's field of view is narrower than in the daytime. If the driver's field of view becomes narrower, detection of a dangerous object is delayed even in a field of view closer to the center, and there is a higher possibility that it will lead to an encounter accident or a jump-out accident.
- FIG. 15 shows a configuration of the driving attention amount determination device 2a provided with the situation detection unit 17.
- the situation detection unit 17 is connected to a speedometer of the vehicle, a sensor provided for an automatic light function that automatically turns on the headlamp when dark, and / or a lighting switch of the headlamp,
- the driving situation speed, brightness around the host vehicle and / or presence / absence of lighting of the headlamp of the host vehicle
- the central visual field area can be defined smaller than when the vehicle is stationary or during the daytime, and the attention amount can be determined using the area other than the central visual field as the peripheral visual field area. it can.
- 16 and 17 show the reduced central visual field regions 171 and 182 respectively. Accordingly, it is possible to set a visual field region corresponding to the change in the driver's visual field that occurs due to a change in an external situation. Therefore, it is possible to determine the amount of attention to the peripheral visual field region according to the speed of the host vehicle and the presence / absence of lighting of the headlamp of the host vehicle, thereby reducing the risk of encounter accidents and pop-out accidents.
- the peripheral stimulus detection unit 16 changes the definition of the central and peripheral visual field regions according to the speed of the host vehicle detected by the situation detection unit 17 and the presence / absence of lighting of the headlamp of the host vehicle.
- FIG. 16 shows a reduced central viewing area 171.
- Table 2 shows an example of the relationship between the speed of the host vehicle and the area ratio of the central visual field area when the vehicle is stationary.
- the area ratio compared to when the vehicle is stationary is 1
- the area ratio is set to 0.6
- the area ratio of the central visual field area 171 shown in FIG. 16 to the central visual field area when the vehicle is stationary is 0.8.
- Table 3 shows an example of the relationship between the presence or absence of lighting of the headlamp of the host vehicle and the area ratio of the central visual field area to daytime.
- the occurrence of a visual stimulus is detected from an image captured in front of the host vehicle, the central visual field region and the peripheral visual field region are distinguished from the position of the stimulus occurrence in the captured image, Determine the amount of attention in the visual field area.
- the attention amount in the peripheral visual field region can be obtained by using the natural visual stimulus generated in front of the driver during driving from the front shot image without intentionally giving the visual stimulus by the driving attention amount determination device. Can be determined.
- the driver basically determines the stimulation generation area on the premise that the driver is looking at the front center while driving. However, when a visual stimulus is generated, the driver does not always turn his / her line of sight toward the front center, and therefore the peripheral visual field region is always changing.
- the line-of-sight measurement unit that measures the line of sight of the driver is provided in the driving attention amount determination device.
- the driving attention amount determination device determines the generation region of the visual stimulus according to the position of the driver's gazing point.
- FIG. 18 is a block diagram of the driving attention amount determination apparatus 3 according to this embodiment.
- the driving attention amount determination device 3 is configured by adding a line-of-sight measurement unit 18 to the driving attention amount determination device 2 (FIG. 13).
- FIG. 19 shows a configuration example of the line-of-sight measurement unit 18.
- the line-of-sight measurement unit 18 measures the driver's gazing point 137 on the two-dimensional plane 136 that projects the scenery in front of the vehicle (that is, the vehicle front image captured by the imaging unit 15).
- the near-infrared light source 131 irradiates the eyeball with a near-infrared point light source, and the image of the eyeball is captured by the CCD camera 132.
- the reflected image position detection unit 133 detects the position of the corneal reflection image of the light source on the pupil and the corneal surface.
- the calibration information storage unit 135 stores in advance the relationship between the position of the corneal reflection image and the gazing point coordinates in the vehicle front image captured by the imaging unit 15. Based on the calibration information, the conversion unit 134 measures the driver's gazing point on the vehicle front image from the position of the cornea reflection image.
- FIG. 20 (a) shows an example of calibration information
- FIG. 20 (b) shows an example of coordinates of the gaze position on the vehicle front image.
- the calibration information includes a corneal reflection image position and a gaze position coordinate.
- the conversion unit 134 Based on the corneal reflection image position (Pxn, Pyn) detected by the reflection image position detection unit 133, the conversion unit 134 converts the gaze position coordinate (Xn, Yn) of the driver on the vehicle front image.
- the line-of-sight measuring unit 18 may be a head-mounted measuring instrument worn by the driver in advance, or an in-vehicle measuring instrument installed near the rear view mirror of the vehicle.
- the peripheral stimulus detection unit 16 detects the generation time point of the visual stimulus and determines whether the position of the stimulus is the central visual field region or the peripheral visual field region.
- the stimulus generation region is determined based on the position of the gazing point 41 (FIG. 5) measured by the line-of-sight measurement unit 18.
- the peripheral visual field generally refers to a region of 130 degrees up and down and 180 degrees left and right outside the range of about 20 degrees (center visual field) centered on the line of sight. Therefore, as shown in FIG. 5, a case where the stimulus is present in a region within 20 degrees of the driver's viewing angle from the gaze point 41 measured is determined as the central visual field region 42, and the stimulus is present in a region other than the region described above. The case is determined to be the peripheral visual field region 43. And when it determines with the peripheral visual field area
- the driver By measuring the driver's line of sight according to the configuration and processing procedure according to the present embodiment and determining the peripheral visual field region according to the position of the gazing point, the driver directs the line of sight toward the front center when a visual stimulus occurs. Even if there is not, it can be accurately determined whether or not the stimulus is the peripheral visual field region. As a result, the attention amount in the peripheral visual field region can be determined with higher accuracy.
- the driving attention amount determination device is configured as a head-mounted display type device worn by the user
- the attention amount with respect to the peripheral visual field is not limited to safety support while driving a car but also during bicycle driving or walking. Can be determined.
- the amount of attention in the peripheral visual field of the user is determined based on the event-related potential of the electroencephalogram. Appropriate alerts to things can be performed appropriately.
- the configuration of the above-described line-of-sight measurement unit 18 can be provided in the driving attention amount determination apparatus 1 (FIG. 2) of the first embodiment and the driving attention amount determination apparatus 2a (FIG. 15) of the second embodiment.
- FIG. 21 shows a block diagram of a driving attention amount determination apparatus 1a according to a modification of the first embodiment.
- the driving attention amount determination device 1 a is provided with a new line-of-sight measurement unit 18 with respect to the driving attention amount determination device 1.
- the line-of-sight measurement unit 18 As the line-of-sight measurement unit 18, the configuration shown in FIG.
- functions and operations of the driving attention amount determination device 1a different from the driving attention amount determination device 1 (FIG. 2) will be described.
- the driving attention amount determination apparatus 1a can dynamically specify the driver's central visual field region and peripheral visual field region that change every moment. Thereby, the peripheral stimulus generation part 12 can selectively blink the light source located in the driver's peripheral visual field region.
- the peripheral stimulus generating unit 12 displays both eyes of the head-mounted display.
- the visual light stimulus can be presented by blinking the light source 23 located on the right side.
- the light source is not provided in each frame on the side close to the nose, but the line-of-sight measurement unit 18 may be provided and the light sources may be provided on the sides of all the frames. However, it is necessary to consider that light from a light source arranged on the left / right eye frame does not enter the opposite right / left eye.
- the visual line measurement unit 18 by providing the visual line measurement unit 18 and controlling the presentation of the visual stimulus, it is possible to reliably present the visual stimulus to the peripheral visual field region of the driver. This makes it possible to determine with higher accuracy whether or not attention is directed to the peripheral visual field region.
- FIG. 22 shows a block configuration diagram of the driving attention amount determination device 2b having the situation detection unit 17 and the line-of-sight measurement unit 18.
- the driving attention amount determination device 2b is configured by providing a line-of-sight measurement unit 18 with respect to the driving attention amount determination device 2a (FIG. 15) in the second embodiment.
- the peripheral stimulus detection unit 16 uses the situation detection unit 17
- the definition of the central and peripheral visual field regions is changed according to the speed and the presence / absence of lighting of the headlamp of the host vehicle.
- FIG. 17 shows an example of the central visual field region 182 reduced based on the detection result of the situation detection unit 17.
- the central visual field region 182 is smaller than the conventional visual angle of 20 degrees, and is defined within a range of about 16 degrees centered on the position of the gazing point 181, for example.
- the driving attention amount determination device 2b determines the attention amount with respect to the peripheral visual field region according to the speed of the host vehicle and the presence / absence of lighting of the headlamp of the host vehicle.
- the processing described using the flowcharts can be realized as a program executed by a computer.
- a computer program is recorded on a recording medium such as a CD-ROM and distributed as a product to the market, or transmitted through an electric communication line such as the Internet.
- All or part of the components constituting the driving attention amount determination device are realized as a general-purpose processor (semiconductor circuit) that executes a computer program.
- a processor that executes a computer program receives a driver's brain wave signal measured by the brain wave measuring unit 11. Then, the amount of attention to the driver's peripheral visual field region is determined from the electroencephalogram signal measured from the generation time point of the visual stimulus generated in the driver's peripheral visual field region, and a signal is output based on the determination result. Thereby, it becomes possible to call attention to the driver.
- the processor controls each operation of the peripheral stimulus generation unit 12, the imaging unit 15, the peripheral stimulus detection unit 16, the situation detection unit 17, the line-of-sight measurement unit 18, and the like. May function as those components.
- the driving attention amount determination apparatus is useful for preventing accidents with respect to events that may occur in the peripheral visual field region of the driver, such as a sudden interruption of a vehicle or a jump of a pedestrian.
- accidents with respect to events that may occur in the peripheral visual field region of the driver such as a sudden interruption of a vehicle or a jump of a pedestrian.
- it when configured as a head-mounted display type device, it can also be applied to safety assistance during bicycle driving or walking.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Surgery (AREA)
- Molecular Biology (AREA)
- Medical Informatics (AREA)
- Biophysics (AREA)
- Heart & Thoracic Surgery (AREA)
- Biomedical Technology (AREA)
- Pathology (AREA)
- Psychiatry (AREA)
- Psychology (AREA)
- Human Computer Interaction (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- General Physics & Mathematics (AREA)
- Ophthalmology & Optometry (AREA)
- Child & Adolescent Psychology (AREA)
- Hospice & Palliative Care (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Chemical & Material Sciences (AREA)
- Developmental Disabilities (AREA)
- Social Psychology (AREA)
- Combustion & Propulsion (AREA)
- Educational Technology (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Traffic Control Systems (AREA)
- Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)
- Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
- Eye Examination Apparatus (AREA)
Abstract
Description
図2は、本実施形態による運転注意量判定装置1のブロック構成図を示す。
本実施形態による運転注意量判定装置では、自車両の前方を撮影する撮影部を設けている。この運転注意量判定装置は、撮影部によって撮影された映像から脳波の事象関連電位を分析する際の起点となる視覚刺激の発生を検出し、撮影映像における視覚刺激発生の位置から中心視野領域および周辺視野領域を区別する。そして、周辺視野領域の注意量を判定する。
実施形態2では、運転者は、基本的に運転中は前方中央を見ていることを前提にし、刺激の発生領域を判定していた。しかし、視覚刺激発生時に運転者は常に前方中央に視線を向けているとは限らず、そのため周辺視野領域も常に変動している。
11 脳波計測部
12 周辺刺激発生部
13 注意量判定部
14 出力部
15 撮影部
16 周辺刺激検出部
17 状況検出部
18 視線計測部
Claims (15)
- 運転者の脳波信号を計測する脳波計測部と、
前記運転者の周辺視野領域で発生した視覚刺激の発生時点を起点として計測された前記脳波信号から、前記運転者の前記周辺視野領域に対する注意量を判定する注意量判定部と、
前記判定結果に基づいて信号を出力することにより、前記運転者に対して注意を喚起する出力部と
を備えた運転注意量判定装置。 - 前記注意量判定部は、前記視覚刺激の発生時点を起点として計測された前記脳波信号の事象関連電位の振幅の大きさに応じて、前記注意量を判定する、請求項1に記載の運転注意量判定装置。
- 前記注意量判定部は、前記視覚刺激の発生時点を起点にして300ミリ秒から600ミリ秒の区間の陽性成分であるP300の事象関連電位の振幅が所定の閾値より小さい場合に、前記注意量が低いと判定する、請求項2に記載の運転注意量判定装置。
- 前記注意量判定部が、前記注意量が低いと判定した場合には、前記出力部は、前記運転者に対して前記信号を出力する、請求項3に記載の運転注意量判定装置。
- 前記注意量判定部は、前記視覚刺激の発生時点を起点にして300ミリ秒から600ミリ秒の区間の陽性成分であるP300成分の事象関連電位の振幅が前記所定の閾値より大きい場合に、前記注意量が大きいと判定し、
前記注意量が大きいと判定された場合には、前記出力部は、前記運転者に対して前記信号を出力しない、請求項3に運転注意量判定装置。 - 前記注意量判定部は、前記視覚刺激の発生時点を起点として計測された前記脳波信号と、予め保持しているテンプレートとの相関係数に応じて前記注意量を判定する、請求項1に記載の運転注意量判定装置。
- 前記出力部は、情報を提示するための画面に文字または記号を提示するための映像信号、および、音声を出力するためのスピーカから出力するための音声信号の少なくとも一方を出力する、請求項1に記載の運転注意量判定装置。
- 前記運転者の周辺視野領域内で前記視覚刺激を発生させる周辺刺激発生部をさらに備えた、請求項2に記載の運転注意量判定装置。
- 前記運転者が運転する車両の前方の映像を撮影する撮影部と、
撮影された前記映像から、前記周辺視野領域内で発生した前記視覚刺激の発生時点を検出する周辺刺激検出部と
をさらに備え、
前記注意量判定部は、検出された前記視覚刺激の発生時点を特定する情報を前記周辺刺激検出部から受け取る、請求項1に記載の運転注意量判定装置。 - 前記運転者の視線を計測する視線計測部をさらに備え、
前記周辺刺激検出部は、前記視線計測部によって計測された、前記視覚刺激の発生時点における前記運転者の視線、および、撮影された前記映像に応じて、前記視覚刺激が周辺視野領域内で発生したか否かを検出する、請求項9に記載の運転注意量計測装置。 - 前記車両の速度またはヘッドランプの点灯の有無を検出する状況検出部を備え、
前記周辺刺激検出部は、前記視覚刺激が前記周辺視野領域にあるか否かを、前記状況検出部の検出結果に応じて検出する、請求項9に記載の運転注意量判定装置。 - 前記運転者の周辺視野領域および中心視野領域の各々で検出された視覚刺激の発生タイミングの時間差が所定値以下の場合、
前記注意量判定部は、前記周辺視野領域で検出された視覚刺激に対する前記脳波信号の事象関連電位のデータを解析対象から除外する、請求項9に記載の運転注意量判定装置。 - 前記周辺刺激発生部は、前記運転者の中心視野領域で発生させる視覚刺激の発生タイミングとは所定値以上の時間差を有する発生タイミングで、前記運転者の周辺視野領域内で前記視覚刺激を発生させる、請求項8に記載の運転注意量判定装置。
- 運転者の脳波信号を計測するステップと、
前記運転者の周辺視野領域で発生した視覚刺激の発生時点を起点として計測された前記脳波信号から、前記運転者の前記周辺視野領域に対する注意量を判定するステップと、
前記判定結果に基づいて信号を出力することにより、前記運転者に対して注意を喚起するステップと
を包含する、運転注意量を判定する方法。 - コンピュータによって実行されるコンピュータプログラムであって、
前記コンピュータプログラムは、前記コンピュータに対し、
運転者の脳波信号を受け取るステップと、
前記運転者の周辺視野領域で発生した視覚刺激の発生時点を起点として計測された前記脳波信号から、前記運転者の前記周辺視野領域に対する注意量を判定するステップと、
前記判定結果に基づいて信号を出力するステップと
を実行させることにより、前記運転者に対して注意を喚起させる、運転注意量を判定するためのコンピュータプログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010510585A JP4625544B2 (ja) | 2008-08-05 | 2009-08-04 | 運転注意量判定装置、方法およびプログラム |
CN2009801193390A CN102047304B (zh) | 2008-08-05 | 2009-08-04 | 驾驶注意力程度判定装置、方法 |
EP09804731.9A EP2312551A4 (en) | 2008-08-05 | 2009-08-04 | DEVICE, METHOD AND PROGRAM FOR EVALUATING A DRIVER'S AWARENESS |
US12/718,326 US20100156617A1 (en) | 2008-08-05 | 2010-03-05 | Apparatus, method, and program of driving attention amount determination |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008-201520 | 2008-08-05 | ||
JP2008201520 | 2008-08-05 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/718,326 Continuation US20100156617A1 (en) | 2008-08-05 | 2010-03-05 | Apparatus, method, and program of driving attention amount determination |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010016244A1 true WO2010016244A1 (ja) | 2010-02-11 |
Family
ID=41663472
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/003724 WO2010016244A1 (ja) | 2008-08-05 | 2009-08-04 | 運転注意量判定装置、方法およびプログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US20100156617A1 (ja) |
EP (1) | EP2312551A4 (ja) |
JP (1) | JP4625544B2 (ja) |
CN (1) | CN102047304B (ja) |
WO (1) | WO2010016244A1 (ja) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013115241A1 (ja) * | 2012-01-31 | 2013-08-08 | 株式会社デンソー | 車両の運転手の注意を喚起する装置及びその方法 |
JP2014191474A (ja) * | 2013-03-26 | 2014-10-06 | Fujitsu Ltd | 集中度判定プログラム、集中度判定装置、および集中度判定方法 |
KR101524526B1 (ko) * | 2013-11-29 | 2015-06-01 | 국립대학법인 울산과학기술대학교 산학협력단 | 네비게이션 정보 기반 차량 충돌 방지 시스템 및 방법 |
CN106571030A (zh) * | 2016-10-20 | 2017-04-19 | 西南交通大学 | 多源交通信息环境下排队长度预测方法 |
JP2017134826A (ja) * | 2016-01-28 | 2017-08-03 | ハーマン ベッカー オートモーティブ システムズ ゲーエムベーハー | 車両の外部音合成のためのシステム及び方法 |
JP2018016120A (ja) * | 2016-07-26 | 2018-02-01 | マツダ株式会社 | 視界制御装置 |
CN111511269A (zh) * | 2017-09-08 | 2020-08-07 | 国家科学研究中心 | 从脑电图信号解码个人的视觉注意 |
JPWO2020138012A1 (ja) * | 2018-12-27 | 2021-10-07 | 株式会社村田製作所 | 認知能力検出装置、および、認知能力検出システム |
WO2022210038A1 (ja) * | 2021-04-02 | 2022-10-06 | 株式会社Jvcケンウッド | 運転支援装置、運転支援方法、及び運転支援プログラム |
Families Citing this family (63)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8376595B2 (en) * | 2009-05-15 | 2013-02-19 | Magna Electronics, Inc. | Automatic headlamp control |
EP2401963B1 (en) * | 2009-10-15 | 2016-09-14 | Panasonic Intellectual Property Management Co., Ltd. | Driving attention amount determination device, method, and computer program |
US8552850B2 (en) * | 2010-02-17 | 2013-10-08 | Honeywell International Inc. | Near-to-eye tracking for adaptive operation |
CZ303192B6 (cs) * | 2010-07-12 | 2012-05-23 | Univerzita Karlova v Praze, Lékarská fakulta v Hradci Králové | Zrakový stimulátor |
WO2012063423A1 (ja) * | 2010-11-12 | 2012-05-18 | パナソニック株式会社 | 音圧評価システム、その方法およびそのプログラム |
US20120176235A1 (en) | 2011-01-11 | 2012-07-12 | International Business Machines Corporation | Mobile computing device emergency warning system and method |
US20120176232A1 (en) * | 2011-01-11 | 2012-07-12 | International Business Machines Corporation | Prevention of texting while operating a motor vehicle |
US8773251B2 (en) | 2011-02-10 | 2014-07-08 | Sitting Man, Llc | Methods, systems, and computer program products for managing operation of an automotive vehicle |
US8902054B2 (en) | 2011-02-10 | 2014-12-02 | Sitting Man, Llc | Methods, systems, and computer program products for managing operation of a portable electronic device |
US8666603B2 (en) | 2011-02-11 | 2014-03-04 | Sitting Man, Llc | Methods, systems, and computer program products for providing steering-control feedback to an operator of an automotive vehicle |
CN102119857B (zh) * | 2011-02-15 | 2012-09-19 | 陕西师范大学 | 基于匹配追踪算法的疲劳驾驶脑电检测系统及检测方法 |
US11145215B1 (en) | 2011-03-11 | 2021-10-12 | Sitting Man, Llc | Methods, systems, and computer program products for providing feedback to a user of a portable electronic in motion |
TWI474173B (zh) * | 2012-02-21 | 2015-02-21 | Hon Hai Prec Ind Co Ltd | 行走輔助系統及行走輔助方法 |
US20130246967A1 (en) * | 2012-03-15 | 2013-09-19 | Google Inc. | Head-Tracked User Interaction with Graphical Interface |
US9096920B1 (en) | 2012-03-22 | 2015-08-04 | Google Inc. | User interface method |
BR112014026072A2 (pt) * | 2012-04-24 | 2017-06-27 | Institucio Catalana De Recerca I Estudis Avancats | método para medir atenção |
US9251704B2 (en) * | 2012-05-29 | 2016-02-02 | GM Global Technology Operations LLC | Reducing driver distraction in spoken dialogue |
DE102012215397A1 (de) * | 2012-08-30 | 2014-03-06 | Robert Bosch Gmbh | Interaktive Aufmerksamkeitssteigerung |
ITTV20130025A1 (it) * | 2013-02-27 | 2014-08-28 | Giorgio Marcon | Sistema di sicurezza elettronico per molteplici funzioni. |
US9064420B2 (en) | 2013-03-14 | 2015-06-23 | Honda Motor Co., Ltd. | Augmented reality heads up display (HUD) for yield to pedestrian safety cues |
EP2977975B1 (en) * | 2013-03-22 | 2017-08-16 | Toyota Jidosha Kabushiki Kaisha | Driving assistance device |
EP2862741B1 (en) * | 2013-10-15 | 2017-06-28 | Volvo Car Corporation | Vehicle driver assist arrangement |
KR102113767B1 (ko) * | 2013-11-28 | 2020-05-21 | 현대모비스 주식회사 | 운전자 상태 감지 장치 및 그 방법 |
US9817474B2 (en) * | 2014-01-24 | 2017-11-14 | Tobii Ab | Gaze driven interaction for a vehicle |
JP6191573B2 (ja) * | 2014-09-29 | 2017-09-06 | マツダ株式会社 | 車両の視界調整装置 |
EP3009280B1 (en) | 2014-10-13 | 2017-04-19 | MY E.G. Services Berhad | Method and system for improving road safety |
US9747812B2 (en) | 2014-10-22 | 2017-08-29 | Honda Motor Co., Ltd. | Saliency based awareness modeling |
EP4173550A1 (en) * | 2015-03-16 | 2023-05-03 | Magic Leap, Inc. | Diagnosing and treating health ailments |
CN104757954A (zh) * | 2015-05-05 | 2015-07-08 | 奇瑞汽车股份有限公司 | 一种车用健康监测与舒适性调节系统及其监测、调节方法 |
DE102015219465A1 (de) * | 2015-10-08 | 2017-04-13 | Volkswagen Aktiengesellschaft | Verfahren und Vorrichtung zur Ermittlung der adaptiven Reaktionszeit des Fahrers eines Kraftfahrzeugs |
US9712736B2 (en) * | 2015-12-15 | 2017-07-18 | Intel Coprporation | Electroencephalography (EEG) camera control |
US9841813B2 (en) * | 2015-12-22 | 2017-12-12 | Delphi Technologies, Inc. | Automated vehicle human-machine interface system based on glance-direction |
CN105708480A (zh) * | 2016-01-26 | 2016-06-29 | 北京航空航天大学 | 基于检测反应任务的驾驶员注意力测试装置 |
US20170351330A1 (en) * | 2016-06-06 | 2017-12-07 | John C. Gordon | Communicating Information Via A Computer-Implemented Agent |
KR101816415B1 (ko) | 2016-06-21 | 2018-02-21 | 현대자동차주식회사 | 시선 추적을 이용한 운전자 집중도 감시 장치 및 방법 |
CN109416884B (zh) * | 2016-07-05 | 2021-02-19 | 三菱电机株式会社 | 识别区域推定装置、识别区域推定方法及识别区域推定程序 |
DE102016117440A1 (de) * | 2016-09-16 | 2018-03-22 | Dr. Ing. H.C. F. Porsche Aktiengesellschaft | Verfahren zur Korrektur eines Ladestands einer Ladestandsanzeige |
FR3065305B1 (fr) * | 2017-04-12 | 2020-08-28 | Valeo Vision | Systeme d'aide a la conduite comportemental |
US10279793B2 (en) | 2017-05-11 | 2019-05-07 | Honda Motor Co., Ltd. | Understanding driver awareness through brake behavior analysis |
CN107174262B (zh) * | 2017-05-27 | 2021-02-02 | 西南交通大学 | 注意力评测方法和系统 |
WO2019021471A1 (ja) * | 2017-07-28 | 2019-01-31 | 日産自動車株式会社 | 表示制御方法及び表示制御装置 |
CN110785334B (zh) * | 2017-08-02 | 2023-01-10 | 本田技研工业株式会社 | 车辆控制装置 |
DE102017213679A1 (de) * | 2017-08-07 | 2019-02-07 | Bayerische Motoren Werke Aktiengesellschaft | Verfahren und Vorrichtung zur Fahrerzustandsbewertung sowie Fahrzeug |
CN107458382B (zh) * | 2017-08-22 | 2019-09-10 | 京东方科技集团股份有限公司 | 车辆控制装置、控制方法和平视显示装置 |
CN107944415A (zh) * | 2017-12-06 | 2018-04-20 | 董伟 | 一种基于深度学习算法的人眼注意力检测方法 |
US10952680B2 (en) * | 2017-12-27 | 2021-03-23 | X Development Llc | Electroencephalogram bioamplifier |
WO2019127079A1 (en) * | 2017-12-27 | 2019-07-04 | Bayerische Motoren Werke Aktiengesellschaft | Vehicle lane change prediction |
US11017249B2 (en) | 2018-01-29 | 2021-05-25 | Futurewei Technologies, Inc. | Primary preview region and gaze based driver distraction detection |
FR3077900B1 (fr) * | 2018-02-12 | 2020-01-17 | Thales | Vision peripherique dans une interface homme-machine |
CN108498094B (zh) * | 2018-03-29 | 2021-06-01 | Oppo广东移动通信有限公司 | 脑电波信息传输控制方法及相关产品 |
GB201817061D0 (en) | 2018-10-19 | 2018-12-05 | Sintef Tto As | Manufacturing assistance system |
CN110584657B (zh) * | 2019-03-15 | 2022-09-23 | 华为技术有限公司 | 一种注意力检测方法及系统 |
KR102685114B1 (ko) * | 2019-06-26 | 2024-07-12 | 현대자동차주식회사 | 오류 모니터링을 이용한 모빌리티 제어 방법 및 장치 |
CN112406727B (zh) * | 2019-08-23 | 2022-06-10 | 比亚迪股份有限公司 | 车辆及多屏系统的控制方法、装置 |
CN110910611A (zh) * | 2019-12-13 | 2020-03-24 | 上海擎感智能科技有限公司 | 提醒方法、系统、终端及车辆 |
CN111319634A (zh) * | 2020-03-12 | 2020-06-23 | 厦门中云创电子科技有限公司 | 一种汽车控制方法及系统 |
US20210315508A1 (en) * | 2020-04-14 | 2021-10-14 | Neurotype Inc. | Assessing Motivated Attention with Cue Reactivity |
JP6990274B1 (ja) * | 2020-06-29 | 2022-01-12 | 本田技研工業株式会社 | 注意喚起装置、移動体、注意喚起装置の制御方法 |
JP7359112B2 (ja) * | 2020-09-11 | 2023-10-11 | トヨタ自動車株式会社 | 注意能力検査装置および注意能力検査方法 |
US11535253B2 (en) * | 2020-09-18 | 2022-12-27 | GM Global Technology Operations LLC | Lane change maneuver intention detection systems and methods |
CN114043992A (zh) * | 2021-11-12 | 2022-02-15 | 东风柳州汽车有限公司 | 车辆控制方法、装置、设备及存储介质 |
US20240278788A1 (en) * | 2023-02-22 | 2024-08-22 | Woven By Toyota, Inc. | Systems and methods for triggering lights remotely to measure operator vigilance |
CN117137498B (zh) * | 2023-09-15 | 2024-06-21 | 北京理工大学 | 基于注意力定向和运动意图脑电的紧急状况检测方法 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09309358A (ja) * | 1996-05-23 | 1997-12-02 | Suzuki Motor Corp | 車間距離警報装置 |
JP2002127780A (ja) | 2000-08-15 | 2002-05-08 | Nissan Motor Co Ltd | 車両用警報装置 |
JP2004178367A (ja) | 2002-11-28 | 2004-06-24 | Toyota Central Res & Dev Lab Inc | 注意配分制御装置 |
JP2007038772A (ja) * | 2005-08-02 | 2007-02-15 | Matsushita Electric Ind Co Ltd | 速度制御装置 |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5813993A (en) * | 1996-04-05 | 1998-09-29 | Consolidated Research Of Richmond, Inc. | Alertness and drowsiness detection and tracking system |
US6167298A (en) * | 1998-01-08 | 2000-12-26 | Levin; Richard B. | Devices and methods for maintaining an alert state of consciousness through brain wave monitoring |
AU767533B2 (en) * | 1999-01-27 | 2003-11-13 | Compumedics Limited | Vigilance monitoring system |
US7546158B2 (en) * | 2003-06-05 | 2009-06-09 | The Regents Of The University Of California | Communication methods based on brain computer interfaces |
US20060258930A1 (en) * | 2004-05-18 | 2006-11-16 | Jianping Wu | Device for use in sleep stage determination using frontal electrodes |
JP4497305B2 (ja) * | 2004-12-08 | 2010-07-07 | 株式会社デンソー | 運転者状態判定装置 |
JP4887980B2 (ja) * | 2005-11-09 | 2012-02-29 | 日産自動車株式会社 | 車両用運転操作補助装置および車両用運転操作補助装置を備えた車両 |
JP4064446B2 (ja) * | 2005-12-09 | 2008-03-19 | 松下電器産業株式会社 | 情報処理システム、情報処理装置および方法 |
JP5171629B2 (ja) * | 2006-09-04 | 2013-03-27 | パナソニック株式会社 | 走行情報提供装置 |
US7710248B2 (en) * | 2007-06-12 | 2010-05-04 | Palo Alto Research Center Incorporated | Human-machine-interface (HMI) customization based on collision assessments |
JP4480755B2 (ja) * | 2007-12-04 | 2010-06-16 | カルソニックカンセイ株式会社 | 車両用ヘッドアップディスプレイ装置 |
TWI446297B (zh) * | 2007-12-28 | 2014-07-21 | 私立中原大學 | 睡意辨識系統 |
JP5127576B2 (ja) * | 2008-06-11 | 2013-01-23 | ヤマハ発動機株式会社 | 精神作業負荷検出装置及びそれを備えた自動二輪車 |
-
2009
- 2009-08-04 JP JP2010510585A patent/JP4625544B2/ja not_active Expired - Fee Related
- 2009-08-04 EP EP09804731.9A patent/EP2312551A4/en not_active Withdrawn
- 2009-08-04 CN CN2009801193390A patent/CN102047304B/zh not_active Expired - Fee Related
- 2009-08-04 WO PCT/JP2009/003724 patent/WO2010016244A1/ja active Application Filing
-
2010
- 2010-03-05 US US12/718,326 patent/US20100156617A1/en not_active Abandoned
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09309358A (ja) * | 1996-05-23 | 1997-12-02 | Suzuki Motor Corp | 車間距離警報装置 |
JP2002127780A (ja) | 2000-08-15 | 2002-05-08 | Nissan Motor Co Ltd | 車両用警報装置 |
JP2004178367A (ja) | 2002-11-28 | 2004-06-24 | Toyota Central Res & Dev Lab Inc | 注意配分制御装置 |
JP2007038772A (ja) * | 2005-08-02 | 2007-02-15 | Matsushita Electric Ind Co Ltd | 速度制御装置 |
Non-Patent Citations (4)
Title |
---|
EBE ET AL.: "Technique for Measuring Driver's Attention Level by Using Event-Related Potentials", AUTOMOTIVE TECHNOLOGIES, vol. 58, no. 7, 2004, pages 91 - 96 |
See also references of EP2312551A4 |
YO MIYATA ET AL.: "New Physiopsychology", 1998, KITAOJI SHOBO, pages: 110 |
YO MIYATA ET AL.: "New Physiopsychology", 1998, KITAOJI SHOBO, pages: 119 |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013115241A1 (ja) * | 2012-01-31 | 2013-08-08 | 株式会社デンソー | 車両の運転手の注意を喚起する装置及びその方法 |
JP2014191474A (ja) * | 2013-03-26 | 2014-10-06 | Fujitsu Ltd | 集中度判定プログラム、集中度判定装置、および集中度判定方法 |
KR101524526B1 (ko) * | 2013-11-29 | 2015-06-01 | 국립대학법인 울산과학기술대학교 산학협력단 | 네비게이션 정보 기반 차량 충돌 방지 시스템 및 방법 |
JP7066318B2 (ja) | 2016-01-28 | 2022-05-13 | ハーマン ベッカー オートモーティブ システムズ ゲーエムベーハー | 車両の外部音合成のためのシステム及び方法 |
JP2017134826A (ja) * | 2016-01-28 | 2017-08-03 | ハーマン ベッカー オートモーティブ システムズ ゲーエムベーハー | 車両の外部音合成のためのシステム及び方法 |
JP2018016120A (ja) * | 2016-07-26 | 2018-02-01 | マツダ株式会社 | 視界制御装置 |
CN106571030A (zh) * | 2016-10-20 | 2017-04-19 | 西南交通大学 | 多源交通信息环境下排队长度预测方法 |
CN106571030B (zh) * | 2016-10-20 | 2020-06-02 | 西南交通大学 | 多源交通信息环境下排队长度预测方法 |
CN111511269A (zh) * | 2017-09-08 | 2020-08-07 | 国家科学研究中心 | 从脑电图信号解码个人的视觉注意 |
CN111511269B (zh) * | 2017-09-08 | 2023-06-06 | 国家科学研究中心 | 从脑电图信号解码个人的视觉注意 |
US11717204B2 (en) | 2017-09-08 | 2023-08-08 | Nextmind Sas | Decoding the visual attention of an individual from electroencephalographic signals |
JPWO2020138012A1 (ja) * | 2018-12-27 | 2021-10-07 | 株式会社村田製作所 | 認知能力検出装置、および、認知能力検出システム |
JP7276354B2 (ja) | 2018-12-27 | 2023-05-18 | 株式会社村田製作所 | 認知能力検出装置、および、認知能力検出システム |
WO2022210038A1 (ja) * | 2021-04-02 | 2022-10-06 | 株式会社Jvcケンウッド | 運転支援装置、運転支援方法、及び運転支援プログラム |
Also Published As
Publication number | Publication date |
---|---|
JPWO2010016244A1 (ja) | 2012-01-19 |
CN102047304A (zh) | 2011-05-04 |
JP4625544B2 (ja) | 2011-02-02 |
EP2312551A1 (en) | 2011-04-20 |
EP2312551A4 (en) | 2014-10-15 |
US20100156617A1 (en) | 2010-06-24 |
CN102047304B (zh) | 2013-04-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4625544B2 (ja) | 運転注意量判定装置、方法およびプログラム | |
JP4733242B2 (ja) | 運転注意量判別装置、方法、および、コンピュータプログラム | |
JP4772935B2 (ja) | 注意状態判定装置、方法およびプログラム | |
JP4353162B2 (ja) | 車輌周囲情報表示装置 | |
US9460601B2 (en) | Driver distraction and drowsiness warning and sleepiness reduction for accident avoidance | |
JP2011180873A (ja) | 運転支援装置、及び運転支援方法 | |
JP4500369B2 (ja) | 注意散漫検出装置、注意散漫検出方法およびコンピュータプログラム | |
JP5923180B2 (ja) | 生体情報計測装置及びそれを用いた入力装置 | |
JP5570386B2 (ja) | 注意状態判別システム、方法、コンピュータプログラムおよび注意状態判別装置 | |
EP1723901A1 (en) | Vehicle operator monitoring system and method | |
WO2015019542A1 (ja) | 視野算出装置および視野算出方法 | |
JP2012173803A (ja) | 安全運転支援装置及び安全運転支援方法 | |
KR101999211B1 (ko) | 뇌파를 이용한 운전자 상태 검출 장치 및 그 방법 | |
JP2012085746A (ja) | 注意状態判別システム、方法、コンピュータプログラムおよび注意状態判別装置 | |
US20210282687A1 (en) | Cognitive ability detection apparatus and cognitive ability detection system | |
JP2018013811A (ja) | ドライバ状態判定装置、及びドライバ状態判定プログラム | |
WO2008020458A2 (en) | A method and system to detect drowsy state of driver | |
JP2011086125A (ja) | 視認検出装置 | |
JP2011206072A (ja) | 有効視野測定システムおよび有効視野測定方法 | |
US20200027235A1 (en) | Device for monitoring the viewing direction of a person | |
KR20190133533A (ko) | 졸음 인식 장치를 이용한 졸음 인식 방법 | |
Mohan et al. | Eye Gaze Estimation Invisible and IR Spectrum for Driver Monitoring System | |
JP2023131010A (ja) | 注意喚起装置 | |
KR20200006145A (ko) | 졸음 인식 장치 및 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200980119339.0 Country of ref document: CN |
|
ENP | Entry into the national phase |
Ref document number: 2010510585 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09804731 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2009804731 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |