WO2019117165A1 - 瞳孔検出装置および検出システム - Google Patents
瞳孔検出装置および検出システム Download PDFInfo
- Publication number
- WO2019117165A1 WO2019117165A1 PCT/JP2018/045555 JP2018045555W WO2019117165A1 WO 2019117165 A1 WO2019117165 A1 WO 2019117165A1 JP 2018045555 W JP2018045555 W JP 2018045555W WO 2019117165 A1 WO2019117165 A1 WO 2019117165A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- pupil
- movement
- imaging
- control unit
- detection device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/113—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/18—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R11/00—Arrangements for holding or mounting articles, not otherwise provided for
- B60R11/04—Mounting of cameras operative during drive; Arrangement of controls thereof relative to the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/215—Motion-based segmentation
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/262—Analysis of motion using transform domain methods, e.g. Fourier domain methods
Definitions
- the present disclosure relates to a pupil detection device that detects movement of a pupil, and a detection system including the pupil detection device.
- Patent Document 1 discloses an eye movement inspection apparatus for estimating the movement of the eye and the movement of the head of a target person from a photographed image.
- the subject of the eye movement inspection apparatus wears a predetermined contact lens.
- a retroreflective material is formed on the contact lens to form a predetermined reflection pattern.
- light of a predetermined wavelength is emitted toward the subject by the light source, and imaging is performed by the camera.
- the computer estimates the head movement of the subject for each axis in the world coordinate system based on the captured image, and estimates the eye movement of the subject in the head coordinate system for each axis of the head coordinate system There is.
- Patent Document 2 discloses an eye part area detection program for extracting an eye part of a subject at high speed from image data obtained by photographing the subject using an image processing technology.
- the eye area detection program differentiates the density value in the original image including the face of the subject in the vertical direction of the original image, integrates the differential values in the horizontal direction, and generates a histogram, and the maximum value of the histogram or The eye region is extracted from the original image using the maximum value.
- Patent Documents 3 and 4 also disclose an image processing technique for detecting an eye part, a pupil part and the like from an image captured by a camera.
- Patent Document 5 discloses a state estimation device that estimates a state of a monitoring target person based on eye movement, using a driver of a vehicle as a monitoring target person.
- Patent Document 6 discloses a system for estimating the attentiveness of a person to be monitored in a state estimation device.
- Patent Document 7 proposes to use measurement of fixation involuntary eye movement when estimating the position at which the user is paying attention within the field of view of the user.
- Patent No. 5276381 gazette Patent No. 596458 gazette Patent No. 5995217 Unexamined-Japanese-Patent No. 2017-23519 JP, 2015-207163, A JP 2013-240469 A
- An object of the present disclosure is to provide a pupil detection device and a detection system that can facilitate detection of specific pupil movement.
- the pupil detection device detects the movement of the pupil based on the imaging result of the imaging device that captures an image including the pupil.
- the pupil detection device includes an acquisition unit and a control unit.
- the acquisition unit acquires imaging data indicating an imaging result from the imaging device.
- the control unit executes data processing based on the imaging data acquired by the acquisition unit.
- the control unit extracts information indicating the movement of the pupil based on the imaging data, performs time-frequency analysis on the extracted information, and detects the movement of the pupil.
- the time frequency analysis is a numerical analysis that decomposes the waveform to be analyzed in the time direction into each frequency, and is performed by, for example, Fourier transform.
- a detection device includes an imaging device and a pupil detection device.
- the imaging device captures an image including a pupil and generates imaging data.
- the pupil detection device detects the movement of the pupil based on the imaging data.
- the pupil detection device and detection system of the present disclosure can facilitate detection of particular pupil movements.
- FIG. 6 illustrates image processing in the microsaccade detection process of the first embodiment.
- a diagram for explaining an imaging device according to a second embodiment Schematic view of various images in the microsaccade detection process of the second embodiment
- Flowchart exemplifying micro-saccade detection process in the second embodiment A scatter plot that illustrates the distribution of events in the intensity change data when a microsaccade occurs
- FIG. 1 is a diagram for describing an application example of a detection system 1 according to the present disclosure.
- FIG. 1 shows an application example of the detection system 1 to in-vehicle applications.
- the detection system 1 detects, for example, the awakening degree indicating the awakening state of the driver 3 driving the vehicle 2, and performs notification for the prevention of a doze or driving according to the decrease in the awakening degree.
- the detection system 1 may perform various controls for driving assistance of the vehicle 2 based on the detection results of the awakening degree and the like.
- the detection system 1 includes an imaging device 4 and a pupil detection device 5 as shown in FIG.
- the imaging device 4 is disposed, for example, in the vehicle 2 and captures an image in the vicinity of the eye 30 of the driver 3.
- the driver 3 is an example of a subject to be detected by the detection system 1.
- the pupil detection device 5 is a device that detects the movement of the pupil according to the eye movement of the subject based on the imaging result of the imaging device 4.
- the pupil detection device 5 is connected to, for example, a notification unit 21 such as an audio output device provided in the vehicle 2.
- the pupil detection device 5 may be connected to various ECUs (electronic control units) or the like of the vehicle 2.
- microsaccades are high-speed eye movements that move instantaneously in fine eye movements classified as involuntary eye movement.
- head movement and eye movement can occur simultaneously, such as occurrence of microsaccade when the driver 3 moves the head, for example.
- each of head movement and eye movement was measured, and complicated processing of analyzing the correlation between the head movement and eye movement was performed to detect eye movement to be a target.
- the detection system 1 does not need to separately measure the head movement under the above-mentioned situation, and the pupil detection device 5 can detect the eye movement to be a target such as a microsaccade.
- the information which shows the movement of the pupil is extracted in and the frequency analysis is performed in the time direction.
- the imaging device 4 is configured by a high-speed camera having a high frame rate such as 500 fps.
- the frame rate of the imaging device 4 is set to be equal to or more than twice the peak frequency of eye movement to be detected in consideration of the detection object such as a microsaccade.
- the imaging device 4 of the present embodiment generates image data indicating a frame image, and outputs the generated image data to the pupil detection device 5.
- the image data generated by the imaging device 4 is an example of imaging data in the present embodiment.
- the imaging device 4 may include an illumination light source for imaging.
- the illumination light source emits, for example, infrared light as illumination light.
- the illumination light source is arranged to irradiate the subject such as the driver 3 with illumination light.
- the illumination light source may be configured separately from the imaging device 4 and can be incorporated into the detection system 1 as appropriate.
- FIG. 2 is a diagram illustrating the configuration of the pupil detection device 5.
- the pupil detection device 5 includes an acquisition unit 51 and a control unit 52 as shown in FIG.
- the pupil detection device 5 also includes, for example, a storage unit 53 and an output unit 54.
- the acquisition unit 51 is an input interface circuit that inputs data from the outside of the pupil detection device 5.
- the acquisition unit 51 is connected to the imaging device 4 in accordance with a predetermined communication standard such as USB, for example.
- the acquisition unit 51 acquires imaging data and the like indicating an imaging result from the imaging device 4.
- the control unit 52 includes a CPU, a RAM, a ROM and the like, and controls each component according to information processing. For example, based on the data acquired by the acquisition unit 51, the control unit 52 executes data processing according to a predetermined program.
- the predetermined program is a program for realizing various operations of the pupil detection device 5 to be described later.
- the control unit 52 may hold various data such as imaging data from the imaging device 4 in an internal memory such as a RAM.
- the storage unit 53 is, for example, an auxiliary storage device such as a hard disk drive or a solid state drive.
- the storage unit 53 stores programs executed by the control unit 52, various data, and the like.
- the pupil detection device 5 may obtain the above-described program and the like from a portable storage medium.
- the storage medium stores information such as a program by an electric, magnetic, optical, mechanical or chemical action so that information such as a computer or other device, machine or the like can be read. It is a medium.
- the output unit 54 is an output interface circuit that outputs data from the pupil detection device 5 to the outside.
- the output unit 54 is connected to, for example, the notification unit 21 of the vehicle 2 (FIG. 1) or various ECUs.
- the output unit 54 and the acquisition unit 51 may be integrally configured to configure an input / output interface.
- FIG. 3 is a flowchart illustrating the operation of the pupil detection device 5 in the detection system 1.
- Each process of the flowchart shown in FIG. 3 is executed by the control unit 52 of the pupil detection device 5.
- This flowchart is started in a state where the imaging device 4 captures an image in the vicinity of the eye 30 of the driver 3 while driving the vehicle 2 shown in FIG. 1, for example.
- the control unit 52 of the pupil detection device 5 executes microsaccade detection processing (S1).
- the microsaccade detection process is an example of data processing for detecting a microsaccade in the movement of the pupil 6 based on the imaging result of the imaging device 4. Details of the micro saccade detection process will be described later.
- the occurrence frequency of the microsaccade is detected from the detection result of the microsaccade detection process (S2), and it is determined whether the current occurrence frequency of the microsaccade corresponds to the decrease condition of the awakening degree. (S2).
- step S2 is performed by appropriately setting a condition for decreasing the awakening degree in accordance with the correlation between the assumed frequency of occurrence of the microsaccade and the awakening degree. For example, the control unit 52 determines whether the same occurrence frequency corresponds to the decrease condition of the awakening degree according to whether the current occurrence frequency of the microsaccade is higher than a predetermined reference value (S2). ).
- step S1 When the control unit 52 determines that the current microsaccade occurrence frequency does not correspond to the awakening degree reduction condition (NO in S2), the control unit 52 executes the process of step S1 again in a predetermined cycle.
- the cycle of repeating step S1 can be set as appropriate, and is, for example, one second.
- control unit 52 determines that the current microsaccade occurrence frequency falls under the awakening degree reduction condition (YES in S2), the control unit 52 generates notification information indicating that the awakening degree has decreased, and the output unit The notification information is output to the notification unit 21 and the like via 54 (FIG. 2) (S3).
- the control unit 52 outputs the notification information (S3), and ends the process according to the flowchart of FIG.
- the pupil detection device 5 can detect the occurrence frequency of the microsaccade by the subject such as the driver 3 and the like, and can detect the decrease in the awakening degree.
- FIG. 4 is a flowchart illustrating the microsaccade detection process (S1 in FIG. 3) in the present embodiment.
- FIG. 5 is a diagram illustrating image processing (S12) in the microsaccade detection process of the present embodiment.
- FIG. 6 is a diagram for explaining time frequency analysis (S13, S14) in the microsaccade detection process.
- the control unit 52 of the pupil detection device 5 acquires image data for a predetermined period from the imaging device 4 via the acquisition unit 51 (S11).
- the predetermined period of step S11 can be appropriately set in consideration of the number of frames required for frequency analysis. For example, the predetermined period is set to one second.
- control unit 52 performs image processing on the image of each frame in the acquired image data, and extracts a temporal change (i.e., time evolution) of the position of the pupil (S12).
- a temporal change i.e., time evolution
- FIG. 5A illustrates an image 60 of one frame in image data from the imaging device 4.
- the pupil 61 is included in the image 60.
- FIG. 5 (b) illustrates the binarized image 62 of the image 60 of FIG. 5 (a).
- X direction the horizontal direction on the imaging surface of the imaging device 4
- Y direction the vertical direction
- step S12 the control unit 52 performs, for example, binarization processing on the luminance for each pixel in the image 60 of one frame as shown in FIG. 5A.
- the control unit 52 assigns white gradation to pixels having luminance less than a predetermined threshold in the original image 60 (FIG. 5A), and sets black floors to pixels having luminance equal to or higher than the threshold.
- a tone is assigned to generate a binarized image 62 (FIG. 5 (b)).
- the threshold value is set to a value larger than the luminance assumed for the pupil 61 in the image 60.
- the area 63 corresponding to the pupil 61 can be easily determined.
- the control unit 52 calculates the central position 64 of the area 63 determined by the binarization processing as the pupil position of the pupil 61 at the time of the corresponding frame.
- the control unit 52 performs the above processing on the image of each frame to generate time-series data of pupil positions. Thereby, the temporal change of the pupil position is extracted.
- FIG. 6A illustrates temporal changes in pupil position. In the graph illustrated in FIG. 6A, the horizontal axis indicates time (ms), and the vertical axis indicates the X coordinate of the pupil position.
- the control unit 52 performs frequency decomposition in the time direction on the temporal change of the extracted pupil position (S13). For example, the control unit 52 performs FFT (Fast Fourier Transform) on the generated time-series data to calculate a frequency spectrum of the time-series data.
- FIG. 6 (b) is a graph showing the frequency spectrum for one second in FIG. 6 (a). In the graph of FIG. 6 (b), the horizontal axis indicates frequency (Hz), and the vertical axis indicates intensity of the frequency component.
- control unit 52 analyzes, for example, the frequency spectrum as a result of the frequency resolution shown in FIG. 6B to determine whether or not pupil movement occurs in a specific frequency band corresponding to the microsaccade.
- the specific frequency band is set to, for example, a range including the peak frequency of the microsaccade, such as 220 to 250 Hz, and not including the frequency that is expected to significantly occur according to the head movement.
- step S14 for example, in the frequency spectrum, the control unit 52 determines whether the frequency component included in the specific frequency band is equal to or higher than a predetermined threshold.
- the threshold value is set in consideration of the strength of the frequency component assumed when microsaccade occurs. If the maximum value of the specific frequency band component is equal to or higher than the threshold value, the control unit 52 determines that the movement of the pupil in the specific frequency band, that is, the microsaccade has occurred (YES in S14).
- the control unit 52 determines that the movement of the pupil in a specific frequency band has occurred (YES in S14), the micro-saccade occurrence time is recorded in the storage unit 53 (S15). For example, the control unit 52 records the time when the image data is acquired in step S11 as the micro-saccade occurrence time.
- control unit 52 determines that the movement of the pupil in the specific frequency band does not occur (NO in S14)
- the microsaccade detection process (S1 in FIG. 3) is ended without performing the process of step S15. , And go to step S2.
- FIG. 6A shows an example in which a microsaccade occurs at the timing indicated by the arrow A1 while the X coordinate of the position of the pupil gradually changes due to head movement.
- the pupil detection device 5 of this embodiment executes the microsaccade detection process every predetermined period (S11) to detect whether or not the microsaccade has occurred during the period to be processed.
- the control unit 52 of the pupil detection device 5 extracts a temporal change of the position of the pupil for one second in the example of FIG. 6A (S12), and as shown in FIG. To calculate the frequency spectrum of (S13).
- the frequency spectrum of the example of FIG. 6B includes a first peak value P1 corresponding to head movement and a second peak value P2 corresponding to microsaccade.
- the control unit 52 of the pupil detection device 5 processes one second based on, for example, the second peak value P2 that is maximum within the range of the specific frequency band B1. It is detected that a microsaccade has occurred (S14, S15). In this way, it is possible to detect whether a microsaccade is occurring or not without interfering with the head movement, particularly under conditions where head movement can occur simultaneously.
- the pupil detection device 5 detects the movement of the pupil based on the imaging result of the imaging device 4 for imaging an image including the pupil.
- the pupil detection device 5 includes an acquisition unit 51 and a control unit 52.
- the acquisition unit 51 acquires image data as imaging data indicating an imaging result from the imaging device 4.
- the control unit 52 executes data processing based on the image data acquired by the acquisition unit 51.
- the control unit 52 extracts a time change of the pupil position as information indicating the movement of the pupil based on the image data (S12).
- the control unit 52 performs time-frequency analysis on the extracted information to detect the movement of the pupil (S13, S14).
- the pupil detection device 5 by extracting information indicating the movement of the pupil and performing time frequency analysis, for example, a specific pupil corresponding to eye movement having a frequency band different from head movement, etc. Motion can be made easy to detect.
- the imaging data of the imaging device 4 is image data indicating the image 60 for each frame.
- the control unit 52 extracts a temporal change of the position of the pupil 61 in the image 60 for each frame (S12).
- S12 a temporal change of the position of the pupil 61 in the image 60 for each frame
- control unit 52 performs a binarization process on luminance on the image 62 for each frame to determine an area 63 corresponding to the pupil 61 (see FIG. 5).
- the control unit 52 extracts a temporal change of the position 64 of the pupil 61 based on the area 63 determined for each frame (S12). Since the luminance of the pupil 61 in the image 62 is considered to be remarkably low, the position of the pupil 61 can be accurately obtained by the binarization processing.
- the detection system 1 includes an imaging device 4 and a pupil detection device 5.
- the imaging device 4 captures an image including a pupil and generates imaging data.
- the pupil detection device 5 detects the movement of the pupil based on the imaging data.
- the detection system 1 can easily detect the movement of a specific pupil according to the microsaccade or the like by the pupil detection device 5, and can detect the awakening degree or the like of the subject based on the detection result.
- Second Embodiment In the second embodiment, a configuration example of a detection system 1 using a differential detection type imaging device 4 that detects a change in luminance will be described.
- FIG. 7 is a diagram for explaining the imaging device 4 according to the present embodiment.
- FIG. 7 schematically shows an imaging surface 4 a of the imaging device 4 of the present embodiment and output data of each pixel 40 on the imaging surface 4 a.
- the detection system 1 of the present embodiment employs, as the imaging device 4, a differential detection type camera device that detects a temporal change rate of luminance for each pixel 40 in the imaging surface 4 a.
- a configuration of the imaging device 4 of the present embodiment for example, a known configuration known as a dynamic vision sensor (DVS) can be applied (see, for example, Non-Patent Document 1).
- DVD dynamic vision sensor
- the imaging device 4 outputs an imaging element having an imaging surface 4a in which the pixels 40 are arranged in a two-dimensional array, an optical lens for condensing light on the imaging surface 4a, and imaging data (brightness change data D1).
- a data output circuit etc. are provided.
- Each pixel 40 of the imaging device includes, for example, a photodiode, a differentiator, a comparator, and the like.
- Each pixel 40 can be identified by XY coordinates on the imaging surface 4a.
- an image is formed (formed) on the imaging surface 4 a of the imaging element by the light from the optical lens.
- the imaging device 4 of the present embodiment performs threshold determination of the temporal change rate of luminance for each pixel 40 on the image on the imaging surface 4 a, and detects an event in which the luminance for each pixel 40 has changed.
- the event to be detected is an event in which a change in brightness occurs at the position of the pixel 40, and is specified by whether the change in brightness increases or decreases, the timing when the brightness increases or decreases, and the XY coordinate of the pixel 40.
- the imaging device 4 generates luminance change data D1 indicating the detection result of each event.
- the luminance change data D1 illustrated in FIG. 7 includes output data D11 when the luminance is increased and output data D12 when the luminance is reduced.
- the open squares represent output data when the luminance is increased.
- the black squares represent output data at the time of decrease in luminance.
- the luminance change data D1 is an example of imaging data of an imaging result by the imaging device 4 of the present embodiment.
- each pixel 40 detects an increase in luminance when the temporal change rate of luminance exceeds the upper threshold, and detects a decrease in luminance when the temporal change rate falls below the lower threshold.
- each pixel 40 detects an increase or a decrease in luminance, it immediately outputs a detection result at the detected timing.
- the pixel 40 whose luminance has not changed and the pixel 40 whose temporal change rate of luminance is between the upper threshold and the lower threshold do not particularly output the detection result.
- the imaging device 4 generates, for example, the luminance change data D1 in association with the XY coordinates of the pixel 40 which has output the detection result, the timing of the detection result, and information indicating increase or decrease of the luminance.
- the imaging device 4 externally outputs the luminance change data D1 as, for example, time-series data in which the time width is set to a predetermined value.
- the imaging device 4 of the present embodiment can detect a change in luminance in a time width much shorter than the frame period.
- the detectable time width is, for example, a reaction period in which the pixel 40 responds to the change in luminance, and is, for example, 100 microseconds.
- the pupil detection device 5 executes the microsaccade detection process (S1 in FIG. 3) using the luminance change data D1 by the imaging device 4.
- the outline of the microsaccade detection process according to the present embodiment will be described with reference to FIG.
- FIG. 8 is a schematic view of various images in the microsaccade detection process of the present embodiment.
- FIGS. 8 (a) and 8 (b) illustrate an image including the pupil 6 before and after the occurrence of the microsaccade.
- FIG. 8C illustrates the luminance change image 7 corresponding to the imaging result between FIGS. 8A and 8B by the imaging device 4 (FIG. 1).
- FIGS. 8 (a) and 8 (b) are formed on the imaging surface 4a (FIG. 7) of the imaging device 4 before and after the occurrence of the microsaccade, respectively.
- the pupil 6 on the image of FIGS. 8 (a) and 8 (b) is moved by the micro saccade from the time point of FIG. 8 (a) to the moving direction d1 as shown in FIG. 8 (b).
- the imaging device 4 outputs a change in luminance for each pixel as luminance change data D1 as needed between FIGS. 8A and 8B (FIG. 7).
- FIG. 8C illustrates the luminance change image 7 corresponding to the luminance change data D1 obtained between FIGS. 8A and 8B. Since the brightness of the area where the pupil 6 is located is significantly lower than the brightness of the area around the pupil 6, the movement of the pupil 6 such as a microsaccade causes a unique change in brightness in the brightness change image 7. it is conceivable that.
- the luminance of the luminance change image 7 is changed between the area 71 on the start end side and the area 72 on the end side in the moving direction d1 of the pupil 6.
- the pupil 6 was positioned at the time of FIG. 8A, but the luminance is increased because the pupil 6 is not positioned at the time of FIG. 8B.
- the pupil 6 was not positioned at the time of FIG. 8A, but the pupil 6 became to be positioned at the time of FIG. Do.
- the microsaccade detection process of the present embodiment realizes the automatic detection of microsaccades that may occur at any time by the simple process of counting the change in luminance as described above.
- the micro saccade detection process according to the present embodiment will be described in detail.
- FIG. 9 is a flowchart illustrating the micro-saccade detection process according to the second embodiment.
- the pupil detection device 5 of the present embodiment executes the microsaccade detection process according to the flowchart of FIG. 9 instead of the flowchart of FIG.
- the microsaccade detection process acquires image data for a predetermined period including a plurality of frames (S11 in FIG. 4), and extracts a temporal change in pupil position (S12).
- the control unit 52 of the pupil detection device 5 acquires luminance change data D1 for a predetermined period, instead of image data, from the imaging device 4 via the acquisition unit 51 S11A).
- the control unit 52 Based on the acquired luminance change data D1, the control unit 52 extracts the temporal change of the number of luminance changes instead of the temporal change of the pupil position (S12A).
- the number of changes in luminance indicates the number of events in which the luminance at the same coordinate in the luminance change data D1 temporally changes, that is, the number of events.
- the control unit 52 divides the above-mentioned predetermined period into predetermined sample periods, and counts the number of luminance changes in each sample period in the acquired luminance change data D1 (S12A).
- the sample period is appropriately set to 1/2 or less of the reciprocal of the frequency of the movement of the pupil to be detected.
- the control unit 52 performs frequency decomposition in the time direction (S13) as in the first embodiment with respect to temporal changes in the extracted luminance change number, and based on the result of the frequency decomposition, the movement of the pupil in a specific frequency band Is determined (S14).
- the specific frequency is set, for example, near the peak frequency of the microsaccade velocity.
- the microsaccade occurrence time can be detected (S15).
- control unit 52 searches for a region corresponding to the pupil in luminance change data D1, and the number of events having coordinates within the searched region May be counted as the number of luminance changes.
- a method of searching for the area of the pupil in the luminance change data D1 will be described with reference to FIGS.
- FIG. 10 is a scatter diagram illustrating the distribution of events in the luminance change data D1 at the occurrence of microsaccades.
- FIG. 10 shows a spatial distribution of an increase event E11 in which the brightness is increased and a decrease event E12 in which the brightness is decreased in the brightness change data D1 acquired in step S11A.
- control unit 52 generates the luminance change image 7 by calculating the number of events E11 and E12 for each XY coordinate of each pixel in the acquired luminance change data D1.
- the luminance change image 7 may have two types (for example, two colors) of pixel values corresponding to the increase event E11 and the decrease event E12 in each pixel.
- the distribution of FIG. 10 includes an elliptical region R1 corresponding to the pupil 6 moved by the micro saccade.
- the luminance increase event E11 and the luminance decrease event E12 are distributed in an elliptical shape. Therefore, the control unit 52 searches for an elliptical region R1 due to the movement of the pupil 6 in the generated luminance change image 7.
- FIG. 11 is a diagram illustrating a method of searching for the elliptical region R1 in the luminance change image 7.
- the control unit 52 scans the luminance change image 7 using the scanning region R10.
- the scan area R10 has a predetermined elliptical shape, and is set at various scan positions on the luminance change image 7.
- the elliptical shape of the scanning region R10 is set to the size and shape assumed as the elliptical region R1 due to the movement of the pupil 6.
- control unit 52 sequentially counts predetermined evaluation values for the images located inside the scanning region R10.
- the evaluation value is counted, for example, by the number of pixels having at least one event E11 and E12 in the image inside the scanning region R10.
- the evaluation value may be counted by the number of luminance changes in the scanning region R10.
- the control unit 52 detects a scan area R10 at a scan position where the evaluation value is the maximum value or the maximum value as an elliptical area R1.
- the search for the elliptical region R1 is not limited to the scanning by the scanning region R10, and for example, pattern recognition may be applied, or Hough transform using an equation of an elliptic curve or the like may be applied. Further, not only the image processing of the luminance change image 7 but also the event processing of the luminance change data D1 may extract data corresponding to the moving pupil 6.
- FIG. 12 is a graph showing experimental results of a pupil model test on the pupil detection device 5 of the second embodiment.
- time ( ⁇ s) is shown, and the vertical axis shows the number of changes in luminance.
- FIG. 12 a model experiment was conducted to simulate the state in which the eye movement and the head movement overlap.
- a black round member with a diameter of 3 mm was used as a model of the pupil.
- the model was moved at 100 m / s corresponding to head movement while vibrating the model corresponding to eye movement.
- the vibration of the model was set at a frequency of 100 Hz and an amplitude of 200 ⁇ m so as to correspond to a microsaccade, and a maximum velocity of 20 mm / s was obtained.
- the motion of the model of the pupil as described above was imaged by the imaging device 4, and the number of changes in luminance was counted by the pupil detection device 5 (S15).
- the imaging condition of the imaging device 4 was an optical magnification of 1.0 and a resolution of 304 ⁇ 240 pixels.
- the sample period (S12A) for counting the number of changes in luminance was 0.5 milliseconds.
- the increase event E11 and the decrease event E12 of the brightness are, for example, 500 or more.
- the number of changes in luminance increases as the object such as the pupil moves faster, and is considered to be correlated with the velocity of the object. For this reason, it is considered that, for example, when the pupil moves by head movement unrelated to eye movement, the number of brightness changes increases.
- the pupil detection device 5 of the present embodiment analyzes the high-frequency component not including the head movement by frequency-resolving the temporal change of the number of luminance changes (S13).
- FIG. 13 is a graph showing the frequency spectrum of the experimental result of FIG. In FIG. 13, the frequency (Hz) is shown, and the vertical axis shows the intensity of the frequency component.
- frequency analysis was performed on the number of luminance changes of 1024 samples for 512 milliseconds.
- a first peak P11 and a second peak P12 which were separated from each other were confirmed.
- the first peak P11 is located near 0 Hz, and is considered to correspond to simulated movement of head movement.
- the second peak P12 is located near 100 Hz and is considered to correspond to a simulated oscillatory movement of the eye movement. Therefore, even when an actual microsaccade or the like is generated overlapping the head movement, the temporal change in the number of changes in luminance is frequency-resolved, and the movement of the pupil in a specific frequency band according to the microsaccade is It was confirmed to be detectable.
- the imaging data acquired from the imaging device 4 is luminance change data D1 indicating the timing of an event in which the luminance has changed for each pixel.
- the control unit 52 Based on the luminance change data D1, the control unit 52 extracts temporal change of the number of events of the event whose luminance has changed (that is, the number of luminance changes) (S12A).
- the number of events has a correlation with the speed of movement of the pupil 6, and the temporal change in the number of events is an example of information indicating the movement of the pupil 6.
- information indicating the movement of the pupil 6 can be easily extracted by counting the number of events.
- control unit 52 performs frequency decomposition on the temporal change of the extracted number of events (S13), and detects the movement of the pupil in a specific frequency band (S14).
- S13 the extracted number of events
- S14 the movement of the pupil in a specific frequency band
- the specific frequency band includes, for example, a frequency corresponding to the micro saccade of eye movement.
- Embodiments 1 and 2 described above an example in which the detection target by the pupil detection device 5 is a microsaccade has been described.
- the object to be detected by the pupil detection device 5 is not limited to the microsaccade, and may be another fine eye movement, a saccade, a blink or the like. In this case, a specific frequency band corresponding to the movement of the pupil to be detected can be set as appropriate.
- the detection system 1 detects the awakening degree of the subject.
- the detection target of the detection system 1 is not limited to the awakening degree. It may be various indicators that can be detected.
- the detection system 1 may detect the degree of collection or fatigue of a subject.
- the detection system 1 which concerns on this indication is not limited to a vehicle-mounted use.
- the detection system 1 according to the present disclosure is applicable to various environments capable of imaging the eye of a subject, and can be applied, for example, to monitor workers in a factory.
- a first aspect according to the present disclosure is a pupil detection device (5) for detecting the movement of the pupil based on an imaging result of an imaging device (4) for imaging an image including a pupil.
- the pupil detection device includes an acquisition unit (51) and a control unit (52).
- An acquisition unit acquires imaging data indicating the imaging result from the imaging device.
- the control unit executes data processing based on the imaging data acquired by the acquisition unit.
- the control unit extracts information indicating the movement of the pupil based on the imaging data (S12, S12A).
- the control unit performs time-frequency analysis on the extracted information to detect the movement of the pupil (S13, S14).
- the imaging data indicates an image (60) for each frame.
- the information indicating the movement of the pupil indicates a temporal change of the position of the pupil.
- the control unit extracts a temporal change of the position of the pupil (61) in the image for each frame.
- the control unit performs binarization processing on luminance on the image for each frame to determine a region (63) corresponding to the pupil. .
- the control unit extracts a temporal change of the position of the pupil based on the area determined for each frame (S12).
- the imaging data (D1) indicates a timing at which the luminance changes for each pixel.
- the information indicating the movement of the pupil indicates a temporal change in the number of luminance changes, which is the number of timings at which the luminance has changed.
- the control unit counts the number of changes in luminance based on the imaging data, and extracts temporal changes in the number of changes in luminance (S12A).
- the control unit frequency-resolves the temporal change of the extracted luminance change number (S13) to detect the movement of the pupil in a specific frequency band To do (S14).
- the specific frequency band includes a frequency corresponding to a microsaccade of eye movement.
- a seventh aspect is a detection system (1) including an imaging device and the pupil detection device according to any of the first to sixth aspects.
- the imaging device captures an image including a pupil and generates imaging data.
- the pupil detection device detects the movement of the pupil based on the imaging data.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- General Physics & Mathematics (AREA)
- Mechanical Engineering (AREA)
- Mathematical Physics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Molecular Biology (AREA)
- Theoretical Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- Multimedia (AREA)
- Biophysics (AREA)
- Surgery (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Educational Technology (AREA)
- Pathology (AREA)
- Social Psychology (AREA)
- Psychology (AREA)
- Psychiatry (AREA)
- Hospice & Palliative Care (AREA)
- Developmental Disabilities (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Child & Adolescent Psychology (AREA)
- Human Computer Interaction (AREA)
- Ophthalmology & Optometry (AREA)
- Eye Examination Apparatus (AREA)
- Image Analysis (AREA)
- Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
- Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2017-239692 | 2017-12-14 | ||
| JP2017239692A JP6930410B2 (ja) | 2017-12-14 | 2017-12-14 | 瞳孔検出装置および検出システム |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2019117165A1 true WO2019117165A1 (ja) | 2019-06-20 |
Family
ID=66819356
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2018/045555 Ceased WO2019117165A1 (ja) | 2017-12-14 | 2018-12-11 | 瞳孔検出装置および検出システム |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP6930410B2 (enExample) |
| WO (1) | WO2019117165A1 (enExample) |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2023087378A (ja) * | 2021-12-13 | 2023-06-23 | キヤノン株式会社 | 電子装置、電子装置の制御方法、および、プログラム |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2003230552A (ja) * | 2001-12-03 | 2003-08-19 | Denso Corp | 生体状態推定方法及び生体刺激方法 |
| JP2016220801A (ja) * | 2015-05-28 | 2016-12-28 | 浜松ホトニクス株式会社 | 両眼計測装置、両眼計測方法、及び両眼計測プログラム |
-
2017
- 2017-12-14 JP JP2017239692A patent/JP6930410B2/ja active Active
-
2018
- 2018-12-11 WO PCT/JP2018/045555 patent/WO2019117165A1/ja not_active Ceased
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2003230552A (ja) * | 2001-12-03 | 2003-08-19 | Denso Corp | 生体状態推定方法及び生体刺激方法 |
| JP2016220801A (ja) * | 2015-05-28 | 2016-12-28 | 浜松ホトニクス株式会社 | 両眼計測装置、両眼計測方法、及び両眼計測プログラム |
Also Published As
| Publication number | Publication date |
|---|---|
| JP2019103745A (ja) | 2019-06-27 |
| JP6930410B2 (ja) | 2021-09-01 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2019117247A1 (ja) | 瞳孔検出装置および検出システム | |
| US20210044744A1 (en) | Method and apparatus of processing a signal from an event-based sensor | |
| JP6547160B2 (ja) | 脈波検出装置、及び脈波検出プログラム | |
| CN108882882B (zh) | 散斑测量设备和散斑测量方法 | |
| GB2610734A (en) | Barcode readers with 3D camera(s) | |
| EP2976990B1 (en) | Pupil detection device and pupil detection method | |
| WO2014002534A1 (ja) | 対象物認識装置 | |
| JP3116638B2 (ja) | 覚醒状態検知装置 | |
| JP2020087312A (ja) | 行動認識装置、行動認識方法及びプログラム | |
| WO2022190622A1 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
| JP2012185684A (ja) | 対象物検出装置、対象物検出方法 | |
| JP6338445B2 (ja) | 瞬き検出システムおよび方法 | |
| CN107533765A (zh) | 跟踪光学物体的装置,方法和系统 | |
| WO2015186395A1 (ja) | 車両周辺監視装置 | |
| WO2019117165A1 (ja) | 瞳孔検出装置および検出システム | |
| WO2008107832A1 (en) | Stress estimation | |
| JP3148187B2 (ja) | パーティクルモニタシステム及びパーティクル検出方法並びにパーティクル検出プログラムを格納した記録媒体 | |
| KR20140045834A (ko) | 단일객체의 크기를 추정하는 영상 감시장치 및 방법 | |
| JP5825588B2 (ja) | 瞬目計測装置及び瞬目計測方法 | |
| JP2012073891A (ja) | 煙検出用周波数成分特定方法、および煙検出装置 | |
| JPWO2007086222A1 (ja) | 注意領域を推定するシステムおよび方法 | |
| JP4765113B2 (ja) | 車両周辺監視装置、車両、車両周辺監視用プログラム、車両周辺監視方法 | |
| KR20150033047A (ko) | 객체를 검출하기 위한 전처리 장치 및 방법 | |
| KR20230161443A (ko) | Rgb 이미지로부터의 심박수 추출을 위한 방법 및 시스템 | |
| KR101830331B1 (ko) | 기계 이상 동작 검출 장치 및 방법 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18887698 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 18887698 Country of ref document: EP Kind code of ref document: A1 |