EP3151166B1 - Sichtlinienerkennungsverfahren und -vorrichtung - Google Patents

Sichtlinienerkennungsverfahren und -vorrichtung Download PDF

Info

Publication number
EP3151166B1
EP3151166B1 EP16187589.3A EP16187589A EP3151166B1 EP 3151166 B1 EP3151166 B1 EP 3151166B1 EP 16187589 A EP16187589 A EP 16187589A EP 3151166 B1 EP3151166 B1 EP 3151166B1
Authority
EP
European Patent Office
Prior art keywords
edge
pupil
processing
region
iris
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP16187589.3A
Other languages
English (en)
French (fr)
Other versions
EP3151166A1 (de
Inventor
Daisuke Ishii
Satoshi Nakashima
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujitsu Ltd
Original Assignee
Fujitsu Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujitsu Ltd filed Critical Fujitsu Ltd
Publication of EP3151166A1 publication Critical patent/EP3151166A1/de
Application granted granted Critical
Publication of EP3151166B1 publication Critical patent/EP3151166B1/de
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/42Global feature extraction by analysis of the whole pattern, e.g. using frequency domain transformations or autocorrelation
    • G06V10/421Global feature extraction by analysis of the whole pattern, e.g. using frequency domain transformations or autocorrelation by analysing segments intersecting the pattern
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/197Matching; Classification

Definitions

  • the embodiments disclosed herein are related to line-of-sight detection techniques.
  • Line-of-sight detection techniques for detecting the direction of the line of sight and the position of the gaze of a user are known.
  • An example of a line-of-sight detection method is a method in which a corneal reflection of a light source and a pupil are detected from an image obtained by capturing an image of an eye of a user and the line of sight of the user is detected based on the positional relationship between the corneal reflection and the center of the pupil.
  • the position of the corneal reflection and the position of the center of the pupil be accurately detected in this method.
  • the distance between the illuminating light source and the camera is small, light from the illuminating light source is reflected by the retina and this reflected light reaches the camera via the pupil. Consequently, the occurrence of a bright pupil state is known in which the entire pupil is bright in the image captured by the camera. In the case where a bright pupil state exists, the outline of the corneal reflection may become indistinct and it may be difficult to detect the corneal reflection.
  • Examples of a method of detecting the center of the pupil include a method in which the center of the pupil is detected from the outline of the pupil and a method in which the center of the pupil is obtained from the outline of the iris. Out of these methods, it is possible to detect the center of the pupil with higher accuracy with the former method. This is because, in a normal state, the difference in brightness between the pupil and the iris is larger than the difference in brightness between the iris and the sclera and the upper part or the lower part of the iris is easily hidden by the eyelid and therefore the outline of the pupil may be detected more clearly than the outline of the iris.
  • the pupil is bright in the above-mentioned bright pupil state
  • the difference in brightness between the pupil and the iris is reduced and it becomes more difficult to detect the outline of the pupil than in the case of the normal state, that is, a state that is not the bright pupil state.
  • a state called a "semi-bright pupil" state in which the pupil is slightly brighter than normal may also exist, and in this case, the difference in brightness between the pupil and the iris becomes almost non-existent and it is difficult to detect the outline of the pupil.
  • techniques disclosed in the embodiments discussed herein aim to improve the accuracy with which the position of the center of a pupil is detected.
  • the present invention provides a line-of-sight detection method according to claim 1, a line-of-sight detection program according to claim 6, and a line-of-sight detection device according to claim 7.
  • the accuracy with which the position of the center of a pupil is detected may be improved.
  • FIG. 1 illustrates an example of the configuration and an example of processing of a line-of-sight detection system according to a background example.
  • a line-of-sight detection system 1 according to the background example includes a boundary detection unit 1a, a center position detection unit 1b, and a line-of-sight-detecting unit 1c.
  • the processing performed by each of the boundary detection unit 1a, the center position detection unit 1b, and the line-of-sight-detecting unit 1c is implemented by a processor executing a prescribed program, for example.
  • at least some of the processing functions of the boundary detection unit 1a, the center position detection unit 1b, and the line-of-sight-detecting unit 1c may be implemented in a different device from the other processing functions.
  • the boundary detection unit 1a detects a boundary between a pupil 11 and an iris 12, from an eye region 10 of a user in a captured image, by performing edge detection based on brightness.
  • the boundary detection unit 1a sets a line-shaped or band-shaped detection region 13 inside the eye region 10 and performs edge detection in a longitudinal direction of the detection region 13.
  • the boundary detection unit 1a detects the positions of boundaries between the pupil 11 and the iris 12 based on the positional symmetry, in the detection region 13, of a first edge where brightness decreases between before and after the edge and a second edge where brightness increases between before and after the edge, among detected edges, and in the example illustrated in FIG. 1 , two boundaries 11a and 11b between the pupil 11 and the iris 12 are detected.
  • the boundaries 11a and 11b are examples of a first edge and a second edge, respectively.
  • the boundary detection unit 1a may detect the boundaries between the pupil 11 and the iris 12 by performing edge detection.
  • the boundary detection unit 1a may not be able to detect the boundaries between the pupil 11 and the iris 12 by performing edge detection.
  • the center position detection unit 1b switches between and executes first processing and second processing in accordance with whether the boundaries between the pupil 11 and the iris 12 are detected by the boundary detection unit 1a.
  • the first processing and the second processing both detect the position of the center of the pupil 11, but have different processing procedures.
  • the outline of the pupil 11 is detected and the position of the center of the pupil 11 is detected based on the result of the detection.
  • the outline of the iris 12 is detected and the position of the center of the pupil 11 is detected based on the result of the detection.
  • the position of the center of the pupil 11 is detected by placing importance on the detection result of the outline of the pupil 11.
  • the position of the center of the pupil 11 is detected by placing importance on the detection result of the outline of the iris 12.
  • the first processing is processing that is suitable for the case of a dark pupil state and the second processing is processing that is suitable for the case of a semi-bright pupil state.
  • the center position detection unit 1b executes the first processing when the boundaries between the pupil 11 and the iris 12 are detected by the boundary detection unit 1a.
  • the center position detection unit 1b executes the second processing when the boundaries between the pupil 11 and the iris 12 are not detected by the boundary detection unit 1a.
  • the line-of-sight-detecting unit 1c detects the direction of the line of sight or the position of the gaze of a user based on the detection result of the position of the center of the pupil 11 obtained by the center position detection unit 1b. For example, the line-of-sight-detecting unit 1c detects the direction of the line of sight or the position of the gaze of the user based on a detected position of a corneal reflection and the detection result of the position of the center of the pupil 11 in the eye region 10.
  • the boundary detection unit 1a detects the boundaries between the pupil 11 and the iris 12 by performing edge detection based on brightness.
  • the center position detection unit 1b may determine whether the semi-bright pupil state exists based on whether the boundary between the pupil 11 and the iris 12 is detected by the boundary detection unit 1a.
  • the center position detection unit 1b may select and execute the appropriate processing for detecting the center of the pupil 11 based on whether the semi-bright pupil state exists.
  • the center of the pupil 11 may be accurately detected even in the case where the semi-bright pupil state exists. Therefore, the accuracy with which the center of the pupil 11 is detected may be improved.
  • FIG. 2 illustrates an example of the hardware configuration of a line-of-sight detection device according to an embodiment of the invention.
  • a line-of-sight detection device 100 according to the embodiment may be implemented as a computer as illustrated in FIG. 2 , for example.
  • the entirety of the line-of-sight detection device 100 is controlled by a processor 101.
  • the processor 101 may be formed of multiple processors.
  • the processor 101 is, for example, a central processing unit (CPU), a micro-processing unit (MPU), a digital signal processor (DSP), an application specific integrated circuit (ASIC) or a programmable logic device (PLD).
  • the processor 101 may be combination of two or more elements from among a CPU, an MPU, a DSP, an ASIC, and a PLD.
  • a random access memory (RAM) 102 and a plurality of peripheral devices are connected to the processor 101 via a bus 109.
  • the RAM 102 is used as a main storage device of the line-of-sight detection device 100. At least part of an operating system (OS) program and application programs, to be executed by the processor 101, are temporarily stored in the RAM 102. In addition, various data that is used in the processing performed by the processor 101 is stored in the RAM 102.
  • OS operating system
  • application programs to be executed by the processor 101, are temporarily stored in the RAM 102.
  • various data that is used in the processing performed by the processor 101 is stored in the RAM 102.
  • Examples of the peripheral devices connected to the bus 109 include a hard disk drive (HDD) 103, a graphic processing device 104, an input interface 105, a reading device 106, a communication interface 107, and a network interface 108.
  • HDD hard disk drive
  • the HDD 103 is used as an auxiliary storage device of the line-of-sight detection device 100.
  • An OS program, application programs, and various data are stored in the HDD 103.
  • Another type of non-volatile storage device such as a solid state drive (SSD) may also be used as the auxiliary storage device.
  • SSD solid state drive
  • a display device 104a is connected to the graphic processing device 104.
  • the graphic processing device 104 displays images on the display device 104a in accordance with commands from the processor 101.
  • Examples of the display device include a liquid crystal display and an organic electroluminescence (EL) display.
  • An input device 105a is connected to the input interface 105.
  • the input interface 105 transmits a signal output from the input device 105a to the processor 101.
  • Examples of the input device 105a include a keyboard and a pointing device.
  • Examples of a pointing device include a mouse, a touch panel, a tablet, a touch pad, and a trackball.
  • a portable recording medium 106a is detachably attached to the reading device 106.
  • the reading device 106 reads out data that is recorded on the portable recording medium 106a and transmits the read out data to the processor 101.
  • Examples of the portable recording medium 106a include an optical disc, a magneto-optical disk, and a semiconductor memory.
  • the communication interface 107 transmits data to and receives data from external devices.
  • an infrared light 107a and an infrared camera 107b are connected as external devices.
  • the infrared light 107a radiates infrared light onto the face of a user who is the target of line-of-sight detection.
  • the infrared camera 107b detects reflected light out of the radiated infrared light.
  • the processor 101 detects the line of sight of a user by analyzing an image captured by the infrared camera 107b.
  • the infrared light 107a and the infrared camera 107b are integrally formed as a sensor unit 107c, for example.
  • the network interface 108 transmits data to and receives data from other devices via a network 108a.
  • the processing functions of the line-of-sight detection device 100 may be implemented by the above-described hardware configuration.
  • the above-described line-of-sight detection device 100 detects the line of sight of a user based on an image of the region of an eye of a user captured by the infrared camera 107b.
  • a "corneal reflection method” is used in which a line-of-sight direction is detected from the positional relationship between the position of a corneal reflection and the position of the center of the pupil.
  • FIGs. 3A, 3B, and 3C illustrate states of a pupil.
  • FIG. 3A illustrates a dark pupil state (normal state)
  • FIG. 3B illustrates a bright pupil state
  • FIG. 3C illustrates a semi-bright pupil state.
  • an "eyeball region”, where the eyeball is exposed, is defined as a region that is enclosed by a lower edge 201 of the upper eyelid and an upper edge 202 of the lower eyelid.
  • the region of a pupil 203 and the region of an iris 204 exist in the form of concentric circles and the region of the sclera (the white of the eye) 205 exists outside the iris 204.
  • a corneal reflection 206 which is reflected light out of infrared light from the infrared light 107a, appears in the eyeball region when line-of-sight detection is performed.
  • the "dark pupil state” exists in which the pupil 203 is sufficiently darker than the iris 204.
  • the distance between the infrared light 107a and the infrared camera 107b is small, light from the infrared light 107a is reflected by the retina and the reflected light reaches the infrared camera 107b via the pupil 203.
  • the "bright pupil state” in which the pupil 203 is brighter than the iris 204 as illustrated in FIG. 3B
  • the "semi-bright pupil state” in which the pupil 203 and the iris 204 have substantially the same brightness as illustrated in FIG. 3C .
  • the line-of-sight detection device 100 detects the position of the corneal reflection 206 and the position of the center of the pupil 203 from an image in which the eyeball region is captured, as described above.
  • Examples of a method for detecting the position of the center of the pupil 203 include a method in which a detection result of the outline of the pupil 203 is used and a method in which a detection result of the outline of the iris 204 is used.
  • the method in which detection result of the outline of the pupil 203 is used has higher accuracy for detecting the position of the center of the pupil 203. This is because, in the dark pupil state, generally, the difference in brightness between the pupil 203 and the iris 204 is larger than the difference in brightness between the iris 204 and the sclera 205, and the outline of the pupil 203 is more distinct. In addition, in the state where the eyelids are open, although the entirety of the outline of the pupil 203 is exposed, often part of the outline of the iris 204 is covered by the eyelid. Consequently, the outline of the pupil 203 may be detected more easily and with higher accuracy in the case where the pupil 203 and the iris 204 are detected with circle detection.
  • the difference in brightness between the pupil 203 and the iris 204 is smaller than in the dark pupil state and therefore the accuracy with which the outline of the pupil 203 is detected decreases.
  • the accuracy with which the outline of the iris 204 is detected does not change between the dark pupil state, the bright pupil state and the semi-bright pupil state. Therefore, in the semi-bright pupil state, the method in which detection result of the outline of the iris 204 is used has higher accuracy for detecting the position of the center of the pupil 203.
  • the semi-bright pupil state occurs in the case where the distance between the infrared light 107a and the infrared camera 107b is short. Therefore, in particular, when the infrared light 107a and the infrared camera 107b are integrated as the sensor unit 107c as illustrated in FIG. 2 and the sensor unit 107c is reduced in size, it is difficult to avoid occurrence of the semi-bright pupil state.
  • the line-of-sight detection device 100 performs edge detection using brightness information inside the eyeball region and, in a background example, switches the processing to be used for detecting the position of the center of the pupil 203 in accordance with whether the edge of the pupil 203 may be detected.
  • the position of the center of the pupil 203 may be stably detected with high accuracy and consequently the accuracy of line-of-sight detection is improved.
  • FIG. 4 is a block diagram illustrating an example of the configuration of processing functions of a line-of-sight detection device.
  • the line-of-sight detection device 100 includes an image-obtaining unit 111, an eyeball-region-detecting unit 112, a corneal-reflection-detecting unit 113, an edge-detecting unit 114, a pupil-center-detecting unit 115 and a line-of-sight-detecting unit 116. These processing functions are implemented by the processor 101 executing a prescribed program, for example.
  • processing functions may be implemented in a different device from the other processing functions.
  • the image-obtaining unit 111, the eyeball-region-detecting unit 112, the corneal-reflection-detecting unit 113, the edge-detecting unit 114, and the pupil-center-detecting unit 115 may be implemented in the sensor unit 107c and the line-of-sight-detecting unit 116 may be implemented in the line-of-sight detection device 100.
  • the image-obtaining unit 111 obtains an image in which the face of the user is captured by the infrared camera 107b.
  • the data of the obtained image is temporarily stored in a storage device (for example, RAM 102) of the line-of-sight detection device 100.
  • the eyeball-region-detecting unit 112 detects the eyeball region from the image obtained by the image-obtaining unit 111.
  • the eyeball region is detected as a region that is enclosed by the lower edge of the upper eyelid and the upper edge of the lower eyelid, for example.
  • the corneal-reflection-detecting unit 113 detects the position of the corneal reflection from the eyeball region based on a brightness distribution of the detected eyeball region. In the eyeball region, the brightness of the region of the corneal reflection is much higher than that of the rest of the eyeball region. Therefore, the corneal-reflection-detecting unit 113 detects a circular region, out of the eyeball region, for which the brightness is equal to or higher than a prescribed threshold as the corneal reflection. The corneal-reflection-detecting unit 113 may also detect the position of the corneal reflection through corner detection, for example.
  • the edge-detecting unit 114 sets a line-shaped or band-shaped edge detection region in a substantially horizontal direction in the eyeball region and detects edges by detecting differences in brightness in the longitudinal direction of the edge detection region.
  • the edge-detecting unit 114 determines whether both the boundaries between the pupil and the iris (that is, the edges of the pupil) and the boundaries between the iris and the sclera (that is, the edges of the iris) have been detected, whether only one of types of boundaries has been detected or whether neither of the types of boundaries has been detected based on the positions of the detected edges.
  • the pupil-center-detecting unit 115 detects the position of the center of the pupil based on the brightness of the eyeball region.
  • the pupil-center-detecting unit 115 switches the processing used to detect the position of the center of the pupil based on the detection result obtained by the edge-detecting unit 114. In this switching, processing in which importance is placed upon the detection result of the outline of the pupil and processing in which importance is placed upon the detection result of the outline of the iris are switched between. In the case where both the edges of the pupil and the edges of the iris are detected by the edge-detecting unit 114, the former processing is switched to and in the case where only the edges of the iris are detected, the latter processing is switched to.
  • the line-of-sight-detecting unit 116 detects the direction of the line of sight of the user based on the positional relationship between the position of the corneal reflection detected by the corneal-reflection-detecting unit 113 and the position of the center of the pupil detected by the pupil-center-detecting unit 115. In addition, the line-of-sight-detecting unit 116 may detect the position of the gaze of the user based on the detected line of sight direction.
  • the edge-detecting unit 114 sets a line-shaped or band-shaped edge detection region in a substantially horizontal direction in the eyeball region and detects edges by detecting differences in brightness between adjacent pixels in the longitudinal direction of the set edge detection region.
  • edge detection results for an edge detection region are illustrated in FIGs. 5 to 7 .
  • FIGs. 5 to 7 as an example, it is assumed that a horizontal line is set as the edge detection region in the eyeball region.
  • FIG. 5 illustrates a first example of an edge detection result.
  • the horizontal axis represents coordinates along the longitudinal direction of the edge detection region and the vertical axis represents brightness. The same is true for the graphs illustrated in FIGs. 6 and 7 as well.
  • the edge-detecting unit 114 performs edge detection along the edge detection region and determines whether detected edges are the edges of the pupil (the boundaries between the pupil and the iris) or the edges of the iris (the boundaries between the iris and the sclera) based on the symmetry of the positions of the edges. For example, the edge-detecting unit 114 determines that a pair of edge portions where the differences in brightness before and after the edge portions are substantially the same and where the signs of the gradients are opposite to each other are such a pair of edges.
  • the graph of FIG. 5 illustrates an example of detection of brightness in the dark pupil state.
  • four edge portions 221a to 221d are detected in the eyeball region.
  • the edge portion 221a and the edge portion 221d have substantially the same difference in brightness between before and after the edge portion and have gradients of opposite signs. From this, the edge portions 221a and 221d are each assumed to be the edge of the pupil or the edge of the iris.
  • edge portion 221b and the edge portion 221c also have substantially the same difference in brightness between before and after the edge portions and have gradients of opposite signs.
  • the edge portions 221b and 221c are located between the edge portion 221a and the edge portion 221d. Therefore, the edge-detecting unit 114 determines that the edge portions 221b and 221c are the edges of the pupil and determines that the edge portions 221a and 221d are the edges of the iris.
  • FIG. 6 illustrates a second example of an edge detection result.
  • an example of detection of brightness is illustrated for a case in which a corneal reflection is superposed with the edge detection region in the dark pupil state.
  • two edge portions 222a and 222c which correspond to the edges of the iris, are detected, similarly to as in the example in FIG. 5 .
  • one edge portion 222b that corresponds to an edge of the pupil is also detected.
  • a corneal reflection exists in a region 223 where the other edge of the pupil is expected to be and therefore the other edge is not detected.
  • the edge-detecting unit 114 regards the region of the corneal reflection as an edge portion or flat portion. For example, the edge-detecting unit 114 considers that an edge portion is detected from the region of the corneal reflection in the case where the difference in brightness between before and after this region is equal to or higher than a prescribed threshold and considers that this region is not an edge portion and is a flat portion in the case where the difference in brightness is less than the threshold.
  • the edge-detecting unit 114 determines the positions of the edges of the pupil and the edges of the iris based on the symmetry of edge portions detected in this way, and consequently the positions of the edges may be detected even in the case where a corneal reflection is superposed with the edge detection region.
  • the region 223 is determined to be an edge portion and this edge portion and the edge portion 222b are determined to have substantially the same difference in brightness between before and after the edge portions and to have gradients of opposite signs.
  • the edge-detecting unit 114 determines that the edge portion 222b and the region 223 are the edges of the pupil and determines that the edge portions 222a and 222c are the edges of the iris.
  • FIG. 7 illustrates a third example of an edge detection result.
  • an example of detection of brightness is illustrated for a case in which a corneal reflection is superposed with the edge detection region in the semi-bright pupil state.
  • a corneal reflection is detected in a region 224.
  • the edge-detecting unit 114 regards the region 224 not as an edge portion but, rather, as a flat portion from the fact that the difference in brightness between before and after the region 224 is less than the prescribed threshold, for example.
  • edge portions 225a and 225b are detected.
  • the edge portions 225a and 225b have substantially the same difference in brightness between before and after the edge portions and have gradients of opposite signs.
  • a pair of edge portions is not detected in the region between the edge portion 225a and the edge portion 225b.
  • the edge-detecting unit 114 determines that the edge portions 225a and 225b are the edges of the iris and determines that the edges of the pupil are not able to be detected. In such a case where the edges of the pupil are not able to be detected, it is considered that the semi-bright pupil state exists.
  • the pupil-center-detecting unit 115 executes detection processing for detecting the position of the center of the pupil in which importance is placed upon detection results for the outline of the iris based on the determination result obtained by the edge-detecting unit 114.
  • FIGs. 8A to 8D illustrate examples of setting of the edge detection region performed by the edge-detecting unit.
  • the edge-detecting unit 114 sets a line-shaped or band-shaped edge detection region in a substantially horizontal direction in the eyeball region.
  • various methods may be employed as a method of setting the edge detection region.
  • the edge-detecting unit 114 sets an edge detection region 241 such that the edge detection region 241 connects an eye inner corner 231 and an eye outer corner 232 and passes between the lower edge 201 of the upper eyelid and the upper edge 202 of the lower eyelid.
  • the edge-detecting unit 114 may set a straight-line-shaped region in the eyeball region as the edge detection region. For example, as illustrated in FIG. 8B , the edge-detecting unit 114 sets a straight-line-shaped region, which passes through a corneal reflection 251, in the eyeball region as an edge detection region 242. Alternatively, the edge-detecting unit 114 may set a straight-line-shaped region that is arranged at a position that is spaced upward or downward away from the corneal reflection 251 by a fixed distance as the edge detection region. For example, in the case where the infrared light 107a radiates infrared light from below the face, it is highly probable that the center of the pupil will be located above the corneal reflection 251. It is clear that it would be preferable to set the edge detection region either above or below the corneal reflection 251 based on the positional relationship between the infrared camera 107b and the face.
  • the edge-detecting unit 114 may set a bent-line-shaped region that passes through the eye inner corner 231, the corneal reflection 252 and the eye outer corner 232 as an edge detection region 243.
  • the edge-detecting unit 114 may set, from among a plurality of edge detection region candidates that are parallel to an edge detection region that has been preliminarily set using such a method, the candidate region that passes through the darkest region as the edge detection region.
  • the edge-detecting unit 114 defines a straight-line-shaped region that connects the eye inner corner and the eye outer corner and defines a darkest point detected within a fixed range from the straight-line-shaped region as a new midpoint. Then, the edge-detecting unit 114 sets a straight-line-shaped region that passes through this new midpoint or a bent-line-shaped region that passes through the new midpoint, the eye inner corner and the eye outer corner as the edge detection region.
  • the edge-detecting unit 114 may set a plurality of edge detection regions inside the eyeball region. For example, as illustrated in FIG. 8D , the edge-detecting unit 114 sets straight-line-shaped edge detection regions 244a to 244d so as to parallel to each other inside the eyeball region. In the case where a plurality of edge detection regions are set in this way, the edge-detecting unit 114, for example, performs the above-described edge detection processing for each of the edge detection regions and adopts the processing result in which the greatest number of pairs of edge portions is detected.
  • the edge-detecting unit 114 may instead set an edge detection portion within a fixed region that contains the eye in the case where a method is used in which the positions of the eye inner corner and the eye outer corner are not specified, for example.
  • the pupil-center-detecting unit 115 switches between and executes detection processing for detecting the position of the center of the pupil in accordance with whether the edges of the pupil are detected by the edge-detecting unit 114. In the case where the edges of the pupil are detected, the pupil-center-detecting unit 115 executes detection processing for detecting the position of the center of the pupil in which importance is placed on the detection result of the outline of the pupil. On the other hand, in the case where the edges of the pupil are not detected, the pupil-center-detecting unit 115 executes detection processing for detecting the position of the center of the pupil in which importance is placed on the detection result of the outline of the iris.
  • FIG. 9 illustrates an example of processing for detecting the position of the center of the pupil.
  • a plurality of templates 271a, 271b, 271c,... which are for detecting the outline of the pupil
  • a plurality of templates 272a, 272b, 272c,..., which are for detecting the outline of the iris are used.
  • the templates 271a, 271b, 271c,... include circles of different radii that are suitable for the sizes of pupils in images.
  • the pupil-center-detecting unit 115 detects the outline of the pupil by detecting a circle in a captured image by using the templates 271a, 271b, 271c,....
  • the templates 272a, 272b, 272c,... include circles of different radii that are suitable for the sizes of irises in images.
  • the largest value of the radii of circles included in the templates 272a, 272b, 272c,... is larger than the largest value of the radii of circles included in the templates 271a, 271b, 271c,....
  • the pupil-center-detecting unit 115 detects the outline of the iris by detecting a circle in a captured image by using the templates 272a, 272b, 272c,....
  • the pupil-center-detecting unit 115 may use the templates after enlarging or shrinking the templates in accordance with the size of the face in the image and the distance between the eyes.
  • An example of a method of switching the detection processing to be used to detect the position of the center of the pupil in accordance with an edge detection result obtained by the edge-detecting unit 114 is a method in which either of processing in which the outline of the pupil is detected and the position of the center of the pupil is detected from that detection result and processing in which the outline of the iris is detected and the position of the center of the pupil is detected from this detection result is switched to and executed in accordance with the edge detection result obtained by the edge-detecting unit 114.
  • the former type of processing is called “center detection processing based on the pupil outline” and the latter type of processing is called “center detection processing based on the iris outline”.
  • the pupil-center-detecting unit 115 executes the center detection processing based on the pupil outline.
  • the pupil-center-detecting unit 115 executes the center detection processing based on the iris outline.
  • the boundaries between the iris and the sclera are more distinct than the boundaries between the pupil and the iris and therefore the position of the center of the pupil may be detected with higher accuracy by detecting the position of the center of the pupil based on the outline of the iris.
  • the templates 271a, 271b, 271c,... may be used when detecting the outline of the pupil.
  • the outline of the pupil and the outline of the iris are detected with the following procedure.
  • the pupil-center-detecting unit 115 obtains a maximum brightness Lmax and a minimum brightness Lmin in the eyeball region.
  • the pupil-center-detecting unit 115 subjects the eyeball region to binarization processing while gradually reducing a threshold from the maximum brightness Lmax to the minimum brightness Lmin.
  • the pupil-center-detecting unit 115 performs matching between the binarized image and the templates 272a, 272b, 272c,..., which are for detecting the outline of the iris, and determines whether a matching evaluation value (degree of similarity) between the binarized image and any of the templates exceeds a prescribed threshold.
  • a circular region the outline of which is partially covered by the eyelids, appears in the eyeball region.
  • the outline of the iris is detected. It is also possible to predict a circular region that corresponds to the outline of the iris from the distance between the eyes, for example.
  • the pupil-center-detecting unit 115 first detects the outline of the iris using the same method as described above. After that, the pupil-center-detecting unit 115 gradually reduces the threshold further and, each time the threshold is reduced, performs matching between the binarized image and the templates 271a, 271b, 271c,..., which are for detecting the outline of the pupil, and determines whether a matching evaluation value (degree of similarity) between the binarized image and any of the templates exceeds a prescribed threshold.
  • the outline of the iris is partially covered by the eyelids in most cases and it is rare for the entirety of the outline of the iris to be exposed in the eyeball region. Consequently, even when both the edges of the pupil and the edges of the iris are distinct, the accuracy of detection of the outline of the iris using the templates 272a, 272b, 272c,... will be lower than the accuracy of detection of the outline of the pupil using the templates 271a, 271b, 271c,.... This is the reason why the accuracy with which the position of the center of the pupil is detected is lower when the position of the center of the pupil is detected based on the outline of the iris than when the position of the center of the pupil is detected based on the outline of the pupil.
  • FIG. 9 an example is illustrated in which center detection processing based on the pupil outline and center detection processing based on the iris outline are switched between in accordance with an edge detection result obtained by the edge-detecting unit 114.
  • a method in which a weight (likelihood) of a detection result of the outline of the pupil and a weight (likelihood) of a detection result of the outline of the iris are changed in accordance with the edge detection result is adopted in an embodiment of the invention.
  • a method may also be adopted in which control parameters of the center detection processing based on the pupil outline and the center detection processing based on the iris outline are changed in accordance with the edge detection result.
  • FIG. 10 is a flowchart that illustrates an example of the processing procedure of line-of-sight detection.
  • the image-obtaining unit 111 obtains an image in which the face of the user is captured by the infrared camera 107b.
  • the eyeball-region-detecting unit 112 detects an eyeball region from the obtained image.
  • the corneal-reflection-detecting unit 113 detects the position of a corneal reflection from the eyeball region based on a brightness distribution of the eyeball region.
  • the edge-detecting unit 114 sets an edge detection region in the eyeball region.
  • the edge-detecting unit 114 detects edge portions by detecting differences in brightness from one end to the other end of the edge detection region.
  • the edge-detecting unit 114 detects a pair of pupil edges (edge pair) and a pair of iris edges (edge pair).
  • the processing of step S14 will be described in detail below.
  • Step S15 The edge-detecting unit 114 sets the brightnesses of the pupil, the iris and the sclera in accordance with the number of edge pairs detected in step S14.
  • the brightnesses set in step S15 are used in the processing of step S16.
  • the edge-detecting unit 114 sets an average value of the brightnesses between the edges of the pupil in the edge detection region as the brightness of the pupil.
  • the edge-detecting unit 114 sets an average value of the brightnesses between the edges of the pupil and the edges of the iris in the edge detection region as the brightness of the iris.
  • the edge-detecting unit 114 sets the brightness of regions outside the edges of the iris in the edge detection region as the brightness of the sclera.
  • the brightness of the sclera may be set to a preset value.
  • the edge-detecting unit 114 sets the brightness of the iris and the brightness of the sclera using the same method as described above. In addition, the edge-detecting unit 114 sets the brightness of the pupil to the same value as that set for the brightness of iris. Alternatively, the edge-detecting unit 114 sets the brightness close to the center of the region between the edges of the iris in the edge detection region as the brightness of the pupil.
  • Step S16 The pupil-center-detecting unit 115 detects the position of the center of the pupil based on the brightnesses of the eyeball region. At this time, based on the detection result of edge pairs in step S14, the pupil-center-detecting unit 115 switches between and executes processing in which importance is placed upon the detection result of the outline of the pupil and processing in which importance is placed upon the detection result of the outline of the iris. The processing of step S16 will be described in detail below.
  • the line-of-sight-detecting unit 116 detects the line of sight direction of the user and the position of the gaze of the user based on the positional relationship between the position of the corneal reflection detected in step S12 and the position of the center of the pupil detected in step S16. Only one of the line of sight direction and the position of the gaze may be detected.
  • FIGs. 11 and 12 are flowcharts that illustrate a processing example 1-1 of edge pair detection.
  • the processing in FIGs. 11 and 12 corresponds to the processing of step S14 in FIG. 10 .
  • a first stack in which edge information regarding edge portions is stored and a second stack in which edge pair information regarding edge pairs is stored are used.
  • the first stack and the second stack are implemented as storage regions in the RAM 102 of the line-of-sight detection device 100, for example. Information is not stored in the first stack and the second stack when the processing of FIG. 11 is initiated.
  • Step S31 The edge-detecting unit 114 executes the processing from step S31 up to the end of the loop in step S40 for the entire edge detection region from one end to the other end of the edge detection region.
  • Step S32 The edge-detecting unit 114 sequentially executes processing of calculating differences in brightness toward the other end of the edge detection region, and upon detecting an edge portion, executes the processing of the subsequent step S33.
  • Step S33 The edge-detecting unit 114 determines whether the position of a detected edge is the position of a corneal reflection. In the case where the position of the detected edge is the position of a corneal reflection, the processing of step S34 is executed and in the case where the position of the detected edge is not the position of a corneal reflection, the processing of step S36 is executed.
  • Step S34 The edge-detecting unit 114 determines whether the difference in brightness between before and after the corneal reflection is equal to or higher than a prescribed threshold. In the case where the difference in brightness is equal to or higher than the threshold, the edge-detecting unit 114 determines that an edge exists in the region of a corneal reflection and the processing of step S35 is executed. On the other hand, in the case where the difference in brightness is less than the threshold, the edge-detecting unit 114 determines that an edge does not exist in the region of a corneal reflection and the processing of step S40 is executed. In the latter case, the region of the corneal reflection is regarded as a flat region in which an edge does not exist and the processing continues.
  • Step S35 The edge-detecting unit 114 regards the region of the corneal reflection as an edge and executes the processing of step S36.
  • Step S36 The edge-detecting unit 114 checks the direction of the change in brightness of the detected edge. In the case where the brightness changes in the direction of an increase, the processing of step S38 is executed, and in the case where the brightness changes in the direction of a decrease, the processing of step S37 is executed.
  • Step S37 In the case where the brightness decreases between before and after the edge, it is assumed that the position of an edge on the advancement-direction side among edges of the pupil has not been reached.
  • the edge-detecting unit 114 registers edge information regarding the detected edge in the first stack.
  • the edge information includes the position of the detected edge and brightness values before and after the edge.
  • Step S38 In the case where the brightness increases between before and after the edge, it is assumed that the position of the edge on the advancement-direction side among edges of the pupil has been reached or that that position has been passed. In this case, the edge-detecting unit 114 determines whether it is possible to make a pair with the detected edge and an edge out of the edges registered in the first stack.
  • the edge-detecting unit 114 extracts, from among edges registered in the first stack, edges for which the direction of the change in brightness is opposite to that of the edge for which the direction of the change in brightness has been detected (that is, the direction of the change in brightness is the direction of a decrease).
  • the edge-detecting unit 114 compares the edge information of the extracted edges and the edge information of the detected edge and specifies, among the extracted edges, that an edge that satisfies a prescribed number of one or two or more of the following conditions 1 to 3 is an edge that may form a pair with the detected edge.
  • the edge-detecting unit 114 executes the processing of step S39 in the case where there is an edge that may form a pair.
  • the edge-detecting unit 114 executes the processing of step S37 in the case where there is not an edge that may form a pair.
  • the edge information regarding the detected edge is registered in the first stack.
  • the edge-detecting unit 114 forms an edge pair out of the detected edge and the edge specified in step S38 and registers edge pair information regarding this edge pair in the second stack.
  • the edge pair information includes the edge information of each of the edges and edge information regarding edges that exist singularly between the edges in the edge detection region.
  • the latter edge information is edge information regarding edges that are not registered as part of an edge pair and is extracted from the first stack.
  • Step S40 The edge-detecting unit 114 repeats execution of the processing from step S31 in the case where the search for edges up to the end of the edge detection region is not finished yet. On the other hand, the edge-detecting unit 114 executes the processing of step S41 in FIG. 12 in the case where the search for edges up to the end of the edge detection region is finished.
  • Step S41 The edge-detecting unit 114 determines whether an edge pair is registered in the second stack. In the case where an edge pair is registered, the processing of step S43 is executed, and in the case where an edge pair is not registered, the processing of step S42 is executed.
  • Step S42 The edge-detecting unit 114 notifies the pupil-center-detecting unit 115 that the number of detected edge pairs is 0. In this case, neither the edges of the pupil nor the edges of the iris have been detected by the edge-detecting unit 114 and there is a low probability that the pupil-center-detecting unit 115 will be able to normally detect the outline of the pupil and the outline of the iris. Consequently, the pupil-center-detecting unit 115 outputs the occurrence of a detection error to the display device 104a and so forth and finishes the line-of-sight detection processing.
  • Step S43 The edge-detecting unit 114 determines whether there are two or more edge pairs registered in the second stack and whether both the edges included in one edge pair exist between the edges included in the other edge pair. In the case where this condition is satisfied, the processing of step S45 is executed, and in the case where this condition is not satisfied, the processing of step S44 is executed.
  • the edge-detecting unit 114 notifies the pupil-center-detecting unit 115 that the number of detected edge pairs is one. In addition, the edge-detecting unit 114 notifies the pupil-center-detecting unit 115 of the edge pair information of the one detected edge pair as information regarding the edges of the iris. In the case where a plurality of pieces of edge pair information are registered in the second stack, the edge-detecting unit 114 selects and notifies the pupil-center-detecting unit 115 of the edge pair information of the edge pair having the longest distance between the edges.
  • the pupil-center-detecting unit 115 determines that the edges of the iris have been detected but the edges of the pupil have not been detected by the edge-detecting unit 114. In step S15 of FIG. 10 , the pupil-center-detecting unit 115 sets the brightness on the low-brightness side of the edges to the brightness of the iris and sets the brightness on the high-brightness side of the edges to the brightness of the sclera based on the brightness values for before and after the edges included in the notified edge pair information.
  • the edge-detecting unit 114 notifies the pupil-center-detecting unit 115 that the number of detected edge pairs is two. In addition, the edge-detecting unit 114 specifies two edge pairs that satisfy the condition of the determination of step S43. Among the edge pairs, the edge-detecting unit 114 notifies the pupil-center-detecting unit 115 of the edge pair information of the edge pair for which the distance between the edges is long as information regarding the edges of the iris and notifies the pupil-center-detecting unit 115 of the edge pair information of the edge pair for which the distance between the edges is short as information regarding the edges of the pupil.
  • the pupil-center-detecting unit 115 determines that both the edges of the pupil and the edges of the iris have been detected by the edge-detecting unit 114. In step S15 of FIG. 10 , the pupil-center-detecting unit 115 sets the brightness on the low-brightness sides of the edges as the brightness of the pupil based on brightness values before and after the edges included in the edge pair information notified as the information regarding the edges of the pupil.
  • the pupil-center-detecting unit 115 sets the brightness on the low-brightness side of the edges to the brightness of the iris and sets the brightness on the high-brightness side of the edges to the brightness of the sclera based on the brightness values for before and after the edges included in the edge pair information notified as the information regarding the edges of the iris.
  • the line-of-sight detection device 100 is able to determine whether the semi-bright pupil state exists by searching for the edges of the pupil and the edges of the iris by detecting differences in brightness in the edge detection region.
  • the detection accuracy may be improved compared with a method in which determination is made based on a result of estimating the distance from the infrared camera 107b to the face of the user. This is because, in addition to the accuracy with which the distance from the infrared camera 107b to the face of the user is estimated being low, it is possible to determine whether the semi-bright pupil state exists regardless of individual differences in brightness between irises.
  • the reflectance, transmittance and absorption of infrared light in the iris chiefly depend on the amount of melanin contained in the iris and there are individual differences in the amount of melanin contained in irises. Therefore, there are individual differences in brightness between irises. Consequently, even if the distance between the infrared light 107a and the infrared camera 107b remains the same, the difference in brightness between the pupil and the iris and the difference in brightness between the iris and the sclera vary from person to person.
  • FIGs. 13 and 14 are flowcharts that illustrate a processing example 1-2 of edge pair detection.
  • the processing in FIGs. 13 and 14 corresponds to the processing of step S14 in FIG. 10 .
  • This processing example 1-2 differs from the processing example 1-1 in that a determination is made as to whether the region of a corneal reflection is to be regarded as an edge based on the positional relationship between the corneal reflection and another detected edge in the case where there is a corneal reflection in the edge detection region.
  • the processing of steps S33 to S35 out of the processing illustrated in FIG. 11 is replaced with processing of step S33a as illustrated in FIG. 13 .
  • Step S33a The edge-detecting unit 114 determines whether the position of a detected edge is the position of a corneal reflection. In the case where the position of the detected edge is the position of a corneal reflection, the processing of step S40 is executed. In this case, determination as to whether the position of the corneal reflection is to be regarded as an edge is performed in the processing of FIG. 14 . On the other hand, in the case where the position of the detected edge is not the position of a corneal reflection, the processing of step S36 is executed.
  • Step S51 The edge-detecting unit 114 determines whether an edge pair is registered in the second stack. In the case where an edge pair is registered, the processing of step S56 is executed, and in the case where an edge pair is not registered, the processing of step S52 is executed.
  • Step S52 When the edge-detecting unit 114 regards the position of the corneal reflection as an edge in the case where there is a corneal reflection in the edge detection region, the edge-detecting unit 114 determines whether there is an another edge that may form a pair with that edge.
  • a suitable brightness range for the brightness values before and after an iris edge and a suitable distance range for the distance between iris edges are stipulated in advance.
  • the edge-detecting unit 114 extracts an edge for which the brightness values before and after the edge fall within the stipulated brightness range from among edges registered in the first stack.
  • the edge-detecting unit 114 determines whether the distance between the extracted edge and the corneal reflection falls within the stipulated distance range. In the case where the distance does fall within the stipulated range, the edge-detecting unit 114 determines that an iris edge exists at the position of the corneal reflection. In this case, when the position of the corneal reflection is regarded as an edge, it is determined that this edge and the extracted edge may form a pair.
  • step S54 In the case where it is determined that there is an edge that may form a pair, the processing of step S54 is executed, and in the case where it is determined that there is not an edge that may form a pair, the processing of step S53 is executed. In the case where a corneal reflection does not exist in the edge detection region, the processing of step S53 is executed unconditionally.
  • Step S53 The edge-detecting unit 114 notifies the pupil-center-detecting unit 115 that the number of detected edge pairs is zero, similarly to as in step S42 of FIG. 12 .
  • the edge-detecting unit 114 regards the position of the corneal reflection as an edge and registers, in the second stack, edge pair information including information regarding this edge and edge information regarding the edge extracted as an edge capable of forming a pair in step S52.
  • Step S55 The same processing as in step S44 of FIG. 12 is executed. That is, the edge-detecting unit 114 notifies the pupil-center-detecting unit 115 that the number of detected edge pairs is one. In addition, the edge-detecting unit 114 notifies the pupil-center-detecting unit 115 of the edge pair information of the one detected edge pair as information regarding the edges of the iris.
  • Step S56 The edge-detecting unit 114 determines whether there are two or more edge pairs registered in the second stack and whether both the edges included in one edge pair exist between the edges included in the other edge pair. In the case where this condition is satisfied, the processing of step S60 is executed, and in the case where this condition is not satisfied, the processing of step S57 is executed.
  • Step S57 When the edge-detecting unit 114 regards the position of the corneal reflection as an edge in the case where there is a corneal reflection in the edge detection region, the edge-detecting unit 114 determines whether there is an another edge that may form a pair with that edge.
  • the edge-detecting unit 114 executes the following processing.
  • the edge-detecting unit 114 calculates the distance between the one edge that is closer to the inner edge, among the edges included in the edge pair, and the inner edge (first distance).
  • the edge-detecting unit 114 calculates the distance between the other edge included in the edge pair and the corneal reflection (second distance).
  • the edge-detecting unit 114 determines that a pupil edge exists at the position of the corneal reflection in the case where the first distance and the second distance are equal to or less than prescribed thresholds, the difference between the brightness values on the high-brightness side is less than a fixed value and the difference between the brightness values on the low-brightness side is less than a fixed value, among the brightness values before and after the inner edge and the corneal reflection.
  • the position of the corneal reflection is regarded as an edge, it is determined that this edge and the inner edge may form a pair.
  • the edge-detecting unit 114 executes the following processing.
  • the edge-detecting unit 114 calculates the distance between the one edge that is closer to the outer edge, among the edges included in the edge pair, and the outer edge (third distance).
  • the edge-detecting unit 114 calculates the distance between the other edge included in the edge pair and the corneal reflection (fourth distance).
  • the edge-detecting unit 114 determines that an iris edge exists at the position of the corneal reflection in the case where the third distance and the fourth distance are equal to or less than prescribed thresholds, and the difference between the brightness values on the high-brightness side is less than a fixed value and the difference between the brightness values on the low-brightness side is less than a fixed value, among the brightness values before and after the outer edge and the corneal reflection.
  • the position of the corneal reflection is regarded as an edge, it is determined that this edge and the outer edge may form a pair.
  • step S58 In the case where it is determined that there is an edge that may form a pair, the processing of step S58 is executed, and in the case where it is determined that there is not an edge that may form a pair, the processing of step S55 is executed. In the case where a corneal reflection does not exist in the edge detection region, the processing of step S55 is executed unconditionally.
  • the edge-detecting unit 114 regards the position of the corneal reflection as an edge and registers, in the second stack, edge pair information including information regarding this edge and edge information regarding the edge extracted as an edge capable of forming a pair in step S57.
  • Step S59 In this state, two or more edge pairs are registered in the second stack.
  • the edge-detecting unit 114 determines whether both the edges included in one edge pair exist between the edges included in the other edge pair among the edge pairs registered in the second stack. In the case where this condition is satisfied, the processing of step S60 is executed, and in the case where this condition is not satisfied, the processing of step S55 is executed.
  • the edge-detecting unit 114 notifies the pupil-center-detecting unit 115 that the number of detected edge pairs is two, similarly to as in step S45 of FIG. 12 .
  • the edge-detecting unit 114 specifies two edge pairs that satisfy the condition of the determination of step S56 or step S59.
  • the edge-detecting unit 114 notifies the pupil-center-detecting unit 115 of the edge pair information of the edge pair for which the distance between the edges is long as information regarding the edges of the iris and notifies the pupil-center-detecting unit 115 of the edge pair information of the edge pair for which the distance between the edges is short as information regarding the edges of the pupil.
  • the iris edges are detected by utilizing the fact that the sclera is visible on both sides of the iris in the eyeball region.
  • the eyeball is turned to the side, only one of sclera regions between which the iris is interposed may be visible in the eyeball region.
  • the detected edge pair are the edges of the pupil when the eyeball is turned to the side.
  • the edge-detecting unit 114 may set the brightnesses of the pupil, iris, and sclera using the following method in step S15 of FIG. 10 in the case where only one edge pair is detected.
  • the edge-detecting unit 114 detects edge lines (outlines), which include edges that form pairs, from the eyeball region.
  • the edge-detecting unit 114 fills an inner region enclosed between detected edge lines with the brightness in the vicinity of the edge lines of the inner region.
  • the edge-detecting unit 114 may determine that the region is the pupil and that the selected edge is a pupil edge. In this case, the edge-detecting unit 114 sets the brightness on the low-brightness side of the selected edge to the brightness of the pupil and sets the brightness on the high-brightness side of the selected edge to the brightness of the iris. In addition, the edge-detecting unit 114 scans the edge detection region from an edge of the pupil to outside the pupil and may determine a position where the brightness increases by a prescribed value or more to be an iris edge. In this case, the brightness outside the determined iris edge is set to the brightness of the sclera.
  • the edge-detecting unit 114 sets the brightness of the inner region inside the edge line to the brightness of the iris and sets the brightness of the outer region outside the edge line to the brightness of the sclera.
  • the brightness of the pupil is set to the same value as the brightness of the iris.
  • FIG. 15 is a flowchart that illustrates a processing example 2-1 of processing of detecting the position of the center of a pupil.
  • the processing in FIG. 15 corresponds to the processing of step S16 in FIG. 10 .
  • the pupil-center-detecting unit 115 calculates a brightness difference Dpi between the pupil and the iris and a brightness difference Dis between the iris and the sclera based on the brightnesses of the pupil, the iris and the sclera notified from the edge-detecting unit 114 in step S15 of FIG. 10 .
  • the pupil-center-detecting unit 115 determines whether a state exists where the outline of the pupil may be accurately detected, that is, whether the dark pupil state exists, based on the brightness differences Dpi and Dis. For example, the pupil-center-detecting unit 115 determines that a state in which the outline of the pupil may be accurately detected does exist when an expression Dpi > Dis - A is satisfied.
  • A is a prescribed bias value and is set to a value greater than 0.
  • step S73 In the case where a state exists where the outline of the pupil may be accurately detected, the processing of step S73 is executed, and otherwise, the processing of step S74 is executed.
  • step S72 it is determined that the brightness difference Dpi is 0 and a state in which the outline of the pupil may be accurately detected does not exist when the edges of the pupil are not detected by the edge-detecting unit 114. In addition, even in the case where the edges of the pupil are detected by the edge-detecting unit 114, it is determined that a state in which the outline of the pupil may be accurately detected does not exist when the brightness difference Dpi between the pupil and the iris is not sufficiently larger than the brightness difference Dis between the iris and the sclera.
  • the pupil-center-detecting unit 115 detects the outline of the pupil from the eyeball region by using pupil outline detection templates.
  • the pupil-center-detecting unit 115 detects the position of the center of the pupil based on the detected outline of the pupil.
  • the pupil-center-detecting unit 115 detects the outline of the iris from the eyeball region by using iris outline detection templates.
  • the pupil-center-detecting unit 115 detects the position of the center of the pupil based on the detected outline of the iris.
  • processing in which the position of the center of the pupil is detected based on the outline of the pupil and processing in which the position of the center of the pupil is detected based on the outline of the iris are switched between in accordance with the brightness differences Dpi and Dis based on a detection result of the edge-detecting unit 114.
  • step S72 for example, in the case where the edges of the pupil are detected by the edge-detecting unit 114, the processing of step S73 may be executed, and in the case where the edges of the pupil are not detected by the edge-detecting unit 114, the processing of step S74 may be executed.
  • FIG. 16 is a flowchart that illustrates a processing example 2-2 of processing of detecting the position of the center of a pupil.
  • the processing in FIG. 16 corresponds to the processing of step S16 in FIG. 10 .
  • the pupil-center-detecting unit 115 detects the outline of the iris from the eyeball region by using iris outline detection templates.
  • the pupil-center-detecting unit 115 detects the position of the center of the pupil based on the detected outline of the iris.
  • the pupil-center-detecting unit 115 detects the outline of the pupil from the eyeball region by using pupil outline detection templates.
  • the pupil-center-detecting unit 115 detects the position of the center of the pupil based on the detected outline of the pupil.
  • the pupil-center-detecting unit 115 calculates a brightness difference Dpi between the pupil and the iris and a brightness difference Dis between the iris and the sclera based on the brightnesses of the pupil, the iris and the sclera notified from the edge-detecting unit 114 in step S15 of FIG. 10 .
  • the pupil-center-detecting unit 115 calculates a likelihood for each of the detection results of steps S81 and S82. For example, the pupil-center-detecting unit 115 calculates likelihoods for the detection results based on matching evaluation values between the image and the templates when the outlines of the iris and the pupil are detected in steps S81 and S82 and likelihoods given in advance to the templates used when detecting outlines.
  • the matching evaluation values between the image and the templates are, for example, calculated based on the degree of agreement between the image and the templates and the difference between an average brightness inside the detected outlines and a predetermined brightness.
  • the pupil-center-detecting unit 115 weights the likelihoods in accordance with the brightness differences Dpi and Dis. At this time, the pupil-center-detecting unit 115, for example, increases the weight of the likelihood of the detection result obtained in step S82, the larger the brightness difference Dpi becomes compared to the brightness difference Dis.
  • the ratio between the brightness difference Dis and the brightness difference Dpi used in this determination represents whether the edges of the pupil have been detected as a continuous evaluation value and this evaluation value represents to what degree of accuracy the outline of the pupil may be detected.
  • Step S85 The pupil-center-detecting unit 115 outputs the detection result having the higher likelihood, from among the detection results of steps S81 and S82, as a final detection result.
  • each detection result may be output together with the likelihood.
  • the likelihoods of a detection result of the position of the center of the pupil based on the outline of the pupil and a detection result of the position of the center of the pupil based on the outline of the iris are weighted in accordance with the brightness differences Dpi and Dis based on a detection result of the edge-detecting unit 114.
  • FIG. 17 illustrates an example of the relationship between templates and likelihood. Likelihoods are assigned in advance to the pupil outline detection templates for every radius of circle included in the templates.
  • the curve of a function Rpupil illustrated in graph 281 of FIG. 17 illustrates the relationship between the radii of circles included the pupil outline detection templates and likelihood. For example, when the outline of the pupil is detected by determining that the image of the eyeball region and a certain template are similar to each other, the likelihood assigned to the used template represents the reliability of the detection result of this outline.
  • likelihoods are similarly assigned in advance to the iris outline detection templates for every radius of circle included in the templates.
  • the curve of a function Riris illustrated in graph 281 of FIG. 17 illustrates the relationship between the radii of circles included the iris outline detection templates and likelihood. Since pupils have smaller radii than irises, the likelihood is set to be higher for the pupil outline detection templates in the range of a small circle radius, as illustrated in graph 281.
  • the pupil-center-detecting unit 115 weights the function Rpupil and the function Riris in accordance with a function f(W) calculated from the brightness differences Dpi and Dis based on the detection result of the edge-detecting unit 114, for example.
  • Graph 282 of FIG. 17 illustrates the curves of the function Rpupil and the function Riris after having been weighted with the function f(W).
  • the function f(W) increases the weight of the function Rpupil and decreases the weight of the function Riris, the larger the brightness difference Dpi becomes compared to the brightness difference Dis, for example.
  • the graph 282 is an example for a case in which the edges of the pupil are not detected by the edge-detecting unit 114 and the brightness difference Dpi is smaller than the brightness difference Dis, and the largest output of the function Riris is larger than the largest output of the function Rpupil.
  • the detection result of the position of the center of the pupil based on the outline of the iris has a higher likelihood than the detection result of the position of the center of the pupil based on the outline of the pupil.
  • the pupil-center-detecting unit 115 may restrict what templates are used in detection of the outlines. For example, the pupil-center-detecting unit 115 compares the output values of the function Rpupil and the function Riris after weighting and a prescribed threshold Th. Based on the function Rpupil, the pupil-center-detecting unit 115 detects the outline of the pupil by using just the templates having a higher likelihood than the threshold Th, among the pupil outline detection templates. In addition, based on the function Riris, the pupil-center-detecting unit 115 detects the outline of the iris by using just the templates having a higher likelihood than the threshold Th, among the iris outline detection templates.
  • outline detection is performed using relatively more pupil outline detection templates than iris outline detection templates.
  • outline detection is performed using relatively more iris outline detection templates than pupil outline detection templates.
  • the largest output of the function Rpupil is equal to or less than the threshold Th and therefore, substantially, only detection of the outline of the iris is performed and detection of the outline of the pupil is not performed.
  • FIG. 18 is a flowchart that illustrates a processing example 2-3 of processing of detecting the position of the center of the pupil.
  • the processing in FIG. 18 corresponds to the processing of step S16 in FIG. 10 .
  • FIG. 18 a case is illustrated in which the templates to be used in outline detection are restricted based on the function Rpupil and the function Riris after the functions have been weighted.
  • the pupil-center-detecting unit 115 weights the likelihoods assigned in advance to the templates used to detect the outline of the iris. For example, the pupil-center-detecting unit 115 makes a weight smaller, the larger the brightness difference Dpi is compared to the brightness difference Dis.
  • the pupil-center-detecting unit 115 determines that templates for which the likelihood after the weighting in step S91 is larger than a threshold Th are templates to be used in outline detection from among the iris outline detection templates.
  • the pupil-center-detecting unit 115 detects the outline of the iris from the eyeball region by using the templates determined in step S92.
  • the pupil-center-detecting unit 115 detects the position of the center of the pupil based on the detected outline of the iris.
  • the pupil-center-detecting unit 115 weights the likelihoods assigned in advance to the templates used to detect the outline of the pupil. For example, the pupil-center-detecting unit 115 makes the weight larger, the larger the brightness difference Dpi is compared to the brightness difference Dis.
  • the pupil-center-detecting unit 115 determines that templates for which the likelihood after the weighting in step S94 is larger than the threshold Th are templates to be used in outline detection from among the pupil outline detection templates.
  • the pupil-center-detecting unit 115 detects the outline of the pupil from the eyeball region by using the templates determined in step S95.
  • the pupil-center-detecting unit 115 detects the position of the center of the pupil based on the detected outline of the pupil.
  • the pupil-center-detecting unit 115 calculates a likelihood for each of the detection results of steps S93 and S96. For example, the pupil-center-detecting unit 115 calculates likelihoods for the detection results based on matching evaluation values between the image and the templates when the outlines of the iris and the pupil are detected in steps S93 and S96 and likelihoods given in advance to templates used when detecting the outlines. The likelihoods weighted in steps S91 and S94 are used as the likelihoods to be given to the templates.
  • the pupil-center-detecting unit 115 outputs the detection result having the higher likelihood, from among the detection results of steps S93 and S96, as a final detection result.
  • each detection result may be output together with the likelihood.
  • the likelihoods assigned to the templates are adjusted in accordance with the detection result of the edge-detecting unit 114.
  • the templates used when detecting outlines are restricted to the templates having high likelihoods and therefore the number of times matching is performed between the templates and the image is reduced.
  • the detection processing load may be lightened while maintaining outline detection accuracy.
  • an image characteristic amount corresponding to the dark pupil state and an image characteristic amount corresponding to the semi-bright pupil state may be used in the processing of detecting the outline of the pupil and the processing of detecting the outline of the iris.
  • the image characteristic amounts are generated from an image captured in a dark pupil state and an image captured in a semi-bright pupil state through studies, for example.
  • likelihoods are assigned to each image characteristic amount and the likelihoods are weighted in accordance with the edge detection result obtained by the edge-detecting unit 114.
  • the processing functions of the devices described in each of the above embodiments may be implemented by a computer.
  • a program in which the processing contents of the functions to be possessed by the devices are written, is supplied and the program is executed on a computer, as a result the processing functions are implemented by the computer.
  • the program in which the processing contents are written may be recorded on a recording medium that is readable by the computer. Examples of a recording medium that is readable by a computer include a magnetic storage device, an optical disc, a magneto-optical recording medium, and a semiconductor memory.
  • Examples of a magnetic recording device include a hard disk drive (HDD), a flexible disk (FD), and a magnetic tape.
  • Examples of an optical disc include a digital versatile disc (DVD), a DVD-RAM, a compact disc-read only memory (CD-ROM), a CR-recordable (R)/rewritable (RW).
  • Examples of a magneto-optical recording medium include a magneto-optical disk (MO).
  • a portable recording medium such as a DVD or a CD-ROM on which the program is recorded is sold, for example.
  • the computer that is to execute the program stores the program, which is recording on a portable recording medium or has been transferred from a server computer, on its own storage device, for example.
  • the computer reads the program out from its own storage device and executes processing in accordance with the program. It is also possible for the computer to directly read the program out from the portable recording medium and execute processing in accordance with the program. Furthermore, it is also possible for the computer to execute processing in accordance with a successively received program each time the program is transferred from a server computer connected via a network.

Claims (9)

  1. Sichtlinienerkennungsverfahren, das von einem Computer ausgeführt wird, wobei das Sichtlinienerkennungsverfahren umfasst:
    Erkennen (S11) eines Augenbereichs (10) eines Subjekts aus einem Bild;
    Ermitteln (S14), basierend auf einer Helligkeitsänderung in einem Erkennungsbereich (13), der in dem Augenbereich (10) liegt und eine Linienform oder eine Bandform aufweist, ob eine Position einer Grenze (11a, 11b) zwischen einer Pupille (11) und einer Iris (12) in dem Erkennungsbereich (13) erkannt wird;
    Ausführen (S16), in Übereinstimmung mit einem Ergebnis der Ermittlung, sowohl
    einer ersten Verarbeitung, in der eine Position eines Mittelpunkts der Pupille (11) basierend auf der Grenze zwischen der Pupille (11) und der Iris (12) erkannt wird, wobei die Grenze zwischen der Pupille (11) und der Iris (12) unter Verwendung einer Vielzahl von ersten Vorlagen (271a bis 271c) erkannt wird, als auch
    einer zweiten Verarbeitung, in der die Position des Mittelpunkts der Pupille (11) basierend auf der Grenze zwischen der Iris (12) und der Lederhaut erkannt wird, wobei die Grenze zwischen der Iris (12) und der Lederhaut unter Verwendung einer Vielzahl von zweiten Vorlagen (272a bis 272c) erkannt wird, die sich von den ersten Vorlagen (271a bis 271c) unterscheiden;
    Gewichten eines ersten Verarbeitungsergebnisses, das durch die erste Verarbeitung erhalten wird, stärker als ein zweites Verarbeitungsergebnis, das durch die zweite Verarbeitung erhalten wird, wenn ermittelt (S14) wurde, dass die Position der Grenze (11a, 11b) in dem Erkennungsbereich (13) erkannt ist;
    Gewichten des zweiten Verarbeitungsergebnisses, das durch die zweite Verarbeitung erhalten wird, stärker als das erste Verarbeitungsergebnis, das durch die erste Verarbeitung erhalten wird, wenn ermittelt (S14) wurde, dass die Position der Grenze (11a, 11b) in dem Erkennungsbereich (13) nicht erkannt ist; und
    Erkennen (S17) einer Sichtlinie des Subjekts, basierend auf der Position des Mittelpunkts der Pupille (11), die durch die erste Verarbeitung und die zweite Verarbeitung erkannt ist.
  2. Sichtlinienerkennungsverfahren nach Anspruch 1, wobei der Erkennungsbereich (13) eine erste Richtung als eine Längsrichtung in dem Augenbereich (10) aufweist, wobei das Verfahren weiter umfasst:
    Erkennen einer Vielzahl von Rändern (221a bis 221d), basierend auf den Helligkeitsänderungen in der Längsrichtung des Erkennungsbereichs (13), wobei die Vielzahl von Rändern einen ersten Rand (221a; 221b) und einen zweiten Rand (221c; 221d) umfasst; und
    Ermitteln, dass der erste Rand (221a; 221b) und der zweite Rand (221c; 221d) Ränder der Pupille (11) oder Ränder der Iris (12) sind, wenn ein Helligkeitsunterschied vor und nach dem ersten Rand (221a; 221b) und ein Helligkeitsunterschied vor und nach dem zweiten Rand (221c; 221d) im Wesentlichen derselbe sind, und
    wenn Helligkeit zwischen vor und nach dem ersten Rand (221a; 221b) abnimmt und Helligkeit zwischen vor und nach dem zweiten Rand (221c; 221d) zunimmt.
  3. Sichtlinienerkennungsverfahren nach Anspruch 2, wobei die erste Richtung einer horizontalen Richtung eines Auges des Subjekts entspricht.
  4. Sichtlinienerkennungsverfahren nach Anspruch 3, weiter umfassend:
    Erkennen, von dem Augenbereich (10), einer Hornhautreflexion, die Reflexion einer Lichtquelle von einer Hornhaut des Subjekts entspricht, indem als die Hornhautreflexion ein Bereich des Augenbereichs (10) erkannt wird, der höhere Helligkeit als der Rest des Augenbereichs (10) aufweist; und
    Ermitteln, wenn die Hornhautreflexion in dem Erkennungsbereich (13) enthalten ist, ob die Hornhautreflexion von dem ersten Rand (221a; 221b) oder dem zweiten Rand (221c; 221d) oder einem Abschnitt des Erkennungsbereichs überlagert wird, wo kein Rand besteht, basierend auf einem Vergleich eines Helligkeitsunterschieds zwischen vor und nach der Hornhautreflexion mit einem vorgeschriebenen Schwellenwert.
  5. Sichtlinienerkennungsverfahren nach Anspruch 3, weiter umfassend:
    Erkennen, von dem Augenbereich (10), einer Hornhautreflexion, die Reflexion einer Lichtquelle von einer Hornhaut des Subjekts entspricht, indem als die Hornhautreflexion ein Bereich des Augenbereichs (10) erkannt wird, der höhere Helligkeit als der Rest des Augenbereichs (10) aufweist; und
    Ermitteln, wenn die Hornhautreflexion an einer Seite eines Bereichs zwischen dem ersten Rand (221b) und dem zweiten Rand (221c) enthalten ist, die jeweils an der Grenze zwischen der Pupille (11) und der Iris (12) in dem Erkennungsbereich (13) sind, und wenn ein dritter Rand an der anderen Seite des Bereichs erkannt wird, dass die Hornhautreflexion von einem Rand bei der Grenze zwischen der Iris (12) und der Lederhaut des Auges überlagert wird, wenn (i) der Abstand zwischen dem dritten Rand und einem des ersten Rands und des zweiten Rands, der näher bei dem dritten Rand ist, und (ii) der Abstand zwischen der Hornhautreflexion und dem anderen des ersten Rands und des zweiten Rands gleich oder weniger als vorgeschriebene Schwellen sind.
  6. Sichtlinienerkennungsprogramm, das, wenn es auf dem Computer ausgeführt wird, den Computer (100) veranlasst, das Sichtlinienerkennungsverfahren nach einem der Ansprüche 1 bis 5 umzusetzen.
  7. Sichtlinienerkennungsvorrichtung (1; 100), umfassend:
    eine Augapfelbereichserkennungseinheit (112), die konfiguriert ist, einen Augenbereich (10) eines Subjekts von einem Bild zu erkennen;
    eine Randerkennungseinheit (114), die konfiguriert ist, basierend auf einer Helligkeitsänderung in einem Erkennungsbereich (13), der in dem Augenbereich (10) liegt und eine Linienform oder eine Bandform aufweist, zu ermitteln, ob eine Position einer Grenze (11a, 11b) zwischen einer Pupille (11) und einer Iris (12) in dem Erkennungsbereich (13) erkannt ist;
    eine Pupillenmittelpunkterkennungseinheit (1b; 115), die konfiguriert ist zum:
    Ausführen, in Übereinstimmung mit einem Ergebnis einer Ermittlung, sowohl,
    einer ersten Verarbeitung, in der eine Position eines Mittelpunkts der Pupille (11) basierend auf der Grenze zwischen der Pupille (11) und der Iris (12) erkannt wird, wobei die Grenze zwischen der Pupille (11) und der Iris (12) unter Verwendung einer Vielzahl von ersten Vorlagen (271a bis 271c) erkannt wird, als auch
    einer zweiten Verarbeitung, in der die Position des Mittelpunkts der Pupille (11) basierend auf der Grenze zwischen der Iris (12) und der Lederhaut erkannt wird, wobei die Grenze zwischen der Iris (12) und der Lederhaut unter Verwendung einer Vielzahl von zweiten Vorlagen (272a bis 272c) erkannt wird, die sich von den ersten Vorlagen (271a bis 271c) unterscheiden;
    Gewichten eines ersten Verarbeitungsergebnisses, das durch die erste Verarbeitung erhalten wird, stärker als ein zweites Verarbeitungsergebnis, das durch die zweite Verarbeitung erhalten wird, wenn die Randerkennungseinheit (114) ermittelt hat, dass die Position der Grenze (11a, 11b) in dem Erkennungsbereich (13) erkannt ist; und
    Gewichten des zweiten Verarbeitungsergebnisses, das durch die zweite Verarbeitung erhalten wird, stärker als das erste Verarbeitungsergebnis, das durch die erste Verarbeitung erhalten wird, wenn die Randerkennungseinheit (114) ermittelt hat, dass die Position der Grenze (11a, 11b) in dem Erkennungsbereich (13) nicht erkannt ist; und
    Erkennen einer Sichtlinie des Subjekts, basierend auf der Position des Mittelpunkts der Pupille (11), die durch die erste Verarbeitung und die zweite Verarbeitung erkannt ist.
  8. Sichtlinienerkennungsvorrichtung (1; 100) nach Anspruch 7, wobei der Erkennungsbereich (13) eine erste Richtung als eine Längsrichtung in dem Augenbereich (10) aufweist und die Randerkennungseinheit (114) konfiguriert ist zum:
    Erkennen einer Vielzahl von Rändern (221a bis 221d), basierend auf den Helligkeitsänderungen in der Längsrichtung des Erkennungsbereichs (13), wobei die Vielzahl von Rändern einen ersten Rand (221a; 221b) und einen zweiten Rand (221c; 221d) umfasst; und
    Ermitteln, dass der erste Rand (221a; 221b) und der zweite Rand (221c; 221d) Ränder der Pupille (11) oder Ränder der Iris (12) sind, wenn ein Helligkeitsunterschied vor und nach dem ersten Rand (221a; 221b) und ein Helligkeitsunterschied vor und nach dem zweiten Rand (221c; 221d) im Wesentlichen derselbe sind, und
    wobei Helligkeit zwischen vor und nach dem ersten Rand (221a; 221b) abnimmt und Helligkeit zwischen vor und nach dem zweiten Rand (221c; 221d) zunimmt.
  9. Sichtlinienerkennungsvorrichtung (1; 100) nach Anspruch 8, wobei die erste Richtung einer horizontalen Richtung eines Auges des Subjekts entspricht.
EP16187589.3A 2015-09-30 2016-09-07 Sichtlinienerkennungsverfahren und -vorrichtung Active EP3151166B1 (de)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2015193729A JP6536324B2 (ja) 2015-09-30 2015-09-30 視線検出システム、視線検出方法および視線検出プログラム

Publications (2)

Publication Number Publication Date
EP3151166A1 EP3151166A1 (de) 2017-04-05
EP3151166B1 true EP3151166B1 (de) 2020-06-10

Family

ID=57103775

Family Applications (1)

Application Number Title Priority Date Filing Date
EP16187589.3A Active EP3151166B1 (de) 2015-09-30 2016-09-07 Sichtlinienerkennungsverfahren und -vorrichtung

Country Status (3)

Country Link
US (1) US10417494B2 (de)
EP (1) EP3151166B1 (de)
JP (1) JP6536324B2 (de)

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101776944B1 (ko) * 2017-01-09 2017-09-08 주식회사 쓰리이 홍채 패턴 코드화 방법
JP6776970B2 (ja) * 2017-03-24 2020-10-28 株式会社Jvcケンウッド 視線検出装置、視線検出方法及び視線検出プログラム
JP6930223B2 (ja) * 2017-05-31 2021-09-01 富士通株式会社 瞳孔検出用コンピュータプログラム、瞳孔検出装置及び瞳孔検出方法
US11080888B2 (en) 2017-06-02 2021-08-03 Sony Corporation Information processing device and information processing method
KR102349565B1 (ko) * 2017-07-03 2022-01-10 휴렛-팩커드 디벨롭먼트 컴퍼니, 엘.피. 눈 움직임에 기초한 회전형 마이크로 led 디스플레이
JP7078386B2 (ja) * 2017-12-07 2022-05-31 矢崎総業株式会社 画像処理装置
CN111936912A (zh) * 2018-02-28 2020-11-13 奇跃公司 使用眼部配准的头部扫描对准
JP6717330B2 (ja) * 2018-03-15 2020-07-01 オムロン株式会社 視線検出装置、該視線検出装置の制御方法、角膜反射像位置の検出方法、及びコンピュータプログラム
US10949969B1 (en) * 2018-03-20 2021-03-16 Welch Allyn, Inc. Pupil edge region removal in digital imaging
CN109002796B (zh) * 2018-07-16 2020-08-04 阿里巴巴集团控股有限公司 一种图像采集方法、装置和系统以及电子设备
CN108922085B (zh) * 2018-07-18 2020-12-18 北京七鑫易维信息技术有限公司 一种监护方法、装置、监护设备及存储介质
JP7311257B2 (ja) * 2018-09-27 2023-07-19 株式会社アイシン 眼球情報検出装置、眼球情報検出方法、および乗員モニタリング装置
US10928904B1 (en) 2019-12-31 2021-02-23 Logitech Europe S.A. User recognition and gaze tracking in a video system
US11163995B2 (en) * 2019-12-31 2021-11-02 Logitech Europe S.A. User recognition and gaze tracking in a video system
CN112989939B (zh) * 2021-02-08 2023-04-07 佛山青藤信息科技有限公司 一种基于视觉的斜视检测系统

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3520618B2 (ja) * 1995-08-16 2004-04-19 日産自動車株式会社 車両用視線方向計測装置
JP4469476B2 (ja) * 2000-08-09 2010-05-26 パナソニック株式会社 眼位置検出方法および眼位置検出装置
JP4008282B2 (ja) 2002-04-24 2007-11-14 沖電気工業株式会社 瞳孔・虹彩円検出装置
JP2004005167A (ja) 2002-05-31 2004-01-08 Matsushita Electric Ind Co Ltd 目位置特定方法および装置
US9250703B2 (en) * 2006-03-06 2016-02-02 Sony Computer Entertainment Inc. Interface with gaze detection and voice input
CN102439627B (zh) * 2010-02-26 2014-10-08 松下电器(美国)知识产权公司 瞳孔检测装置和瞳孔检测方法
JP5529660B2 (ja) * 2010-07-20 2014-06-25 パナソニック株式会社 瞳孔検出装置及び瞳孔検出方法
WO2013057882A1 (ja) * 2011-10-19 2013-04-25 パナソニック株式会社 表示制御装置、集積回路、表示制御方法およびプログラム
JP6056323B2 (ja) 2012-09-24 2017-01-11 富士通株式会社 視線検出装置、視線検出用コンピュータプログラム
JP2014078052A (ja) * 2012-10-09 2014-05-01 Sony Corp 認証装置および方法、並びにプログラム
JP6175945B2 (ja) * 2013-07-05 2017-08-09 ソニー株式会社 視線検出装置及び視線検出方法
US9355315B2 (en) * 2014-07-24 2016-05-31 Microsoft Technology Licensing, Llc Pupil detection
US10016130B2 (en) * 2015-09-04 2018-07-10 University Of Massachusetts Eye tracker system and methods for detecting eye parameters

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None *

Also Published As

Publication number Publication date
JP2017068615A (ja) 2017-04-06
JP6536324B2 (ja) 2019-07-03
US10417494B2 (en) 2019-09-17
US20170091520A1 (en) 2017-03-30
EP3151166A1 (de) 2017-04-05

Similar Documents

Publication Publication Date Title
EP3151166B1 (de) Sichtlinienerkennungsverfahren und -vorrichtung
JP6930223B2 (ja) 瞳孔検出用コンピュータプログラム、瞳孔検出装置及び瞳孔検出方法
JP6577454B2 (ja) 軸上視線追跡システム及び方法
US20190102597A1 (en) Method and electronic device of performing fingerprint recognition
US8942419B1 (en) Position estimation using predetermined patterns of light sources
US10445574B2 (en) Method and apparatus for iris recognition
EP2881891B1 (de) Bildverarbeitungsvorrichtung und Bildverarbeitungsverfahren
JP7004059B2 (ja) なりすまし検知装置、なりすまし検知方法、及びプログラム
WO2019050543A1 (en) RELIABILITY OF FOLLOW-UP DATA FOR THE LEFT EYE AND THE RIGHT EYE
US11375133B2 (en) Automatic exposure module for an image acquisition system
JP6870474B2 (ja) 視線検出用コンピュータプログラム、視線検出装置及び視線検出方法
US11776311B2 (en) Image processing device, image processing method, and storage medium
US20170116736A1 (en) Line of sight detection system and method
US20210012105A1 (en) Method and system for 3d cornea position estimation
US20170243061A1 (en) Detection system and detection method
CN1567377B (zh) 数字图像的红眼处理方法
EP3671541B1 (de) Klassifikation von zwinkern unter verwendung eines augenverfolgungssystems
CN111033562B (zh) 图像处理系统、图像处理方法和存储介质
CN103577791A (zh) 一种红眼检测方法和系统
US11156831B2 (en) Eye-tracking system and method for pupil detection, associated systems and computer programs
CN112235912A (zh) 一种基于用户图片的照明调节方法及系统
JP2018120299A (ja) 視線検出用コンピュータプログラム、視線検出装置及び視線検出方法
JP5736763B2 (ja) 撮像装置、撮像プログラムおよび信号処理装置
CN116704974A (zh) 一种基于瞳孔直径变化的亮度调节方法及设备

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20170726

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20180718

REG Reference to a national code

Ref country code: DE

Ref legal event code: R079

Ref document number: 602016037804

Country of ref document: DE

Free format text: PREVIOUS MAIN CLASS: G06K0009460000

Ipc: G06K0009000000

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

RIC1 Information provided on ipc code assigned before grant

Ipc: G06K 9/00 20060101AFI20200108BHEP

Ipc: G06K 9/50 20060101ALI20200108BHEP

Ipc: G06K 9/46 20060101ALI20200108BHEP

INTG Intention to grant announced

Effective date: 20200129

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

Ref country code: AT

Ref legal event code: REF

Ref document number: 1279763

Country of ref document: AT

Kind code of ref document: T

Effective date: 20200615

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602016037804

Country of ref document: DE

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200610

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200910

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200610

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200610

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200911

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20200610

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200610

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200610

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200610

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200910

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1279763

Country of ref document: AT

Kind code of ref document: T

Effective date: 20200610

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200610

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200610

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200610

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200610

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200610

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200610

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200610

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200610

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200610

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201012

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200610

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200610

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201010

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602016037804

Country of ref document: DE

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200610

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200610

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

26N No opposition filed

Effective date: 20210311

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200610

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20200930

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200907

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200930

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200930

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200930

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200907

REG Reference to a national code

Ref country code: DE

Ref legal event code: R079

Ref document number: 602016037804

Country of ref document: DE

Free format text: PREVIOUS MAIN CLASS: G06K0009000000

Ipc: G06V0010000000

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200610

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200610

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200610

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200610

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20220728

Year of fee payment: 7

Ref country code: DE

Payment date: 20220803

Year of fee payment: 7

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20220808

Year of fee payment: 7