WO2011105004A1 - 瞳孔検出装置及び瞳孔検出方法 - Google Patents
瞳孔検出装置及び瞳孔検出方法 Download PDFInfo
- Publication number
- WO2011105004A1 WO2011105004A1 PCT/JP2011/000359 JP2011000359W WO2011105004A1 WO 2011105004 A1 WO2011105004 A1 WO 2011105004A1 JP 2011000359 W JP2011000359 W JP 2011000359W WO 2011105004 A1 WO2011105004 A1 WO 2011105004A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- pupil
- image
- unit
- pupil detection
- imaging
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
Definitions
- the present invention relates to a pupil detection device and a pupil detection method.
- Pupil detection is performed in gaze detection or expression detection. If this pupil detection is performed when the illumination is low, "red eye phenomenon” may occur. This red-eye phenomenon is caused by imaging of blood vessels in the retina, particularly when flash photography is performed with the pupil widely open in a dark place or the like. The luminance of the pupil peripheral image of the pupil image in which the red eye phenomenon occurs is higher than the luminance of the pupil peripheral image of a normal pupil image in which the red eye phenomenon does not occur. Therefore, under the condition where the red eye phenomenon occurs, it is difficult to perform stable pupil detection even if the pupil detection method targeting the normal pupil where the red eye phenomenon does not occur is applied.
- a pupil detection method at the time of occurrence of the red eye phenomenon has been proposed (see, for example, Patent Document 1).
- this pupil detection method a pupil image in which a red eye phenomenon has occurred is detected based on the saturation and luminance of the eye area.
- the pupil detection accuracy may decrease or it may be difficult to apply in the first place.
- the pupil detection result may be used for gaze detection as described above. Therefore, when the pupil detection accuracy is lowered, the accuracy of processing (for example, line-of-sight detection processing) in which the pupil detection result is used is also lowered.
- An object of the present invention is to provide a pupil detection device and a pupil detection method capable of selecting and outputting a detection result with high accuracy even when a grayscale image using near infrared light is used.
- a first imaging pair consisting of imaging means separated by a predetermined separation distance and light emitting means emitting light at the time of imaging, and the second separation distance larger than the first imaging pair
- a first pupil image is detected from an imaging pair and a first person image in which a person is imaged by the first imaging pair, and a second person image in which the person is imaged by the second imaging pair
- a detection unit for detecting a second pupil image, a calculated value of red eye generation intensity which is a relative luminance of the luminance in the first pupil image to the luminance of a peripheral image outside the first pupil image, and a red eye
- output switching means for selectively outputting the detection result of the first pupil image or the detection result of the second pupil image based on the correlation characteristic between the generation intensity and the pupil detection accuracy value.
- a first imaging pair consisting of imaging means separated by a predetermined separation distance and light emitting means emitting light at the time of imaging, and the second separation distance larger than the first imaging pair
- a pupil detection method in a pupil detection device comprising an imaging pair, comprising: detecting a first pupil image from a first person image in which a person is imaged by the first imaging pair; Detecting a second pupil image from a second person image in which the person is imaged by the imaging pair; and luminance of a peripheral image outside the first pupil image of the luminance in the first pupil image
- the detection result of the first pupil image or the detection result of the second pupil image is calculated based on the calculated value of the red eye generation intensity, which is relative luminance to the image, and the correlation characteristic between the red eye generation intensity and the pupil detection accuracy value.
- Selective output step Comprising a.
- the present invention it is possible to provide a pupil detection device and a pupil detection method capable of selecting and outputting a detection result with high accuracy even when a grayscale image using near infrared light is used.
- a block diagram showing a configuration of a pupil detection device according to Embodiment 1 of the present invention Block diagram showing the configuration of the image input unit Block diagram showing the configuration of the image input unit Diagram showing the configuration of the imaging unit Block diagram showing the configuration of the pupil detection unit Block diagram showing the configuration of the pupil detection unit Block diagram showing the configuration of the switching determination unit Flow chart for explaining the operation of the pupil detection device
- a diagram used to explain the correlation characteristics between red-eye occurrence intensity and pupil detection accuracy rate Diagram used to explain fluctuation trend characteristics A block diagram showing a configuration of a pupil detection device according to a second embodiment of the present invention
- Block diagram showing the configuration of the switching determination unit Flow chart for explaining the operation of the pupil detection device Diagram showing the correspondence between illuminance and illuminance coefficient
- Block diagram showing a configuration of a pupil detection device according to a third embodiment of the present invention Block diagram showing the configuration of the switching determination unit Diagram showing configuration variation of image input unit
- FIG. 1 is a block diagram showing a configuration of a pupil detection apparatus 100 according to Embodiment 1 of the present invention.
- the pupil detection device 100 is provided, for example, in a compartment of a car and connected to an alarm device for use.
- the alarm device detects the line of sight direction of the driver based on the detection result of the pupil detection device 100, and warns the driver to warn when the driver does not turn to the front for a long time.
- the pupil detection device 100 is set in the vehicle compartment of a car will be described as an example.
- pupil detection apparatus 100 includes image input units 101 and 102, pupil detection units 103 and 104, switching determination unit 105, and evaluation value storage unit 106.
- the image input units 101 and 102 emit light to illuminate an imaging target (that is, a person here), and capture an image of the imaging target.
- the target image data is output to pupil detection units 103 and 104, respectively.
- the image input unit 101 includes an imaging unit 111, an irradiation unit 112, and a synchronization unit 113, as shown in FIG.
- the imaging unit 111 captures an imaging target at timing according to the synchronization signal received from the synchronization unit 113, and outputs a target image signal to the pupil detection unit 103.
- the imaging unit 111 includes an image sensor such as a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS).
- CCD charge coupled device
- CMOS complementary metal oxide semiconductor
- the irradiation unit 112 emits light at timing according to the synchronization signal received from the synchronization unit 113.
- the irradiation unit 112 includes an infrared LED.
- the infrared light emitted from the infrared LED is invisible but is sensed by the imaging unit 111. That is, the imaging unit 111 has infrared sensitivity.
- the irradiation part 112 may be always on, not synchronous pulse drive.
- the synchronization unit 113 controls the light emission timing of the irradiation unit 112 and the exposure timing of the imaging unit 111 by outputting a synchronization signal to the imaging unit 111 and the irradiation unit 112.
- the imaging target can be imaged at the timing when the imaging target is illuminated.
- the image input unit 102 includes an imaging unit 121, an irradiation unit 122, and a synchronization unit 123.
- the imaging unit 121, the irradiation unit 122, and the synchronization unit 123 have the same functions as the imaging unit 111, the irradiation unit 112, and the synchronization unit 113.
- the separation distance d1 between the imaging unit 111 and the irradiation unit 112 (that is, the distance from the optical axis of the imaging unit 111 to the irradiation unit 112) is shorter than the separation processing d2 between the imaging unit 121 and the irradiation unit 122. That is, in the image captured by the imaging unit 111, the red eye phenomenon is more likely to occur than the image captured by the imaging unit 121.
- the image input units 101 and 102 having the above configuration constitute an imaging unit 200 as shown in FIG. 4.
- the imaging unit 200 is installed in front of the driver's seat, for example, on a steering wheel of a car or on a dashboard.
- the driver's face is irradiated by the irradiation units 112 and 122, and the driver's face is photographed by the image input units 101 and 102.
- the pupil detection units 103 and 104 detect pupil images from the target image received from the image input unit 101 and the target image received from the image input unit 102.
- the pupil detection unit 103 has a face detection unit 131, a face part detection unit 132, and a pupil information calculation unit 133.
- the face detection unit 131 detects a face image from the target image received from the image input unit 101, and outputs face image data to the face part detection unit 132.
- the face part detection unit 132 detects a face part group (that is, eye corners, eyes, etc.) from the face image received from the face detection unit 131, and outputs position coordinates of each face part to the pupil information calculation unit 133.
- a face part group that is, eye corners, eyes, etc.
- the pupil information calculation unit 133 calculates the center position of the pupil image and the size of the pupil image (that is, the pupil diameter) based on the position coordinates of each face component received from the face component detection unit 132.
- the central position of the pupil image and the size of the pupil image are output to the switching determination unit 105 together with the target image output from the image input unit 101 as a pupil image detection result.
- the central position of the pupil image and the size of the pupil image are calculated for each of the right eye and the left eye.
- the pupil detection unit 104 includes a face detection unit 141, a face part detection unit 142, and a pupil information calculation unit 143.
- the face detection unit 141, the face part detection unit 142, and the pupil information calculation unit 143 have functions similar to those of the face detection unit 131, the face part detection unit 132, and the pupil information calculation unit 133. However, they differ in that the data to be processed is a target image received from the image input unit 102.
- the switching determination unit 105 calculates “red eye occurrence intensity” from the target image data received from the image input unit 101, and correlates the calculated red eye occurrence intensity with the red eye occurrence intensity and the pupil detection accuracy rate. And selects one of the pupil detection result of the pupil detection unit 103 and the pupil detection result of the pupil detection unit 104 as a signal to be output to the processing unit (for example, gaze direction calculation unit) in the subsequent stage.
- the processing unit for example, gaze direction calculation unit
- the switching determination unit 105 includes a red-eye occurrence intensity calculation unit 151, a characteristic calculation unit 152, and an output signal selection unit 153.
- the red eye occurrence intensity calculation unit 151 calculates a red eye occurrence intensity based on the input image and the pupil image detection result.
- the red-eye occurrence intensity is calculated for the target image received from the pupil detection unit 103 and the pupil image detection result.
- This red-eye occurrence intensity means "relative luminance" with respect to the luminance of the pupil image peripheral area located outside the pupil image area of the luminance in the pupil image area.
- the red eye generation intensity will be described in detail later.
- the characteristic calculation unit 152 has data (hereinafter, may be simply referred to as “correlation characteristic data”) regarding the correlation characteristic between the red-eye occurrence intensity and the pupil detection accuracy rate (that is, pupil detection accuracy value).
- the characteristic calculation unit 152 calculates the pupil detection accuracy rate corresponding to the red-eye occurrence intensity received from the red-eye occurrence intensity calculation unit 151 using the correlation characteristic data.
- the calculated pupil detection accuracy rate is output to the output signal selection unit 153 and the evaluation value storage unit 106. In the following, this calculated pupil detection accuracy rate may be called “pupil detection reliability” in order to clearly distinguish it from the pupil detection accuracy rate in the correlation characteristic data.
- the output signal selection unit 153 outputs a signal of the pupil detection unit 103 as a signal to be output to a processing unit (for example, the gaze direction calculation unit) in the subsequent stage based on the “variation tendency characteristic” of the pupil detection reliability received from the characteristic calculation unit 152.
- a processing unit for example, the gaze direction calculation unit
- One of the pupil detection result and the pupil detection result of the pupil detection unit 104 is selected.
- output signal selection unit 153 receives the pupil received this time from characteristic calculation unit 152. Based on the detection reliability and the history of the pupil detection reliability stored in the evaluation value storage unit 106, the fluctuation tendency characteristic of the pupil detection reliability is calculated. Then, the output signal selection unit 153 selects one of the pupil detection result of the pupil detection unit 103 and the pupil detection result of the pupil detection unit 104 based on the calculated fluctuation tendency characteristic, and selects the selected pupil detection result in the subsequent stage. Output to processing unit.
- the fluctuation tendency characteristic of the pupil detection reliability will be described in detail later.
- the evaluation value storage unit 106 stores the pupil detection reliability received from the switching determination unit 105 in association with the imaging time of the target image used to specify the pupil detection reliability.
- FIG. 8 is a flowchart for explaining the operation of pupil detection apparatus 100.
- the flow chart of FIG. 8 also includes the process flow in the above-described alarm device. That is, FIG. 8 shows a process flow when the pupil detection device 100 is applied to an alarm device.
- the processing flow shown in FIG. 8 starts with the start of the photographing operation.
- the photographing operation may be started by the user's operation or may be started by using some external signal as a trigger.
- step S201 the image input units 101 and 102 emit light to illuminate an imaging target (that is, a person here) and capture an image of the imaging target. Thereby, a target image is acquired.
- an imaging target that is, a person here
- the irradiation unit 112 of the image input unit 101 irradiates the target with invisible near infrared light (for example, light of wavelength 850 nm).
- the imaging unit 111 captures an image of the target.
- the separation distance d1 between the imaging unit 111 and the irradiation unit 112 is sufficiently short (for example, 10 mm). Therefore, when the peripheral illumination decreases, the pupil diameter increases, and a red-eye phenomenon occurs in the target image captured by the imaging unit 111.
- the irradiation unit 122 of the image input unit 102 also irradiates the target with invisible near infrared light (for example, light with a wavelength of 850 nm) at a timing different from that of the irradiation unit 112.
- the imaging unit 121 captures an image of the target at a timing at which the irradiation unit 122 irradiates the target.
- a separation distance d2 between the imaging unit 121 and the irradiation unit 122 is larger than d1.
- the occurrence probability of the red eye phenomenon is lower than that of the target image imaged by the imaging unit 111, and even if the red eye phenomenon occurs, the red eye occurrence intensity is low.
- the red-eye occurrence intensity is sufficiently large, the brightness of the lighted portion of the pupil reaches the brightness upper limit of the image, so the target image captured by the imaging unit 111 and the target image captured by the imaging unit 121 are equivalent.
- the imaging unit 111 and the imaging unit 121 for example, a digital camera provided with a CMOS image sensor and a lens is assumed. Therefore, an image storage unit (for example, a memory of a PC, not shown) is included in the image input unit 101 or 102 and an image etc. in PPM (Portable Pix Map file format) format captured by the imaging unit 111 and the imaging unit 121 is included. After being temporarily stored in the space), the PPM format is output to the pupil detection units 103 and 104 as it is.
- PPM Portable Pix Map file format
- FIG. 9 is a view showing a face image which is a target image.
- FIG. 9A is a view showing an image captured by the imaging unit 111 at the time of occurrence of the red-eye phenomenon
- FIG. 9B is a view showing an image captured by the imaging unit 121 when the red-eye phenomenon does not occur.
- the horizontal direction of the image is the X axis
- the vertical direction of the image is the Y axis
- one pixel is one coordinate point.
- a candidate of an image serving as a feature (that is, a feature image candidate) is extracted from an input image, and the extracted feature image candidate is compared with a feature image representing a face area prepared in advance.
- feature image candidates having high similarity are detected.
- the similarity is obtained, for example, by collating the Gabor feature amount of the average face acquired in advance with the Gabor feature amount extracted by scanning the input image, and is obtained as the reciprocal of the absolute value of the difference between the two.
- the face detection unit 131 specifies an area having the highest correlation in the image of FIG. 9A as the face image 301 as compared to the template prepared in advance.
- the face image 311 is specified also in the image of FIG. 9B.
- the face area detection process may be performed by detecting a skin color area in the image (that is, skin color area detection), or may be performed by detecting an elliptical portion (that is, ellipse detection). And may be performed by using a statistical pattern identification method. In addition to the above, any method may be adopted as long as it can perform the face detection.
- the face part detection unit 132, 142 detects a face part group (that is, a corner, corner, eyes, etc.) from the face image received from the face detection unit 131, 141, and calculates position information of each face part as pupil information Output to the units 133 and 143.
- the search area of the face part group is the face areas 301 and 311 specified in step S202. Face parts 302 and 312 are shown in FIGS. 9A and 9B, respectively.
- the face part detection units 132 and 142 receive the face images 301 and 311 when they are input. The place with the highest likelihood regarding the correspondence may be detected as a face part.
- the face part detection units 132 and 142 may search for face parts from within the face images 301 and 311 using a standard face part template.
- step S204 the pupil information calculation units 133 and 143 receive the central position of the pupil image and the size of the pupil image (that is, the pupil diameter based on the position coordinates of each facial component received from the facial component detection units 132 and 142). Calculate).
- a circular separability filter is applied to the eye regions 304 and 314 including the corners of the eyes and the corners obtained in step S203. That is, when the application position of the circular separability filter is moved in the eye regions 304 and 314, the average luminance of the inside of the filter circle is the highest among the positions where the average luminance of the filter circle is higher than the average luminance of the outside of the filter circle.
- a region corresponding to the filter circle at a position where the degree of separation is high is detected as a pupil region (corresponding to the regions 303 and 313 in FIGS. 9A and 9B).
- the coordinates and the diameter of the center of the circle of the separation degree filter corresponding to the detected pupil are acquired as the central position of the pupil image and the size of the pupil image.
- the switching determination unit 105 calculates the red-eye occurrence intensity from the target image data received from the image input unit 101, and calculates the calculated red-eye occurrence intensity and the correlation characteristic between the red-eye occurrence intensity and the pupil detection accuracy rate. Based on this, either the pupil detection result of the pupil detection unit 103 or the pupil detection result of the pupil detection unit 104 is selected as a signal to be output to the processing unit (for example, the gaze direction calculation unit) in the subsequent stage.
- the red eye occurrence intensity calculation unit 151 calculates the red eye occurrence intensity based on the input image and the pupil image detection result (that is, the central position of the pupil image and the size of the pupil image). .
- the red-eye occurrence intensity V is calculated by the following equation (1). That is, the red-eye occurrence intensity indicates how high the luminance in the pupil is with respect to the luminance around the pupil.
- P1 is the area 401 in the eye area image 304 (314) shown in FIG. 10, that is, the inner area of the pupil area 401.
- P2 is a region 402 in the eye region image 304 (314) shown in FIG. 10, that is, a region outside the pupil region 401.
- b is the luminance of each pixel.
- N1 is the number of pixels present in P1.
- N2 is the number of pixels present in P2.
- the red-eye occurrence intensity is calculated for the target image received from the pupil detection unit 103 and the pupil image detection result.
- step S206 the characteristic calculation unit 152 calculates the pupil detection accuracy rate corresponding to the red-eye occurrence intensity received from the red-eye occurrence intensity calculation unit 151 using the correlation characteristic data.
- the correlation characteristic data is prepared in advance by experimental data or a characteristic model. Further, this correlation characteristic data represents the correlation characteristic between the red eye occurrence intensity and the pupil detection accuracy rate, and for example, as shown in FIG. 11, the horizontal axis is the red eye occurrence intensity and the vertical axis is the pupil detection accuracy rate, It may be a graph plotting experimental data.
- a curve 501 represents correlation characteristic data when the red-eye effect actually occurs. That is, the curve 501 is a curve obtained from the red-eye occurrence intensity obtained from the target image obtained by the image input unit 101.
- Curve 502 represents correlation characteristic data when no red-eye effect occurs. That is, the curve 502 is based on the target image obtained by the image input unit 102 in the same shooting environment as the red-eye occurrence intensity obtained from the target image obtained by the image input unit 101 in a certain shooting environment. It is a curve obtained by plotting the correct rate of pupil detection performed.
- the curve 501 and the curve 502 intersect at the intersection point 503, and the pupil detection accuracy rate corresponding to the intersection point 503 is used as the switching determination reference value 504.
- the switching judgment reference value 504 it is optimal for the switching judgment reference value to be the intersection of the two curves, the pupil detection accuracy rate at the intersection is higher than the predetermined allowable accuracy rate even if it is not necessarily the intersection.
- the allowable accuracy rate may be used as the switching determination reference value.
- the pupil detection accuracy rate obtained by adding or subtracting a predetermined value to or from the pupil detection accuracy rate at the intersection of two curves may be used as the switching determination reference value.
- the characteristic calculation unit 152 causes the evaluation value storage unit 106 to store the pupil detection reliability.
- the evaluation value storage unit 106 stores the pupil detection reliability received from the switching determination unit 105 in association with the imaging time of the target image used to specify the pupil detection reliability. However, if the imaging time and the pupil detection reliability are sufficiently large for the amount necessary for the process of step S208 described later, the evaluation value storage unit 106 erases the imaging time from the oldest one, and becomes vacant by erasure. The newly acquired time and pupil detection reliability may be overwritten in the second region.
- step S208 the output signal selection unit 153 detects a pupil as a signal to be output to a processing unit (for example, a gaze direction calculation unit) in the subsequent stage based on the "fluctuation tendency characteristic" of the pupil detection reliability received from the characteristic calculation unit 152.
- a processing unit for example, a gaze direction calculation unit
- One of the pupil detection result of the unit 103 and the pupil detection result of the pupil detection unit 104 is selected.
- the output signal selection unit 153 determines the pupil detection reliability. Calculate the fluctuation tendency characteristic of degree.
- FIG. 12 is a diagram showing a graph formed from the pupil detection reliability received from the characteristic calculation unit 152 this time and the history of pupil detection reliability stored in the evaluation value storage unit 106.
- the horizontal axis is imaging time
- the vertical axis is pupil detection reliability.
- the value itself of the pupil detection reliability stored in the evaluation value storage unit 106 is used, for example, a value obtained by taking a time average may be used, or an outlier is removed. You may use the value which took the time average.
- the fluctuation tendency characteristic of the pupil detection reliability is calculated as follows.
- the output signal selection unit 153 calculates the gradient of the time change of the pupil detection reliability at each imaging time.
- the gradient D1 of the time change of the pupil detection reliability can be calculated by the following equation (2). Where E is pupil detection reliability and t is imaging time.
- the output signal selection unit 153 selects one of the pupil detection result of the pupil detection unit 103 and the pupil detection result of the pupil detection unit 104 based on the calculated fluctuation tendency characteristic, and selects the selected pupil detection result in the subsequent stage. Output to processing unit.
- the selection criteria are as follows. That is, (in FIG. 12, a period 602) a predetermined time period at the pupil detection reliability is smaller than the switching determination reference value 504, and the negative (i.e. the slope D 1 are continuously, pupil detection reliability continuously Is reduced, the pupil detection result of the pupil detection unit 104 is selected. That is, when the pupil detection reliability curve 603 included in the predetermined period is below the switching determination reference value 504 and in a decreasing tendency, the pupil detection result of the pupil detection unit 104 is selected. Be done.
- a predetermined time period at the pupil detection reliability is larger than the switching determination reference value 504, and the positive and the slope D 1 are continuous (i.e., the pupil detection reliability continuously Is increased, the pupil detection result of the pupil detection unit 103 is selected.
- the previously selected one of the pupil detection result of the pupil detection unit 103 and the pupil detection result of the pupil detection unit 104 is also selected this time.
- the pupil detection reliability curve 601 which is out of the predetermined period is a portion which is not necessary for the processing of the output signal selection unit 153. Therefore, the evaluation value storage unit 106 may delete data corresponding to the curve 601.
- the red-eye occurrence intensity becomes stronger, the brightness of the pupil becomes higher than the brightness of the iris, so detection of the pupil on the image becomes easy. Therefore, when the red eye phenomenon occurs, detecting the pupil using the image obtained by the image input unit 101 has better detection performance than using the image obtained by the image input unit 102. However, when the red-eye occurrence intensity is weak, the difference between the brightness of the pupil and the brightness of the iris becomes small. Therefore, in a situation where the red-eye phenomenon starts to occur, in the image obtained by the image input unit 101, the difference between the luminance of the pupil and the luminance of the iris is almost zero, and detection of the pupil becomes difficult.
- the occurrence of the red eye phenomenon is weaker than the image obtained by the image input unit 101, so the pupil does not shine and the pupil brightness is sufficient for iris brightness. It gets lower. Therefore, in the image obtained by the image input unit 102, the difference between the brightness of the pupil and the brightness of the iris is larger than the image obtained by the image input unit 101, and the detection performance is relatively improved. As described above, since the accuracy of the pupil detection result of the pupil detection unit 103 and the pupil detection result of the pupil detection unit 104 is reversed depending on the red-eye occurrence intensity, the accuracy of the subsequent processing can be obtained by appropriately selecting the higher accuracy. It can be improved.
- step S209 the gaze direction calculation unit (not shown) calculates the gaze direction from the coordinates of the face part group obtained in step S203 and the central coordinates of the pupil image obtained in step S204.
- the gaze direction is, for example, a face direction vector representing the direction of the front direction of the face calculated from the coordinates of the face part group, and a gaze direction vector to the front direction of the face calculated from the coordinates of the corner of the eye , Is calculated.
- the face direction vector is calculated, for example, by the following procedure.
- the gaze direction calculation unit converts the three-dimensional coordinates of the driver's face parts group acquired in advance by rotating and translating the three-dimensional coordinates. Then, the gaze direction calculation unit projects the converted three-dimensional coordinates on the target image used to obtain the pupil detection result selected in step S208. Then, the gaze direction calculation unit calculates a rotation / translation parameter that most closely matches the face part group detected in step S203.
- a combination of a vector representing the direction in which the driver's face is facing and a vector rotated by the determined rotation parameter is the face direction It is a vector.
- the gaze direction vector is calculated, for example, by the following procedure.
- the gaze direction calculation unit stores in advance the driver's face part group and the three-dimensional coordinates of the pupil center when looking in the same direction as the face direction.
- a position moved by a predetermined distance in the opposite direction to the sight line direction is calculated as the eyeball center position.
- the predetermined distance is suitably about 12 mm which is the radius of a general adult eye, but not limited to the above value, any value may be used.
- three-dimensional coordinates of the eyeball center at the time of detection are obtained using the face rotation and translation parameters acquired at the time of face direction vector calculation.
- the pupil is on the sphere whose center is the eyeball center and whose radius is the predetermined distance, it is searched where on the sphere the detected pupil center is. Finally, a vector connecting the eyeball center and the searched point on the sphere is calculated as the gaze direction.
- step S210 the alarm device (not shown) determines whether the driver is facing the front from the direction of the line of sight obtained in step S209, and executes alarm processing based on the determination result. That is, an alarm is issued when the sight line direction faces outside the predetermined angle range for a predetermined time or more.
- This warning is a display including a warning message, a voice message by a voice synthesis LSI, LED light emission, sound emission by a speaker or the like, or a combination of these.
- the alarm count W is increased by one. Then, when the alarm count W exceeds a predetermined threshold value, the alarm device considers that the driver has not looked at the front for a long time, and issues an alarm. If it is determined that the user is facing the front, the alarm device sets the alarm count W to zero. The alarm count W is zero in the initial state.
- step S211 the end determination is performed, and when the end condition is satisfied, a series of processing ends. On the other hand, when the end condition is not satisfied, the processing flow returns to step S201.
- the end determination may be performed by manual input of the end instruction, or may be performed using some external signal as a trigger.
- switching determination section 105 is the periphery outside the first pupil image of the luminance in the first pupil image detected by pupil detection section 103.
- the detection result of the first pupil image or the pupil detection unit 104 based on the calculated value of the red eye generation intensity which is relative luminance to the image luminance and the correlation characteristic between the red eye generation intensity and the pupil detection accuracy value
- the detection result of the second pupil image is selectively output.
- the pupil detection unit 103 uses the first person image captured by the image input unit 101 including the imaging unit 111 and the irradiation unit 112, which are separated from each other by d1.
- the pupil detection unit 104 uses the second person image captured by the image input unit 102 including the imaging unit 121 and the irradiation unit 122, which are separated from each other by d2.
- the separation distance d1 is smaller than the separation distance d2.
- the detection result of the first pupil image or the detection result of the second pupil image based on the estimated accuracy value of the pupil detection using the first pupil image relatively prone to the red eye phenomenon
- the redeye occurrence intensity which is the relative brightness of the brightness in the first pupil image and the brightness of the peripheral image outside the first pupil image
- the near infrared rays are used. Even when a gray scale image is used, it is possible to select and output highly accurate detection results.
- the pupil detection result is selected based on the illuminance coefficient, in addition to the calculated red-eye occurrence intensity and the correlation characteristic between the red-eye occurrence intensity and the pupil detection accuracy rate.
- FIG. 13 is a block diagram showing a configuration of pupil detection apparatus 700 according to Embodiment 2 of the present invention.
- the pupil detection device 700 includes an illuminance measurement instrument 701, a switching determination unit 702, and an evaluation value storage unit 703.
- the illuminance measuring instrument 701 measures the illuminance at the time of imaging by the image input units 101 and 102, and outputs the measured illuminance to the evaluation value storage unit 703.
- the illuminance measuring instrument 701 is installed near the face of the driver or at a position where the illuminance in the gaze direction of the driver can be measured.
- the switching determination unit 702 calculates “red-eye occurrence intensity” from each of the target image data received from the image input unit 101 and the target image data received from the image input unit 102, and the calculated red-eye occurrence intensity, red-eye occurrence intensity and pupil
- the pupil detection result of pupil detection unit 103 and pupil detection unit 104 as a signal to be output to a processing unit (for example, line-of-sight direction calculation unit) in a later stage based on correlation characteristics with the detection accuracy rate and measured illuminance. Select one of the pupil detection results.
- the switching determination unit 702 has an output signal selection unit 711 as shown in FIG.
- the output signal selection unit 711 corrects the pupil detection reliability received from the characteristic calculation unit 152 based on the illuminance, and based on the “variation tendency characteristic” of the pupil detection reliability after the correction, the processing unit in the subsequent stage (for example, As a signal to be output to the direction calculation unit, either the pupil detection result of the pupil detection unit 103 or the pupil detection result of the pupil detection unit 104 is selected.
- the correction based on the illumination with respect to the pupil detection reliability will be described in detail later.
- the evaluation value storage unit 703 stores the pupil detection reliability received from the switching determination unit 702 and the illuminance received from the illuminance measurement instrument 701 in association with the imaging time of the target image used to specify the pupil detection reliability.
- FIG. 15 is a flowchart for explaining the operation of pupil detection apparatus 700.
- step S801 the illuminance measuring instrument 701 measures the illuminance at the time of imaging by the image input units 101 and 102.
- the target position of the illuminance measurement is optimal for the driver's face area or the driver's gaze direction, it is not limited to this, and for example, the illuminance outside the car equipped with the pupil detection device 700 Also good.
- step S802 the switching determination unit 702 calculates “red eye occurrence intensity” from each of the target image data received from the image input unit 101 and the target image data received from the image input unit 102, and the red eye occurrence intensity calculated and red eye occurrence.
- the pupil detection result of the pupil detection unit 103 and the pupil detection as a signal to be output to the processing unit (for example, the gaze direction calculation unit) of the subsequent stage based on the correlation characteristic between the intensity and the pupil detection accuracy rate and the measured illuminance.
- One of the pupil detection results of the unit 104 is selected.
- the output signal selection unit 711 corrects the pupil detection reliability currently received from the characteristic calculation unit 152 and the pupil detection reliability stored in the evaluation value storage unit 703 based on the illuminance. This correction is performed by the following equation (3). That is, by multiplying the pupil detection reliability E by the illuminance coefficient L, the illuminance reflection pupil detection reliability F is calculated.
- FIG. 16 shows the correspondence between the illuminance and the illuminance coefficient.
- the illumination coefficient L is set to be smaller as the illumination becomes higher.
- FIGS. 16A and 16B the available correspondence relationship between the illuminance and the illuminance coefficient is not limited to this. Any correspondence relationship may be used as long as the gradient of the illuminance coefficient with respect to the illuminance does not become positive.
- the output signal selection unit 711 outputs the corrected pupil detection reliability (that is, the illuminance reflection pupil detection reliability F) to the processing unit (for example, gaze direction calculation unit) at the subsequent stage based on the fluctuation tendency characteristic.
- the processing unit for example, gaze direction calculation unit
- illuminance reflected pupil detection reliability gradient D 2 of the time variation of F obtained in the calculation of the variation trend characteristics are determined by the following equation (4).
- the output signal selection unit 711 selects either the pupil detection result of the pupil detection unit 103 or the pupil detection result of the pupil detection unit 104 based on the calculated fluctuation tendency characteristic and the illuminance, and the selected pupil detection result Is output to the subsequent processing unit.
- the output signal selection unit 711 selects the pupil detection result of the pupil detection unit 103 when the illuminance is equal to or more than a predetermined value in the predetermined period 602 described above.
- the output signal selection unit 711 determines the pupil detection result of the pupil detection unit 103 and the calculated fluctuation tendency characteristic. One of the pupil detection results of the pupil detection unit 104 is selected, and the selected pupil detection result is output to the processing unit of the subsequent stage.
- the selection criteria are as follows. That is, (in FIG. 12, a period 602) a predetermined time period at the pupil detection reliability is smaller than the switching determination reference value 504, and the negative (i.e. the gradient D 2 are continuously, pupil detection reliability continuously Is reduced, the pupil detection result of the pupil detection unit 104 is selected. That is, when the pupil detection reliability curve 603 included in the predetermined period is below the switching determination reference value 504 and in a decreasing tendency, the pupil detection result of the pupil detection unit 104 is selected. Be done.
- a predetermined time period at the pupil detection reliability is larger than the switching determination reference value 504, and the positive and the gradient D 2 are continuous (i.e., the pupil detection reliability continuously Is increased, the pupil detection result of the pupil detection unit 103 is selected.
- the previously selected one of the pupil detection result of the pupil detection unit 103 and the pupil detection result of the pupil detection unit 104 is also selected this time.
- output signal selection unit 711 corrects the fluctuation tendency characteristic based on the illuminance at the time of imaging, and based on the corrected fluctuation tendency characteristic, A detection result of one pupil image or a detection result of a second pupil image is selected.
- FIG. 17 is a block diagram showing a configuration of pupil detection apparatus 900 according to Embodiment 3 of the present invention.
- pupil detection apparatus 900 has mode selection section 901 and switching determination section 902.
- the mode selection unit 901 selects one of a plurality of prepared modes.
- the plurality of modes include, for example, a first mode in which a person who is the target is wearing sunglasses and a second mode in which the person is not wearing sunglasses.
- Information on the selected mode is output to the switching determination unit 902.
- the switching determination unit 902 calculates “red-eye occurrence intensity” from each of the target image data received from the image input unit 101 and the target image data received from the image input unit 102, and the calculated red-eye occurrence intensity, red-eye occurrence intensity and pupil Either the pupil detection result of the pupil detection unit 103 or the pupil detection result of the pupil detection unit 104 as a signal to be output to the processing unit (for example, the gaze direction calculation unit) of the subsequent stage based on the correlation characteristic with the detection accuracy rate.
- the processing unit for example, the gaze direction calculation unit
- the switching determination unit 902 switches the correlation characteristic used to select the pupil detection result according to the mode information received from the mode selection unit 901.
- the switching determination unit 902 has a characteristic calculation unit 911.
- the characteristic calculation unit 911 has correlation characteristic data corresponding to each of a plurality of modes selectable by the mode selection unit 901. Then, the correlation characteristic data corresponding to the mode information received from the mode selection unit 901 is selected, and the pupil detection accuracy rate corresponding to the red eye generation intensity received from the red eye generation intensity calculation unit 151 is calculated using the correlation characteristic data.
- the pupil detection result is selected using the correlation characteristic optimum for each mode, the pupil detection result with high accuracy can be selected with high accuracy.
- the pupil detection device 100 may be configured to include an imaging unit configured of two imaging units and one irradiation unit (see FIG. 19A).
- pupil detection apparatus 100 may be configured to include an imaging unit configured of one imaging unit and two irradiation units (see FIG. 19B). The point is that two pairs of the imaging unit and the irradiation unit and different in the separation distance between the imaging unit and the irradiation unit may be present.
- each function block employed in the description of each of the aforementioned embodiments may typically be implemented as an LSI constituted by an integrated circuit. These may be individually made into one chip, or may be made into one chip so as to include some or all. Although an LSI is used here, it may be called an IC, a system LSI, a super LSI, or an ultra LSI depending on the degree of integration. Further, the method of circuit integration is not limited to LSI's, and implementation using dedicated circuitry or general purpose processors is also possible.
- a programmable field programmable gate array may be used, or a reconfigurable processor may be used which can reconfigure connection and setting of circuit cells in the LSI.
- FPGA field programmable gate array
- a reconfigurable processor may be used which can reconfigure connection and setting of circuit cells in the LSI.
- the pupil detection device and the pupil detection method of the present invention are useful as a device capable of selecting and outputting a detection result with high accuracy even when a grayscale image using near-infrared light is used.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Ophthalmology & Optometry (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Eye Examination Apparatus (AREA)
- Image Processing (AREA)
Abstract
Description
[瞳孔検出装置の構成]
図1は、本発明の実施の形態1に係る瞳孔検出装置100の構成を示すブロック図である。瞳孔検出装置100は、例えば、自動車の車室内に設けられ、警報装置と接続されて使用される。この警報装置は、瞳孔検出装置100の検出結果に基づいて、ドライバーの視線方向を検出し、ドライバーが正面を長時間向かなかった場合には、ドライバーに対して警告を行って注意を喚起する。以下では、特に、瞳孔検出装置100が自動車の車室内に設定された場合を例にとって説明する。
以上の構成を有する瞳孔検出装置100の動作について説明する。図8は、瞳孔検出装置100の動作説明に供するフロー図である。図8のフロー図には、上記した警報装置における処理フローも含まれている。すなわち、図8には、瞳孔検出装置100が警報装置に適用された場合の処理フローが示されている。
ステップS201で画像入力部101,102は、発光して撮像ターゲット(つまり、ここでは、人物)を照射すると共に、その撮像ターゲットを撮像する。これにより、ターゲット画像が取得される。
ステップS202で顔検出部131,141は、画像入力部101,102から受け取るターゲット画像から顔画像を検出する。図9は、ターゲット画像である顔画像を示す図である。図9Aは、撮像部111で撮像した、赤目現象発生時の画像を示す図であり、図9Bは、撮像部121で撮像した、赤目現象が発生していない時の画像を示す図である。なお、撮像した顔画像では、例えば、画像横方向をX軸、画像縦方向をY軸とし、1画素が1座標点である。
ステップS205-S208で切替判定部105は、画像入力部101から受け取るターゲット画像データから赤目発生強度を算出し、算出された赤目発生強度と、赤目発生強度と瞳孔検出正解率との相関特性とに基づいて、後段の処理部(例えば、視線方向算出部)へ出力する信号として、瞳孔検出部103の瞳孔検出結果及び瞳孔検出部104の瞳孔検出結果のいずれかを選択する。
ステップS209で視線方向算出部(図示せず)は、ステップS203で得られた顔部品群の座標と、ステップS204で得られた瞳孔画像の中心座標とから、視線方向を算出する。
ステップS210で警報装置(図示せず)は、ステップS209で得られた視線方向から運転者が正面を向いているか否かを判定し、判定結果に基づいて、警報処理を実行する。すなわち、視線方向が所定の時間以上、所定の角度範囲外を向いている場合には、警報が発せられる。この警報は、警告メッセージ含む表示、音声合成LSIによる音声メッセージ、LED発光、スピーカ等による放音、またはこれらの組み合わせである。
実施の形態2では、算出された赤目発生強度と、赤目発生強度と瞳孔検出正解率との相関特性とに加えて、照度係数に基づいて、瞳孔検出結果を選択する。
実施の形態3では、複数のモードのそれぞれに対応する相関特性が用意され、選択されたモードに応じた相関特性を用いて、瞳孔検出信頼度を算出する。
(1)上記各実施の形態では、画像入力部101及び画像入力部102のそれぞれに、撮像部と照射部とが1つずつ設けられる場合について説明した。しかしながら、本発明は、これに限定されるものではない。例えば、瞳孔検出装置100は、2つの撮像部及び1つの照射部から構成される撮像ユニットを具備する構成であっても良い(図19A参照)。逆に、瞳孔検出装置100は、1つの撮像部及び2つの照射部から構成される撮像ユニットを具備する構成であっても良い(図19B参照)。要は、撮像部と照射部とのペアであって撮像部と照射部との離間距離が異なるペアが2つ存在していれば良い。
また、上記各実施の形態の説明に用いた各機能ブロックは、典型的には集積回路であるLSIとして実現される。これらは個別に1チップ化されてもよいし、一部または全てを含むように1チップ化されてもよい。ここでは、LSIとしたが、集積度の違いにより、IC、システムLSI、スーパーLSI、ウルトラLSIと呼称されることもある。
また、集積回路化の手法はLSIに限るものではなく、専用回路または汎用プロセッサで実現してもよい。LSI製造後に、プログラムすることが可能なFPGA(Field Programmable Gate Array)や、LSI内部の回路セルの接続や設定を再構成可能なリコンフィギュラブル・プロセッサーを利用してもよい。
さらには、半導体技術の進歩または派生する別技術によりLSIに置き換わる集積回路化の技術が登場すれば、当然、その技術を用いて機能ブロックの集積化を行ってもよい。バイオ技術の適用等が可能性としてありえる。
101,102 画像入力部
103,104 瞳孔検出部
105,702,902 切替判定部
106,703 評価値記憶部
111,121 撮像部
112,122 照射部
113,123 同期部
131,141 顔検出部
132,142 顔部品検出部
133,143 瞳孔情報算出部
151 赤目発生強度算出部
152,911 特性算出部
153,711 出力信号選択部
200 撮像ユニット
701 照度計測器
901 モード選択部
Claims (6)
- 所定の離間距離だけ離れた撮像手段と撮像時に発光する発光手段とからなる第1の撮像ペアと、
前記所定の離間距離が第1の撮像ペアよりも大きい第2撮像ペアと、
前記第1の撮像ペアによって人物が撮像された第1の人物画像から第1の瞳孔画像を検出し、前記第2の撮像ペアによって前記人物が撮像された第2の人物画像から、第2の瞳孔画像を検出する検出手段と、
前記第1の瞳孔画像内の輝度の、前記第1の瞳孔画像外の周辺画像の輝度に対する相対輝度である赤目発生強度の算出値と、赤目発生強度と瞳孔検出精度値との相関特性とに基づいて、前記第1の瞳孔画像の検出結果又は前記第2の瞳孔画像の検出結果を選択的に出力する出力切替手段と、
を具備する瞳孔検出装置。 - 前記出力切替手段は、
前記第1の人物画像に基づいて、前記赤目発生強度を算出する発生強度算出手段と、
前記算出された赤目発生強度の算出値と、赤目発生強度と瞳孔検出精度値との相関特性とに基づいて、前記赤目発生強度の算出値に対応する瞳孔検出精度値を算出する精度値算出手段と、
前記算出された瞳孔検出精度値の変動傾向特性に基づいて、前記第1の瞳孔画像の検出結果又は前記第2の瞳孔画像の検出結果を選択する選択手段と、
を具備する請求項1に記載の瞳孔検出装置。 - 前記選択手段は、
所定の期間における前記算出された瞳孔検出精度値のすべてが所定の閾値未満であり、かつ、前記算出された瞳孔検出精度値の時間変動傾向が単調減少する変動傾向特性である場合に、前記第1の瞳孔画像の検出結果を選択する、
請求項2に記載の瞳孔検出装置。 - 前記所定の閾値は、赤目発生強度と前記第1の瞳孔画像の瞳孔検出精度値との第1の相関特性と、赤目発生強度と前記第2の瞳孔画像の瞳孔検出精度値との第2の相関特性とが交わる点に対応する瞳孔検出精度値である、
請求項3に記載の瞳孔検出装置。 - 前記出力切替手段は、前記撮像時の照度に基づいて、前記変動傾向特性を補正する補正手段、をさらに具備し、
前記選択手段は、前記補正後の変動傾向特性に基づいて、前記第1の瞳孔画像の検出結果又は前記第2の瞳孔画像の検出結果を選択する、
請求項2に記載の瞳孔検出装置。 - 所定の離間距離だけ離れた撮像手段と撮像時に発光する発光手段とからなる第1の撮像ペアと、前記所定の離間距離が第1の撮像ペアよりも大きい第2撮像ペアと、を具備する瞳孔検出装置における、瞳孔検出方法であって、
前記第1の撮像ペアによって人物が撮像された第1の人物画像から第1の瞳孔画像を検出し、前記第2の撮像ペアによって前記人物が撮像された第2の人物画像から、第2の瞳孔画像を検出するステップと、
前記第1の瞳孔画像内の輝度の、前記第1の瞳孔画像外の周辺画像の輝度に対する相対輝度である赤目発生強度の算出値と、赤目発生強度と瞳孔検出精度値との相関特性とに基づいて、前記第1の瞳孔画像の検出結果又は前記第2の瞳孔画像の検出結果を選択的に出力するステップと、
を具備する瞳孔検出方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011521790A JP5694161B2 (ja) | 2010-02-26 | 2011-01-24 | 瞳孔検出装置及び瞳孔検出方法 |
US13/318,431 US8810642B2 (en) | 2010-02-26 | 2011-01-24 | Pupil detection device and pupil detection method |
EP11746977.5A EP2541493B1 (en) | 2010-02-26 | 2011-01-24 | Pupil detection device and pupil detection method |
CN201180001860.1A CN102439627B (zh) | 2010-02-26 | 2011-01-24 | 瞳孔检测装置和瞳孔检测方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010-042455 | 2010-02-26 | ||
JP2010042455 | 2010-02-26 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011105004A1 true WO2011105004A1 (ja) | 2011-09-01 |
Family
ID=44506439
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/000359 WO2011105004A1 (ja) | 2010-02-26 | 2011-01-24 | 瞳孔検出装置及び瞳孔検出方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US8810642B2 (ja) |
EP (1) | EP2541493B1 (ja) |
JP (1) | JP5694161B2 (ja) |
CN (1) | CN102439627B (ja) |
WO (1) | WO2011105004A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016514974A (ja) * | 2013-02-19 | 2016-05-26 | オプトス ピーエルシー | 画像処理における改良または画像処理に関する改良 |
JP2017068615A (ja) * | 2015-09-30 | 2017-04-06 | 富士通株式会社 | 視線検出システム、視線検出方法および視線検出プログラム |
US10147022B2 (en) | 2015-06-23 | 2018-12-04 | Fujitsu Limited | Detection method and system |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104094197B (zh) * | 2012-02-06 | 2018-05-11 | 索尼爱立信移动通讯股份有限公司 | 利用投影仪的注视追踪 |
JP5999013B2 (ja) * | 2013-04-12 | 2016-09-28 | 株式会社デンソー | 開眼状態判定装置 |
JP6260006B2 (ja) * | 2013-07-30 | 2018-01-17 | パナソニックIpマネジメント株式会社 | 撮像装置、並びにそれを用いた撮像システム、電子ミラーシステムおよび測距装置 |
WO2015106202A1 (en) * | 2014-01-10 | 2015-07-16 | Carleton Life Support Systems, Inc. | Reduced cognitive function detection and alleviation system for a pilot |
DE102014013165A1 (de) * | 2014-09-04 | 2016-03-10 | GM Global Technology Operations LLC (n. d. Ges. d. Staates Delaware) | Kraftfahrzeug sowie Verfahren zum Betrieb eines Kraftfahrzeugs |
WO2016099402A1 (en) * | 2014-12-16 | 2016-06-23 | Singapore Health Services Pte Ltd | A method and system for monitoring and/or assessing pupillary responses |
DE102017202659A1 (de) * | 2017-02-20 | 2018-08-23 | Bayerische Motoren Werke Aktiengesellschaft | System und Verfahren zur Erkennung der Müdigkeit eines Fahrzeugführers |
EP3415078A1 (en) | 2017-06-16 | 2018-12-19 | Essilor International | Method and system for determining a pupillary distance of an individual |
US10402644B1 (en) * | 2017-10-24 | 2019-09-03 | Wells Fargo Bank, N.A. | System and apparatus for improved eye tracking using a mobile device |
US10956737B1 (en) * | 2017-10-24 | 2021-03-23 | Wells Fargo Bank, N.A. | System and apparatus for improved eye tracking using a mobile device |
CN108053444B (zh) * | 2018-01-02 | 2021-03-12 | 京东方科技集团股份有限公司 | 瞳孔定位方法及装置、设备和存储介质 |
CN110393504B (zh) * | 2018-04-24 | 2022-02-15 | 高金铎 | 一种智能鉴毒的方法及装置 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0694980A (ja) * | 1992-09-14 | 1994-04-08 | Nikon Corp | 視線検出装置付きカメラ |
JPH07319037A (ja) * | 1994-05-19 | 1995-12-08 | Olympus Optical Co Ltd | 視線検出カメラ及びプリンタ装置 |
JP2008158922A (ja) * | 2006-12-26 | 2008-07-10 | Aisin Seiki Co Ltd | 瞼検出装置、瞼検出方法及びプログラム |
JP2010042455A (ja) | 2008-08-08 | 2010-02-25 | Hitachi Koki Co Ltd | 電動工具 |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7280678B2 (en) * | 2003-02-28 | 2007-10-09 | Avago Technologies General Ip Pte Ltd | Apparatus and method for detecting pupils |
WO2004107695A1 (ja) * | 2003-05-27 | 2004-12-09 | Nec Corporation | 適応変調において適切な閾値で変調方式を選択するデータ通信装置 |
CN100350877C (zh) * | 2003-07-04 | 2007-11-28 | 松下电器产业株式会社 | 活体眼睛判定方法及活体眼睛判定装置 |
JP2006338236A (ja) | 2005-06-01 | 2006-12-14 | Matsushita Electric Ind Co Ltd | 眼画像撮影装置およびそれを用いた認証装置 |
JP5089405B2 (ja) | 2008-01-17 | 2012-12-05 | キヤノン株式会社 | 画像処理装置及び画像処理方法並びに撮像装置 |
US8081254B2 (en) * | 2008-08-14 | 2011-12-20 | DigitalOptics Corporation Europe Limited | In-camera based method of detecting defect eye with high accuracy |
-
2011
- 2011-01-24 WO PCT/JP2011/000359 patent/WO2011105004A1/ja active Application Filing
- 2011-01-24 US US13/318,431 patent/US8810642B2/en active Active
- 2011-01-24 JP JP2011521790A patent/JP5694161B2/ja active Active
- 2011-01-24 CN CN201180001860.1A patent/CN102439627B/zh active Active
- 2011-01-24 EP EP11746977.5A patent/EP2541493B1/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0694980A (ja) * | 1992-09-14 | 1994-04-08 | Nikon Corp | 視線検出装置付きカメラ |
JPH07319037A (ja) * | 1994-05-19 | 1995-12-08 | Olympus Optical Co Ltd | 視線検出カメラ及びプリンタ装置 |
JP2008158922A (ja) * | 2006-12-26 | 2008-07-10 | Aisin Seiki Co Ltd | 瞼検出装置、瞼検出方法及びプログラム |
JP2010042455A (ja) | 2008-08-08 | 2010-02-25 | Hitachi Koki Co Ltd | 電動工具 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016514974A (ja) * | 2013-02-19 | 2016-05-26 | オプトス ピーエルシー | 画像処理における改良または画像処理に関する改良 |
US10147022B2 (en) | 2015-06-23 | 2018-12-04 | Fujitsu Limited | Detection method and system |
JP2017068615A (ja) * | 2015-09-30 | 2017-04-06 | 富士通株式会社 | 視線検出システム、視線検出方法および視線検出プログラム |
Also Published As
Publication number | Publication date |
---|---|
EP2541493A1 (en) | 2013-01-02 |
EP2541493A4 (en) | 2017-03-29 |
JPWO2011105004A1 (ja) | 2013-06-17 |
CN102439627A (zh) | 2012-05-02 |
US20120050516A1 (en) | 2012-03-01 |
US8810642B2 (en) | 2014-08-19 |
EP2541493B1 (en) | 2019-10-23 |
JP5694161B2 (ja) | 2015-04-01 |
CN102439627B (zh) | 2014-10-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2011105004A1 (ja) | 瞳孔検出装置及び瞳孔検出方法 | |
JP5661043B2 (ja) | 外光映り込み判定装置、視線検出装置及び外光映り込み判定方法 | |
JP5529660B2 (ja) | 瞳孔検出装置及び瞳孔検出方法 | |
US8066375B2 (en) | Eye tracker having an extended span of operating distances | |
JP4078334B2 (ja) | 画像処理装置および画像処理方法 | |
JP5538160B2 (ja) | 瞳孔検出装置及び瞳孔検出方法 | |
JP5466610B2 (ja) | 視線推定装置 | |
JP6601351B2 (ja) | 視線計測装置 | |
JP2004320287A (ja) | デジタルカメラ | |
US10722112B2 (en) | Measuring device and measuring method | |
US11163994B2 (en) | Method and device for determining iris recognition image, terminal apparatus, and storage medium | |
JP2010244156A (ja) | 画像特徴量検出装置及びこれを用いた視線方向検出装置 | |
CN112153363B (zh) | 用于3d角膜位置估计的方法和系统 | |
US8090253B2 (en) | Photographing control method and apparatus using strobe | |
CN111522431B (zh) | 使用眼睛跟踪系统对闪光进行分类 | |
CN112041783B (zh) | 曝光时间控制的方法、系统和计算机存储介质 | |
JP2017162233A (ja) | 視線検出装置および視線検出方法 | |
JPH0761256A (ja) | 車両用前方不注意検知装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180001860.1 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011521790 Country of ref document: JP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11746977 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13318431 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011746977 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |