WO2021112282A1 - Dispositif de suivi du regard à l'aide d'une détection de pupille et d'une courbure de ligne de paupière, et procédé associé - Google Patents

Dispositif de suivi du regard à l'aide d'une détection de pupille et d'une courbure de ligne de paupière, et procédé associé Download PDF

Info

Publication number
WO2021112282A1
WO2021112282A1 PCT/KR2019/017051 KR2019017051W WO2021112282A1 WO 2021112282 A1 WO2021112282 A1 WO 2021112282A1 KR 2019017051 W KR2019017051 W KR 2019017051W WO 2021112282 A1 WO2021112282 A1 WO 2021112282A1
Authority
WO
WIPO (PCT)
Prior art keywords
gaze
eyelid
pupil
eye
detected
Prior art date
Application number
PCT/KR2019/017051
Other languages
English (en)
Korean (ko)
Inventor
신영길
이상화
한상윤
Original Assignee
서울대학교산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 서울대학교산학협력단 filed Critical 서울대학교산학협력단
Publication of WO2021112282A1 publication Critical patent/WO2021112282A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/28Quantising the image, e.g. histogram thresholding for discrimination between background and foreground patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction

Definitions

  • the present invention relates to a gaze tracking apparatus and method using pupil detection and eyelid curvature, and more particularly, to a pupil and an eye region from a user's eye image taken through a camera, and located in the detected eye region.
  • Pupil detection that calculates the relative position of the pupil and uses the calculated relative position of the pupil and the curvature of the eyelid that changes according to the height of the pupil to accurately track the gaze of the user moving in the up, down, left, and right directions in real time
  • a gaze tracking device and method using the eyelid curvature.
  • Vision is the most important sense among the senses of a person who recognizes the external world and collects information. There is growing interest in
  • the eye tracking device refers to a device that recognizes the direction in which a person looks in real time, and is applied to various fields such as driver drowsiness detection, research on learning disabilities, pilot training, ergonomics, advertisement, iris recognition, etc. have.
  • Such a gaze tracking apparatus detects a position of a pupil from an image acquired through a camera, and tracks and recognizes the gaze of a person based on the detected position of the pupil.
  • the conventional gaze tracking apparatus tracks the gaze using only the position of the pupil without a specific reference point for the pupil, the detected position of the pupil is determined every moment even when the person is looking in the same direction while the body is moving.
  • the direction of the gaze There is a limitation in that it is not possible to accurately track the direction of the gaze because it is different.
  • the conventional gaze tracking apparatus has a problem in that the accuracy of the vertical gaze (ie, the height of the gaze) is significantly lowered compared to the left and right gaze because only the position of the pupil is detected and used in the gaze tracking.
  • the pupil is detected from the user's eye image input through the camera, and the eye region including the edge and shape of the eye is extracted based on the detected pupil, and the eye region
  • the user's gaze direction is estimated, and the user's vertical gaze is accurately recognized by applying the change in the eyelid curvature according to the detected height of the pupil to the estimated gaze direction, It is intended to propose a method for accurately tracking the user's overall gaze in real time, thereby providing a reliable gaze tracking result.
  • the present invention is to provide a stable and robust gaze tracking result in real time because the relative position of the pupil with respect to the eye region and the curvature of the eyelid are used even if the position of the camera or the gaze tracking device is changed.
  • Korean Patent Registration No. 2016308 (October 31, 2018) relates to a gaze tracking method using the movement of the pupil and the distance between the eyes and the face edge, and detects a face region from an image acquired through a camera, and the detected face Based on the area, the movement distance of the pupil, the direction and angle of the gaze are calculated, and using the shape of the person in the image information, the standard installed previously, and the markers displayed on the floor and the wall, the distance between the reference and the person and the person It relates to a gaze tracking method using the movement of the pupil to track the gaze by calculating the eye height of the , and the gaze position using the calculated gaze direction, angle, and human eye height, and the distance between the eyes and the face edge.
  • the prior art calculates the gaze direction and angle by calculating the difference between the movement distance of the pupil and the distance from the outer end of each eye to the edge of the face from the image information, and calculates the gaze direction and angle and the marker. It is to track a person's gaze based on the person's eye level calculated by using it.
  • the present invention detects a pupil from an eye image, detects an eye region based on the detected pupil, calculates a relative position of the detected pupil in the detected eye region, and calculates the relative position of the pupil And, using the curvature of the eyelid that changes according to the detected height of the pupil, it is possible to accurately track the movement of the gaze up, down, left and right in real time, and there is a clear difference in configuration between the present invention and the prior art.
  • Korea Patent No. 1706992 (2017.02.09.) relates to a gaze tracking apparatus and method, and a recording medium for performing the same, by detecting a person's face area, eye area, and nose area from surrounding images taken with a camera. , detects a pupil region and its center from the detected eye region, calculates an angle for a human face pose from the eye region and nose region, detects an eyeball center according to the calculated angle, and detects the detected pupil
  • the present invention relates to a gaze tracking apparatus and method for detecting a gaze direction of a person by connecting a center of an area and a center of an eyeball, and a recording medium for performing the same.
  • the prior art describes a method for tracking a person's gaze by detecting the center of the pupil region and the center of the eyeball in the face image of a person.
  • the present invention applies the change in the curvature of the eyelid according to the relative position of the pupil with respect to the eye region and the height of the pupil so that the movement of the gaze up, down, left, and right can be tracked in real time. It is clear that there is no characterization or suggestion or suggestion.
  • the present invention was created to solve the above problems, detects a pupil from an image captured by the user, extracts an eye region using the detected pupil, and compares the detected pupil from the extracted eye region.
  • detects a pupil from an image captured by the user extracts an eye region using the detected pupil, and compares the detected pupil from the extracted eye region.
  • the position, and combining the calculated relative position of the pupil, the change in the curvature of the eyelid according to the height of the pupil, and a three-dimensional eye model the user's gaze is recognized and tracked in real time, resulting in highly reliable gaze tracking
  • An object of the present invention is to provide a gaze tracking device and method using pupil detection and eyelid curvature that can be provided.
  • the present invention sets a window for detecting an eye region based on the detected pupil after detecting the pupil, and the edge of the eyelid continues around the edge of the eyelid corresponding to the edge of the pupil.
  • the present invention matches the contour of the upper eyelid with a quadratic curve using the least squares method, and uses the inner branch point of the vertex of the matched quadratic curve and the front tail of the eye located at the end point of the left edge of the eyelid to set a reference point for gazing at the front, and calculate the relative position of the detected pupil based on the set reference point to estimate the gaze direction of the user, and the curvature of the eyelid according to the height of the pupil in the estimated gaze direction
  • An object of the present invention is to provide a gaze tracking apparatus and method using pupil detection and eyelid curvature that can recognize and track the user's vertical, horizontal, and vertical movements in real time by reflecting the change.
  • the gaze tracking apparatus includes an edge detector configured to convert a user's eye image into a binary image to detect an edge from the eye image, a pupil detector configured to detect a pupil from the detected edge, and the detected pupil. and an eye region detector configured to detect an eye region by detecting an eyelid contour from an edge, and a gaze tracking unit configured to track the user's gaze in real time using the detected pupil, eyelid contour, and eye region, wherein the user Real-time tracking of the gaze of the user is performed by recognizing the direction of the user's gaze using the curvature of the detected eyelid contour that changes according to the detected height of the pupil located in the detected eye region. characterized by being
  • the eye region detection unit sets a window of a preset size on the converted binary image based on the extracted pupil, and the edge of the eyelid continues around the edge of the eyelid adjacent to the edge of the pupil.
  • the eye region is detected by extending the set window in the left and right directions up to the edge boundary, and by detecting the outline of the eyelid by expanding or reducing the set window up to the upper and lower edge boundaries of the eyelid in the vertical direction. do.
  • the eye tracking device further includes a pupil relative position calculation unit for detecting a relative position of the detected pupil with respect to the detected eye region, and an eyelid secondary curve matching unit for matching the detected eyelid contour with a quadratic curve.
  • the relative position of the pupil is, in the entire detected eye region, the position coordinate for the front tail of the eye, which is the end point of the left edge of the detected eyelid contour, and the position with respect to the vertex of the matched quadratic curve It is characterized in that it is calculated by setting an internal division point obtained by internalizing the coordinates as the frontal gaze reference point of the gaze, and calculating the position of the detected pupil using the set frontal gaze reference point as the origin.
  • the gaze tracking device further includes an eyelid curvature calculator configured to calculate the curvature of a quadratic curve matched with the detected eyelid contour line, and calculate the eyelid curvature, wherein the gaze tracking unit includes the calculated pupil Estimate the gaze in the left, right, up and down directions based on the relative position of , and based on the calculated eyelid curvature and the calculated eyelid curvature with respect to the gaze in the central direction according to the set frontal gaze reference point, from the gaze in the central direction
  • the user's gaze is tracked by calculating a change in the curvature of the eyelid with respect to the estimated gaze and correcting the vertical direction of the estimated gaze based on the calculated change in curvature.
  • the gaze tracking apparatus further includes an image acquisition unit for acquiring an eye image of the user as a visible light image or an infrared image, and a gaze tracking result providing unit providing a result of tracking the gaze, wherein the edge detection unit includes: The method further comprises converting the obtained eye image into the binary image by removing noise.
  • the gaze tracking method includes an edge detection step of converting a user's eye image into a binary image and detecting an edge from the eye image, a pupil detection step of detecting a pupil from the detected edge, the The eye area detection step of detecting the eye area by detecting the outline of the eyelid from the detected pupil and the edge, and the eye tracking step of tracking the user's gaze in real time using the detected pupil, the outline of the eyelid, and the eye area Including, wherein the tracking of the user's gaze in real time includes using the curvature of the detected eyelid contour that changes according to the detected height of the pupil located in the detected eye region to determine the user's gaze It is characterized in that it is performed by recognizing the direction.
  • a window of a preset size is set on the converted binary image based on the extracted pupil, and the edge of the eyelid is maintained around the edge of the eyelid adjacent to the edge of the pupil. Detecting the eye area by extending the set window in the left and right directions up to the edge boundary of the eyelid, and extending or reducing the set window in the vertical direction up to the upper and lower edge boundary of the eyelid to detect the outline of the eyelid characterized.
  • the eye tracking method includes a pupil relative position calculation step of detecting a relative position of the detected pupil with respect to the detected eye region, and an eyelid secondary curve matching step of matching the detected eyelid contour with a quadratic curve.
  • the relative position of the pupil is, in the detected entire eye region, the position coordinate for the front tail of the eye, which is the end point of the left edge of the detected eyelid contour, and the vertex of the matched quadratic curve. It is characterized in that the calculation is performed by setting an internal division point obtained by internalizing the positional coordinates as the frontal gaze reference point of the gaze, and calculating the position of the detected pupil using the set frontal gaze reference point as the origin.
  • the gaze tracking method further includes calculating a curvature of a quadratic curve matched to the detected eyelid contour, and calculating the curvature of the eyelid
  • the eye tracking step includes: Based on the relative position of one pupil, the gaze in the left, right, up and down directions is estimated, and based on the calculated eyelid curvature and the eyelid curvature calculated with respect to the gaze in the mid-center direction according to the set frontal gaze reference point, The user's gaze is tracked by calculating a change in the curvature of the eyelid with respect to the estimated gaze from the gaze, and correcting the vertical direction of the estimated gaze using the calculated change in curvature.
  • the gaze tracking method further includes an image acquisition step of acquiring an eye image of the user as a visible light image or an infrared image, and a gaze tracking result providing step of providing a result of tracking the gaze, wherein the edge detection step includes: , It characterized in that it further comprises converting the obtained eye image into the binary image by removing the noise.
  • the pupil is detected from the input image including the eye region, and the user's eye region is detected based on the detected pupil, Calculate the relative position with respect to the pupil in the detected eye region, and use the calculated relative position with respect to the pupil and the curvature of the eyelid according to the height of the pupil to perform up, down, left and right movements with respect to the user's gaze in real time
  • the pupil is detected from the input image including the eye region
  • the user's eye region is detected based on the detected pupil
  • Calculate the relative position with respect to the pupil in the detected eye region and use the calculated relative position with respect to the pupil and the curvature of the eyelid according to the height of the pupil to perform up, down, left and right movements with respect to the user's gaze in real time
  • the present invention is not limited to the left and right directions of the gaze, but also accurately recognizes the vertical direction of the gaze, thereby accurately recognizing and providing the user's overall gaze.
  • FIG. 1 is a conceptual diagram illustrating a gaze tracking apparatus and method using pupil detection and eyelid curvature according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating a process of detecting a user's eye region according to an embodiment of the present invention.
  • FIG 3 is a diagram illustrating a process of matching the second-order eyelid curve according to an embodiment of the present invention.
  • FIG. 4 is a view showing the relative position of the pupil according to an embodiment of the present invention.
  • FIG. 5 is a view showing a two-dimensional eye model according to an embodiment of the present invention.
  • FIG. 6 is a diagram illustrating a three-dimensional eye model according to an embodiment of the present invention.
  • FIG. 7 is a diagram illustrating a gaze estimation result using only a three-dimensional eye model according to an embodiment of the present invention.
  • FIG. 8 is a view showing changes in the eyelid curvature according to the vertical height of the pupil according to an embodiment of the present invention.
  • FIG. 9 is a diagram illustrating a gaze tracking result reflecting a curvature of an eyelid according to an embodiment of the present invention.
  • FIG. 10 is a block diagram illustrating a configuration of a gaze tracking apparatus using pupil detection and eyelid curvature according to an embodiment of the present invention.
  • 11 is a flowchart illustrating a procedure for tracking gaze using pupil detection and eyelid curvature according to an embodiment of the present invention.
  • FIG. 1 is a conceptual diagram illustrating a gaze tracking apparatus and method using pupil detection and eyelid curvature according to an embodiment of the present invention.
  • the gaze tracking apparatus 100 (hereinafter referred to as the gaze tracking device) using pupil detection and eyelid curvature according to an embodiment of the present invention, from an eye image obtained by photographing a user, The iris and pupil are extracted, and the function of tracking the user's gaze in real time is performed using the curvature of the eyelid that changes according to the height of the extracted pupil.
  • the eye tracking device 100 is applied to a virtual reality device (VR device) or an augmented reality device (AR device) to easily perform an interaction between the virtual reality device or the augmented reality device and the user through the eye tracking result, or , applied to the iris recognition device to accurately recognize the iris by tracking the user's gaze, or applied to the drowsiness detection device provided in automobiles to determine whether the user is drowsy according to the change in the curvature of the eyelids to ensure safe operation It can be applied to a variety of devices, such as That is, the eye tracking device 100 is applied to various devices regardless of whether it is a wearable device (eg, a virtual reality device, an augmented reality device) or a non-mountable device (eg, an iris recognition device, a drowsiness detection device), By tracking the user's gaze, it is implemented so that the result of tracking the gaze can be easily used in various devices as described above.
  • VR device virtual reality device
  • AR device augmented reality device
  • the user's eye image is obtained by photographing the user's eye area through a visible light camera or an infrared camera.
  • the gaze tracking apparatus 100 converts the obtained eye image into a binary image, and detects a plurality of edges from the converted binary image.
  • the eye tracking apparatus 100 performs a noise removal process in order to accurately detect the edge.
  • the eye tracking apparatus 100 uses a median filter having a preset size (eg, 3x3), and a pixel corresponding to a median value among values of a plurality of pixels adjacent to a specific pixel. By outputting the value as the pixel value of the specific pixel, unnecessary small edges are removed by removing noise including impulse noise representing a sudden change in the eye image.
  • the noise can be removed by using various filters such as an average filter in addition to the median filter.
  • the detection of the edge is performed through a canny edge detection method, wherein the canny edge detection method uses two threshold values and uses a high threshold value to set a low threshold value in the slope direction of the edge.
  • the eye tracking apparatus 100 detects a pupil from the detected edge.
  • the detection of the pupil is performed by applying a circular hough transform to the detected edge, and detecting an edge of a linear structure among the detected edges.
  • the eye tracking apparatus 100 may detect the iris as well as the pupil.
  • the eye tracking apparatus 100 detects the user's eye region by detecting the outline of the eyelid from the binary image using the detected edge and the pupil, and calculates the eyelid curvature based on the detected outline of the eyelid do. Meanwhile, the detection of the eye region and the calculation of the eyelid curvature will be described with reference to FIGS. 2 and 3 , respectively.
  • the gaze tracking apparatus 100 calculates the relative position of the detected pupil within the detected eye region, and estimates the direction of the user's gaze based on the calculated relative position of the pupil.
  • the user's gaze which is estimated according to the relative position of the pupil, does not reflect the vertical direction with respect to the gaze, and in order to accurately track the user's gaze, the vertical gaze direction must be recognized in addition to the left and right gaze direction.
  • the eye tracking device 100 calculates the eyelid curvature according to the height of the pupil based on the detected eyelid contour, and calculates the calculated relative position of the pupil, the calculated eyelid curvature, and the three-dimensional eye model of the present invention.
  • the user's gaze moving up, down, left and right is accurately recognized and tracked in real time for every frame of the eye image, thereby providing a reliable gaze tracking result.
  • the eye tracking device 100 determines the position of the forelimb of the eye in the detected eye region and the secondary curve for the eyelid (that is, the secondary curve for the contour of the eyelid) through the eyelid secondary curve matching process. should be calculated. Meanwhile, the eyelid secondary curve matching process will be described in detail with reference to FIG. 3 , and the 3D eye model will be described in detail with reference to FIG. 6 .
  • the present invention recognizes the user's left and right gaze directions as well as the vertical gaze directions using the relative position of the pupil and the eyelid curvature according to the height of the pupil, so that the user's gaze can be accurately tracked in real time and provided. , to make the user's eye tracking results available on various devices.
  • FIG. 2 is a diagram illustrating a process of detecting a user's eye region according to an embodiment of the present invention.
  • the gaze estimating apparatus 100 first determines the pupil detected from the user's eye image as described in FIG. 1 . By detecting the outline of the eyelid as a center, the eye region is detected.
  • the eye tracking apparatus 100 sets a window of a preset size on the converted binary image based on the detected pupil, and sets the edge of the eyelid adjacent to the edge of the pupil as the center,
  • the set window is extended to the left and right to the most magnetic boundary where the edge of the eyelid continues.
  • the eye tracking apparatus 100 detects the eye region from the user's eye image by finally detecting the outline of the eyelid by expanding or reducing the expanded window to the upper and lower boundaries of the edge of the eyelid.
  • the window is reduced to the boundary of the edge of the eyelid, and the upper and lower portions of the window are located inside the eye area so that the window is reduced. Accordingly, when a circular edge corresponding to the pupil is detected, the corresponding window is again extended to the upper and lower boundaries of the edge of the eyelid to detect the eye region.
  • FIG 3 is a diagram illustrating a process of matching the second-order eyelid curve according to an embodiment of the present invention.
  • the eyelid secondary curve matching process is the basis for accurately tracking the user's gaze, and the relative position of the pupil and 2 for the outline of the eyelid This is done to calculate the difference curve.
  • the eye tracking apparatus 100 simplifies the outline (ie, edge) of the eye region detected in the converted binary image through a morphology erosion process using a morphology filter. carry out the process
  • the eye tracking apparatus 100 moves a morphology filter of a preset size to the converted binary image, and sets the value of the pixel located at the center of the morphology to the smallest value among the pixel values in the morphology filter.
  • the outline for the detected eye region ie, the outline of the eyelid
  • the eye tracking apparatus 100 skeletonizes the binary image obtained by simplifying the edge, and uses a RANSAC (random sample consensus) algorithm based on the least squares method of the detected eyelid. Fit the contour to a quadratic curve.
  • RANSAC random sample consensus
  • the Lansack algorithm predicts a mathematical model suitable for given data based on the least squares method, and the eye tracking apparatus 100 uses the Lansak algorithm to apply the detected outline of the eyelid to the outline of the corresponding eyelid. Through quadratic curve matching, a quadratic curve graph for the eyelid is obtained.
  • the eyelid is composed of an upper eyelid that covers and protects the user's eyes from the upper part and a lower eyelid that covers and protects the user's eyes from the lower part, and the secondary curve matching is preferably performed with respect to the upper eyelid.
  • the eye tracking apparatus 100 determines the left edge end point of the detected eyelid in the binary image as the front tail of the user's eye, and the position coordinates (x, y) for the determined front tail position of the eye and , an internal division point obtained by internally dividing the position coordinates (x, y) for the vertices of the matched quadratic curve is set as the reference point for the frontal gaze of the user.
  • the gaze tracking apparatus 100 calculates the position of the detected pupil using the set frontal gaze reference point as an origin, thereby calculating the relative position of the detected pupil in the detected eye region.
  • Figure 4 is a view showing the relative position of the pupil according to an embodiment of the present invention
  • Figure 5 is a view showing a two-dimensional eye model according to an embodiment of the present invention.
  • the relative position of the pupil uses a quadratic curve matched with the outline of the eyelid and the position of the front edge of the eye when the user is gazing at the front. It indicates how far the currently detected pupil is from the frontal gaze reference point.
  • diff_x shown in FIG. 4 represents how much the pupil has moved left and right (ie, in the x-axis direction) from the set frontal gaze reference point
  • diff_y is the vertical distance of the pupil from the set frontal gaze reference point (ie, y It is a value showing how much it has moved in the axial direction).
  • the diff_x and diff_y represent the relative positional coordinates of the pupil with respect to the frontal gaze reference point.
  • the position at which the user gazes on the screen can be derived through the following [Equation 1].
  • diff_x 1 shown in FIG. 5 represents the distance when the calculated relative position of the pupil is moved to the left compared to the frontal gaze reference point when the user's eye image is taken through the camera
  • diff_x 2 is the represents the distance moved to the left when more than diff_x 1, x 1 and x 2 are, means a position (that is, a position away from the center of the screen), the user is the actual gaze on the screen.
  • diff_x 1 and diff_x 2 show a relationship of about 2.2 times, but x 1 and x 2 show a relationship of about 3 times.
  • the difference in the multiple relationship between the two numerical values becomes more extreme as diff_x increases, which is a linear relationship between the difference between the distance captured by the camera and the location where the actual user is gazing on the screen. It means you don't have it.
  • the present invention proposes a three-dimensional eye model to solve this problem.
  • FIG. 6 is a view showing a 3D eye model according to an embodiment of the present invention
  • FIG. 7 is a view showing a gaze estimation result using only the 3D eye model according to an embodiment of the present invention.
  • FIG. 8 is a view showing changes in the eyelid curvature according to the vertical height of the pupil according to an embodiment of the present invention.
  • the 3D eye model stipulates a variable between the user's actual gaze point and the shooting distance on the screen and the user's eyeball.
  • the [Equation 1] can be derived into the following [Equation 2] through the dimensional eye model.
  • x and y denote a gaze point on the screen
  • d denotes a photographing distance
  • r denotes a radius of the user's eyeball.
  • the curvature of the eyelid is reduced compared to when the user is gazing in the front direction, and the user is in the upward direction (ie, upward). It can be seen that the curvature of the eyelids is increased compared to the case of looking straight ahead.
  • the present invention reflects the curvature of the eyelid that changes according to the height of the pupil in [Equation 2], and corrects the gaze direction in the y-axis direction (ie, up and down) to the estimated user's gaze, precisely, To track the user's gaze.
  • the gaze tracking apparatus 100 first calculates the curvature of the eyelids from the frontal gaze reference point with respect to the frontal central gaze.
  • the curvature of the eyelid is preferably calculated as the curvature of the upper eyelid that is changed more sensitively to the vertical movement of the pupil.
  • the coefficient of the quadratic term for the quadratic function of the quadratic curve for the curvature of the eyelid is calculated, where the center If the coefficient of the quadratic term for the curvature of the eyelid with respect to the gaze is set to a 0 and the coefficient of the quadratic term for the curvature of the eyelid with respect to the currently estimated user's gaze is set to a, y of the estimated user's gaze The direction is corrected as in the following [Equation 3].
  • w denotes a preset weight value
  • the gaze tracking apparatus 100 can accurately track and provide the gaze of the user moving up, down, left and right in real time by correcting the estimated vertical direction of the user's gaze using the eyelid curvature.
  • FIG. 9 is a diagram illustrating a gaze tracking result reflecting a curvature of an eyelid according to an embodiment of the present invention.
  • the gaze tracking apparatus 100 of the present invention provides not only the left and right gaze to the user but also the upward direction of the gaze. It can be seen that by accurately recognizing the gaze, it is possible to provide a reliable gaze tracking result by tracking the gaze of the moving user in real time.
  • FIG. 10 is a block diagram illustrating a configuration of a gaze tracking apparatus using pupil detection and eyelid curvature according to an embodiment of the present invention.
  • the gaze tracking apparatus 100 includes an image acquisition unit 110 that captures a user to acquire an eye image of the user, and an edge from the acquired eye image.
  • Edge detection unit 120 for detecting pupil detection unit 130 for detecting pupil from the detected edge
  • eye region detection unit for detecting the eye region of the user by detecting the outline of the eyelid from the detected pupil and edge ( 140)
  • an eyelid secondary curve matching unit 150 for matching the detected eyelid contour with a quadratic curve
  • a pupil relative position calculating unit for calculating a relative position with respect to the detected pupil in the detected eye region ( 160)
  • the eyelid curvature calculating unit 170 for calculating the curvature of the eyelid using the defined quadratic curve, the user's gaze using the detected pupil, the detected eye area, and the calculated eyelid curvature
  • It is configured to include a gaze tracking unit 180 for tracking and a gaze tracking result providing unit 190 providing the tracking result.
  • the image acquisition unit 110 performs a function of acquiring an eye image of the user by photographing the user's eye part in real time through a visible light camera or an infrared camera.
  • the edge detector 120 performs a function of detecting an edge from the acquired eye image.
  • the edge detector 120 removes noise from the acquired eye image through a noise removal process, and converts the noise-removed eye image into a binary image to detect the edge.
  • the pupil detection unit 130 performs a function of detecting a pupil from the binary image based on the detected edge. As described above, detecting the pupil is performed through Hough transform.
  • the eye region detection unit 140 detects the eye region of the user by detecting the outline of the eyelid from the detected pupil and edge.
  • the eye region detection unit 140 sets a window of a preset size on the converted binary image based on the detected pupil, and sets the eyelid edge around the edge of the eyelid adjacent to the edge of the pupil. By extending the window from side to side to the edge boundary of the eyelid edge that continues continuously and vertically extending or reducing the window up to the upper and lower boundary of the eyelid edge, by detecting the outline of the eyelid, the eye region detects
  • the eyelid secondary curve matching unit 150 performs a function of matching the detected outline of the eyelid to the secondary curve.
  • the eyelid secondary curve matching unit 150 simplifies the contour of the detected eyelid through the morphological erosion process, and after skeleton-processing the simplified outline of the eyelid, the contour of the skeleton-treated eyelid is least squares By approximating the quadratic curve based on the method, the contour line for the eyelid is matched to the quadratic curve.
  • the pupil relative position detection unit 160 performs a function of detecting a pupil relative position, which is a position relative to the detected pupil, within the detected eye region.
  • the pupil relative position is an internal division point obtained by internalizing the position coordinates for the front tail of the eye formed by the end point of the left edge of the detected eyelid and the position coordinate for the vertex of the matched quadratic curve of the eyelid. It is calculated by setting as a frontal gaze reference point, which is a position for the pupil when gazing, and calculating the position of the detected pupil using the set frontal gaze reference point as the origin.
  • the eyelid curvature calculator 170 calculates the curvature of the detected eyelid by calculating the curvature of the matched quadratic curve.
  • Calculating the curvature of the eyelid is performed by calculating a coefficient of a quadratic term for the quadratic curve when the contour of the eyelid is matched with a quadratic curve.
  • the eyelid curvature calculating unit 170 not only the eyelid curvature with respect to the detected relative position of the pupil, but also the eyelid curvature with respect to the relative position of the calculated pupil of the central vision, as well as the set frontal gaze reference point.
  • the eyelid curvature with respect to the relative position of the pupil is also calculated.
  • the gaze tracking unit 180 estimates the user's gaze direction according to the detected relative position of the pupil, and the calculated eyelid curvature is based on the estimated user's gaze direction using Equation 3 By reflecting , it performs a function of accurately recognizing the direction of the gaze that the user is actually looking at.
  • the gaze tracking unit 180 estimates the user's gaze according to the relative position of the pupil calculated by the pupil relative position calculation unit 160 , and the detection calculated by the eyelid curvature calculation unit 170 . Based on the eyelid curvature according to the relative position of one pupil and the eyelid curvature calculated for the gaze in the polite direction according to the set frontal gaze reference point, the curvature of the eyelid with respect to the estimated gaze in the gaze in the central direction By calculating the change and correcting the vertical direction of the estimated user's gaze using the calculated change in curvature, the user's gaze is accurately recognized and tracked.
  • the gaze tracking unit 180 accurately tracks the gaze of the user moving up, down, left, and right in real time using the curvature of the eyelid that changes according to the height of the pupil in the detected eye region, thereby tracking the gaze with high reliability. to provide results.
  • the gaze tracking result providing unit 190 outputs and provides the gaze tracking result, so that various devices interworking with the gaze tracking apparatus 100 can use the gaze tracking result.
  • 11 is a flowchart illustrating a procedure for tracking gaze using pupil detection and eyelid curvature according to an embodiment of the present invention.
  • the procedure of tracking the gaze using the pupil detection and the eyelid curvature through the gaze tracking device 100 is first performed by the gaze tracking device 100, the user An image acquisition step of acquiring an eye image for a corresponding user by photographing is performed (S110).
  • the eye image is obtained through a visible light camera or an infrared camera, and when various devices such as a virtual reality device to which the cast tracking device 100 is applied, an augmented reality device, an iris recognition device, and a drowsiness detection device are driven, in real time is obtained
  • the eye tracking apparatus 100 converts the obtained eye image into a binary image, and performs an edge detection step of detecting an edge (S120).
  • the edge detection step converts the eye image into a binary image by removing noise from the acquired eye image in order to detect an accurate edge. Thereafter, in the edge detection step, the edge is detected through various edge detection methods such as a Canny edge detection method. That is, the edge detection method for detecting the edge in the present invention is not limited thereto.
  • a pupil detection step of detecting a pupil based on the detected edge is performed (S130).
  • the pupil is detected, and the hongjae is detected together with the pupil as described above.
  • the eye tracking apparatus 100 detects the outline of the eyelid based on the detected pupil and edge, and performs an eye region detection step of detecting the eye region for the entire eye region of the user ( S140 ).
  • a window of a preset size is set on the basis of the detected pupil on the converted binary image, and the set window is expanded, reduced, or a combination thereof is performed on the eyelid.
  • the contour for the it is performed as described above.
  • the eye tracking apparatus 100 performs an eyelid secondary curve matching step of matching the detected eyelid contour with a secondary curve (S150).
  • the contour of the detected eyelid is simplified, the skeleton is processed again (that is, skeletonized), and then the detected eyelid contour is approximated as a quadratic curve based on the least squares method. by doing so.
  • the eye tracking device 100 calculates the relative position of the detected pupil located in the detected eye region by using a quadratic curve with respect to the contour of the matched eyelid. to perform (S160).
  • the step of calculating the relative position of the pupil comprises internally dividing the position coordinates for the vertices of the matched quadratic curve and the position coordinates for the front tail of the eyes, which is the left edge end point of the detected eyelid contour line.
  • the gaze tracking apparatus 100 tracks the user's gaze using the calculated relative position of the pupil and the eyelid curvature using the matched quadratic curve to provide the tracking result. to perform a gaze tracking step (S170).
  • matching the contour of the eyelid with a quadratic curve or calculating the curvature of the eyelid is preferably performed for the upper eyelid.
  • the user's gaze direction is estimated by applying the detected relative position of the pupil to the three-dimensional eye model of the present invention shown in FIG. 6, and according to Equation 3
  • the gaze toward the user is tracked in real time.
  • the present invention can accurately recognize and track the user's gaze moving in the vertical, horizontal, left, and right directions by using the height of the pupil, that is, the curvature of the eyelids that change according to the vertical direction of the gaze, when tracking the user's gaze. Eye tracking results may be provided.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Human Computer Interaction (AREA)
  • Image Analysis (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

La présente invention concerne un dispositif de suivi du regard à l'aide d'une détection de pupille et d'une courbure de ligne de paupière, et un procédé associé, la pupille et la région de l'œil d'un utilisateur étant détectées à partir d'une image de l'œil de l'utilisateur, capturée par l'intermédiaire d'une caméra, destinée à calculer la position relative de la pupille située dans la région d'œil détectée, et le regard de l'utilisateur se déplaçant dans les directions haut-bas et gauche-droite pouvant être suivi avec précision en temps réel à l'aide de la position relative calculée de la pupille et de la courbure de ligne de paupière variant en fonction de la hauteur de la pupille.
PCT/KR2019/017051 2019-12-04 2019-12-05 Dispositif de suivi du regard à l'aide d'une détection de pupille et d'une courbure de ligne de paupière, et procédé associé WO2021112282A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2019-0159782 2019-12-04
KR1020190159782A KR102294029B1 (ko) 2019-12-04 2019-12-04 동공검출과 눈꺼풀 곡률을 이용한 시선 추적 장치 및 그 방법

Publications (1)

Publication Number Publication Date
WO2021112282A1 true WO2021112282A1 (fr) 2021-06-10

Family

ID=76221976

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2019/017051 WO2021112282A1 (fr) 2019-12-04 2019-12-05 Dispositif de suivi du regard à l'aide d'une détection de pupille et d'une courbure de ligne de paupière, et procédé associé

Country Status (2)

Country Link
KR (1) KR102294029B1 (fr)
WO (1) WO2021112282A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015141312A (ja) * 2014-01-29 2015-08-03 株式会社東芝 表示装置及び視線推定装置
KR20180044331A (ko) * 2015-08-21 2018-05-02 매직 립, 인코포레이티드 눈 포즈 측정을 사용한 눈꺼풀 형상 추정
KR20180044346A (ko) * 2015-08-21 2018-05-02 매직 립, 인코포레이티드 눈꺼풀 형상 추정
WO2019045750A1 (fr) * 2017-09-01 2019-03-07 Magic Leap, Inc. Modèle détaillé de forme d'œil pour applications biométriques robustes
WO2019117350A1 (fr) * 2017-12-14 2019-06-20 삼성전자 주식회사 Dispositif et procédé de détermination de distance d'observation

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015141312A (ja) * 2014-01-29 2015-08-03 株式会社東芝 表示装置及び視線推定装置
KR20180044331A (ko) * 2015-08-21 2018-05-02 매직 립, 인코포레이티드 눈 포즈 측정을 사용한 눈꺼풀 형상 추정
KR20180044346A (ko) * 2015-08-21 2018-05-02 매직 립, 인코포레이티드 눈꺼풀 형상 추정
WO2019045750A1 (fr) * 2017-09-01 2019-03-07 Magic Leap, Inc. Modèle détaillé de forme d'œil pour applications biométriques robustes
WO2019117350A1 (fr) * 2017-12-14 2019-06-20 삼성전자 주식회사 Dispositif et procédé de détermination de distance d'observation

Also Published As

Publication number Publication date
KR102294029B1 (ko) 2021-08-26
KR20210069933A (ko) 2021-06-14

Similar Documents

Publication Publication Date Title
Shreve et al. Macro-and micro-expression spotting in long videos using spatio-temporal strain
AU2016310452B2 (en) Eyelid shape estimation
Morimoto et al. Pupil detection and tracking using multiple light sources
WO2018048000A1 (fr) Dispositif et procédé d'interprétation d'imagerie tridimensionnelle basés sur caméra unique, et support lisible par ordinateur enregistré avec un programme pour une interprétation d'imagerie tridimensionnelle
CN104809424B (zh) 一种基于虹膜特征实现视线追踪的方法
WO2019117350A1 (fr) Dispositif et procédé de détermination de distance d'observation
KR100691348B1 (ko) 팬/틸트 제어기반의 스테레오 카메라를 이용한 이동표적추적방법 및 그 시스템
JP2003015816A (ja) ステレオカメラを使用した顔・視線認識装置
KR101819164B1 (ko) 시선 추적 장치 및 방법
JP6855872B2 (ja) 顔認識装置
CN112183200B (zh) 一种基于视频图像的眼动追踪方法和系统
WO2022039404A1 (fr) Appareil de caméra stéréo ayant un large champ de vision et procédé de traitement d'image de profondeur l'utilisant
CN108681403A (zh) 一种采用视线跟踪的小车控制方法
JP2000105819A (ja) 顔領域検出装置
JPH08287216A (ja) 顔面内部位認識方法
KR20170023565A (ko) 영상 처리를 이용한 손가락 개수 인식 방법 및 장치
WO2021112282A1 (fr) Dispositif de suivi du regard à l'aide d'une détection de pupille et d'une courbure de ligne de paupière, et procédé associé
KR20100121817A (ko) 눈의 영역을 추적하는 방법
WO2012133962A1 (fr) Appareil et procédé de reconnaissance de mouvements 3d par caméra stéréoscopique
WO2016104842A1 (fr) Système de reconnaissance d'objet et procédé de prise en compte de distorsion de caméra
WO2022146109A1 (fr) Procédé et système basés sur une caméra infrarouge pour estimer la position de la main par apprentissage par transfert de domaine
WO2015167081A1 (fr) Procédé et dispositif permettant de détecter une partie d'un corps humain
Tsai et al. Gaze direction estimation using only a depth camera
JP3963789B2 (ja) 眼検出装置、眼検出プログラム、そのプログラムを記録する記録媒体及び眼検出方法
WO2005055144A1 (fr) Procede de detection de machoire sur le visage d'un individu, systeme de detection de machoire, et programme detection de machoire

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19954972

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19954972

Country of ref document: EP

Kind code of ref document: A1