KR101734809B1 - Apparatus and method for eye-tracking using web-camera, and apparatus for controlling light using the same - Google Patents

Apparatus and method for eye-tracking using web-camera, and apparatus for controlling light using the same Download PDF

Info

Publication number
KR101734809B1
KR101734809B1 KR1020160007867A KR20160007867A KR101734809B1 KR 101734809 B1 KR101734809 B1 KR 101734809B1 KR 1020160007867 A KR1020160007867 A KR 1020160007867A KR 20160007867 A KR20160007867 A KR 20160007867A KR 101734809 B1 KR101734809 B1 KR 101734809B1
Authority
KR
South Korea
Prior art keywords
user
reference point
image
gaze
pupil
Prior art date
Application number
KR1020160007867A
Other languages
Korean (ko)
Inventor
이찬수
이재익
박신원
Original Assignee
영남대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 영남대학교 산학협력단 filed Critical 영남대학교 산학협력단
Priority to KR1020160007867A priority Critical patent/KR101734809B1/en
Application granted granted Critical
Publication of KR101734809B1 publication Critical patent/KR101734809B1/en

Links

Images

Classifications

    • G06K9/00604
    • G06K9/00281
    • G06K9/00617
    • H05B37/0281

Landscapes

  • Eye Examination Apparatus (AREA)
  • Image Analysis (AREA)

Abstract

Disclosed are a gaze tracking device and a method thereof using a web camera, and a device for controlling light using the same. The gaze tracking device according to an embodiment of the present invention includes: an image obtaining unit for photographing a user and obtaining an image; a reference point extracting unit for extracting a reference point located at one point on the face of the user set from the obtained image; a pupil extracting unit for extracting a pupil of the user from the image; and a gaze tracking unit for tracking the gaze of the user in consideration of at least one of a change in distance between the reference point and the extracted pupil in the image and a change in position of the reference point in the image.

Description

BACKGROUND OF THE INVENTION Field of the Invention [0001] The present invention relates to a gaze tracking device, a method of tracking a gaze using a web camera, and a lighting control device using the gaze tracking device.

Embodiments of the invention relate to eye tracking technology.

Eye tracking refers to the process of estimating the direction of the eye through eye movement. Recently, such a line-of-sight tracking system is increasingly in demand as a visual search evaluation tool in terms of visual interest, driving, cognitive research, and human-computer interaction (HCI) studies. In addition, there is an increasing demand for gaze tracking devices to conduct research on object detection in child cognitive research. However, according to the conventional line-of-sight tracking system, since an infrared camera for measuring the movement of the pupil center is required, there is a problem that a high cost is required to construct the equipment.

Further, a lighting device mounted on the vehicle is formed so as to illuminate the front of the vehicle. As a result, there may be a risk of driving a dark, steep curve area without streetlights.

Korean Patent Registration No. 10-0950138 (Mar. 22, 2010)

Embodiments of the present invention are intended to precisely track eye lines at low cost.

According to an exemplary embodiment of the present invention, there is provided an image processing apparatus including: an image acquisition unit for acquiring an image by capturing a user; A reference point extracting unit for extracting a reference point located at one point on the face of the user set from the acquired image; A pupil extracting unit for extracting a pupil of the user from the image; And a gaze tracking unit for tracking the gaze of the user in consideration of at least one of a change in distance between the reference point in the image and the extracted pupil and a change in position of the reference point in the image, Is provided.

The reference point may be located at one edge of the eye of the user.

The reference point extracting unit may extract the reference point by detecting the face region of the user from the image and measuring a skin color change on the face region.

Wherein the gaze tracking unit calculates a distance between a distance between the reference point and an initial position of the pupil in the image and a distance between the reference point and the current position of the pupil and calculates a distance between the distance between the initial position and the current position It is possible to track the user's gaze using the distance difference.

The gaze tracking unit may obtain the initial position of the reference point and the current position of the reference point in the image and track the user's gaze using the difference between the initial position and the current position.

According to another exemplary embodiment of the present invention, the gaze tracking device described above; And a lighting device for emitting light along the line of sight of the user tracked by the gaze tracking device.

The illumination device may emit light along the user's line of sight when the user's gaze is maintained the same for a set time period.

According to another exemplary embodiment of the present invention, there is provided an image capturing apparatus including: an image acquiring unit acquiring an image by capturing a user; Extracting a reference point located at one point on the face of the user set from the acquired image in a reference point extracting unit; Extracting a pupil of the user from the image in a pupil extracting unit; And tracking the gaze of the user by considering at least one of a change in distance between the reference point and the pupil in the image and a change in position of the reference point in the image tracing unit .

The reference point may be located at one edge of the eye of the user.

The step of extracting the reference point may extract the reference point by detecting the face region of the user from the image and measuring a skin color change on the face region.

The step of tracking the line of sight may further include calculating a distance between the reference point and the initial position of the pupil in the image and the current position of the reference point and the pupil, The gaze of the user can be tracked using the difference.

The step of tracking the line of sight may acquire an initial position of the reference point and a current position of the reference point in the image and track the user's gaze using a difference between the initial position and the current position .

According to embodiments of the present invention, the user's eyes can be precisely tracked at low cost by measuring the position of the user's eyes using a web camera.

According to embodiments of the present invention, the orientation of the user's face can be measured by measuring the movement of the reference point by extracting the edge of the user's eye as a reference point. Furthermore, by tracking the user's gaze in consideration of the orientation of the user's face, it is possible to accurately track the gaze of the user.

In addition, according to the embodiments of the present invention, it is possible to safely operate the vehicle by tracking the driver's gaze and illuminating the driver's eyes in accordance with the driver's gaze.

1 is a block diagram showing a detailed configuration of a lighting control apparatus according to an embodiment of the present invention;
2 is a block diagram illustrating a configuration of a gaze tracking apparatus according to an embodiment of the present invention.
3 is a diagram illustrating an example of a reference point and a pupil in an image according to an embodiment of the present invention.
4 is a flowchart for explaining a gaze tracking method according to an embodiment of the present invention.

Hereinafter, specific embodiments of the present invention will be described with reference to the drawings. The following detailed description is provided to provide a comprehensive understanding of the methods, apparatus, and / or systems described herein. However, this is merely an example and the present invention is not limited thereto.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings. In the following description, well-known functions or constructions are not described in detail since they would obscure the invention in unnecessary detail. The following terms are defined in consideration of the functions of the present invention, and may be changed according to the intention or custom of the user, the operator, and the like. Therefore, the definition should be based on the contents throughout this specification. The terms used in the detailed description are intended only to describe embodiments of the invention and should in no way be limiting. Unless specifically stated otherwise, the singular form of a term includes plural forms of meaning. In this description, the expressions "comprising" or "comprising" are intended to indicate certain features, numbers, steps, operations, elements, parts or combinations thereof, Should not be construed to preclude the presence or possibility of other features, numbers, steps, operations, elements, portions or combinations thereof.

1 is a block diagram showing a detailed configuration of a lighting control apparatus 100 according to an embodiment of the present invention. 1, a lighting control apparatus 100 according to an embodiment of the present invention includes a gaze tracking device 102 and a lighting device 104. [ The lighting control apparatus 100 can be mounted on a vehicle to control the operation of the illumination.

The gaze tracking device 102 may acquire an image by capturing a user and extract a reference point located at one point on the face of the user set from the obtained image. Further, the gaze tracking device 102 extracts the user's pupil from the image, and detects at least one of a change in the distance between the reference point and the extracted pupil in the image and a change in the position of the reference point in the image It is possible to track the gaze of the user. Specifically, the gaze tracking unit can track the user's gaze based on the reference point extracted by the reference point extracting unit and the pupil extracted by the pupil extracting unit.

The illumination device 104 may emit light into the user's line of sight pursued by the eye-gaze tracking device 102. Specifically, the lighting device 104 can receive the user's gaze information from the gaze tracking device 102. [ The user's gaze information may include a gaze point, a gaze time, and the like of the user. At this time, the user's gaze may be one point, but it may be a certain area having an area. The illumination device 104 may include a light emitting module for emitting light. The light emitting module may be, for example, a light-emitting diode, but the type of the light emitting module is not particularly limited. The light emitting means may be attached to one surface of the illumination control apparatus 100, but not limited thereto, and may be provided as a separate module from the illumination control apparatus 100.

The lighting device 104 may emit light along the user's line of sight when the user's line of sight is maintained. In addition, the illumination device 104 may emit light along the user's line of sight when the user's line of sight is maintained for a predetermined period of time (e.g., 3 seconds).

 The illumination device 104 may further include a device for adjusting the orientation of the light emitting module, and may include a plurality of light emitting modules that emit light in different areas.

According to embodiments of the present invention, it is possible to safely operate the vehicle by tracking the driver's gaze and illuminating the driver's eyes in accordance with the driver's gaze.

2 is a block diagram showing a detailed configuration of a gaze tracking device 102 according to an embodiment of the present invention. 2, the gaze tracking device 102 includes an image acquiring unit 202, a reference point extracting unit 204, a pupil extracting unit 206, and a gaze tracking unit 208.

The image acquiring unit 202 acquires a user's image by photographing a user. The user's image may be a video image. The user image may also include the user's face and eyes. For example, the user's image may be an image of the face of the user, but the present invention is not limited thereto. The image acquisition unit 202 may include an optical module such as a web camera. At this time, the optical module may be attached to one side of the gaze tracking device 102, but the present invention is not limited thereto, and the optical module may be provided as a separate module from the gaze tracking device 102.

The reference point extracting unit 204 may extract a reference point located at one point on the user's face set from the obtained user's image. To this end, the reference point extracting unit may detect the face region of the user from the image. The face region may be an area including the user's face in the user's image. The reference point extracting unit 204 converts all the frames into a monochrome image with respect to the acquired user's image, and then detects the face region of the user by adjusting the contrast through histogram equalization, As shown in FIG. The reference point extracting unit 204 can detect the eye region of the user in the same manner, but there is no particular limitation on the method of detecting the face region and the eye region.

The reference point extracting unit 204 may extract a reference point located at one point on the face of the user. The reference point is a virtual point displayed on the user's image and can be a reference for tracking the user's gaze. According to one embodiment, the reference point may be located at one edge of the user ' s eye. In other words, the reference point may be the contiguous edge of the user's two eyes. At this time, the two edges of the user's eyes may be the reference directions of the eyes of the left eye and the right eye, respectively.

When the reference point extraction unit 204 can not extract the face region and the eye region of the user at a specific point in time, the extracted face region and eye region may be assumed to be a new face region and eye region. Specifically, If the reference point extracting unit 204 can not extract the face area and the eye area in the current frame, the face area and the eye area in the previous frame can be applied to the current frame.

The reference point extracting unit 204 can extract a reference point by detecting the face region of the user from the user's image and measuring the skin color change on the face region. Specifically, the reference point extracting unit 204 may obtain the reference point by separating the brightness information of the user image from the color difference information (for example, a YCbCr channel). For example, since the color of the face region and the edge of the user's eyes change, the reference point extraction unit 204 can easily extract the edge as a reference point.

The eye pupil extracting unit 206 may extract the eye pupil of the user from the image of the user acquired by the image acquiring unit 202. [ In one example, the pupil extracting unit 206 can extract the pupil using Equation (1). At this time, the pupil means the center of the iris. That is, the pupil extracting unit 206 may extract the center of the pupil by applying a filter according to Equation (1) in the captured image of the user.

Figure 112016007258744-pat00001

(x: x axis coordinate value, y: y axis coordinate value, c: pupil size)

Also, the pupil extracting unit 206 can extract the pupil center of the user using a Hough transform. That is, the pupil extracting unit 206 may assume a region of 5 px distance from the center of the pupil as a potential pupil region. Next, the pupil extracting unit 206 can extract the center of the pupil by changing the diameter of the potential pupil region and finding the portion closest to the circle.

The gaze tracking unit 208 may track the gaze of the user in consideration of at least one of a change in distance between the reference point in the user's image and the extracted pupil and a change in the position of the reference point in the image . First, the gaze tracking unit 208 may receive an initial position of a reference point and an initial position of a pupil. Accordingly, the gaze tracking unit 208 can calculate the distance between the initial position of the reference point and the initial position of the pupil. In addition, the gaze tracking unit 208 may measure the current position of the reference point and the current position of the pupil in the image. Accordingly, the gaze tracking unit can calculate the distance between the current position of the reference point and the current position of the pupil.

The gaze tracking unit 208 calculates the distance between the reference point and the initial position of the pupil in the image, and the distance between the reference point and the current position of the pupil, and calculates the distance between the initial position and the current position The user's eyes can be tracked using the distance difference of the user.

On the other hand, the gaze tracking unit 208 can determine that the user's gaze has moved when a distance change of more than a threshold value occurs based on the normalized distance between the reference point and the pupil.

The gaze tracking unit 208 may calculate the initial position of the reference point and the current position of the reference point in the image and track the user's gaze using the difference between the initial position and the current position have. In one example, when the distance between the reference point and the pupil becomes shorter than the initial distance, the gaze tracking unit 208 can determine that the user's gaze has moved to the reference point. In one example, when the position of the reference point moves in any first direction, the gaze tracking unit 208 may determine that the user's gaze has moved in the first direction. At this time, the gaze tracking unit 208 may acquire the positions of the reference point and the pupils at the set intervals.

The gaze tracking unit 208 can track the user's gaze in consideration of a change in distance between the reference point and the pupil or a change in position of the reference point. However, in consideration of both the distance change between the reference point and the pupil, . ≪ / RTI >

Table 1 is data showing a user's gaze tracking result according to an embodiment of the present invention. As shown in Table 1, 1236 frames corresponded to movement of the user's gaze for a total of 1358 frames that tracked the user's gaze. According to one embodiment, the user's gaze tracking result showed an accuracy of 91.02%.

real
Tracking
Right side center left
Right side 249 One 26 center 10 718 42 left 23 20 269

In one embodiment, the image acquisition unit 202, the reference point extraction unit 204, the pupil extraction unit 206, the eye tracking unit 208, and the illumination device 104 may include one or more processors and computer readable May be implemented on a computing device including a recording medium. The computer readable recording medium may be internal or external to the processor, and may be coupled to the processor by any of a variety of well known means. A processor in the computing device may cause each computing device to operate in accordance with the exemplary embodiment described herein. For example, a processor may execute instructions stored on a computer-readable recording medium, and instructions stored on the computer readable recording medium may cause a computing device to perform operations in accordance with the exemplary embodiments described herein For example.

3 is an exemplary view illustrating a reference point 310 and a pupil 320 in an image of a user according to an exemplary embodiment of the present invention.

According to one embodiment, as shown in FIG. 3, the reference point 310 may be two adjacent edges of the eye. In addition, the pupil 320 according to an exemplary embodiment may mean the center of the user's pupil 320. The procedure for extracting the pupils 320 is as described above.

4 is a flowchart illustrating a gaze tracking method 400 according to an embodiment of the present invention. The method shown in Fig. 4 can be performed, for example, by the gaze tracking apparatus 102 described above. In the illustrated flow chart, the method is described as being divided into a plurality of steps, but at least some of the steps may be performed in reverse order, combined with other steps, performed together, omitted, divided into detailed steps, One or more steps may be added and performed.

The image acquiring unit 202 may acquire the user's image by capturing the user (S402). The image acquisition unit 202 may include an optical module for capturing a user. The user's image may be an image including the user's face.

The reference point extraction unit 204 may extract a reference point located at one point on the face of the user from the image (S404). According to one embodiment, the reference point may be located at one edge of the user ' s eye. The step of extracting the reference point may extract the reference point by detecting the face region of the user from the image and measuring the skin color change on the face region.

The pupil extracting unit 206 may extract the pupil of the user from the image (S406).

The gaze tracking unit 208 may track the gaze of the user in consideration of at least one of a change in distance between the reference point and the pupil in the image and a change in position of the reference point in step S408. Specifically, the step of tracking the user's gaze may include calculating the distance between the reference point and the initial position of the pupil in the image, and the current position of the reference point and the pupil, respectively, The user's gaze can be tracked using the difference between the current position and the current position. The tracking of the user's gaze may include calculating an initial position of the reference point and a current position of the reference point in the image and tracking the user's gaze using the difference between the initial position and the current position can do.

On the other hand, an embodiment of the present invention may include a program for performing the methods described herein on a computer, and a computer-readable recording medium including the program. The computer-readable recording medium may include a program command, a local data file, a local data structure, or the like, alone or in combination. The media may be those specially designed and constructed for the present invention, or may be those that are commonly used in the field of computer software. Examples of computer-readable media include magnetic media such as hard disks, floppy disks and magnetic tape, optical recording media such as CD-ROMs and DVDs, and specifically configured to store and execute program instructions such as ROM, RAM, flash memory, Hardware devices. Examples of such programs may include machine language code such as those produced by a compiler, as well as high-level language code that may be executed by a computer using an interpreter or the like.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, . Therefore, the scope of the present invention should not be limited to the above-described embodiments, but should be determined by equivalents to the appended claims, as well as the appended claims.

100: Lighting control device
102: eye tracking device
104: Lighting device
202:
204: Reference point extracting unit
206: pupil extracting unit
208: eye tracking unit
310: Reference point
320: Eyes

Claims (12)

A lighting control device mounted on a vehicle,
An image acquisition unit for acquiring an image by photographing a user;
A reference point extracting unit for extracting a reference point located at one point on the face of the user set from the acquired image;
A pupil extracting unit for extracting a pupil of the user from the image; And
A gaze tracking device for tracking the gaze of the user in consideration of at least one of a change in distance between the reference point and the extracted pupil in the image and a change in position of the reference point in the image; And
And a lighting device for emitting light to a gaze area corresponding to the gaze of the user being tracked,
Wherein the illumination device emits light to the gaze area when the gaze of the user remains the same for a set time.
The method according to claim 1,
Wherein the reference point is located at one edge of the eye of the user.
The method according to claim 1,
Wherein the reference point extracting unit extracts the reference point by detecting a face region of the user from the image and measuring a skin color change on the face region.
The method according to claim 1,
Wherein the gaze tracking unit calculates a distance between a distance between the reference point and an initial position of the pupil in the image and a distance between the reference point and the current position of the pupil and calculates a distance between the distance between the initial position and the current position And tracking the user's gaze using a distance difference.
The method according to claim 1,
Wherein the gaze tracking unit acquires an initial position of the reference point and a current position of the reference point in the image and tracks the user's gaze using a difference between the initial position and the current position.
delete delete A lighting control method performed by a lighting control device mounted on a vehicle,
Capturing a user and acquiring an image in an image acquiring unit;
Extracting a reference point located at one point on the face of the user set from the acquired image in a reference point extracting unit;
Extracting a pupil of the user from the image in a pupil extracting unit;
Tracking the user's gaze in consideration of at least one of a change in distance between the reference point and the pupil in the image and a change in position of the reference point in the gaze tracking unit; And
In a lighting device, emitting light to a gaze area corresponding to the gaze of the user being tracked,
Wherein the emitting step emits light to the gaze area when the user's gaze is maintained the same for a set time.
The method of claim 8,
Wherein the reference point is located at one edge of the eye of the user.
The method of claim 8,
Wherein the step of extracting the reference point extracts the reference point by detecting the face region of the user from the image and measuring a skin color change on the face region.
The method of claim 8,
The step of tracking the line of sight may further include calculating a distance between the reference point and the initial position of the pupil in the image and the current position of the reference point and the pupil, And tracking the user's gaze using the difference.
The method of claim 8,
Wherein the step of tracking the line of sight acquires the initial position of the reference point and the current position of the reference point in the image and tracks the user's gaze using the difference between the initial position and the current position, Control method.
KR1020160007867A 2016-01-22 2016-01-22 Apparatus and method for eye-tracking using web-camera, and apparatus for controlling light using the same KR101734809B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020160007867A KR101734809B1 (en) 2016-01-22 2016-01-22 Apparatus and method for eye-tracking using web-camera, and apparatus for controlling light using the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020160007867A KR101734809B1 (en) 2016-01-22 2016-01-22 Apparatus and method for eye-tracking using web-camera, and apparatus for controlling light using the same

Publications (1)

Publication Number Publication Date
KR101734809B1 true KR101734809B1 (en) 2017-05-12

Family

ID=58739921

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020160007867A KR101734809B1 (en) 2016-01-22 2016-01-22 Apparatus and method for eye-tracking using web-camera, and apparatus for controlling light using the same

Country Status (1)

Country Link
KR (1) KR101734809B1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114302543A (en) * 2022-03-08 2022-04-08 广州市企通信息科技有限公司 Data processing system and method based on Internet of things

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101395819B1 (en) 2013-11-15 2014-05-16 동국대학교 산학협력단 Apparatus and methdo for tracking vision

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101395819B1 (en) 2013-11-15 2014-05-16 동국대학교 산학협력단 Apparatus and methdo for tracking vision

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114302543A (en) * 2022-03-08 2022-04-08 广州市企通信息科技有限公司 Data processing system and method based on Internet of things

Similar Documents

Publication Publication Date Title
US11699293B2 (en) Neural network image processing apparatus
US10692210B2 (en) Recording medium storing computer program for pupil detection, information processing apparatus, and pupil detecting method
US9473696B2 (en) Gaze detection apparatus, gaze detection computer program, and display apparatus
US20170286771A1 (en) Gaze detection apparatus and gaze detection method
JP4811259B2 (en) Gaze direction estimation apparatus and gaze direction estimation method
JP6322986B2 (en) Image processing apparatus, image processing method, and image processing program
JP2019527448A (en) Method and system for monitoring the status of a vehicle driver
KR101769177B1 (en) Apparatus and method for eye tracking
JP4501003B2 (en) Face posture detection system
JP4491604B2 (en) Pupil detection device
CN112102389A (en) Method and system for determining spatial coordinates of a 3D reconstruction of at least a part of a physical object
JP2008210239A (en) Line-of-sight estimation device
KR101914190B1 (en) Device and method for driver's gaze detection
KR101628493B1 (en) Apparatus and method for tracking gaze of glasses wearer
US10496874B2 (en) Facial detection device, facial detection system provided with same, and facial detection method
JP6221292B2 (en) Concentration determination program, concentration determination device, and concentration determination method
KR20150025041A (en) Method and its apparatus for controlling a mouse cursor using eye recognition
KR101734809B1 (en) Apparatus and method for eye-tracking using web-camera, and apparatus for controlling light using the same
Kunka et al. Non-intrusive infrared-free eye tracking method
CN110598635B (en) Method and system for face detection and pupil positioning in continuous video frames
Wiśniewska et al. Robust eye gaze estimation
KR20130137507A (en) Interaction providing apparatus and method for wearable display device
Ahmed et al. Controlling multimedia player with eye gaze using webcam
US11796802B2 (en) Device tracking gaze and method therefor
KR20190056642A (en) Eye Tracking Method Using Movement of Pupil and Gap between Eyes and Edge of Face

Legal Events

Date Code Title Description
E701 Decision to grant or registration of patent right
GRNT Written decision to grant