WO2021261770A1 - Procédé de calcul de position de pupille d'utilisateur, et support d'enregistrement, sur lequel un programme d'exécution de procédé de calcul de position de pupille d'utilisateur est enregistré - Google Patents
Procédé de calcul de position de pupille d'utilisateur, et support d'enregistrement, sur lequel un programme d'exécution de procédé de calcul de position de pupille d'utilisateur est enregistré Download PDFInfo
- Publication number
- WO2021261770A1 WO2021261770A1 PCT/KR2021/006198 KR2021006198W WO2021261770A1 WO 2021261770 A1 WO2021261770 A1 WO 2021261770A1 KR 2021006198 W KR2021006198 W KR 2021006198W WO 2021261770 A1 WO2021261770 A1 WO 2021261770A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- pupil position
- calculating
- present
- pupil
- Prior art date
Links
- 210000001747 pupil Anatomy 0.000 title claims abstract description 62
- 238000000034 method Methods 0.000 title claims abstract description 45
- 239000000284 extract Substances 0.000 abstract description 5
- 230000001815 facial effect Effects 0.000 abstract description 3
- 210000001508 eye Anatomy 0.000 description 17
- 238000004364 calculation method Methods 0.000 description 10
- 230000008569 process Effects 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 5
- 238000013527 convolutional neural network Methods 0.000 description 4
- 238000013459 approach Methods 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 210000000744 eyelid Anatomy 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 238000013529 biological neural network Methods 0.000 description 1
- 210000005252 bulbus oculi Anatomy 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 210000000887 face Anatomy 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
Definitions
- the present invention relates to a method for calculating a user's pupil position and a program for executing a method for calculating a user's pupil position, and more particularly, to a user's pupil position through a general camera device such as a webcam rather than an infrared camera device.
- Eye tracking technology a technology that tracks the user's gaze, has been studied for a long time, and it is known that it was first applied in the field of psychology with the expectation that it would be possible to analyze the user's psychological state by tracking the gaze.
- the most common eye-tracking technology represented by Tobii in the United States, is based on a mechanism that can be implemented due to the interaction between users' pupils and infrared rays, and for this purpose, special equipment such as an infrared camera is used. .
- the conventional eye-tracking technology cannot be implemented in an environment without infrared camera equipment because it uses infrared camera equipment as a basic application tool.
- an object of the present invention is to execute a method for calculating a user's pupil position and a method for calculating the user's pupil position so that the user's pupil position can be recognized with high precision through a general camera device, such as a webcam, rather than an infrared camera device.
- a general camera device such as a webcam
- an infrared camera device To provide a recording medium on which a program is recorded.
- a method for calculating a user's pupil position according to the present invention for achieving the above object includes the steps of: (a) extracting, by a computing device, a user's face region from a video image captured by the user; (b) extracting, by the computing device, a user's eye region from the extracted face region; and (c) calculating, by the computing device, the position of the user's pupil based on the extracted eye region.
- the step (c) comprises the step of (c1) extracting a pupil region based on a boundary at which a color value changes in the eye region.
- step (c) is characterized in that it further comprises the step of (c2) calculating the center coordinates of the pupil region.
- the recording medium according to the present invention is characterized in that the program for executing the method is recorded.
- the present invention it is possible to recognize the user's pupil position with high precision through a general camera device such as a webcam rather than an infrared camera device.
- FIG. 1 is a configuration diagram of a method for calculating a user's pupil position and an execution system of a user's gaze tracking method using the same according to an embodiment of the present invention
- FIG. 2 is a flowchart illustrating an execution process of a method for calculating a user's pupil position according to an embodiment of the present invention
- FIG. 3 is a flowchart illustrating an execution process of a method for tracking a user's gaze using a method for calculating a pupil position according to an embodiment of the present invention.
- FIG. 4 is a diagram illustrating an execution principle of a method for calculating a pupil position according to an embodiment of the present invention.
- FIG. 1 is a configuration diagram of a method for calculating a user's pupil position and an execution system of a user's eye tracking method using the same according to an embodiment of the present invention
- FIG. 2 is a method for calculating a user's pupil position according to an embodiment of the present invention. It is a flow chart showing the execution process of
- the camera device 100 such as a webcam, which captures the user's face, photographs the user's face, which is gazing at the target object 20 such as a monitor, newspaper, test paper, art work, etc., and the captured video image Data is transmitted to the computing device 100 in real time.
- the target object 20 such as a monitor, newspaper, test paper, art work, etc.
- a program for executing each step of the method for calculating the user's pupil position and the user's eye tracking method using the same is installed in the computing device 100 according to an embodiment of the present invention, and such a program is provided through a program distribution server It may be downloaded and installed to the computing device 100 having a communication function, such as a PC, a smart phone, etc., and may be transferred or rented while being recorded on various recording media such as a CD or USB.
- a communication function such as a PC, a smart phone, etc.
- the computing device 100 can acquire the user's photographed image from the camera device 100 in real time (S210), and from the video image through the video analysis process for the video image received from the camera device 100 The user's face region is extracted (S230).
- video images captured by various types of camera devices 100 may have various sizes or resolutions, and thus the computing device 100 is a DCN (Deep Convolutional Network) method.
- DCN Deep Convolutional Network
- the computing device 100 can increase the calculation speed and accuracy in each execution procedure of the pupil position calculation method and the gaze tracking method according to the present invention.
- a method utilizing various algorithms in the computer vision region or inspiration from a biological neural network A method using an artificial neural network, which is a statistical learning algorithm that obtained
- the computing device 100 extracts the user's eye area 30 as shown in FIG. 4 from the extracted face area (S250). Specifically, the computing device 100 extracts the eye region 30 by continuously detecting a point at which a difference in color value between the skin tissue of the eyelid and the white color value of the eyeball at the boundary of the external exposure region of the user's eye occurs.
- the computing device 100 extracts the user's pupil region 40 from the extracted eye region 30 ( S270 ). Specifically, the computing device 100 continuously detects a point where a difference between the color value of the iris and the color value of the iris at the boundary between the iris in the eye region 30 and the iris in the eye region 30 is generated, thereby forming the pupil region 40 . will be able to extract
- the computing device 100 calculates the pupil position information by calculating the center coordinates of the pupil region 40 extracted as described above ( S290 ).
- the computing device 100 may calculate the pupil position information by calculating the center coordinates of the circle defined by the pupil region 40 extracted in step S270 described above.
- the computing device 100 calculates the color value of the white region at the boundary between the white region in the eye region 30 and the iris;
- the pupil region 40 having a circular shape may be extracted by performing interpolation on the remaining sections based on the annular continuous detection section of the point where the difference in color values of the iris occurs.
- the present invention it is possible to recognize the user's pupil position with high precision even through the general camera device 100 such as a webcam rather than the infrared camera device.
- the computing device 100 sets parameter values for defining the pupil, such as pupil color value information, pupil shape information, and pupil size information, in advance through a computer vision calculation method. It may be possible to calculate the position of the pupil.
- Blob Detection is an algorithm that compares characteristics such as brightness or color of a digital image with other surrounding areas. You could also use an algorithm.
- the computing device 100 determines the position of the pupil in a manner that sequentially utilizes basic algorithm techniques such as thresholding and blurring of the computer vision region to filter pixel values other than the pupil position. It may be possible to calculate
- the calculating device 100 selects only the pupils within a predetermined error range by comparing the diameter values of the candidates with a predetermined pupil diameter reference value to calculate the position coordinates. will be.
- FIG. 3 is a flowchart illustrating an execution process of a method for tracking a user's gaze using a method for calculating a pupil position according to an embodiment of the present invention.
- FIGS. 1 and 3 an execution process of a method for tracking a user's gaze using a method for calculating a pupil position according to an embodiment of the present invention will be described.
- a gaze vector l of is generated (S330).
- the computing device 100 calculates the rotation direction and rotation angle of the user's face in three dimensions through image analysis of the user's face region extracted from the video image in step S230 described above.
- the computing device 100 may calculate the rotation direction and rotation angle of the user's face in 3D through a facial landmarks detection technique.
- the computing device 100 calculates the user's three-dimensional gaze vector ( l) is created.
- the size of the gaze vector l may be about 1.5 to 2 times the distance from the user to the gaze target 20 .
- the gaze vector l generated in this way may be defined as an equation of a three-dimensional radial line developed along the three-dimensional rotation direction angle of the user's face with the three-dimensional position coordinate of the user's pupil as the starting point.
- the computing device 100 sets the shooting distance, which is the separation distance from the camera device 100 to the user, as a predetermined reference distance. It is desirable to keep it constant, and the calculation device 100 calculates the 3D position coordinates of the pupil based on the reference distance information preset in the calculation device 100 .
- a general camera device 100 such as a webcam, rather than an infrared camera device.
- the computing device 100 may calculate the coordinates of the gaze target plane, which is a plane gazed at by the user, in the gaze target 20 that the user gazes at in three dimensions.
- the computing device ( 100) may calculate the three-dimensional coordinates of the gaze target surface from the video image received from the camera device 100 in step S210 described above.
- the calculation device 100 when the gaze target 20 is not photographed by the camera device 100 , such as when the gaze target 20 is a monitor on which a webcam is installed, the calculation device 100 is focused on It may be preferable that the three-dimensional coordinate information of the target surface is stored in advance.
- the computing device 100 can calculate the coordinates of the intersection of the user's gaze vector l generated in the above-described step S330 and the target plane (S350).
- the computing device 100 can calculate the point where the user's gaze vector l meets the gaze target surface as an area where the user's gaze stays on the gaze target plane (S370).
- the calculation device 100 may calculate the degree of attention of the user to the gaze target 20 based on the calculation result of the coordinates of the intersection of the gaze vector l and the gaze target plane in the aforementioned step S370. (S390).
- the computing device 100 when it is determined that the intersection of the user's gaze vector l and the target plane does not exist when the user is watching lecture content output through the monitor, the computing device 100 allows the user to view the current monitor. It can be determined that the lecture contents output through the lecture are not being watched.
- the user's lecture content It may be possible to calculate the degree of attention.
- the gaze target 20 is an electronic device that requires the user's control, such as a tablet PC, based on the gaze area information of the user on the gaze target surface Control information about the gaze target 20 may be generated.
- the computing device 100 when the user's gaze area on the gaze target surface matches the scroll button display area of the tablet PC screen, the computing device 100 generates a scroll control command and transmits it to the gaze target 20 such as the tablet PC. By doing so, the control operation of the electronic device may be executed based on the user's gaze information.
- a plurality of users may be included in the video image captured by the camera device 100 in step S210 described above, and in this case, the computing device 100 executes each of the steps described above.
- extracting the face region for each user S230
- extracting each user's eye region S250
- extracting each user's pupil region 40 S270
- calculating the center coordinates of each user's pupil region 40 S290
- calculating each user's pupil position information S310
- generating each user's gaze vector (S330) calculating the intersection point between each user's gaze vector (l) and the target plane (S350)
- each user's gaze region Simultaneous tracking of the gazes of a plurality of users may be executed by independently executing the calculation ( S370 ) and the calculation ( S390 ) of each user's attention level.
- the computing device 100 may use a sliding window approach or a convolutional neural network approach (CNN) to simultaneously recognize faces of various sizes of a plurality of users.
- CNN convolutional neural network approach
- the computing device 100 outputs the output value through the CNN layer in the form of a feature map, and based on the feature map, a region in which a face exists with a predetermined probability. A specific location may be indicated, and then, an expected distribution value of the eye region 30 at the specific location may be output.
- the present invention is recognized for its industrial applicability in the field of eye tracking technology.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Life Sciences & Earth Sciences (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- Image Analysis (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
L'invention concerne un procédé de calcul de position de pupille d'un utilisateur et un support d'enregistrement sur lequel est enregistré un programme d'exécution du procédé de calcul de position de pupille d'un utilisateur. La présente invention est mise en œuvre par des étapes dans lesquelles un dispositif d'exploitation extrait la zone faciale d'un utilisateur à partir d'une image capturée par la photographie de l'utilisateur, extrait la zone oculaire de l'utilisateur à partir de la zone faciale extraite, et calcule la position de la pupille de l'utilisateur en fonction de la zone oculaire extraite. Selon la présente invention, la position de la pupille d'un utilisateur peut être reconnue avec une précision élevée grâce à un dispositif caméra général tel qu'une webcam au lieu d'un équipement caméra infrarouge.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2020-0078228 | 2020-06-26 | ||
KR1020200078228A KR102308190B1 (ko) | 2020-06-26 | 2020-06-26 | 사용자의 동공 위치 산출 방법 및 사용자의 동공 위치 산출 방법을 실행시키는 프로그램이 기록된 기록 매체 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021261770A1 true WO2021261770A1 (fr) | 2021-12-30 |
Family
ID=78115307
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2021/006198 WO2021261770A1 (fr) | 2020-06-26 | 2021-05-18 | Procédé de calcul de position de pupille d'utilisateur, et support d'enregistrement, sur lequel un programme d'exécution de procédé de calcul de position de pupille d'utilisateur est enregistré |
Country Status (2)
Country | Link |
---|---|
KR (1) | KR102308190B1 (fr) |
WO (1) | WO2021261770A1 (fr) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20130043366A (ko) * | 2011-10-20 | 2013-04-30 | 경북대학교 산학협력단 | 시선 추적 장치와 이를 이용하는 디스플레이 장치 및 그 방법 |
KR20170036764A (ko) * | 2014-07-24 | 2017-04-03 | 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 | 동공 검출 |
KR20170078054A (ko) * | 2015-12-29 | 2017-07-07 | 주식회사 펀진 | 디지털 홀로그래피 병렬처리 시스템 및 그 방법 |
KR20190100982A (ko) * | 2018-02-05 | 2019-08-30 | 동국대학교 산학협력단 | 딥 러닝 기반의 차량 운전자 시선 추적 장치 및 방법 |
KR20190102651A (ko) * | 2018-02-27 | 2019-09-04 | 주식회사 펀진 | 디지털 홀로그래피 디스플레이 장치 및 그 방법 |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101423916B1 (ko) * | 2007-12-03 | 2014-07-29 | 삼성전자주식회사 | 복수의 얼굴 인식 방법 및 장치 |
-
2020
- 2020-06-26 KR KR1020200078228A patent/KR102308190B1/ko active IP Right Grant
-
2021
- 2021-05-18 WO PCT/KR2021/006198 patent/WO2021261770A1/fr active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20130043366A (ko) * | 2011-10-20 | 2013-04-30 | 경북대학교 산학협력단 | 시선 추적 장치와 이를 이용하는 디스플레이 장치 및 그 방법 |
KR20170036764A (ko) * | 2014-07-24 | 2017-04-03 | 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 | 동공 검출 |
KR20170078054A (ko) * | 2015-12-29 | 2017-07-07 | 주식회사 펀진 | 디지털 홀로그래피 병렬처리 시스템 및 그 방법 |
KR20190100982A (ko) * | 2018-02-05 | 2019-08-30 | 동국대학교 산학협력단 | 딥 러닝 기반의 차량 운전자 시선 추적 장치 및 방법 |
KR20190102651A (ko) * | 2018-02-27 | 2019-09-04 | 주식회사 펀진 | 디지털 홀로그래피 디스플레이 장치 및 그 방법 |
Also Published As
Publication number | Publication date |
---|---|
KR102308190B1 (ko) | 2021-10-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US12053301B2 (en) | Classifying facial expressions using eye-tracking cameras | |
Rakhmatulin et al. | Deep neural networks for low-cost eye tracking | |
JP7286208B2 (ja) | 生体顔検出方法、生体顔検出装置、電子機器、及びコンピュータプログラム | |
Ghani et al. | GazePointer: A real time mouse pointer control implementation based on eye gaze tracking | |
CN113591562B (zh) | 图像处理方法、装置、电子设备及计算机可读存储介质 | |
CN109634407B (zh) | 一种基于多模人机传感信息同步采集与融合的控制方法 | |
WO2019190076A1 (fr) | Procédé de suivi des yeux et terminal permettant la mise en œuvre dudit procédé | |
Lemley et al. | Eye tracking in augmented spaces: A deep learning approach | |
Morris et al. | Facial feature tracking for cursor control | |
Utaminingrum et al. | Eye movement as navigator for disabled person | |
WO2021261771A1 (fr) | Procédé de suivi du regard d'un utilisateur, et support d'enregistrement doté d'un programme pour exécuter un procédé de suivi du regard de l'utilisateur enregistré sur ce dernier | |
Cao et al. | Gaze tracking on any surface with your phone | |
Khilari | Iris tracking and blink detection for human-computer interaction using a low resolution webcam | |
WO2021261770A1 (fr) | Procédé de calcul de position de pupille d'utilisateur, et support d'enregistrement, sur lequel un programme d'exécution de procédé de calcul de position de pupille d'utilisateur est enregistré | |
Annachhatre et al. | Virtual Mouse Using Hand Gesture Recognition-A Systematic Literature Review | |
JP2004157778A (ja) | 鼻位置の抽出方法、およびコンピュータに当該鼻位置の抽出方法を実行させるためのプログラムならびに鼻位置抽出装置 | |
Kondo et al. | Pupil center detection for infrared irradiation eye image using CNN | |
Niu et al. | Real-time localization and matching of corneal reflections for eye gaze estimation via a lightweight network | |
CN112527103A (zh) | 显示设备的遥控方法、装置、设备及计算机可读存储介质 | |
Lander et al. | Eyemirror: Mobile calibration-free gaze approximation using corneal imaging | |
KR102696396B1 (ko) | 학습 지도 장치, 원격으로 학습자의 학습 상황을 모니터링하는 방법 및 컴퓨터 프로그램 | |
Waggoner et al. | Inclusive Design: Accessibility Settings for People with Cognitive Disabilities | |
CN112395922A (zh) | 面部动作检测方法、装置及系统 | |
Bilal et al. | Design a Real-Time Eye Tracker | |
WO2018034384A1 (fr) | Procédé de commande de carte à puce sur la base d'une reconnaissance vocale et de mouvement, et pointeur laser virtuel l'utilisant |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21830146 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21830146 Country of ref document: EP Kind code of ref document: A1 |