US20180335839A1 - Eye tracking method, electronic device, and non-transitory computer readable storage medium - Google Patents

Eye tracking method, electronic device, and non-transitory computer readable storage medium Download PDF

Info

Publication number
US20180335839A1
US20180335839A1 US15/985,725 US201815985725A US2018335839A1 US 20180335839 A1 US20180335839 A1 US 20180335839A1 US 201815985725 A US201815985725 A US 201815985725A US 2018335839 A1 US2018335839 A1 US 2018335839A1
Authority
US
United States
Prior art keywords
eye
calibration
interest
processing circuit
pupil region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/985,725
Other languages
English (en)
Inventor
Yung-Chen Lin
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
HTC Corp
Original Assignee
HTC Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by HTC Corp filed Critical HTC Corp
Priority to US15/985,725 priority Critical patent/US20180335839A1/en
Assigned to HTC CORPORATION reassignment HTC CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LIN, YUNG-CHEN
Publication of US20180335839A1 publication Critical patent/US20180335839A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • G06K9/00604
    • G06K9/3233
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/98Detection or correction of errors, e.g. by rescanning the pattern or by human intervention; Evaluation of the quality of the acquired patterns
    • G06V10/993Evaluation of the quality of the acquired pattern
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted

Definitions

  • the present disclosure relates to an electronic device and an eye tracking method. More particularly, the present disclosure relates to the electronic device and the eye tracking method in a head mount display (HMD).
  • HMD head mount display
  • eye tracking methods are used in various applications.
  • VR virtual reality
  • AR augmented reality
  • eye tracking methods are used in the VR/AR system to trace a user's gazing direction in order to provide corresponding reaction and/or control in the VR/AR environment.
  • the eye tracking method includes: obtaining, by a processing circuit, an eye model; capturing, by a camera, an image of an eye; detecting, by the processing circuit, a pupil region of interest in the image of the eye; calculating, by the processing circuit, an estimated pupil region in the image of the eye based on the eye model; and measuring, by the processing circuit, a similarity value between the estimated pupil region and the pupil region of interest to optimize the eye model.
  • the electronic device includes a camera, a processing circuit electrically connected to the camera, a memory electrically connected to the processing circuit, and one or more programs.
  • the one or more programs are stored in the memory and configured to be executed by the processing circuit.
  • the one or more programs including instructions for: obtaining an eye model; controlling the camera to capture an image of an eye; detecting a pupil region of interest in the image of the eye; calculating an estimated pupil region in the image of the eye based on the eye model; and measuring a similarity value between the estimated pupil region and the pupil region of interest to optimize the eye model.
  • the non-transitory computer readable storage medium stores one or more programs including instructions, which when executed, causes a processing circuit to perform operations including: obtaining an eye model; controlling a camera to capture, an image of an eye; detecting a pupil region of interest in the image of the eye; calculating an estimated pupil region in the image of the eye based on the eye model; and measuring a similarity value between the estimated pupil region and the pupil region of interest to optimize the eye model.
  • FIG. 1 is a schematic block diagram illustrating an electronic device in accordance with some embodiments of the present disclosure.
  • FIG. 2 is a flowchart illustrating the eye tracking method in accordance with some embodiments of the present disclosure.
  • FIG. 3 is a flowchart illustrating the detailed operations in accordance with some embodiments of the present disclosure.
  • FIG. 4A and FIG. 4B are diagrams illustrating the operation of the electronic device according to some embodiments of the present disclosure.
  • FIG. 5A and FIG. 5B are diagrams illustrating operations of the electronic device according to some embodiments of the present disclosure.
  • FIG. 6 is a flowchart illustrating detailed operations in accordance with some embodiments of the present disclosure.
  • FIG. 7A is a flowchart illustrating detailed operations in accordance with some embodiments of the present disclosure.
  • FIG. 7B is a diagram illustrating the operation of the electronic device according to some embodiments of the present disclosure.
  • FIG. 8 is a flowchart illustrating the eye tracking method in accordance with some other embodiments of the present disclosure.
  • FIG. 9 is a diagram illustrating operations of the electronic device according to some embodiments of the present disclosure.
  • FIG. 1 is a schematic block diagram illustrating an electronic device 100 in accordance with some embodiments of the present disclosure.
  • the electronic device 100 may be configured to perform eye tracking to detect a gaze direction of user.
  • the electronic device 100 may be applied in a virtual reality (VR)/mixed reality (MR)/augmented reality (AR) system.
  • the electronic device 100 may be realized by, a standalone head mounted device (HMD) or VIVE HMD.
  • the standalone HMD or VIVE HMD may handle such as processing location data of position and rotation, graph processing or others data calculation.
  • the electronic device 100 includes a processing circuit 110 , a memory 120 , a camera 130 , infrared light-emitting diodes (IR LED) units 142 , 144 , 146 , and a display unit 150 .
  • One or more programs P 1 are stored in the memory 120 and configured to be executed by the processing circuit 110 , in order to perform the eye tracking method.
  • the processing circuit 110 is electrically connected to the camera.
  • the processing circuit 110 can be realized by, for example, one or more processors, such as central processors and/or microprocessors, but are not limited in this regard.
  • the memory 120 includes one or more memory devices, each of which includes, or a plurality of which collectively include a computer readable storage medium.
  • the computer readable storage medium may include a read-only memory (ROM), a flash memory, a floppy disk, a hard disk, an optical disc, a flash disk, a flash drive, a tape, a database accessible from a network, and/or any storage medium with the same functionality that can be contemplated by persons of ordinary skill in the art to which this disclosure pertains.
  • the memory 120 is electrically connected to the processing circuit 110 .
  • the camera 130 is configured to capture one or more images of an eye of the user, such that the processing circuit 110 may analyze the one or more images to perform eye tracking.
  • the camera 130 may be realized by an infrared camera device.
  • the one or more IR LED units 142 , 144 , 146 may be electrically connected to the processing circuit 110 and configured to provide one or more infrared rays, in which the image of the eye are captured by the infrared camera device using the one or more infrared rays. It is noted that, the embodiments shown in FIG. 1 is merely an example and not meant to limit the present disclosure.
  • the display unit 150 is electrically connected to the processing circuit 110 , such that the video and/or audio content displayed by the display unit 150 is controlled by the processing circuit 110 .
  • FIG. 2 is a flowchart illustrating the eye tracking method 900 in accordance with some embodiments of the present disclosure.
  • the eye tracking method 900 can be applied to an electrical device having a structure that is the same as or similar to the structure of the electronic device 100 shown in FIG. 1 .
  • the embodiments shown in FIG. 1 will be used as an example to describe the eye tracking method 900 according to some embodiments of the present disclosure.
  • the present disclosure is not limited to application to the embodiments shown in FIG. 1 .
  • the eye tracking method 900 includes operations S 1 , S 2 , S 3 , S 4 , S 5 and S 6 .
  • the processing circuit 110 is configured to obtain an eye model.
  • FIG. 3 is a flowchart illustrating the detailed operations of the operation S 1 in accordance with some embodiments of the present disclosure.
  • the operation S 1 includes operations S 11 , S 12 , S 13 , and S 14 .
  • the processing circuit 110 is configured to control the camera 130 to capture multiple calibration images of the eye.
  • FIG. 4A and FIG. 4B are diagrams illustrating the operation of the electronic device 100 according to some embodiments of the present disclosure.
  • the processing circuit 110 may be configured to control the display unit 150 to display calibration gazing points CP 1 -CP 9 sequentially in multiple frames.
  • the calibration gazing points CP 1 -CP 9 correspond to multiple calibration viewing directions respectively.
  • the display unit 150 is displaying the corresponding calibration gazing point CP 1 , which corresponds to the calibration viewing direction VD 1 .
  • the display unit 150 is displaying the corresponding calibration gazing point CP 2 , which corresponds to the calibration viewing direction VD 2 , and so on.
  • the processing circuit 110 may be configured to control the camera 130 to capture the calibration images of the eye in the multiple frames sequentially.
  • the processing circuit 110 may be configured to detect calibration pupil regions of interest in the calibration images respectively.
  • the processing circuit 110 may perform feature extraction to the calibration images in order to find the corresponding calibration pupil regions of interest.
  • image filtering may be performed to modify or enhance the image of the eye to emphasize the pupil features and/or remove other features, and reduce or eliminate the noise of the image.
  • the false targets may also be eliminated using proper image filtering methods.
  • Haar-like features may be used to calculate the features in the image. Then, mean shift algorithm may be applied for image segmentation, and a center and a contour of the pupil in the image of the eye may be determined accordingly. In some other embodiments, machine learning/deep learning method related to computer vision may be applied to identify and determine the center and the contour of the pupil in the image of the eye. For example, Convolutional Neural Network (CNN) may be applied, but the present disclosure is not limited thereto.
  • CNN Convolutional Neural Network
  • the processing circuit 110 may be configured to determine the center of the pupil in the image of the eye, and determine the contour of the pupil in the image of the eye. Then, the processing circuit 110 may be configured to fit an ellipse to the pupil according to the center and the contour of the pupil, to obtain the calibration pupil region of interest.
  • the processing circuit 110 is configured to analyze the calibration pupil regions of interest to obtain calibration vectors correspondingly.
  • FIG. 5A and FIG. 5B are diagrams illustrating operations of the electronic device 100 according to some embodiments of the present disclosure.
  • the calibration pupil regions of interest are fitted as ellipses E 1 , E 2 with different center coordinates, with different rotation angles, and/or with different major axis E 1 a , E 2 a and minor axis E 1 b , E 2 b .
  • different ellipses E 1 , E 2 are fitted as the calibration pupil regions of interest.
  • parameters of the major axis E 1 a , E 2 a and the minor axis E 1 b , E 2 b of the ellipses E 1 , E 2 may be calculated and obtained to be the calibration vectors.
  • the calibration vectors can be realized by normal vectors of surfaces which are formed by the major axis E 1 a , E 2 a and the minor axis E 1 b , E 2 b of the ellipses E 1 , E 2 .
  • operation S 14 may be performed.
  • the processing circuit 110 is configured to obtain the eye model according to the calibration vectors and calibration viewing directions corresponding to the calibration vectors.
  • the eye model may include a matrix indicating relationship between the viewpoint of the eye and the gaze vector of the pupil region of interest.
  • a polynomial equation may be obtained to indicate the relationship between the ellipse parameters (e.g., major axis and the minor axis) and polar coordinates of the corresponding gazing points, which represent the viewing directions of the eye.
  • the eye model may include the companion matrix of the polynomial equation to indicate the relationship thereof.
  • eye tracking may be executed in the following operations S 2 -S 6 as the user performs various interaction with the contents displayed in the display unit 150 while wearing the HMD device. Since the calibration of the eye model is performed in operation S 1 to meet one or more users' specific pupil shape, the accuracy of eye tracking is improved.
  • the processing circuit 110 is configured to control the camera 130 to capture an image of an eye.
  • the processing circuit 110 is configured to detect a pupil region of interest in the image of the eye.
  • the operations of detecting the pupil region of interest are similar to the operations of detecting the calibration pupil region of interests in operation S 1 .
  • FIG. 6 is a flowchart illustrating detailed operations in operation S 3 in accordance with some embodiments of the present disclosure.
  • operation S 31 the processing circuit 110 determines whether a previous pupil region of interest exists. If the previous pupil region of interest exists, which indicates the target is tracked, operation S 33 is executed. If the previous pupil region of interest does not exist, which indicates the tracking is lost or in the initiative state, operation S 32 is executed to perform feature extraction, in which a haar like cascade or a CNN algorithm may be applied.
  • the processing circuit 110 is configured to analyze the pupil region of interest to obtain a gaze vector of the pupil region of interest.
  • the gaze vector includes a major axis and a minor axis indicating a shape of the pupil region of interest.
  • the pupil region of interest, as well as the calibration pupil regions of interest may be a corresponding ellipse, in which the major axis and the minor axis of the ellipse may be used to identify the viewing direction.
  • FIG. 7A is a flowchart illustrating detailed operations in operation S 5 in accordance with some embodiments of the present disclosure.
  • the processing circuit 110 is configured to calculate an angular coordinate of the viewpoint according to an angle of the major axis based on the eye model.
  • the processing circuit 110 is configured to calculate a radial coordinate of the viewpoint according to a ratio of the major axis and the minor axis based on the eye model.
  • the processing circuit 110 is configured to obtain the viewpoint of the eye according to the angular coordinate and the radial coordinate.
  • FIG. 7B is a diagram illustrating the operation of the electronic device 100 according to some embodiments of the present disclosure.
  • the user gazes at the viewpoint VP on the screen of the display unit 150 with the viewing direction VD 3 .
  • the viewpoint VP is denoted in the polar coordinate systems, the angular coordinate a of the viewpoint is correlated to the angle of the major axis.
  • the radial coordinate r of the viewpoint is correlated to the ratio of the major axis and the minor axis.
  • the ellipticity (or oblateness, flattening) of the pupil increases, and results in greater ratio between the major axis and the minor axis. That is, greater the ratio between the major axis and the minor axis, greater the radial coordinate r.
  • the rotating angle of the ellipse changes correspondingly and results in the changes of the angle of the major axis. Accordingly, by applying the eye model obtained in operation S 1 , the viewpoint VP and the viewing direction VD 3 of the eye may be calculated and obtained.
  • the processing circuit 110 may be configured to track a motion of the eye based on the viewpoint calculated using the eye model, and achieve the eye tracking to trace the gaze direction of the user, in order to execute interactions properly.
  • the processing circuit 110 may control the video and/or audio content displayed by the display unit 150 based on the gaze direction of the user, in order to provide a smooth VR experience and/or a friendly user interactive interface.
  • a Pupil Center-Ellipse Ratio (PCER) eye tracking method is realized as discussed in various embodiments mentioned above, in which the eye tracker may process successfully with one time calibration at the beginning.
  • FIG. 8 is a flowchart illustrating the eye tracking method 900 in accordance with some other embodiments of the present disclosure. With respect to the embodiments of FIG. 8 , like elements in FIG. 2 are designated with the same reference numbers for ease of understanding.
  • the eye tracking method 900 may further include operations S 301 and S 302 to perform a head position calibration.
  • the relative position and/or angle of the HMD may shift with the user's movement such as head nodding or shaking.
  • the head position calibration may be performed to optimize the eye model to increase the accuracy of the eye tracking.
  • the processing circuit 110 is configured to calculate an estimated pupil region in the image of the eye based on the eye model.
  • the processing circuit 110 is configured to measure a similarity value between the estimated pupil region and the pupil region of interest to optimize the eye model. Accordingly, in the following operation S 5 , the viewpoint of the eye may be calculated according to the gaze vector based on the optimized eye model. In the following operation S 6 , the motion of the eye is tracked based on the viewpoint calculated using the optimized eye model.
  • FIG. 9 is a diagram illustrating operations of the electronic device 100 according to some embodiments of the present disclosure.
  • the eye 200 indicates the position of an estimated 3D eye model obtained by the processing circuit 110 .
  • the eye 300 indicates the position of the real-time actual eye of the user.
  • the estimated pupil region Re calculated based on the eye model and the pupil region of interest Ri obtained by the image analysis are not perfectly coincided.
  • the accuracy of the eye model may be valued, and the eye model may be optimized correspondingly if the original eye model is failed due to the change of the position and/or angle of the HMD.
  • the processing circuit 110 may be configured to measure multiple error values between multiple estimated pupil regions Re and multiple pupil regions of interest Ri sequentially in different time points. Then, the processing circuit 110 may be configured to optimize the eye model to minimize a cost function according to the error values.
  • the cost function may be represented by the following equations.
  • A denotes a camera intrinsic matrix
  • R denotes a rotation matrix
  • T denotes a translation matrix
  • M denotes a refraction matrix
  • p e denotes a point on pupil contour of the estimated 3D eye model.
  • q i denotes a predicted 2D image point on pupil contour of the estimated 3D eye model (e.g., an image point of the estimated pupil region Re).
  • q i denotes a real 2D image point on pupil contour (e.g., an image point of the pupil region of interest Ri).
  • f(R, T) denotes a cost function of spatial constrain.
  • cost function of spatial constrain indicates limitation of the eye relief and/or eye box, but the present disclosure is not limited thereto.
  • the optimized eye model similarly to the original eye model, includes a matrix indicating relationship between the viewpoint of the eye and the gaze vector of the pupil region of interest.
  • the movement and the rotation of the head may also be estimated based on the eye model and the obtained image correspondingly. That is, the head facing direction may be determined. Accordingly, on the condition that the position of the display plane and the viewing axis are known, the viewing point may be calculated as the point the viewing axis crosses the display plane. Therefore, the rotation and the movement of the head of the user are taken into account during the eye tracking.
  • the head position calibration process discussed above may also be applied other eye tracking method, such as a Pupil Center-Corneal Reflection (PCCR) eye tracking method, in which the eye model may be obtained in various ways, and not limited to the operations S 11 -S 14 in the embodiments of FIG. 3 - FIG. 6 .
  • PCCR Pupil Center-Corneal Reflection
  • an eye tracking method with the head position calibration process is implemented to realize the eye tracker for the application in the VR, AR or MR and increase the accuracy of eye tracking, which brings a smoother user experience for the user wearing the HMD to interact with the object in the VR, AR or MR environment.
  • the eye tracking method 900 may be implemented as a computer program.
  • the computer program When the computer program is executed by a computer, an electronic device, or the processing circuit 110 in FIG. 1 , this executing device performs the eye tracking method 900 .
  • the computer program can be stored in a non-transitory computer readable storage medium such as a ROM (read-only memory), a flash memory, a floppy disk, a hard disk, an optical disc, a flash disk, a flash drive, a tape, a database accessible from a network, or any storage medium with the same functionality that can be contemplated by persons of ordinary skill in the art to which this disclosure pertains.
  • the operations of the eye tracking method 900 may be added to, replaced, and/or eliminated as appropriate, in accordance with various embodiments of the present disclosure.
  • the functional blocks will preferably be implemented through circuits (either dedicated circuits, or general purpose circuits, which operate under the control of one or more processors and coded instructions), which will typically include transistors or other circuit elements that are configured in such a way as to control the operation of the circuity in accordance with the functions and operations described herein.
  • a compiler such as a register transfer language (RTL) compiler.
  • RTL compilers operate upon scripts that closely resemble assembly language code, to compile the script into a form that is used for the layout or fabrication of the ultimate circuitry. Indeed, RTL is well known for its role and use in the facilitation of the design process of electronic and digital systems.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • General Health & Medical Sciences (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Eye Examination Apparatus (AREA)
  • Processing Or Creating Images (AREA)
  • Image Analysis (AREA)
  • User Interface Of Digital Computer (AREA)
US15/985,725 2017-05-22 2018-05-22 Eye tracking method, electronic device, and non-transitory computer readable storage medium Abandoned US20180335839A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/985,725 US20180335839A1 (en) 2017-05-22 2018-05-22 Eye tracking method, electronic device, and non-transitory computer readable storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201762509196P 2017-05-22 2017-05-22
US15/985,725 US20180335839A1 (en) 2017-05-22 2018-05-22 Eye tracking method, electronic device, and non-transitory computer readable storage medium

Publications (1)

Publication Number Publication Date
US20180335839A1 true US20180335839A1 (en) 2018-11-22

Family

ID=62235835

Family Applications (2)

Application Number Title Priority Date Filing Date
US15/985,725 Abandoned US20180335839A1 (en) 2017-05-22 2018-05-22 Eye tracking method, electronic device, and non-transitory computer readable storage medium
US15/985,726 Active US10572009B2 (en) 2017-05-22 2018-05-22 Eye tracking method, electronic device, and non-transitory computer readable storage medium

Family Applications After (1)

Application Number Title Priority Date Filing Date
US15/985,726 Active US10572009B2 (en) 2017-05-22 2018-05-22 Eye tracking method, electronic device, and non-transitory computer readable storage medium

Country Status (4)

Country Link
US (2) US20180335839A1 (zh)
EP (2) EP3407255A1 (zh)
CN (2) CN108960045A (zh)
TW (2) TW201901528A (zh)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109725726A (zh) * 2018-12-29 2019-05-07 上海掌门科技有限公司 一种查询方法和装置
CN109725728A (zh) * 2018-12-29 2019-05-07 三星电子(中国)研发中心 一种ar设备的显示修正方法和装置
US20200183489A1 (en) * 2018-12-11 2020-06-11 National Taiwan University Adaptive eye-tracking calibration method
CN111510630A (zh) * 2020-04-24 2020-08-07 Oppo广东移动通信有限公司 图像处理方法、装置及存储介质
CN112381709A (zh) * 2020-11-13 2021-02-19 北京字节跳动网络技术有限公司 图像处理方法、模型训练方法、装置、设备和介质
US11170213B2 (en) * 2016-10-11 2021-11-09 Optos Plc Ocular image capturing device

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3486834A1 (en) * 2017-11-16 2019-05-22 Smart Eye AB Detection of a pose of an eye
TWI674518B (zh) 2018-11-28 2019-10-11 國立臺灣大學 眼球追蹤的校正方法和其裝置
TWI688254B (zh) * 2018-12-11 2020-03-11 宏碁股份有限公司 立體顯示裝置及其參數校正方法
CN109656373B (zh) * 2019-01-02 2020-11-10 京东方科技集团股份有限公司 一种注视点定位方法及定位装置、显示设备和存储介质
CN111399633B (zh) 2019-01-03 2023-03-31 见臻科技股份有限公司 针对眼球追踪应用的校正方法
TWI710250B (zh) * 2019-01-03 2020-11-11 見臻科技股份有限公司 針對眼球追蹤應用之校正方法
CN111506188A (zh) * 2019-01-30 2020-08-07 托比股份公司 用于动态地调整hud的方法和hmd
US10817595B2 (en) * 2019-02-14 2020-10-27 Nanning Fugui Precision Industrial Co., Ltd. Method of device unlocking and device utilizing the same
US11145844B2 (en) * 2019-02-27 2021-10-12 Int Tech Co., Ltd. Method for manufacturing electroluminescent device
TWI711984B (zh) * 2019-03-08 2020-12-01 鴻海精密工業股份有限公司 深度學習加速方法及用戶終端
CN110764613B (zh) * 2019-10-15 2023-07-18 北京航空航天大学青岛研究院 基于头戴式眼动模组的眼动追踪校准方法
CN112748797B (zh) * 2019-10-31 2022-08-09 Oppo广东移动通信有限公司 一种眼球追踪方法及相关设备
FI130748B1 (fi) 2020-02-21 2024-02-26 Seetrue Tech Oy Katseen seuranta
CN111639017B (zh) * 2020-05-29 2024-05-07 京东方科技集团股份有限公司 测量眼球追踪设备延迟量的方法、设备、眼球追踪系统
CN114546102B (zh) * 2020-11-26 2024-02-27 幻蝎科技(武汉)有限公司 眼动追踪滑行输入方法、系统、智能终端及眼动追踪装置

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040174496A1 (en) * 2003-03-06 2004-09-09 Qiang Ji Calibration-free gaze tracking under natural head movement
US20060239670A1 (en) * 2005-04-04 2006-10-26 Dixon Cleveland Explicit raytracing for gimbal-based gazepoint trackers
US20070014552A1 (en) * 2004-02-17 2007-01-18 Yoshinobu Ebisawa Eyeshot detection device using distance image sensor
US20070040908A1 (en) * 2005-03-16 2007-02-22 Dixon Cleveland System and method for perceived image processing in a gaze tracking system
US20080192990A1 (en) * 2007-02-09 2008-08-14 Kabushiki Kaisha Toshiba Gaze detection apparatus and the method of the same
US8878749B1 (en) * 2012-01-06 2014-11-04 Google Inc. Systems and methods for position estimation
US20160011658A1 (en) * 2014-04-11 2016-01-14 Javier San Agustin Lopez Systems and methods of eye tracking calibration
US20160026847A1 (en) * 2014-07-24 2016-01-28 Milan Vugdelija Pupil detection
US20160029883A1 (en) * 2013-03-28 2016-02-04 Eye Tracking Analysts Ltd Eye tracking calibration

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5912721A (en) 1996-03-13 1999-06-15 Kabushiki Kaisha Toshiba Gaze detection apparatus and its method as well as information display apparatus
GB0119859D0 (en) * 2001-08-15 2001-10-10 Qinetiq Ltd Eye tracking system
EP1691670B1 (en) * 2003-11-14 2014-07-16 Queen's University At Kingston Method and apparatus for calibration-free eye tracking
TWI398796B (zh) 2009-03-27 2013-06-11 Utechzone Co Ltd Pupil tracking methods and systems, and correction methods and correction modules for pupil tracking
CN102125422A (zh) 2010-01-12 2011-07-20 北京科技大学 视线追踪系统中基于瞳孔-角膜反射的视线估计方法
US11747895B2 (en) * 2013-03-15 2023-09-05 Intuitive Surgical Operations, Inc. Robotic system providing user selectable actions associated with gaze tracking
CN103838378B (zh) * 2014-03-13 2017-05-31 广东石油化工学院 一种基于瞳孔识别定位的头戴式眼睛操控系统
TWI577327B (zh) 2014-08-14 2017-04-11 由田新技股份有限公司 瞳孔定位方法與裝置及其電腦程式產品
CN104615978B (zh) * 2015-01-23 2017-09-22 清华大学 视线方向跟踪方法及装置
TWI617948B (zh) 2015-07-24 2018-03-11 由田新技股份有限公司 用於眼部追蹤的校正模組及其方法及電腦可讀取紀錄媒體
CN105425967B (zh) * 2015-12-16 2018-08-28 中国科学院西安光学精密机械研究所 视线追踪及人眼感兴趣区域定位系统
CN106339087B (zh) 2016-08-29 2019-01-29 上海青研科技有限公司 一种基于多维坐标的眼球追踪方法及其装置

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040174496A1 (en) * 2003-03-06 2004-09-09 Qiang Ji Calibration-free gaze tracking under natural head movement
US20070014552A1 (en) * 2004-02-17 2007-01-18 Yoshinobu Ebisawa Eyeshot detection device using distance image sensor
US20070040908A1 (en) * 2005-03-16 2007-02-22 Dixon Cleveland System and method for perceived image processing in a gaze tracking system
US20060239670A1 (en) * 2005-04-04 2006-10-26 Dixon Cleveland Explicit raytracing for gimbal-based gazepoint trackers
US20100220288A1 (en) * 2005-04-04 2010-09-02 Dixon Cleveland Explict raytracing for gimbal-based gazepoint trackers
US20080192990A1 (en) * 2007-02-09 2008-08-14 Kabushiki Kaisha Toshiba Gaze detection apparatus and the method of the same
US8878749B1 (en) * 2012-01-06 2014-11-04 Google Inc. Systems and methods for position estimation
US20160029883A1 (en) * 2013-03-28 2016-02-04 Eye Tracking Analysts Ltd Eye tracking calibration
US20160011658A1 (en) * 2014-04-11 2016-01-14 Javier San Agustin Lopez Systems and methods of eye tracking calibration
US20160026847A1 (en) * 2014-07-24 2016-01-28 Milan Vugdelija Pupil detection

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11170213B2 (en) * 2016-10-11 2021-11-09 Optos Plc Ocular image capturing device
US20200183489A1 (en) * 2018-12-11 2020-06-11 National Taiwan University Adaptive eye-tracking calibration method
US10895910B2 (en) * 2018-12-11 2021-01-19 National Taiwan University Adaptive eye-tracking calibration method
CN109725726A (zh) * 2018-12-29 2019-05-07 上海掌门科技有限公司 一种查询方法和装置
CN109725728A (zh) * 2018-12-29 2019-05-07 三星电子(中国)研发中心 一种ar设备的显示修正方法和装置
CN111510630A (zh) * 2020-04-24 2020-08-07 Oppo广东移动通信有限公司 图像处理方法、装置及存储介质
CN112381709A (zh) * 2020-11-13 2021-02-19 北京字节跳动网络技术有限公司 图像处理方法、模型训练方法、装置、设备和介质

Also Published As

Publication number Publication date
EP3407255A1 (en) 2018-11-28
TW201901529A (zh) 2019-01-01
CN108958473B (zh) 2020-08-07
CN108960045A (zh) 2018-12-07
CN108958473A (zh) 2018-12-07
US20180335840A1 (en) 2018-11-22
EP3410347A1 (en) 2018-12-05
US10572009B2 (en) 2020-02-25
TW201901528A (zh) 2019-01-01

Similar Documents

Publication Publication Date Title
US10572009B2 (en) Eye tracking method, electronic device, and non-transitory computer readable storage medium
Memo et al. Head-mounted gesture controlled interface for human-computer interaction
US10990170B2 (en) Eye tracking method, electronic device, and non-transitory computer readable storage medium
US11442537B2 (en) Glint-assisted gaze tracker
US8878906B2 (en) Invariant features for computer vision
JP5243529B2 (ja) 拡張リアリティイメージのためのカメラポーズ推定装置および方法
US11573641B2 (en) Gesture recognition system and method of using same
US9367951B1 (en) Creating realistic three-dimensional effects
US10254831B2 (en) System and method for detecting a gaze of a viewer
US20180300531A1 (en) Computer-implemented 3d model analysis method, electronic device, and non-transitory computer readable storage medium
Borghi et al. Hands on the wheel: a dataset for driver hand detection and tracking
US20220198836A1 (en) Gesture recognition method, electronic device, computer-readable storage medium, and chip
US20170289518A1 (en) Apparatus for replaying content using gaze recognition and method thereof
US10922831B2 (en) Systems and methods for handling multiple simultaneous localization and mapping (SLAM) sources and algorithms in virtual, augmented, and mixed reality (xR) applications
Núnez et al. Real-time human body tracking based on data fusion from multiple RGB-D sensors
Lee et al. Multi-modal user interaction method based on gaze tracking and gesture recognition
US10304258B2 (en) Human feedback in 3D model fitting
CN111277812A (zh) 图像处理方法和设备
Ferhat et al. Eye-tracking with webcam-based setups: Implementation of a real-time system and an analysis of factors affecting performance
Chugh An Eye Tracking System for a Virtual Reality Headset
US20240211038A1 (en) Gesture-Initiated Eye Enrollment
Liang et al. Accurate Annotation of Gaze Point for Wearable Eye Tracking Devices
Kang et al. Head Pose-Aware Regression for Pupil Localization From a-Pillar Cameras
CN116343290A (zh) 基于外观的人眼三维视线方向估计方法、系统、装置
Manjunath CONTROLLING MOUSE CURSOR USING EYEBALL MOVEMENT

Legal Events

Date Code Title Description
AS Assignment

Owner name: HTC CORPORATION, TAIWAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LIN, YUNG-CHEN;REEL/FRAME:045899/0853

Effective date: 20180521

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION