WO2018076202A1 - Dispositif d'affichage monté sur la tête pouvant effectuer un suivi de l'œil, et procédé de suivi de l'œil - Google Patents

Dispositif d'affichage monté sur la tête pouvant effectuer un suivi de l'œil, et procédé de suivi de l'œil Download PDF

Info

Publication number
WO2018076202A1
WO2018076202A1 PCT/CN2016/103375 CN2016103375W WO2018076202A1 WO 2018076202 A1 WO2018076202 A1 WO 2018076202A1 CN 2016103375 W CN2016103375 W CN 2016103375W WO 2018076202 A1 WO2018076202 A1 WO 2018076202A1
Authority
WO
WIPO (PCT)
Prior art keywords
eye
head
human eye
image information
eyeball
Prior art date
Application number
PCT/CN2016/103375
Other languages
English (en)
Chinese (zh)
Inventor
李荣茂
臧珊珊
刘燕君
陈昳丽
朱艳春
陈鸣闽
谢耀钦
Original Assignee
中国科学院深圳先进技术研究院
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中国科学院深圳先进技术研究院 filed Critical 中国科学院深圳先进技术研究院
Priority to PCT/CN2016/103375 priority Critical patent/WO2018076202A1/fr
Publication of WO2018076202A1 publication Critical patent/WO2018076202A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Definitions

  • the present invention relates to the field of computer technologies, and in particular, to a head mounted visual device, and more particularly to a head mounted visual device capable of human eye tracking and a human eye tracking method.
  • a head-mounted display (HMD, also known as a head-mounted visual device) reflects a two-dimensional image directly into the viewer's eye, specifically by a set of optical systems (primarily precision optical lenses) that amplify the ultra-microdisplay The image on the image is projected onto the retina, and the large-screen image is presented in the viewer's eye. The image is a magnifying glass to see the object and present an enlarged virtual object image.
  • the image can be obtained directly through a light emitting diode (LED), an active matrix liquid crystal display (AMLCD), an organic light emitting diode (OLED), or a liquid crystal with silicon (LCOS), or can be indirectly obtained by conducting a fiber or the like.
  • LED light emitting diode
  • AMLCD active matrix liquid crystal display
  • OLED organic light emitting diode
  • LCOS liquid crystal with silicon
  • the display system is imaged at infinity by a collimating lens and then reflected through the reflecting surface into the human eye. Head-mounted visual devices are quietly changing people's modern lives because of their portability and entertainment.
  • the existing head-mounted visual device cannot actively interact with the user, that is, the wearer actively operates the head-mounted visual device, and the head-mounted visual device cannot actively sense the user's attention and the user's mood. Therefore, people think of using eye tracking technology to actively perceive the user's attention and the user's mood.
  • how to use the eye tracking technology in the head-mounted visual device to realize real-time tracking of human eye information to obtain the gaze point of the human eye in space there is currently no good solution;
  • the weight of the head-mounted visual device is considered to be a non-negligible factor.
  • the existing eye tracker already has a mature product, it is directly embedded in the eye-catching visual device. The tracker will undoubtedly increase the weight of the virtual reality helmet and reduce the customer experience.
  • the technical problem to be solved by the present invention is to provide a head-mounted visual device capable of human eye tracking and a human eye tracking method, which solves the problem that the existing head-mounted visual device cannot track the human eye viewing orientation. problem.
  • a specific embodiment of the present invention provides a head mounted visual device capable of performing human eye tracking, comprising: a virtual reality helmet for accommodating a head mounted visual device; a light source disposed at the The virtual reality helmet is used to illuminate the eyeball of the human eye; the micro camera is disposed on the virtual reality helmet for collecting eyeball image information of the human eye, so that the server determines the orientation information of the pupil of the human eye according to the image information of the eyeball.
  • a specific embodiment of the present invention further provides a human eye tracking method for a head-mounted visual device, comprising: illuminating an eyeball with an LED light source; collecting an eyeball image information of a human eye by using a micro camera; and utilizing a spatial mapping relationship according to The eyeball image information determines orientation information of the pupil of the human eye.
  • the head-mounted visual device capable of performing human eye tracking and the human eye tracking method have at least the following beneficial effects: by embedding a miniature camera and an LED light source in the head-mounted visual device, and Multiple reference points are set in the virtual scene, and the spatial mapping relationship between the miniature camera, the reference point and the eyeball is constructed by using the three-dimensional matrix; then the micro-camera is used to capture the image information of the eyeball, and the acquired image information of the eyeball is obtained according to the spatial mapping relationship.
  • the analysis can obtain the pupil focus area in real time, thereby determining the user's viewing orientation, without increasing the weight of the head-mounted visual device, and not leaking the environmental information around the user, thereby improving the user experience.
  • FIG. 1A is a schematic structural diagram of a main body of a head-mounted visual device capable of tracking human eyes according to an embodiment of the present invention
  • FIG. 1B is a schematic rear view of a head-mounted visual device capable of tracking human eyes according to an embodiment of the present invention
  • Embodiment 1 of a human eye tracking method for a head mounted visual device according to an embodiment of the present invention
  • Embodiment 3 is a flowchart of Embodiment 2 of a human eye tracking method for a head mounted visual device according to an embodiment of the present invention
  • FIG. 4 is a schematic diagram of three-dimensional coordinates of a spatial positional relationship between a miniature camera, a reference point, and a human eyeball according to an embodiment of the present invention
  • FIG. 5 is a diagram of a coordinate conversion relationship provided by a specific embodiment of the present invention.
  • orientation terms used herein for example, up, down, left, right, front or back, etc., only the orientation with reference to the drawings. Therefore, the orientation terms used are used to illustrate that it is not intended to limit the creation.
  • FIG. 1A is a schematic diagram of a main structure of a head-mounted visual device capable of tracking human eyes according to an embodiment of the present invention
  • FIG. 1B is a head-mounted type capable of tracking human eyes according to an embodiment of the present invention
  • a rear view structure of the visual device as shown in FIG. 1A and FIG.
  • a light source and a micro camera are respectively disposed on two sides of the virtual reality helmet lens, one light source and one micro camera correspond to one eye of the user, and another light source and Another micro camera corresponds to the other eye of the user, the light source is used to illuminate the eyeball of the human eye, and the miniature camera is used to collect the eyeball image information of the human eye, so that the server determines the orientation information of the pupil of the human eye according to the image information of the eyeball.
  • the head mounted visual device comprises a virtual reality helmet 10, a light source 20 and a miniature camera 30, wherein the virtual reality helmet 10 is for accommodating a head mounted visual device;
  • the light source 20 is used to illuminate the eyeball of the human eye;
  • the micro camera 30 is disposed in the virtual reality helmet 10, and the miniature camera 30 is configured to collect eyeball image information of the human eye, so that the server according to the eyeball
  • the image information determines the orientation information of the pupil of the human eye, wherein the micro camera 30 can be a miniature camera, a micro camera, etc.
  • the light source 20 can be a micro LED light source, and when the micro camera 30 collects the eyeball image information of the human eye, the light source 20 is instantly turned on and off.
  • the miniature camera 30 is connected to the server via the HDMI data line of the miniature camera.
  • the orientation information of the pupil of the human eye specifically refers to: viewing the straight line directly in front of the human eye as a reference line, and then connecting the viewing target point with the pupil of the human eye, and the angle and positional relationship information between the connection line and the reference line is The orientation information of the pupil of the human eye.
  • the server calculates the orientation information of the pupil of the human eye according to the spatial positional relationship between the micro camera 30, the reference point, and the eye of the human eye.
  • the number of reference points is at least 4.
  • the light source 20 specifically includes a first LED light source 201 and a second LED light source 202.
  • a first LED light source 201 is disposed at a left lens edge of the virtual reality helmet 10;
  • a second LED light source 202 is disposed at a right lens edge of the virtual reality helmet 10; and
  • the first LED light source 201 is configured to illuminate a left eye The eyeball;
  • the second LED light source 202 is for illuminating the right eyeball.
  • the miniature camera 30 specifically includes a first miniature camera 301 and a second miniature camera 302. a first miniature camera 301 is disposed at a left lens edge of the virtual reality helmet 10; a second miniature camera 302 is disposed at a right lens edge of the virtual reality helmet 10; and a first miniature camera 301 is configured to capture a left eye Eyeball image information; the second miniature camera 302 is used to capture eyeball image information of the right eye.
  • the server obtains a left-eye optical axis vector of a left-eye gaze orientation according to the eyeball image information of the left eye, and obtains a right-eye optical axis vector of the right-eye gaze orientation according to the right-eye ocular image information, and then The orientation information of the pupil of the human eye is determined according to the intersection of the optical axis vector of the left eye and the optical axis vector of the right eye.
  • a miniature camera and a light source are disposed in a virtual reality helmet, and a plurality of reference points are set in the virtual scene, and a spatial mapping relationship between the miniature camera, the reference point, and the eye of the human eye is constructed by using the three-dimensional matrix;
  • the micro-camera is used to collect the image information of the eyeball, and the acquired image information of the eyeball is analyzed according to the spatial mapping relationship, and the pupil focusing area can be obtained in real time, thereby determining the view of the user. Look at the orientation without increasing the weight of the head-mounted visual device and without revealing environmental information around the user.
  • the power source is integrated into a USB interface (not shown) to supply power to the electronic components such as the light source 20 and the micro camera 30 in the virtual reality helmet;
  • the wearable visual device is connected to the server through the HDMI data line, the server controls the light source 20 switch and the micro camera 30 to collect the eyeball image information through the HDMI data line, and the processing of the eyeball image information collected by the micro camera 30 is completed by the server.
  • a processor may be provided in the virtual reality helmet 10 to perform the processing and control of the server.
  • FIG. 2 is a flowchart of Embodiment 1 of a human eye tracking method for a head-mounted visual device according to an embodiment of the present invention.
  • an LED light source is turned on instantaneously, and a micro camera collects a human eye.
  • the eyeball image information determines the orientation information of the pupil of the human eye by analyzing the acquired image information of the eyeball.
  • Step 101 Illuminating the eyeball of the human eye with an LED light source.
  • the LED light source is similar to the camera's flash, and is turned off immediately after turning on, without affecting the user's normal visual experience.
  • Step 102 Acquire an eyeball image information of a human eye by using a micro camera.
  • the miniature camera captures the eyeball image information of the human eye when the LED light source is turned on; the miniature camera can be a miniature camera, a miniature camera, or the like.
  • Step 103 Determine the orientation information of the pupil of the human eye according to the eyeball image information by using a spatial mapping relationship.
  • the step 103 includes: collecting eyeball image information of the left eye and eyeball image information of the right eye; obtaining a left-eye optical axis vector of the left-eye gaze orientation according to the eyeball image information of the left eye, and according to the right The eyeball image information of the eye obtains a right eye optical axis vector of the right eye gaze orientation; and the orientation information of the human eye pupil is determined according to the left eye optical axis vector and the right eye optical axis vector.
  • the micro-camera (which can also use a sensor such as a micro camera) collects the eyeball image information of the human eye, analyzes the acquired eyeball image information according to the spatial mapping relationship, and can obtain the pupil focusing area in real time, thereby determining the viewing orientation of the user. Does not increase the weight of the head-mounted visual device, and does not reveal environmental information around the user, thereby improving the user experience.
  • FIG. 3 is a flowchart of Embodiment 2 of a human eye tracking method for a head-mounted visual device according to an embodiment of the present invention. As shown in FIG. 3, before the user performs human eye tracking, the user needs to utilize The three-dimensional matrix constructs a spatial mapping relationship between the miniature camera, the reference point, and the eye of the human eye.
  • the method further includes:
  • Step 100 Construct a spatial mapping relationship between the miniature camera, the reference point, and the eyeball of the human eye using the three-dimensional matrix.
  • a three-dimensional matrix form is used to fit the coordinate system between the eyeball of the human eye and the coordinate system of the reference point, and the position between the miniature camera and the eye of the human eye. Relationship, finally constructing the spatial mapping relationship between the miniature camera, the reference point and the eyeball of the human eye, and using the spatial mapping relationship combined with the collected eyeball image information, the visual gaze point of the user in the virtual space can be calculated in real time.
  • FIG. 4 is a schematic diagram of three-dimensional coordinates of a spatial positional relationship between a miniature camera, a reference point, and a human eyeball according to an embodiment of the present invention.
  • the present invention provides a pupil focus area tracking applied to a virtual reality helmet.
  • the solution is mainly to install a miniature camera (for example, a miniature camera) on both sides of the lens of a head-mounted visual device (for example, a virtual reality helmet), and install an LED light source on the edge of the miniature camera lens, and the virtual reality helmet is operated in a virtual state.
  • Four reference points are set in the scene. When the eyeball is looking at the reference point, the LED light source is turned on.
  • the miniature camera captures and records the real-time image information of the eyeball and the pupil, and then combines the miniature camera, the reference point, and the coordinate system of the human eyeball. Spatial positional relationship, with different functional forms and matrix forms to fit the one-to-one correspondence between the eye reference frame and the reference frame where the reference point is located, and obtain the pupil position and its orientation information, which can be calculated in the space.
  • E1 and E2 are the origin of the space rectangular coordinate system where the left and right eyeballs are located; S1 and S2 are the origin of the space rectangular coordinate system where the miniature camera is located; O is the origin of the space rectangular coordinate system where the target fixation point is located; X1 and X2 are the reference points set in the virtual reality X1 and X2 are located on the midline of the line segment of the two eyeballs; X3 is the target fixation point in the virtual reality scene; H1, H2 and Ct are the vertical distance between the camera and the human eye; L is the distance between the two eyeballs; Cs is the distance between two miniature cameras; the distance between reference points X1 and X2 is equal to the distance between reference points X1 and S0, both of which are ⁇ X; the angle of ⁇ E1X1E2 is 2 ⁇ .
  • the spatial position and orientation information of the pupil are calculated. , you can get the vector coordinates of the pupil looking at a certain point.
  • the spatial position of the pupil can be expressed as The pupil movement in the space contains three dimensions of the X-axis, the Y-axis, and the Z-axis. Therefore, there should be three unknown parameters, but since the pupil moves on the fixed plane of the eyeball, it is on the fixed plane where the eyeball is located. , Contains only two unknown parameters ⁇ 0 , ⁇ 0 in the two-dimensional space in which the pupil plane motion is located, and another parameter Directly related to ⁇ 0, ⁇ 0.
  • the gaze orientation of the pupil is the rotation angle of the pupil in three dimensions of the space in which it is located, denoted as R, and the spatial position and orientation data of the pupil are integrated, and the vector coordinate information [R, t] when the pupil looks at a certain point can be obtained.
  • R is a 3x3 rotation matrix representing the gaze orientation of the pupil
  • t is a 3x1 vector representing the spatial position information of the pupil. Since the rotation angle R is also on the fixed plane of the eyeball, there are two rotation angles which are unknown parameters, one is the rotation angle around the X axis, and the other is the rotation angle around the Z axis, and the two rotation angles determine the value of R.
  • R The values of R can be determined by (1) and (2):
  • the coordinate system of the reference points X 1 and X 2 is recorded as the plane coordinate system O
  • the coordinate system of the eyeball is recorded as the three-dimensional coordinate system E of the eye
  • the coordinate system of the camera is recorded as S
  • the coordinate system of the two-dimensional image of the eye movement of the camera is located.
  • Recorded as B according to the relationship between the camera, the reference point and the coordinate system of the eyeball in the virtual reality eye tracking system, the coordinate conversion relationship diagram shown in Fig. 5 can be obtained.
  • T O ⁇ E T O ⁇ S ⁇ T S ⁇ B ⁇ T B ⁇ E
  • T O ⁇ E represents the conversion relationship from the eye coordinate system E to the coordinate system O where the reference point is located, and can be calibrated by the reference point.
  • another T O ⁇ S camera coordinate system S relative to the coordinate system O of the reference point, and the coordinate system B of the two-dimensional image captured by the T S ⁇ B camera relative to the coordinate system S of the camera can be obtained by calibration .
  • T B ⁇ E Two unknown parameters (x, y) in T B ⁇ E are calculated from the reference point, that is, the transformation relationship between the current eye coordinate system E and the coordinate system B in which the two-dimensional image is located.
  • Relative orbital eye are two unknown amount, in the eyes and eye shape limitations, eye movement only in X, Y-axis, the reference point by a calibration can be obtained two unknowns T B ⁇ E in, Get the conversion relationship of T B ⁇ E .
  • the unknown parameters of the coordinate system can be calculated.
  • R is a 3 ⁇ 3 rotation matrix
  • t is a 3 ⁇ 1 vector
  • C is an internal matrix.
  • the four external parameters of the pupil determine the position and orientation of the pupil relative to the scene, including two rotation angles, which can uniquely determine R, and the other two parameters constitute t.
  • the starting point (x 0 , y 0 ) represents the pixel coordinates at the intersection of the optical axis and the reference point
  • f x and f y represent the length of the focus in the horizontal and vertical directions, respectively.
  • the two-dimensional image of the eyeball captured by the camera can be converted into the optical axis vector coordinate of the eye gaze orientation, and the intersection of the optical axis vectors acquired by the two eyes is the target gaze region, and here are mainly the following three Situation:
  • the first type the optical axes intersect.
  • the obtained optical axes of the two eyes are successfully intersected to obtain the target fixation point.
  • the light columns intersect. According to the eyeball feature of each user, a light column having a radius of r (according to the characteristics of the user's eyes) centered on the optical axis vector Fo is formed, and the intersection of the left and right eye beams is the target attention area.
  • the third type the light cones intersect.
  • the actual line-of-sight geometry range is that the retina is the apex of the cone of light, and the line of sight is the central axis of the cone of light, and the cone of light is at an angle, that is, the field of view is an area on the focal plane of the view.
  • the intersection area of the area is the focus area, and the geometric center of the focus area is the focus.
  • the first two methods yield sufficient approximation accuracy.
  • the reference point is set in the virtual scene to pick up the eyeball image data of the pupil at different target points, through the spatial positional relationship of the system, the conversion between different coordinate systems, and the image data.
  • the visual gaze point of the user in the virtual space can be calculated in real time.
  • the solution of the invention mainly comprises the following contents: setting of the virtual reality helmet edge camera and the LED light source; setting of the reference point in the virtual reality scene; photographing the pupil movement image; and segmenting the eye white and the pupil according to the image information to obtain the pupil and the eyeball Position relationship; calculate the real-time position and focus direction of the pupil based on the acquired data.
  • a miniature camera is placed at the edge of the lens of the virtual reality helmet to capture the changes in the user's eye.
  • an LED light source is arranged on the micro camera to emit light, which helps the camera to collect data.
  • the position relationship of the miniature camera is shown in Fig. 4.
  • Set reference point Before the user uses the virtual reality helmet, set 4 target points from the near to far as the reference point in the default virtual scene.
  • the reference point is set to obtain the data information when the eye focuses on the reference point, and the user pupils focus.
  • the camera captures the image information of the user's eyeball at this time.
  • the camera photographs the eye movement image: when the user's eyes look at each reference point, the LED light is turned on, and the camera takes a group of images to record the pupil motion information to obtain image data.
  • Analyze the image information to obtain the positional relationship between the pupil and the eyeball transmit different sets of image information captured by the camera to the server, and segment the white of the eye and the pupil through image analysis.
  • the reference system between the eye reference frame and the reference point is fitted in different functional forms and matrix forms.
  • Corresponding mapping relationship is obtained, and the position of the pupil and its orientation information are obtained, and the visual gaze point of the user in the virtual space is calculated in real time.
  • the application environment of the invention is an eye tracking technology inside the virtual reality immersed helmet, geometric nearsight of the near field eye line of sight, and the application environment is an environment with no content tracking other than the eye space, and the environment is controllable for protecting the user's personal information.
  • the interaction (without revealing the user's surroundings) is convenient and easy to use; due to the geometric vision myopia model, the visual optical path reconstruction parameter model of the user's lens, pupil, cornea, vitreous, etc. is not calculated, and the calculation amount is small and the implementation is simple.
  • an embodiment of the present invention may be implemented in various hardware, software code or combinations of both.
  • an embodiment of the present invention may also be a program code for executing the above method in a Digital Signal Processor (DSP).
  • DSP Digital Signal Processor
  • the invention may also relate to various functions performed by a computer processor, digital signal processor, microprocessor or Field Programmable Gate Array (FPGA).
  • the above described processor may be configured to perform specific tasks in accordance with the present invention, which are accomplished by executing machine readable software code or firmware code that defines a particular method disclosed herein.
  • Software code or firmware code can be developed into different programming languages and different formats or forms. Software code can also be compiled for different target platforms. However, different code patterns, types, and languages of software code and other types of configuration code for performing tasks in accordance with the present invention do not depart from the spirit and scope of the present invention.

Abstract

L'invention concerne un dispositif d'affichage monté sur la tête qui peut effectuer un suivi de l'œil, et un procédé de suivi de l'œil, l'affichage monté sur la tête comprenant : un casque de réalité virtuelle (10) utilisé pour recevoir le dispositif d'affichage monté sur la tête; une source de lumière (20) disposée à l'intérieur du casque de réalité virtuelle (10) pour éclairer les globes oculaires des yeux humains; et une caméra miniature (30) disposée à l'intérieur du casque de réalité virtuelle (10) pour collecter des informations d'image de globe oculaire concernant les yeux humains, de telle sorte qu'un serveur peut déterminer des informations d'orientation concernant des pupilles des yeux humains selon les informations d'image de globe oculaire. L'orientation de visualisation d'un utilisateur peut être déterminée en temps réel sans augmenter le poids du dispositif d'affichage monté sur la tête.
PCT/CN2016/103375 2016-10-26 2016-10-26 Dispositif d'affichage monté sur la tête pouvant effectuer un suivi de l'œil, et procédé de suivi de l'œil WO2018076202A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/103375 WO2018076202A1 (fr) 2016-10-26 2016-10-26 Dispositif d'affichage monté sur la tête pouvant effectuer un suivi de l'œil, et procédé de suivi de l'œil

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/103375 WO2018076202A1 (fr) 2016-10-26 2016-10-26 Dispositif d'affichage monté sur la tête pouvant effectuer un suivi de l'œil, et procédé de suivi de l'œil

Publications (1)

Publication Number Publication Date
WO2018076202A1 true WO2018076202A1 (fr) 2018-05-03

Family

ID=62023004

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/103375 WO2018076202A1 (fr) 2016-10-26 2016-10-26 Dispositif d'affichage monté sur la tête pouvant effectuer un suivi de l'œil, et procédé de suivi de l'œil

Country Status (1)

Country Link
WO (1) WO2018076202A1 (fr)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110308794A (zh) * 2019-07-04 2019-10-08 郑州大学 具有两种显示模式的虚拟现实头盔及显示模式的控制方法
CN110347260A (zh) * 2019-07-11 2019-10-18 歌尔科技有限公司 一种增强现实装置及其控制方法、计算机可读存储介质
CN110633014A (zh) * 2019-10-23 2019-12-31 哈尔滨理工大学 一种头戴式眼动追踪装置
CN111240464A (zh) * 2018-11-28 2020-06-05 简韶逸 眼球追踪的校正方法和其装置
CN111524175A (zh) * 2020-04-16 2020-08-11 东莞市东全智能科技有限公司 非对称式多摄像头的深度重建及眼动追踪方法及系统
CN111665932A (zh) * 2019-03-05 2020-09-15 宏达国际电子股份有限公司 头戴式显示装置以及其眼球追踪装置
CN112540084A (zh) * 2019-09-20 2021-03-23 联策科技股份有限公司 外观检查系统与检查方法
CN112633128A (zh) * 2020-12-18 2021-04-09 上海影创信息科技有限公司 余光区域中感兴趣对象信息的推送方法和系统
CN112926521A (zh) * 2021-03-30 2021-06-08 青岛小鸟看看科技有限公司 基于光源亮灭的眼球追踪方法、系统
CN113138664A (zh) * 2021-03-30 2021-07-20 青岛小鸟看看科技有限公司 基于光场感知的眼球追踪系统、方法
CN113242384A (zh) * 2021-05-08 2021-08-10 聚好看科技股份有限公司 一种全景视频显示方法及显示设备
CN113362676A (zh) * 2020-03-04 2021-09-07 上海承尊器进多媒体科技有限公司 一种基于虚拟现实的虚拟现实驾驶系统及方法
CN114209990A (zh) * 2021-12-24 2022-03-22 艾视雅健康科技(苏州)有限公司 一种实时分析医疗装置的入眼有效功的方法及装置

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040150728A1 (en) * 1997-12-03 2004-08-05 Shigeru Ogino Image pick-up apparatus for stereoscope
CN103439794A (zh) * 2013-09-11 2013-12-11 百度在线网络技术(北京)有限公司 头戴式设备的校准方法和头戴式设备
CN104603673A (zh) * 2012-09-03 2015-05-06 Smi创新传感技术有限公司 头戴式系统以及使用头戴式系统计算和渲染数字图像流的方法
CN104685541A (zh) * 2012-09-17 2015-06-03 感官运动仪器创新传感器有限公司 用于确定三维对象上注视点的方法和装置
US20150160725A1 (en) * 2013-12-10 2015-06-11 Electronics And Telecommunications Research Institute Method of acquiring gaze information irrespective of whether user wears vision aid and moves
CN105393160A (zh) * 2013-06-28 2016-03-09 微软技术许可有限责任公司 基于眼睛注视的相机自动聚焦

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040150728A1 (en) * 1997-12-03 2004-08-05 Shigeru Ogino Image pick-up apparatus for stereoscope
CN104603673A (zh) * 2012-09-03 2015-05-06 Smi创新传感技术有限公司 头戴式系统以及使用头戴式系统计算和渲染数字图像流的方法
CN104685541A (zh) * 2012-09-17 2015-06-03 感官运动仪器创新传感器有限公司 用于确定三维对象上注视点的方法和装置
CN105393160A (zh) * 2013-06-28 2016-03-09 微软技术许可有限责任公司 基于眼睛注视的相机自动聚焦
CN103439794A (zh) * 2013-09-11 2013-12-11 百度在线网络技术(北京)有限公司 头戴式设备的校准方法和头戴式设备
US20150160725A1 (en) * 2013-12-10 2015-06-11 Electronics And Telecommunications Research Institute Method of acquiring gaze information irrespective of whether user wears vision aid and moves

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111240464A (zh) * 2018-11-28 2020-06-05 简韶逸 眼球追踪的校正方法和其装置
CN111665932A (zh) * 2019-03-05 2020-09-15 宏达国际电子股份有限公司 头戴式显示装置以及其眼球追踪装置
CN111665932B (zh) * 2019-03-05 2023-03-24 宏达国际电子股份有限公司 头戴式显示装置以及其眼球追踪装置
CN110308794A (zh) * 2019-07-04 2019-10-08 郑州大学 具有两种显示模式的虚拟现实头盔及显示模式的控制方法
CN110347260A (zh) * 2019-07-11 2019-10-18 歌尔科技有限公司 一种增强现实装置及其控制方法、计算机可读存储介质
CN112540084A (zh) * 2019-09-20 2021-03-23 联策科技股份有限公司 外观检查系统与检查方法
CN110633014A (zh) * 2019-10-23 2019-12-31 哈尔滨理工大学 一种头戴式眼动追踪装置
CN110633014B (zh) * 2019-10-23 2024-04-05 常州工学院 一种头戴式眼动追踪装置
CN113362676A (zh) * 2020-03-04 2021-09-07 上海承尊器进多媒体科技有限公司 一种基于虚拟现实的虚拟现实驾驶系统及方法
CN111524175A (zh) * 2020-04-16 2020-08-11 东莞市东全智能科技有限公司 非对称式多摄像头的深度重建及眼动追踪方法及系统
CN112633128A (zh) * 2020-12-18 2021-04-09 上海影创信息科技有限公司 余光区域中感兴趣对象信息的推送方法和系统
CN112926521A (zh) * 2021-03-30 2021-06-08 青岛小鸟看看科技有限公司 基于光源亮灭的眼球追踪方法、系统
CN112926521B (zh) * 2021-03-30 2023-01-24 青岛小鸟看看科技有限公司 基于光源亮灭的眼球追踪方法、系统
US11863875B2 (en) 2021-03-30 2024-01-02 Qingdao Pico Technology Co., Ltd Eyeball tracking method and system based on on-off of light sources
CN113138664A (zh) * 2021-03-30 2021-07-20 青岛小鸟看看科技有限公司 基于光场感知的眼球追踪系统、方法
CN113242384A (zh) * 2021-05-08 2021-08-10 聚好看科技股份有限公司 一种全景视频显示方法及显示设备
CN114209990A (zh) * 2021-12-24 2022-03-22 艾视雅健康科技(苏州)有限公司 一种实时分析医疗装置的入眼有效功的方法及装置

Similar Documents

Publication Publication Date Title
WO2018076202A1 (fr) Dispositif d'affichage monté sur la tête pouvant effectuer un suivi de l'œil, et procédé de suivi de l'œil
US11290706B2 (en) Display systems and methods for determining registration between a display and a user's eyes
US10917634B2 (en) Display systems and methods for determining registration between a display and a user's eyes
US9728010B2 (en) Virtual representations of real-world objects
CN107991775B (zh) 能够进行人眼追踪的头戴式可视设备及人眼追踪方法
US9779512B2 (en) Automatic generation of virtual materials from real-world materials
US9727132B2 (en) Multi-visor: managing applications in augmented reality environments
CA2820950C (fr) Zone focale optimisee pour affichages de realite augmentee
JP5908491B2 (ja) 拡張現実表示のための自動合焦の改良
US20160131902A1 (en) System for automatic eye tracking calibration of head mounted display device
CN112805659A (zh) 通过用户分类为多深度平面显示系统选择深度平面
US20140152558A1 (en) Direct hologram manipulation using imu
CN108139806A (zh) 相对于可穿戴设备跟踪穿戴者的眼睛
JP2016507805A (ja) 複合現実環境のための直接インタラクション・システム
US11422620B2 (en) Display systems and methods for determining vertical alignment between left and right displays and a user's eyes
CN112753037A (zh) 传感器融合眼睛跟踪
CN114581514A (zh) 一种双眼注视点的确定方法和电子设备
WO2023195995A1 (fr) Systèmes et procédés permettant d'effectuer un test neurologique de compétences motrices à l'aide d'une réalité augmentée ou virtuelle
JP2015013011A (ja) 視野制限画像データ作成プログラム及びこれを用いた視野制限装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16919734

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16919734

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 02.07.2019)

122 Ep: pct application non-entry in european phase

Ref document number: 16919734

Country of ref document: EP

Kind code of ref document: A1