WO2019154012A1 - 匹配光源与光斑的方法和装置 - Google Patents

匹配光源与光斑的方法和装置 Download PDF

Info

Publication number
WO2019154012A1
WO2019154012A1 PCT/CN2019/071326 CN2019071326W WO2019154012A1 WO 2019154012 A1 WO2019154012 A1 WO 2019154012A1 CN 2019071326 W CN2019071326 W CN 2019071326W WO 2019154012 A1 WO2019154012 A1 WO 2019154012A1
Authority
WO
WIPO (PCT)
Prior art keywords
spot
distance
distance distribution
module
light source
Prior art date
Application number
PCT/CN2019/071326
Other languages
English (en)
French (fr)
Inventor
刘伟
宫小虎
聂凤梅
王健
杨孟
Original Assignee
北京七鑫易维信息技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京七鑫易维信息技术有限公司 filed Critical 北京七鑫易维信息技术有限公司
Publication of WO2019154012A1 publication Critical patent/WO2019154012A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/145Illumination specially adapted for pattern recognition, e.g. using gratings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/197Matching; Classification

Definitions

  • the present application relates to the field of gaze tracking devices, and in particular to a method and apparatus for matching light sources and spots.
  • VR Virtual Reality
  • a computer simulation system that can create and experience a virtual world.
  • virtual reality technology a real-time dynamic display of a three-dimensional simulation environment can be generated in a computer.
  • the same perception as in the real environment can be obtained in the simulation environment, for example, visual perception, auditory perception, tactile perception, and the like.
  • VR technology has been widely used in various industries. Among them, Sight tracking technology in VR technology is widely used in medicine, for example, tracking in the medical field.
  • a 3D approximate sphere model of the eyeball is mainly used, and a line of sight estimation is performed on a distant device of the eye gaze point according to the pupil center coordinates and the corneal reflection.
  • the VR device uses multiple cameras and multiple light sources, only a single point correction is required to complete the tracking of the line of sight.
  • the light source is usually not specific, and the relative positions of the light source and the camera are usually different, some cameras cannot obtain images during the line of sight tracking, or The acquired image is not good, resulting in an inability to accurately match the light source and the spot.
  • the embodiments of the present application provide a method and apparatus for matching a light source and a light spot to at least solve the technical problem that the spot and the light source are inaccurately matched in the prior art.
  • a method for matching a light source and a light spot includes: acquiring a first distance distribution of a plurality of light spots and a pupil center, wherein the image to be detected includes a plurality of light spots and pupils; a second distance distribution of the light sources; comparing the first distance distribution and the second distance distribution to obtain a comparison result; determining, according to the comparison result, that the spot that satisfies the second distance distribution matches the light source.
  • a device for matching a light source and a light spot comprising: a first acquiring module configured to acquire a first distance distribution of a plurality of spots and a pupil center, wherein the image to be detected includes a plurality of spots and pupils; a second acquisition module configured to acquire a second distance distribution based on the plurality of light sources; and a comparison module configured to compare the first distance distribution and the second distance distribution to obtain a comparison result; determining a module, setting A spot that satisfies the second distance distribution is determined to match the light source based on the comparison result.
  • a storage medium comprising a stored program, wherein the program performs a method of matching a light source with a light spot.
  • a processor configured to execute a program, wherein a method of matching a light source and a spot is performed while the program is running.
  • the distance distribution is used to match the light source and the spot, the first distance distribution of the plurality of spots and the pupil center is obtained, and the second distance distribution based on the plurality of light sources is used, and then the first distance distribution is compared.
  • the second distance distribution is obtained, and the comparison result is obtained, and the spot that satisfies the second distance distribution is determined to match the light source according to the comparison result, wherein the image to be detected includes a plurality of spots and pupils, and the spot light on the image is accurately filtered out.
  • the aim is to realize the technical effect of accurately matching the light source and the light spot, thereby solving the technical problem that the spot and the light source are inaccurately matched in the prior art.
  • FIG. 1 is a flow chart of a method for matching a light source and a light spot according to an embodiment of the present application
  • FIG. 2 is a schematic structural diagram of an optional multi-light source-based acquisition image according to an embodiment of the present application
  • FIG. 3 is a schematic diagram of an optional image to be detected according to an embodiment of the present application.
  • FIG. 4 is a schematic diagram of an optional first distance distribution in accordance with an embodiment of the present application.
  • FIG. 5 is a schematic diagram of an optional light source distribution according to an embodiment of the present application.
  • FIG. 6 is a schematic diagram of an alternative second distance distribution in accordance with an embodiment of the present application.
  • FIG. 7 is a schematic structural diagram of a device for matching a light source and a light spot according to an embodiment of the present application.
  • an embodiment of a method of matching a light source and a light spot is provided, it being noted that the steps illustrated in the flowchart of the figures may be performed in a computer system such as a set of computer executable instructions, and Although the logical order is shown in the flowcharts, in some cases the steps shown or described may be performed in a different order than the ones described herein.
  • FIG. 1 is a flowchart of a method for matching a light source and a light spot according to an embodiment of the present application. As shown in FIG. 1 , the method includes the following steps:
  • Step S102 Acquire a first distance distribution of the plurality of spots and the center of the pupil, wherein the image to be detected includes a plurality of spots and pupils.
  • the image capture device may acquire an image to be detected, wherein the image capture device may include, but is not limited to, a camera, a mobile phone with a camera function, a tablet, and the like.
  • the line-of-sight tracking device connected to the image capturing device can process the image collected by the image capturing device, wherein the line-of-sight tracking device can include, but is not limited to, a virtual reality device and a smart terminal capable of line-of-sight tracking, for example, a mobile phone. , computers, tablets, and wearable devices.
  • the image acquired by the image capture device may be a corneal image of the eye.
  • FIG. 2 an alternative schematic diagram of the acquisition of an image based on multiple light sources is shown in FIG.
  • a is the eyeball
  • b is the corneal surface
  • c is the corneal curvature center
  • d is the eyeball rotation center
  • p is the pupil center
  • r is the pupil radius
  • O 1 is the camera
  • I 1 and I 2 are the two light sources.
  • u 21 and u 11 are the imaging points of the camera after the light source is reflected by the cornea. Wherein, the light source is reflected by the cornea and reaches the imaging point of the camera, that is, the light spot.
  • FIG. 1 an alternative schematic of the image to be detected is shown in FIG.
  • the line-of-sight tracking device can process the image to be detected to obtain a center position of the plurality of spots and a distance between the centers of the pupils, for example, a distance between a center position of the spot 1 and the center of the pupil is L1 (not shown in FIG. 3). The distance between the center position of the spot 2 and the center of the pupil is L2 (not shown in Fig. 3), and so on. If there are eight spots, eight distance values are obtained, and the first distance distribution as shown in FIG. 4 is obtained from the obtained eight distance values.
  • Step S104 acquiring a second distance distribution based on the plurality of light sources.
  • a plurality of light sources are distributed in a circular shape in space, for example, eight light sources are equally arranged, and eight light sources are formed into an approximately elliptical shape after undergoing perspective projection transformation or radiation projection transformation.
  • the light spots formed by the plurality of light sources distributed in a circle on the image to be detected are elliptical.
  • the plurality of light sources are distributed in a circular shape, and the imaging of the plurality of light sources is an elliptical distribution, as shown in FIG.
  • the center position of the ellipse can be determined, and then the distance between the center position of each light source and the center position of the ellipse can be calculated, thereby obtaining a second distance distribution of the light source, as shown in FIG.
  • Step S106 comparing the first distance distribution and the second distance distribution to obtain a comparison result.
  • the second distance distribution is in the form of a triangular-like distribution.
  • the spot in the image to be detected is the same as the distribution of the light source, so ideally, the first distance distribution is the same as the second distance distribution.
  • Step S108 determining, according to the comparison result, that the spot that satisfies the second distance distribution matches the light source.
  • the number of light sources is eight and the eight light sources are evenly distributed in a circle. From this, it can be determined that on the curve of the second distance distribution, the distance of each two light sources is changed by ⁇ /2. Then, the distance conversion frequency of each two spots on the curve of the first distance distribution is obtained. If the conversion frequency between the spot A and the spot B is ⁇ /4, it is determined that the spot B is a stray spot, and then the line-of-sight tracking device filters out Spot B. If the frequency of change between spot A and spot B is ⁇ /3, within an acceptable error range, it is determined that spot B matches the source, at which point the line-of-sight tracking device will retain spot B.
  • the first distance distribution characterizes the distribution of the light spot
  • the second distance distribution characterizes the distribution of the light source
  • the foregoing embodiment can achieve the purpose of accurately filtering out the stray spots on the image, thereby realizing the technical effect of accurately matching the light source and the spot, thereby solving the technique of inaccurate matching between the spot and the light source in the prior art. problem.
  • the center positions of the plurality of spots need to be determined, and the specific steps are as follows:
  • Step S10 acquiring an image to be detected
  • Step S12 preprocessing the image to be detected
  • Step S14 performing spot extraction on the pre-processed image to be detected, to obtain a center position of the plurality of spots.
  • the gaze tracking device After obtaining the image to be detected including the human eye image, the gaze tracking device performs enhancement, sharpening, and the like on the detected image, and extracts the spot in the preprocessed image to be detected by using the method of binarized connected domain. Then, the constraint conditions related to the pupil position and the convex hull shape constraint conditions are determined, the spot is extracted again, and finally the center position of the extracted spot is obtained.
  • the first distance distribution may be determined according to the center position of the plurality of spots and the center of the pupil, and the specific steps are as follows:
  • Step S1020 Acquire a distance between a center position of each of the plurality of spots and a center of the pupil to obtain a first distance set;
  • Step S1022 determining a first distance distribution according to the first distance set.
  • the distance between the 9 spots and the center of the pupil is L1, L2, L3, L4, L5, L6, L7, L8, L9, respectively.
  • the distance between the nine spots and the center of the pupil is taken as the ordinate, and the position of the nine spots is labeled as the abscissa, and the first distance distribution as shown in FIG. 4 is obtained.
  • Step S1040 acquiring a center position of a circle formed by the plurality of light sources
  • Step S1042 Obtain a distance between a center position of the plurality of light sources and a center position of the light source to obtain a second distance set.
  • Step S1044 determining a second distance distribution according to the second distance set.
  • determining the second distance distribution is similar to the method for determining the first distance distribution, and details are not described herein again.
  • the plurality of light sources may be uniformly distributed or may be non-uniformly distributed.
  • Step S1060 determining a starting spot and a starting light source, wherein the starting spot corresponds to the starting light source;
  • Step S1062 determining a first transform frequency between every two spots in the first distance distribution according to the initial spot
  • Step S1064 determining a second transform frequency between each two light sources in the second distance distribution according to the starting light source
  • Step S1066 comparing the first transform frequency with the second transform frequency to obtain a comparison result.
  • the starting light source can be determined by the relative position of the light source and the human eye, and the starting spot is determined by the relative position of the spot to the human eye in the image to be detected, for example, the human eye is large.
  • the corner of the eye ie, the corner of the eye near the bridge of the nose
  • the light source closest to the distance from the large eye is the starting light source.
  • the spot closest to the big eye angle is determined as the starting spot, and each spot is determined clockwise.
  • first transform frequency between and a second transform frequency between each two light sources
  • first transform frequency matches the second transform frequency, for example, the frequency difference between the first transform frequency and the second transform frequency is less than Or equal to the preset frequency, it is determined that the spot matches the light source; otherwise, it is determined that the spot does not match the light source.
  • the spot A is the starting spot
  • the light source A' is the starting light source
  • the switching frequency between each two spots and the light source is calculated clockwise, wherein the switching frequency between the light source A' and the light source B' is ⁇ / 2
  • the conversion frequency between the spot A and the spot B is ⁇ /12, since the frequency difference 5 ⁇ /12 between ⁇ /2 and ⁇ /12 is greater than the preset frequency ⁇ /12, it is determined that the spot B and the light source A' are not match.
  • the frequency of conversion between each two light sources is a fixed frequency. In this case, only the conversion frequency between each two spots needs to be compared with a fixed frequency. If multiple light sources are non-uniformly distributed, the frequency of conversion between each two light sources may be different. In this case, the conversion frequency of each two spots needs to be compared with the conversion frequency between the corresponding light sources.
  • determining, according to the comparison result, that the spot that satisfies the second distance distribution matches the light source mainly by determining and second The spot corresponding to the first transform frequency whose matching frequency matches is matched with the light source, and filters the spot corresponding to the first transform frequency that does not match the second transform frequency.
  • FIG. 7 is a schematic structural diagram of a device for matching a light source and a light spot according to an embodiment of the present application. As shown in FIG. 7 , the device includes: a first obtaining module 701, a second acquiring module 703, a comparing module 705, and a determining module 707. .
  • the first obtaining module 701 is configured to acquire a first distance distribution of the plurality of spots and the pupil center, wherein the image to be detected includes a plurality of spots and pupils, and the second obtaining module 703 is configured to acquire the plurality of light sources.
  • the two distance distribution; the comparison module 705 is configured to compare the first distance distribution and the second distance distribution to obtain a comparison result; and the determining module 707 is configured to determine that the spot that satisfies the second distance distribution matches the light source according to the comparison result.
  • first obtaining module 701, the second obtaining module 703, the comparing module 705, and the determining module 707 correspond to the steps S102 to S108 in Embodiment 1, and the four modules and the corresponding steps are implemented by the corresponding steps. It is the same as the application scenario, but is not limited to the content disclosed in the above embodiment 1.
  • the device for matching the light source and the light spot further includes: a third acquisition module, a processing module, and an extraction module.
  • the third acquiring module is configured to acquire an image to be detected; the processing module is configured to perform preprocessing on the image to be detected; and the extracting module is configured to perform spot extraction on the preprocessed image to obtain a central position of the plurality of spots. .
  • the first obtaining module includes: a fourth acquiring module and a first determining module.
  • the fourth obtaining module is configured to obtain a distance between a center position of each of the plurality of spots and a center of the pupil to obtain a first distance set
  • the first determining module is configured to determine the first according to the first distance set. Distance distribution.
  • the first obtaining module includes: a fifth obtaining module, a sixth obtaining module, and a second determining module.
  • the fifth obtaining module is configured to obtain a center position of a circle formed by the plurality of light sources
  • the sixth acquiring module is configured to obtain a distance between a center position of the plurality of light sources and a center position of the light source to obtain a second distance set
  • the second determining module is configured to determine the second distance distribution according to the second distance set.
  • the foregoing fifth obtaining module, the sixth obtaining module, and the second determining module correspond to step S1040 to step S1044 in Embodiment 1, and the three modules are the same as the examples and application scenarios implemented by the corresponding steps, but It is not limited to the contents disclosed in the above embodiment 1.
  • the comparison module includes: a third determining module, a fourth determining module, a fifth determining module, and a first comparing module.
  • the third determining module is configured to determine a starting spot and a starting light source, wherein the starting spot corresponds to the starting light source; and the fourth determining module is configured to determine each of the first distance distribution according to the starting spot a first transforming frequency between the spots; a fifth determining module configured to determine a second transform frequency between each of the two light sources in the second distance distribution according to the initial light source; the first comparison module is set to be compared A conversion frequency and a second transformation frequency are obtained to obtain a comparison result.
  • the foregoing third determining module, the fourth determining module, the fifth determining module, and the first comparing module correspond to steps S1060 to S1066 in Embodiment 1, and examples of four modules and corresponding steps are implemented. It is the same as the application scenario, but is not limited to the content disclosed in the above embodiment 1.
  • the determining module comprises: a sixth determining module and a filtering module.
  • the sixth determining module is configured to determine that the spot corresponding to the first transform frequency that matches the second transform frequency matches the light source, and the filtering module is configured to filter the first transform frequency that does not match the second transform frequency. Spot.
  • a storage medium including a stored program, wherein the program performs the method of matching a light source and a spot in Embodiment 1.
  • a processor configured to execute a program, wherein the method of matching the light source and the spot in Embodiment 1 is executed while the program is running.
  • the disclosed technical contents may be implemented in other manners.
  • the device embodiments described above are only schematic.
  • the division of the unit may be a logical function division.
  • there may be another division manner for example, multiple units or components may be combined or may be Integrate into another system, or some features can be ignored or not executed.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, unit or module, and may be electrical or otherwise.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the above integrated unit can be implemented in the form of hardware or in the form of a software functional unit.
  • the integrated unit if implemented in the form of a software functional unit and sold or used as a standalone product, may be stored in a computer readable storage medium.
  • a computer readable storage medium A number of instructions are included to cause a computer device (which may be a personal computer, server or network device, etc.) to perform all or part of the steps of the methods described in various embodiments of the present application.
  • the foregoing storage medium includes: a U disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic disk, or an optical disk, and the like. .
  • the solution provided by the embodiment of the present application can be used to match the spot and the light source.
  • the device can be applied to the line-of-sight tracking device, and the first distance distribution of the plurality of spots and the pupil center can be acquired, and A second distance distribution of the plurality of light sources, and then comparing the first distance distribution and the second distance distribution to obtain a comparison result, and determining, according to the comparison result, that the spot that satisfies the second distance distribution matches the light source. Since the distribution shape of the light source in the space is similar to the projection shape on the pupil, the first distance distribution characterizes the distribution of the light spot, and the second distance distribution characterizes the distribution of the light source.
  • the second distance distribution can accurately determine the spot that does not match the second distance distribution, thereby achieving the purpose of accurately filtering out the stray spots on the image, thereby realizing the technical effect of accurately matching the light source and the spot, thereby solving the present There are technical problems in the art that the spot and the light source are not accurately matched.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Image Analysis (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

一种匹配光源与光斑的方法和装置。其中,该方法包括:获取多个光斑与瞳孔中心的第一距离分布,其中,待检测图像包含多个光斑以及瞳孔(S102);获取基于多个光源的第二距离分布(S104);对比第一距离分布和第二距离分布,得到比对结果(S106);根据比对结果确定满足第二距离分布的光斑与光源匹配(S108)。解决了现有技术中光斑和光源匹配不准确的技术问题。

Description

匹配光源与光斑的方法和装置
本申请要求于2018年02月12日提交中国专利局、申请号为201810147220.5、申请名称“匹配光源与光斑的方法和装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及视线追踪设备领域,具体而言,涉及一种匹配光源与光斑的方法和装置。
背景技术
虚拟现实技术(Virtual Reality,即VR的简称)是一种可以创建和体验虚拟世界的计算机仿真系统,通过虚拟现实技术可在计算机中生成一种实时动态显示的三维立体的模拟环境,用户在该模拟环境中可获取到与真实环境下相同的感知,例如,视觉上的感知、听觉上的感知、触觉上的感知等。随着科学技术的快速发展,VR技术已广泛应用在了各行各业中,其中,VR技术中的视线追踪技术广泛应用到了医学中,例如,在医学领域对视线进行追踪。
然而,在现有技术中,主要是采用眼球的3D近似圆球模型,根据瞳孔中心坐标和角膜反射,对眼睛注视点的远距离设备进行视线估计。并且,当VR设备使用多个相机多光源时,只需要进行单点校正即可完成对视线的追踪。
然而,在实际的使用过程中,由于光源通常是不具有特异性的,并且,光源和相机的相对位置通常也是不同的,因此,在进行视线追踪的过程中,有些相机获取不到图像,或者获取到的图像不佳,从而导致无法对光源和光斑进行精确的匹配。
针对上述现有技术中光斑和光源匹配不准确的问题,目前尚未提出有效的解决方案。
发明内容
本申请实施例提供了一种匹配光源与光斑的方法和装置,以至少解决现有技术中光斑和光源匹配不准确的技术问题。
根据本申请实施例的一个方面,提供了一种匹配光源与光斑的方法,包括:获取 多个光斑与瞳孔中心的第一距离分布,其中,待检测图像包含多个光斑以及瞳孔;获取基于多个光源的第二距离分布;对比第一距离分布和第二距离分布,得到比对结果;根据比对结果确定满足第二距离分布的光斑与光源匹配。
根据本申请实施例的另一方面,还提供了一种匹配光源与光斑的装置,包括:第一获取模块,设置为获取多个光斑与瞳孔中心的第一距离分布,其中,待检测图像包含多个光斑以及瞳孔;第二获取模块,设置为获取基于多个光源的第二距离分布;比对模块,设置为对比第一距离分布和第二距离分布,得到比对结果;确定模块,设置为根据比对结果确定满足第二距离分布的光斑与光源匹配。
根据本申请实施例的另一方面,还提供了一种存储介质,该存储介质包括存储的程序,其中,程序执行匹配光源与光斑的方法。
根据本申请实施例的另一方面,还提供了一种处理器,该处理器设置为运行程序,其中,程序运行时执行匹配光源与光斑的方法。
在本申请实施例中,采用距离分布匹配光源和光斑的方式,通过获取多个光斑与瞳孔中心的第一距离分布,以及基于多个光源的第二距离分布,然后,对比第一距离分布和第二距离分布,得到比对结果,并根据比对结果确定满足第二距离分布的光斑与光源匹配,其中,待检测图像包含多个光斑以及瞳孔,达到了准确滤除图像上的杂光斑的目的,从而实现了对光源和光斑进行准确匹配的技术效果,进而解决了现有技术中光斑和光源匹配不准确的技术问题。
附图说明
此处所说明的附图用来提供对本申请的进一步理解,构成本申请的一部分,本申请的示意性实施例及其说明用于解释本申请,并不构成对本申请的不当限定。在附图中:
图1是根据本申请实施例的一种匹配光源与光斑的方法流程图;
图2是根据本申请实施例的一种可选的基于多光源的获取图像的结构示意图;
图3是根据本申请实施例的一种可选的待检测图像的示意图;
图4是根据本申请实施例的一种可选的第一距离分布的示意图;
图5是根据本申请实施例的一种可选的光源分布示意图;
图6是根据本申请实施例的一种可选的第二距离分布的示意图;以及
图7是根据本申请实施例的一种匹配光源与光斑的装置结构示意图。
具体实施方式
为了使本技术领域的人员更好地理解本申请方案,下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分的实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都应当属于本申请保护的范围。
需要说明的是,本申请的说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便这里描述的本申请的实施例能够以除了在这里图示或描述的那些以外的顺序实施。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,例如,包含了一系列步骤或单元的过程、方法、系统、产品或设备不必限于清楚地列出的那些步骤或单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它步骤或单元。
实施例1
根据本申请实施例,提供了一种匹配光源与光斑的方法实施例,需要说明的是,在附图的流程图示出的步骤可以在诸如一组计算机可执行指令的计算机系统中执行,并且,虽然在流程图中示出了逻辑顺序,但是在某些情况下,可以以不同于此处的顺序执行所示出或描述的步骤。
图1是根据本申请实施例的匹配光源与光斑的方法流程图,如图1所示,该方法包括如下步骤:
步骤S102,获取多个光斑与瞳孔中心的第一距离分布,其中,待检测图像包含多个光斑以及瞳孔。
需要说明的是,图像采集装置可获取到待检测图像,其中,图像采集装置可以包括但不限于相机,具有摄像功能的手机、平板等。另外,与图像采集装置相连接的视线追踪设备可对图像采集装置所采集到的图像进行处理,其中,视线追踪设备可以包括但不限于虚拟现实设备以及可进行视线追踪的智能终端,例如,手机、电脑、平板以及可穿戴设备等。在与图像采集装置相连的设备为视线追踪设备的情况下,图像采集装置采集到的图像可以为眼睛的角膜图像。
在一种可选的实施例中,如图2所示的一种可选的基于多光源的获取图像的结构 示意图。在图2中,a为眼球,b为角膜表面,c为角膜曲率中心,d为眼球旋转中心,p为瞳孔中心,r为瞳孔半径,O 1为相机,I 1和I 2为两个光源,u 21和u 11为光源经角膜反射后达到相机的成像点。其中,光源经角膜反射后达到相机的成像点即为上述光斑。
在一种可选的实施例中,如图3所示的一种可选的待检测图像的示意图。视线追踪设备可对该待检测图像进行处理得到多个光斑的中心位置以及瞳孔中心之间的距离,例如,光斑1的中心位置与瞳孔中心之间的距离为L1(图3未示出),光斑2的中心位置与瞳孔中心之间的距离为L2(图3未示出),以此类推。如果光斑的数量有8个,则可得到8个距离值,并根据得到的8个距离值得到如图4所示的第一距离分布。
步骤S104,获取基于多个光源的第二距离分布。
需要说明的是,在空间中,如果多个光源在空间中呈圆形分布,例如,8个光源等分排列,8个光源经过透视投影变换或放射投影变换后,形成近似椭圆形,因此,呈圆形分布的多个光源在待检测图像上形成的光斑分布为类椭圆形。
在一种可选的实施例中,多个光源呈圆形分布,则多个光源的成像为椭圆分布,如图5所示的一种可选的光源分布示意图。在确定了光源的分布情况之后,可确定椭圆的中心位置,然后再计算每个光源的中心位置与椭圆的中心位置之间的距离,进而得到光源的第二距离分布,如图6所示。
步骤S106,对比第一距离分布和第二距离分布,得到比对结果。
在一种可选的实施例中,由图6可知,如果光源成圆形分布,则第二距离分布呈类似三角曲线的分布形式。在理想情况下,待检测图像中的光斑与光源的分布相同,因此,理想情况下,第一距离分布与第二距离分布相同。由此,通过对比第一距离分布和第二距离分布,即可确定光斑与光源的匹配情况。
步骤S108,根据比对结果确定满足第二距离分布的光斑与光源匹配。
在一种可选的实施例中,光源的数量为8个,并且8个光源呈圆形均匀分布。由此可确定在第二距离分布的曲线上,每两个光源的距离变换频率为π/2。然后获取到第一距离分布的曲线上每两个光斑的距离变换频率,如果光斑A与光斑B之间的变换频率为π/4,则确定光斑B为杂光斑,此时视线追踪设备滤除光斑B。如果光斑A与光斑B之间的变换频率为π/3,在可接受的误差范围内,则确定光斑B与光源匹配,此时,视线追踪设备将保留光斑B。
基于上述步骤S102至步骤S108所限定的方案,可以获知,通过获取多个光斑与瞳孔中心的第一距离分布,以及基于多个光源的第二距离分布,然后,对比第一距离 分布和第二距离分布,得到比对结果,并根据比对结果确定满足第二距离分布的光斑与光源匹配,其中,待检测图像包含多个光斑以及瞳孔。
容易注意到的是,由于空间中光源的分布形状与在瞳孔上的投影形状相近似,而第一距离分布表征了光斑的分布情况,而第二距离分布表征了光源的分布情况,因此,通过比对第一距离分布和第二距离分布可准确确定与第二距离分布不相匹配的光斑,达到了提高对光斑和光源进行精确匹配的目的。
由上述内容可知,上述实施例可以达到准确滤除图像上的杂光斑的目的,从而实现了对光源和光斑进行准确匹配的技术效果,进而解决了现有技术中光斑和光源匹配不准确的技术问题。
在一种可选的实施例中,在获取多个光斑与瞳孔中心的第一距离分布之前,需要确定多个光斑的中心位置,具体步骤如下:
步骤S10,获取待检测图像;
步骤S12,对待检测图像进行预处理;
步骤S14,对预处理后的待检测图像进行光斑提取,得到多个光斑的中心位置。
具体的,在得到包含人眼图像的待检测图像之后,视线追踪设备对待检测图像进行增强、锐化等处理,并使用二值化连通域的方法提取预处理后的待检测图像中的光斑,然后确定与瞳孔位置相关的约束条件以及凸包形状约束条件,对光斑进行再次提取,最后获取提取到的光斑的中心位置。
需要说明的是,在获取多个光斑的中心位置以及瞳孔中心之后,可根据多个光斑的中心位置与瞳孔中心确定第一距离分布,具体步骤如下:
步骤S1020,获取多个光斑中的每个光斑的中心位置与瞳孔中心之间的距离,得到第一距离集合;
步骤S1022,根据第一距离集合确定第一距离分布。
在一种可选的实施例中,待检测图像中存在9个光斑,9个光斑与瞳孔中心之间的距离分别为L1、L2、L3、L4、L5、L6、L7、L8、L9,则以9个光斑与瞳孔中心之间的距离为纵坐标,以9个光斑的位置标号为横坐标,得到如图4所示的第一距离分布。
需要说明的是,在得到基于多个光斑的第一距离分布之后,还需要确定基于多个光源的第二距离分布,具体步骤如下:
步骤S1040,获取多个光源所组成的圆的圆心位置;
步骤S1042,获取多个光源的中心位置与圆心位置之间的距离,得到第二距离集合;
步骤S1044,根据第二距离集合确定第二距离分布。
需要说明的是,确定第二距离分布与确定第一距离分布的方法类似,在此不再赘述。另外,多个光源可均匀分布,也可以为非均匀分布。在得到第一距离分布和第二距离分布之后,需要对比第一距离分布和第二距离分布以得到对比结果,进而对杂光斑进行滤除,具体步骤如下:
步骤S1060,确定起始光斑与起始光源,其中,起始光斑与起始光源相对应;
步骤S1062,根据起始光斑确定第一距离分布中的每两个光斑之间的第一变换频率;
步骤S1064,根据起始光源确定第二距离分布中的每两个光源之间的第二变换频率;
步骤S1066,对比第一变换频率和第二变换频率,得到比对结果。
在一种可选的实施例中,可通过光源与人眼的相对位置来确定起始光源,通过光斑与待检测图像中人眼的相对位置来确定起始光斑,例如,以人眼的大眼角(即靠近鼻梁处的眼角)为参照物,确定与大眼角距离最近的光源为起始光源,确定与大眼角距离最近的光斑为起始光斑,并按照顺时针方向来确定每两个光斑之间的第一变换频率以及每两个光源之间的第二变换频率,然后确定第一变换频率与第二变换频率是否相匹配,例如,第一变换频率与第二变换频率的频率差小于或等于预设频率,则确定光斑与光源相匹配,否则,确定光斑与光源不匹配。例如,光斑A为起始光斑,光源A′为起始光源,按照顺时针计算每两个光斑与光源之间的变换频率,其中,光源A′与光源B′之间的变换频率为π/2,光斑A与光斑B之间的变换频率为π/12,由于π/2与π/12之间的频率差5π/12大于预设频率π/12,则确定光斑B与光源A′不匹配。
需要说明的是,如果多个光源为均匀分布,每两个光源之间的变换频率为固定频率,此时,只需要将每两个光斑之间的变换频率与固定频率对比即可。如果多个光源为非均匀分布,每两个光源之间的变换频率可能不同,此时,需要将每两个光斑的变换频率与对应光源之间的变换频率进行比对。
在一种可选的实施例中,在确定了第一距离分布与第二距离分布的比对结果之后,根据比对结果确定满足第二距离分布的光斑与光源匹配,主要通过确定与第二变换频率相匹配的第一变换频率对应的光斑与光源匹配,并滤除与第二变换频率不匹配的第 一变换频率对应的光斑。
实施例2
根据本申请实施例,提供了一种匹配光源与光斑的装置实施例。其中,图7是根据本申请实施例的匹配光源与光斑的装置结构示意图,如图7所示,该装置包括:第一获取模块701、第二获取模块703、比对模块705以及确定模块707。
其中,第一获取模块701,设置为获取多个光斑与瞳孔中心的第一距离分布,其中,待检测图像包含多个光斑以及瞳孔;第二获取模块703,设置为获取基于多个光源的第二距离分布;比对模块705,设置为对比第一距离分布和第二距离分布,得到比对结果;确定模块707,设置为根据比对结果确定满足第二距离分布的光斑与光源匹配。
需要说明的是,上述第一获取模块701、第二获取模块703、比对模块705以及确定模块707对应于实施例1中的步骤S102至步骤S108,四个模块与对应的步骤所实现的示例和应用场景相同,但不限于上述实施例1所公开的内容。
在一种可选的实施例中,匹配光源与光斑的装置还包括:第三获取模块、处理模块以及提取模块。其中,第三获取模块,设置为获取待检测图像;处理模块,设置为对待检测图像进行预处理;提取模块,设置为对预处理后的待检测图像进行光斑提取,得到多个光斑的中心位置。
需要说明的是,上述第三获取模块、处理模块以及提取模块对应于实施例1中的步骤S10至步骤S14,三个模块与对应的步骤所实现的示例和应用场景相同,但不限于上述实施例1所公开的内容。
在一种可选的实施例中,第一获取模块包括:第四获取模块以及第一确定模块。其中,第四获取模块,设置为获取多个光斑中的每个光斑的中心位置与瞳孔中心之间的距离,得到第一距离集合;第一确定模块,设置为根据第一距离集合确定第一距离分布。
需要说明的是,上述第四获取模块以及第一确定模块对应于实施例1中的步骤S1020至步骤S1022,两个模块与对应的步骤所实现的示例和应用场景相同,但不限于上述实施例1所公开的内容。
在一种可选的实施例中,第一获取模块包括:第五获取模块、第六获取模块以及第二确定模块。其中,第五获取模块,设置为获取多个光源所组成的圆的圆心位置;第六获取模块,设置为获取多个光源的中心位置与圆心位置之间的距离,得到第二距 离集合;第二确定模块,设置为根据第二距离集合确定第二距离分布。
需要说明的是,上述第五获取模块、第六获取模块以及第二确定模块对应于实施例1中的步骤S1040至步骤S1044,三个模块与对应的步骤所实现的示例和应用场景相同,但不限于上述实施例1所公开的内容。
在一种可选的实施例中,比对模块包括:第三确定模块、第四确定模块、第五确定模块以及第一比对模块。其中,第三确定模块,设置为确定起始光斑与起始光源,其中,起始光斑与起始光源相对应;第四确定模块,设置为根据起始光斑确定第一距离分布中的每两个光斑之间的第一变换频率;第五确定模块,设置为根据起始光源确定第二距离分布中的每两个光源之间的第二变换频率;第一比对模块,设置为对比第一变换频率和第二变换频率,得到比对结果。
需要说明的是,上述第三确定模块、第四确定模块、第五确定模块以及第一比对模块对应于实施例1中的步骤S1060至步骤S1066,四个模块与对应的步骤所实现的示例和应用场景相同,但不限于上述实施例1所公开的内容。
在一种可选的实施例中,确定模块包括:第六确定模块以及过滤模块。其中,第六确定模块,设置为确定与第二变换频率相匹配的第一变换频率对应的光斑与光源匹配;过滤模块,设置为滤除与第二变换频率不匹配的第一变换频率对应的光斑。
实施例3
根据本申请实施例的另一方面,还提供了一种存储介质,该存储介质包括存储的程序,其中,程序执行实施例1中的匹配光源与光斑的方法。
实施例4
根据本申请实施例的另一方面,还提供了一种处理器,该处理器设置为运行程序,其中,程序运行时执行实施例1中的匹配光源与光斑的方法。
上述本申请实施例序号仅仅为了描述,不代表实施例的优劣。
在本申请的上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述的部分,可以参见其他实施例的相关描述。
在本申请所提供的几个实施例中,应该理解到,所揭露的技术内容,可通过其它的方式实现。其中,以上所描述的装置实施例仅仅是示意性的,例如所述单元的划分,可以为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所 显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,单元或模块的间接耦合或通信连接,可以是电性或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可为个人计算机、服务器或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、移动硬盘、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述仅是本申请的优选实施方式,应当指出,对于本技术领域的普通技术人员来说,在不脱离本申请原理的前提下,还可以做出若干改进和润饰,这些改进和润饰也应视为本申请的保护范围。
工业实用性
本申请实施例提供的方案可以用于匹配光斑和光源,在本申请实施例提供的技术方案中,可以应用于视线追踪设备中,可以获取多个光斑与瞳孔中心的第一距离分布,以及基于多个光源的第二距离分布,然后,对比第一距离分布和第二距离分布,得到比对结果,并根据比对结果确定满足第二距离分布的光斑与光源匹配。由于空间中光源的分布形状与在瞳孔上的投影形状相近似,而第一距离分布表征了光斑的分布情况,而第二距离分布表征了光源的分布情况,因此,通过比对第一距离分布和第二距离分布可准确确定与第二距离分布不相匹配的光斑,从而达到准确滤除图像上的杂光斑的目的,从而实现了对光源和光斑进行准确匹配的技术效果,进而解决了现有技术中光斑和光源匹配不准确的技术问题。

Claims (14)

  1. 一种匹配光源与光斑的方法,包括:
    获取多个光斑与瞳孔中心的第一距离分布,其中,待检测图像包含所述多个光斑以及瞳孔;
    获取基于多个光源的第二距离分布;
    对比所述第一距离分布和所述第二距离分布,得到比对结果;
    根据所述比对结果确定满足所述第二距离分布的光斑与光源匹配。
  2. 根据权利要求1所述的方法,其中,在获取多个光斑与瞳孔中心的第一距离分布之前,所述方法还包括:
    获取所述待检测图像;
    对所述待检测图像进行预处理;
    对预处理后的所述待检测图像进行光斑提取,得到所述多个光斑的中心位置。
  3. 根据权利要求1所述的方法,其中,获取多个光斑与瞳孔中心的第一距离分布包括:
    获取所述多个光斑中的每个光斑的中心位置与所述瞳孔中心之间的距离,得到第一距离集合;
    根据所述第一距离集合确定所述第一距离分布。
  4. 根据权利要求1所述的方法,其中,获取基于多个光源的第二距离分布包括:
    获取所述多个光源所组成的圆的圆心位置;
    获取所述多个光源的中心位置与所述圆心位置之间的距离,得到第二距离集合;
    根据所述第二距离集合确定所述第二距离分布。
  5. 根据权利要求3或4中任意一项所述的方法,其中,对比所述第一距离分布和所述第二距离分布,得到比对结果包括:
    确定起始光斑与起始光源,其中,所述起始光斑与所述起始光源相对应;
    根据所述起始光斑确定所述第一距离分布中的每两个光斑之间的第一变换频率;
    根据所述起始光源确定所述第二距离分布中的每两个光源之间的第二变换频率;
    对比所述第一变换频率和所述第二变换频率,得到所述比对结果。
  6. 根据权利要求5所述的方法,其中,根据所述比对结果确定满足所述第二距离分布的光斑与光源匹配包括:
    确定与所述第二变换频率相匹配的第一变换频率对应的光斑与所述光源匹配;
    滤除与所述第二变换频率不匹配的第一变换频率对应的光斑。
  7. 一种匹配光源与光斑的装置,包括:
    第一获取模块,设置为获取多个光斑与瞳孔中心的第一距离分布,其中,待检测图像包含所述多个光斑以及瞳孔;
    第二获取模块,设置为获取基于多个光源的第二距离分布;
    比对模块,设置为对比所述第一距离分布和所述第二距离分布,得到比对结果;
    确定模块,设置为根据所述比对结果确定满足所述第二距离分布的光斑与光源匹配。
  8. 根据权利要求7所述的装置,其中,所述装置还包括:
    第三获取模块,设置为获取所述待检测图像;
    处理模块,设置为对所述待检测图像进行预处理;
    提取模块,设置为对预处理后的所述待检测图像进行光斑提取,得到所述多个光斑的中心位置。
  9. 根据权利要求7所述的装置,其中,所述第一获取模块包括:
    第四获取模块,设置为获取所述多个光斑中的每个光斑的中心位置与所述瞳孔中心之间的距离,得到第一距离集合;
    第一确定模块,设置为根据所述第一距离集合确定所述第一距离分布。
  10. 根据权利要求7所述的装置,其中,所述第一获取模块包括:
    第五获取模块,设置为获取所述多个光源所组成的圆的圆心位置;
    第六获取模块,设置为获取所述多个光源的中心位置与所述圆心位置之间的距离,得到第二距离集合;
    第二确定模块,设置为根据所述第二距离集合确定所述第二距离分布。
  11. 根据权利要求9或10中任意一项所述的装置,其中,所述比对模块包括:
    第三确定模块,设置为确定起始光斑与起始光源,其中,所述起始光斑与所述起始光源相对应;
    第四确定模块,设置为根据所述起始光斑确定所述第一距离分布中的每两个光斑之间的第一变换频率;
    第五确定模块,设置为根据所述起始光源确定所述第二距离分布中的每两个光源之间的第二变换频率;
    第一比对模块,设置为对比所述第一变换频率和所述第二变换频率,得到所述比对结果。
  12. 根据权利要求11所述的装置,其中,所述确定模块包括:
    第六确定模块,设置为确定与所述第二变换频率相匹配的第一变换频率对应的光斑与所述光源匹配;
    过滤模块,设置为滤除与所述第二变换频率不匹配的第一变换频率对应的光斑。
  13. 一种存储介质,所述存储介质包括存储的程序,其中,所述程序执行权利要求1至6中任意一项所述的匹配光源与光斑的方法。
  14. 一种处理器,所述处理器设置为运行程序,其中,所述程序运行时执行权利要求1至6中任意一项所述的匹配光源与光斑的方法。
PCT/CN2019/071326 2018-02-12 2019-01-11 匹配光源与光斑的方法和装置 WO2019154012A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810147220.5A CN108510542B (zh) 2018-02-12 2018-02-12 匹配光源与光斑的方法和装置
CN201810147220.5 2018-02-12

Publications (1)

Publication Number Publication Date
WO2019154012A1 true WO2019154012A1 (zh) 2019-08-15

Family

ID=63375044

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/071326 WO2019154012A1 (zh) 2018-02-12 2019-01-11 匹配光源与光斑的方法和装置

Country Status (3)

Country Link
CN (1) CN108510542B (zh)
TW (1) TWI680309B (zh)
WO (1) WO2019154012A1 (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108510542B (zh) * 2018-02-12 2020-09-11 北京七鑫易维信息技术有限公司 匹配光源与光斑的方法和装置
CN112747906B (zh) * 2021-01-28 2022-07-22 歌尔光学科技有限公司 光源的检测方法、检测装置和可读存储介质
CN117455927B (zh) * 2023-12-21 2024-03-15 万灵帮桥医疗器械(广州)有限责任公司 光斑阵列分割与光斑偏移量计算方法、装置、设备及存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7001377B1 (en) * 2003-12-22 2006-02-21 Alcon Refractivehorizons, Inc. Optical tracking system and associated methods
CN103530618A (zh) * 2013-10-23 2014-01-22 哈尔滨工业大学深圳研究生院 一种基于角膜反射的非接触式视线追踪方法
CN103761519A (zh) * 2013-12-20 2014-04-30 哈尔滨工业大学深圳研究生院 一种基于自适应校准的非接触式视线追踪方法
CN104732191A (zh) * 2013-12-23 2015-06-24 北京七鑫易维信息技术有限公司 利用交比不变性实现虚拟显示屏视线追踪的装置及其方法
CN106778641A (zh) * 2016-12-23 2017-05-31 北京七鑫易维信息技术有限公司 视线估计方法及装置
CN108510542A (zh) * 2018-02-12 2018-09-07 北京七鑫易维信息技术有限公司 匹配光源与光斑的方法和装置

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102715981B (zh) * 2012-06-29 2014-08-20 深圳普门科技有限公司 用于光治疗设备的环状光斑取得装置和光治疗设备
US20140375541A1 (en) * 2013-06-25 2014-12-25 David Nister Eye tracking via depth camera
EP3065623B1 (en) * 2013-11-09 2020-04-22 Shenzhen Goodix Technology Co., Ltd. Optical eye tracking
US9454699B2 (en) * 2014-04-29 2016-09-27 Microsoft Technology Licensing, Llc Handling glare in eye tracking
EP3035110A1 (en) * 2014-12-18 2016-06-22 Optotune AG Optical system for avoiding speckle patterns
TWI617948B (zh) * 2015-07-24 2018-03-11 由田新技股份有限公司 用於眼部追蹤的校正模組及其方法及電腦可讀取紀錄媒體
JP6845982B2 (ja) * 2016-01-13 2021-03-24 フォーブ インコーポレーテッド 表情認識システム、表情認識方法及び表情認識プログラム
CN205485920U (zh) * 2016-01-21 2016-08-17 京东方科技集团股份有限公司 一种显示控制装置及显示控制系统
CN107066085B (zh) * 2017-01-12 2020-07-10 惠州Tcl移动通信有限公司 一种基于眼球追踪控制终端的方法及装置
CN107357429B (zh) * 2017-07-10 2020-04-07 京东方科技集团股份有限公司 用于确定视线的方法、设备和计算机可读存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7001377B1 (en) * 2003-12-22 2006-02-21 Alcon Refractivehorizons, Inc. Optical tracking system and associated methods
CN103530618A (zh) * 2013-10-23 2014-01-22 哈尔滨工业大学深圳研究生院 一种基于角膜反射的非接触式视线追踪方法
CN103761519A (zh) * 2013-12-20 2014-04-30 哈尔滨工业大学深圳研究生院 一种基于自适应校准的非接触式视线追踪方法
CN104732191A (zh) * 2013-12-23 2015-06-24 北京七鑫易维信息技术有限公司 利用交比不变性实现虚拟显示屏视线追踪的装置及其方法
CN106778641A (zh) * 2016-12-23 2017-05-31 北京七鑫易维信息技术有限公司 视线估计方法及装置
CN108510542A (zh) * 2018-02-12 2018-09-07 北京七鑫易维信息技术有限公司 匹配光源与光斑的方法和装置

Also Published As

Publication number Publication date
TWI680309B (zh) 2019-12-21
CN108510542B (zh) 2020-09-11
CN108510542A (zh) 2018-09-07
TW201935083A (zh) 2019-09-01

Similar Documents

Publication Publication Date Title
CN106056092B (zh) 基于虹膜与瞳孔的用于头戴式设备的视线估计方法
EP3362946B1 (en) Eye pose identification using eye features
US11657133B2 (en) Systems and methods of multi-modal biometric analysis
US10380421B2 (en) Iris recognition via plenoptic imaging
WO2019154012A1 (zh) 匹配光源与光斑的方法和装置
JP2019079546A (ja) 眼球血管および顔認識のための画像および特徴品質、画像強化および特徴抽出、ならびにバイオメトリックシステムのための眼球血管と顔領域および/またはサブ顔領域との融合
JP2021502618A (ja) 注視点判定方法および装置、電子機器ならびにコンピュータ記憶媒体
WO2016111880A1 (en) Gaze detection offset for gaze tracking models
US8639058B2 (en) Method of generating a normalized digital image of an iris of an eye
WO2019128676A1 (zh) 过滤光斑的方法和装置
EP4033458A2 (en) Method and apparatus of face anti-spoofing, device, storage medium, and computer program product
WO2019232871A1 (zh) 眼镜虚拟佩戴方法、装置、计算机设备及存储介质
CN110765830B (zh) 一种人脸全自助注册方法、系统、介质及设备
CN105550625B (zh) 一种活体虹膜检测方法及终端
WO2019128715A1 (zh) 确定瞳孔位置的方法和装置
US10909363B2 (en) Image acquisition system for off-axis eye images
CN110532992A (zh) 一种基于可见光和近红外的人脸识别方法
CN112099622B (zh) 一种视线追踪方法及装置
CN107767421B (zh) 视线追踪设备中光斑光源匹配方法和装置
Waizenegger et al. Model based 3D gaze estimation for provision of virtual eye contact
CN112132755A (zh) 校正标定瞳孔位置的方法、装置、系统及计算机可读介质
Skodras et al. An accurate eye center localization method for low resolution color imagery
Wang et al. Contact lenses detection based on the gaussian curvature
TW201944769A (zh) 光斑提取方法
CN109447995B (zh) 一种眼前节图像的分割方法及其相关装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19751002

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 01.12.2020)

122 Ep: pct application non-entry in european phase

Ref document number: 19751002

Country of ref document: EP

Kind code of ref document: A1