WO2018184245A1 - 一种眼动判断方法及装置 - Google Patents

一种眼动判断方法及装置 Download PDF

Info

Publication number
WO2018184245A1
WO2018184245A1 PCT/CN2017/079818 CN2017079818W WO2018184245A1 WO 2018184245 A1 WO2018184245 A1 WO 2018184245A1 CN 2017079818 W CN2017079818 W CN 2017079818W WO 2018184245 A1 WO2018184245 A1 WO 2018184245A1
Authority
WO
WIPO (PCT)
Prior art keywords
vertex
coordinates
pupil
eye movement
center point
Prior art date
Application number
PCT/CN2017/079818
Other languages
English (en)
French (fr)
Inventor
武克易
Original Assignee
闲客智能(深圳)科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 闲客智能(深圳)科技有限公司 filed Critical 闲客智能(深圳)科技有限公司
Priority to PCT/CN2017/079818 priority Critical patent/WO2018184245A1/zh
Publication of WO2018184245A1 publication Critical patent/WO2018184245A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition

Definitions

  • the present invention relates to the field of intelligent hardware technologies, and more particularly to an eye movement determination method and apparatus.
  • the technical problem to be solved by the present invention is to provide an eye movement determination method and apparatus for the above-mentioned drawbacks of the prior art.
  • Step 1 continuously obtain a picture of the user's eye at a preset time interval by using the camera;
  • Step 2 continuously analyzing each frame of the picture, calculating and storing the distance from the center point of the pupil to the upper, lower, left and right directions of the eyelid;
  • Step 3 judging the direction of the eye movement according to the distance change from the center point of the pupil to the upper, lower, left and right directions of the eyelid, and each frame of the image to be acquired is placed in the same coordinate, and the vertex and the lower vertex of the edge curve of the eyelid are
  • the coordinates of the left vertex, the coordinates of the right vertex, and the coordinates of the center point of the pupil are all active points, and the coordinates of the coordinates of the center point of the pupil in each frame obtained continuously are calculated to the other four points, and according to the four in each frame of the picture
  • the distance change of the point determines the direction of the eye movement.
  • step 2 specifically includes:
  • Step 2-1 extracting the edge curve of the eyelid, and acquiring coordinates of the vertex, the lower vertex, the left vertex, and the right vertex of the edge curve of the eyelid respectively;
  • Step 2-2 extract the coordinates of the center point of the pupil
  • Step 2-3 Calculate a relative distance between the coordinate of the center point of the pupil to the coordinates of the vertex, the lower vertex, the left vertex, and the right vertex of the eyelid edge curve.
  • the invention also provides an eye movement judging device, which comprises:
  • a camera module configured to continuously acquire a user's eye picture at a preset time interval by using a camera
  • the distance calculation module is configured to continuously analyze each frame of the picture, calculate and store the distance from the center point of the pupil to the upper, lower, left and right directions of the eyelid;
  • the judging module is configured to judge the eye movement direction according to the distance change from the center point of the pupil to the upper, lower, left and right directions of the eyelid, and each image of the frame to be acquired is placed in the same coordinate, and the vertex of the curve of the eyelid edge is The lower vertex, the left vertex, the right vertex coordinates, and the pupil center point coordinates are active points, and the coordinates of the pupil center point coordinates in each successively obtained picture are calculated to the other four points, and according to each frame The distance change of these four points determines the direction of the eye movement.
  • the eye movement judging device of the present invention wherein the distance calculation module comprises:
  • a coordinate calculation unit configured to extract the eyelid edge curve, respectively acquire coordinates of a vertex, a lower vertex, a left vertex, and a right vertex of the eyelid edge curve, and extract coordinates of a pupil center point;
  • the distance calculating unit is configured to calculate a relative distance between the coordinates of the pupil center point and the coordinates of the vertex, the lower vertex, the left vertex, and the right vertex of the eyelid edge curve, respectively.
  • the beneficial effects of the present invention are: calculating and storing the distances of the obtained pupil center points of each frame to the upper, lower, left and right directions of the eyelid, and according to the center point of the pupil to the upper, lower, left and right sides of the eyelid.
  • the change of the distance of the directions determines the direction of the eye movement, and the judgment of the eye movement direction can be quickly realized.
  • the method can save a lot of calculations, improve the calculation speed and reduce the hardware implementation cost.
  • FIG. 1 is a flow chart of a method for judging eye movement according to a preferred embodiment of the present invention
  • FIGS. 2a, 2b, 2c, and 2d are schematic diagrams of a continuously acquired 4-frame eye picture according to a preferred embodiment of the present invention.
  • Fig. 3 is a block diagram showing the principle of an eye movement judging device according to a preferred embodiment of the present invention.
  • the flow of the eye movement determination method according to the preferred embodiment of the present invention is as shown in FIG. 1 and includes the following steps:
  • Step S10 continuously obtaining a user's eye picture at a preset time interval by using a camera
  • Step S20 continuously analyzing each frame of the image, and calculating and storing the distance from the center point of the pupil to the upper, lower, left, and right directions of the eyelid;
  • step S30 the eye movement direction is determined according to the distance change from the center point of the pupil to the upper, lower, left and right directions of the eyelid.
  • the method of the invention calculates and stores the distance between the obtained pupil center point of each frame to the upper, lower, left and right directions of the eyelid, and according to the distance from the center point of the pupil to the upper, lower, left and right directions of the eyelid.
  • the change of the eye movement direction can quickly realize the judgment of the eye movement direction. Compared with the existing eye movement direction judgment method, the method can save a lot of calculations, improve the calculation speed and reduce the hardware implementation cost.
  • step S20 specifically includes:
  • Step S21 extracting an eyelid edge curve, and respectively acquiring coordinates of a vertex, a lower vertex, a left vertex, and a right vertex of the eyelid edge curve;
  • Step S22 extracting coordinates of a pupil center point
  • Step S23 Calculate the relative distance between the coordinate of the center point of the pupil to the coordinates of the vertex, the lower vertex, the left vertex, and the right vertex of the eyelid edge curve.
  • step S30 specifically includes:
  • the coordinates of the vertices, the lower vertex, the left vertex, and the right vertex of the eyelid edge curve are fixed coordinates, and the coordinates of the pupil center point coordinates in each successively obtained picture are calculated to the four points, and according to the four pictures in each frame The distance change of the point determines the direction of the eye movement.
  • step S30 specifically includes:
  • 2a, 2b, 2c, and 2d are schematic diagrams of four frames of eye images that are continuously acquired. From Fig. 2a to Figs. 2b, 2c, and 2d, the pupil moves to the left.
  • the coordinates of the vertex c, the lower vertex d, the left vertex a, and the right vertex b on the eyelid edge curve are fixed coordinates, and the distances from the e-coordinate of the pupil center point to the four points in the four frame images are respectively calculated, and the four distance values are respectively calculated. It is recorded as ec, ed, ea, eb, respectively.
  • the three distance values of ec, ed, and eb are sequentially increased, and ea is smaller, and it can be determined that the pupil moves toward the left vertex of the eyelid.
  • the pupil moves toward the upper vertex c and the left vertex a in FIG. 2a, that is, moves to the upper left.
  • the distance value ed and eb are gradually increased by the above method.
  • the distance values ea and ec gradually become smaller.
  • the four values of ec, ed, ea, and eb may be increased or decreased by two states, and the corresponding eye movement direction state results are listed in a combined combination, and the above ec, ed are calculated.
  • the direct lookup table obtains the corresponding eye movement direction judgment result, which can further improve the judgment speed.
  • an eye movement judging device is further provided.
  • the method further includes: a camera module 10, configured to continuously acquire a user's eye image at a preset time interval by using a camera;
  • the module 20 is configured to continuously analyze each frame of the picture, calculate and store the distance from the center point of the pupil to the upper, lower, left and right directions of the eyelid;
  • the determining module 30 is configured to go up and down according to the center point of the pupil The distance between the left and right directions determines the direction of the eye movement.
  • the invention calculates and stores the distances of the obtained pupil center points of each frame to the upper, lower, left and right directions of the eyelid, and changes according to the distance from the center point of the pupil to the upper, lower, left and right directions of the eyelid. Judging the direction of the eye movement, In order to quickly realize the judgment of the eye movement direction, compared with the existing eye movement direction judgment method, the method can save a lot of calculations, improve the calculation speed and reduce the hardware implementation cost.
  • the camera module may be a single camera or a camera module.
  • the distance calculation module and the judgment module may be hardware circuit modules solidified on the FPGA board, and data calculation is implemented by the programmable logic device.
  • the distance calculation module includes: a coordinate calculation unit, configured to extract an eyelid edge curve, respectively acquire coordinates of a vertex, a lower vertex, a left vertex, and a right vertex of the eyelid edge curve, and extract coordinates of the pupil center point;
  • the distance calculation unit is configured to calculate the relative distance between the coordinates of the center point of the pupil to the coordinates of the vertex, the lower vertex, the left vertex, and the right vertex of the eyelid edge curve.
  • the judging module includes a first judging unit configured to calculate the pupil center in each successively obtained picture by using the coordinates of the upper vertex, the lower vertex, the left vertex, and the right vertex of the eyelid edge curve as fixed coordinates. Point coordinates to the distance of these four points, and determine the direction of eye movement according to the distance change of these four points in each frame of the picture.
  • the judging module includes a second judging unit, configured to place each frame of the acquired image in the same coordinate, the vertex of the eyelid edge curve, the lower vertex, the left vertex, the right vertex coordinate, and the pupil
  • the coordinates of the center point are all active points, and the distance from the center point of the pupil in each frame of the frame obtained to the other four points is calculated, and the direction of the eye movement is judged according to the distance change of the four points in each frame of the picture.

Abstract

本发明涉及一种眼动判断方法及装置,其中方法包括以下步骤:步骤1、通过摄像头以预设时间间隔持续获取用户眼部图片;步骤2、连续对每一帧图片进行分析,计算并存储瞳孔中心点到眼眶上、下、左、右四个方向的距离;步骤3、根据瞳孔中心点到眼眶上、下、左、右四个方向的距离变化判断眼动方向,即将获取的每一帧图片置于同一坐标中,眼眶边缘曲线上顶点、下顶点、左侧顶点、右侧顶点坐标,以及瞳孔中心点坐标均为活动点,计算连续获得的每一帧图片中瞳孔中心点坐标至其他四点的距离,并根据每帧图片中这四点的距离变化判断眼动方向。与现有眼动方向判断方法相比,本方法能节省大量运算,提高计算速度并降低硬件实现成本。

Description

一种眼动判断方法及装置 技术领域
本发明涉及智能硬件技术领域,更具体地说,涉及一种眼动判断方法及装置。
背景技术
在人机交互技术领域,尽管当前已经有许多新兴交互方式的尝试,比如体感交互、眼动跟踪、语音交互、生物识别等方式,但大部分的交互方式使用率都不是非常高,也还未进入真正意义上的商业应用普及中,更没有哪种人机交互方式,能够达到人可以毫无障碍、随心所欲地和设备(机器)交流的水平。除了逐渐普及了的多点触控交互方式以外,其他大部分的人机交互方式在技术以及使用稳定性上还有待突围。
发明内容
本发明要解决的技术问题在于,针对现有技术的上述缺陷,提供一种眼动判断方法及装置。
本发明解决其技术问题所采用的技术方案是:
构造一种眼动判断方法,其中,包括以下步骤:
步骤1、通过摄像头以预设时间间隔持续获取用户眼部图片;
步骤2、连续对每一帧图片进行分析,计算并存储瞳孔中心点到眼眶上、下、左、右四个方向的距离;
步骤3、根据瞳孔中心点到眼眶上、下、左、右四个方向的距离变化判断眼动方向,即将获取的每一帧图片置于同一坐标中,所述眼眶边缘曲线上顶点、下顶点、左侧顶点、右侧顶点坐标,以及所述瞳孔中心点坐标均为活动点,计算连续获得的每一帧图片中瞳孔中心点坐标至其他四点的距离,并根据每帧图片中这四点的距离变化判断眼动方向。
本发明所述的眼动判断方法,其中,所述步骤2具体包括:
步骤2-1、提取所述眼眶边缘曲线,分别获取所述眼眶边缘曲线上顶点、下顶点、左侧顶点、右侧顶点的坐标;
步骤2-2、提取瞳孔中心点坐标;
步骤2-3、计算所述瞳孔中心点坐标分别至所述眼眶边缘曲线上顶点、下顶点、左侧顶点、右侧顶点坐标的相对距离。
本发明还提供了一种眼动判断装置,其中,包括:
摄像模块,用于通过摄像头以预设时间间隔持续获取用户眼部图片;
距离计算模块,用于连续对每一帧图片进行分析,计算并存储瞳孔中心点到眼眶上、下、左、右四个方向的距离;
判断模块,用于根据瞳孔中心点到眼眶上、下、左、右四个方向的距离变化判断眼动方向,即将获取的每一帧图片置于同一坐标中,所述眼眶边缘曲线上顶点、下顶点、左侧顶点、右侧顶点坐标,以及所述瞳孔中心点坐标均为活动点,计算连续获得的每一帧图片中瞳孔中心点坐标至其他四点的距离,并根据每帧图片中这四点的距离变化判断眼动方向。
本发明所述的眼动判断装置,其中,所述距离计算模块包括:
坐标计算单元,用于提取所述眼眶边缘曲线,分别获取所述眼眶边缘曲线上顶点、下顶点、左侧顶点、右侧顶点的坐标,提取瞳孔中心点坐标;
距离计算单元,用于计算所述瞳孔中心点坐标分别至所述眼眶边缘曲线上顶点、下顶点、左侧顶点、右侧顶点坐标的相对距离。
本发明的有益效果在于:通过计算并存储所获取的每帧图片瞳孔中心点到眼眶上、下、左、右四个方向的距离,并根据瞳孔中心点到眼眶上、下、左、右四个方向的距离变化判断眼动方向,可以快速实现眼动方向的判断,与现有眼动方向判断方法相比,本方法能节省大量运算,提高计算速度并降低硬件实现成本。
附图说明
为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将结合附图及实施例对本发明作进一步说明,下面描述中的附图仅仅是本发明的部分实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他附图:
图1是本发明较佳实施例的眼动判断方法流程图;
图2a、2b、2c、2d是本发明较佳实施例的持续获取的4帧眼部图片示意图;
图3是本发明较佳实施例的眼动判断装置原理框图。
具体实施方式
为了使本发明实施例的目的、技术方案和优点更加清楚,下面将结合本发明实施例中的技术方案进行清楚、完整的描述,显然,所描述的实施例是本发明的部分实施例,而不是全部实施例。基于本发明的实施例,本领域普通技术人员在没有付出创造性劳动的前提下所获得的所有其他实施例,都属于本发明 的保护范围。
本发明较佳实施例的眼动判断方法流程如图1所示,包括以下步骤:
步骤S10、通过摄像头以预设时间间隔持续获取用户眼部图片;
步骤S20、连续对每一帧图片进行分析,计算并存储瞳孔中心点到眼眶上、下、左、右四个方向的距离;
步骤S30、根据瞳孔中心点到眼眶上、下、左、右四个方向的距离变化判断眼动方向。
本发明方法通过计算并存储所获取的每帧图片瞳孔中心点到眼眶上、下、左、右四个方向的距离,并根据瞳孔中心点到眼眶上、下、左、右四个方向的距离变化判断眼动方向,可以快速实现眼动方向的判断,与现有眼动方向判断方法相比,本方法能节省大量运算,提高计算速度并降低硬件实现成本。
上述眼动判断方法中,步骤S20具体包括:
步骤S21、提取眼眶边缘曲线,分别获取眼眶边缘曲线上顶点、下顶点、左侧顶点、右侧顶点的坐标;
步骤S22、提取瞳孔中心点坐标;
步骤S23、计算瞳孔中心点坐标分别至眼眶边缘曲线上顶点、下顶点、左侧顶点、右侧顶点坐标的相对距离。
上述眼动判断方法中,步骤S30具体包括:
以眼眶边缘曲线上顶点、下顶点、左侧顶点、右侧顶点坐标为固定坐标,计算连续获得的每一帧图片中瞳孔中心点坐标至这四点的距离,并根据每帧图片中这四点的距离变化判断眼动方向。
上述眼动判断方法中,步骤S30具体包括:
将获取的每一帧图片置于同一坐标中,眼眶边缘曲线上顶点、下顶点、左 侧顶点、右侧顶点坐标,以及瞳孔中心点坐标均为活动点;
计算连续获得的每一帧图片中瞳孔中心点坐标至其他四点的距离,并根据每帧图片中这四点的距离变化判断眼动方向。
图2a、2b、2c、2d是持续获取的4帧眼部图片示意图,从图2a依次至图2b、2c、2d,瞳孔向左侧移动。以眼眶边缘曲线上顶点c、下顶点d、左侧顶点a、右侧顶点b坐标为固定坐标,分别计算这4帧图片中,瞳孔中心点e坐标至这四点的距离,四个距离值分别记为e-c、e-d、e-a、e-b,通过列表,可以发现e-c、e-d、e-b三个距离值均依次变大,e-a则变小,可以判断出瞳孔向眼眶左侧顶点方向移动。
在另一实施例中,瞳孔向图2a中上顶点c和左侧顶点a的方向移动,即向左上方移动,此时可以通过上述方法计算出上述距离值e-d、e-b是逐渐变大,而距离值e-a、e-c则逐渐变小。
在具体的实施例中,可以将e-c、e-d、e-a、e-b四个值分别的变大、变小两个状态以排列组合形式列出对应的眼动方向状态结果,在计算得到上述e-c、e-d、e-a、e-b四个值之后,直接查表得到对应的眼动方向判断结果,可进一步提高判断速度。
在本发明另一实施例中,还提供了一种眼动判断装置,如图3所示,具体包括:摄像模块10,用于通过摄像头以预设时间间隔持续获取用户眼部图片;距离计算模块20,用于连续对每一帧图片进行分析,计算并存储瞳孔中心点到眼眶上、下、左、右四个方向的距离;判断模块30,用于根据瞳孔中心点到眼眶上、下、左、右四个方向的距离变化判断眼动方向。本发明通过计算并存储所获取的每帧图片瞳孔中心点到眼眶上、下、左、右四个方向的距离,并根据瞳孔中心点到眼眶上、下、左、右四个方向的距离变化判断眼动方向,可 以快速实现眼动方向的判断,与现有眼动方向判断方法相比,本方法能节省大量运算,提高计算速度并降低硬件实现成本。
上述实施例中,摄像模块可以是单个摄像头,或者是摄像模组。距离计算模块、判断模块可以是固化于FPGA板上的硬件电路模块,通过可编程逻辑器件实现数据计算。
上述眼动判断装置中,距离计算模块包括:坐标计算单元,用于提取眼眶边缘曲线,分别获取眼眶边缘曲线上顶点、下顶点、左侧顶点、右侧顶点的坐标,提取瞳孔中心点坐标;距离计算单元,用于计算瞳孔中心点坐标分别至眼眶边缘曲线上顶点、下顶点、左侧顶点、右侧顶点坐标的相对距离。
上述眼动判断装置中,判断模块包括第一判断单元,用于以眼眶边缘曲线上顶点、下顶点、左侧顶点、右侧顶点坐标为固定坐标,计算连续获得的每一帧图片中瞳孔中心点坐标至这四点的距离,并根据每帧图片中这四点的距离变化判断眼动方向。
上述眼动判断装置中,判断模块包括第二判断单元,用于将获取的每一帧图片置于同一坐标中,眼眶边缘曲线上顶点、下顶点、左侧顶点、右侧顶点坐标,以及瞳孔中心点坐标均为活动点,计算连续获得的每一帧图片中瞳孔中心点坐标至其他四点的距离,并根据每帧图片中这四点的距离变化判断眼动方向。
应当理解的是,对本领域普通技术人员来说,可以根据上述说明加以改进或变换,而所有这些改进和变换都应属于本发明所附权利要求的保护范围。

Claims (4)

  1. 一种眼动判断方法,其特征在于,包括以下步骤:
    步骤1、通过摄像头以预设时间间隔持续获取用户眼部图片;
    步骤2、连续对每一帧图片进行分析,计算并存储瞳孔中心点到眼眶上、下、左、右四个方向的距离;
    步骤3、根据瞳孔中心点到眼眶上、下、左、右四个方向的距离变化判断眼动方向,即将获取的每一帧图片置于同一坐标中,所述眼眶边缘曲线上顶点、下顶点、左侧顶点、右侧顶点坐标,以及所述瞳孔中心点坐标均为活动点,计算连续获得的每一帧图片中瞳孔中心点坐标至其他四点的距离,并根据每帧图片中这四点的距离变化判断眼动方向。
  2. 根据权利要求1所述的眼动判断方法,其特征在于,所述步骤2具体包括:
    步骤2-1、提取所述眼眶边缘曲线,分别获取所述眼眶边缘曲线上顶点、下顶点、左侧顶点、右侧顶点的坐标;
    步骤2-2、提取瞳孔中心点坐标;
    步骤2-3、计算所述瞳孔中心点坐标分别至所述眼眶边缘曲线上顶点、下顶点、左侧顶点、右侧顶点坐标的相对距离。
  3. 一种眼动判断装置,其特征在于,包括:
    摄像模块,用于通过摄像头以预设时间间隔持续获取用户眼部图片;
    距离计算模块,用于连续对每一帧图片进行分析,计算并存储瞳孔中心点到眼眶上、下、左、右四个方向的距离;
    判断模块,用于根据瞳孔中心点到眼眶上、下、左、右四个方向的距离变 化判断眼动方向,即将获取的每一帧图片置于同一坐标中,所述眼眶边缘曲线上顶点、下顶点、左侧顶点、右侧顶点坐标,以及所述瞳孔中心点坐标均为活动点,计算连续获得的每一帧图片中瞳孔中心点坐标至其他四点的距离,并根据每帧图片中这四点的距离变化判断眼动方向。
  4. 根据权利要求3所述的眼动判断装置,其特征在于,所述距离计算模块包括:
    坐标计算单元,用于提取所述眼眶边缘曲线,分别获取所述眼眶边缘曲线上顶点、下顶点、左侧顶点、右侧顶点的坐标,提取瞳孔中心点坐标;
    距离计算单元,用于计算所述瞳孔中心点坐标分别至所述眼眶边缘曲线上顶点、下顶点、左侧顶点、右侧顶点坐标的相对距离。
PCT/CN2017/079818 2017-04-08 2017-04-08 一种眼动判断方法及装置 WO2018184245A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/079818 WO2018184245A1 (zh) 2017-04-08 2017-04-08 一种眼动判断方法及装置

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/079818 WO2018184245A1 (zh) 2017-04-08 2017-04-08 一种眼动判断方法及装置

Publications (1)

Publication Number Publication Date
WO2018184245A1 true WO2018184245A1 (zh) 2018-10-11

Family

ID=63712560

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/079818 WO2018184245A1 (zh) 2017-04-08 2017-04-08 一种眼动判断方法及装置

Country Status (1)

Country Link
WO (1) WO2018184245A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111563895A (zh) * 2020-05-21 2020-08-21 苏州沃柯雷克智能系统有限公司 一种图片清晰度确定方法、装置、设备及存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102799277A (zh) * 2012-07-26 2012-11-28 深圳先进技术研究院 一种基于眨眼动作的人机交互方法及系统
CN103336576A (zh) * 2013-06-28 2013-10-02 优视科技有限公司 一种基于眼动追踪进行浏览器操作的方法及装置
CN103784112A (zh) * 2013-10-10 2014-05-14 杨松 传感眼动的方法、柔性接触体、外置传感线圈及系统
CN104905765A (zh) * 2015-06-08 2015-09-16 四川大学华西医院 眼动跟踪中基于CamShift算法的FPGA实现方法
CN105589551A (zh) * 2014-10-22 2016-05-18 褚秀清 一种用于移动设备人机交互的眼动跟踪方法
CN106339087A (zh) * 2016-08-29 2017-01-18 上海青研科技有限公司 一种基于多维坐标的眼球追踪方法及其装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102799277A (zh) * 2012-07-26 2012-11-28 深圳先进技术研究院 一种基于眨眼动作的人机交互方法及系统
CN103336576A (zh) * 2013-06-28 2013-10-02 优视科技有限公司 一种基于眼动追踪进行浏览器操作的方法及装置
CN103784112A (zh) * 2013-10-10 2014-05-14 杨松 传感眼动的方法、柔性接触体、外置传感线圈及系统
CN105589551A (zh) * 2014-10-22 2016-05-18 褚秀清 一种用于移动设备人机交互的眼动跟踪方法
CN104905765A (zh) * 2015-06-08 2015-09-16 四川大学华西医院 眼动跟踪中基于CamShift算法的FPGA实现方法
CN106339087A (zh) * 2016-08-29 2017-01-18 上海青研科技有限公司 一种基于多维坐标的眼球追踪方法及其装置

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111563895A (zh) * 2020-05-21 2020-08-21 苏州沃柯雷克智能系统有限公司 一种图片清晰度确定方法、装置、设备及存储介质

Similar Documents

Publication Publication Date Title
US20180018505A1 (en) Method for detecting skin region and apparatus for detecting skin region
CN106471544B (zh) 三维模型产生的系统和方法
WO2017092334A1 (zh) 一种图像渲染处理的方法及装置
WO2017092332A1 (zh) 一种渲染图像的处理方法及装置
US10620826B2 (en) Object selection based on region of interest fusion
JP2016531362A (ja) 肌色調整方法、肌色調整装置、プログラム及び記録媒体
US8903130B1 (en) Virtual camera operator
US9052740B2 (en) Adaptive data path for computer-vision applications
CN107231529A (zh) 图像处理方法、移动终端及存储介质
US11132544B2 (en) Visual fatigue recognition method, visual fatigue recognition device, virtual reality apparatus and storage medium
CN105847850A (zh) 全景视频的实时播放方法及装置
JP2021532403A (ja) 光学式捕捉によるパーソナライズされたhrtf
EP2946274B1 (en) Methods and systems for creating swivel views from a handheld device
CN103607584A (zh) 一种kinect拍摄的深度图与彩色摄像机拍摄视频的实时配准方法
US9613404B2 (en) Image processing method, image processing apparatus and electronic device
WO2018040480A1 (zh) 一种调整扫描状态的方法及装置
WO2015078240A1 (zh) 一种视频控制方法及用户终端
CN109992809A (zh) 一种建筑模型的构建方法、装置及存储装置
CN107092872A (zh) 一种眼动识别方法及装置
WO2022121963A1 (zh) 一种图像遮挡检测方法、装置、拍摄设备及介质
WO2018184245A1 (zh) 一种眼动判断方法及装置
WO2018184246A1 (zh) 一种眼动识别方法及装置
WO2019000464A1 (zh) 一种图像显示方法、装置、存储介质和终端
CN107255812A (zh) 基于3d技术的测速方法、移动终端、及存储介质
CN111385481A (zh) 图像处理方法及装置、电子设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17904723

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS (EPO FORM 1205A DATED 13.03.2020)

122 Ep: pct application non-entry in european phase

Ref document number: 17904723

Country of ref document: EP

Kind code of ref document: A1