WO2014075418A1 - 一种人机交互方法及装置 - Google Patents

一种人机交互方法及装置 Download PDF

Info

Publication number
WO2014075418A1
WO2014075418A1 PCT/CN2013/073786 CN2013073786W WO2014075418A1 WO 2014075418 A1 WO2014075418 A1 WO 2014075418A1 CN 2013073786 W CN2013073786 W CN 2013073786W WO 2014075418 A1 WO2014075418 A1 WO 2014075418A1
Authority
WO
WIPO (PCT)
Prior art keywords
sight
line
screen
distance
angle
Prior art date
Application number
PCT/CN2013/073786
Other languages
English (en)
French (fr)
Inventor
李颖
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2014075418A1 publication Critical patent/WO2014075418A1/zh
Priority to US14/583,487 priority Critical patent/US9740281B2/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction

Definitions

  • the present invention relates to an operation for displaying content, and in particular to a human-computer interaction method and apparatus. Background technique
  • the prior art methods for realizing the above functions are mostly based on the principle of gravity sensing, that is, by using the piezoelectric effect, by measuring a heavy object inside (currently, the internal gravity of the electronic product is mostly a gravity sensing chip, and the weight and the piezoelectric piece are integrated into one body). The direction of the force in the two directions is orthogonal to determine the direction, thereby judging the user's motion.
  • the drawback of gravity sensing is that the product is used within the gravity range and these functions are lost when the gravity range is removed. Summary of the invention
  • the technical problem solved by the present invention is to provide a method and device for human-computer interaction, which can determine the motion of the user and perform corresponding operations on the screen display content without relying on gravity sensing.
  • the first aspect of the present application provides a human-computer interaction method, including:
  • the capturing a line of sight direction includes:
  • the line of sight feature parameters including a vector of pupil center to corneal reflection
  • a coordinate position of the line of sight direction on the screen is calculated based on a vector of the pupil center to the cornea reflection.
  • the line of sight feature parameter further includes a line of sight feature vector, and determining a change in an angle between the line of sight and the screen according to the line of sight direction / or changes in the distance between the user and the screen include:
  • the performing operations on the screen display content according to the change of the included angle includes:
  • the screen display content is moved in the vertical direction, the horizontal direction, or the diagonal direction of the screen according to the change of the included angle.
  • the performing operations on the screen display content according to the change of the included angle includes:
  • the screen display content is switched to the next page or the next page according to the change of the angle.
  • the performing, according to the change of the distance, performing corresponding operations on the screen display content includes:
  • the size of the screen display content is adjusted according to the change in the distance.
  • the second aspect of the present application provides a human-machine interaction apparatus, including:
  • a line of sight tracking unit for capturing a line of sight direction
  • a processing unit configured to determine, according to the direction of the line of sight, a change in the angle between the line of sight and the screen and/or a change in the distance between the user and the screen;
  • an execution unit configured to perform corresponding operations on the screen display content according to the change of the angle and/or the change of the distance.
  • the line-of-sight tracking unit includes a line-of-sight feature parameter extraction module and a coordinate position calculation module, where:
  • the line of sight feature parameter extraction module is configured to extract a line of sight feature parameter according to a pupil-corneal reflection method, the line of sight feature parameter including a vector of a pupil center to a corneal reflection;
  • the coordinate position calculation module is configured to calculate a coordinate position of the line of sight direction on the screen according to the vector of the pupil center to the cornea reflection.
  • the line of sight feature parameter further includes a line of sight feature vector
  • the processing unit includes a distance calculation module and an angle calculation module, where:
  • the distance calculation module is configured to calculate a distance between the user and the screen according to the line of sight feature vector
  • the included angle calculation module is configured to calculate an angle between the line of sight and a screen according to a distance between the user and the screen and the coordinate position.
  • the execution unit performs corresponding operations on the screen display content according to the change of the angle
  • the execution unit is configured to control the screen display content to move in a vertical direction, a horizontal direction, or a diagonal direction of the screen according to the change of the included angle.
  • the executing unit performs corresponding operations on the screen display content according to the change of the angle
  • the specific way is:
  • the execution unit is configured to control the screen display content to switch to the next page or the next page according to the change of the included angle.
  • the execution unit performs corresponding operations on the screen display content according to the change of the distance
  • the specific way is: The execution unit is configured to adjust a size of the screen display content according to the change in the distance.
  • the human-computer interaction method and device provided by the present application perform corresponding operations on the screen display content according to the change of the user's line of sight and the angle between the screen and the distance between the user and the screen, and do not depend on gravity sensing, and the user still leaves the gravity range. It can operate the screen conveniently, and the operation does not need to be controlled by hand.
  • the human-computer interaction mode is novel, especially suitable for people with disabilities to read electronic documents, view pictures, enjoy videos and so on.
  • FIG. 1 is a flowchart of a human-computer interaction method provided by an embodiment of the present application.
  • Figure 2 is a schematic view of line of sight tracking
  • Figure 3 is a schematic diagram of 25 points of screen division
  • FIG. 4 is a schematic view of the angle between the line of sight and the screen when the line of sight coincides with the normal line of sight;
  • FIG. 5 to FIG. 8 are schematic views of the angle between the line of sight and the screen of different types;
  • FIG. 9 is a structural diagram of a human-machine interaction apparatus according to an embodiment of the present application.
  • FIG. 10 is a structural diagram of another human-machine interaction apparatus according to an embodiment of the present application.
  • FIG. 11 is a structural diagram of another human-machine interaction apparatus according to an embodiment of the present application. detailed description
  • FIG. 1 is a flowchart of a human-computer interaction method according to an embodiment of the present application, including:
  • the direction of the line of sight of the user's eyes can be captured by line-of-sight tracking technology, as shown in Figure 2,
  • a camera 203 and an infrared light source transmitter 202 mounted on the display or body capture the line of sight.
  • the gaze tracking technology scheme can adopt the non-intrusive gaze tracking technology based on Video Oculo Graphic (VOG).
  • VOG Video Oculo Graphic
  • the basic principle is to use some eye structures and features with the same relative position when the eyeball rotates as a reference.
  • the line of sight/point of regard (LOS/POR) variation parameter is extracted between the position change feature and the invariant feature, and then the line of sight direction is obtained through the geometric model or the map model.
  • the VOG-based gaze tracking technology is divided into two components: gaze feature parameter extraction and line-of-sight estimation model.
  • the line of sight feature parameter is the necessary process and premise of line of sight tracking.
  • the extracted features may include the vector of pupil center and cornea reflection, the corneal reflection matrix, and the elliptical boundary of the iris.
  • the VOG-based gaze tracking technique generally uses the pupil-corneal reflex method. When the infrared source is coaxial with the camera, the pupil will be brighter than the iris (bright); when the infrared source is separated from the camera, the pupil is darker (dark) than the iris.
  • the line-of-sight tracking method based on the pupil-corneal reflection method detects the line-of-sight feature by controlling the infrared emission direction of the infrared light source, so that it produces bright and dark phenomenon, and uses the bright dark image to perform image difference technology to extract the pupil feature.
  • the eye position is quickly captured in the face image and the pupil is accurately and accurately segmented in the eye image.
  • the method of capturing the direction of the line of sight may be: controlling the infrared emission direction of the infrared light source or controlling the alternating light and darkness of the infrared light source, generating a video sequence in which the bright and dark frames alternately appear, and using the adjacent bright and dark images to eliminate the background.
  • the effect of detecting the pupil in the thresholded differential image is as follows:
  • the difference between the bright and dark images is subtracted to obtain a difference image, and the difference image is filtered to obtain a pupil region; the edge of the pupil region is detected and the cornea reflection is searched based on the gray level in the vicinity of the eye region; the corneal reflection center can be located by the centroid method. And consider the wave edge of the pupil to eliminate the influence of corneal reflection on the contour of the pupil edge;
  • the relationship between the screen position of the user's line of sight direction, that is, the position of the line of sight on the screen (G x , G y ) and the vector of the pupil center to the corneal reflection ( A X , Ay) can pass a complex nonlinear mapping function.
  • the nonlinear mapping function in the case where the user's head position is fixed can be expressed as:
  • G y f y ( ⁇ , ⁇ ) ⁇ b ⁇ x)+b 2 A +b ⁇ A y 2
  • the gaze tracking technology works as shown in Fig. 2:
  • the infrared light source transmitter 202 and the camera 203 are mounted on the display 201, and the infrared light source transmitter 202 emits an invisible infrared ray 204, which illuminates the user's eye 206 and reflects back to the infrared Signal 205, the camera collects the infrared signal 205 and then the processor combines the "bright” and “dark” phenomena and accurately tracks the position of the pupil according to algorithms such as bright and dark image difference and filtering, and obtains the direction 208 of the line of sight, the direction of the line of sight.
  • the coordinate position of 208 on the screen is (G x , G y
  • S102 Determine a change in the angle between the line of sight and the screen according to the direction of the line of sight and/or a change in the distance between the user and the screen.
  • the term "and/or” in this context is merely an association that describes the associated object, indicating that there can be three relationships, for example, A and / or B, which can mean: A exists separately, and both A and B exist, exist alone B these three situations.
  • the character "/" in this article generally means that the contextual object is an "or" relationship.
  • the angle between the line of sight and the screen is calculated based on the distance between the user and the screen and the coordinate position.
  • Calculating the distance Z between the user and the screen can be estimated by the line of sight feature vector L.
  • a set of line of sight feature vectors can be measured at different locations in advance! ⁇ and distance, establish a relationship model of 1 ⁇ and then calculate the value of Z based on the input L.
  • S103 Perform corresponding operations on the screen display content according to the change of the angle and/or the change of the distance.
  • the manner of correspondingly operating the screen display content according to the change of the angle may be: controlling the screen display content to move along the vertical direction, the horizontal direction or the diagonal direction of the screen according to the change of the angle.
  • the angle between the line of sight and the screen is the first type of angle, the second type of angle, the third type of angle or the fourth type of angle, and the first type of angle means that the line of sight is on the screen normal line on the screen.
  • the acute angle is shown in Figure 6.
  • the third type of angle refers to the acute angle between the line of sight and the horizontal direction of the screen when the line of sight is to the right of the screen normal to the screen, as shown in Figure 7.
  • the fourth type of angle is Refers to the acute angle of the line of sight and the horizontal direction of the screen when the line of sight is to the left of the screen normal to the screen, as shown in Figure 8.
  • the acute angle 209 of the line of sight 208 and the vertical direction of the screen in FIG. 2 is above the line of sight normal line 207 on the screen, belonging to the first type of angle, and the screen normal line 207 is perpendicular to the plane of the screen and A straight line through the eyes of the user.
  • FIG. 4 is a schematic diagram of an angle when a user's line of sight coincides with a line of sight normal when the user is looking at the screen, where 301 is the screen, 304 is the line of sight normal line, 305 is the line of sight, 304 and 305 are coincident, and the screen At an angle of 90°, 302 is the eye, 303 is the pupil, and the direction in which the pupil points is the direction of the line of sight.
  • 306 ⁇ 309 in Fig. 4 to Fig. 7 are schematic diagrams of the angles of the first type to the fourth type, respectively.
  • controlling the movement of the screen display content along the vertical direction, the horizontal direction or the diagonal direction of the screen according to the change of the included angle includes: when the first type of angle becomes smaller, the screen display content moves upward along the vertical direction of the screen, for example, when the screen When the image is tilted up or the user's line of sight moves up, the page of the picture or text in the screen moves up.
  • the screen display content moves down in the vertical direction of the screen or switches the page in the vertical direction of the screen, for example, when the screen is tilted downward or the user's line of sight moves downward, the picture or text in the screen The page moves down.
  • the screen display content moves to the right in the horizontal direction of the screen or switches the page in the horizontal direction of the screen, for example, when the screen is tilted to the right or the user's line of sight moves to the right, the page or text of the screen is oriented. Move right.
  • the screen display content moves to the left in the horizontal direction of the screen or switches the page in the horizontal direction of the screen, for example, when the screen is tilted to the left or the user's line of sight moves to the left, the page of the picture or the text of the screen is Move left.
  • the angle type of the embodiment may further include an acute angle formed by the line of sight and the diagonal direction of the screen.
  • the acute angle formed by the diagonal direction of the line of sight screen changes, the screen display content moves along the diagonal direction of the screen or Switching the page, that is, when the screen is tilted diagonally or the user's line of sight moves in the diagonal direction, the screen display content changes correspondingly along the diagonal direction.
  • the specific method can refer to the operation idea of the first to fourth types of angle change. , I won't go into details here.
  • the manner of performing corresponding operations on the screen display content according to the change of the angle may also be: controlling the screen display content to switch to the next page or the next page according to the change of the angle. Specifically, when the screen display content is moved along the vertical direction, the horizontal direction, or the diagonal direction of the screen according to the change of the angle, the display content in the screen, such as a picture or a text page, has been moved to a preset The threshold position is switched to the next or previous image, or the previous or next page of text.
  • the speed at which the screen displays the content movement or the speed of switching the page in the embodiment may be related to the size of the angle between the line of sight and the screen. For example, the smaller the angle is, the faster the screen display content moves. , or the faster the page is switched.
  • the corresponding operation of the screen display content according to the change of the distance may adjust the screen display image or the screen display text size according to the change of the distance. For example, when the user walks in the direction of the screen The image of the singer displayed in the screen becomes smaller. When the user moves backward relative to the screen, the image of the singer displayed on the screen becomes larger to enhance the user's sensory feeling.
  • the corresponding operation of the screen display content is performed on the screen display content, and does not depend on gravity sensing, and the user is separated from the gravity range.
  • the screen can still be conveniently operated, and the operation does not need to be controlled by hand.
  • the user can use the movement of the line of sight to control the screen, and is especially suitable for people with disabilities to read electronic documents, view pictures, enjoy videos, and the like.
  • FIG. 9 is a device for human-computer interaction according to an embodiment of the present application, which includes at least: a 401 line-of-sight tracking unit, a 402 processing unit, and a 403 execution unit, where:
  • a line of sight tracking unit 401 for capturing the direction of the line of sight.
  • the processing unit 402 is configured to determine a change in the angle between the line of sight and the screen and/or a change in the distance between the user and the screen according to the direction of the line of sight.
  • the angle between the line of sight and the screen may be the first type of angle, the second type of angle, the third type of angle or the fourth type of angle or the acute angle formed by the line of sight and the diagonal direction of the screen, various clips
  • the definition of the angle refers to FIG. 1 and the embodiment will not be described again here.
  • the executing unit 403 is configured to perform corresponding operations on the screen display content according to the change of the angle and/or the change of the distance.
  • the speed at which the screen displays the content movement or the speed of switching the page in the embodiment may be related to the size of the angle between the line of sight and the screen. For example, the smaller the angle is, the faster the screen display content moves. , or the faster the page is switched.
  • FIG. 10 is another apparatus for human-computer interaction provided by the embodiment of the present application, including at least: a 401 line-of-sight tracking unit, a 402 processing unit, a 403 execution unit, a 401 line-of-sight tracking unit, a 402 processing unit, and a 403
  • the function of the unit refers to the embodiment of Fig. 9. among them,
  • 401 line-of-sight tracking unit includes 4011 line-of-sight feature parameter extraction module and 4012 coordinate position calculation Module, where:
  • the 4011 line of sight feature parameter extraction module is configured to extract line of sight feature parameters according to the pupil-corneal reflection method, and the line of sight feature parameters include a vector of pupil center to corneal reflection;
  • the 4012 coordinate position calculation module is configured to calculate the coordinate position of the line of sight direction on the screen according to the vector of the pupil center to the corneal reflection.
  • the line-of-sight feature parameter further includes a line-of-sight feature vector
  • the 402 processing unit includes a 4021 distance calculation module and a 4022 angle calculation module, wherein:
  • the 4021 distance calculation module is configured to calculate a distance between the user and the screen according to the line of sight feature vector
  • the 4022 Angle Calculation Module is used to calculate the angle between the line of sight and the screen based on the distance between the user and the screen and the coordinate position.
  • the corresponding operation of the screen display content is performed on the screen display content, and does not depend on gravity sensing, and the user is separated from the gravity range.
  • the screen can still be conveniently operated, and the operation does not need to be controlled by hand.
  • the user can use the movement of the line of sight to control the screen, and is especially suitable for people with disabilities to read electronic documents, view pictures, enjoy videos, and the like.
  • FIG. 11 is another apparatus for human-computer interaction provided by the embodiment of the present application, including at least: 501 Sight Tracking Device, 502 System Device, wherein 501 Sight Tracking Device includes 5011 Infrared Light Source Transmitter, 5012 Camera
  • the 502 system equipment includes 5021CPU, 5022RAM, 5023ROM, 5024 disks.
  • the 5021CPU is used to perform the following steps:
  • the screen display content is operated according to the change of the angle and/or the change from the 10,000.
  • the 5021CPU is also used to perform the following steps:
  • the line of sight feature parameters are extracted according to the pupil-corneal reflection method, and the line of sight feature parameters include a vector from the pupil center to the corneal reflection and a line of sight feature vector;
  • the coordinate position of the line of sight direction on the screen is calculated from the vector of the pupil center to the corneal reflection.
  • the 5021CPU is also used to perform the following steps: Calculating the distance between the user and the screen according to the line of sight feature vector;
  • the angle between the line of sight and the screen is calculated based on the distance between the user and the screen and the coordinate position.
  • the screen display content is operated correspondingly, and does not depend on gravity sensing, and the user can conveniently operate the screen after being separated from the gravity range, and The operation does not require manual control.
  • the user can control the screen with the movement of the line of sight. It is especially suitable for people with disabilities to read electronic documents, view pictures, enjoy videos and so on.
  • the storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), or a random access memory (Random Access Memory).

Abstract

本发明实施例公开了一种人机交互方法及装置,该方法包括:捕捉视线方向;根据所述视线方向判断视线与屏幕之间夹角的变化和/或用户与所述屏幕之间距离的变化;根据所述夹角的变化和/或所述距离的变化对屏幕显示内容进行相应操作。与上述方法相对应,上述装置包括:视线追踪单元,处理单元和执行单元。本发明能够不依赖于重力感应判断出用户的动作并对屏幕显示内容进行相应操作。

Description

一种人机交互方法及装置
本申请要求于 2012 年 11 月 13 日提交中国专利局、 申请号为 201210452461.3, 发明名称为 "一种人机交互方法及装置" 的中国专利申请的 优先权, 其全部内容通过引用结合在本申请中。 技术领域
本发明涉及显示内容的操作, 具体涉及一种人机交互方法及装置。 背景技术
随着电子技术的发展和用户要求的提高,目前判断电子产品优劣的条件已 经远远不是停留在硬件指标和技术指标上,注重用户体验和感受才是厂商贏得 市场的关键。 现有的很多电子产品如手机、 平板电脑、 MP3、 MP4、 游戏机等 都能根据用户的相应动作做出相应的操作, 如当用户翻转机身时自动翻转屏 幕, 甩动机身时切换界面、 歌曲、 视频, 倾斜机身时屏幕中图像随之移动, 赛 车游戏中根据用户对机身左右摇摆控制屏幕中赛车方向等等。这些相比起以前 只能通过键盘或者触屏操作的电子产品来说, 用户体验大大的提高了。
现有技术实现上述功能的方法多是基于重力感应原理, 即利用压电效应, 通过测量内部一片重物(目前电子产品内部多是采用重力感应芯片, 重物和压 电片^成一体)重力正交两个方向的分力大小来判断方向,从而判断用户的动 作。 重力感应的缺陷在于产品要在重力范围内使用,脱离重力范围后这些功能 就会丧失。 发明内容
本发明所述解决的技术问题是提供一种人机交互的方法及装置,能够不依 赖于重力感应判断出用户的动作并对屏幕显示内容进行相应操作。
本申请第一方面提供一种人机交互方法, 包括:
捕捉视线方向;
根据所述视线方向判断视线与屏幕之间夹角的变化和 /或用户与所述屏幕 之间距离的变化; 根据所述夹角的变化和 /或所述距离的变化对屏幕显示内容进行相应操 作。
在第一种可能的实现方式中, 所述捕捉视线方向包括:
根据瞳孔-角膜反射方法提取视线特征参数, 所述视线特征参数包括瞳孔 中心到角膜反射的向量;
根据所述瞳孔中心到角膜反射的向量计算所述视线方向在所述屏幕上的 坐标位置。
结合第一方面的第一种可能的实现方式,在第二种可能的实现方式中, 所 述视线特征参数还包括视线特征向量,根据所述视线方向判断视线与屏幕之间 夹角的变化和 /或用户与所述屏幕之间距离的变化包括:
根据所述视线特征向量计算用户和所述屏幕之间距离;
根据所述用户和所述屏幕之间距离和所述坐标位置计算所述视线与屏幕 之间夹角。
结合第一方面及第一方面的第一种、第二种可能的实现方式,在第三种可 能的实现方式中, 所述根据所述夹角的变化对屏幕显示内容进行相应操作包 括:
根据所述夹角的变化控制屏幕显示内容沿屏幕竖直方向、水平方向或者对 角线方向移动。
结合第一方面及第一方面的第一种、第二种可能的实现方式,在第四种可 能的实现方式中, 所述根据所述夹角的变化对屏幕显示内容进行相应操作包 括:
根据所述夹角的变化控制屏幕显示内容向上一页或者下一页切换。
结合第一方面及第一方面的第一种、第二种可能的实现方式,在第五种可 能的实现方式中, 所述根据所述距离的变化对屏幕显示内容进行相应操作包 括:
根据所述距离的变化调节屏幕显示内容的大小。
本申请第二方面提供一种人机交互装置, 包括:
视线追踪单元, 用于捕捉视线方向;
处理单元, 用于根据所述视线方向判断视线与屏幕之间夹角的变化和 /或 用户与所述屏幕之间距离的变化;
执行单元, 用于根据所述夹角的变化和 /或所述距离的变化对屏幕显示内 容进行相应操作。
在第一种可能的实现方式中,所述视线追踪单元包括视线特征参数提取模 块和坐标位置计算模块, 其中:
所述视线特征参数提取模块用于根据瞳孔-角膜反射方法提取视线特征参 数, 所述视线特征参数包括瞳孔中心到角膜反射的向量;
所述坐标位置计算模块用于根据所述瞳孔中心到角膜反射的向量计算所 述视线方向在所述屏幕上的坐标位置。
结合第二方面的第一种可能的实现方式,在第二种可能的实现方式中, 所 述视线特征参数还包括视线特征向量,所述处理单元包括距离计算模块和夹角 计算模块, 其中:
所述距离计算模块用于根据所述视线特征向量计算用户和所述屏幕之间 距离;
所述夹角计算模块用于根据所述用户和所述屏幕之间距离和所述坐标位 置计算所述视线与屏幕之间夹角。
结合第二方面及第二方面的第一种、第二种可能的实现方式,在第三种可 能的实现方式中,所述执行单元根据所述夹角的变化对屏幕显示内容进行相应 操作的具体方式为:
所述执行单元用于根据所述夹角的变化控制屏幕显示内容沿屏幕竖直方 向、 水平方向或者对角线方向移动。
结合第二方面及第二方面的第一种、第二种可能的实现方式,在第四种可 能的实现方式中,所述执行单元根据所述夹角的变化对屏幕显示内容进行相应 操作的具体方式为:
所述执行单元用于根据所述夹角的变化控制屏幕显示内容向上一页或者 下一页切换。
结合第二方面及第二方面的第一种、第二种可能的实现方式,在第五种可 能的实现方式中,所述执行单元根据所述距萬的变化对屏幕显示内容进行相应 操作的具体方式为: 所述执行单元用于根据所述距离的变化调节屏幕显示内容的大小。
本申请提供的人机交互方法及装置根据用户视线与屏幕夹角和用户与屏 幕之间的距离的变化来对屏幕显示内容进行相应的操作, 不依赖于重力感应, 在脱离重力范围后用户仍能方便的操作屏幕, 而且操作不需要用手控制,人机 交互方式新颖, 尤其适合残障人士阅读电子文档、 查看图片、 欣赏视频等。 附图说明
为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施 例或现有技术描述中所需要使用的附图作筒单地介绍,显而易见地, 下面描述 中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付 出创造性劳动的前提下, 还可以根据这些附图获得其他的附图。
图 1是本申请实施例提供的一种人机交互方法的流程图;
图 2是视线追踪的示意图;
图 3为屏幕划分 25个点示意图;
图 4是视线与视线法向线重合时视线与屏幕之间夹角的示意图; 图 5~图 8是不同类型的视线与屏幕之间夹角的示意图;
图 9是本申请实施例提供的一种人机交互装置的结构图;
图 10是本申请实施例提供的另一种人机交互装置的结构图;
图 11是本申请实施例提供的另一种人机交互装置的结构图。 具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清 楚、 完整地描述, 显然, 所描述的实施例仅仅是本发明一部分实施例, 而不是 全部的实施例。基于本发明中的实施例, 本领域普通技术人员在没有作出创造 性劳动前提下所获得的所有其他实施例, 都属于本发明保护的范围。
请参考图 1 , 图 1是本申请实施例提供的一种人机交互方法的流程图, 包 括:
S101、 捕捉视线方向。
其中, 捕捉用户眼睛的视线方向可以采用视线追踪技术, 如图 2所示, 通 过安装在显示器或者机身上的一个摄像头 203和一个红外光源发射器 202捕捉 视线方向。 视线追踪技术的方案可以采用基于眼睛视频分析 (Video Oculo Graphic , VOG) 的非侵入式视线追踪技术, 其基本原理是利用眼球转动时相 对位置不变的某些眼部结构和特征作为参照, 在位置变化特征和这些不变特 征之间提取视线 (Line of sight/point of regard, LOS/POR) 变化参数, 然后通过 几何模型或映射模型获取视线方向。 基于 VOG的视线追踪技术分为视线特征 参数提取和视线估计模型建立两个组成部分。
视线特征参数是视线追踪的必要过程和前提, 根据视线追踪方法的不同, 提取的特征可以包括瞳孔中心与角膜反射的向量, 角膜反射矩阵, 虹膜的椭圆 边界等。基于 VOG的视线追踪技术普遍使用瞳孔-角膜反射方法。 当红外光源 与摄像头同轴时, 瞳孔将比虹膜更亮 (亮瞳); 当红外光源与摄像头分离时, 瞳孔比虹膜更暗(暗瞳)。 基于瞳孔-角膜反射方法的视线追踪方法通过控制红 外光源的红外线发射方向进行视线特征检测,使其产生亮瞳和暗瞳现象, 并利 用亮暗瞳图像进行图像差分技术提取瞳孔特征,可以在整幅脸部图像中快速捕 捉眼部位置和在眼部图像中精细准确地分割瞳孔。
捕捉视线方向的方法可以为:控制红外光源的红外线发射方向或控制红外 光源交替亮暗,产生亮瞳和暗瞳隔帧交替出现的视频序列, 利用相邻亮瞳和暗 瞳图像做差分消除背景的影响,在阈值化后的差分图像中检测瞳孔, 具体步骤 为:
亮瞳与暗瞳图像相减得到差分图像, 对差分图像做滤波, 得到瞳孔区域; 检测瞳孔区域的边缘并在眼睛区域附近基于灰度搜索角膜反射; 通过求质心的方法可以定位角膜反射中心, 并对瞳孔边缘 虑波, 消除角 膜反射对瞳孔边缘轮廓的影响;
采用椭圆拟合定位瞳孔中心, 得到中心坐标, 提取的视线特征向量 L。 其中,
Figure imgf000007_0001
其中,( X, y )为瞳孔中心到角膜反射的向量,其中(Δχ, Ay) = pipjp - p. ( ip,jp)为瞳孔中心在图像中的位置, (ic,jc)为角膜反射在图像中的位置, (1 和 a mnOT之比为瞳孔椭圆长短轴之比, Θ为瞳孔椭圆长轴与垂直方向的角度。
用户视线方向关注的屏幕位置即视线方向在所述屏幕上的坐标位置 ( Gx, Gy)和瞳孔中心到角膜反射的向量 ( AX,Ay) 的关系可以通过一个复杂的非 线性映射函数来表示。例如,假设用户头部位置固定情况下的非线性映射函数 可以表示为:
Gx=fx( Δχ,Δγ)^ ¾1( Δ y)+a2( Δγ) Δχ
Figure imgf000008_0001
a2(^y)¾a5+a6^y
Gy=fy( Δχ,Δγ)^ b Δ x)+b2 A +b^A y2
Figure imgf000008_0002
(Gx, Gy) =f ( Δχ,Δγ) 中有 8 个未知数, 通过 4个以上的标定点就可以 确定其关系。这个回归过程只在建立模型的时候进行一次, 一旦 8个未知数已 经确定,以后不同使用者使用系统时,因为已经经过上个步骤的个体差异补偿, 所以可以直接使用这个视线映射函数, 无需重新进行回归过程。 例如采用 5χ 5=25点进行未知数求解, 25个点依屏幕呈长方形分布, 如图 3所示, 校正的 方法为, 让被试者依次注视屏幕上的这 25个点, 系统记下注视每个点时的视 线向量(
Figure imgf000008_0003
相应的视线所在位置坐标为 (G^ Gy,), 根据 25个校正 目标点的已知坐标, 分别建立 (Gx, Gy) 与 ( Λχ,Δγ) 映射函数方程组, 这 样其他用户关注的位置可以通过映射函数关系求解出来。
视线追踪技术工作原理如图 2所示: 红外光源发射器 202和摄像头 203 安装在显示器 201上, 红外光源发射器 202发射出不可见的红外线 204, 该红 外线照射到用户的眼睛 206,反射回红外信号 205,摄像头采集该红外信号 205 然后由处理器结合 "亮瞳"和 "暗瞳"现象以及根据亮暗瞳图像差分和滤波等 算法精确跟踪瞳孔的位置, 得出视线的方向 208, 视线方向 208在所述屏幕上 的坐标位置为 (Gx, Gy
S 102、 根据视线方向判断视线与屏幕之间夹角的变化和 /或用户与屏幕之 间距离的变化。 本文中术语"和 /或", 仅仅是一种描述关联对象的关联关系, 表示可以存 在三种关系, 例如, A和 /或 B, 可以表示: 单独存在 A, 同时存在 A和 B, 单独存在 B这三种情况。 另外, 本文中字符" /", 一般表示前后关联对象是一 种"或"的关系。
其中, 根据视线方向判断视线与屏幕之间夹角的变化和 /或用户与屏幕之 间 巨离的变^ i包括:
根据视线特征向量计算用户和屏幕之间距离;
根据用户和屏幕之间距离和坐标位置计算视线与屏幕之间夹角。
计算用户和屏幕之间距离 Z可以通过视线特征向量 L估算得到, 可以预 先在不同位置测得一组视线特征向量!^和距离 , 建立 1^和 的关系模型, 之后根据输入的 L就可以推算出 Z的值。
根据用户和屏幕之间距离和坐标位置计算视线与屏幕之间夹角的方法可 以为,从图 2中三角几何关系可以得出 Z/ Gy =tan a , a =arctan(Z/ Gy)。
S103、 根据夹角的变化和 /或距离的变化对屏幕显示内容进行相应操作。 其中,根据夹角的变化对屏幕显示内容进行相应操作的方式可以为: 根据 夹角的变化控制屏幕显示内容沿屏幕竖直方向、 水平方向或者对角线方向移 动。
其中, 视线与屏幕之间夹角为第一类夹角、 第二类夹角、 第三类夹角或第 四类夹角,第一类夹角是指在屏幕上当视线在屏幕法向线上方时视线与所述屏 幕的竖直方向所成的锐角,如图 5所示; 第二类夹角是指在屏幕上当视线在屏 幕法向线下方时视线与屏幕的竖直方向所成的锐角,如图 6所示; 第三类夹角 是指在屏幕上当视线在屏幕法向线右方时视线与屏幕的水平方向所成的锐角, 如图 7所示;第四类夹角是指在屏幕上当视线在屏幕法向线左方时视线与所述 屏幕的水平方向所成的锐角, 如图 8所示。
其中,附图 2中视线 208与屏幕的竖直方向所成的锐角 209在屏幕上在视 线法向线 207的上方,属于第一类夹角,屏幕法向线 207垂直于屏幕所在的平 面并经过用户眼睛的直线。
图 4是当用户正视屏幕时用户视线与视线法向线重合时的夹角的示意图, 其中 301为屏幕, 304为视线法向线, 305为视线, 304与 305重合, 与屏幕 成 90° 角, 302为眼睛, 303为瞳孔, 瞳孔所指向的方向就是视线的方向。 图 4~图 7中 306~309分别是第一类夹角到第四类夹角的示意图。
其中,根据夹角的变化控制屏幕显示内容沿屏幕竖直方向、水平方向或者 对角线方向移动包括:当第一类夹角变小时屏幕显示内容沿屏幕的竖直方向向 上移动,例如当屏幕向上倾斜或者用户视线向上移动时屏幕内的图片或者文本 的页面向上移动。当第二类夹角变小时屏幕显示内容沿屏幕的竖直方向向下移 动或者沿屏幕的竖直方向切换页面,例如当屏幕向下倾斜或者用户视线向下移 动时屏幕内的图片或者文本的页面向下移动。当第三类夹角变小时屏幕显示内 容沿屏幕的水平方向向右移动或者沿屏幕的水平方向切换页面,例如当屏幕向 右倾斜或者用户视线向右移动时屏幕内的图片或者文本的页面向右移动。当第 四类夹角变小时屏幕显示内容沿屏幕的水平方向向左移动或者沿屏幕的水平 方向切换页面,例如当屏幕向左倾斜或者用户视线向左移动时屏幕内的图片或 者文本的页面向左移动。
其中, 本实施例的夹角类型还可以包括视线与屏幕对角线方向所成的锐 角, 当视线屏幕对角线方向所成的锐角变化时,屏幕显示内容沿屏幕的对角线 方向移动或者切换页面,即当屏幕对角倾斜或者用户视线沿对角线方向移动时 屏幕显示内容沿对角线方向做相应的变化,具体方法可以参考第一类到第四类 夹角变化时的操作思想, 这里不再赘述。
其中,根据夹角的变化对屏幕显示内容进行相应操作的方式还可以为: 根 据所述夹角的变化控制屏幕显示内容向上一页或者下一页切换。具体为在根据 夹角的变化控制屏幕显示内容沿屏幕竖直方向、水平方向或者对角线方向移动 的基础上, 当屏幕内的显示内容,如图片或者文本页面等已经移动到预先设定 的阈值位置, 则相应的切换到下一张或者上一张图像,或者上一页或者下一页 文本。
为了使设计更加人性化,本实施例中屏幕显示内容移动的速度或者切换页 面的速度可与视线与屏幕之间夹角的大小相关联,如当夹角越小时屏幕显示内 容移动的速度越快, 或者切换页面的速度越快。
其中,根据距离的变化对屏幕显示内容进行相应操作可以为根据距离的变 化调节屏幕显示图像或者屏幕显示文字大小。例如当用户朝屏幕方向走进时屏 幕内显示的图像歌者文字变小,当用户相对屏幕后退时屏幕内显示的图像歌者 文字变大, 以提升用户的感观感受。
本实施例根据用户视线与屏幕夹角和用户与屏幕之间的距离的变化来对 屏幕显示内容进行相应的操作对屏幕显示内容进行相应的操作,不依赖于重力 感应,在脱离重力范围后用户仍能方便的操作屏幕,而且操作不需要用手控制, 用户可以用视线的移动来操控屏幕, 尤其适合残障人士阅读电子文档、查看图 片、 欣赏视频等。
请参考图 9,图 9是本申请实施例提供的一种人机交互的装置,至少包括: 401视线追踪单元, 402处理单元, 403执行单元, 其中:
视线追踪单元 401 , 用于捕捉视线方向。
其中,视线追踪单元 401的实现视线方向捕捉的方式参考图 1实施例, 这 里不再赘述。
处理单元 402,用于根据视线方向判断视线与屏幕之间夹角的变化和 /或用 户与屏幕之间距离的变化。
其中, 根据视线方向判断视线与屏幕之间夹角的变化和 /或用户与屏幕之 间距离的变化的具体实现方式参考图 1实施例, 这里不再赘述。
其中, 视线与屏幕之间夹角可以为第一类夹角、 第二类夹角、 第三类夹角 或第四类夹角或视线与屏幕对角线方向所成的锐角, 各种夹角的定义参考图 1 实施例这里不再赘述。
执行单元 403 ,用于根据夹角的变化和 /或距萬的变化对屏幕显示内容进行 相应操作。
具体操作方法参考图 1实施例这里不再赘述。
为了使设计更加人性化,本实施例中屏幕显示内容移动的速度或者切换页 面的速度可与视线与屏幕之间夹角的大小相关联,如当夹角越小时屏幕显示内 容移动的速度越快, 或者切换页面的速度越快。
请参考图 10, 图 10是本申请实施例提供的另一种人机交互的装置, 至少 包括: 401视线追踪单元, 402处理单元, 403执行单元, 401视线追踪单元, 402处理单元和 403执行单元的功能参考图 9实施例。 其中,
401视线追踪单元包括 4011视线特征参数提取模块和 4012坐标位置计算 模块, 其中:
4011视线特征参数提取模块用于根据瞳孔-角膜反射方法提取视线特征参 数, 视线特征参数包括瞳孔中心到角膜反射的向量;
4012 坐标位置计算模块用于根据瞳孔中心到角膜反射的向量计算视线方 向在所述屏幕上的坐标位置。
上述视线特征参数还包括视线特征向量, 402处理单元包括 4021距离计 算模块和 4022夹角计算模块, 其中:
4021距离计算模块用于根据视线特征向量计算用户和屏幕之间距离;
4022 夹角计算模块用于根据用户和屏幕之间距离和坐标位置计算视线与 屏幕之间夹角。
本实施例根据用户视线与屏幕夹角和用户与屏幕之间的距离的变化来对 屏幕显示内容进行相应的操作对屏幕显示内容进行相应的操作,不依赖于重力 感应,在脱离重力范围后用户仍能方便的操作屏幕,而且操作不需要用手控制, 用户可以用视线的移动来操控屏幕, 尤其适合残障人士阅读电子文档、查看图 片、 欣赏视频等。
请参考图 11 , 图 11是本申请实施例提供的另一种人机交互的装置, 至少 包括: 501视线追踪设备, 502系统设备, 其中, 501视线追踪设备包括 5011 红外线光源发射器, 5012摄像头, 502系统设备包括 5021CPU, 5022RAM, 5023ROM, 5024磁盘。
其中, 5021CPU用于执行以下步骤:
控制视线追踪设备捕捉视线方向;
根据视线方向判断视线与屏幕之间夹角的变化和 /或用户与屏幕之间距离 的变化;
根据夹角的变化和 /或距萬的变化对屏幕显示内容进行相应操作。
其中, 5021CPU还用于执行以下步骤:
根据瞳孔-角膜反射方法提取视线特征参数, 视线特征参数包括瞳孔中心 到角膜反射的向量和视线特征向量;
根据瞳孔中心到角膜反射的向量计算视线方向在屏幕上的坐标位置。
5021CPU还用于执行以下步骤: 根据视线特征向量计算用户和屏幕之间距离;
根据用户和屏幕之间距离和坐标位置计算视线与屏幕之间夹角。
本实施例根据用户视线与屏幕夹角和用户与屏幕之间的距离的变化来对 屏幕显示内容进行相应的操作, 不依赖于重力感应,在脱离重力范围后用户仍 能方便的操作屏幕, 而且操作不需要用手控制, 用户可以用视线的移动来操控 屏幕, 尤其适合残障人士阅读电子文档、 查看图片、 欣赏视频等。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程, 是可以通过计算机程序来指令相关的硬件来完成,所述的程序可存储于一计算 机可读取存储介质中,该程序在执行时,可包括如上述各方法的实施例的流程。 其中,所述的存储介质可为磁碟、光盘、只读存储记忆体(Read-Only Memory, ROM )或随机存取存储器(Random Access Memory, 筒称 RAM )等。
以上所揭露的仅为本发明较佳实施例而已,当然不能以此来限定本发明之 权利范围,因此依本发明权利要求所作的等同变化,仍属本发明所涵盖的范围。

Claims

权 利 要 求
1、 一种人机交互方法, 其特征在于, 包括:
捕捉视线方向;
根据所述视线方向判断视线与屏幕之间夹角的变化和 /或用户与所述屏幕 之间距离的变化;
根据所述夹角的变化和 /或所述距离的变化对屏幕显示内容进行相应操 作。
2、 根据权利要求 1所述的方法, 其特征在于, 所述捕捉视线方向包括: 根据瞳孔-角膜反射方法提取视线特征参数, 所述视线特征参数包括瞳孔 中心到角膜反射的向量;
根据所述瞳孔中心到角膜反射的向量计算所述视线方向在所述屏幕上的 坐标位置。
3、 根据权利要求 2所述的方法, 其特征在于, 所述视线特征参数还包括 视线特征向量, 根据所述视线方向判断视线与屏幕之间夹角的变化和 /或用户 与所述屏幕之间距离的变化包括:
根据所述视线特征向量计算用户和所述屏幕之间距离;
根据所述用户和所述屏幕之间距离和所述坐标位置计算所述视线与屏幕 之间夹角。
4、 根据权利要求 1至 3任一项所述的方法, 其特征在于, 所述根据所述 夹角的变化对屏幕显示内容进行相应操作包括:
根据所述夹角的变化控制屏幕显示内容沿屏幕竖直方向、水平方向或者对 角线方向移动。
5、 根据权利要求 1至 3任一项所述的方法, 其特征在于, 所述根据所述 夹角的变化对屏幕显示内容进行相应操作包括:
根据所述夹角的变化控制屏幕显示内容向上一页或者下一页切换。
6、 根据权利要求 1至 3任一项所述的方法, 其特征在于, 所述根据所述 距离的变化对屏幕显示内容进行相应操作包括:
根据所述距离的变化调节屏幕显示内容的大小。
7、 一种人机交互装置, 其特征在于, 包括: 视线追踪单元, 用于捕捉视线方向;
处理单元, 用于根据所述视线方向判断视线与屏幕之间夹角的变化和 /或 用户与所述屏幕之间距离的变化;
执行单元, 用于根据所述夹角的变化和 /或所述距离的变化对屏幕显示内 容进行相应操作。
8、 根据权利要求 7所述的装置, 其特征在于, 所述视线追踪单元包括视 线特征参数提取模块和坐标位置计算模块, 其中:
所述视线特征参数提取模块用于根据瞳孔-角膜反射方法提取视线特征参 数, 所述视线特征参数包括瞳孔中心到角膜反射的向量;
所述坐标位置计算模块用于根据所述瞳孔中心到角膜反射的向量计算所 述视线方向在所述屏幕上的坐标位置。
9、 根据权利要求 8所述的装置, 其特征在于, 所述视线特征参数还包括 视线特征向量, 所述处理单元包括距离计算模块和夹角计算模块, 其中: 所述距离计算模块用于根据所述视线特征向量计算用户和所述屏幕之间 距离;
所述夹角计算模块用于根据所述用户和所述屏幕之间距离和所述坐标位 置计算所述视线与屏幕之间夹角。
10、 根据权利要求 7至 9任一项所述的装置, 其特征在于, 所述执行单元 根据所述夹角的变化对屏幕显示内容进行相应操作的具体方式为:
所述执行单元用于根据所述夹角的变化控制屏幕显示内容沿屏幕竖直方 向、 水平方向或者对角线方向移动。
11、 根据权利要求 7至 9任一项所述的装置, 其特征在于, 所述执行单元 根据所述夹角的变化对屏幕显示内容进行相应操作的具体方式为:
所述执行单元用于根据所述夹角的变化控制屏幕显示内容向上一页或者 下一页切换。
12、 根据权利要求 7至 9任一项所述的装置, 其特征在于, 所述执行单元 根据所述距离的变化对屏幕显示内容进行相应操作的具体方式为:
所述执行单元用于根据所述距萬的变化调节屏幕显示内容的大小。
PCT/CN2013/073786 2012-11-13 2013-04-07 一种人机交互方法及装置 WO2014075418A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/583,487 US9740281B2 (en) 2012-11-13 2014-12-26 Human-machine interaction method and apparatus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201210452461.3 2012-11-13
CN201210452461.3A CN103809737A (zh) 2012-11-13 2012-11-13 一种人机交互方法及装置

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/583,487 Continuation US9740281B2 (en) 2012-11-13 2014-12-26 Human-machine interaction method and apparatus

Publications (1)

Publication Number Publication Date
WO2014075418A1 true WO2014075418A1 (zh) 2014-05-22

Family

ID=50706634

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2013/073786 WO2014075418A1 (zh) 2012-11-13 2013-04-07 一种人机交互方法及装置

Country Status (3)

Country Link
US (1) US9740281B2 (zh)
CN (1) CN103809737A (zh)
WO (1) WO2014075418A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110865761A (zh) * 2018-08-28 2020-03-06 财团法人工业技术研究院 指向判断系统以及指向判断方法

Families Citing this family (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102130797B1 (ko) * 2013-09-17 2020-07-03 엘지전자 주식회사 이동 단말기 및 그것의 제어방법
DE102014204800A1 (de) * 2014-03-14 2015-09-17 Volkswagen Aktiengesellschaft Verfahren und Vorrichtung zum Bereitstellen einer graphischen Nutzerschnittstelle in einem Fahrzeug
JP6385210B2 (ja) * 2014-09-05 2018-09-05 キヤノン株式会社 画像診断支援装置および制御方法、プログラム
CN104461002A (zh) * 2014-12-05 2015-03-25 上海斐讯数据通信技术有限公司 一种降低观看移动终端设备的视觉疲劳度的装置和方法
CN105988556A (zh) * 2015-01-27 2016-10-05 鸿富锦精密工业(武汉)有限公司 电子设备及应用于该电子设备的显示调节方法
CN105989577B (zh) * 2015-02-17 2020-12-29 中兴通讯股份有限公司 一种图像校正的方法和装置
US9980703B2 (en) * 2015-03-17 2018-05-29 General Electric Company Methods and systems for a display interface for diagnostic medical imaging
CN106155288B (zh) 2015-04-10 2019-02-12 北京智谷睿拓技术服务有限公司 信息获取方法、信息获取装置及用户设备
CN105183169B (zh) * 2015-09-22 2018-09-25 小米科技有限责任公司 视线方向识别方法及装置
JP2017134558A (ja) * 2016-01-27 2017-08-03 ソニー株式会社 情報処理装置、情報処理方法、およびプログラムを記録したコンピュータ読み取り可能な記録媒体
CN106328006A (zh) * 2016-10-25 2017-01-11 南京奥拓电子科技有限公司 一种互动广告节目展示方法
WO2018098772A1 (zh) * 2016-12-01 2018-06-07 深圳前海达闼云端智能科技有限公司 视点确定方法、装置、电子设备和计算机程序产品
CN107122102A (zh) * 2017-04-27 2017-09-01 维沃移动通信有限公司 一种翻页控制方法及移动终端
CN107357429B (zh) * 2017-07-10 2020-04-07 京东方科技集团股份有限公司 用于确定视线的方法、设备和计算机可读存储介质
CN107608504A (zh) * 2017-08-22 2018-01-19 深圳传音控股有限公司 移动终端和智能文字显示方法
CN107765851A (zh) * 2017-09-28 2018-03-06 努比亚技术有限公司 基于虹膜识别的应用程序处理方法、终端及存储介质
CN108875526B (zh) * 2018-01-05 2020-12-25 北京旷视科技有限公司 视线检测的方法、装置、系统及计算机存储介质
CN110740246A (zh) * 2018-07-18 2020-01-31 阿里健康信息技术有限公司 一种图像矫正方法、移动设备和终端设备
CN109508679B (zh) * 2018-11-19 2023-02-10 广东工业大学 实现眼球三维视线跟踪的方法、装置、设备及存储介质
CN109656373B (zh) * 2019-01-02 2020-11-10 京东方科技集团股份有限公司 一种注视点定位方法及定位装置、显示设备和存储介质
CN110969084B (zh) * 2019-10-29 2021-03-05 深圳云天励飞技术有限公司 一种关注区域检测方法、装置、可读存储介质及终端设备
CN111399627B (zh) * 2020-03-09 2021-09-28 宁波视睿迪光电有限公司 一种3d显示装置节能方法及系统
CN111857461B (zh) * 2020-06-29 2021-12-24 维沃移动通信有限公司 图像显示方法、装置、电子设备及可读存储介质
CN112183200B (zh) * 2020-08-25 2023-10-17 中电海康集团有限公司 一种基于视频图像的眼动追踪方法和系统
CN113012574B (zh) * 2021-02-19 2023-05-09 深圳创维-Rgb电子有限公司 屏幕曲率调整方法、装置、曲面显示器以及存储介质
CN116449965B (zh) * 2023-06-16 2023-10-13 深圳润方创新技术有限公司 基于儿童电子画板的数据处理方法、装置及电子设备
CN116820246A (zh) * 2023-07-06 2023-09-29 上海仙视电子科技有限公司 一种视角自适应的屏幕调节控制方法及装置

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101311882A (zh) * 2007-05-23 2008-11-26 华为技术有限公司 视线跟踪人机交互方法及装置
CN101441513A (zh) * 2008-11-26 2009-05-27 北京科技大学 一种利用视觉进行非接触式人机交互的系统
CN201307266Y (zh) * 2008-06-25 2009-09-09 韩旭 双目视线跟踪装置
CN101813976A (zh) * 2010-03-09 2010-08-25 华南理工大学 基于soc的视线跟踪人机交互方法及装置

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6762794B1 (en) * 1997-12-03 2004-07-13 Canon Kabushiki Kaisha Image pick-up apparatus for stereoscope
JP4604190B2 (ja) * 2004-02-17 2010-12-22 国立大学法人静岡大学 距離イメージセンサを用いた視線検出装置
JP5045212B2 (ja) * 2007-04-25 2012-10-10 株式会社デンソー 顔画像撮像装置
CA2685976C (en) * 2007-05-23 2013-02-19 The University Of British Columbia Methods and apparatus for estimating point-of-gaze in three dimensions
CN101788876A (zh) * 2009-01-23 2010-07-28 英华达(上海)电子有限公司 自动缩放调整的方法及其系统
CN101807110B (zh) * 2009-02-17 2012-07-04 由田新技股份有限公司 瞳孔定位方法及系统
KR101651430B1 (ko) * 2009-12-18 2016-08-26 삼성전자주식회사 휴대용 단말기에서 출력 데이터의 사이즈를 조절하기 위한 장치 및 방법
US8739019B1 (en) * 2011-07-01 2014-05-27 Joel Nevins Computer-implemented methods and computer program products for integrating and synchronizing multimedia content, including content displayed via interactive televisions, smartphones, electronic book readers, holographic imagery projectors, and other computerized devices
CN102081503B (zh) * 2011-01-25 2013-06-26 汉王科技股份有限公司 基于视线追踪自动翻页的电子阅读器及其方法
US9285883B2 (en) * 2011-03-01 2016-03-15 Qualcomm Incorporated System and method to display content based on viewing orientation
KR101891786B1 (ko) * 2011-11-29 2018-08-27 삼성전자주식회사 아이 트래킹 기반의 사용자 기능 운용 방법 및 이를 지원하는 단말기
JP5945417B2 (ja) * 2012-01-06 2016-07-05 京セラ株式会社 電子機器
US8937591B2 (en) * 2012-04-06 2015-01-20 Apple Inc. Systems and methods for counteracting a perceptual fading of a movable indicator
US20150358594A1 (en) * 2014-06-06 2015-12-10 Carl S. Marshall Technologies for viewer attention area estimation

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101311882A (zh) * 2007-05-23 2008-11-26 华为技术有限公司 视线跟踪人机交互方法及装置
CN201307266Y (zh) * 2008-06-25 2009-09-09 韩旭 双目视线跟踪装置
CN101441513A (zh) * 2008-11-26 2009-05-27 北京科技大学 一种利用视觉进行非接触式人机交互的系统
CN101813976A (zh) * 2010-03-09 2010-08-25 华南理工大学 基于soc的视线跟踪人机交互方法及装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110865761A (zh) * 2018-08-28 2020-03-06 财团法人工业技术研究院 指向判断系统以及指向判断方法
CN110865761B (zh) * 2018-08-28 2023-07-28 财团法人工业技术研究院 指向判断系统以及指向判断方法

Also Published As

Publication number Publication date
CN103809737A (zh) 2014-05-21
US20150109204A1 (en) 2015-04-23
US9740281B2 (en) 2017-08-22

Similar Documents

Publication Publication Date Title
WO2014075418A1 (zh) 一种人机交互方法及装置
US11262840B2 (en) Gaze detection in a 3D mapping environment
US20220164032A1 (en) Enhanced Virtual Touchpad
EP3129849B1 (en) Systems and methods of eye tracking calibration
US9696859B1 (en) Detecting tap-based user input on a mobile device based on motion sensor data
JP5807989B2 (ja) 注視支援コンピュータインターフェース
US20140184494A1 (en) User Centric Interface for Interaction with Visual Display that Recognizes User Intentions
US20190369807A1 (en) Information processing device, information processing method, and program
EP3353635A1 (en) Fingertip identification for gesture control
KR20140014868A (ko) 시선 추적 장치 및 이의 시선 추적 방법
EP2846288A2 (en) Dynamic Image Analyzing System And Operating Method Thereof
US11604517B2 (en) Information processing device, information processing method for a gesture control user interface
AU2015252151B2 (en) Enhanced virtual touchpad and touchscreen
CN110858095A (zh) 可由头部操控的电子装置与其操作方法
EP3059664A1 (en) A method for controlling a device by gestures and a system for controlling a device by gestures

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13854583

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13854583

Country of ref document: EP

Kind code of ref document: A1