WO2016176959A1 - 一种基于眼球追踪技术的显示屏多屏控制方法及系统 - Google Patents

一种基于眼球追踪技术的显示屏多屏控制方法及系统 Download PDF

Info

Publication number
WO2016176959A1
WO2016176959A1 PCT/CN2015/091524 CN2015091524W WO2016176959A1 WO 2016176959 A1 WO2016176959 A1 WO 2016176959A1 CN 2015091524 W CN2015091524 W CN 2015091524W WO 2016176959 A1 WO2016176959 A1 WO 2016176959A1
Authority
WO
WIPO (PCT)
Prior art keywords
screen
display screen
current gaze
display
gaze coordinate
Prior art date
Application number
PCT/CN2015/091524
Other languages
English (en)
French (fr)
Inventor
石贞
Original Assignee
惠州Tcl移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 惠州Tcl移动通信有限公司 filed Critical 惠州Tcl移动通信有限公司
Priority to US15/117,891 priority Critical patent/US10802581B2/en
Priority to EP15877369.7A priority patent/EP3293620B1/en
Publication of WO2016176959A1 publication Critical patent/WO2016176959A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas

Definitions

  • the invention relates to the field of eyeball tracking technology, in particular to a display screen multi-screen control method and system based on eyeball tracking technology.
  • split screen technology is also widely used in various display terminals, including mobile terminal devices (mobile phones, pads, etc.), televisions, PCs, and the like.
  • the split-screen mode of the mobile terminal is to allow the two apps to run and display on one screen at the same time, and Google has included it as a necessary implementation function of the next generation android system.
  • iOS 8 The split screen display function has been implemented on the iPad, and the two applications running at the same time can also interact and share content, such as dragging text, video, and images from one application to another. It can be seen that the split screen display does not need to switch back and forth in multitasking, which is convenient for the user to handle multiple tasks at the same time. When the user needs to switch between split screens or maximize the current split screen, it is a touch operation or a sliding operation, which brings inconvenience to the user operation and cannot meet the user's needs.
  • the present invention aims to provide a display screen multi-screen control method and system based on eyeball tracking technology, which aims to solve the problem that the user needs to switch between split screens in the prior art, or to divide the current score.
  • the screen is maximized, it is realized by touch operation or sliding operation, which brings inconvenience to the user operation and cannot meet the defect of the user's demand.
  • a display screen multi-screen control method based on eyeball tracking technology comprising the following steps:
  • the current gaze coordinates of the display screen of the user's eyes are judged by acquiring the human eye image in real time and using the eyeball tracking technology;
  • the eye motion information corresponds to the blink state
  • step B specifically includes:
  • the current gaze coordinate is in the multi-screen area in the display screen, further determine whether the time stayed within the specified range of the current gaze coordinate exceeds a preset focus time threshold value, and when exceeded, the eyeball tracking technique is used to determine the current Eye movement information within a specified range of gaze coordinates;
  • the current gaze coordinate is in the split screen template area in the display screen, further determine whether the time spent staying within the specified range of the current gaze coordinate exceeds a preset focus time threshold value, and when it exceeds, enter a preset set point.
  • the screen template selects the control interface.
  • the display screen multi-screen control method based on eyeball tracking technology wherein the focus time threshold is 5-10 seconds.
  • the display screen multi-screen control method based on the eyeball tracking technology, wherein in the multi-screen mode, the display screen of the mobile terminal includes four split screens.
  • a display screen multi-screen control method based on eyeball tracking technology comprising the following steps:
  • the current gaze coordinates of the display screen of the user's eyes are judged by acquiring the human eye image in real time and using the eyeball tracking technology;
  • the eye motion information corresponds to the blink state
  • step C further comprises:
  • step B specifically includes:
  • the current gaze coordinate is in the multi-screen area in the display screen, further determine whether the time stayed within the specified range of the current gaze coordinate exceeds a preset focus time threshold value, and when it exceeds, the eye tracking technology is used to judge Eye movement information within a specified range of current gaze coordinates;
  • the current gaze coordinate is in the split screen template area in the display screen, further determine whether the time spent staying within the specified range of the current gaze coordinate exceeds a preset focus time threshold value, and when it exceeds, enter a preset set point.
  • the screen template selects the control interface.
  • step C further comprises: when entering the full-screen display, using the eyeball tracking technology to determine the current gaze coordinate of the user's human eye looking at the display screen, if the user is judged If the current gaze coordinate is in the designated area of the display screen and the time spent staying within the specified range of the current gaze coordinate exceeds the focus time threshold, the full screen display is exited and the multi-screen mode is restored.
  • the display screen multi-screen control method based on eyeball tracking technology wherein the focus time threshold is 5-10 seconds.
  • the display screen multi-screen control method based on the eyeball tracking technology, wherein in the multi-screen mode, the display screen of the mobile terminal includes four split screens.
  • a display multi-screen control system based on eyeball tracking technology which comprises:
  • the positioning module is configured to: when the display screen of the mobile terminal is in the multi-screen mode, obtain the current gaze coordinates of the display screen by the user's human eye by acquiring the human eye image in real time and using the eyeball tracking technology;
  • a determining and acquiring module configured to determine whether a time stayed within a specified range of current gaze coordinates exceeds a preset focus time threshold, and when exceeded, an eye tracking technique is used to determine eye motion information within a specified range of current gaze coordinates;
  • the full-screen control module is configured to determine, when the eye motion information corresponds to the blink state, whether the time spent in the specified range of the current gaze coordinate exceeds a preset full-screen time threshold, and if it exceeds, the current gaze coordinate is The split screen is displayed in full screen.
  • the display screen multi-screen control system based on the eyeball tracking technology further comprising:
  • the split screen close control module is configured to close the split screen where the current gaze coordinates are located when the eye motion information corresponds to the closed eye state.
  • the display screen multi-screen control system based on the eyeball tracking technology wherein the determining and acquiring module specifically includes:
  • the area judging unit is configured to determine whether the current gaze coordinate of the user's human eye gaze display is in a multi-screen area in the display screen or in a split screen template area in the display screen;
  • a first control unit configured to determine, if the current gaze coordinate is in a multi-screen area in the display screen, whether the time stayed within the specified range of the current gaze coordinate exceeds a preset focus time threshold, and then re-use when exceeded
  • the eye tracking technique determines eye movement information within a specified range of current gaze coordinates
  • a second control unit configured to determine, if the current gaze coordinate is in the split screen template area in the display screen, whether the time spent staying within the specified range of the current gaze coordinate exceeds a preset focus time threshold, when exceeded Enter the preset split screen template selection control interface.
  • the display screen multi-screen control system based on the eyeball tracking technology, wherein the full-screen control module is further configured to use the eyeball tracking technology to determine the current gaze coordinate of the user's human eye looking at the display screen when entering the full-screen display. If it is determined whether the current gaze coordinate of the user is in the designated area of the display screen and the time staying within the specified range of the current gaze coordinate exceeds the focus time threshold, the full screen display is exited and the multi-screen mode is restored.
  • the display screen multi-screen control system based on eyeball tracking technology, wherein the focus time threshold is 5-10 seconds.
  • the display screen multi-screen control system based on the eyeball tracking technology, wherein in the multi-screen mode, the display screen of the mobile terminal includes four split screens.
  • the invention provides a display screen multi-screen control method and system based on eyeball tracking technology, and the method comprises: when the display screen of the mobile terminal is in a multi-screen mode, obtaining the human eye image in real time and using the eyeball tracking technology to determine the user The eye looks at the current gaze coordinate of the display screen; determines whether the time stayed within the specified range of the current gaze coordinate exceeds a preset focus time threshold value, and when it is exceeded, uses the eyeball tracking technique to determine the eye movement within the specified range of the current gaze coordinate Information; when the eye motion information corresponds to the blink state, it is determined whether the time stayed within the specified range of the current gaze coordinate exceeds the preset full screen time threshold, and if it is exceeded, the split screen where the current gaze coordinate is located is full screen Display.
  • the invention acquires the current gaze coordinate and the corresponding staying time through the eyeball tracking technology, and also obtains the eye movement information, thereby judging whether the current split screen is full screen or exiting the full screen, which is convenient for the user.
  • FIG. 1 is a flow chart of a preferred embodiment of a display screen multi-screen control method based on eyeball tracking technology according to the present invention.
  • FIG. 2 is a specific flowchart of determining an area where a current gaze coordinate of a user's human eye gaze is in a display screen multi-screen control method based on eyeball tracking technology according to the present invention.
  • FIG. 3 is a schematic diagram of one of the multi-screens of the mobile terminal selected by the user's eyes.
  • FIG. 4 is a structural block diagram of a preferred embodiment of a display screen multi-screen control system based on eyeball tracking technology according to the present invention.
  • the present invention provides a display screen multi-screen control method and system based on eyeball tracking technology.
  • the present invention will be further described in detail below. It is understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
  • FIG. 1 is a flow chart of a preferred embodiment of a display screen multi-screen control method based on eyeball tracking technology according to the present invention. The method includes the following steps:
  • Step S100 When the display screen of the mobile terminal is in the multi-screen mode, the current gaze coordinates of the display screen of the user's eyes are determined by acquiring the human eye image in real time and using the eyeball tracking technology.
  • the human eye image is acquired in real time through the front camera, and the eyeball tracking technology is used to judge The user's human eye looks at the current gaze coordinate of the display screen, that is, the position at which the focus point of the user's line of sight is located in the display screen of the mobile terminal.
  • eye tracking technology is an emerging somatosensory technology in recent years, and its technology has been applied in a variety of mobile terminals.
  • the eye tracking technology analyzes the state and changes of the user's eyeballs to resolve the screen position that the user is currently gazing at. It is this embodiment of the invention that utilizes this technique to capture the user's current gaze coordinates. Therefore, regardless of the implementation and operation of the eye tracking device to capture the user's current gaze coordinates, as long as the device can learn the user's gaze coordinates by collecting the user's eye movement, it can be considered suitable for the embodiment of the present invention.
  • the solution provided is within the scope of the present invention.
  • Step S200 Determine whether the time stayed within the specified range of the current gaze coordinate exceeds a preset focus time threshold value, and when it exceeds, use the eyeball tracking technique to determine the eye motion information within the specified range of the current gaze coordinate.
  • the focus time threshold may be set to 5-10 seconds, that is, when the user is staring at one of the plurality of split screens in the mobile terminal, the time of the split screen exceeds 5-10 seconds, indicating that the user needs to select
  • the split screen is set so that the user does not need to touch the display screen to select the split screen, which is convenient for the user.
  • the eye tracking information is used to determine the user's eye movement information.
  • the eye motion information corresponds to two states, one is a blinking state, and the other is a closed eye state.
  • Step S300 When the eye motion information corresponds to the blink state, it is determined whether the time stayed within the specified range of the current gaze coordinate exceeds a preset full-screen time threshold, and if it is exceeded, the split screen where the current gaze coordinate is located is performed. Full screen display.
  • step S300 when it is determined that the eye motion information corresponds to the blink state, it indicates that the focus point of the user's line of sight is still at the split screen, and the content in the split screen needs to be further viewed.
  • the split screen when it is determined that the time stayed within the specified range of the current gaze coordinate exceeds the preset full-screen time threshold, the split screen is maximized, and the other split screens are The app stays up and running in the background.
  • step S300 when the split screen is full-screened in step S300, the user is not required to touch the display screen to maximize the split screen, which is convenient for the user.
  • the method further includes:
  • Step S400 When the eye motion information corresponds to the closed eye state, the split screen where the current gaze coordinates are located is closed.
  • step S200 the selection of the split screen is achieved by gazing for one of the plurality of split screens for 5-10 seconds, at which time the user can achieve the closing of the split screen by the closed eye state.
  • the operation of the display screen is also realized by the action of the user's eyes, without the need for manual operation by the user.
  • step S200 the specific process of determining, in step S200, the region in which the user's human eye gaze at the current gaze coordinate of the display screen includes:
  • Step S201 Determine whether the current gaze coordinate of the user's human eye gaze display is in a multi-screen area in the display screen or in a split screen template area in the display screen.
  • the split screen of the current gaze coordinates can be determined.
  • the process of the present invention for selecting a split screen by the focus point of the user's line of sight the following description is by way of a specific embodiment.
  • the display screen of the mobile terminal has entered the multi-screen mode, and includes four sub-screens, which are respectively recorded as split screen 1, split screen 2, split screen 3, and split screen 4, and also include a split screen.
  • the current gaze coordinate of the user's eyes watching the display screen is in split screen 2.
  • Step S202 If the current gaze coordinate is in the multi-screen area in the display screen, further determine whether the time stayed within the specified range of the current gaze coordinate exceeds a preset focus time threshold, and when exceeded, the eye tracking technology is used to judge Eye motion information within a specified range of current gaze coordinates.
  • the user selects one of the multi-screen areas by gaze, it is judged whether the focus point of the user's line of sight (ie, the current gaze coordinate) is still in the split screen, and if it is still staying, it is judged whether the stay time is exceeded.
  • the full screen time threshold (generally set to 5-10 seconds) is selected when it is exceeded. At this time, it is necessary to use the eyeball tracking technique again to judge the eye movement information within the specified range of the current gaze coordinates.
  • Step S203 If the current gaze coordinate is in the split screen template area in the display screen, further determine whether it is determined whether the time stayed within the specified range of the current gaze coordinate exceeds a preset focus time threshold, and when it exceeds, enters a preset
  • the split screen template selects the control interface.
  • the user may continue to select one of the plurality of pre-stored split screen templates by using the method similar to the split screen selected in step S200.
  • the number of split screens, the layout of split screens, and the coordinate range of each split screen in the display screen are included.
  • the step S300 further includes: when entering the full-screen display, using the eyeball tracking technology to determine the current gaze coordinate of the user's human eye looking at the display screen, and if the user's current gaze coordinate is determined to be in the designated area of the display screen, And if the time spent staying within the specified range of the current gaze coordinates exceeds the focus time threshold, the full screen display is exited and the multi-screen mode is restored.
  • the user-selected split screen enters the full screen display
  • the user needs to transfer the current gaze coordinates to the designated area of the full screen of the split screen (for example, a square area in the upper left corner, upper right) A square area of the corner, etc.)
  • the focus time threshold 5-10 seconds
  • the present invention acquires the current gaze coordinate and the corresponding staying time through the eyeball tracking technology, and also acquires the eye motion information, thereby determining whether the currently selected split screen is fully screened or exited full screen, which is convenient for the user.
  • the present invention also provides a display screen multi-screen control system based on eyeball tracking technology.
  • the display screen multi-screen control system based on eyeball tracking technology includes:
  • the positioning module 100 is configured to: when the display screen of the mobile terminal is in the multi-screen mode, determine the current gaze coordinate of the user's human eye to the display screen by acquiring the human eye image in real time and using the eyeball tracking technology; as described above.
  • the determining and acquiring module 200 is configured to determine whether the time stayed within the specified range of the current gaze coordinate exceeds a preset focus time threshold, and when exceeded, the eyeball tracking technology is used to determine the eye motion information within the specified range of the current gaze coordinate. Specifically as described above.
  • the full-screen control module 300 is configured to determine, when the eye motion information corresponds to the blink state, whether the time spent in the specified range of the current gaze coordinate exceeds a preset full-screen time threshold, and if it exceeds, the current gaze coordinate
  • the split screen at the place is displayed in full screen; as described above.
  • the method further includes:
  • the split screen close control module is configured to close the split screen where the current gaze coordinates are when the eye motion information corresponds to the closed eye state; as described above.
  • the determining and acquiring module 200 specifically includes:
  • the area judging unit is configured to determine whether the current gaze coordinate of the user's human eye gaze display is in a multi-screen area in the display screen or in a split screen template area in the display screen; as described above.
  • a first control unit configured to determine, if the current gaze coordinate is in a multi-screen area in the display screen, whether the time stayed within the specified range of the current gaze coordinate exceeds a preset focus time threshold, and then re-use when exceeded
  • the eye tracking technique determines eye movement information within a specified range of current gaze coordinates; as described above.
  • a second control unit configured to determine, if the current gaze coordinate is in the split screen template area in the display screen, whether the time spent staying within the specified range of the current gaze coordinate exceeds a preset focus time threshold, when exceeded Enter the preset split screen template selection control interface; as described above.
  • the full-screen control module 300 is further configured to use the eyeball tracking technology to determine the current state of the user’s human eye gaze display when entering the full-screen display. Staring coordinates, if it is determined whether the current gaze coordinate of the user is in the designated area of the display screen and the time remaining in the specified range of the current gaze coordinate exceeds the focus time threshold, the full screen display is exited and the multi-screen mode is restored; Said.
  • the present invention provides a display screen multi-screen control method and system based on eyeball tracking technology, and the method includes: acquiring a human eye image and applying an eyeball in real time when the display screen of the mobile terminal is in a multi-screen mode
  • the tracking technology determines the current gaze coordinate of the user's eyes to look at the display screen; determines whether the time spent in the specified range of the current gaze coordinate exceeds the preset focus time threshold, and when it exceeds, uses the eye tracking technology to determine the current gaze coordinate.
  • the eye movement information within the specified range when the eye motion information corresponds to the blink state, it is determined whether the time stayed within the specified range of the current gaze coordinate exceeds the preset full-screen time threshold, and if it exceeds, the current gaze coordinate
  • the split screen at the place is displayed in full screen.
  • the invention acquires the current gaze coordinate and the corresponding staying time through the eyeball tracking technology, and also obtains the eye movement information, thereby judging whether the current split screen is full screen or exiting the full screen, which is convenient for the user.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

一种基于眼球追踪技术的显示屏多屏控制方法及系统,方法包括:当移动终端的显示屏画面为多屏模式时,运用眼球追踪技术判断用户人眼注视显示屏的当前凝视坐标(S100);判断在当前凝视坐标的指定范围内停留的时间是否超出预先设置的聚焦时间阈值,当超出时则运用眼球追踪技术判断用户的眼睛动作信息(S200);当眼睛动作信息对应睁眼状态时,则判断在当前凝视坐标的指定范围内停留的时间是否超出预先设置的全屏化时间阈值,若超出时则将当前凝视坐标所处的分屏进行全屏化显示(S300)。上述方案通过眼球追踪技术获取当前凝视坐标及对应停留的时间,还获取眼睛动作信息,从而判断是否将当前分屏全屏化或退出全屏化,方便了用户。

Description

一种基于眼球追踪技术的显示屏多屏控制方法及系统
【技术领域】
本发明涉及眼球追踪技术领域,尤其涉及的是一种基于眼球追踪技术的显示屏多屏控制方法及系统。
【背景技术】
目前,各类分屏技术的实现方案已经非常完善,分屏技术也广泛使用在各类显示终端上,包括移动终端设备(手机、pad等),电视机,PC机等。
移动终端的分屏模式就是让两款APP可以同时在一个屏幕中运行及显示,Google已经将其列入下一代android系统的必要实现功能。iOS 8在iPad上已经实现了分屏显示功能,且同时运行的两个应用程序还可以进行交互、内容分享,比如拖动文本、视频、图像从一个应用程序到另一个应用程序。可见,分屏显示时无需在多任务中来回切换,方便了用户同时处理多项任务。当用户需进行分屏间切换,或是将当前分屏最大化时,均是触摸操作或滑动操作实现,而给用户操作带来不便、满足不了用户需求。
因此,现有技术还有待于改进和发展。
【发明内容】
鉴于上述现有技术的不足,本发明的目的在于提供一种基于眼球追踪技术的显示屏多屏控制方法及系统,旨在解决现有技术中用户需进行分屏间切换,或是将当前分屏最大化时,均是触摸操作或滑动操作实现,而给用户操作带来不便、满足不了用户需求的缺陷。
本发明的技术方案如下:
一种基于眼球追踪技术的显示屏多屏控制方法,其中,所述方法包括以下步骤:
A、当移动终端的显示屏画面为多屏模式时,通过实时获取人眼图像并运用眼球追踪技术判断用户人眼注视显示屏的当前凝视坐标;
B、判断在当前凝视坐标的指定范围内停留的时间是否超出预先设置的聚焦时间阈值,当超出时则运用眼球追踪技术判断在当前凝视坐标的指定范围内的眼睛动作信息;
C、当眼睛动作信息对应睁眼状态时,则判断在当前凝视坐标的指定范围内停留的时间是否超出预先设置的全屏化时间阈值,若超出时则将当前凝视坐标所处的分屏进行全屏化显示,当进入全屏化显示时,则运用眼球追踪技术判断用户人眼注视显示屏的当前凝视坐标,若判断用户的当前凝视坐标处于显示屏的指定区域、且在当前凝视坐标的指定范围内停留的时间是否超出所述聚焦时间阈值,则退出全屏化显示并恢复多屏模式;
D、当眼睛动作信息对应闭眼状态时,则将当前凝视坐标所处的分屏关闭。
所述基于眼球追踪技术的显示屏多屏控制方法,其中,所述步骤B具体包括:
B1、判断用户人眼注视显示屏的当前凝视坐标是处于显示屏中的多屏区域,还是处于显示屏中的分屏模板区域;
B2、若当前凝视坐标是处于显示屏中的多屏区域,则进一步判断在当前凝视坐标的指定范围内停留的时间是否超出预先设置的聚焦时间阈值,当超出时则运用眼球追踪技术判断在当前凝视坐标的指定范围内的眼睛动作信息;
B3、若当前凝视坐标是处于显示屏中的分屏模板区域,则进一步判断判断在当前凝视坐标的指定范围内停留的时间是否超出预先设置的聚焦时间阈值,当超出时则进入预先设置的分屏模板选择控制界面。
所述基于眼球追踪技术的显示屏多屏控制方法,其中,所述聚焦时间阈值为5-10秒。
所述基于眼球追踪技术的显示屏多屏控制方法,其中,在所述多屏模式下,所述移动终端的显示屏包括4个分屏。
一种基于眼球追踪技术的显示屏多屏控制方法,其中,所述方法包括以下步骤:
A、当移动终端的显示屏画面为多屏模式时,通过实时获取人眼图像并运用眼球追踪技术判断用户人眼注视显示屏的当前凝视坐标;
B、判断在当前凝视坐标的指定范围内停留的时间是否超出预先设置的聚焦时间阈值,当超出时则运用眼球追踪技术判断在当前凝视坐标的指定范围内的眼睛动作信息;
C、当眼睛动作信息对应睁眼状态时,则判断在当前凝视坐标的指定范围内停留的时间是否超出预先设置的全屏化时间阈值,若超出时则将当前凝视坐标所处的分屏进行全屏化显示。
所述基于眼球追踪技术的显示屏多屏控制方法,其中,所述步骤C之后还包括:
D、当眼睛动作信息对应闭眼状态时,则将当前凝视坐标所处的分屏关闭。
所述基于眼球追踪技术的显示屏多屏控制方法,其中,所述步骤B具体包括:
B1、判断用户人眼注视显示屏的当前凝视坐标是处于显示屏中的多屏区域,还是处于显示屏中的分屏模板区域;
B2、若当前凝视坐标是处于显示屏中的多屏区域,则进一步判断在当前凝视坐标的指定范围内停留的时间是否超出预先设置的聚焦时间阈值,当超出时则再运用眼球追踪技术判断在当前凝视坐标的指定范围内的眼睛动作信息;
B3、若当前凝视坐标是处于显示屏中的分屏模板区域,则进一步判断判断在当前凝视坐标的指定范围内停留的时间是否超出预先设置的聚焦时间阈值,当超出时则进入预先设置的分屏模板选择控制界面。
所述基于眼球追踪技术的显示屏多屏控制方法,其中,所述步骤C还包括:当进入全屏化显示时,则运用眼球追踪技术判断用户人眼注视显示屏的当前凝视坐标,若判断用户的当前凝视坐标处于显示屏的指定区域、且在当前凝视坐标的指定范围内停留的时间是否超出所述聚焦时间阈值,则退出全屏化显示并恢复多屏模式。
所述基于眼球追踪技术的显示屏多屏控制方法,其中,所述聚焦时间阈值为5-10秒。
所述基于眼球追踪技术的显示屏多屏控制方法,其中,在所述多屏模式下,所述移动终端的显示屏包括4个分屏。
一种基于眼球追踪技术的显示屏多屏控制系统,其中,包括:
定位模块,用于当移动终端的显示屏画面为多屏模式时,通过实时获取人眼图像并运用眼球追踪技术判断用户人眼注视显示屏的当前凝视坐标;
判断及获取模块,用于判断在当前凝视坐标的指定范围内停留的时间是否超出预先设置的聚焦时间阈值,当超出时则运用眼球追踪技术判断在当前凝视坐标的指定范围内的眼睛动作信息;
全屏化控制模块,用于当眼睛动作信息对应睁眼状态时,则判断在当前凝视坐标的指定范围内停留的时间是否超出预先设置的全屏化时间阈值,若超出时则将当前凝视坐标所处的分屏进行全屏化显示。
所述基于眼球追踪技术的显示屏多屏控制系统,其中,还包括:
分屏关闭控制模块,用于当眼睛动作信息对应闭眼状态时,则将当前凝视坐标所处的分屏关闭。
所述基于眼球追踪技术的显示屏多屏控制系统,其中,所述判断及获取模块具体包括:
区域判断单元,用于判断用户人眼注视显示屏的当前凝视坐标是处于显示屏中的多屏区域,还是处于显示屏中的分屏模板区域;
第一控制单元,用于若当前凝视坐标是处于显示屏中的多屏区域,则进一步判断在当前凝视坐标的指定范围内停留的时间是否超出预先设置的聚焦时间阈值,当超出时则再运用眼球追踪技术判断在当前凝视坐标的指定范围内的眼睛动作信息;
第二控制单元,用于若当前凝视坐标是处于显示屏中的分屏模板区域,则进一步判断判断在当前凝视坐标的指定范围内停留的时间是否超出预先设置的聚焦时间阈值,当超出时则进入预先设置的分屏模板选择控制界面。
所述基于眼球追踪技术的显示屏多屏控制系统,其中,所述全屏化控制模块还用于当进入全屏化显示时,则运用眼球追踪技术判断用户人眼注视显示屏的当前凝视坐标,若判断用户的当前凝视坐标处于显示屏的指定区域、且在当前凝视坐标的指定范围内停留的时间是否超出所述聚焦时间阈值,则退出全屏化显示并恢复多屏模式。
所述基于眼球追踪技术的显示屏多屏控制系统,其中,所述聚焦时间阈值为5-10秒。
所述基于眼球追踪技术的显示屏多屏控制系统,其中,在所述多屏模式下,所述移动终端的显示屏包括4个分屏。
本发明提供的一种基于眼球追踪技术的显示屏多屏控制方法及系统,方法包括:当移动终端的显示屏画面为多屏模式时,通过实时获取人眼图像并运用眼球追踪技术判断用户人眼注视显示屏的当前凝视坐标;判断在当前凝视坐标的指定范围内停留的时间是否超出预先设置的聚焦时间阈值,当超出时则运用眼球追踪技术判断在当前凝视坐标的指定范围内的眼睛动作信息;当眼睛动作信息对应睁眼状态时,则判断在当前凝视坐标的指定范围内停留的时间是否超出预先设置的全屏化时间阈值,若超出时则将当前凝视坐标所处的分屏进行全屏化显示。本发明通过眼球追踪技术获取当前凝视坐标及对应停留的时间,还获取眼睛动作信息,从而判断是否将当前分屏全屏化或退出全屏化,方便了用户。
【附图说明】
图1为本发明所述基于眼球追踪技术的显示屏多屏控制方法较佳实施例的流程图。
图2为本发明所述基于眼球追踪技术的显示屏多屏控制方法中判断用户人眼注视显示屏的当前凝视坐标所处区域的具体流程图。
图3为用户眼睛选定移动终端的多屏中其中一个分屏的示意图。
图4为本发明所述基于眼球追踪技术的显示屏多屏控制系统较佳实施例的结构框图。
【具体实施方式】
本发明提供一种基于眼球追踪技术的显示屏多屏控制方法及系统,为使本发明的目的、技术方案及效果更加清楚、明确,以下对本发明进一步详细说明。应当理解,此处所描述的具体实施例仅仅用以解释本发明,并不用于限定本发明。
如图1所示,为本发明所述基于眼球追踪技术的显示屏多屏控制方法较佳实施例的流程图。所述方法包括以下步骤:
步骤S100、当移动终端的显示屏画面为多屏模式时,通过实时获取人眼图像并运用眼球追踪技术判断用户人眼注视显示屏的当前凝视坐标。
本发明的实施例中,当用户打开多个应用,并通过分屏技术将各应用显示在一一对应的分屏中时,则通过前置摄像头实时获取人眼图像,并运用眼球追踪技术判断用户人眼注视显示屏的当前凝视坐标,也即判断用户此时视线的聚焦点位于移动终端显示屏中的位置。
如前所述,眼球跟踪技术是近年来新兴的一项体感技术,其技术已经在多种移动终端上得到了应用。该眼球跟踪技术通过捕捉并跟踪用户眼球的状态及变化,进而解析用户当前所凝视的屏幕位置。本发明实施例正是利用这一技术来捕捉用户的当前凝视坐标。因此,不论采用何种实现方式和工作原理的眼球跟踪装置来捕捉用户的当前凝视坐标,只要该装置能够通过采集用户的眼球运动获悉用户的凝视坐标,均可视为适用于本发明实施例所提供的方案,在本发明的保护范围之内。
步骤S200、判断在当前凝视坐标的指定范围内停留的时间是否超出预先设置的聚焦时间阈值,当超出时则运用眼球追踪技术判断在当前凝视坐标的指定范围内眼睛动作信息。
显然,当检测到用户盯着移动终端中多个分屏中的其中一个分屏时间超出所述聚焦时间阈值时,则表示用户需选中该分屏并对该分屏进行下一步操作。具体实施时,可将所述聚焦时间阈值设置为5-10秒,即检测到用户盯着移动终端中多个分屏中的其中一个分屏时间超出5-10秒时,则说明用户需选定该分屏,这样无需用户通过触摸显示屏来选定分屏,方便了用户。
当判断用户选定了该分屏后,则需运用眼球追踪技术判断用户的眼睛动作信息。其中,所述眼睛动作信息对应两种状态,一种是睁眼状态,另一种则是闭眼状态。
步骤S300、当眼睛动作信息对应睁眼状态时,则判断在当前凝视坐标的指定范围内停留的时间是否超出预先设置的全屏化时间阈值,若超出时则将当前凝视坐标所处的分屏进行全屏化显示。
在步骤S300中,当判断眼睛动作信息对应睁眼状态时,则表示用户此时视线的聚焦点仍停留在该分屏,需对该分屏中的内容进一步查看。此时,为了方便用户查看该分屏中的内容,故当判断在当前凝视坐标的指定范围内停留的时间超出预先设置的全屏化时间阈值时,则将该分屏最大化,其它分屏中的应用在后台保持正常运行。
可见,在步骤S300中对分屏进行全屏化处理时,也无需用户通过触摸显示屏来最大化分屏,方便了用户。
进一步的,在所述步骤S300之后还包括:
步骤S400、当眼睛动作信息对应闭眼状态时,则将当前凝视坐标所处的分屏关闭。
在步骤S200中,通过对多个分屏中的其中一个分屏凝视5-10秒实现了对该分屏的选定,此时用户可通过闭眼状态来实现对该分屏的关闭。同样,也是通过用户眼睛的动作来实现对显示屏的操作,无需用户手动操作。
进一步的,如图2所示,所述步骤S200中判断用户人眼注视显示屏的当前凝视坐标所处区域的具体流程包括:
步骤S201、判断用户人眼注视显示屏的当前凝视坐标是处于显示屏中的多屏区域,还是处于显示屏中的分屏模板区域。
由于在移动终端中已预先设置了多屏模式下每一分屏的显示区域,故当判断了用户人眼注视显示屏的当前凝视坐标后,即可判断当前凝视坐标所处分屏。为了更清楚的理解本发明通过用户视线的聚焦点选定分屏的过程,下面通过一具体实施例来说明。
例如,如图3所示,移动终端的显示屏已进入多屏模式,且包括4个分屏,分别记为分屏一、分屏二、分屏三及分屏四,还包括一分屏模板区域,显然此时用户人眼注视显示屏的当前凝视坐标处于分屏二中。
步骤S202、若当前凝视坐标是处于显示屏中的多屏区域,则进一步判断在当前凝视坐标的指定范围内停留的时间是否超出预先设置的聚焦时间阈值,当超出时则再运用眼球追踪技术判断在当前凝视坐标的指定范围内的眼睛动作信息。
当用户通过凝视选定多屏区域中的其中一个分屏时,则在判断用户视线的聚焦点(即当前凝视坐标)是否仍停留在该分屏中,若仍停留时则判断停留时间是否超出所述全屏化时间阈值(一般设置为5-10秒),当超出时则选定该分屏。此时,需再次运用眼球追踪技术判断在当前凝视坐标的指定范围内的眼睛动作信息。
步骤S203、若当前凝视坐标是处于显示屏中的分屏模板区域,则进一步判断判断在当前凝视坐标的指定范围内停留的时间是否超出预先设置的聚焦时间阈值,当超出时则进入预先设置的分屏模板选择控制界面。
当进入了分屏模板选择控制界面后,用户可继续采用类似步骤S200中选定分屏的方法来选中多个预先存储的分屏模板中的其中一个。在每一分屏模板中均包括了分屏数目、分屏排列方式,每一分屏在显示屏中的坐标范围等信息。
进一步的,所述步骤S300中还包括:当进入全屏化显示时,则再运用眼球追踪技术判断用户人眼注视显示屏的当前凝视坐标,若判断用户的当前凝视坐标处于显示屏的指定区域、且在当前凝视坐标的指定范围内停留的时间是否超出所述聚焦时间阈值,则退出全屏化显示并恢复多屏模式。
在用户选定的分屏进入了全屏化显示后,当用户需退出全屏化时,则用户需将当前凝视坐标转移至全屏化的该分屏的指定区域(例如左上角的一方形区域、右上角的一方形区域等),当在当前凝视坐标的指定范围内停留的时间超出所述聚焦时间阈值(5-10秒),则退出全屏化显示并恢复多屏模式。
可见,本发明通过眼球追踪技术获取当前凝视坐标及对应停留的时间,还获取眼睛动作信息,从而判断是否将当前选定的分屏全屏化或退出全屏化,方便了用户。
基于上述方法实施例,本发明还提供一种基于眼球追踪技术的显示屏多屏控制系统。如图4所示,所述基于眼球追踪技术的显示屏多屏控制系统包括:
定位模块100,用于当移动终端的显示屏画面为多屏模式时,通过实时获取人眼图像并运用眼球追踪技术判断用户人眼注视显示屏的当前凝视坐标;具体如上所述。
判断及获取模块200,用于判断在当前凝视坐标的指定范围内停留的时间是否超出预先设置的聚焦时间阈值,当超出时则运用眼球追踪技术判断在当前凝视坐标的指定范围内的眼睛动作信息;具体如上所述。
全屏化控制模块300,用于当眼睛动作信息对应睁眼状态时,则判断在当前凝视坐标的指定范围内停留的时间是否超出预先设置的全屏化时间阈值,若超出时则将当前凝视坐标所处的分屏进行全屏化显示;具体如上所述。
进一步的,在所述基于眼球追踪技术的显示屏多屏控制系统中,还包括:
分屏关闭控制模块,用于当眼睛动作信息对应闭眼状态时,则将当前凝视坐标所处的分屏关闭;具体如上所述。
进一步的,在所述基于眼球追踪技术的显示屏多屏控制系统中,所述判断及获取模块200具体包括:
区域判断单元,用于判断用户人眼注视显示屏的当前凝视坐标是处于显示屏中的多屏区域,还是处于显示屏中的分屏模板区域;具体如上所述。
第一控制单元,用于若当前凝视坐标是处于显示屏中的多屏区域,则进一步判断在当前凝视坐标的指定范围内停留的时间是否超出预先设置的聚焦时间阈值,当超出时则再运用眼球追踪技术判断在当前凝视坐标的指定范围内的眼睛动作信息;具体如上所述。
第二控制单元,用于若当前凝视坐标是处于显示屏中的分屏模板区域,则进一步判断判断在当前凝视坐标的指定范围内停留的时间是否超出预先设置的聚焦时间阈值,当超出时则进入预先设置的分屏模板选择控制界面;具体如上所述。
进一步的,在所述基于眼球追踪技术的显示屏多屏控制系统中,所述全屏化控制模块300还用于当进入全屏化显示时,则运用眼球追踪技术判断用户人眼注视显示屏的当前凝视坐标,若判断用户的当前凝视坐标处于显示屏的指定区域、且在当前凝视坐标的指定范围内停留的时间是否超出所述聚焦时间阈值,则退出全屏化显示并恢复多屏模式;具体如上所述。
综上所述,本发明提供的一种基于眼球追踪技术的显示屏多屏控制方法及系统,方法包括:当移动终端的显示屏画面为多屏模式时,通过实时获取人眼图像并运用眼球追踪技术判断用户人眼注视显示屏的当前凝视坐标;判断在当前凝视坐标的指定范围内停留的时间是否超出预先设置的聚焦时间阈值,当超出时则再运用眼球追踪技术判断在当前凝视坐标的指定范围内的眼睛动作信息;当眼睛动作信息对应睁眼状态时,则判断在当前凝视坐标的指定范围内停留的时间是否超出预先设置的全屏化时间阈值,若超出时则将当前凝视坐标所处的分屏进行全屏化显示。本发明通过眼球追踪技术获取当前凝视坐标及对应停留的时间,还获取眼睛动作信息,从而判断是否将当前分屏全屏化或退出全屏化,方便了用户。
应当理解的是,本发明的应用不限于上述的举例,对本领域普通技术人员来说,可以根据上述说明加以改进或变换,所有这些改进和变换都应属于本发明所附权利要求的保护范围。

Claims (16)

  1. 一种基于眼球追踪技术的显示屏多屏控制方法,其中,所述方法包括以下步骤:
    A、当移动终端的显示屏画面为多屏模式时,通过实时获取人眼图像并运用眼球追踪技术判断用户人眼注视显示屏的当前凝视坐标;
    B、判断在当前凝视坐标的指定范围内停留的时间是否超出预先设置的聚焦时间阈值,当超出时则运用眼球追踪技术判断在当前凝视坐标的指定范围内的眼睛动作信息;
    C、当眼睛动作信息对应睁眼状态时,则判断在当前凝视坐标的指定范围内停留的时间是否超出预先设置的全屏化时间阈值,若超出时则将当前凝视坐标所处的分屏进行全屏化显示,当进入全屏化显示时,则运用眼球追踪技术判断用户人眼注视显示屏的当前凝视坐标,若判断用户的当前凝视坐标处于显示屏的指定区域、且在当前凝视坐标的指定范围内停留的时间是否超出所述聚焦时间阈值,则退出全屏化显示并恢复多屏模式;
    D、当眼睛动作信息对应闭眼状态时,则将当前凝视坐标所处的分屏关闭。
  2. 根据权利要求1所述基于眼球追踪技术的显示屏多屏控制方法,其中,所述步骤B具体包括:
    B1、判断用户人眼注视显示屏的当前凝视坐标是处于显示屏中的多屏区域,还是处于显示屏中的分屏模板区域;
    B2、若当前凝视坐标是处于显示屏中的多屏区域,则进一步判断在当前凝视坐标的指定范围内停留的时间是否超出预先设置的聚焦时间阈值,当超出时则运用眼球追踪技术判断在当前凝视坐标的指定范围内的眼睛动作信息;
    B3、若当前凝视坐标是处于显示屏中的分屏模板区域,则进一步判断判断在当前凝视坐标的指定范围内停留的时间是否超出预先设置的聚焦时间阈值,当超出时则进入预先设置的分屏模板选择控制界面。
  3. 根据权利要求1所述基于眼球追踪技术的显示屏多屏控制方法,其中,所述聚焦时间阈值为5-10秒。
  4. 根据权利要求1所述基于眼球追踪技术的显示屏多屏控制方法,其中,在所述多屏模式下,所述移动终端的显示屏包括4个分屏。
  5. 一种基于眼球追踪技术的显示屏多屏控制方法,其中,所述方法包括以下步骤:
    A、当移动终端的显示屏画面为多屏模式时,通过实时获取人眼图像并运用眼球追踪技术判断用户人眼注视显示屏的当前凝视坐标;
    B、判断在当前凝视坐标的指定范围内停留的时间是否超出预先设置的聚焦时间阈值,当超出时则运用眼球追踪技术判断在当前凝视坐标的指定范围内的眼睛动作信息;
    C、当眼睛动作信息对应睁眼状态时,则判断在当前凝视坐标的指定范围内停留的时间是否超出预先设置的全屏化时间阈值,若超出时则将当前凝视坐标所处的分屏进行全屏化显示。
  6. 根据权利要求5所述基于眼球追踪技术的显示屏多屏控制方法,其中,所述步骤C之后还包括:
    D、当眼睛动作信息对应闭眼状态时,则将当前凝视坐标所处的分屏关闭。
  7. 根据权利要求6所述基于眼球追踪技术的显示屏多屏控制方法,其中,所述步骤B具体包括:
    B1、判断用户人眼注视显示屏的当前凝视坐标是处于显示屏中的多屏区域,还是处于显示屏中的分屏模板区域;
    B2、若当前凝视坐标是处于显示屏中的多屏区域,则进一步判断在当前凝视坐标的指定范围内停留的时间是否超出预先设置的聚焦时间阈值,当超出时则运用眼球追踪技术判断在当前凝视坐标的指定范围内的眼睛动作信息;
    B3、若当前凝视坐标是处于显示屏中的分屏模板区域,则进一步判断判断在当前凝视坐标的指定范围内停留的时间是否超出预先设置的聚焦时间阈值,当超出时则进入预先设置的分屏模板选择控制界面。
  8. 根据权利要求5所述基于眼球追踪技术的显示屏多屏控制方法,其中,所述步骤C还包括:当进入全屏化显示时,则运用眼球追踪技术判断用户人眼注视显示屏的当前凝视坐标,若判断用户的当前凝视坐标处于显示屏的指定区域、且在当前凝视坐标的指定范围内停留的时间是否超出所述聚焦时间阈值,则退出全屏化显示并恢复多屏模式。
  9. 根据权利要求5所述基于眼球追踪技术的显示屏多屏控制方法,其中,所述聚焦时间阈值为5-10秒。
  10. 根据权利要求5所述基于眼球追踪技术的显示屏多屏控制方法,其中,在所述多屏模式下,所述移动终端的显示屏包括4个分屏。
  11. 一种基于眼球追踪技术的显示屏多屏控制系统,其中,包括:
    定位模块,用于当移动终端的显示屏画面为多屏模式时,通过实时获取人眼图像并运用眼球追踪技术判断用户人眼注视显示屏的当前凝视坐标;
    判断及获取模块,用于判断在当前凝视坐标的指定范围内停留的时间是否超出预先设置的聚焦时间阈值,当超出时则运用眼球追踪技术判断在当前凝视坐标的指定范围内的眼睛动作信息;
    全屏化控制模块,用于当眼睛动作信息对应睁眼状态时,则判断在当前凝视坐标的指定范围内停留的时间是否超出预先设置的全屏化时间阈值,若超出时则将当前凝视坐标所处的分屏进行全屏化显示。
  12. 根据权利要求11所述基于眼球追踪技术的显示屏多屏控制系统,其中,还包括:
    分屏关闭控制模块,用于当眼睛动作信息对应闭眼状态时,则将当前凝视坐标所处的分屏关闭。
  13. 根据权利要求11所述基于眼球追踪技术的显示屏多屏控制系统,其中,所述判断及获取模块具体包括:
    区域判断单元,用于判断用户人眼注视显示屏的当前凝视坐标是处于显示屏中的多屏区域,还是处于显示屏中的分屏模板区域;
    第一控制单元,用于若当前凝视坐标是处于显示屏中的多屏区域,则进一步判断在当前凝视坐标的指定范围内停留的时间是否超出预先设置的聚焦时间阈值,当超出时则再运用眼球追踪技术判断在当前凝视坐标的指定范围内的眼睛动作信息;
    第二控制单元,用于若当前凝视坐标是处于显示屏中的分屏模板区域,则进一步判断判断在当前凝视坐标的指定范围内停留的时间是否超出预先设置的聚焦时间阈值,当超出时则进入预先设置的分屏模板选择控制界面。
  14. 根据权利要求11所述基于眼球追踪技术的显示屏多屏控制系统,其中,所述全屏化控制模块还用于当进入全屏化显示时,则运用眼球追踪技术判断用户人眼注视显示屏的当前凝视坐标,若判断用户的当前凝视坐标处于显示屏的指定区域、且在当前凝视坐标的指定范围内停留的时间是否超出所述聚焦时间阈值,则退出全屏化显示并恢复多屏模式。
  15. 根据权利要求11所述基于眼球追踪技术的显示屏多屏控制系统,其中,所述聚焦时间阈值为5-10秒。
  16. 根据权利要求11所述基于眼球追踪技术的显示屏多屏控制系统,其中,在所述多屏模式下,所述移动终端的显示屏包括4个分屏。
PCT/CN2015/091524 2015-05-04 2015-10-09 一种基于眼球追踪技术的显示屏多屏控制方法及系统 WO2016176959A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US15/117,891 US10802581B2 (en) 2015-05-04 2015-10-09 Eye-tracking-based methods and systems of managing multi-screen view on a single display screen
EP15877369.7A EP3293620B1 (en) 2015-05-04 2015-10-09 Multi-screen control method and system for display screen based on eyeball tracing technology

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201510219997.4 2015-05-04
CN201510219997.4A CN104834446B (zh) 2015-05-04 2015-05-04 一种基于眼球追踪技术的显示屏多屏控制方法及系统

Publications (1)

Publication Number Publication Date
WO2016176959A1 true WO2016176959A1 (zh) 2016-11-10

Family

ID=53812370

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2015/091524 WO2016176959A1 (zh) 2015-05-04 2015-10-09 一种基于眼球追踪技术的显示屏多屏控制方法及系统

Country Status (4)

Country Link
US (1) US10802581B2 (zh)
EP (1) EP3293620B1 (zh)
CN (1) CN104834446B (zh)
WO (1) WO2016176959A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113173072A (zh) * 2021-04-29 2021-07-27 恒大新能源汽车投资控股集团有限公司 一种屏幕转向控制方法、装置和系统
CN113760083A (zh) * 2020-06-01 2021-12-07 张也弛 操作者目光在终端设备屏幕上落点位置的确定方法及装置

Families Citing this family (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104834446B (zh) 2015-05-04 2018-10-26 惠州Tcl移动通信有限公司 一种基于眼球追踪技术的显示屏多屏控制方法及系统
US10007845B2 (en) * 2015-07-06 2018-06-26 Pixart Imaging Inc. Eye state detecting method and eye state detecting system
CN105334961A (zh) * 2015-10-27 2016-02-17 惠州Tcl移动通信有限公司 一种基于眼球追踪的移动终端控制方法及移动终端
CN106791353B (zh) * 2015-12-16 2019-06-14 深圳市汇顶科技股份有限公司 自动对焦的方法、装置和系统
CN106231175A (zh) * 2016-07-09 2016-12-14 东莞市华睿电子科技有限公司 一种终端手势拍照的方法
KR20190049784A (ko) 2016-09-22 2019-05-09 애플 인크. 무주의 조건 동안 그래픽 사용자 인터페이스에 영향을 미치는 정보의 상태 변화 연기
CN106534879B (zh) * 2016-11-08 2020-02-07 天脉聚源(北京)传媒科技有限公司 一种基于关注度的直播切换方法及系统
CN106504722B (zh) 2017-01-12 2019-10-01 京东方科技集团股份有限公司 一种goa分区驱动方法和装置、goa单元
CN108733203A (zh) * 2017-04-20 2018-11-02 上海耕岩智能科技有限公司 一种眼球追踪操作的方法和装置
CN107333177A (zh) * 2017-06-01 2017-11-07 梁小红 一种智能电视退出全屏播放的方法及装置
WO2018231245A1 (en) * 2017-06-16 2018-12-20 Hewlett-Packard Development Company, L.P. Displaying images from multiple devices
CN107297572A (zh) * 2017-06-19 2017-10-27 余姚市德沃斯模具科技有限公司 一种注塑模具缝线纹理的制作工艺及系统
CN107390870A (zh) * 2017-07-18 2017-11-24 福建捷联电子有限公司 基于眼球追踪的显示器背光调整及自动开关机方法
US10948983B2 (en) * 2018-03-21 2021-03-16 Samsung Electronics Co., Ltd. System and method for utilizing gaze tracking and focal point tracking
CN108478184A (zh) * 2018-04-26 2018-09-04 京东方科技集团股份有限公司 基于vr的视力测量方法及装置、vr设备
CN109040427B (zh) * 2018-07-03 2020-08-04 Oppo广东移动通信有限公司 分屏处理方法、装置、存储介质和电子设备
CN108845755A (zh) * 2018-07-03 2018-11-20 Oppo广东移动通信有限公司 分屏处理方法、装置、存储介质及电子设备
CN108958587B (zh) * 2018-07-06 2020-09-08 Oppo广东移动通信有限公司 分屏处理方法、装置、存储介质和电子设备
CN109062409A (zh) * 2018-07-27 2018-12-21 华勤通讯技术有限公司 客户端的控制方法及系统、移动终端
CN109445736A (zh) * 2018-10-26 2019-03-08 维沃移动通信有限公司 一种分屏显示方法及移动终端
CN109508092A (zh) * 2018-11-08 2019-03-22 北京七鑫易维信息技术有限公司 基于眼球追踪控制终端设备的方法、装置和终端
US11012750B2 (en) * 2018-11-14 2021-05-18 Rohde & Schwarz Gmbh & Co. Kg Method for configuring a multiviewer as well as multiviewer
CN109782968B (zh) * 2018-12-18 2020-11-06 维沃移动通信有限公司 一种界面调整方法及终端设备
CN109885167B (zh) * 2019-02-25 2022-04-01 北京七鑫易维信息技术有限公司 数据处理方法、数据传输方法及装置
CN109919065A (zh) * 2019-02-26 2019-06-21 浪潮金融信息技术有限公司 一种使用眼球追踪技术在屏幕上获取关注点的方法
CN110266881B (zh) * 2019-06-18 2021-03-12 Oppo广东移动通信有限公司 应用控制方法及相关产品
CN112540084A (zh) * 2019-09-20 2021-03-23 联策科技股份有限公司 外观检查系统与检查方法
CN110955922B (zh) * 2019-12-17 2023-03-21 联想(北京)有限公司 一种显示方法及显示装置
CN111667265A (zh) * 2020-02-20 2020-09-15 中国银联股份有限公司 基于眼球追踪的信息处理方法及系统、支付处理方法
JP2021140590A (ja) * 2020-03-06 2021-09-16 キヤノン株式会社 電子機器、電子機器の制御方法、プログラム、記憶媒体
CN114257824B (zh) * 2021-11-25 2024-03-19 广州方硅信息技术有限公司 直播显示方法、装置、存储介质及计算机设备

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101813976A (zh) * 2010-03-09 2010-08-25 华南理工大学 基于soc的视线跟踪人机交互方法及装置
WO2014019307A1 (zh) * 2012-08-02 2014-02-06 百度在线网络技术(北京)有限公司 具有多屏幕的App的开发方法、切换控制方法及装置
CN103593051A (zh) * 2013-11-11 2014-02-19 百度在线网络技术(北京)有限公司 头戴式显示设备
CN103645806A (zh) * 2013-12-24 2014-03-19 惠州Tcl移动通信有限公司 一种基于眼球追踪的商品浏览方法及系统
CN104571528A (zh) * 2015-01-27 2015-04-29 王露 一种基于眼球追踪实现眼球控制智能终端的设备及方法
CN104834446A (zh) * 2015-05-04 2015-08-12 惠州Tcl移动通信有限公司 一种基于眼球追踪技术的显示屏多屏控制方法及系统

Family Cites Families (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DK2202609T3 (en) * 2004-06-18 2016-04-25 Tobii Ab Eye control of computer equipment
JP2010004118A (ja) * 2008-06-18 2010-01-07 Olympus Corp デジタルフォトフレーム、情報処理システム、制御方法、プログラム及び情報記憶媒体
CN101866215B (zh) * 2010-04-20 2013-10-16 复旦大学 在视频监控中采用视线跟踪的人机交互装置和方法
US10013976B2 (en) * 2010-09-20 2018-07-03 Kopin Corporation Context sensitive overlays in voice controlled headset computer displays
US20120169582A1 (en) * 2011-01-05 2012-07-05 Visteon Global Technologies System ready switch for eye tracking human machine interaction control system
US9285874B2 (en) * 2011-02-09 2016-03-15 Apple Inc. Gaze detection in a 3D mapping environment
US9081416B2 (en) * 2011-03-24 2015-07-14 Seiko Epson Corporation Device, head mounted display, control method of device and control method of head mounted display
US9766698B2 (en) * 2011-05-05 2017-09-19 Nokia Technologies Oy Methods and apparatuses for defining the active channel in a stereoscopic view by using eye tracking
US10120438B2 (en) * 2011-05-25 2018-11-06 Sony Interactive Entertainment Inc. Eye gaze to alter device behavior
US9383579B2 (en) * 2011-10-12 2016-07-05 Visteon Global Technologies, Inc. Method of controlling a display component of an adaptive display system
JP5936379B2 (ja) * 2012-02-07 2016-06-22 シャープ株式会社 画像表示装置
US9823742B2 (en) * 2012-05-18 2017-11-21 Microsoft Technology Licensing, Llc Interaction and management of devices using gaze detection
US9007301B1 (en) * 2012-10-11 2015-04-14 Google Inc. User interface
US20150234457A1 (en) * 2012-10-15 2015-08-20 Umoove Services Ltd. System and method for content provision using gaze analysis
US9626072B2 (en) * 2012-11-07 2017-04-18 Honda Motor Co., Ltd. Eye gaze control system
JP2014157466A (ja) * 2013-02-15 2014-08-28 Sony Corp 情報処理装置及び記憶媒体
KR20150007910A (ko) * 2013-07-11 2015-01-21 삼성전자주식회사 사용자 인터렉션을 제공하는 사용자 단말 장치 및 그 방법
EP3090322A4 (en) * 2013-12-31 2017-07-19 Eyefluence, Inc. Systems and methods for gaze-based media selection and editing
US9836639B2 (en) * 2014-01-10 2017-12-05 Facebook, Inc. Systems and methods of light modulation in eye tracking devices
CN103853330B (zh) * 2014-03-05 2017-12-01 努比亚技术有限公司 基于眼睛控制显示层切换的方法和移动终端
US9744853B2 (en) * 2014-12-30 2017-08-29 Visteon Global Technologies, Inc. System and method of tracking with associated sensory feedback
US10242379B2 (en) * 2015-01-30 2019-03-26 Adobe Inc. Tracking visual gaze information for controlling content display

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101813976A (zh) * 2010-03-09 2010-08-25 华南理工大学 基于soc的视线跟踪人机交互方法及装置
WO2014019307A1 (zh) * 2012-08-02 2014-02-06 百度在线网络技术(北京)有限公司 具有多屏幕的App的开发方法、切换控制方法及装置
CN103593051A (zh) * 2013-11-11 2014-02-19 百度在线网络技术(北京)有限公司 头戴式显示设备
CN103645806A (zh) * 2013-12-24 2014-03-19 惠州Tcl移动通信有限公司 一种基于眼球追踪的商品浏览方法及系统
CN104571528A (zh) * 2015-01-27 2015-04-29 王露 一种基于眼球追踪实现眼球控制智能终端的设备及方法
CN104834446A (zh) * 2015-05-04 2015-08-12 惠州Tcl移动通信有限公司 一种基于眼球追踪技术的显示屏多屏控制方法及系统

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3293620A4 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113760083A (zh) * 2020-06-01 2021-12-07 张也弛 操作者目光在终端设备屏幕上落点位置的确定方法及装置
CN113173072A (zh) * 2021-04-29 2021-07-27 恒大新能源汽车投资控股集团有限公司 一种屏幕转向控制方法、装置和系统

Also Published As

Publication number Publication date
EP3293620B1 (en) 2024-01-03
US20170160799A1 (en) 2017-06-08
EP3293620A1 (en) 2018-03-14
CN104834446B (zh) 2018-10-26
CN104834446A (zh) 2015-08-12
US10802581B2 (en) 2020-10-13
EP3293620A4 (en) 2018-05-02

Similar Documents

Publication Publication Date Title
WO2016176959A1 (zh) 一种基于眼球追踪技术的显示屏多屏控制方法及系统
WO2021020667A1 (ko) 원격 재활 훈련 제공 방법 및 프로그램
WO2018062658A1 (en) Display apparatus and controlling method thereof
WO2018128472A1 (en) Virtual reality experience sharing
WO2014133277A1 (en) Apparatus and method for processing an image in device
WO2017075973A1 (zh) 无人机操控界面交互方法、便携式电子设备和存储介质
WO2016052778A1 (ko) 포터블 디바이스 및 그 제어 방법
WO2015161697A1 (zh) 应用于人机交互的运动物体跟踪方法及系统
WO2013129792A1 (en) Method and portable terminal for correcting gaze direction of user in image
WO2014148696A1 (en) Display device detecting gaze location and method for controlling thereof
WO2015120673A1 (zh) 利用眼球跟踪技术控制拍照对焦的方法、系统及拍照设备
WO2014104521A1 (en) Image transformation apparatus and method
WO2015122616A1 (en) Photographing method of an electronic device and the electronic device thereof
WO2015046677A1 (en) Head-mounted display and method of controlling the same
WO2017181686A1 (zh) 一种移动终端视频通讯中画面角度自动修正方法及系统
WO2015030307A1 (en) Head mounted display device and method for controlling the same
WO2017142223A1 (en) Remote image transmission system, display apparatus, and guide displaying method thereof
WO2018040443A1 (zh) 一种拍照方法和装置
WO2018040269A1 (zh) 一种图像处理方法及终端
WO2018088730A1 (en) Display apparatus and control method thereof
EP3047640A1 (en) Portable device and control method thereof
WO2017128853A1 (zh) 摄制视频的处理方法、装置和设备
WO2018006280A1 (zh) 页面切换方法、装置、终端以及存储介质
WO2018004036A1 (ko) 모니터링 장치 및 시스템
WO2019164145A1 (ko) 전자 장치 및 그의 자세 교정 방법

Legal Events

Date Code Title Description
REEP Request for entry into the european phase

Ref document number: 2015877369

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 15117891

Country of ref document: US

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15877369

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2015877369

Country of ref document: EP