WO2019214329A1 - 终端的控制方法、装置及终端 - Google Patents

终端的控制方法、装置及终端 Download PDF

Info

Publication number
WO2019214329A1
WO2019214329A1 PCT/CN2019/077165 CN2019077165W WO2019214329A1 WO 2019214329 A1 WO2019214329 A1 WO 2019214329A1 CN 2019077165 W CN2019077165 W CN 2019077165W WO 2019214329 A1 WO2019214329 A1 WO 2019214329A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
terminal
gaze point
instruction
point information
Prior art date
Application number
PCT/CN2019/077165
Other languages
English (en)
French (fr)
Inventor
黄通兵
秦林婵
杨川
Original Assignee
北京七鑫易维信息技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京七鑫易维信息技术有限公司 filed Critical 北京七鑫易维信息技术有限公司
Publication of WO2019214329A1 publication Critical patent/WO2019214329A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Definitions

  • the present invention relates to the field of computers, and in particular to a method, device, and terminal for controlling a terminal.
  • Multi-channel interactive integration uses new interactive channels, devices and interactive technologies such as line of sight, voice, and gestures, enabling users to use multiple channels to conduct human-machine dialogue in a natural, parallel, and collaborative manner.
  • Multi-channel human-computer interaction interaction naturalness, interaction efficiency, and compatibility with traditional interfaces are especially important.
  • Eye tracking technology uses an eye movement measurement device to capture the image of the user's eye in milliseconds. By analyzing the relative positions of the pupil contour, the iris contour, the pupil center, the iris center, and the reflection point of the external light source on the cornea, the line of sight direction is estimated. The line of sight is falling.
  • eye movement analysis technology is widely used in new human-computer interaction (using eye typing, playing games), medical diagnosis, user experience research, psychological and cognitive research, and educational aids.
  • Some existing models of Apple mobile phones, Huawei mobile phones, and Samsung have eye tracking functions, which can be used for eye movement unlocking, no eye movement sleep, eye movement page turning and other applications. There are also communication aids that use visual input to help people with disabilities communicate with the outside world.
  • the computer can easily distinguish whether the user's line of sight stays to operate or only observe the target, which is the Midas contact problem.
  • the eye tracker has inherent inaccuracy, and does not guarantee that the calculated gaze point is 100% coincident with the actual gaze point, especially in scenarios where the use environment of the mobile phone, PAD, etc. is complicated, and the estimation of the gaze point by the eye movement device is not easy to prepare. , resulting in poor user experience.
  • the method of gazing the duration threshold is adopted, and the efficiency is relatively low and it is easy to operate by mistake.
  • the embodiment of the invention provides a method, a device and a terminal for controlling a terminal, so as to solve at least the technical problem that the deterministic instruction cannot be accurately generated according to the gaze point information.
  • a method for controlling a terminal includes: acquiring gaze point information for a terminal; acquiring a first touch instruction generated by performing a touch operation on the touch device; The first touch instruction determines a control instruction for the terminal; and controls the terminal to perform an operation corresponding to the control instruction.
  • the method further includes: calibrating the gaze point information.
  • the calibrating the gaze point information includes: determining an initial position of a gaze point corresponding to the target object; adjusting a position of the pointer, causing the indicator to coincide with the target object, and adjusting the adjusted indication
  • the position of the object is used as the gaze point information after the calibration, wherein the pointer is set to indicate the position of the gaze point.
  • adjusting the position of the indicator comprises: acquiring a second touch instruction generated by performing a touch operation on the touch device; and adjusting a position of the indicator according to the second touch instruction.
  • a storage medium including a stored program, wherein, when the program is running, controlling a device where the storage medium is located to perform control of the terminal described above method.
  • a processor configured to execute a program, wherein the program is executed to execute the control method of the terminal described above.
  • a control device for a terminal including: a first acquiring unit configured to acquire gaze point information for the terminal; and a second acquiring unit configured to acquire a touch on the touch device a first touch instruction generated by the operation; the determining unit is configured to determine a control instruction for the terminal according to the gaze point information and the first touch instruction; and the control unit is configured to control the terminal to perform and the control The operation corresponding to the instruction.
  • the apparatus further includes: a calibration unit configured to calibrate the gaze point information before determining a control instruction to the terminal according to the gaze point information and the touch operation instruction.
  • the calibration unit includes: a determining module configured to determine an initial position of a gaze point corresponding to the target object; and an adjustment module configured to adjust a position of the pointer to cause the indicator to coincide with the target object, And adjusting the position of the indicator as the gaze point information after calibration, wherein the indicator is used to indicate the location of the gaze point.
  • the adjustment module includes: an acquisition module configured to acquire a second touch instruction generated by performing a touch operation on the touch device; and an adjustment submodule configured to adjust a position of the indicator according to the second touch instruction.
  • a terminal including: an eye tracking device configured to acquire gaze point information for the terminal; and a touch device configured to acquire a touch operation performed on the touch device a touch command; the processor is configured to determine a control instruction for the terminal according to the gaze point information and the first touch instruction, and control the terminal to perform an operation corresponding to the control instruction.
  • the eye tracking device is detachably disposed on the terminal; or the eye tracking device is fixedly mounted on the terminal.
  • the touch device is fixedly mounted on the terminal.
  • the touch device includes a touch screen of the terminal; and/or the touch device includes a touch panel disposed on a back of the terminal.
  • the gaze point information of the user for the terminal is acquired, and the first touch instruction generated by the touch operation of the touch device by the user is determined, and then the control command issued by the user to the terminal is determined according to the gaze point information and the first touch instruction.
  • the terminal can execute the operation corresponding to the control instruction, complete the instruction issued by the user, and realize the technical effect of accurately generating the deterministic instruction according to the gaze point information of the user, thereby solving the problem that the deterministic instruction cannot be accurately generated according to the gaze point information. technical problem.
  • FIG. 1 is a flowchart of a method for controlling a terminal according to an embodiment of the present invention
  • FIG. 2 is a schematic diagram of a control device for a terminal according to an embodiment of the present invention.
  • FIG. 3 is a schematic diagram of a terminal according to an embodiment of the present invention.
  • FIG. 4(a) is a first schematic diagram of a terminal-set fixed eye tracking device according to an embodiment of the invention.
  • FIG. 4(b) is a second schematic diagram of a terminal-set fixed eye tracking device according to an embodiment of the invention.
  • FIG. 5(a) is a first schematic diagram of a terminal detachable eye tracking device according to an embodiment of the invention.
  • FIG. 5(b) is a second schematic diagram of a terminal detachable eye tracking device according to an embodiment of the invention.
  • an embodiment of a method for controlling a terminal is provided. It should be noted that the steps shown in the flowchart of the accompanying drawings may be executed in a computer system such as a set of computer executable instructions, and, although The logical order is shown in the flowcharts, but in some cases the steps shown or described may be performed in a different order than the ones described herein.
  • FIG. 1 is a flowchart of a method for controlling a terminal according to an embodiment of the present invention. As shown in FIG. 1, the method includes the following steps:
  • Step S102 acquiring gaze point information for the terminal
  • Step S104 Acquire a first touch instruction generated by performing a touch operation on the touch device.
  • Step S106 determining a control instruction for the terminal according to the gaze point information and the first touch instruction
  • Step S108 the control terminal performs an operation corresponding to the control instruction.
  • the terminal can execute the operation corresponding to the control instruction, complete the instruction issued by the user, and realize the technical effect of accurately generating the deterministic instruction according to the gaze point information of the user, thereby solving the technical problem that the deterministic instruction cannot be accurately generated according to the gaze point information.
  • the gaze point information may be the location information that the user has watched by the eye tracking device after the user is looking at the terminal.
  • the eye movement following device includes an infrared device, a camera, and a processor.
  • the terminal includes: a mobile phone, a tablet computer (PAD), and a notebook computer.
  • the eye tracking device determines the gaze point of the user for the terminal by collecting an image of the user's eyeball.
  • the touch device may be disposed on the terminal, such as a touch screen or a touch panel fixed on the terminal, and the user generates a touch instruction by touching the touch device.
  • the gaze point information at the current moment is determined as the instruction information
  • the control instruction is determined according to the first touch instruction and the instruction information.
  • control instruction is used to control an instruction of the terminal, which is jointly determined by the fixation point information and the first touch instruction.
  • the gaze point information, the mapping relationship between the first touch instruction and the control instruction may be stored in a storage device (such as a buffer area or a database) in advance.
  • the first touch instruction may be generated by tapping the touch device, pressing on the touch device, and sliding on the touch device.
  • the terminal may perform an operation corresponding to the control instruction in response to the control instruction.
  • the first touch instruction can be used as the confirmation signal of the fixation point information by gazing at the point information and the first touch instruction, so that the deterministic instruction can be accurately generated according to the fixation point, and the user's operation can be accurately confirmed. mobile.
  • the touch device when the first touch command generated by the touch device is used as the gaze point information confirmation signal, the occurrence of the acknowledgment signal can be avoided to cause the terminal to sway, thereby avoiding the sway of the gaze point. To get accurate gaze point information.
  • the embodiment may further include: calibrating the gaze point information.
  • the gaze point information may be first calibrated to obtain accurate gaze point information.
  • calibrating the gaze point information can be accomplished by calibrating the location of the gaze point.
  • the gaze point of the terminal can be collected by the eye tracking device, and then the calibration of the gaze information can be achieved by calibrating the gaze point position.
  • the calibrating the gaze point information includes: determining an initial position of the gaze point corresponding to the target object; adjusting the position of the pointer, causing the pointer to coincide with the target object, and adjusting the indicator The position is used as the gaze point information after the calibration, wherein the indicator is used to indicate the location of the gaze point.
  • the terminal displays the target object, and determines that when the user looks at the target object, the terminal calculates the obtained gaze point and positions the gaze point.
  • the initial position the position of the gaze point is indicated by the indicator, and then the position of the pointer is adjusted to make the pointer coincide with the target object, so that the position of the current indication can be used as the gaze point position, and the gaze after calibration is obtained.
  • the point information enables calibration of the fixation point information.
  • the indicator may be a calibration point for calibration, such as a black dot.
  • the gaze point can be displayed on the terminal in the form of an indicator during the calibration process, and in the case other than the calibration, the gaze point can be hidden.
  • the pointer may be a cursor indicating the location of the point.
  • adjusting the position of the pointer includes: acquiring a second touch instruction generated by performing a touch operation on the touch device; and adjusting a position of the pointer according to the second touch instruction.
  • the second touch command generated by the touch device may be acquired, and the pointer is controlled to move according to the second touch command to adjust the position of the pointer.
  • the second touch instruction may be generated by tapping the touch device, pressing on the touch device, and sliding on the touch device.
  • the second touch instruction is a touch instruction generated by the touch device during the calibration of the gaze point information.
  • the touch instruction generated by the touch device is a second touch instruction.
  • a storage medium comprising a stored program, wherein the program executes the method of any of the above.
  • processor being arranged to run a program, wherein the program is executed to perform the method of any of the above.
  • control device of the terminal is also provided according to the embodiment of the present invention. It should be noted that the control device of the terminal may be configured to perform the control method of the terminal in the embodiment of the present invention. The control method can be executed in the control device of the terminal.
  • the device includes: a first acquiring unit 21 configured to acquire gaze point information for the terminal; and a second acquiring unit 23, Set to acquire a first touch instruction generated by performing a touch operation on the touch device; the determining unit 25 is configured to determine a control instruction for the terminal according to the gaze point information and the first touch instruction; and the control unit 27 is configured to control the terminal to execute and control The operation corresponding to the instruction.
  • first obtaining unit 21 in this embodiment may be configured to perform step S102 in the embodiment of the present application.
  • the second obtaining unit 23 in this embodiment may be configured to perform step S104 in the embodiment of the present application.
  • the determining unit 25 in this embodiment may be configured to perform step S106 in the embodiment of the present application.
  • the control unit 27 in this embodiment may be configured to perform step S108 in the embodiment of the present application.
  • the gaze point information of the user for the terminal is acquired, and the first touch instruction generated by the touch operation of the touch device by the user is determined, and then the control issued by the user to the terminal is determined according to the gaze point information and the first touch instruction.
  • the instruction enables the terminal to perform an operation corresponding to the control instruction, completes the instruction issued by the user, and realizes the technical effect of accurately generating the deterministic instruction according to the gaze point information of the user, thereby solving the problem that the deterministic instruction cannot be accurately generated according to the gaze point information.
  • the embodiment may further include: a calibration unit configured to calibrate the gaze point information before determining a control instruction to the terminal according to the gaze point information and the touch operation instruction.
  • the calibration unit includes: a determining module configured to determine an initial position of the gaze point corresponding to the target object; and an adjustment module configured to adjust the position of the pointer to cause the indicator to coincide with the target object, The position of the adjusted indicator is used as the gaze point information after the calibration, wherein the pointer is set to indicate the position of the gaze point.
  • the adjustment module includes: an acquisition module configured to acquire a second touch instruction generated by performing a touch operation on the touch device; and an adjustment submodule configured to adjust a position of the pointer according to the second touch instruction.
  • FIG. 3 is a schematic diagram of a terminal according to an embodiment of the present invention.
  • the terminal includes: an eye tracking device 31 configured to acquire gaze point information for the terminal; and a touch device 33 configured to acquire a touch The device performs a first touch instruction generated by the touch operation; the processor 35 is configured to determine a control instruction for the terminal according to the gaze point information and the first touch instruction, and control the terminal to perform an operation corresponding to the control instruction.
  • the gaze point information of the user for the terminal is acquired by the eye tracking device, and the first touch instruction generated by the user performing the touch operation on the touch device, and then determining the user pair according to the gaze point information and the first touch instruction.
  • the control command issued by the terminal enables the terminal to perform an operation corresponding to the control command, completes the command issued by the user, and realizes the technical effect of accurately generating the deterministic command according to the gaze point information of the user, thereby solving the problem that the gaze point cannot be accurately determined.
  • the technical problem of generating deterministic instructions for information is acquired by the eye tracking device, and the first touch instruction generated by the user performing the touch operation on the touch device, and then determining the user pair according to the gaze point information and the first touch instruction.
  • the terminal is a terminal controlled in the above control method of the terminal.
  • the eye tracking device and the screen are located on the same side of the terminal.
  • the eye tracking device can be positioned to the front of the terminal.
  • the eye movement following device is detachably disposed on the terminal; or the eye movement following device is fixedly mounted on the terminal.
  • the touch device is fixedly mounted on the terminal.
  • the touch device includes a touch screen of the terminal; and/or the touch device includes a touch pad disposed on the back of the terminal.
  • the terminal is backed by a face on the terminal that is different from the touch screen, for example, the face opposite to the touch screen.
  • the touch instruction generated by the touch screen may be used as the first touch instruction.
  • the present invention also provides a preferred embodiment that provides a system and method for correcting a fixation point using a touch device.
  • the invention adopts the touch of the touch device as the confirmation signal for operating the gaze target, and uses the displacement on the touch device as the correction signal of the calculated gaze point to ensure that the line of sight data collection and the line of sight input achieve a good use effect.
  • correction signal is a second touch signal generated by the touch device during the calibration process.
  • the touch device can be integrated into the target device (terminal), for example, integrated into the mobile phone, the PAD, and the notebook, thereby eliminating the need to separately set the touch panel, simplifying the design complexity, and reducing the possibility of the touch panel being lost.
  • the method of using the touch device to touch does not cause displacement and vibration of the device (terminal), and does not affect the use of the eye tracking device.
  • the invention adopts the touch displacement method to adjust and correct the gaze point, and the method is intuitive, and the accuracy of the gaze tracking can be further ensured.
  • a mobile phone is used as a terminal for illustration.
  • a touchpad can be added to the back of the phone.
  • an eye tracking device can be added to the front of the phone.
  • FIG. 4(a) is a first schematic diagram of a terminal-set fixed eye tracking device according to an embodiment of the present invention. As shown in FIG. 4(a), a built-in method is used to increase an eye movement detecting device, and the eye tracking device is fixed. With the phone screen below.
  • the eye tracking device includes an infrared light, a motherboard and a camera.
  • the camera of the eye tracking device may be the front camera of the mobile phone.
  • FIG. 4(b) is a second schematic diagram of a terminal-set fixed eye tracking device according to an embodiment of the present invention. As shown in FIG. 4(b), the touch panel is added by using a built-in method, and the touchpad is disposed on the back of the mobile phone. .
  • FIG. 5(a) is a first schematic diagram of a detachable eye tracking device according to an embodiment of the present invention. As shown in FIG. 5(a), an external method is used to increase the eye movement detecting device, and the eye movement is followed. The device is below the screen of the phone.
  • the eye movement detecting device can be connected to the mobile phone through a charging interface of the mobile phone.
  • FIG. 5(b) is a second schematic diagram of a detachable eye tracking device according to an embodiment of the present invention. As shown in FIG. 5(b), the touch panel is added by using a built-in method, and the touchpad is set with a mobile phone. back.
  • the eye tracking mode can be turned on by using a shortcut key or a mobile phone setting interface.
  • the gaze point needs to be calibrated before the eye tracking mode is activated.
  • a number of eye feature images of the user may be acquired and then the user's gaze point is calculated from the eye feature graphic.
  • the user needs to cooperate to view the target gaze points on some mobile phone screens or mobile phones, and this process is also called calibration.
  • the calibration process can be performed by the following steps:
  • a gaze point indicator appears on the screen of the mobile phone while the user is looking at the target, and the pointer can be a cursor.
  • the cursor can be moved to the target by the touchpad behind the mobile phone to overlap the two.
  • the gaze point recognition algorithm adjusts the calculation coefficient according to the correction result of the user.
  • the user can repeat the 2 process until the calculated gaze point and the user's actual gaze point reach satisfactory use accuracy.
  • the user can generate an "open command" by means of gaze.
  • the user can generate a "page turning instruction" by means of gaze.
  • the finger when reading an e-book or webpage on a mobile phone, the finger is pressed on the touch device, the user's gaze moves from the lower right corner of the "book” and moves to the left to complete the backward page turning action; the user's gaze is from the lower left of the "book” Angle, and move to the right to complete the forward page flipping action.
  • the user can generate a "drag command" by means of gaze.
  • the user looks at an icon of the mobile phone interactive interface, while the finger is pressed by the touch device and moves toward the target direction, and the corresponding icon moves to the target position with the finger.
  • the user looks at an icon of the mobile phone interactive interface while the finger is pressed by the touch device, and then the user's gaze (gaze point) moves toward the target direction, and the corresponding icon moves to the target position following the user's gaze (gaze point).
  • the technology provided by the present invention combines the touch device with the eye tracking technology to correct or operate the calculated fixation point, and achieves the effect of accurately using the line of sight for human-computer interaction, thereby realizing the precise operation of the electronic device.
  • the disclosed technical contents may be implemented in other manners.
  • the device embodiments described above are only schematic.
  • the division of the unit may be a logical function division.
  • there may be another division manner for example, multiple units or components may be combined or may be Integrate into another system, or some features can be ignored or not executed.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, unit or module, and may be electrical or otherwise.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • each functional unit in each embodiment of the present invention may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the above integrated unit can be implemented in the form of hardware or in the form of a software functional unit.
  • the integrated unit if implemented in the form of a software functional unit and sold or used as a standalone product, may be stored in a computer readable storage medium.
  • the technical solution of the present invention which is essential or contributes to the prior art, or all or part of the technical solution, may be embodied in the form of a software product stored in a storage medium.
  • a number of instructions are included to cause a computer device (which may be a personal computer, server or network device, etc.) to perform all or part of the steps of the methods described in various embodiments of the present invention.
  • the foregoing storage medium includes: a U disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic disk, or an optical disk, and the like. .
  • the solution provided by the embodiment of the present invention can be applied to a human-computer interaction process based on eye tracking technology.
  • the technical problem that the deterministic instruction cannot be accurately generated according to the gaze point information is solved by the embodiment of the present invention, and the technical effect of accurately generating the deterministic instruction according to the gaze point information of the user is realized.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

本发明公开了一种终端的控制方法、装置及终端。其中,该方法包括:获取针对终端的注视点信息;获取对触摸设备进行触摸操作所产生的第一触摸指令;依据注视点信息与第一触摸指令确定对终端的控制指令;控制终端执行与控制指令对应的操作。本发明解决了无法准确根据注视点信息生成确定性指令的技术问题。

Description

终端的控制方法、装置及终端
本申请要求于2018年05月07日提交中国专利局、申请号为201810428124.8、发明名称“终端的控制方法、装置及终端”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本发明涉及计算机领域,具体而言,涉及一种终端的控制方法、装置及终端。
背景技术
近年来人机交互的方式逐渐多样化,多通道交互综合采用视线、语音、手势等新的交互通道、设备和交互技术,使用户利用多个通道以自然、并行、协作的方式进行人机对话,通过整合来自多个通道的、精确的和不精确的输入来捕捉用户的交互意图,提高人机交互的自然性和高效性。多通道人机交互中,交互自然性、交互高效性及与传统界面的兼容性尤其重要。
眼动跟踪技术利用眼动测量设备以毫秒级捕捉使用者眼睛图像,通过分析瞳孔轮廓、虹膜轮廓、瞳孔中心、虹膜中心及外界光源在角膜上的反射点等特征的相对位置,估计视线方向或者视线落点。目前,眼动分析技术被广泛应用于新型人机交互(用眼睛打字、玩游戏)、医学诊断、用户体验研究、心理与认知研究、教育辅助等领域。
现有的苹果手机、华为手机、三星的一些机型具有眼动跟踪功能,可以实现眼动解锁、无眼动时休眠、眼动翻页等应用。也有沟通辅助产品,采用视线输入帮助残疾人与外界沟通。
但是,在实际应用过程中,由于注视不是一种确定性指令,计算机容易分不清用户的视线停留是要进行操作还是仅仅观察目标,这就是迈达斯接触问题。
此外,眼动仪具有固有的不精确性,并不能保证计算所得注视点与实际注视点100%重合,尤其在手机、PAD等使用环境复杂的场景,眼动设备对注视点的估计容易不准备,导致用户体验差。在以往的方法中,采用注视时长阈值的方法,效率比较低而且容易误操作。
针对上述无法准确根据注视点信息生成确定性指令的问题,目前尚未提出有效的 解决方案。
发明内容
本发明实施例提供了一种终端的控制方法、装置及终端,以至少解决无法准确根据注视点信息生成确定性指令的技术问题。
根据本发明实施例的一个方面,提供了一种终端的控制方法,包括:获取针对终端的注视点信息;获取对触摸设备进行触摸操作所产生的第一触摸指令;依据所述注视点信息与所述第一触摸指令确定对所述终端的控制指令;控制所述终端执行与所述控制指令对应的操作。
进一步地,在依据所述注视点信息与所述触摸操作指令确定对所述终端的控制指令之前,所述方法还包括:校准所述注视点信息。
进一步地,校准所述注视点信息包括:确定与目标物对应的注视点所在的初始位置;调整指示物的位置,使所述指示物与所述目标物重合,并将调整后的所述指示物的位置作为校准后的注视点信息,其中,所述指示物设置为指示所述注视点所在位置。
进一步地,调整所述指示物的位置包括:获取对触摸设备进行触摸操作所产生的第二触摸指令;根据所述第二触摸指令调整所述指示物的位置。
根据本发明实施例的另一方面,还提供了一种存储介质,所述存储介质包括存储的程序,其中,在所述程序运行时控制所述存储介质所在设备执行上述所述的终端的控制方法。
根据本发明实施例的另一方面,还提供了一种处理器,所述处理器设置为运行程序,其中,所述程序运行时执行上述所述的终端的控制方法。
根据本发明实施例的另一方面,还提供了一种终端的控制装置,包括:第一获取单元,设置为获取针对终端的注视点信息;第二获取单元,设置为获取对触摸设备进行触摸操作所产生的第一触摸指令;确定单元,设置为依据所述注视点信息与所述第一触摸指令确定对所述终端的控制指令;控制单元,设置为控制所述终端执行与所述控制指令对应的操作。
进一步地,所述装置还包括:校准单元,设置为在依据所述注视点信息与所述触摸操作指令确定对所述终端的控制指令之前,校准所述注视点信息。
进一步地,所述校准单元包括:确定模块,设置为确定与目标物对应的注视点所在的初始位置;调整模块,设置为调整指示物的位置,使所述指示物与所述目标物重 合,并将调整后的所述指示物的位置作为校准后的注视点信息,其中,所述指示物用于指示所述注视点所在位置。
进一步地,所述调整模块包括:获取模块,设置为获取对触摸设备进行触摸操作所产生的第二触摸指令;调整子模块,设置为根据所述第二触摸指令调整所述指示物的位置。
根据本发明实施例的另一方面,还提供了一种终端,包括:眼动跟随设备,设置为获取针对终端的注视点信息;触摸设备,设置为获取对触摸设备进行触摸操作所产生的第一触摸指令;处理器,设置为依据所述注视点信息与所述第一触摸指令确定对所述终端的控制指令,并控制所述终端执行与所述控制指令对应的操作。
进一步地,所述眼动跟随设备可拆卸式设置于所述终端上;或所述眼动跟随设备固定安装于所述终端上。
进一步地,所述触摸设备固定安装于所述终端上。
进一步地,所述触摸设备包括所述终端的触摸屏幕;和/或所述触摸设备包括在所述终端背面设置的触摸板。
在本发明实施例中获取用户针对终端的注视点信息,以及用户对触摸设备进行触摸操作所产生的第一触摸指令,再根据注视点信息和第一触摸指令确定用户对终端所发出的控制指令,使终端能够执行与控制指令对应的操作,完成用户所发出的指令,实现了准确根据用户的注视点信息生成确定性指令的技术效果,进而解决了无法准确根据注视点信息生成确定性指令的技术问题。
附图说明
此处所说明的附图用来提供对本发明的进一步理解,构成本申请的一部分,本发明的示意性实施例及其说明用于解释本发明,并不构成对本发明的不当限定。在附图中:
图1是根据本发明实施例的一种终端的控制方法的流程图;
图2是根据本发明实施例的一种终端的控制装置的示意图;
图3是根据本发明实施例的一种终端的示意图;
图4(a)是根据本发明实施例的一种终端设置固定式眼动跟随设备的示意图一;
图4(b)是根据本发明实施例的一种终端设置固定式眼动跟随设备的示意图二;
图5(a)是根据本发明实施例的一种终端设置可拆卸式眼动跟随设备的示意图一;
图5(b)是根据本发明实施例的一种终端设置可拆卸式眼动跟随设备的示意图二。
具体实施方式
为了使本技术领域的人员更好地理解本发明方案,下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分的实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都应当属于本发明保护的范围。
需要说明的是,本发明的说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便这里描述的本发明的实施例能够以除了在这里图示或描述的那些以外的顺序实施。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,例如,包含了一系列步骤或单元的过程、方法、系统、产品或设备不必限于清楚地列出的那些步骤或单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它步骤或单元。
根据本发明实施例,提供了一种终端的控制方法实施例,需要说明的是,在附图的流程图示出的步骤可以在诸如一组计算机可执行指令的计算机系统中执行,并且,虽然在流程图中示出了逻辑顺序,但是在某些情况下,可以以不同于此处的顺序执行所示出或描述的步骤。
图1是根据本发明实施例的一种终端的控制方法的流程图,如图1所示,该方法包括如下步骤:
步骤S102,获取针对终端的注视点信息;
步骤S104,获取对触摸设备进行触摸操作所产生的第一触摸指令;
步骤S106,依据注视点信息与第一触摸指令确定对终端的控制指令;
步骤S108,控制终端执行与控制指令对应的操作。
通过上述步骤,获取用户针对终端的注视点信息,以及用户对触摸设备进行触摸操作所产生的第一触摸指令,再根据注视点信息和第一触摸指令确定用户对终端所发出的控制指令,使终端能够执行与控制指令对应的操作,完成用户所发出的指令,实现了准确根据用户的注视点信息生成确定性指令的技术效果,进而解决了无法准确根 据注视点信息生成确定性指令的技术问题。
在上述步骤S102提供的方案中,注视点信息可以用户在注视终端的情况下,通过眼动跟随设备采集的用户所注视的位置信息。
需要说明的是,眼动跟随设备包括红外装置、摄像头,以及处理器。
需要说明的是,终端包括:手机、平板电脑(PAD)、笔记本电脑。
可选地,眼动跟随设备通过采集用户眼球的图像来确定用户针对终端的注视点。
在上述步骤S104提供的方案中,触摸设备可以设置在终端上,如固定在终端上的触摸屏幕或触摸板,用户通过触碰该触摸设备生产触摸指令。
可选地,在获取第一触摸指令的情况下,确定当前时刻的注视点信息为指令信息,并依据第一触摸指令和该指令信息确定控制指令。
在上述步骤S106提供的方案中,控制指令用于控制终端的指令,该控制指令由注视点信息和第一触摸指令共同确定。
可选地,可以预先在存储设备(如缓存区或数据库)中存储注视点信息、第一触摸指令与控制指令的映射关系。
可选地,第一触摸指令可以通过轻触触摸设备、在触摸设备上按下、以及在触摸设备上滑动等方式生成。
在上述步骤S108提供的方案中,终端在获取控制指令后,可以响应该控制指令,执行与该控制指令所对应的操作。
根据本发明上述实施例,通过注视点信息与第一触摸指令的方式,可以将第一触摸指令作为注视点信息的确认信号,从而可以准确根据注视点生成确定性的指令,准确确认用户的操作移动。
另外,由于触摸设备产生指令的方式,在将触摸设备所产生的第一触摸指令作为注视点信息确认信号的情况下,可以避免确认信号的产生而造成终端的晃动,进而避免了注视点的晃动,从而获得准确的注视点信息。
作为一种可选的实施例,在依据注视点信息与触摸操作指令确定对终端的控制指令之前,该实施例还可以包括:校准注视点信息。
为了获取准确的注视点信息,可以先对注视点信息进行校准,从而获得准确的注视点信息。
可选地,校准注视点信息可以通过校准注视点所在的位置来实现。
作为一个可选的示例,可以通过眼动跟随设备采集终端的注视点,然后通过对注视点位置进行校准来实现对注视信息的校准。
在一个可选的实施例中,校准注视点信息包括:确定与目标物对应的注视点所在的初始位置;调整指示物的位置,使指示物与目标物重合,并将调整后的指示物的位置作为校准后的注视点信息,其中,指示物用于指示注视点所在位置。
采用本发明上述实施例,在校准注视点信息的过程中,终端显示目标物,并确定在用户注视该目标物的情况下,终端通过计算所得出的注视点,并将该注视点所在的位置作为初始位置,并通过指示物来指示该注视点所在位置,然后通过调整该指示物的位置,使指示物与目标物重合,从而可以将当前指示所在位置作为注视点位置,得到校准后的注视点信息,实现了对注视点信息的校准。
可选地,是指示物可以是用于校准的校准点,如黑点。
需要说明的是,注视点可以在校准的过程中,通过指示物的形式展示在终端上,进而在校准以外的情况下,可以隐藏注视点。
可选地,指示物可以是光标,用于指示点所在的位置。
一个可选的实施例,调整指示物的位置包括:获取对触摸设备进行触摸操作所产生的第二触摸指令;根据第二触摸指令调整指示物的位置。
采用本发明上述实施例,在调整指示物的过程中,可以获取触摸设备所产生的第二触摸指令,并根据该第二触摸指令控制指示物移动,调整指示物的位置。
需要说明的是,第二触摸指令可以通过轻触触摸设备、在触摸设备上按下、以及在触摸设备上滑动等方式生成。
可选地,第二触摸指令为校准注视点信息过程中,触摸装置所生成的触摸指令。
可选地,在校准注视点信息过程中,触摸装置所生成的触摸指令即为第二触摸指令。
根据本发明的又一个实施例,还提供了一种存储介质,存储介质包括存储的程序,其中,程序运行时执行上述任一项的方法。
根据本发明的又一个实施例,还提供了一种处理器,处理器设置为运行程序,其中,程序运行时执行上述任一项的方法。
根据本发明实施例,还提供了一种终端的控制装置实施例,需要说明的是,该终端的控制装置可以设置为执行本发明实施例中的终端的控制方法,本发明实施例中的终端的控制方法可以在该终端的控制装置中执行。
图2是根据本发明实施例的一种终端的控制装置的示意图,如图2所示,该装置包括:第一获取单元21,设置为获取针对终端的注视点信息;第二获取单元23,设置为获取对触摸设备进行触摸操作所产生的第一触摸指令;确定单元25,设置为依据注视点信息与第一触摸指令确定对终端的控制指令;控制单元27,设置为控制终端执行与控制指令对应的操作。
需要说明的是,该实施例中的第一获取单元21可以设置为执行本申请实施例中的步骤S102,该实施例中的第二获取单元23可以设置为执行本申请实施例中的步骤S104,该实施例中的确定单元25发送模块76可以设置为执行本申请实施例中的步骤S106,该实施例中的控制单元27可以设置为执行本申请实施例中的步骤S108。上述模块与对应的步骤所实现的示例和应用场景相同,但不限于上述实施例所公开的内容。
根据本发明上述实施例,获取用户针对终端的注视点信息,以及用户对触摸设备进行触摸操作所产生的第一触摸指令,再根据注视点信息和第一触摸指令确定用户对终端所发出的控制指令,使终端能够执行与控制指令对应的操作,完成用户所发出的指令,实现了准确根据用户的注视点信息生成确定性指令的技术效果,进而解决了无法准确根据注视点信息生成确定性指令的技术问题。
作为一种可选的实施例,该实施例还可以包括:校准单元,设置为在依据注视点信息与触摸操作指令确定对终端的控制指令之前,校准注视点信息。
作为一种可选的实施例,校准单元包括:确定模块,设置为确定与目标物对应的注视点所在的初始位置;调整模块,设置为调整指示物的位置,使指示物与目标物重合,并将调整后的指示物的位置作为校准后的注视点信息,其中,指示物设置为指示注视点所在位置。
作为一种可选的实施例,调整模块包括:获取模块,设置为获取对触摸设备进行触摸操作所产生的第二触摸指令;调整子模块,设置为根据第二触摸指令调整指示物的位置。
图3是根据本发明实施例的一种终端的示意图,如图3所示,该终端包括:眼动跟随设备31,设置为获取针对终端的注视点信息;触摸设备33,设置为获取对触摸设备进行触摸操作所产生的第一触摸指令;处理器35,设置为依据注视点信息与第一触摸指令确定对终端的控制指令,并控制终端执行与控制指令对应的操作。
根据本发明上述实施例,通过眼动跟随设备获取用户针对终端的注视点信息,以及用户对触摸设备进行触摸操作所产生的第一触摸指令,再根据注视点信息和第一触摸指令确定用户对终端所发出的控制指令,使终端能够执行与控制指令对应的操作,完成用户所发出的指令,实现了准确根据用户的注视点信息生成确定性指令的技术效果,进而解决了无法准确根据注视点信息生成确定性指令的技术问题。
需要说的是,该终端为上述终端的控制方法中所控制的终端。
可选地,眼动跟随设备与屏幕位于终端同侧。
可选地,眼动跟随设备可以位与终端正面。
作为一种可选的实施例,眼动跟随设备可拆卸式设置于终端上;或眼动跟随设备固定安装于终端上。
作为一种可选的实施例,触摸设备固定安装于终端上。
作为一种可选的实施例,触摸设备包括终端的触摸屏幕;和/或触摸设备包括在终端背面设置的触摸板。
需要说明的是,终端被背面为终端上与触模屏幕不同的面,例如,与触模屏幕相反的面。
可选地,在触摸设备为终端的触摸屏幕的情况下,可以将触摸屏幕所产生的触摸指令作为第一触摸指令。
本发明还提供了一种优选实施例,该优选实施例提供了一种采用触摸设备对注视点进行修正的系统与方法。
本发明采用触摸设备的轻触作为操作注视目标的确认信号,采用触摸设备上的位移作为计算所得注视点的修正信号,以确保视线数据搜集与视线输入达到良好的使用效果。
需要说的是,修正信号为校准过程中,操作触摸设备所产生的第二触摸信号。
在本发明中,触摸设备可以整合到目标设备(终端)中,例如,整合到手机、PAD、笔记本中,从而无需单独设置触摸板,简化设计复杂度,也降低了触摸板丢失的可能性。
本发明采用触摸设备轻触的方式不会造成设备(终端)的位移与震动,不影响眼动跟踪设备使用。
本发明采用触摸位移方式调整修正计算注视点,方式直观,可进一步保证视线跟踪的准确性。
在以下实施例中,以手机作为终端,进行举例说明。
例如,可以在手机上添加眼动跟踪与触摸设备。
可选地,可以在手机背面添加触摸板。
可选地,可以在手机正面添加眼动跟踪装置。
图4(a)是根据本发明实施例的一种终端设置固定式眼动跟随设备的示意图一,如图4(a)所示,采用内置方法增加眼动检测设备,将眼动跟随设备固定与手机屏幕下方。
可选地,眼动跟随设备包括红外灯,主板和摄像头。
可选地,眼动跟随设备的摄像头可以是手机的前摄像头。
图4(b)是根据本发明实施例的一种终端设置固定式眼动跟随设备的示意图二,如图4(b)所示,采用内置方法增加触摸板,并将触摸板设置与手机背面。
图5(a)是根据本发明实施例的一种终端设置可拆卸式眼动跟随设备的示意图一,如图5(a)所示,采用外置方法增加眼动检测设备,将眼动跟随设备与手机屏幕下方。
可选地,眼动检测设备可以通过手机的充电接口与手机相连。
图5(b)是根据本发明实施例的一种终端设置可拆卸式眼动跟随设备的示意图二,如图5(b)所示,采用内置方法增加触摸板,并将触摸板设置与手机背面。
可选地,可以通过快捷键或手机设置界面中开启眼动跟踪模式。
可选地,为确保手机所采集的注视点的准确性,在启动眼动跟踪模式之前,还需要对注视点进行校准。
需要说明的是,在使用眼动跟踪模式之前,需采集若干的眼睛特征图像,然后根据眼睛特征图像计算注视点。
例如,可以采集用户的若干的眼睛特征图像,然后根据该眼睛特征图形计算用户的注视点。
可选地,在使用眼动跟踪模式之前,需要用户配合观看一些手机屏幕或手机上的目标注视点,此过程又称为校准。
可选地,校准过程可以通过如下步骤进行:
1)用户注视目标物的同时手机屏幕出现注视点指示物,该指示物可以为光标。
2)用户检查注视点指示物与注视目标物的偏差。
3)如果没有偏差则直接继续使用注视点作为指示器操作手机。
4)如果有偏差,则切换到修正模式。
可选地,在修改模式下,可以通过手机后的触摸板将光标挪到目标物处使二者重叠。
5)在将挪动光标,使光标与目标物重叠后,注视点识别算法根据用户的修正结果调整计算系数。
6)用户可重复2过程,直到计算得出的注视点与用户实际的注视点达到满意的使用精度。
以下为本发明所提供方案的实际应用的示例。
需要说明的是,本发明所提供的技术方案可以包括但不限于以下示例。
可选地,用户可以通过注视的方式生成“打开指令”。
例如,用于可以注视手机交互界面中的目标应用的图标,指示光标(可隐藏)出现在目标图标上后,轻点触摸设备,控制目标应用打开。
可选地,用户可以通过注视的方式生成“翻页指令”。
例如,在手机上阅读电子书籍时,注视“书”的右下角,手指在触摸设备上轻轻地从右划到左,向后翻页;注视“书”的左下角,手指在触摸设备上轻轻地从左划到右,向前翻页。
又例如,在手机上阅读电子书籍或网页时,手指在触摸设备按下,用户目光从“书”的右下角,并向左移动,完成向后翻页动作;用户目光从“书”的左下角,并向右移动,完成向前翻页动作。
可选地,用户可以通过注视的方式生成“拖拽指令”。
例如,用户注视手机交互界面某图标,同时手指在触摸设备按下并往目标方向移动,相应的图标跟着手指移动到目标位置。
由例如,用户注视手机交互界面某图标,同时手指在触摸设备按下,然后用户的 目光(注视点)往目标方向移动,相应的图标跟着用户的目光(注视点)移动到目标位置。
本发明所提供的技术放哪,通过触摸设备与眼动追踪技术结合,对计算注视点进行修正或操作,达到精准使用视线进行人机交互的效果,实现了电子设备的精准操作。
上述本发明实施例序号仅仅为了描述,不代表实施例的优劣。
在本发明的上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述的部分,可以参见其他实施例的相关描述。
在本申请所提供的几个实施例中,应该理解到,所揭露的技术内容,可通过其它的方式实现。其中,以上所描述的装置实施例仅仅是示意性的,例如所述单元的划分,可以为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,单元或模块的间接耦合或通信连接,可以是电性或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本发明各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本发明的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可为个人计算机、服务器或者网络设备等)执行本发明各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、移动硬盘、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述仅是本发明的优选实施方式,应当指出,对于本技术领域的普通技术人员来说,在不脱离本发明原理的前提下,还可以做出若干改进和润饰,这些改进和润饰也应视为本发明的保护范围。
工业实用性
本发明实施例提供的方案,可以应用于基于眼动跟踪技术的人机交互过程中。通过本发明实施例解决了无法准确根据注视点信息生成确定性指令的技术问题,实现了准确根据用户的注视点信息生成确定性指令的技术效果。

Claims (14)

  1. 一种终端的控制方法,包括:
    获取针对终端的注视点信息;
    获取对触摸设备进行触摸操作所产生的第一触摸指令;
    依据所述注视点信息与所述第一触摸指令确定对所述终端的控制指令;
    控制所述终端执行与所述控制指令对应的操作。
  2. 根据权利要求1所述的方法,其中,在依据所述注视点信息与所述触摸操作指令确定对所述终端的控制指令之前,所述方法还包括:
    校准所述注视点信息。
  3. 根据权利要求2所述的方法,其中,校准所述注视点信息包括:
    确定与目标物对应的注视点所在的初始位置;
    调整指示物的位置,使所述指示物与所述目标物重合,并将调整后的所述指示物的位置作为校准后的注视点信息,其中,所述指示物用于指示所述注视点所在位置。
  4. 根据权利要求3所述的方法,其中,调整所述指示物的位置包括:
    获取对触摸设备进行触摸操作所产生的第二触摸指令;
    根据所述第二触摸指令调整所述指示物的位置。
  5. 一种终端的控制装置,包括:
    第一获取单元,设置为获取针对终端的注视点信息;
    第二获取单元,设置为获取对触摸设备进行触摸操作所产生的第一触摸指令;
    确定单元,设置为依据所述注视点信息与所述第一触摸指令确定对所述终端的控制指令;
    控制单元,设置为控制所述终端执行与所述控制指令对应的操作。
  6. 根据权利要求5所述的装置,其中,所述装置还包括:
    校准单元,设置为在依据所述注视点信息与所述触摸操作指令确定对所述终 端的控制指令之前,校准所述注视点信息。
  7. 根据权利要求6所述的装置,其中,所述校准单元包括:
    确定模块,设置为确定与目标物对应的注视点所在的初始位置;
    调整模块,设置为调整指示物的位置,使所述指示物与所述目标物重合,并将调整后的所述指示物的位置作为校准后的注视点信息,其中,所述指示物用于指示所述注视点所在位置。
  8. 根据权利要求7所述的装置,其中,所述调整模块包括:
    获取模块,设置为获取对触摸设备进行触摸操作所产生的第二触摸指令;
    调整子模块,设置为根据所述第二触摸指令调整所述指示物的位置。
  9. 一种终端,包括:
    眼动跟随设备,设置为获取针对终端的注视点信息;
    触摸设备,设置为获取对触摸设备进行触摸操作所产生的第一触摸指令;
    处理器,设置为依据所述注视点信息与所述第一触摸指令确定对所述终端的控制指令,并控制所述终端执行与所述控制指令对应的操作。
  10. 根据权利要求9所述的终端,其中,
    所述眼动跟随设备可拆卸式设置于所述终端上;或
    所述眼动跟随设备固定安装于所述终端上。
  11. 根据权利要求9所述的终端,其中,所述触摸设备固定安装于所述终端上。
  12. 根据权利要求11所述的终端,其中,
    所述触摸设备包括所述终端的触摸屏幕;和/或
    所述触摸设备包括在所述终端背面设置的触摸板。
  13. 一种存储介质,所述存储介质包括存储的程序,其中,所述程序执行权利要求1至4中任意一项所述的终端的控制方法。
  14. 一种处理器,所述处理器设置为运行程序,其中,所述程序运行时执行权利要求1至4中任意一项所述的终端的控制方法。
PCT/CN2019/077165 2018-05-07 2019-03-06 终端的控制方法、装置及终端 WO2019214329A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810428124.8 2018-05-07
CN201810428124.8A CN108829239A (zh) 2018-05-07 2018-05-07 终端的控制方法、装置及终端

Publications (1)

Publication Number Publication Date
WO2019214329A1 true WO2019214329A1 (zh) 2019-11-14

Family

ID=64148334

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/077165 WO2019214329A1 (zh) 2018-05-07 2019-03-06 终端的控制方法、装置及终端

Country Status (3)

Country Link
CN (1) CN108829239A (zh)
TW (1) TW201947361A (zh)
WO (1) WO2019214329A1 (zh)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108829239A (zh) * 2018-05-07 2018-11-16 北京七鑫易维信息技术有限公司 终端的控制方法、装置及终端
CN109960412B (zh) * 2019-03-22 2022-06-07 北京七鑫易维信息技术有限公司 一种基于触控调整注视区域的方法以及终端设备
CN110174937A (zh) * 2019-04-09 2019-08-27 北京七鑫易维信息技术有限公司 注视信息控制操作的实现方法及装置
CN114115532B (zh) * 2021-11-11 2023-09-29 珊瑚石(上海)视讯科技有限公司 一种基于显示内容的ar标注方法及系统
CN116737051B (zh) * 2023-08-16 2023-11-24 北京航空航天大学 基于触控屏的视触结合交互方法、装置、设备和可读介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104238726A (zh) * 2013-06-17 2014-12-24 腾讯科技(深圳)有限公司 智能眼镜控制方法、装置及一种智能眼镜
CN105320261A (zh) * 2015-01-07 2016-02-10 维沃移动通信有限公司 移动终端的控制方法及移动终端
CN107771051A (zh) * 2014-11-14 2018-03-06 Smi创新传感技术有限公司 眼睛追踪系统以及检测优势眼的方法
CN108829239A (zh) * 2018-05-07 2018-11-16 北京七鑫易维信息技术有限公司 终端的控制方法、装置及终端

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105260008B (zh) * 2014-07-15 2018-10-12 华为技术有限公司 一种定位位置的方法及装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104238726A (zh) * 2013-06-17 2014-12-24 腾讯科技(深圳)有限公司 智能眼镜控制方法、装置及一种智能眼镜
CN107771051A (zh) * 2014-11-14 2018-03-06 Smi创新传感技术有限公司 眼睛追踪系统以及检测优势眼的方法
CN105320261A (zh) * 2015-01-07 2016-02-10 维沃移动通信有限公司 移动终端的控制方法及移动终端
CN108829239A (zh) * 2018-05-07 2018-11-16 北京七鑫易维信息技术有限公司 终端的控制方法、装置及终端

Also Published As

Publication number Publication date
CN108829239A (zh) 2018-11-16
TW201947361A (zh) 2019-12-16

Similar Documents

Publication Publication Date Title
WO2019214329A1 (zh) 终端的控制方法、装置及终端
US11809784B2 (en) Audio assisted enrollment
US11119573B2 (en) Pupil modulation as a cognitive control signal
CN102662462B (zh) 电子装置、手势识别方法及手势应用方法
US9965039B2 (en) Device and method for displaying user interface of virtual input device based on motion recognition
US20160216761A1 (en) System for gaze interaction
US20150002475A1 (en) Mobile device and method for controlling graphical user interface thereof
US20130152002A1 (en) Data collection and analysis for adaptive user interfaces
US20140049462A1 (en) User interface element focus based on user's gaze
KR102326489B1 (ko) 디스플레이를 제어하는 전자 장치 및 방법
US11670144B2 (en) User interfaces for indicating distance
KR20140035358A (ko) 시선-보조 컴퓨터 인터페이스
US10620748B2 (en) Method and device for providing a touch-based user interface
US20210117078A1 (en) Gesture Input Method for Wearable Device and Wearable Device
JP2017208638A (ja) 虹彩認証装置、虹彩認証方法、及びプログラム
WO2012145142A2 (en) Control of electronic device using nerve analysis
CN110851048A (zh) 一种调整控件的方法和电子设备
WO2016147498A1 (ja) 情報処理装置、情報処理方法及びプログラム
Park et al. Gazel: Runtime gaze tracking for smartphones
CN113906725B (zh) 用于音量控制的方法、电子设备和计算机可读存储介质
US20230100689A1 (en) Methods for interacting with an electronic device
CN118302735A (zh) 用于沉浸式现实应用中的智能眼镜的具有助手特征的基于凝视的用户界面
Deng Multimodal interactions in virtual environments using eye tracking and gesture control.
CN112445328A (zh) 映射控制方法及装置
Skovsgaard Noise challenges in monomodal gaze interaction

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19799117

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 10.03.2021)

122 Ep: pct application non-entry in european phase

Ref document number: 19799117

Country of ref document: EP

Kind code of ref document: A1