WO2023082952A1 - Method for interacting with electronic device, and electronic device - Google Patents

Method for interacting with electronic device, and electronic device Download PDF

Info

Publication number
WO2023082952A1
WO2023082952A1 PCT/CN2022/125889 CN2022125889W WO2023082952A1 WO 2023082952 A1 WO2023082952 A1 WO 2023082952A1 CN 2022125889 W CN2022125889 W CN 2022125889W WO 2023082952 A1 WO2023082952 A1 WO 2023082952A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
electronic device
gesture
type
eye movement
Prior art date
Application number
PCT/CN2022/125889
Other languages
French (fr)
Chinese (zh)
Inventor
刘璕
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2023082952A1 publication Critical patent/WO2023082952A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Definitions

  • the present application relates to the technical field of terminals, and in particular to a method for interacting with an electronic device and the electronic device.
  • the present application provides a method for interacting with an electronic device, an electronic device, a computer storage medium, and a computer program product, which can realize efficient and convenient control of the electronic device by combining eye movement control and gesture control, and improve user experience .
  • the present application provides a method for interacting with an electronic device, the method including: the electronic device acquires the eye movement type of the user's eyes; when the eye movement type is determined to be the first type, the electronic device responds to the electronic The device performs control; when it is determined that the eye movement type is the second type, the electronic device instructs the user to perform gesture control; the electronic device obtains the user's gesture; and the electronic device controls the electronic device according to the obtained gesture.
  • the first type may be bouncing
  • the second type may be non-jumping, such as head-up or smooth tracking.
  • instructing the user to perform gesture control specifically includes: obtaining a gaze position of the user; and instructing the user to perform gesture control within a first preset area where the gaze position is located.
  • the first preset area can be understood as a surface or a point.
  • the first preset area can be the area formed by taking the user's gaze position as the center and extending a preset distance outward; when understood as a point, the first preset area can be the user's The point where the gaze position of is, that is, the user's gaze point.
  • instructing the user to perform gesture control specifically includes: determining the target area according to the type of the first content displayed on the electronic device; instructing the user to perform gesture control in the target area Gesture control, wherein the target area is an area on the electronic device that does not contain the first content.
  • the first content may be text content or picture content and so on.
  • instructing the user to perform gesture control specifically includes: obtaining the gaze position of the user; determining the gaze area of the user according to the gaze position; when the gaze area is the second preset area, instructs the user to perform gesture controls.
  • the user is instructed to perform gesture control only when the user is looking at a specific area, thereby preventing the user from falsely triggering and instructing the user to perform gesture control when viewing the electronic device immersively, thereby improving user experience. For example, when the user is watching a movie, this way the user can be prevented from being falsely instructed to perform gesture controls.
  • instructing the user to perform gesture control includes one or more of the following: controlling the electronic device to display a cursor, controlling to enlarge at least part of the content displayed by the electronic device, or, The sound component matched with the electronic device is controlled to broadcast the first voice, and the first voice is used to instruct the user to perform gesture control. The user is thus prompted dynamically or audibly for gesture control.
  • instructing the user to perform gesture control includes controlling the electronic device to display a cursor, and the cursor is displayed in the first area of the display area of the electronic device; the method further includes: Determine that the eye movement type continues to be the second type, and when the user's eyes move and the user's gaze area is switched from the first area to the second area in the display area of the electronic device, control the cursor to move from the first area to the second area . In this way, during the movement of the user's eyes, when the user's gaze area changes, the cursor is controlled to move synchronously, so that the user can easily select the content he needs, and the user experience is improved.
  • instructing the user to perform gesture control includes controlling and zooming in on at least part of the content displayed on the electronic device, and the zoomed-in content is what is displayed in the first area of the display area of the electronic device Displayed content; the method also includes: determining that the eye movement type continues to be the second type, and when the user's eyes move and the user's gaze area is switched from the first area to the second area of the electronic device, zooming in on the first area The content of the area is restored to its original state, and the content in the second area is enlarged. In this way, during the movement of the user's eyes, when the user's gaze area changes, the magnified content displayed on the electronic device is synchronously changed, so that the user can easily select the content he needs, and the user experience is improved.
  • the method further includes: determining that the eye movement type is switched from the second type to the first type, stopping instructing the user to perform gesture control, and/or, the electronic device according to The first type controls the electronic device and restricts the electronic device from acquiring user's gestures.
  • the eye movement type after determining that the eye movement type is the second type, it further includes: obtaining the gaze position of the user, and determining the size of the gaze area of the user according to the gaze position; According to the size of the gaze area, the size of the operation area when the user performs gesture control is determined. In this way, the user can perform gesture operations in an operation area of a size matching the size of the gaze area, thereby improving the accuracy of gesture operations and user experience.
  • the electronic device controls the electronic device according to the first type, specifically including: the electronic device determines the gaze position of the user in real time according to the first type, and/or, The electronic device controls switching of the content displayed on the electronic device according to the first type.
  • the present application provides a method for interacting with an electronic device, the method comprising: the electronic device obtains a user's gesture; the electronic device controls the electronic device according to the obtained gesture; the electronic device obtains the user's eye movement type; When it is determined that the user's eye movement type is the first type, the electronic device controls the electronic device according to the first type, and restricts the electronic device from controlling the electronic device according to the user's gesture; when it is determined that the user's eye movement type is the second type, the electronic device The device obtains the user's gesture; the electronic device controls the electronic device according to the obtained gesture.
  • the interaction between the eye movement control and the gesture control can be realized, so that the user can efficiently and conveniently control the electronic device, and the user experience is improved.
  • the first type may be bouncing
  • the second type may be non-jumping, such as head-up or smooth tracking.
  • the method further includes: when determining the user's eye movement When the type is switched from the first type to the second type, the electronic device instructs the user to perform gesture control; the electronic device obtains the user's gesture; and the electronic device controls the electronic device according to the obtained gesture. Therefore, eye movement control and gesture control are combined in an orderly manner, so that the eye movement control and gesture control can interact without feeling, so that the user can efficiently and conveniently control the electronic device, and the user experience is improved.
  • restricting the electronic device from controlling the electronic device according to the user's gestures specifically includes: the electronic device does not obtain the user's gestures, or, although the electronic device obtains the user's gestures but does not Gestures are not processed. In this way, the purpose of restricting the electronic device from controlling the electronic device according to the gesture of the user is achieved.
  • the present application provides a method for interacting with an electronic device, the method comprising: the electronic device acquires a first gesture of a user operating on a first area on the electronic device; the electronic device responds to the electronic device according to the acquired first gesture Control; the electronic device obtains the user's eye movement type; when it is determined that the user's eye movement type is the first type, the electronic device controls the electronic device according to the first type, and restricts the electronic device from obtaining the user's gesture; the electronic device continues to obtain The user's eye movement type; when it is determined that the user's eye movement type is switched from the first type to the second type, and the user's gaze position is switched from the first area on the electronic device to the second area, the electronic device instructs the user to perform gesture control The electronic device acquires a second gesture performed by the user on the second area; the electronic device controls the electronic device according to the acquired second gesture.
  • the user can achieve the purpose of non-inductive switching when switching the control area, so that the user can efficiently and conveniently control the electronic device and improve user experience.
  • the first type may be bouncing
  • the second type may be non-jumping, such as head-up or smooth tracking.
  • the present application provides a method for interacting with an electronic device, the method comprising: the electronic device obtains the user's first eye movement information; the electronic device determines the first eye movement type according to the first eye movement information; the electronic device determines the first eye movement type according to The first eye movement type controls the electronic device; the electronic device obtains the user's second eye movement information; the electronic device determines the second eye movement type according to the second eye movement information; the electronic device obtains the user's gesture; the electronic device according to the acquired gesture Control electronic equipment.
  • the first type of eye movement may be jerky
  • the second type of eye movement may be non-jiggly, such as head-up or smooth tracking.
  • the first eye movement type may also be referred to as the first type
  • the second eye movement type may also be referred to as the second type.
  • the method also includes: the electronic device acquires the third eye movement information of the user; the electronic device determines the first eye movement type according to the third eye movement information; the electronic device The electronic device is controlled according to the first eye movement type, and the electronic device is restricted from acquiring the user's gestures.
  • the present application provides an electronic device, including: an eye-tracking device for acquiring user's eye-movement information; a gesture tracking device for acquiring user's gestures; at least one memory for storing programs; at least one The processor is used to execute the program stored in the memory.
  • the processor When the program stored in the memory is executed, the processor is used to: control the eye movement tracking device to obtain the eye movement information of the user; when the eye movement type of the user's eyes determined by the eye movement information is For the first type, the electronic device is controlled according to the first type; when the eye movement type of the user’s eyes determined by the eye movement information is the second type, the electronic device is controlled to instruct the user to perform gesture control; the gesture tracking device is controlled to obtain the user’s Gesture: controlling the electronic device according to the gesture acquired by the gesture tracking device.
  • the processor controls the electronic device to instruct the user to perform gesture control, specifically including: the processor determines the user's gaze position according to the user's eye movement information; the processor controls the electronic device to The device prompts the user to perform gesture control within the first preset area where the gaze position is located.
  • the processor controls the electronic device to instruct the user to perform gesture control, specifically including: the processor determines the target area according to the type of the first content displayed on the electronic device; The processor controls the electronic device to prompt the user to perform gesture control in a target area, wherein the target area is an area on the electronic device that does not contain the first content.
  • the processor controls the electronic device to instruct the user to perform gesture control, specifically including: the processor obtains the user's gaze position according to the user's eye movement information; The location determines the gaze area of the user; when the gaze area is the second preset area, the processor controls the electronic device to prompt the user to perform gesture control.
  • the processor controls the electronic device to instruct the user to perform gesture control, including one or more of the following: the processor controls the electronic device to display a cursor, and the processor controls the electronic device to zoom in At least part of the content is displayed, or the processor controls a sound component matched with the electronic device to broadcast a first voice, and the first voice is used to prompt the user to perform gesture control.
  • the processor controlling the electronic device to instruct the user to perform gesture control includes the processor controlling the electronic device to display a cursor, and the cursor is displayed in the first area of the display area of the electronic device Inside; the processor is also used for: when the eye movement type of the user's eyes determined by the eye movement information continues to be the second type, and the user's eyes move and the user's gaze area is determined by the first area in the display area of the electronic device When switching to the second area, the electronic device is controlled to move the cursor from the first area to the second area, wherein the gaze area is determined by the gaze position of the user's eyes.
  • the processor controlling the electronic device to instruct the user to perform gesture control includes the processor controlling to enlarge at least part of the content displayed by the electronic device, and the enlarged content is the display of the electronic device The content displayed in the first area in the area; the processor is also used for: when the eye movement type of the user's eyes determined by the eye movement information continues to be the second type, and the user's eyes move and the user's gaze area is determined by the electronic When the first area of the device is switched to the second area, the electronic device is controlled to restore the enlarged content in the first area to an initial state, and to enlarge the content in the second area.
  • the processor controls the electronic device to instruct the user to perform gesture control
  • the processor is further configured to: when the eye movement type of the user's eyes determined by the eye movement information is determined by the first
  • the control electronic device stops instructing the user to perform gesture control, and/or controls the electronic device according to the first type, and restricts the gesture tracking device from acquiring the user's gestures.
  • the processor is also used to: obtain the user's gaze position; The position determines the size of the user's gaze area; according to the size of the gaze area, the size of the operation area when the user performs gesture control is determined.
  • the processor controls the electronic device according to the first type, specifically including: the processor determines the gaze position of the user in real time according to the first type, and/or, The processor controls the electronic device to switch the content displayed on the electronic device according to the first type.
  • the present application provides an electronic device, including: an eye-tracking device for acquiring user's eye-movement information; a gesture tracking device for acquiring user's gestures; at least one memory for storing programs; at least one The processor is used to execute the program stored in the memory.
  • the processor is used to: control the user's gesture acquired by the gesture tracking device; control the electronic device according to the gesture acquired by the gesture tracking device;
  • the eye movement type of the user's eyes determined by the eye movement information is the first type, control the electronic device according to the first type, and limit the control of the electronic device according to the user's gesture;
  • the eye movement type of the user's eyes determined by the eye movement information is In the second type, the electronic device is controlled according to the gesture acquired by the gesture tracking device.
  • the processor controls the electronic device according to the first type and restricts the control of the electronic device according to the user's gesture, it is further used to: when the eye movement information When the determined eye movement type of the user's eyes is switched from the first type to the second type, control the electronic device to instruct the user to perform gesture control; control the user's gesture acquired by the gesture tracking device; control the electronic device according to the gesture acquired by the gesture tracking device .
  • the processor restricts controlling the electronic device according to the user's gestures, specifically including: the processor controls the gesture tracking device not to acquire the user's gestures, or the processor controls the gestures The tracking device continues to acquire the user's gestures but does not process the gestures.
  • the present application provides an electronic device, including: an eye-tracking device for acquiring user's eye-movement information; a gesture tracking device for acquiring user's gestures; at least one memory for storing programs; at least one The processor is used to execute the program stored in the memory.
  • the processor is used to: control the gesture tracking device to obtain the user's gesture; according to the user's operation on the first area on the electronic device obtained by the gesture tracking device
  • the first gesture controls the electronic device; controls the eye movement tracking device to obtain the user's eye movement information; when the eye movement type of the user's eyes determined by the eye movement information is the first type, controls the electronic device according to the first type, And limit the gesture tracking device to obtain the user's gesture; continue to control the eye tracking device to obtain the user's eye movement information; when the eye movement type of the user's eyes determined by the eye movement information is switched from the first type to the second type, and the user's gaze
  • the electronic device is controlled to instruct the user to perform gesture control; the gesture tracking device is controlled to obtain the user's gesture; Gestures control electronic devices.
  • the present application provides an electronic device, including: an eye-tracking device for acquiring user's eye-movement information; a gesture tracking device for acquiring user's gestures; at least one memory for storing programs; at least one
  • the processor is used to execute the program stored in the memory.
  • the processor is used to: control the eye movement tracking device to obtain the user's first eye movement information; determine the first eye movement type according to the first eye movement information ; Control the electronic device according to the first eye movement type; control the eye movement tracking device to obtain the user's second eye movement information; determine the second eye movement type according to the second eye movement information; control the gesture tracking device to obtain the user's gesture;
  • the gestures acquired by the gesture tracking device control the electronic equipment.
  • the processor is further configured to: control the eye-tracking device to obtain the user's third eye-movement information; determine the first eye-movement type according to the third eye-movement information; The electronic device is controlled according to the type of the first eye movement, and the gesture tracking device is restricted from acquiring the user's gestures.
  • the present application provides a computer-readable storage medium.
  • the computer-readable storage medium stores a computer program.
  • the computer program runs on the electronic device, the electronic device executes any one of the first to fourth aspects. Aspect or the method provided in any implementation manner of the first aspect to the fourth aspect.
  • the present application provides a computer program product.
  • the computer program product runs on an electronic device, the electronic device executes any one of the first to fourth aspects or any of the first to fourth aspects.
  • FIG. 1 is a schematic diagram of an application scenario provided by an embodiment of the present application
  • FIG. 2 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • Fig. 3 is a schematic diagram of eye movement tracking provided by an eye tracking device according to an embodiment of the present application.
  • Fig. 4 is a schematic diagram of the relative position between the pupil center of the user's eye and the light spot on the cornea provided by an embodiment of the present application;
  • Fig. 5 is a schematic diagram of the coordinates of the gaze position on the display screen of an electronic device provided by an embodiment of the present application;
  • Fig. 6 is a schematic layout diagram of an infrared emitter on an electronic device provided by an embodiment of the present application.
  • Fig. 7 is a schematic diagram of a gesture tracking device tracking gestures provided by an embodiment of the present application.
  • Fig. 8 is a schematic flowchart of a method for interacting with an electronic device provided by an embodiment of the present application.
  • Fig. 9 is a schematic diagram of instructing a user to perform gesture control provided in an embodiment of the present application.
  • Fig. 10 is another schematic diagram of instructing a user to perform gesture control provided in an embodiment of the present application.
  • Fig. 11 is a schematic diagram of the change of the cursor displayed on the electronic device when the user's gaze area changes according to an embodiment of the present application
  • Fig. 12 is a schematic diagram of changes in content enlarged on an electronic device when a user's gaze area changes according to an embodiment of the present application
  • Fig. 13 is a corresponding schematic diagram of an operation area and a display screen on an electronic device provided by an embodiment of the present application;
  • Fig. 14 is a corresponding schematic diagram of an operation area and a gaze area on an electronic device according to an embodiment of the present application
  • Fig. 15 is a schematic diagram of an eye's viewing angle provided by an embodiment of the present application.
  • Fig. 16 is a schematic diagram of displaying a cursor in a gaze area on an electronic device according to an embodiment of the present application
  • Fig. 17 is a schematic diagram of the steps of a control process provided by an embodiment of the present application.
  • Fig. 18 is a schematic diagram of a user's hand movement process provided by an embodiment of the present application.
  • Fig. 19 is a schematic flowchart of a method for interacting with an electronic device provided by an embodiment of the present application.
  • Fig. 20 is a schematic flowchart of another method for interacting with an electronic device provided by an embodiment of the present application.
  • Fig. 21 is a schematic flowchart of another method for interacting with an electronic device provided by an embodiment of the present application.
  • Fig. 22 is a schematic flowchart of another method for interacting with an electronic device provided by an embodiment of the present application.
  • Fig. 23 is a schematic diagram of a control process provided by an embodiment of the present application.
  • Fig. 24 is a schematic diagram of another control process provided by an embodiment of the present application.
  • Fig. 25 is a schematic diagram of another control process provided by an embodiment of the present application.
  • Fig. 26 is a schematic diagram of another control process provided by an embodiment of the present application.
  • Fig. 27 is a schematic diagram of another control process provided by an embodiment of the present application.
  • first and second and the like in the specification and claims herein are used to distinguish different objects, rather than to describe a specific order of objects.
  • first response message and the second response message are used to distinguish different response messages, rather than describing a specific order of the response messages.
  • words such as “exemplary” or “for example” are used as examples, illustrations or illustrations. Any embodiment or design scheme described as “exemplary” or “for example” in the embodiments of the present application shall not be interpreted as being more preferred or more advantageous than other embodiments or design schemes. Rather, the use of words such as “exemplary” or “such as” is intended to present related concepts in a concrete manner.
  • multiple means two or more, for example, multiple processing units refer to two or more processing units, etc.; multiple A component refers to two or more components or the like.
  • Fig. 1 shows an application scenario in some embodiments of the present application.
  • user A is using an electronic device 100, wherein the electronic device 100 may be, but not limited to, a smart TV.
  • a TV or other electronic devices with a large screen for interaction, such as a user interface in a smart phone can be transmitted wirelessly and presented on the smart TV, and user operations on the smart TV can also affect the smart phone.
  • the electronic device 100 may be provided with an eye tracking device 110 and/or a gesture tracking device 120 .
  • the eye tracking device 110 can be used to detect the movement of the user A's eyes 21
  • the gesture tracking device 120 can be used to detect the gesture and/or hand movement of the user A's hand 22 .
  • user A can use eye movement manipulation to control the electronic device 100 .
  • an eye tracking device 110 may be provided on the electronic device 100 .
  • the user A may intentionally perform some control-related actions through his eyes 21 .
  • the electronic device 100 may generate a control command according to the motion of the eye 21 detected by the eye-tracking device 110 , and output the control command. It can be seen that during the manipulation of the electronic device 100 by the user A, the eyes 21 of the user A need to make frequent deliberate movements, which easily causes eye fatigue of the user A.
  • eye movement manipulation may be replaced by gesture manipulation.
  • the eye tracking device 110 on the electronic device 100 can be replaced by the gesture tracking device 120 .
  • the user A may intentionally make some control-related gestures through his hand 22 .
  • the electronic device 100 can generate a control command according to the gesture made by the hand 22 detected by the gesture tracking device 120, and output the control command.
  • the control area on the electronic device 100 is large, the distance that user A's hand 22 needs to move will also increase, which will easily lead to a decrease in control efficiency and cause fatigue.
  • the present application also provides a solution.
  • the electronic device 100 can be provided with an eye tracking device 110 and a gesture tracking device 120.
  • the eye tracking device 110 on the electronic device 100
  • the eyes 21 of user A can be detected, and when it is detected that user A intends to control the electronic device 100, the electronic device 100 can instruct user A to perform gesture control;
  • the gesture of A's hand 22 generates a control command, and outputs the control command.
  • eye movement control and gesture control are combined in an orderly manner, so that there can be no sense interaction between eye movement control and gesture control, and efficient, convenient and high-precision control of electronic devices is realized.
  • the electronic device 100 shown in FIG. 1 can also be replaced with other electronic devices, and the replaced solution is still within the protection scope of the present application.
  • the electronic device 100 can be a mobile phone, a tablet computer, a desktop computer, a laptop computer, a handheld computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, and a cell phone, a personal Digital assistant (personal digital assistant, PDA), augmented reality (augmented reality, AR) equipment, virtual reality (virtual reality, VR) equipment, artificial intelligence (artificial intelligence, AI) equipment, wearable equipment, vehicle equipment, smart home device and/or smart city device, the embodiment of the present application does not specifically limit the specific type of the electronic device 100 .
  • FIG. 2 shows a schematic structural diagram of the electronic device 100 .
  • the electronic device 100 may include: an eye tracking device 110 , a gesture tracking device 120 , a processor 130 and a memory 140 .
  • the eye movement tracking device 110 may be used to detect eye movement information, eye movement type and/or gaze position of the user's eyes.
  • the eye movement information may include eye movement velocity and/or eye movement acceleration and the like.
  • the type of eye movement may include bouncing and non-bouncing, wherein non-bouncing refers to an eye movement type other than bouncing, such as gaze or smooth tracking.
  • the eye-tracking device 110 may, but is not limited to, detect eye movement types through pupil center corneal reflection (pupil center corneal reflection, PCCR), appearance-based and other technologies.
  • PCCR pupil center corneal reflection
  • the eye tracking device 110 may include: an infrared emitter 111 and a camera 112 .
  • the infrared emitter 111 can emit infrared light, which can be reflected on the cornea of user A's eye 21 and project light spots on the cornea.
  • the camera 112 can collect the face image of user A, and determine the relative position between the pupil center of the user's eye 21 and the light spot on the cornea through a preset image algorithm, and determine the gaze of user A's eye 21 based on the relative position Point and eye movement types.
  • the eye-tracking device 110 can be started together with the electronic device 100, or can be started after obtaining the start instruction.
  • the electronic device 100 may send an image to the eye tracking device 110 Start command.
  • the electronic device 100 may also send an activation instruction to the eye tracking device 110 after obtaining the instruction from the user to start the eye tracking device 110 .
  • the electronic device 100 may also send an activation instruction to the eye tracking device 110 .
  • the mapping relationship between the relative position of the pupil center of the user's eye and the light spot on the cornea and the gaze point of the user's gaze on the electronic device 100 can be preset, so that when the pupil center of the user's eye and the cornea After determining the relative positions between the light spots, the gaze position of the user on the electronic device 100 can be determined.
  • the user's gaze direction is to gaze at the central area of the left edge of the electronic device 100
  • the pupil center 41 of the user's eye 21 is located directly to the right of the transmitted light spot 42 on the cornea, and the distance between the two is L 1
  • the coordinates of this area can be (-X 1 ,0).
  • the type of eye movement when determining the type of eye movement, it may be determined based on, but not limited to, eye movement velocity, eye movement acceleration, or spatial divergence.
  • the determination of the eye movement type based on the eye movement speed is used as an example for illustration.
  • V 1 the eye movement speed of the user's eyes exceeds a certain threshold V 1 , that is, the eye movement speed V ⁇ (V 1 , + ⁇ )
  • V ⁇ the eye movement speed
  • the eye movement speed is fast, it can be determined that the eye movement type is jumping; when the eye movement speed of the user's eyes is between V 2 and V 1 , that is, the eye movement speed V ⁇ [V 2 , V 1 ], it indicates that the user's eye movement speed is between V 2 and V 1 .
  • the eye movement speed is relatively slow. At this time, it can be determined that the eye movement type is gentle tracking, where V 2 is less than V 1 ; when the eye movement speed of the user's eyes is less than V 2 , that is, the eye movement speed V ⁇ [0, V 2 ), it indicates The user's eye movement speed is very slow, and the eye movement type can be determined to be fixation at this time.
  • the number of infrared emitters 111 may be multiple.
  • a plurality of infrared emitters 111 may be arranged on the electronic device 100 at intervals. For example, as shown in FIG. 6 , when the number of infrared emitters 111 is four, the four infrared emitters 111 may be respectively arranged at four corners of the electronic device 100 .
  • the number of cameras 112 may also be multiple, which may be determined according to specific circumstances, and is not limited here.
  • at least part of the functions in the eye tracking device 110 (such as data processing functions, etc.) may be implemented by the processor 110 .
  • the eye-tracking device 110 may be integrated on the electronic device 100 or arranged separately, which is not limited here.
  • the gesture tracking device 120 can be used to detect gestures made by the user.
  • the gesture tracking device 120 may, but not limited to, detect the gestures made by the user through methods such as camera tracking based on computer vision, comparison based on transmitted signals and reflected signals, and the like. Exemplarily, taking the comparison method based on transmitted signals and reflected signals as an example, as shown in FIG. 7 , the gesture tracking device 120 may include a signal transmitter 121 , a signal receiver 122 and a signal processing unit 123 .
  • the signal transmitter 121 can send signals (such as signals such as millimeter wave, infrared light, ultrasound, wireless fidelity (Wi-Fi)); then, the signal receiver 122 can receive The signal reflected by the hand 22 of user A; finally, the signal processing unit 123 compares the original signal sent by the signal transmitter 121 with the reflected signal received by the signal receiver 122, and utilizes principles such as the Doppler effect, phase shift, and time difference The movement of the hand 22 of the user A is tracked, and then the gesture made by the user A is determined.
  • signals such as signals such as millimeter wave, infrared light, ultrasound, wireless fidelity (Wi-Fi)
  • the signal receiver 122 can receive The signal reflected by the hand 22 of user A
  • the signal processing unit 123 compares the original signal sent by the signal transmitter 121 with the reflected signal received by the signal receiver 122, and utilizes principles such as the Doppler effect, phase shift, and time difference The movement of the hand 22 of the user A is tracked, and then the gesture made by
  • the gesture tracking device 120 when the gesture tracking device 120 uses a camera tracking method based on computer vision to detect the gestures made by the user, the gesture tracking device 120 may also include a camera, which can collect images of the user's hand and A set image processing algorithm detects gestures made by the user. Wherein, the camera may be the same camera as the camera in the eye tracking device 110, or may be a different camera. In one example, at least some components in the gesture tracking device 120 (such as the signal processing unit 123 ) can be integrated in the processor 130 . In one example, the gesture tracking device 120 may be integrated on the electronic device 100 or arranged separately, or a part may be integrated on the electronic device 100 while another part is arranged separately, which is not limited here.
  • the gesture tracking device 120 can be started together with the electronic device 100, or can be started after obtaining the start command, for example, when the electronic device 100 determines that the eye movement type of the user's eyes is not When jumping, the electronic device 100 may send an activation instruction to the gesture tracking device 120 .
  • the electronic device 100 may also send an activation instruction to the eye tracking device 120 after obtaining the instruction from the user to start the eye tracking device 120 .
  • the electronic device 100 may also send an activation instruction to the eye tracking device 120 .
  • the gesture tracking device 120 may include peripheral components such as an electromyogram (electromyogram, EMG) bracelet or an infrared remote control pen.
  • peripheral components such as an electromyogram (electromyogram, EMG) bracelet or an infrared remote control pen.
  • EMG electromyogram
  • the signal processing unit in the gesture tracking device 120 can obtain the signals emitted by the peripheral components, and detect the user's gestures and gestures according to the signals emitted by the peripheral components. sports.
  • the eye tracking device 110 may also be in a working state, so that the eye tracking device 110 continues to acquire the eye movement type of the user's eyes.
  • the gesture tracking device 120 when the eye tracking device 110 is working, the gesture tracking device 120 can stop working to save power consumption, or the gesture tracking device 120 can also continue to work so that the gesture tracking device 120 can continue to acquire the user's gestures .
  • the processor 130 is a calculation core and a control core of the electronic device 100 .
  • Processor 130 may include one or more processing units.
  • the processor 130 may include an application processor (application processor, AP), a modem (modem), a graphics processing unit (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video encoder One or more of a decoder, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc.
  • different processing units may be independent devices, or may be integrated in one or more processors.
  • the processor 130 may issue a control command to the eye tracking device 110 and/or the gesture tracking device 120, for example: controlling the eye tracking device 110 to be turned on or off, so as to control the eye tracking device 110 to obtain the user's eye movement Information or stop acquiring the user's eye movement information, etc., control the gesture tracking device 120 to start or shut down, so as to control the gesture tracking device 120 to acquire the user's gestures or stop acquiring the user's gestures.
  • the processor 130 may determine the user's eye movement type according to the user's eye movement information acquired by the eye movement tracking device 110. For details, refer to the above-mentioned description of determining the eye movement type based on the eye movement information, which will not be repeated here. A repeat.
  • the memory 140 can store a program, and the program can be executed by the processor 130, so that the processor 130 can at least execute the method provided in the embodiment of the present application.
  • the memory 140 may also store data.
  • the processor 130 can read data stored in the memory 140 .
  • the memory 140 and the processor 130 may be provided separately.
  • the memory 140 may also be integrated in the processor 130 .
  • the electronic device 100 further includes a display screen (not shown in the figure).
  • the display screen can be used to display images, videos, etc.
  • the display screen may include a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode, AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diodes (quantum dot light emitting diodes, QLED), etc.
  • the electronic device 100 may include 1 or N display screens, where N is a positive integer greater than 1. Exemplarily, the electronic device may display information on the display screen indicating that the user performs gesture control.
  • the structure shown in FIG. 2 of the present application does not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or fewer components than shown in the figure, or combine certain components, or separate certain components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the electronic device 100 may display a user interface as shown in (A) of FIG.
  • Eye movement type for example: detection of eye movement type through pupil center corneal reflection technology, appearance detection and other technologies (see the above description for details, and will not be repeated here).
  • the electronic device 100 may instruct the user to perform gesture control.
  • the electronic device 100 may display a cursor 92 to direct the user to gesture control.
  • the electronic device 100 can activate the gesture tracking device 120 on it to acquire the user's operation gesture, and then respond to the user's operation.
  • FIG. 8 shows a schematic flowchart of a method for interacting with an electronic device provided in an embodiment of the present application.
  • the electronic device involved in FIG. 8 may be the electronic device 100 described above in FIG. 2 , which has the eye tracking device 110 and the manual tracking device 120 described above.
  • the method for interacting with an electronic device may include the following steps:
  • the eye movement type may include jitter or non-jitter, and non-jitter means that the eye movement type is not jitter.
  • the eye movement type of the user's eyes may be acquired in real time or periodically by using the eye movement tracking device 110 described above.
  • the type of eye movement may include jumping or non-beating, and non-beating means that the type of eye movement is not jumping, such as gaze or smooth tracking.
  • the processor 130 in the electronic device 100 can also determine the eye movement type of the user's eyes based on the eye movement information of the user's eyes obtained by the eye movement tracking device 110 in real time or periodically. Do limited.
  • the user can be instructed to perform gesture control according to the eye movement type, so that the user can learn that the electronic device can be controlled by gestures in the future.
  • the user may be instructed to perform gesture control.
  • the detection of the user's eye movement type may continue, and at this time, the display interface on the electronic device may not change.
  • the eye movement type at the previous moment is not jerk, but the eye movement type at the current moment is jerk, the user is no longer instructed to perform gesture control.
  • non-jump means not jitter.
  • the electronic device 100 has a display screen, and the display screen is in a bright state, and an icon of an application or other programs that the user is watching may be displayed on the display screen of the electronic device 100 .
  • Content such as text, pictures, etc.
  • the display on the electronic device 100 A cursor 92 may be displayed on the screen, so as to indicate the user to perform gesture control through the cursor 92 .
  • the electronic device 100 has a display screen, and the display screen is in a bright state, and the display screen of the electronic device 100 may display application icons or other programs that the user is viewing.
  • Content such as text, pictures, etc.
  • instructing the user to perform gesture control as shown in (B) of FIG. to instruct the user to perform gesture controls.
  • the position of the cursor 902 may be the starting position of the gesture operation.
  • the position of the enlarged icon 901 may be the starting position of the gesture operation.
  • the starting position of the gesture operation can be understood as the initial moving position of the cursor when the user performs the gesture operation.
  • voice instructions and other methods can also be used to instruct the user to perform gesture control, which may be determined according to actual conditions, and is not limited here.
  • the gesture tracking device 120 described above may be used to obtain the user's operation gestures, see the above description for details, and will not be repeated here.
  • an operation gesture may also be referred to as a gesture.
  • the electronic device can be controlled according to the operation gesture.
  • the mapping relationship between the preset gesture and the control instruction can be queried according to the operation gesture, so as to determine the control instruction corresponding to the operation gesture, and then based on the control instruction, the electronic device can be controlled. control.
  • the gaze position of the user's eyes can be determined in real time according to the eye movement type during jumping, so as to determine the user's potential control area, and then the user can be accurately instructed For gesture control.
  • the way of determining the gaze position of the user's eyes can refer to the relevant introduction about the eye tracking device 110 in FIG. 2 above, and will not be repeated here.
  • the gaze position of the user's eyes may represent the user's interest area.
  • "determining the gaze position of the user's eyes" can be understood as "controlling the electronic device according to the type of eye movement during jumping".
  • the user when precisely instructing the user to perform gesture control, the user may be instructed to perform gesture control within the determined preset area where the gaze position of the user's eyes is located. For example: continue to refer to (A) of FIG. 9 , when it is determined that the user's gaze position is the position where the icon 91 is, as shown in (B) of FIG. 9 , the cursor 92 can be displayed at the position where the icon 91 is. Similarly, continuing to refer to (A) of FIG. 10 , when it is determined that the user's gaze position is where the icon 91 is located, as shown in (B) of FIG. 10 , the icon 91 can be enlarged.
  • the user's gaze position can indicate the user's area of interest, this can indicate that the user can perform gesture control at the same time, so that the user can directly operate the area of interest, avoiding the need for the user to reselect the desired area. Controlled content improves user experience.
  • the content displayed on the electronic device may be switched according to the current eye movement type.
  • the user interface displayed on the electronic device can be controlled to be switched.
  • the switching direction may be determined by the changing direction of the gaze position of the user's eyes.
  • the file has 10 pages in total, and the electronic device can only display 1 page at a time, and the user is browsing page 5, when it is determined that the eye movement type of the user's eyes is twitching, and the user's eyes If the gaze position of the electronic device moves from the top to the bottom, then when the gaze position of the user's eyes is at the bottom of the display area on the electronic device, the electronic device can switch the content of the file it displays from page 5 to page 5.
  • Page 6 when it is determined that the eye movement type of the user's eyes is jitter, and the gaze position of the user's eyes is moving from the bottom to the top of the electronic device, then when the gaze position of the user's eyes is at the top of the display area on the electronic device,
  • the electronic device can switch the content of the displayed file from page 5 to page 4.
  • "when the eye movement type of the user's eyes is bouncing, control and switch the content displayed on the electronic device according to the eye movement type at this time” can be understood as "controlling the electronic device according to the eye movement type of the bouncing control".
  • the electronic device when the electronic device is controlled according to the type of eye movement during jumping, the electronic device may be restricted from acquiring the user's gestures, so as to avoid interference from the user's gestures.
  • restricting the electronic device from acquiring the user's gesture may include: the electronic device does not acquire the user's gesture, or, although the electronic device acquires the user's gesture, it does not process the gesture.
  • the gesture tracking device 120 can be controlled to stop working, or the gesture tracking device 120 can be continuously controlled to work, but the acquired gestures are not processed.
  • the user's eye movement type can also be skipped.
  • the content used to instruct the user to perform gesture control such as Cursor, etc.
  • the content used to instruct the user to perform gesture control such as Cursor, etc.
  • the cursor 92 can be displayed in the area 1101 on the display screen of the electronic device 100 to instruct the user to perform gesture control. Then, when the user's eye movement type is gentle tracking, and the gaze area moves to When there is an area 1102 on the display screen of the electronic device 100 , as shown in (B) of FIG. 11 , the cursor 92 can be controlled to move from the area 1101 to the area 1102 .
  • the zoom-in content as an example to instruct the user to perform gesture control, as shown in (A) of FIG.
  • the icon 1202 in the enlarged area 1201 can be controlled to instruct the user to perform gesture control.
  • the icon 1202 in the area 1201 can be controlled to return to the initial state (that is, the state not enlarged), and the icon 1204 in the enlarged area 1203 can be controlled.
  • the user's hand when the user's hand needs to move a large distance during the gesture control process of the user, it is easy to cause fatigue of the user's hand, resulting in poor user experience.
  • the hand 22 of the user A when the hand 22 of the user A can move in the area 50 to complete the control of the electronic device 100, although it can solve the situation that the hand of the user A is easily fatigued due to the large movement of the hand,
  • the display area 51 on the electronic device 100 is obviously larger than the area 50, this makes the moving distance of the hand 22 of the user A in the area 50 often move a relatively large distance when it is mapped to the display area 51 on the electronic device 100, thereby causing control
  • the decreased precision makes it difficult for the user to precisely control the electronic device 100 .
  • the user's gaze area can be determined based on the user's gaze position, and the size of the gaze area can be set in advance
  • the ratio between the size of the gaze area and the size of the operation area can be 1:1, or 1:0.8, or 1:1.2, etc. etc., so that the user's hand can operate in the operating area of the user's hand during operation (of course, the user's hand can also exceed the operating area during operation.
  • the eye movement type of the user's eyes does not When it is not jumping, the cursor and other content used to instruct the user to perform gesture control can follow the user's hand to move outside the gaze area), and then complete the control of the content in the gaze area, improving the comfort of operation.
  • the user's gaze position is position x
  • the gaze area determined based on position x is area 53 at this time, wherein, area 53 is mapped to the operation area at the position where user A is located as area 52, and then , user A can control his hand 22 to make a control gesture in this area 52, that is, he can control the content in the area 53, so that the moving distance of the user's hand can be controlled in a more comfortable area for the user , which improves the user's operating experience.
  • the size of the area 53 and the area 52 can be the same, so that the moving distance of the hand 22 of the user A in the area 52 can be mapped to the area 53 at a 1:1 ratio, and thus accurate Controlling the electronic device 100 improves the control precision.
  • the size of the gaze area and the operation area is mainly to determine the scale of the operation. For example, when very high-precision manipulation is required, the ratio between the size of the gaze area and the operation area can be 2:1, etc.; When it is necessary to achieve lower-precision manipulation, the ratio between the size of the gaze area and the size of the operation area may be 1:2 or the like.
  • the gaze area may be an area extending outward for a preset distance centered on the gaze position of the user.
  • the gaze area may also be calculated in real time or periodically based on items such as the gaze position of the user, the distance between the user and the electronic device, or the angle at which the user watches the electronic device.
  • the distance between the user and the electronic device can be used to determine the target distance extending outward centered on the user's gaze position, and then the target distance can be obtained by extending the target distance outward centered on the user's gaze position. area.
  • the target distance S can be extended outward with the user's gaze position as the center, so as to obtain the gaze area.
  • the gaze area may be a regular shape such as a circle or a rectangle, or may be an irregular shape.
  • information instructing the user to perform gesture control may be placed in the gaze area in S802 above.
  • the determined gaze area is the area 54 and the cursor 92 is used to instruct the user to perform gesture control
  • the cursor 92 may be placed in the gaze area 54 .
  • the indication may also be performed in the determined gaze area, which will not be repeated here.
  • the cursor when the cursor is used to instruct the user to perform gesture control, in order to prevent the cursor from blocking the content the user is watching, when the cursor is displayed, the cursor can be displayed at a position within the gaze area that does not block the user's line of sight.
  • the display position of the cursor in the gaze area may be determined according to the type of application currently being used by the user. For example, if the application that the user is currently using is a reading application, the cursor can be displayed at a position where no text is displayed in the gaze area when displaying the cursor; if the application that the user is currently using is a video application, at this time When displaying the cursor, the cursor may be displayed at the lower left corner of the gaze area, and so on.
  • a preset area can be set on the electronic device. Gaze type directs the user to gesture control. Exemplarily, it may be determined whether the user's gaze area is a preset area based on the user's gaze position. For example, the user's gaze position may be compared with a preset area to determine whether the user's gaze position is located in the preset area.
  • the electronic device may determine the scene where the user uses the electronic device based on the type of the application (application, APP) being used by the user and the content displayed on the APP. For example, when the APP being used by the user is a video application and a video is being played on the APP, it can be determined that the scene where the user only has electronic devices is a scene of watching a video.
  • application application
  • APP application
  • FIG. 17 shows a process of controlling an electronic device.
  • the electronic device is equipped with the eye tracking device and the gesture tracking device described above, wherein the cursor is used to indicate when instructing the user to perform gesture control.
  • the process may include the following steps:
  • the eye tracking device 110 described above may be used to perform eye tracking, see the above description for details, and details will not be repeated here.
  • the eye movement type may be analyzed according to the tracked eye movement data. For example, according to the eye movement speed analysis of the user's eyes, etc. See the above description for details, and will not repeat them here.
  • the eye movement type after the eye movement type is determined, it can be determined whether the eye movement type is jitter. Wherein, when the eye movement type is jitter, it indicates that the current user has no intention of manipulating the electronic device. Therefore, at this time, S1304 is executed, that is, the execution returns to S1301. When the eye movement type is not jitter, it indicates that the current user has an intention to manipulate the electronic device, so at this time, the precise control stage can be entered, that is, S1305 is executed.
  • the gaze position of the user may be determined, so as to subsequently determine the operation area.
  • the manner of determining the gaze position refer to the above description for details, and details will not be repeated here.
  • a gazing area can be determined according to the gazing position, and then a cursor can be displayed at an appropriate position in the gazing area.
  • the gaze area may be the area 53 described in FIG. 14
  • the display cursor may be the display manner described in FIG. 16 . See the above description for details, and will not repeat them here.
  • the user's hand may be tracked by the gesture tracking device 120 described above, see the above description for details, and will not be repeated here.
  • the cursor can be controlled to follow the user's hand movement according to a specified scale.
  • a specified scale Exemplarily, as shown in FIG. 18 , at time T1, user A's hand 22 is at position 521 in area 52, and at this time cursor 92 is at position 531 in area 53; at time T2, user A's hand The hand part 22 moves and moves to the position 522 in the area 52 .
  • the cursor 92 can move with the hand part 22 according to a prescribed scale and move to the position 532 in the area 53 .
  • the gesture tracking device 120 described above may perform hand tracking on the user's hand, and then determine whether there is a gesture operation. Wherein, when there is gesture operation, execute S1310; otherwise, return to execute S1307.
  • a control instruction corresponding to the currently determined gesture is generated, and the control instruction is output to complete the corresponding operation .
  • the determined gesture is a confirmation gesture
  • a confirmation instruction is output, thereby completing the operation of confirming and selecting "setting".
  • the gesture control stage it is possible to continue to judge the user's eye movement type, and when the eye movement type is jumping, the control cursor disappears and returns to S1301, and when the eye movement type is not jumping, continue hand tracking , that is, execute S1306.
  • sequence numbers of the steps in the embodiments of the present application do not mean the order of execution, and the execution order of each process should be determined by its functions and internal logic, and should not constitute the implementation process of the embodiments of the present application. Any restrictions.
  • the execution order of each execution step in the embodiment of the present application may be adjusted and/or selectively executed according to actual conditions, which is not limited here.
  • the eye movement control and gesture control mentioned in the embodiments of this application can be combined as needed, for example: eye movement control first and then gesture control, or gesture control first and then eye movement control, or eye movement control and gesture control Simultaneously etc.
  • the electronic device may obtain an eye movement type of the user's eyes.
  • the electronic device controls the electronic device according to the first type.
  • the electronic device instructs the user to perform gesture control.
  • the electronic device may acquire the user's gesture.
  • the electronic device may control the electronic device according to the acquired gesture.
  • the electronic device may acquire a user's gesture first. Then, at S2002, the electronic device is controlled according to the acquired gesture. Next, in S2003, the electronic device may obtain the user's eye movement type, wherein the electronic device may also execute S2003 during the process of obtaining the user's gesture or during the process of controlling the electronic device. Next, at S2004, when it is determined that the user's eye movement type is the second type (for example: non-jumping), the electronic device may continue to obtain the user's gesture, and at S2005, control the electronic device according to the obtained gesture.
  • the second type for example: non-jumping
  • the electronic device when it is determined that the user's eye movement type is the first type (for example: jumping), the electronic device can control the electronic device according to the eye movement type at this time, and limit the electronic device to control the electronic device according to the user's gesture , for example: the electronic device may not obtain the user's gesture, or the electronic device may not process the user's gesture although it has obtained the user's gesture.
  • the first type for example: jumping
  • the electronic device can control the electronic device according to the eye movement type at this time, and limit the electronic device to control the electronic device according to the user's gesture , for example: the electronic device may not obtain the user's gesture, or the electronic device may not process the user's gesture although it has obtained the user's gesture.
  • the electronic device controls the electronic device according to the first type of eye movement type
  • the electronic device determines that the eye movement type of the user's eyes is switched from the first type (such as: jumping) to the second type (such as : non-jumping)
  • the electronic device can instruct the user to perform gesture control; then, the electronic device can obtain the gesture of the user, and control the electronic device according to the obtained gesture.
  • the process of obtaining the eye movement type and the gesture of the electronic device, and the process of controlling the electronic device, etc. please refer to the relevant description above, and will not repeat them here.
  • the electronic device may first obtain eye movement information of the user.
  • the eye movement type determined by the electronic device according to the eye movement information is the first type (for example: jumping), and then, in S2103, the electronic device may control the electronic device according to the eye movement type at this time.
  • the electronic device continues to obtain the user's eye movement information, and then, at S2105, the electronic device determines that the eye movement type according to the eye movement information is the second type (for example: non-jumping), and then, at S2106, the electronic device The user's gesture can be acquired, and the electronic device can be controlled according to the acquired gesture at S2107.
  • the electronic device can continue to acquire the user's eye movement information.
  • the eye movement type determined by the electronic device from the eye movement information is the first type (for example: jumping)
  • the electronic device can control the electronic device according to the eye movement type at this time. Take control and restrict access to user gestures. It can be understood that, for the process of obtaining the eye movement type and the gesture of the electronic device, and the process of controlling the electronic device, etc., please refer to the relevant description above, and will not repeat them here.
  • the electronic device may acquire a first gesture performed by the user on the first area on the electronic device.
  • the electronic device may control the electronic device according to the first gesture acquired at this time.
  • the electronic device may acquire the eye movement type of the user.
  • the electronic device may control the electronic device according to the current eye movement type, and restrict the electronic device from acquiring the user's gestures.
  • the electronic device may continue to acquire the eye movement type of the user.
  • the electronic device when it is determined that the user's eye movement type is switched from the first type (for example: jumping) to the second type (for example: non-beating), and the user's gaze position is switched from the first area on the electronic device to the second area , the electronic device can instruct the user to perform gesture control.
  • the electronic device may acquire a second gesture performed by the user on the second area, and at S2208 control the electronic device according to the acquired second gesture.
  • the process described in FIG. 22 can be understood as that the user first performs gesture control on an area on the electronic device, and then shifts the line of sight to another area on the electronic device, and then performs gesture control on the new area.
  • the user performs gesture control to the content in area 53; Then, switch from (D) of Figure 24 to (E) of Figure 24
  • the user's eye movement type acquired by the electronic device is jerk, at this time, the electronic device can determine the gaze position of the user's eyes in real time (that is, control the electronic device), and the electronic device can no longer obtain the user's gestures during this process.
  • the electronic device continues to acquire the user's eye movement type, and at this time the user's eye movement The type is switched from beating to non-jumping, and the user's gaze position is also switched from the position in (D) of Figure 24 to the position where the picture 63 is located.
  • the electronic device can indicate the user Gesture control is performed, and the gestures performed by the user on the area where the picture 63 is located are subsequently obtained, and the electronic device is controlled according to the obtained gestures.
  • multiple items are displayed on the display screen of the electronic device 100, such as applications, files, pictures, videos and so on.
  • user A can browse items on the electronic device 100, and during user A's browsing process, the eye movement tracking device matched with the electronic device 100 can determine that the eye movement type of user A's eyes is At this time, continue to judge the eye movement type of user A's eyes.
  • user A's eyes start to fixate on item 61 , and the eye-tracking device on the electronic device 100 can determine that the eye-movement type of user A's eyes is not jitter.
  • the gaze position of the user A's eyes can be determined by using the eye tracking device, and then the gaze area 53 can be determined, and the cursor 92 can be displayed at the lower left corner of the gaze area 53 .
  • the user A can use his hand 22 to perform gesture operations in the operation area 52 corresponding to the gaze area 53 . But at this point User A changes his mind and rebrowses the item again.
  • the eye-tracking device can be used to determine that the eye movement type of user A's eyes is twitching.
  • the cursor 902 is no longer displayed, that is, the cursor 902 is controlled to disappear, and the electronic device shown in (C) of FIG. 23 is displayed. 100 displayed interface.
  • cursor 92 can be placed on item 61, so that user A can directly select item 61, and The item 61 can be selected without moving the cursor 92 .
  • the user A's eyes start to focus on the item 62 .
  • the eye-tracking device on the electronic device 100 can determine that the eye movement type of the user A's eyes is not jitter.
  • the gaze position of user A's eyes can be determined by using the eye tracking device matched with the electronic device 100 , and then the gaze area 53 can be determined, and the cursor 92 can be displayed at the lower left corner of the gaze area 53 .
  • user A can use his hand 22 to perform gesture operations in the operation area 52 corresponding to the gaze area 53 .
  • the cursor 92 can be placed on the item 62, so that the user A can directly select the item 62, and It is not necessary to move the cursor 92 to select the item 62 .
  • the user A moves his hand 22 as shown in (E) of FIG. 23 .
  • the gesture tracking device matched with the electronic device 100 can track the movement of the hand 22 .
  • the electronic device 100 can control the cursor 92 in its gaze area 53 to move along with the hand 22 .
  • the cursor 92 moves to the item 62, the user A can complete the "confirmation" operation through the confirmation gesture. So far, the operation of selecting item 62 is completed.
  • content such as text may be displayed on the electronic device 100 , and the user may read the text on the electronic device 100 at this time.
  • the scene may be a meeting, education and other scenes.
  • the user A reads the text being displayed on the display screen of the electronic device 100 .
  • the eye movement tracking device matched with the electronic device 100 detects that the eye movement type is smooth tracking.
  • the gaze position of user A's eyes can be determined by using the eye tracking device, and then the gaze area 53 can be determined, that is, the eye movement type of the user's eyes is not jumping, and at this time it can be A cursor 92 is displayed within the gaze area 53 .
  • the cursor 92 may appear in a place that does not block the reading line of sight.
  • the operation area 52 corresponding to the gaze area can also be determined in (B) of FIG. 24 , and the hand 22 of the user A can make a corresponding gesture operation in the operation area 52 .
  • the user A moves his hand 22 as shown in (C) of FIG. 24 .
  • the gesture tracking device matched with the electronic device 100 can track the movement of the hand 22 .
  • the electronic device 100 can control the cursor 92 in its gaze area 53 to move along with the hand 22 .
  • the cursor 92 moves to the position where user A wants to mark, user A can stop moving his hand 22 and make a gesture to start marking, so as to mark.
  • user A wants to mark text on the picture 63 , at this time, user A can move the cursor 92 by moving his hand 22 .
  • the user A can use a text writing gesture to write text in the manipulation area 52 , for example, write "smile" and so on.
  • the gesture tracking device on the electronic device 100 can track the writing track of user A's hand 22, and then the display screen of the electronic device 100 can display the same content as the user's writing track. , for example to display "smile".
  • user A can use a specific gesture to switch the writing position to the left side of the operation area 52, and then continue writing.
  • the type of annotation includes but not limited to pattern, text, symbol and so on.
  • close touch marking requires users to approach the screen every time they mark, which is very inconvenient; and voice remote marking on the one hand Only text information can be input, and it is difficult to input pattern information, and speech signal recognition is also affected by environmental noise.
  • users can complete natural and efficient annotation of various information remotely, which improves the efficiency of annotation and user experience.
  • the scene may be an entertainment scene.
  • the scene may be a game scene.
  • the user may be experiencing first-person shooting games, somatosensory games, puzzle games, chess and card games, and the like.
  • the following uses a first-person shooter game as an example to introduce this scenario.
  • the eye movement tracking device matched with the electronic device 100 detects that the eye movement type is gaze. Then, the gaze position of the user A's eyes can be determined by using the eye tracking device, and then the gaze area 53 can be determined, and the cursor 92 for aiming can be displayed in the gaze area 53 . Afterwards, the user A's hand 22 can make a shooting gesture in the operation area 52 corresponding to the gaze area 53 to complete the operation of shooting the target 71 .
  • the user A finds the target 72 and shifts his gaze to the target 72 .
  • the eye movement tracking device matched with the electronic device 100 detects that the eye movement type is jitter, and at this time, the cursor 92 can be controlled to disappear.
  • the eye movement tracking device matched with the electronic device 100 detects that the eye movement type is gaze.
  • a cursor 92 for aiming will then appear within the gaze area 53 .
  • the cursor 92 is not completely on the target 72 , in order to improve the shooting accuracy, the user A can adjust the cursor 92 .
  • the hand 22 of user A can make a movement gesture in the operation area 52 corresponding to the gaze area 53, and at this time, the cursor 92 can follow the hand 22 of user A to move .
  • the user A's hand 22 can make a shooting gesture in the operation area 52 corresponding to the gaze area 53 to complete the operation of shooting the target 72 .
  • This scene can be a usage scene of a virtual large-screen interface created by electronic devices such as virtual reality (virtual reality, VR), augmented reality (augmented reality, AR), and mixed reality (mixed reality, MR).
  • virtual reality virtual reality
  • AR augmented reality
  • MR mixed reality
  • the eye-tracking device matched with the electronic device 100 can determine that the eye movement type of the user A is not jitter. Afterwards, the eye tracking device can be used to determine the gaze position of the user A's eyes, and then the gaze area 53 can be determined, and a cursor 92 is displayed in the gaze area 53 to instruct user A to perform gesture control. At this time, the user A can use his hand 26 to perform gesture operations in the operation area 52 corresponding to the gaze area 53 .
  • the user A wants to close the item 64 , and the user A can move his hand 26 in the operation area 52 .
  • the gesture tracking device matched with the electronic device 100 can track the movement of the hand 26 .
  • the electronic device 100 can control the cursor 92 in its gaze area 53 to move along with the hand 26 .
  • User A may stop moving hand 26 when cursor 92 is moved over item 62 .
  • user A can make a confirmation gesture with his hand 26 to complete the "confirmation” operation.
  • the gesture tracking device matched with the electronic device 100 can track the “confirm” operation of the hand 26 .
  • the electronic device 100 can close the item 64 on its FOV. So far, the operation of closing item 64 is completed.
  • the scene may be an entertainment scene.
  • the scene may be a scene of watching a video (such as watching a movie, etc.).
  • a preset area is set on the electronic device. When the user looks at the preset area, it can be judged that the user currently has the intention to control the electronic device. When the user looks at other areas, it can be judged that the user does not currently exist. Intent to control electronic devices. It is understandable that when a user watches a video, the eye movement type of the user's eyes is not jumping in most cases. Therefore, such a design method can avoid misidentification and improve user experience.
  • the eye movement tracking device matched with the electronic device 100 can detect the user's eye movement type in real time or periodically.
  • the user A is looking at the area 56 at this time.
  • the area 56 is a preset area on the electronic device 100 .
  • the eye-tracking device matched with the electronic device 100 can detect that the eye movement type of user A is not jumping, and determine the gaze position of user A's eyes is the area 56, and turn to (C) of FIG. 27 at this time.
  • the eye tracking device can be used to determine the gaze position of the user A's eyes, and then the gaze area 53 can be determined, and the playback progress and the cursor 92 can be displayed in the gaze area 53, At this time, the cursor 92 may be displayed in the lower left corner of the gaze area 53 .
  • the operation area 52 corresponding to the gaze area 53 can also be determined in (C) of FIG. 27 , and the hand 22 of the user A can perform corresponding gesture operations in the operation area 52 . It can be understood that when a user watches a video, he generally wants to adjust the progress or volume, etc.
  • the cursor 92 can be displayed at the position where the progress is adjusted or the volume is adjusted.
  • the volume adjustment key and the playback progress bar can be displayed at the same time at this time, so that the user can make a selection based on needs.
  • the electronic device 100 can also be provided with an area corresponding to the volume adjustment and an area corresponding to the playback progress. When the user looks at the area corresponding to the volume adjustment, the volume adjustment key will be displayed. When the user looks at the area corresponding to the playback progress area, the playback progress bar will be displayed.
  • a target area corresponding to the target function may be set on the electronic device 100 , and when the user looks at the target area, control keys related to the target function will be displayed on the electronic device.
  • the user A moves his hand 22 as shown in (D) of FIG. 27 .
  • the gesture tracking device matched with the electronic device 100 can track the movement of the hand 22 .
  • the electronic device 100 can control the cursor 92 in its gaze area 53 to move along with the hand 22 .
  • the cursor 92 moves to a position that user A wants to adjust, user A can stop moving his hand 22 .
  • user A can make a gesture of selecting the progress bar and drag the progress bar.
  • the gesture tracking device matched with the electronic device 100 can track the movement of the hand 22 .
  • the electronic device 100 can control the cursor 92 in its gaze area 53 to move along with the hand 22 .
  • the cursor 92 moves to the position that user A wants to mark, user A can stop moving his hand 22 .
  • the eye-tracking device matched with the electronic device 100 can detect that the eye movement type of user A is jumping, so the cursor 92 can be controlled to disappear .
  • eye movement control and gesture control there can be no sense interaction between eye movement control and gesture control, so that users can naturally and efficiently complete the adjustment of playback progress, sound, etc. in entertainment scenes, and can also achieve higher precision. operation, so as to obtain high-quality control experience and movie viewing experience.
  • processor in the embodiments of the present application may be a central processing unit (central processing unit, CPU), and may also be other general processors, digital signal processors (digital signal processor, DSP), application specific integrated circuits (application specific integrated circuit, ASIC), field programmable gate array (field programmable gate array, FPGA) or other programmable logic devices, transistor logic devices, hardware components or any combination thereof.
  • CPU central processing unit
  • DSP digital signal processor
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • a general-purpose processor can be a microprocessor, or any conventional processor.
  • the method steps in the embodiments of the present application may be implemented by means of hardware, or may be implemented by means of a processor executing software instructions.
  • the software instructions can be composed of corresponding software modules, and the software modules can be stored in random access memory (random access memory, RAM), flash memory, read-only memory (read-only memory, ROM), programmable read-only memory (programmable rom) , PROM), erasable programmable read-only memory (erasable PROM, EPROM), electrically erasable programmable read-only memory (electrically EPROM, EEPROM), register, hard disk, mobile hard disk, CD-ROM or known in the art any other form of storage medium.
  • An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium.
  • the storage medium may also be a component of the processor.
  • the processor and storage medium can be located in the ASIC.
  • all or part of them may be implemented by software, hardware, firmware or any combination thereof.
  • software When implemented using software, it may be implemented in whole or in part in the form of a computer program product.
  • the computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on the computer, the processes or functions according to the embodiments of the present application will be generated in whole or in part.
  • the computer can be a general purpose computer, a special purpose computer, a computer network, or other programmable devices.
  • the computer instructions may be stored in or transmitted via a computer-readable storage medium.
  • the computer instructions may be transmitted from one website site, computer, server, or data center to another website site by wired (such as coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (such as infrared, wireless, microwave, etc.) , computer, server or data center for transmission.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer, or a data storage device such as a server or a data center integrated with one or more available media.
  • the available medium may be a magnetic medium (such as a floppy disk, a hard disk, or a magnetic tape), an optical medium (such as a DVD), or a semiconductor medium (such as a solid state disk (solid state disk, SSD)), etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A method for interacting with an electronic device, and an electronic device, relating to the technical field of terminals. According to the method, in the operation process of the electronic device, the eye movement type of eyes of a user can be tested, and when the eye movement type is a saccade, the electronic device is controlled according to the eye movement type at the moment, and when the eye movement type is not a saccade, the user can be instructed to perform gesture control. After the user is instructed to perform gesture control, an operation gesture of the user can be obtained, and the electronic device is controlled according to the obtained operation gesture. Therefore, eye movement control and gesture control are combined on the electronic device, so that non-inductive interaction between eye movement control and gesture control can be realized, and thus the user can efficiently and conveniently control the electronic device, and the user experience is improved.

Description

一种与电子设备进行交互的方法及电子设备A method for interacting with an electronic device and the electronic device
本申请要求于2021年11月10日提交中国国家知识产权局、申请号为202111329181.9、申请名称为“一种与电子设备进行交互的方法及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。This application claims the priority of a Chinese patent application filed with the State Intellectual Property Office of China on November 10, 2021, with the application number 202111329181.9, and the title of the application is "A Method for Interacting with Electronic Devices and Electronic Devices", the entire content of which Incorporated in this application by reference.
技术领域technical field
本申请涉及终端技术领域,尤其涉及一种与电子设备进行交互的方法及电子设备。The present application relates to the technical field of terminals, and in particular to a method for interacting with an electronic device and the electronic device.
背景技术Background technique
在人们生活中,电子设备已经成为不可或缺的一部分。在操控电子设备方面,人们通常使用手持式遥控装置或者电子设备上的控制按键对电子设备进行操控,操作较为繁琐。Electronic devices have become an integral part of people's lives. In terms of controlling electronic equipment, people usually use a hand-held remote control device or control buttons on the electronic equipment to control the electronic equipment, and the operation is cumbersome.
相关技术中,可以通过检测用户眼睛的注视点和动作,并根据预设的注视点和动作对电子设备进行操控,但在操控过程中,用户的眼睛需要频繁做出刻意的动作,这就容易引起眼部疲劳。因此,如何实现高效便捷的对电子设备进行操控是目前亟需解决的技术问题。In related technologies, it is possible to detect the gaze point and movement of the user's eyes, and control the electronic device according to the preset gaze point and movement. However, during the manipulation process, the user's eyes need to frequently make deliberate movements, which is easy Cause eye fatigue. Therefore, how to realize efficient and convenient control of electronic devices is a technical problem that needs to be solved urgently.
发明内容Contents of the invention
本申请提供了一种与电子设备进行交互的方法、电子设备、计算机存储介质及计算机程序产品,能够通过眼动操控和手势操控相结合,实现高效便捷的对电子设备进行操控,提升了用户体验。The present application provides a method for interacting with an electronic device, an electronic device, a computer storage medium, and a computer program product, which can realize efficient and convenient control of the electronic device by combining eye movement control and gesture control, and improve user experience .
第一方面,本申请提供一种与电子设备进行交互的方法,该方法包括:电子设备获取用户眼睛的眼动类型;当确定眼动类型为第一类型时,电子设备根据第一类型对电子设备进行控制;当确定眼动类型为第二类型时,电子设备指示用户进行手势控制;电子设备获取用户的手势;电子设备根据获取的手势对电子设备进行控制。这样,通过在电子设备上将眼动操控和手势操控相结合,使得眼动操控和手势操控之间可以无感交互,从而使得用户可以高效便捷的对电子设备进行操控,提升了用户体验。示例性的,第一类型可以为跳动,第二类型可以为非跳动,例如:平视或平缓追踪等。In a first aspect, the present application provides a method for interacting with an electronic device, the method including: the electronic device acquires the eye movement type of the user's eyes; when the eye movement type is determined to be the first type, the electronic device responds to the electronic The device performs control; when it is determined that the eye movement type is the second type, the electronic device instructs the user to perform gesture control; the electronic device obtains the user's gesture; and the electronic device controls the electronic device according to the obtained gesture. In this way, by combining the eye movement control and the gesture control on the electronic device, the interaction between the eye movement control and the gesture control can be realized, so that the user can efficiently and conveniently control the electronic device, and the user experience is improved. Exemplarily, the first type may be bouncing, and the second type may be non-jumping, such as head-up or smooth tracking.
根据第一方面,指示用户进行手势控制,具体包括:获取用户的注视位置;在注视位置所在的第一预设区域内指示用户进行手势控制。这样,可以实现在与用户的注视位置相匹配的区域内指示用户进行手势控制,提升指示的准确度和后序用户操作的便捷度。示例性的,第一预设区域可以理解为一个面或一个点。其中,当理解为一个面时,第一预设区域可以为以用户的注视位置为中心并向外延伸预设距离所组成的区域;当理解为一个点时,第一预设区域可以为用户的注视位置所在的点,即用户的注视点。According to the first aspect, instructing the user to perform gesture control specifically includes: obtaining a gaze position of the user; and instructing the user to perform gesture control within a first preset area where the gaze position is located. In this way, it is possible to instruct the user to perform gesture control in an area matching the gaze position of the user, thereby improving the accuracy of the indication and the convenience of subsequent user operations. Exemplarily, the first preset area can be understood as a surface or a point. Among them, when understood as a surface, the first preset area can be the area formed by taking the user's gaze position as the center and extending a preset distance outward; when understood as a point, the first preset area can be the user's The point where the gaze position of is, that is, the user's gaze point.
根据第一方面,或者以上第一方面的任意一种实现方式,指示用户进行手势控制,具体包括:根据电子设备上所显示的第一内容的类型,确定目标区域;在目标区域内指示用户进行手势控制,其中,目标区域为电子设备上未包含第一内容的区域。这样, 可以使得在指示用户进行手势控制时不会对用户浏览内容造成影响,从而提升用户体验。示例性的,第一内容可以为文字内容或图片内容等等。According to the first aspect, or any implementation manner of the above first aspect, instructing the user to perform gesture control specifically includes: determining the target area according to the type of the first content displayed on the electronic device; instructing the user to perform gesture control in the target area Gesture control, wherein the target area is an area on the electronic device that does not contain the first content. In this way, when instructing the user to perform gesture control, the browsing content of the user will not be affected, thereby improving user experience. Exemplarily, the first content may be text content or picture content and so on.
根据第一方面,或者以上第一方面的任意一种实现方式,指示用户进行手势控制,具体包括:获取用户的注视位置;根据注视位置,确定用户的注视区域;当注视区域为第二预设区域时,指示用户进行手势控制。这样,可以使得只有在用户注视特定区域时才会指示用户进行手势控制,从而可以避免用户在沉浸式观看电子设备时误触发指示用户进行手势控制,提升用户体验。例如,当用户观看影片时,通过这种方式可以避免误指示用户进行手势控制。According to the first aspect, or any implementation of the above first aspect, instructing the user to perform gesture control specifically includes: obtaining the gaze position of the user; determining the gaze area of the user according to the gaze position; when the gaze area is the second preset area, instructs the user to perform gesture controls. In this way, the user is instructed to perform gesture control only when the user is looking at a specific area, thereby preventing the user from falsely triggering and instructing the user to perform gesture control when viewing the electronic device immersively, thereby improving user experience. For example, when the user is watching a movie, this way the user can be prevented from being falsely instructed to perform gesture controls.
根据第一方面,或者以上第一方面的任意一种实现方式,指示用户进行手势控制,包括以下一项或多项:控制电子设备显示光标,控制放大电子设备所显示的至少部分内容,或者,控制与电子设备配套的声音组件播报第一语音,第一语音用于指示用户进行手势控制。由此以动态方式或声音方式提示用户进行手势控制。According to the first aspect, or any implementation manner of the above first aspect, instructing the user to perform gesture control includes one or more of the following: controlling the electronic device to display a cursor, controlling to enlarge at least part of the content displayed by the electronic device, or, The sound component matched with the electronic device is controlled to broadcast the first voice, and the first voice is used to instruct the user to perform gesture control. The user is thus prompted dynamically or audibly for gesture control.
根据第一方面,或者以上第一方面的任意一种实现方式,指示用户进行手势控制包括控制电子设备显示光标,且光标显示在电子设备的显示区域中的第一区域内;该方法还包括:确定眼动类型持续为第二类型,在用户的眼睛出现移动且用户的注视区域由电子设备的显示区域中的第一区域切换至第二区域时,控制光标由第一区域移动至第二区域。这样,在用户的眼睛移动过程中,当用户的注视区域发生变化时控制光标同步移动,使得用户可以便于选择其所需要的内容,提升了用户体验。According to the first aspect, or any implementation of the above first aspect, instructing the user to perform gesture control includes controlling the electronic device to display a cursor, and the cursor is displayed in the first area of the display area of the electronic device; the method further includes: Determine that the eye movement type continues to be the second type, and when the user's eyes move and the user's gaze area is switched from the first area to the second area in the display area of the electronic device, control the cursor to move from the first area to the second area . In this way, during the movement of the user's eyes, when the user's gaze area changes, the cursor is controlled to move synchronously, so that the user can easily select the content he needs, and the user experience is improved.
根据第一方面,或者以上第一方面的任意一种实现方式,指示用户进行手势控制包括控制放大电子设备所显示的至少部分内容,且放大的内容为电子设备的显示区域中第一区域中所显示的内容;该方法还包括:确定眼动类型持续为第二类型,在用户的眼睛出现移动且用户的注视区域由电子设备的第一区域切换至第二区域时,将第一区域中放大的内容恢复至初始状态,以及放大第二区域中的内容。这样,在用户的眼睛移动过程中,,当用户的注视区域发生变化时同步更改电子设备所显示的放大的内容,使得用户可以便于选择其所需要的内容,提升了用户体验。According to the first aspect, or any implementation manner of the above first aspect, instructing the user to perform gesture control includes controlling and zooming in on at least part of the content displayed on the electronic device, and the zoomed-in content is what is displayed in the first area of the display area of the electronic device Displayed content; the method also includes: determining that the eye movement type continues to be the second type, and when the user's eyes move and the user's gaze area is switched from the first area to the second area of the electronic device, zooming in on the first area The content of the area is restored to its original state, and the content in the second area is enlarged. In this way, during the movement of the user's eyes, when the user's gaze area changes, the magnified content displayed on the electronic device is synchronously changed, so that the user can easily select the content he needs, and the user experience is improved.
根据第一方面,或者以上第一方面的任意一种实现方式,该方法还包括:确定眼动类型由第二类型切换为第一类型,停止指示用户进行手势控制,和/或,电子设备根据第一类型对电子设备进行控制,并限制电子设备获取用户的手势。According to the first aspect, or any implementation of the above first aspect, the method further includes: determining that the eye movement type is switched from the second type to the first type, stopping instructing the user to perform gesture control, and/or, the electronic device according to The first type controls the electronic device and restricts the electronic device from acquiring user's gestures.
根据第一方面,或者以上第一方面的任意一种实现方式,在确定眼动类型为第二类型之后,还包括:获取用户的注视位置,以及根据注视位置,确定用户的注视区域的尺寸;根据注视区域的尺寸,确定用户进行手势控制时的操作区域的尺寸。这样可以使得用户在与注视区域的尺寸相匹配的尺寸的操作区域内进行手势操作,提升了手势操作的精准度和用户体验。According to the first aspect, or any implementation manner of the above first aspect, after determining that the eye movement type is the second type, it further includes: obtaining the gaze position of the user, and determining the size of the gaze area of the user according to the gaze position; According to the size of the gaze area, the size of the operation area when the user performs gesture control is determined. In this way, the user can perform gesture operations in an operation area of a size matching the size of the gaze area, thereby improving the accuracy of gesture operations and user experience.
根据第一方面,或者以上第一方面的任意一种实现方式,电子设备根据第一类型对电子设备进行控制,具体包括:电子设备根据第一类型,实时确定用户的注视位置,和/或,电子设备根据第一类型,控制切换电子设备上所显示的内容。According to the first aspect, or any implementation manner of the above first aspect, the electronic device controls the electronic device according to the first type, specifically including: the electronic device determines the gaze position of the user in real time according to the first type, and/or, The electronic device controls switching of the content displayed on the electronic device according to the first type.
第二方面,本申请提供一种与电子设备进行交互的方法,该方法包括:电子设备获取用户的手势;电子设备根据获取的手势对电子设备进行控制;电子设备获取用户的眼动类型;当确定用户的眼动类型为第一类型时,电子设备根据第一类型对电子设 备进行控制,并限制电子设备根据用户的手势控制电子设备;当确定用户的眼动类型为第二类型时,电子设备获取用户的手势;电子设备根据获取的手势对电子设备进行控制。这样,通过在电子设备上将眼动操控和手势操控相结合,使得眼动操控和手势操控之间可以无感交互,从而使得用户可以高效便捷的对电子设备进行操控,提升了用户体验。示例性的,第一类型可以为跳动,第二类型可以为非跳动,例如:平视或平缓追踪等。In a second aspect, the present application provides a method for interacting with an electronic device, the method comprising: the electronic device obtains a user's gesture; the electronic device controls the electronic device according to the obtained gesture; the electronic device obtains the user's eye movement type; When it is determined that the user's eye movement type is the first type, the electronic device controls the electronic device according to the first type, and restricts the electronic device from controlling the electronic device according to the user's gesture; when it is determined that the user's eye movement type is the second type, the electronic device The device obtains the user's gesture; the electronic device controls the electronic device according to the obtained gesture. In this way, by combining the eye movement control and the gesture control on the electronic device, the interaction between the eye movement control and the gesture control can be realized, so that the user can efficiently and conveniently control the electronic device, and the user experience is improved. Exemplarily, the first type may be bouncing, and the second type may be non-jumping, such as head-up or smooth tracking.
根据第二方面,或者以上第二方面的任意一种实现方式,电子设备根据第一类型对电子设备进行控制,并限制电子设备获取用户的手势之后,该方法还包括:当确定用户的眼动类型由第一类型切换至第二类型时,电子设备指示用户进行手势控制;电子设备获取用户的手势;电子设备根据获取的手势对电子设备进行控制。由此以将眼动操控和手势操控有序的相结合,使得眼动操控和手势操控之间可以无感交互,从而使得用户可以高效便捷的对电子设备进行操控,提升了用户体验。According to the second aspect, or any implementation of the above second aspect, after the electronic device controls the electronic device according to the first type and restricts the electronic device from acquiring the user's gestures, the method further includes: when determining the user's eye movement When the type is switched from the first type to the second type, the electronic device instructs the user to perform gesture control; the electronic device obtains the user's gesture; and the electronic device controls the electronic device according to the obtained gesture. Therefore, eye movement control and gesture control are combined in an orderly manner, so that the eye movement control and gesture control can interact without feeling, so that the user can efficiently and conveniently control the electronic device, and the user experience is improved.
根据第二方面,或者以上第二方面的任意一种实现方式,限制电子设备根据用户的手势控制电子设备,具体包括:电子设备不获取用户的手势,或者,电子设备虽然获取了用户的手势但不对手势进行处理。由此以达到限制电子设备根据用户的手势控制电子设备的目的。According to the second aspect, or any implementation manner of the above second aspect, restricting the electronic device from controlling the electronic device according to the user's gestures specifically includes: the electronic device does not obtain the user's gestures, or, although the electronic device obtains the user's gestures but does not Gestures are not processed. In this way, the purpose of restricting the electronic device from controlling the electronic device according to the gesture of the user is achieved.
第三方面,本申请提供一种与电子设备进行交互的方法,该方法包括:电子设备获取用户针对电子设备上第一区域进行操作的第一手势;电子设备根据获取的第一手势对电子设备进行控制;电子设备获取用户的眼动类型;当确定用户的眼动类型为第一类型时,电子设备根据第一类型对电子设备进行控制,并限制电子设备获取用户的手势;电子设备继续获取用户的眼动类型;当确定用户的眼动类型由第一类型切换至第二类型,且用户的注视位置由电子设备上的第一区域切换至第二区域时,电子设备指示用户进行手势控制;电子设备获取用户针对第二区域进行操作的第二手势;电子设备根据获取的第二手势对电子设备进行控制。这样,通过在电子设备上将眼动操控和手势操控相结合,使得用户在切换控制区域时可以达到无感切换的目的,从而使得用户可以高效便捷的对电子设备进行操控,提升了用户体验。示例性的,第一类型可以为跳动,第二类型可以为非跳动,例如:平视或平缓追踪等。In a third aspect, the present application provides a method for interacting with an electronic device, the method comprising: the electronic device acquires a first gesture of a user operating on a first area on the electronic device; the electronic device responds to the electronic device according to the acquired first gesture Control; the electronic device obtains the user's eye movement type; when it is determined that the user's eye movement type is the first type, the electronic device controls the electronic device according to the first type, and restricts the electronic device from obtaining the user's gesture; the electronic device continues to obtain The user's eye movement type; when it is determined that the user's eye movement type is switched from the first type to the second type, and the user's gaze position is switched from the first area on the electronic device to the second area, the electronic device instructs the user to perform gesture control The electronic device acquires a second gesture performed by the user on the second area; the electronic device controls the electronic device according to the acquired second gesture. In this way, by combining eye movement control and gesture control on the electronic device, the user can achieve the purpose of non-inductive switching when switching the control area, so that the user can efficiently and conveniently control the electronic device and improve user experience. Exemplarily, the first type may be bouncing, and the second type may be non-jumping, such as head-up or smooth tracking.
第四方面,本申请提供一种与电子设备进行交互的方法,该方法包括:电子设备获取用户的第一眼动信息;电子设备根据第一眼动信息确定第一眼动类型;电子设备根据第一眼动类型对电子设备进行控制;电子设备获取用户的第二眼动信息;电子设备根据第二眼动信息确定第二眼动类型;电子设备获取用户的手势;电子设备根据获取的手势对电子设备进行控制。这样,通过在电子设备上将眼动操控和手势操控相结合,使得眼动操控和手势操控之间可以无感交互,从而使得用户可以高效便捷的对电子设备进行操控,提升了用户体验。示例性的,第一眼动类型可以为跳动,第二眼动类型可以为非跳动,例如:平视或平缓追踪等。示例性的,第一眼动类型也可以称之为第一类型,第二眼动类型也可以称之为第二类型。In a fourth aspect, the present application provides a method for interacting with an electronic device, the method comprising: the electronic device obtains the user's first eye movement information; the electronic device determines the first eye movement type according to the first eye movement information; the electronic device determines the first eye movement type according to The first eye movement type controls the electronic device; the electronic device obtains the user's second eye movement information; the electronic device determines the second eye movement type according to the second eye movement information; the electronic device obtains the user's gesture; the electronic device according to the acquired gesture Control electronic equipment. In this way, by combining the eye movement control and the gesture control on the electronic device, the interaction between the eye movement control and the gesture control can be realized, so that the user can efficiently and conveniently control the electronic device, and the user experience is improved. Exemplarily, the first type of eye movement may be jerky, and the second type of eye movement may be non-jiggly, such as head-up or smooth tracking. Exemplarily, the first eye movement type may also be referred to as the first type, and the second eye movement type may also be referred to as the second type.
根据第四方面,或者以上第四方面的任意一种实现方式,该方法还包括:电子设备获取用户的第三眼动信息;电子设备根据第三眼动信息确定第一眼动类型;电子设备根据第一眼动类型对电子设备进行控制,并限制电子设备获取用户的手势。According to the fourth aspect, or any implementation of the above fourth aspect, the method also includes: the electronic device acquires the third eye movement information of the user; the electronic device determines the first eye movement type according to the third eye movement information; the electronic device The electronic device is controlled according to the first eye movement type, and the electronic device is restricted from acquiring the user's gestures.
第五方面,本申请提供一种电子设备,包括:眼动追踪装置,用于获取用户的眼动信息;手势追踪装置,用于获取用户的手势;至少一个存储器,用于存储程序;至少一个处理器,用于执行存储器存储的程序,当存储器存储的程序被执行时,处理器用于:控制眼动追踪装置获取用户的眼动信息;当由眼动信息确定的用户眼睛的眼动类型为第一类型时,根据第一类型对电子设备进行控制;当由眼动信息确定的用户眼睛的眼动类型为第二类型时,控制电子设备指示用户进行手势控制;控制手势追踪装置获取用户的手势;根据手势追踪装置获取的手势对电子设备进行控制。In a fifth aspect, the present application provides an electronic device, including: an eye-tracking device for acquiring user's eye-movement information; a gesture tracking device for acquiring user's gestures; at least one memory for storing programs; at least one The processor is used to execute the program stored in the memory. When the program stored in the memory is executed, the processor is used to: control the eye movement tracking device to obtain the eye movement information of the user; when the eye movement type of the user's eyes determined by the eye movement information is For the first type, the electronic device is controlled according to the first type; when the eye movement type of the user’s eyes determined by the eye movement information is the second type, the electronic device is controlled to instruct the user to perform gesture control; the gesture tracking device is controlled to obtain the user’s Gesture: controlling the electronic device according to the gesture acquired by the gesture tracking device.
根据第五方面,或者以上第五方面的任意一种实现方式,处理器控制电子设备指示用户进行手势控制,具体包括:处理器根据用户的眼动信息,确定用户的注视位置;处理器控制电子设备在注视位置所在的第一预设区域内提示用户进行手势控制。According to the fifth aspect, or any implementation manner of the fifth aspect above, the processor controls the electronic device to instruct the user to perform gesture control, specifically including: the processor determines the user's gaze position according to the user's eye movement information; the processor controls the electronic device to The device prompts the user to perform gesture control within the first preset area where the gaze position is located.
根据第五方面,或者以上第五方面的任意一种实现方式,处理器控制电子设备指示用户进行手势控制,具体包括:处理器根据电子设备上所显示的第一内容的类型,确定目标区域;处理器控制电子设备在目标区域内提示用户进行手势控制,其中,目标区域为电子设备上未包含第一内容的区域。According to the fifth aspect, or any implementation manner of the above fifth aspect, the processor controls the electronic device to instruct the user to perform gesture control, specifically including: the processor determines the target area according to the type of the first content displayed on the electronic device; The processor controls the electronic device to prompt the user to perform gesture control in a target area, wherein the target area is an area on the electronic device that does not contain the first content.
根据第五方面,或者以上第五方面的任意一种实现方式,处理器控制电子设备指示用户进行手势控制,具体包括:处理器根据用户的眼动信息,获取用户的注视位置;处理器根据注视位置,确定用户的注视区域;当注视区域为第二预设区域时,处理器控制电子设备提示用户进行手势控制。According to the fifth aspect, or any implementation of the fifth aspect above, the processor controls the electronic device to instruct the user to perform gesture control, specifically including: the processor obtains the user's gaze position according to the user's eye movement information; The location determines the gaze area of the user; when the gaze area is the second preset area, the processor controls the electronic device to prompt the user to perform gesture control.
根据第五方面,或者以上第五方面的任意一种实现方式,处理器控制电子设备指示用户进行手势控制,包括以下一项或多项:处理器控制电子设备显示光标,处理器控制放大电子设备所显示的至少部分内容,或者,处理器控制与电子设备配套的声音组件播报第一语音,第一语音用于提示用户进行手势控制。According to the fifth aspect, or any implementation manner of the above fifth aspect, the processor controls the electronic device to instruct the user to perform gesture control, including one or more of the following: the processor controls the electronic device to display a cursor, and the processor controls the electronic device to zoom in At least part of the content is displayed, or the processor controls a sound component matched with the electronic device to broadcast a first voice, and the first voice is used to prompt the user to perform gesture control.
根据第五方面,或者以上第五方面的任意一种实现方式,处理器控制电子设备指示用户进行手势控制包括处理器控制电子设备显示光标,且光标显示在电子设备的显示区域中的第一区域内;处理器还用于:当由眼动信息确定的用户眼睛的眼动类型持续为第二类型,且在用户的眼睛出现移动且用户的注视区域由电子设备的显示区域中的第一区域切换至第二区域时,控制电子设备将光标由第一区域移动至第二区域,其中,注视区域由用户的眼睛的注视位置确定。According to the fifth aspect, or any implementation of the above fifth aspect, the processor controlling the electronic device to instruct the user to perform gesture control includes the processor controlling the electronic device to display a cursor, and the cursor is displayed in the first area of the display area of the electronic device Inside; the processor is also used for: when the eye movement type of the user's eyes determined by the eye movement information continues to be the second type, and the user's eyes move and the user's gaze area is determined by the first area in the display area of the electronic device When switching to the second area, the electronic device is controlled to move the cursor from the first area to the second area, wherein the gaze area is determined by the gaze position of the user's eyes.
根据第五方面,或者以上第五方面的任意一种实现方式,处理器控制电子设备指示用户进行手势控制包括处理器控制放大电子设备所显示的至少部分内容,且放大的内容为电子设备的显示区域中第一区域中所显示的内容;处理器还用于:当由眼动信息确定的用户眼睛的眼动类型持续为第二类型,且在用户的眼睛出现移动且用户的注视区域由电子设备的第一区域切换至第二区域时,控制电子设备将第一区域中放大的内容恢复至初始状态,以及放大第二区域中的内容。According to the fifth aspect, or any implementation of the above fifth aspect, the processor controlling the electronic device to instruct the user to perform gesture control includes the processor controlling to enlarge at least part of the content displayed by the electronic device, and the enlarged content is the display of the electronic device The content displayed in the first area in the area; the processor is also used for: when the eye movement type of the user's eyes determined by the eye movement information continues to be the second type, and the user's eyes move and the user's gaze area is determined by the electronic When the first area of the device is switched to the second area, the electronic device is controlled to restore the enlarged content in the first area to an initial state, and to enlarge the content in the second area.
根据第五方面,或者以上第五方面的任意一种实现方式,处理器控制电子设备指示用户进行手势控制之后,处理器还用于:当由眼动信息确定的用户眼睛的眼动类型由第二类型切换为第一类型时,控制电子设备停止指示用户进行手势控制,和/或,根据第一类型对电子设备进行控制,并限制手势追踪装置获取用户的手势。According to the fifth aspect, or any implementation manner of the above fifth aspect, after the processor controls the electronic device to instruct the user to perform gesture control, the processor is further configured to: when the eye movement type of the user's eyes determined by the eye movement information is determined by the first When the second type is switched to the first type, the control electronic device stops instructing the user to perform gesture control, and/or controls the electronic device according to the first type, and restricts the gesture tracking device from acquiring the user's gestures.
根据第五方面,或者以上第五方面的任意一种实现方式,在由眼动信息确定的用 户眼睛的眼动类型为第二类型之后,处理器还用于:获取用户的注视位置;根据注视位置,确定用户的注视区域的尺寸;根据注视区域的尺寸,确定用户进行手势控制时的操作区域的尺寸。According to the fifth aspect, or any implementation of the fifth aspect above, after the eye movement type of the user's eyes determined by the eye movement information is the second type, the processor is also used to: obtain the user's gaze position; The position determines the size of the user's gaze area; according to the size of the gaze area, the size of the operation area when the user performs gesture control is determined.
根据第五方面,或者以上第五方面的任意一种实现方式,处理器根据第一类型对电子设备进行控制,具体包括:处理器根据第一类型,实时确定用户的注视位置,和/或,处理器根据第一类型,控制电子设备切换电子设备上所显示的内容。According to the fifth aspect, or any implementation manner of the above fifth aspect, the processor controls the electronic device according to the first type, specifically including: the processor determines the gaze position of the user in real time according to the first type, and/or, The processor controls the electronic device to switch the content displayed on the electronic device according to the first type.
第六方面,本申请提供一种电子设备,包括:眼动追踪装置,用于获取用户的眼动信息;手势追踪装置,用于获取用户的手势;至少一个存储器,用于存储程序;至少一个处理器,用于执行存储器存储的程序,当存储器存储的程序被执行时,处理器用于:控制手势追踪装置获取的用户的手势;根据手势追踪装置获取的手势对电子设备进行控制;当由眼动信息确定的用户眼睛的眼动类型为第一类型时,根据第一类型对电子设备进行控制,并限制根据用户的手势控制电子设备;当由眼动信息确定的用户眼睛的眼动类型为第二类型时,根据手势追踪装置获取的手势对电子设备进行控制。In a sixth aspect, the present application provides an electronic device, including: an eye-tracking device for acquiring user's eye-movement information; a gesture tracking device for acquiring user's gestures; at least one memory for storing programs; at least one The processor is used to execute the program stored in the memory. When the program stored in the memory is executed, the processor is used to: control the user's gesture acquired by the gesture tracking device; control the electronic device according to the gesture acquired by the gesture tracking device; When the eye movement type of the user's eyes determined by the eye movement information is the first type, control the electronic device according to the first type, and limit the control of the electronic device according to the user's gesture; when the eye movement type of the user's eyes determined by the eye movement information is In the second type, the electronic device is controlled according to the gesture acquired by the gesture tracking device.
根据第六方面,或者以上第六方面的任意一种实现方式,处理器在根据第一类型对电子设备进行控制,并限制根据用户的手势控制电子设备之后,还用于:当由眼动信息确定的用户眼睛的眼动类型由第一类型切换至第二类型时,控制电子设备指示用户进行手势控制;控制手势追踪装置获取的用户的手势;根据手势追踪装置获取的手势对电子设备进行控制。According to the sixth aspect, or any implementation of the sixth aspect above, after the processor controls the electronic device according to the first type and restricts the control of the electronic device according to the user's gesture, it is further used to: when the eye movement information When the determined eye movement type of the user's eyes is switched from the first type to the second type, control the electronic device to instruct the user to perform gesture control; control the user's gesture acquired by the gesture tracking device; control the electronic device according to the gesture acquired by the gesture tracking device .
根据第六方面,或者以上第六方面的任意一种实现方式,处理器限制根据用户的手势控制电子设备,具体包括:处理器控制手势追踪装置不获取用户的手势,或者,处理器虽然控制手势追踪装置继续获取用户的手势但不对手势进行处理。According to the sixth aspect, or any implementation of the sixth aspect above, the processor restricts controlling the electronic device according to the user's gestures, specifically including: the processor controls the gesture tracking device not to acquire the user's gestures, or the processor controls the gestures The tracking device continues to acquire the user's gestures but does not process the gestures.
第七方面,本申请提供一种电子设备,包括:眼动追踪装置,用于获取用户的眼动信息;手势追踪装置,用于获取用户的手势;至少一个存储器,用于存储程序;至少一个处理器,用于执行存储器存储的程序,当存储器存储的程序被执行时,处理器用于:控制手势追踪装置获取用户的手势;根据手势追踪装置获取的用户针对电子设备上第一区域进行操作的第一手势对电子设备进行控制;控制眼动追踪装置获取用户的眼动信息;当由眼动信息确定的用户眼睛的眼动类型为第一类型时,根据第一类型对电子设备进行控制,并限制手势追踪装置获取用户的手势;继续控制眼动追踪装置获取用户的眼动信息;当由眼动信息确定的用户眼睛的眼动类型由第一类型切换至第二类型,且用户的注视位置由电子设备上的第一区域切换至第二区域时,控制电子设备指示用户进行手势控制;控制手势追踪装置获取用户的手势;根据手势追踪装置获取的用户针对第二区域进行操作的第二手势对电子设备进行控制。In a seventh aspect, the present application provides an electronic device, including: an eye-tracking device for acquiring user's eye-movement information; a gesture tracking device for acquiring user's gestures; at least one memory for storing programs; at least one The processor is used to execute the program stored in the memory. When the program stored in the memory is executed, the processor is used to: control the gesture tracking device to obtain the user's gesture; according to the user's operation on the first area on the electronic device obtained by the gesture tracking device The first gesture controls the electronic device; controls the eye movement tracking device to obtain the user's eye movement information; when the eye movement type of the user's eyes determined by the eye movement information is the first type, controls the electronic device according to the first type, And limit the gesture tracking device to obtain the user's gesture; continue to control the eye tracking device to obtain the user's eye movement information; when the eye movement type of the user's eyes determined by the eye movement information is switched from the first type to the second type, and the user's gaze When the location is switched from the first area on the electronic device to the second area, the electronic device is controlled to instruct the user to perform gesture control; the gesture tracking device is controlled to obtain the user's gesture; Gestures control electronic devices.
第八方面,本申请提供一种电子设备,包括:眼动追踪装置,用于获取用户的眼动信息;手势追踪装置,用于获取用户的手势;至少一个存储器,用于存储程序;至少一个处理器,用于执行存储器存储的程序,当存储器存储的程序被执行时,处理器用于:控制眼动追踪装置获取用户的第一眼动信息;根据第一眼动信息确定第一眼动类型;根据第一眼动类型对电子设备进行控制;控制眼动追踪装置获取用户的第二眼动信息;根据第二眼动信息确定第二眼动类型;控制手势追踪装置获取用户的手势;根据手势追踪装置获取的手势对电子设备进行控制。In an eighth aspect, the present application provides an electronic device, including: an eye-tracking device for acquiring user's eye-movement information; a gesture tracking device for acquiring user's gestures; at least one memory for storing programs; at least one The processor is used to execute the program stored in the memory. When the program stored in the memory is executed, the processor is used to: control the eye movement tracking device to obtain the user's first eye movement information; determine the first eye movement type according to the first eye movement information ; Control the electronic device according to the first eye movement type; control the eye movement tracking device to obtain the user's second eye movement information; determine the second eye movement type according to the second eye movement information; control the gesture tracking device to obtain the user's gesture; The gestures acquired by the gesture tracking device control the electronic equipment.
根据第八方面,或者以上第八方面的任意一种实现方式,处理器还用于:控制眼动追踪装置获取用户的第三眼动信息;根据第三眼动信息确定第一眼动类型;根据第一眼动类型对电子设备进行控制,并限制手势追踪装置获取用户的手势。According to the eighth aspect, or any implementation of the above eighth aspect, the processor is further configured to: control the eye-tracking device to obtain the user's third eye-movement information; determine the first eye-movement type according to the third eye-movement information; The electronic device is controlled according to the type of the first eye movement, and the gesture tracking device is restricted from acquiring the user's gestures.
第九方面,本申请提供一种计算机可读存储介质,计算机可读存储介质存储有计算机程序,当计算机程序在电子设备上运行时,使得电子设备执行如第一方面至第四方面中任意一方面或第一方面至第四方面的任意一种实现方式中提供的方法。In the ninth aspect, the present application provides a computer-readable storage medium. The computer-readable storage medium stores a computer program. When the computer program runs on the electronic device, the electronic device executes any one of the first to fourth aspects. Aspect or the method provided in any implementation manner of the first aspect to the fourth aspect.
第十方面,本申请提供一种计算机程序产品,当计算机程序产品在电子设备上运行时,使得电子设备执行如第一方面至第四方面中任意一方面或第一方面至第四方面的任意一种实现方式中提供的方法。In a tenth aspect, the present application provides a computer program product. When the computer program product runs on an electronic device, the electronic device executes any one of the first to fourth aspects or any of the first to fourth aspects. A method provided in an implementation.
可以理解的是,上述第五方面至第十方面的有益效果可以参见上述第一方面中的相关描述,在此不再赘述。It can be understood that, for the beneficial effects of the above fifth aspect to the tenth aspect, reference may be made to the relevant description in the above first aspect, which will not be repeated here.
附图说明Description of drawings
图1是本申请一实施例提供的一种应用场景的示意图;FIG. 1 is a schematic diagram of an application scenario provided by an embodiment of the present application;
图2是本申请一实施例提供的一种电子设备的结构示意图;FIG. 2 is a schematic structural diagram of an electronic device provided by an embodiment of the present application;
图3是本申请一实施例提供的一种眼动追踪装置追踪眼动的示意图;Fig. 3 is a schematic diagram of eye movement tracking provided by an eye tracking device according to an embodiment of the present application;
图4是本申请一实施例提供的一种用户眼睛的瞳孔中心与角膜上光斑之间的相对位置的示意图;Fig. 4 is a schematic diagram of the relative position between the pupil center of the user's eye and the light spot on the cornea provided by an embodiment of the present application;
图5是本申请一实施例提供的一种电子设备的显示屏上注视位置的坐标示意图;Fig. 5 is a schematic diagram of the coordinates of the gaze position on the display screen of an electronic device provided by an embodiment of the present application;
图6是本申请一实施例提供的一种电子设备上红外发射器的布置示意图;Fig. 6 is a schematic layout diagram of an infrared emitter on an electronic device provided by an embodiment of the present application;
图7是本申请一实施例提供的一种手势追踪装置追踪手势的示意图;Fig. 7 is a schematic diagram of a gesture tracking device tracking gestures provided by an embodiment of the present application;
图8是本申请一实施例提供的一种与电子设备进行交互的方法的流程示意图;Fig. 8 is a schematic flowchart of a method for interacting with an electronic device provided by an embodiment of the present application;
图9是本申请一实施例中提供的一种指示用户进行手势控制的示意图;Fig. 9 is a schematic diagram of instructing a user to perform gesture control provided in an embodiment of the present application;
图10是本申请一实施例中提供的另一种指示用户进行手势控制的示意图;Fig. 10 is another schematic diagram of instructing a user to perform gesture control provided in an embodiment of the present application;
图11是本申请一实施例提供的一种用户的注视区域发生变化时电子设备上显示的光标的变化示意图;Fig. 11 is a schematic diagram of the change of the cursor displayed on the electronic device when the user's gaze area changes according to an embodiment of the present application;
图12是本申请一实施例提供的一种用户的注视区域发生变化时电子设备上放大的内容的变化示意图;Fig. 12 is a schematic diagram of changes in content enlarged on an electronic device when a user's gaze area changes according to an embodiment of the present application;
图13是本申请一实施例提供的一种操作区域与电子设备上显示屏的对应示意图;Fig. 13 is a corresponding schematic diagram of an operation area and a display screen on an electronic device provided by an embodiment of the present application;
图14是本申请一实施例提供的一种操作区域与电子设备上的注视区域的对应示意图;Fig. 14 is a corresponding schematic diagram of an operation area and a gaze area on an electronic device according to an embodiment of the present application;
图15是本申请一实施例提供的一种眼睛的可视角度的示意图;Fig. 15 is a schematic diagram of an eye's viewing angle provided by an embodiment of the present application;
图16是本申请一实施例提供的一种在电子设备上的注视区域内显示光标的示意图;Fig. 16 is a schematic diagram of displaying a cursor in a gaze area on an electronic device according to an embodiment of the present application;
图17是本申请一实施例提供的一种控制过程的步骤示意图;Fig. 17 is a schematic diagram of the steps of a control process provided by an embodiment of the present application;
图18是本申请一实施例提供的一种用户手部移动的过程示意图;Fig. 18 is a schematic diagram of a user's hand movement process provided by an embodiment of the present application;
图19是本申请一实施例提供的一种与电子设备进行交互的方法的流程示意图;Fig. 19 is a schematic flowchart of a method for interacting with an electronic device provided by an embodiment of the present application;
图20是本申请一实施例提供的另一种与电子设备进行交互的方法的流程示意图;Fig. 20 is a schematic flowchart of another method for interacting with an electronic device provided by an embodiment of the present application;
图21是本申请一实施例提供的又一种与电子设备进行交互的方法的流程示意图;Fig. 21 is a schematic flowchart of another method for interacting with an electronic device provided by an embodiment of the present application;
图22是本申请一实施例提供的再一种与电子设备进行交互的方法的流程示意图;Fig. 22 is a schematic flowchart of another method for interacting with an electronic device provided by an embodiment of the present application;
图23是本申请一实施例提供的一种控制过程的示意图;Fig. 23 is a schematic diagram of a control process provided by an embodiment of the present application;
图24是本申请一实施例提供的另一种控制过程的示意图;Fig. 24 is a schematic diagram of another control process provided by an embodiment of the present application;
图25是本申请一实施例提供的又一种控制过程的示意图;Fig. 25 is a schematic diagram of another control process provided by an embodiment of the present application;
图26是本申请一实施例提供的再一种控制过程的示意图;Fig. 26 is a schematic diagram of another control process provided by an embodiment of the present application;
图27是本申请一实施例提供的再一种控制过程的示意图。Fig. 27 is a schematic diagram of another control process provided by an embodiment of the present application.
具体实施方式Detailed ways
本文中术语“和/或”,是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。本文中符号“/”表示关联对象是或者的关系,例如A/B表示A或者B。The term "and/or" in this article is an association relationship describing associated objects, which means that there can be three relationships, for example, A and/or B can mean: A exists alone, A and B exist simultaneously, and B exists alone These three situations. The symbol "/" in this document indicates that the associated object is an or relationship, for example, A/B indicates A or B.
本文中的说明书和权利要求书中的术语“第一”和“第二”等是用于区别不同的对象,而不是用于描述对象的特定顺序。例如,第一响应消息和第二响应消息等是用于区别不同的响应消息,而不是用于描述响应消息的特定顺序。The terms "first" and "second" and the like in the specification and claims herein are used to distinguish different objects, rather than to describe a specific order of objects. For example, the first response message and the second response message are used to distinguish different response messages, rather than describing a specific order of the response messages.
在本申请实施例中,“示例性的”或者“例如”等词用于表示作例子、例证或说明。本申请实施例中被描述为“示例性的”或者“例如”的任何实施例或设计方案不应被解释为比其它实施例或设计方案更优选或更具优势。确切而言,使用“示例性的”或者“例如”等词旨在以具体方式呈现相关概念。In the embodiments of the present application, words such as "exemplary" or "for example" are used as examples, illustrations or illustrations. Any embodiment or design scheme described as "exemplary" or "for example" in the embodiments of the present application shall not be interpreted as being more preferred or more advantageous than other embodiments or design schemes. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete manner.
在本申请实施例的描述中,除非另有说明,“多个”的含义是指两个或者两个以上,例如,多个处理单元是指两个或者两个以上的处理单元等;多个元件是指两个或者两个以上的元件等。In the description of the embodiments of the present application, unless otherwise specified, "multiple" means two or more, for example, multiple processing units refer to two or more processing units, etc.; multiple A component refers to two or more components or the like.
示例性的,图1示出了本申请一些实施例中的一种应用场景。如图1所示,用户A正在使用电子设备100,其中,电子设备100可以但不限于为智能电视,本申请实施例中所指的智能电视可以是能与移动设备例如智能手机、平板电脑等进行交互的电视或其他具有大屏的电子设备,例如智能手机中的用户界面可以通过无线方式传输并在智能电视中呈现,用户在智能电视中的操作也可以影响智能手机。在电子设备100上可以设置有眼动追踪装置110和/或手势追踪装置120。眼动追踪装置110可以用来检测用户A的眼睛21的动作,手势追踪装置120可以用来检测用户A的手部22的手势和/或手部运动。Exemplarily, Fig. 1 shows an application scenario in some embodiments of the present application. As shown in Figure 1, user A is using an electronic device 100, wherein the electronic device 100 may be, but not limited to, a smart TV. A TV or other electronic devices with a large screen for interaction, such as a user interface in a smart phone can be transmitted wirelessly and presented on the smart TV, and user operations on the smart TV can also affect the smart phone. The electronic device 100 may be provided with an eye tracking device 110 and/or a gesture tracking device 120 . The eye tracking device 110 can be used to detect the movement of the user A's eyes 21 , and the gesture tracking device 120 can be used to detect the gesture and/or hand movement of the user A's hand 22 .
一般地,用户A可以使用眼动操控对电子设备100进行控制。此时,在电子设备100上可以设置有眼动追踪装置110。其中,在用户A使用电子设备100的过程中,用户A可以通过其眼睛21刻意做出一些与控制相关的动作。之后,眼动追踪装置110检测到眼睛21做出的动作后,电子设备100可以根据眼动追踪装置110检测到的眼睛21的动作生成控制指令,并输出该控制指令。可见,在用户A对电子设备100的操控过程中,用户A的眼睛21需要频繁做出刻意的动作,因此容易引起用户A的眼部疲劳。Generally, user A can use eye movement manipulation to control the electronic device 100 . At this time, an eye tracking device 110 may be provided on the electronic device 100 . Wherein, during the process of using the electronic device 100 by the user A, the user A may intentionally perform some control-related actions through his eyes 21 . Afterwards, after the eye-tracking device 110 detects the motion of the eye 21 , the electronic device 100 may generate a control command according to the motion of the eye 21 detected by the eye-tracking device 110 , and output the control command. It can be seen that during the manipulation of the electronic device 100 by the user A, the eyes 21 of the user A need to make frequent deliberate movements, which easily causes eye fatigue of the user A.
为了避免出现眼动操控导致用户A的眼部疲劳的情况,可以将眼动操控替换为手势操控。此时,电子设备100上的眼动追踪装置110可以替换为手势追踪装置120。此时,在用户A使用电子设备100的过程中,用户A可以通过其手部22刻意做出一些与控制相关的手势。之后,手势追踪装置120检测到手部22做出的手势后,电子设备 100可以根据手势追踪装置120检测到的手部22做出的手势生成控制指令,并输出该控制指令。但当电子设备100上的操控区域较大时,用户A的手部22需要移动的距离也将变大,这就容易导致操控效率降低,且容易引起疲劳。In order to avoid eye fatigue caused by eye movement manipulation, eye movement manipulation may be replaced by gesture manipulation. At this time, the eye tracking device 110 on the electronic device 100 can be replaced by the gesture tracking device 120 . At this time, during the process of using the electronic device 100 by the user A, the user A may intentionally make some control-related gestures through his hand 22 . Afterwards, after the gesture tracking device 120 detects the gesture made by the hand 22, the electronic device 100 can generate a control command according to the gesture made by the hand 22 detected by the gesture tracking device 120, and output the control command. However, when the control area on the electronic device 100 is large, the distance that user A's hand 22 needs to move will also increase, which will easily lead to a decrease in control efficiency and cause fatigue.
进一步地,为了避免单独进行眼动操控和手势操控带来的效率低且容易引起疲劳的问题,并实现高效便捷的对电子设备进行操控,本申请还提供了一种方案。其中,在该方案中,在电子设备100上可以设置有眼动追踪装置110和手势追踪装置120,此时,在用户A使用电子设备100的过程中,电子设备100上的眼动追踪装置110可以对用户A的眼睛21进行检测,当检测到用户A意图对电子设备100进行控制时,电子设备100可以指示用户A进行手势控制;之后,电子设备100可以根据手势追踪装置120检测到的用户A的手部22的手势生成控制指令,并输出该控制指令。这样将眼动操控和手势操控有序结合,使得眼动操控和手势操控之间可以无感交互,实现了高效便捷和高精度的对电子设备进行操控。Further, in order to avoid the problems of low efficiency and easy fatigue caused by separate eye movement control and gesture control, and realize efficient and convenient control of electronic devices, the present application also provides a solution. Among them, in this solution, the electronic device 100 can be provided with an eye tracking device 110 and a gesture tracking device 120. At this time, when user A uses the electronic device 100, the eye tracking device 110 on the electronic device 100 The eyes 21 of user A can be detected, and when it is detected that user A intends to control the electronic device 100, the electronic device 100 can instruct user A to perform gesture control; The gesture of A's hand 22 generates a control command, and outputs the control command. In this way, eye movement control and gesture control are combined in an orderly manner, so that there can be no sense interaction between eye movement control and gesture control, and efficient, convenient and high-precision control of electronic devices is realized.
可以理解的是,图1中所示的电子设备100也可以替换为其他的电子设备,替换后的方案仍在本申请的保护范围内。示例性的,电子设备100可以为手机、平板电脑、桌面型计算机、膝上型计算机、手持计算机、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本,以及蜂窝电话、个人数字助理(personal digital assistant,PDA)、增强现实(augmented reality,AR)设备、虚拟现实(virtual reality,VR)设备、人工智能(artificial intelligence,AI)设备、可穿戴式设备、车载设备、智能家居设备和/或智慧城市设备,本申请实施例对该电子设备100的具体类型不作特殊限制。It can be understood that the electronic device 100 shown in FIG. 1 can also be replaced with other electronic devices, and the replaced solution is still within the protection scope of the present application. Exemplarily, the electronic device 100 can be a mobile phone, a tablet computer, a desktop computer, a laptop computer, a handheld computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, and a cell phone, a personal Digital assistant (personal digital assistant, PDA), augmented reality (augmented reality, AR) equipment, virtual reality (virtual reality, VR) equipment, artificial intelligence (artificial intelligence, AI) equipment, wearable equipment, vehicle equipment, smart home device and/or smart city device, the embodiment of the present application does not specifically limit the specific type of the electronic device 100 .
示例性的,图2示出了电子设备100的结构示意图。如图2所示,该电子设备100可以包括:眼动追踪装置110、手势追踪装置120、处理器130和存储器140。Exemplarily, FIG. 2 shows a schematic structural diagram of the electronic device 100 . As shown in FIG. 2 , the electronic device 100 may include: an eye tracking device 110 , a gesture tracking device 120 , a processor 130 and a memory 140 .
其中,眼动追踪装置110可以用于检测用户眼睛的眼动信息、眼动类型和/或注视位置。示例性的,眼动信息可以包括眼动速度和/或眼动加速度等。示例性的,眼动类型可以包括跳动和非跳动,其中,非跳动是指除跳动以外的眼动类型,例如:注视或平缓追踪等。眼动追踪装置110可以但不限于通过瞳孔中心角膜反射技术(pupil center corneal reflection,PCCR)、基于外貌检测(appearance-based)等技术检测眼动类型。示例性的,以采用PCCR为例,如图3所示,眼动追踪装置110可以包括:红外发射器111和摄像头112。眼动追踪装置110在工作过程中,红外发射器111可以发出红外光,该红外光可以在用户A的眼睛21中的角膜上产生反射,并在角膜上投射出光斑。摄像头112可以采集用户A的脸部图像,且通过预设的图像算法确定出用户眼睛21的瞳孔中心和角膜上光斑之间的相对位置,并基于该相对位置确定出用户A的眼睛21的注视点和眼动类型。示例性的,眼动追踪装置110可以跟随电子设备100一起启动,也可以在获取到启动指令后再启动。示例性的,在电子设备100上配置有图像采集装置(比如:摄像头),且电子设备100确定出该图像采集装置采集到用户的人脸图像时,电子设备100可以向眼动追踪装置110发送启动指令。另外,电子设备100在获取到用户下发的开启眼动追踪装置110的指令后,也可以向眼动追踪装置110发送启动指令。此外,在电子设备100获取到用户下发的开启电子设备100的指令时,电子设备100亦可以向眼动追踪装置110发送启动指令。Wherein, the eye movement tracking device 110 may be used to detect eye movement information, eye movement type and/or gaze position of the user's eyes. Exemplarily, the eye movement information may include eye movement velocity and/or eye movement acceleration and the like. Exemplarily, the type of eye movement may include bouncing and non-bouncing, wherein non-bouncing refers to an eye movement type other than bouncing, such as gaze or smooth tracking. The eye-tracking device 110 may, but is not limited to, detect eye movement types through pupil center corneal reflection (pupil center corneal reflection, PCCR), appearance-based and other technologies. Exemplarily, taking PCCR as an example, as shown in FIG. 3 , the eye tracking device 110 may include: an infrared emitter 111 and a camera 112 . When the eye tracking device 110 is in operation, the infrared emitter 111 can emit infrared light, which can be reflected on the cornea of user A's eye 21 and project light spots on the cornea. The camera 112 can collect the face image of user A, and determine the relative position between the pupil center of the user's eye 21 and the light spot on the cornea through a preset image algorithm, and determine the gaze of user A's eye 21 based on the relative position Point and eye movement types. Exemplarily, the eye-tracking device 110 can be started together with the electronic device 100, or can be started after obtaining the start instruction. Exemplarily, when an image capture device (such as a camera) is configured on the electronic device 100, and the electronic device 100 determines that the image capture device has captured a user's face image, the electronic device 100 may send an image to the eye tracking device 110 Start command. In addition, the electronic device 100 may also send an activation instruction to the eye tracking device 110 after obtaining the instruction from the user to start the eye tracking device 110 . In addition, when the electronic device 100 obtains an instruction from the user to turn on the electronic device 100 , the electronic device 100 may also send an activation instruction to the eye tracking device 110 .
在一个例子中,可以预先设定用户眼睛的瞳孔中心与角膜上光斑之间的相对位置和用户注视电子设备100上的注视点之间的映射关系,这样当确定出用户眼睛的瞳孔中心与角膜上光斑之间的相对位置后,即可以确定出用户注视电子设备100上的注视位置。示例性的,如图4的(A)所示,若用户眼睛21的瞳孔中心41和红外发射器111发出的红外光在用户眼睛21的角膜上透射的光斑42重合时,用户的注视方向为注视电子设备100的中心区域,则当确定出用户眼睛a的瞳孔中心与角膜上的光斑重合时,即可以确定出用户正在注视电子设备100的中心区域,其中,如图5所示,该中心区域的坐标可以为(0,0)。如图4的(B)所示,若用户眼睛21的瞳孔中心41位于红外发射器111发出的红外光在用户眼睛21的角膜上透射的光斑42的正左方,且两者之间相距L 1时,用户的注视方向为注视电子设备100的右侧边缘的中心区域,则当确定出用户眼睛21的瞳孔中心41位于角膜上透射的光斑42的正左方,且两者之间相距L 1时,即可以确定出用户正在注视电子设备100的右侧边缘的中心区域,其中,如图5所示,该区域的坐标可以为(X 1,0)。如图4的(C)所示,若用户眼睛21的瞳孔中心41位于红外发射器111发出的红外光在用户眼睛21的角膜上透射的光斑42的正右方,且两者之间相距L 1时,用户的注视方向为注视电子设备100的左侧边缘的中心区域,则当确定出用户眼睛21的瞳孔中心41位于角膜上透射的光斑42的正右方,且两者之间相距L 1时,即可以确定出用户正在注视电子设备100的左侧边缘的中心区域,其中,如图5所示,该区域的坐标可以为(-X 1,0)。 In one example, the mapping relationship between the relative position of the pupil center of the user's eye and the light spot on the cornea and the gaze point of the user's gaze on the electronic device 100 can be preset, so that when the pupil center of the user's eye and the cornea After determining the relative positions between the light spots, the gaze position of the user on the electronic device 100 can be determined. Exemplary, as shown in (A) of Figure 4, if the pupil center 41 of the user's eye 21 and the infrared light emitted by the infrared emitter 111 coincide with the light spot 42 transmitted on the cornea of the user's eye 21, the user's gaze direction is Staring at the central area of the electronic device 100, when it is determined that the pupil center of the user's eye a coincides with the light spot on the cornea, it can be determined that the user is looking at the central area of the electronic device 100, wherein, as shown in Figure 5, the center The coordinates of the region can be (0,0). As shown in (B) of Figure 4, if the pupil center 41 of the user's eye 21 is located at the right left of the light spot 42 transmitted by the infrared light emitted by the infrared emitter 111 on the cornea of the user's eye 21, and the distance between the two is L When 1 , the user's gaze direction is to gaze at the central area of the right edge of the electronic device 100, then when it is determined that the pupil center 41 of the user's eye 21 is located directly to the left of the transmitted light spot 42 on the cornea, and the distance between the two is L 1 , it can be determined that the user is looking at the central area of the right edge of the electronic device 100, where, as shown in FIG. 5 , the coordinates of this area can be (X 1 ,0). As shown in (C) of Figure 4, if the pupil center 41 of the user's eye 21 is located at the right side of the light spot 42 transmitted by the infrared light emitted by the infrared emitter 111 on the cornea of the user's eye 21, and the distance between the two is L 1 , the user's gaze direction is to gaze at the central area of the left edge of the electronic device 100, then when it is determined that the pupil center 41 of the user's eye 21 is located directly to the right of the transmitted light spot 42 on the cornea, and the distance between the two is L 1 , it can be determined that the user is looking at the central area of the left edge of the electronic device 100, where, as shown in FIG. 5 , the coordinates of this area can be (-X 1 ,0).
在一个实施例中,在确定眼动类型时,可以但不限于基于眼动速度、眼动加速度或空间集散度等方式确定。示例性的,以基于眼动速度确定眼动类型举例进行说明,当用户眼睛的眼动速度超过某一阈值V 1时,即眼动速度V∈(V 1,+∞),表明用户的眼动速度较快,此时可以确定眼动类型为跳动;当用户眼睛的眼动速度在V 2和V 1之间时,即眼动速度V∈[V 2,V 1],表明用户的眼动速度较为缓慢,此时可以确定眼动类型为平缓追踪,其中,V 2小于V 1;当用户眼睛的眼动速度小于V 2时,即眼动速度V∈[0,V 2),表明用户的眼动速度非常缓慢,此时可以确定眼动类型为注视。 In an embodiment, when determining the type of eye movement, it may be determined based on, but not limited to, eye movement velocity, eye movement acceleration, or spatial divergence. Exemplarily, the determination of the eye movement type based on the eye movement speed is used as an example for illustration. When the eye movement speed of the user's eyes exceeds a certain threshold V 1 , that is, the eye movement speed V ∈ (V 1 , +∞), it indicates that the user's eye movement If the eye movement speed is fast, it can be determined that the eye movement type is jumping; when the eye movement speed of the user's eyes is between V 2 and V 1 , that is, the eye movement speed V ∈ [V 2 , V 1 ], it indicates that the user's eye movement speed is between V 2 and V 1 . The eye movement speed is relatively slow. At this time, it can be determined that the eye movement type is gentle tracking, where V 2 is less than V 1 ; when the eye movement speed of the user's eyes is less than V 2 , that is, the eye movement speed V∈[0, V 2 ), it indicates The user's eye movement speed is very slow, and the eye movement type can be determined to be fixation at this time.
在另一些实施例中,红外发射器111的数量可以为多个。多个红外发射器111可以间隔布置在电子设备100上。例如,如图6所示,当红外发射器111的数量为4个时,4个红外发射器111可以分别布置在电子设备100的四个边角处。此外,摄像头112的数量也可以为多个,具体可根据情况而定,此处不做限定。在一个例子中,眼动追踪装置110中的至少部分功能(比如数据处理功能等)可以由处理器110实现。在一个例子中,眼动追踪装置110可以集成在电子设备100上,也可以单独布置,此处不做限定。In other embodiments, the number of infrared emitters 111 may be multiple. A plurality of infrared emitters 111 may be arranged on the electronic device 100 at intervals. For example, as shown in FIG. 6 , when the number of infrared emitters 111 is four, the four infrared emitters 111 may be respectively arranged at four corners of the electronic device 100 . In addition, the number of cameras 112 may also be multiple, which may be determined according to specific circumstances, and is not limited here. In an example, at least part of the functions in the eye tracking device 110 (such as data processing functions, etc.) may be implemented by the processor 110 . In an example, the eye-tracking device 110 may be integrated on the electronic device 100 or arranged separately, which is not limited here.
手势追踪装置120可以用于检测用户做出的手势。手势追踪装置120可以但不限于基于计算机视觉的摄像头追踪、基于发射信号和反射信号的对比等方法对用户做出的手势进行检测。示例性的,以基于发射信号和反射信号的对比方法为例,如图7所示,手势追踪装置120可以包括信号发射器121、信号接收器122和信号处理单元123。手势追踪装置120在工作过程中,信号发射器121可以发送信号(如:毫米波、红外光、超声、无线保真(wireless fidelity,Wi-Fi)等信号);然后,信号接收器122可以接收用户A的手部22反射的信号;最后,信号处理单元123通过对比信号发射器 121发出的原始信号和信号接收器122接收到的反射信号,利用多普勒效应、相位偏移、时间差等原理追踪用户A的手部22的动作,进而确定出用户A做出的手势。在一个例子中,当手势追踪装置120采用基于计算机视觉的摄像头追踪方法对用户做出的手势进行检测时,手势追踪装置120还可以包括摄像头,该摄像头可以采集用户手部的图像,并基于预先设定的图像处理算法检测出用户做出的手势。其中,该摄像头可以与眼动追踪装置110中的摄像头为同一摄像头,也可以为不同的摄像头。在一个例子中,手势追踪装置120中的至少部分组件(比如信号处理单元123)可以集成在处理器130中。在一个例子中,手势追踪装置120可以集成在电子设备100上,也可以单独布置,亦可以一部分集成在电子设备100上,而另一部分单独布置,此处不做限定。在一个例子中,手势追踪装置120中的至少部分功能(比如数据处理功能等)可以由处理器110实现。示例性的,手势追踪装置120可以跟随电子设备100一起启动,也可以在获取到启动指令后再启动,比如,当电子设备100通过眼动追踪装置110确定出用户的眼睛的眼动类型不为跳动时,电子设备100可以向手势追踪装置120发送启动指令。另外,电子设备100在获取到用户下发的开启眼动追踪装置120的指令后,也可以向眼动追踪装置120发送启动指令。此外,在电子设备100获取到用户下发的开启电子设备100的指令时,电子设备100亦可以向眼动追踪装置120发送启动指令。The gesture tracking device 120 can be used to detect gestures made by the user. The gesture tracking device 120 may, but not limited to, detect the gestures made by the user through methods such as camera tracking based on computer vision, comparison based on transmitted signals and reflected signals, and the like. Exemplarily, taking the comparison method based on transmitted signals and reflected signals as an example, as shown in FIG. 7 , the gesture tracking device 120 may include a signal transmitter 121 , a signal receiver 122 and a signal processing unit 123 . During the working process of the gesture tracking device 120, the signal transmitter 121 can send signals (such as signals such as millimeter wave, infrared light, ultrasound, wireless fidelity (Wi-Fi)); then, the signal receiver 122 can receive The signal reflected by the hand 22 of user A; finally, the signal processing unit 123 compares the original signal sent by the signal transmitter 121 with the reflected signal received by the signal receiver 122, and utilizes principles such as the Doppler effect, phase shift, and time difference The movement of the hand 22 of the user A is tracked, and then the gesture made by the user A is determined. In one example, when the gesture tracking device 120 uses a camera tracking method based on computer vision to detect the gestures made by the user, the gesture tracking device 120 may also include a camera, which can collect images of the user's hand and A set image processing algorithm detects gestures made by the user. Wherein, the camera may be the same camera as the camera in the eye tracking device 110, or may be a different camera. In one example, at least some components in the gesture tracking device 120 (such as the signal processing unit 123 ) can be integrated in the processor 130 . In one example, the gesture tracking device 120 may be integrated on the electronic device 100 or arranged separately, or a part may be integrated on the electronic device 100 while another part is arranged separately, which is not limited here. In one example, at least part of the functions (such as data processing functions, etc.) of the gesture tracking device 120 may be implemented by the processor 110 . Exemplarily, the gesture tracking device 120 can be started together with the electronic device 100, or can be started after obtaining the start command, for example, when the electronic device 100 determines that the eye movement type of the user's eyes is not When jumping, the electronic device 100 may send an activation instruction to the gesture tracking device 120 . In addition, the electronic device 100 may also send an activation instruction to the eye tracking device 120 after obtaining the instruction from the user to start the eye tracking device 120 . In addition, when the electronic device 100 obtains an instruction from the user to turn on the electronic device 100 , the electronic device 100 may also send an activation instruction to the eye tracking device 120 .
在一个实施例中,手势追踪装置120可以包括肌电图(electromyogram,EMG)手环或红外遥控笔等外设组件。此外,用户可以使用手势追踪装置120中的外设组件过程中,手势追踪装置120中的信号处理单元可以获取外设组件发射的信号,并根据外设组件发射的信号,来检测用户的手势和运动。In one embodiment, the gesture tracking device 120 may include peripheral components such as an electromyogram (electromyogram, EMG) bracelet or an infrared remote control pen. In addition, when the user can use the peripheral components in the gesture tracking device 120, the signal processing unit in the gesture tracking device 120 can obtain the signals emitted by the peripheral components, and detect the user's gestures and gestures according to the signals emitted by the peripheral components. sports.
在一个实施例中,在手势追踪装置120工作过程中,眼动追踪装置110也可以处于工作状态,以便眼动追踪装置110继续获取用户的眼睛的眼动类型。In one embodiment, during the working process of the gesture tracking device 120 , the eye tracking device 110 may also be in a working state, so that the eye tracking device 110 continues to acquire the eye movement type of the user's eyes.
在一个实施例中,在眼动追踪装置110工作过程中,手势追踪装置120可以停止工作,以节省功耗,或者,手势追踪装置120也可以继续工作,以便手势追踪装置120继续获取用户的手势。In one embodiment, when the eye tracking device 110 is working, the gesture tracking device 120 can stop working to save power consumption, or the gesture tracking device 120 can also continue to work so that the gesture tracking device 120 can continue to acquire the user's gestures .
处理器130是电子设备100的计算核心及控制核心。处理器130可以包括一个或多个处理单元。例如,处理器130可以包括应用处理器(application processor,AP)、调制解调器(modem)、图形处理器(graphics processing unit,GPU)、图像信号处理器(image signal processor,ISP)、控制器、视频编解码器、数字信号处理器(digital signal processor,DSP)、基带处理器、和/或神经网络处理器(neural-network processing unit,NPU)等中的一项或多项。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。示例性的,处理器130可以向眼动追踪装置110和/或手势追踪装置120下发控制指令,例如:控制眼动追踪装置110启动或关闭,以控制眼动追踪装置110获取用户的眼动信息或者停止获取用户的眼动信息等,控制手势追踪装置120启动或关闭,以控制手势追踪装置120获取用户的手势或者停止获取用户的手势等。示例性的,处理器130可以根据眼动追踪装置110获取到的用户的眼动信息确定出用户的眼动类型,详见上述基于眼动信息确定眼动类型的描述,此处就不再一一赘述。The processor 130 is a calculation core and a control core of the electronic device 100 . Processor 130 may include one or more processing units. For example, the processor 130 may include an application processor (application processor, AP), a modem (modem), a graphics processing unit (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video encoder One or more of a decoder, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors. Exemplarily, the processor 130 may issue a control command to the eye tracking device 110 and/or the gesture tracking device 120, for example: controlling the eye tracking device 110 to be turned on or off, so as to control the eye tracking device 110 to obtain the user's eye movement Information or stop acquiring the user's eye movement information, etc., control the gesture tracking device 120 to start or shut down, so as to control the gesture tracking device 120 to acquire the user's gestures or stop acquiring the user's gestures. Exemplarily, the processor 130 may determine the user's eye movement type according to the user's eye movement information acquired by the eye movement tracking device 110. For details, refer to the above-mentioned description of determining the eye movement type based on the eye movement information, which will not be repeated here. A repeat.
存储器140可以存储有程序,程序可被处理器130运行,使得处理器130至少可 以执行本申请实施例中提供的方法。存储器140还可以存储有数据。处理器130可以读取存储器140中存储的数据。存储器140和处理器130可以单独设置。另外,存储器140也可以集成在处理器130中。The memory 140 can store a program, and the program can be executed by the processor 130, so that the processor 130 can at least execute the method provided in the embodiment of the present application. The memory 140 may also store data. The processor 130 can read data stored in the memory 140 . The memory 140 and the processor 130 may be provided separately. In addition, the memory 140 may also be integrated in the processor 130 .
在另一些实施例中,电子设备100中还包括显示屏(图中未示出)。该显示屏可以用于显示图像,视频等。该显示屏可以包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备100可以包括1个或N个显示屏,N为大于1的正整数。示例性的,电子设备可以在显示屏上显示指示用户进行手势控制的信息。In other embodiments, the electronic device 100 further includes a display screen (not shown in the figure). The display screen can be used to display images, videos, etc. The display screen may include a display panel. The display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode, AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diodes (quantum dot light emitting diodes, QLED), etc. In some embodiments, the electronic device 100 may include 1 or N display screens, where N is a positive integer greater than 1. Exemplarily, the electronic device may display information on the display screen indicating that the user performs gesture control.
可以理解的是,本申请图2示意的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。It can be understood that the structure shown in FIG. 2 of the present application does not constitute a specific limitation on the electronic device 100 . In other embodiments of the present application, the electronic device 100 may include more or fewer components than shown in the figure, or combine certain components, or separate certain components, or arrange different components. The illustrated components can be realized in hardware, software or a combination of software and hardware.
在一些实施例中,电子设备100例如智能电视,可以显示如图9的(A)所示的用户界面,此时,眼动追踪装置110可以实时或周期性获取用户的眼动信息和/或眼动类型,例如:通过瞳孔中心角膜反射技术、基于外貌检测等技术检测眼动类型(详见上文描述,此处不再赘述)。在眼动追踪装置110获取到的眼动类型不为跳动(即眼动类型为非跳动)时,电子设备100可以指示用户进行手势控制,例如,如图9的(B)所示,电子设备100可以显示光标92,以指示用户进行手势控制。另外,在眼动追踪装置110获取到的眼动类型不为跳动时,电子设备100可以启动其上的手势追踪装置120,以获取到用户的操作手势,进而响应用户的操作。In some embodiments, the electronic device 100, such as a smart TV, may display a user interface as shown in (A) of FIG. Eye movement type, for example: detection of eye movement type through pupil center corneal reflection technology, appearance detection and other technologies (see the above description for details, and will not be repeated here). When the type of eye movement acquired by the eye-tracking device 110 is not jitter (that is, the type of eye movement is non-jitter), the electronic device 100 may instruct the user to perform gesture control. For example, as shown in (B) of FIG. 9 , the electronic device 100 may display a cursor 92 to direct the user to gesture control. In addition, when the type of eye movement acquired by the eye-tracking device 110 is not bouncing, the electronic device 100 can activate the gesture tracking device 120 on it to acquire the user's operation gesture, and then respond to the user's operation.
下面基于上文所描述的内容,并结合图8,对本申请提供的技术方案进行详细说明。示例性的,图8示出了本申请实施例提供的一种与电子设备进行交互的方法的流程示意图。在图8中所涉及的电子设备可以为上文图2中所描述的电子设备100,其具有上文所描述的眼动追踪装置110和手动追踪装置120。如图8所示,该与电子设备进行交互的方法可以包括以下步骤:Based on the content described above and in conjunction with FIG. 8 , the technical solution provided by the present application will be described in detail below. Exemplarily, FIG. 8 shows a schematic flowchart of a method for interacting with an electronic device provided in an embodiment of the present application. The electronic device involved in FIG. 8 may be the electronic device 100 described above in FIG. 2 , which has the eye tracking device 110 and the manual tracking device 120 described above. As shown in Figure 8, the method for interacting with an electronic device may include the following steps:
S801、获取用户眼睛的眼动类型,眼动类型可以包括跳动或非跳动,非跳动是指眼动类型不是跳动。S801. Obtain an eye movement type of the user's eyes. The eye movement type may include jitter or non-jitter, and non-jitter means that the eye movement type is not jitter.
具体地,可以利用上文所描述的眼动追踪装置110实时或周期性获取到用户眼睛的眼动类型。其中,眼动类型可以包括跳动或非跳动,非跳动是指眼动类型不是跳动,例如注视或平缓追踪等。此外,电子设备100中的处理器130也可以基于眼动追踪装置110实时或周期性获取到用户眼睛的眼动信息,确定用户眼睛的眼动类型,具体可根据实际情况而定,此处不做限定。Specifically, the eye movement type of the user's eyes may be acquired in real time or periodically by using the eye movement tracking device 110 described above. Wherein, the type of eye movement may include jumping or non-beating, and non-beating means that the type of eye movement is not jumping, such as gaze or smooth tracking. In addition, the processor 130 in the electronic device 100 can also determine the eye movement type of the user's eyes based on the eye movement information of the user's eyes obtained by the eye movement tracking device 110 in real time or periodically. Do limited.
S802、确定眼动类型为非跳动,指示用户进行手势控制。S802. Determine that the eye movement type is non-jumping, and instruct the user to perform gesture control.
具体地,在获取到眼动类型后,可以根据眼动类型,指示用户进行手势控制,以 便用户可以获知到在后续可以通过手势对电子设备进行控制。在一些实施例中,当确定出眼动类型不为跳动(即为注视或平缓追踪等)时,可以指示用户进行手势控制。当确定出眼动类型为跳动时,可以继续检测用户的眼动类型,此时电子设备上的显示界面可以无变化。另外,当在前一时刻的眼动类型不为跳动,而当前时刻的眼动类型为跳动时,则不再指示用户进行手势控制。也即是说,当用户眼睛的眼动类型由非跳动切换跳动时,则不再指示用户进行手势控制。当用户眼睛的眼动类型由跳动切换为非跳动时,则指示用户进行手势控制。其中,非跳动是指不是跳动。Specifically, after the eye movement type is acquired, the user can be instructed to perform gesture control according to the eye movement type, so that the user can learn that the electronic device can be controlled by gestures in the future. In some embodiments, when it is determined that the eye movement type is not jerk (ie, gaze or gentle tracking, etc.), the user may be instructed to perform gesture control. When it is determined that the eye movement type is jitter, the detection of the user's eye movement type may continue, and at this time, the display interface on the electronic device may not change. In addition, when the eye movement type at the previous moment is not jerk, but the eye movement type at the current moment is jerk, the user is no longer instructed to perform gesture control. That is to say, when the eye movement type of the user's eyes switches from non-jumping to jittering, the user is no longer instructed to perform gesture control. When the eye movement type of the user's eyes is switched from bouncing to non-bouncing, the user is instructed to perform gesture control. Here, non-jump means not jitter.
示例性的,如图9的(A)所示,电子设备100具备显示屏,且显示屏处于亮屏状态,以及在电子设备100的显示屏上可以显示有应用的图标或者用户正在观看的其他内容(比如文字、图片等);此时,以在显示屏上显示的是应用的图标为例,在指示用户进行手势控制时,如图9的(B)所示,在电子设备100的显示屏上可以显示光标92,从而通过该光标92以指示用户进行手势控制。示例性的,如图10的(A)所示,电子设备100具备显示屏,且显示屏处于亮屏状态,以及在电子设备100的显示屏上可以显示有应用的图标或者用户正在观看的其他内容(比如文字、图片等),在指示用户进行手势控制时,如图10的(B)所示,在电子设备100上的显示屏上可以放大某个图标,如图标91,从而通过放大图标以指示用户进行手势控制。Exemplarily, as shown in (A) of FIG. 9 , the electronic device 100 has a display screen, and the display screen is in a bright state, and an icon of an application or other programs that the user is watching may be displayed on the display screen of the electronic device 100 . Content (such as text, pictures, etc.); at this time, taking the application icon displayed on the display screen as an example, when instructing the user to perform gesture control, as shown in (B) of FIG. 9 , the display on the electronic device 100 A cursor 92 may be displayed on the screen, so as to indicate the user to perform gesture control through the cursor 92 . Exemplarily, as shown in (A) of FIG. 10 , the electronic device 100 has a display screen, and the display screen is in a bright state, and the display screen of the electronic device 100 may display application icons or other programs that the user is viewing. Content (such as text, pictures, etc.), when instructing the user to perform gesture control, as shown in (B) of FIG. to instruct the user to perform gesture controls.
在一些实施例中,继续参阅图9的(B),光标902所在的位置可以为手势操作的起始位置。继续参阅图10的(B),放大的图标901所在的位置可以为手势操作的起始位置。示例性的,手势操作的起始位置可以理解为用户在进行手势操作时光标的初始移动位置。In some embodiments, referring to (B) of FIG. 9 , the position of the cursor 902 may be the starting position of the gesture operation. Continuing to refer to (B) of FIG. 10 , the position of the enlarged icon 901 may be the starting position of the gesture operation. Exemplarily, the starting position of the gesture operation can be understood as the initial moving position of the cursor when the user performs the gesture operation.
可理解的是,除了使用图9和图10所示的指示方式外,还可以采用语音指示等方式指示用户进行手势控制,具体可根据实际情况而定,此处不做限定。It can be understood that, in addition to using the instruction methods shown in FIG. 9 and FIG. 10 , voice instructions and other methods can also be used to instruct the user to perform gesture control, which may be determined according to actual conditions, and is not limited here.
S803、获取用户的操作手势。S803. Obtain an operation gesture of the user.
具体地,可以利用上文所描述的手势追踪装置120获取用户的操作手势,详见上文描述,此处不再赘述。示例性的,操作手势也可以称之为手势。Specifically, the gesture tracking device 120 described above may be used to obtain the user's operation gestures, see the above description for details, and will not be repeated here. Exemplarily, an operation gesture may also be referred to as a gesture.
S804、根据操作手势,对电子设备进行控制。S804. Control the electronic device according to the operation gesture.
具体地,获取到用户的操作手势后,即可以根据该操作手势对电子设备进行控制。示例性的,获取到操作手势后,可以根据该操作手势,查询预设的手势和控制指令之间的映射关系,从而确定出该操作手势对应的控制指令,进而基于该控制指令对电子设备进行控制。Specifically, after the user's operation gesture is acquired, the electronic device can be controlled according to the operation gesture. Exemplarily, after the operation gesture is obtained, the mapping relationship between the preset gesture and the control instruction can be queried according to the operation gesture, so as to determine the control instruction corresponding to the operation gesture, and then based on the control instruction, the electronic device can be controlled. control.
由此,通过将眼动操控和手势操控有序结合,使得眼动操控和手势操控之间可以无感交互,从而使得用户可以高效便捷和高精度的对电子设备进行操控。Therefore, through the orderly combination of eye movement control and gesture control, there can be no sense interaction between eye movement control and gesture control, so that the user can control the electronic device efficiently, conveniently and with high precision.
在一些实施例中,在上述S802中指示用户进行手势控制之前,还可以根据跳动时的眼动类型,实时确定用户眼睛的注视位置,以便确定出用户潜在的控制区域,进而可以精准的指示用户进行手势控制。其中,确定用户眼睛的注视位置的方式可以参考上文图2中有关眼动追踪装置110的相关介绍,此处不再赘述。示例性的,用户眼睛的注视位置可以表示用户的感兴趣区域。示例性的,“确定用户眼睛的注视位置”可以理解为是“根据跳动时的眼动类型对电子设备进行控制”。In some embodiments, before instructing the user to perform gesture control in the above S802, the gaze position of the user's eyes can be determined in real time according to the eye movement type during jumping, so as to determine the user's potential control area, and then the user can be accurately instructed For gesture control. Wherein, the way of determining the gaze position of the user's eyes can refer to the relevant introduction about the eye tracking device 110 in FIG. 2 above, and will not be repeated here. Exemplarily, the gaze position of the user's eyes may represent the user's interest area. Exemplarily, "determining the gaze position of the user's eyes" can be understood as "controlling the electronic device according to the type of eye movement during jumping".
示例性的,在精准的指示用户进行手势控制时,可以在确定出的用户的眼睛的注 视位置所在的预设区域内指示用户进行手势控制。例如:继续参阅图9的(A),当确定出用户的注视位置是图标91所在的位置,此时如图9的(B)所示,可以将光标92显示在图标91所在的位置处。同样的,继续参阅图10的(A),当确定出用户的注视位置是图标91所在的位置,此时如图10的(B)所示,可以将图标91进行放大。由于用户的注视位置可以表示用户的感兴趣区域,因此,这样可以在指示用户可以进行手势控制的同时,使得用户可以直接可以对其感兴趣的区域进行操作,避免了用户需要重新选择其所希望控制的内容,提升了用户体验。Exemplarily, when precisely instructing the user to perform gesture control, the user may be instructed to perform gesture control within the determined preset area where the gaze position of the user's eyes is located. For example: continue to refer to (A) of FIG. 9 , when it is determined that the user's gaze position is the position where the icon 91 is, as shown in (B) of FIG. 9 , the cursor 92 can be displayed at the position where the icon 91 is. Similarly, continuing to refer to (A) of FIG. 10 , when it is determined that the user's gaze position is where the icon 91 is located, as shown in (B) of FIG. 10 , the icon 91 can be enlarged. Since the user's gaze position can indicate the user's area of interest, this can indicate that the user can perform gesture control at the same time, so that the user can directly operate the area of interest, avoiding the need for the user to reselect the desired area. Controlled content improves user experience.
在一些实施例中,当用户眼睛的眼动类型为跳动时,还可以根据此时的眼动类型,控制切换电子设备上所显示的内容。示例性的,可以控制切换电子设备上所显示的用户界面。其中,切换方向可以由用户眼睛的注视位置的变化方向确定。举例来说,当用户正在浏览文件,该文件总共有10页,电子设备一次只能显示1页,且用户正在浏览第5页时,当确定出用户眼睛的眼动类型为跳动,且用户眼睛的注视位置是由电子设备的上方朝向下方移动,则可以在用户眼睛的注视位置处于电子设备上的显示区域的底部时,电子设备可以将其所显示的文件的内容由第5页切换至第6页;当确定出用户眼睛的眼动类型为跳动,且用户眼睛的注视位置是由电子设备的下方朝向上方移动,则可以在用户眼睛的注视位置处于电子设备上的显示区域的顶部时,电子设备可以将其所显示的文件的内容由第5页切换至第4页。示例性的,“当用户眼睛的眼动类型为跳动时,根据此时的眼动类型,控制切换电子设备上所显示的内容”可以理解为是“根据跳动时的眼动类型对电子设备进行控制”。In some embodiments, when the eye movement type of the user's eyes is twitching, the content displayed on the electronic device may be switched according to the current eye movement type. Exemplarily, the user interface displayed on the electronic device can be controlled to be switched. Wherein, the switching direction may be determined by the changing direction of the gaze position of the user's eyes. For example, when the user is browsing a file, the file has 10 pages in total, and the electronic device can only display 1 page at a time, and the user is browsing page 5, when it is determined that the eye movement type of the user's eyes is twitching, and the user's eyes If the gaze position of the electronic device moves from the top to the bottom, then when the gaze position of the user's eyes is at the bottom of the display area on the electronic device, the electronic device can switch the content of the file it displays from page 5 to page 5. Page 6; when it is determined that the eye movement type of the user's eyes is jitter, and the gaze position of the user's eyes is moving from the bottom to the top of the electronic device, then when the gaze position of the user's eyes is at the top of the display area on the electronic device, The electronic device can switch the content of the displayed file from page 5 to page 4. Exemplarily, "when the eye movement type of the user's eyes is bouncing, control and switch the content displayed on the electronic device according to the eye movement type at this time" can be understood as "controlling the electronic device according to the eye movement type of the bouncing control".
在一些实施例中,当根据跳动时的眼动类型对电子设备进行控制时,可以限制电子设备获取用户的手势,以避免用户的手势的干扰。示例性的,限制电子设备获取用户的手势可以包括:电子设备不获取用户的手势,或者,电子设备虽然获取了用户的手势但不对手势进行处理。作为一种可能的实现方式,可以控制手势追踪装置120停止工作,或者,可以继续控制手势追踪装置120工作,但对其获取到的手势不做处理。In some embodiments, when the electronic device is controlled according to the type of eye movement during jumping, the electronic device may be restricted from acquiring the user's gestures, so as to avoid interference from the user's gestures. Exemplarily, restricting the electronic device from acquiring the user's gesture may include: the electronic device does not acquire the user's gesture, or, although the electronic device acquires the user's gesture, it does not process the gesture. As a possible implementation manner, the gesture tracking device 120 can be controlled to stop working, or the gesture tracking device 120 can be continuously controlled to work, but the acquired gestures are not processed.
在一些实施例中,考虑到用户在对某个项目进行操作时,一般需要注视该项目,因此为了便于用户操控,在指示用户进行手势控制时,还可以在用户的眼动类型不为跳动时,当用户在电子设备100的显示区域上的注视区域发生变化时,可以根据用户的眼睛的移动(此时用户的眼动类型为平缓追踪),控制用于指示用户进行手势控制的内容(比如光标等)移动,直至用户的眼动类型为注视,由此以便告知用户其当前所注视的位置,便于用户选择项目。示例性的,以用于指示用户进行手势控制的内容为光标为例,如图11的(A)所示,当用户的眼动类型不为跳动,且当前用户的注视区域为电子设备100的显示屏上的区域1101时,可以在电子设备100的显示屏上的区域1101中显示光标92,以指示用户进行手势控制,接着,当用户的眼动类型为平缓追踪,且其注视区域移动至电子设备100的显示屏上的区域1102时,如图11的(B)所示,可以控制光标92由区域1101移动至区域1102。示例性的,以用于指示用户进行手势控制的内容为放大内容为例,如图12的(A)所示,当用户的眼动类型不为跳动,且当前用户的注视区域为电子设备100的显示屏上的区域1201时,则可以控制放大区域1201中的图标1202,以指示用户进行手势控制,接着,当用户的眼动类型为平缓追踪,且其注视区域移动至电子设备100的显示屏上的区域1203时,如图12的 (B)所示,可以控制区域1201中的图标1202恢复至初始状态(即未放大的状态),并控制放大区域1203中的图标1204。In some embodiments, considering that the user generally needs to focus on the item when operating an item, in order to facilitate the user's manipulation, when instructing the user to perform gesture control, the user's eye movement type can also be skipped. , when the gaze area of the user on the display area of the electronic device 100 changes, the content used to instruct the user to perform gesture control (such as Cursor, etc.) until the user's eye movement type is gazing, so as to inform the user of his current gaze position, so as to facilitate the user to select an item. Exemplarily, taking the content used to instruct the user to perform gesture control as a cursor as an example, as shown in (A) of FIG. When the area 1101 on the display screen, the cursor 92 can be displayed in the area 1101 on the display screen of the electronic device 100 to instruct the user to perform gesture control. Then, when the user's eye movement type is gentle tracking, and the gaze area moves to When there is an area 1102 on the display screen of the electronic device 100 , as shown in (B) of FIG. 11 , the cursor 92 can be controlled to move from the area 1101 to the area 1102 . Exemplarily, taking the zoom-in content as an example to instruct the user to perform gesture control, as shown in (A) of FIG. When the area 1201 on the display screen is displayed, the icon 1202 in the enlarged area 1201 can be controlled to instruct the user to perform gesture control. Then, when the user's eye movement type is gentle tracking, and the gaze area moves to the display of the electronic device 100 When the area 1203 on the screen is displayed, as shown in (B) of FIG. 12 , the icon 1202 in the area 1201 can be controlled to return to the initial state (that is, the state not enlarged), and the icon 1204 in the enlarged area 1203 can be controlled.
在一些实施例中,在用户进行手势控制过程中,当用户的手部需要移动的距离较大时,则容易造成用户手部的疲劳,导致用户体验较差。此外,如图13所示,当用户A的手部22可以在区域50中移动即可以完成对电子设备100的控制时,虽然可以解决用户手部移动较大而造成的手部易疲劳的情况,但由于电子设备100上的显示区域51明显大于区域50,这就使得用户A的手部22在区域50中移动距离映射到电子设备100上的显示区域51上时往往移动距离较大,进而造成控制精度下降,使得用户难以精准对电子设备100进行控制。为避免用户手部移动距离较大的情况出现,同时为了提升控制精度,还可以在确定出用户的注视位置后,基于用户的注视位置,确定用户的注视区域,并预先设定注视区域的尺寸与用户手部的操作区域的尺寸之间的对应关系,例如,注视区域的尺寸和操作区域的尺寸之间的比例可以是1:1,也可以是1:0.8,亦可以是1:1.2等等,这样用户在操作时其手部可以在用户手部的操作区域进行操作(当然了,用户在操作过程中其手部也可以超出操作区域,此时,在用户的眼睛的眼动类型不为跳动时,光标等用于指示用户进行手势控制的内容可以跟随用户的手部移动至注视区域外部),进而完成对注视区域中内容的控制,提升了操作的舒适性。例如,如图14所示,用户的注视位置为位置x,此时基于位置x确定出的注视区域为区域53,其中,区域53映射到用户A所在的位置处的操作区域为区域52,之后,用户A可以控制其手部22在该区域52中做出控制手势,即可以实现对区域53中的内容的控制,由此使得可以将用户手部的移动距离控制在用户较为舒适的区域中,提升了用户的操作体验。此外,在图14中,区域53和区域52之间大小可以是相同的,这样使得用户A的手部22在区域52中的移动距离可以1:1的映射到区域53中,进而可以精准的对电子设备100进行控制,提升了控制精度。在一个例子中,注视区域和操作区域的尺寸主要是为了确定操作的比例尺,例如当需要实现非常高精度的操控时,注视区域与操作区域的尺寸之间的比例可以为2:1等;当需要实现较低精度的操控时,注视区域与操作区域的尺寸之间的比例可以为1:2等。In some embodiments, when the user's hand needs to move a large distance during the gesture control process of the user, it is easy to cause fatigue of the user's hand, resulting in poor user experience. In addition, as shown in FIG. 13 , when the hand 22 of the user A can move in the area 50 to complete the control of the electronic device 100, although it can solve the situation that the hand of the user A is easily fatigued due to the large movement of the hand, However, since the display area 51 on the electronic device 100 is obviously larger than the area 50, this makes the moving distance of the hand 22 of the user A in the area 50 often move a relatively large distance when it is mapped to the display area 51 on the electronic device 100, thereby causing control The decreased precision makes it difficult for the user to precisely control the electronic device 100 . In order to avoid the situation that the user's hand moves a large distance, and to improve the control accuracy, after the user's gaze position is determined, the user's gaze area can be determined based on the user's gaze position, and the size of the gaze area can be set in advance Correspondence with the size of the operation area of the user's hand, for example, the ratio between the size of the gaze area and the size of the operation area can be 1:1, or 1:0.8, or 1:1.2, etc. etc., so that the user's hand can operate in the operating area of the user's hand during operation (of course, the user's hand can also exceed the operating area during operation. At this time, the eye movement type of the user's eyes does not When it is not jumping, the cursor and other content used to instruct the user to perform gesture control can follow the user's hand to move outside the gaze area), and then complete the control of the content in the gaze area, improving the comfort of operation. For example, as shown in Figure 14, the user's gaze position is position x, and the gaze area determined based on position x is area 53 at this time, wherein, area 53 is mapped to the operation area at the position where user A is located as area 52, and then , user A can control his hand 22 to make a control gesture in this area 52, that is, he can control the content in the area 53, so that the moving distance of the user's hand can be controlled in a more comfortable area for the user , which improves the user's operating experience. In addition, in FIG. 14 , the size of the area 53 and the area 52 can be the same, so that the moving distance of the hand 22 of the user A in the area 52 can be mapped to the area 53 at a 1:1 ratio, and thus accurate Controlling the electronic device 100 improves the control precision. In one example, the size of the gaze area and the operation area is mainly to determine the scale of the operation. For example, when very high-precision manipulation is required, the ratio between the size of the gaze area and the operation area can be 2:1, etc.; When it is necessary to achieve lower-precision manipulation, the ratio between the size of the gaze area and the size of the operation area may be 1:2 or the like.
示例性的,该注视区域可以是以用户的注视位置为中心向外延伸预设距离的区域。此外,也可以基于用户的注视位置、用户与电子设备之间的距离或者用户观看电子设备的角度等中的多项实时或周期性计算该注视区域。示例性的,可以由用户与电子设备之间的距离确定出以用户的注视位置为中心向外延伸的目标距离,之后,以用户的注视位置为中心向外延伸目标距离,即可以得到该注视区域。Exemplarily, the gaze area may be an area extending outward for a preset distance centered on the gaze position of the user. In addition, the gaze area may also be calculated in real time or periodically based on items such as the gaze position of the user, the distance between the user and the electronic device, or the angle at which the user watches the electronic device. Exemplarily, the distance between the user and the electronic device can be used to determine the target distance extending outward centered on the user's gaze position, and then the target distance can be obtained by extending the target distance outward centered on the user's gaze position. area.
举例来说,如图15所示,由于人的眼睛最灵敏的区域的可视角度A为2°,因此在获取到用户与电子设备之间的距离D后,可以根据下述“公式一”计算出目标距离S,之后,即可以以用户的注视位置为中心向外延伸目标距离S,从而得到注视区域。示例性的,注视区域可以为圆形、矩形等规则图形,也可以为非规则图形。For example, as shown in Figure 15, since the viewing angle A of the most sensitive area of the human eye is 2°, after obtaining the distance D between the user and the electronic device, the following "Formula 1" can be used After the target distance S is calculated, the target distance S can be extended outward with the user's gaze position as the center, so as to obtain the gaze area. Exemplarily, the gaze area may be a regular shape such as a circle or a rectangle, or may be an irregular shape.
Figure PCTCN2022125889-appb-000001
Figure PCTCN2022125889-appb-000001
其中,A=2;D为用户与电子设备之间的距离,D可以由电子设备上的距离传感器检测得到;S为目标距离。Wherein, A=2; D is the distance between the user and the electronic device, D can be detected by the distance sensor on the electronic device; S is the target distance.
进一步地,在确定出注视区域后,在上述S802中可以将指示用户进行手势控制的 信息置于该注视区域中。示例性的,如图16所示,当确定出的注视区域为区域54,且通过光标92指示用户进行手势控制时,可以将光标92置于注视区域54中。当然了,当使用其他的指示方式指示用户进行手势控制时,也可以在确定出的注视区域中进行指示,此处不再一一赘述。Further, after the gaze area is determined, information instructing the user to perform gesture control may be placed in the gaze area in S802 above. Exemplarily, as shown in FIG. 16 , when the determined gaze area is the area 54 and the cursor 92 is used to instruct the user to perform gesture control, the cursor 92 may be placed in the gaze area 54 . Of course, when using other indication methods to instruct the user to perform gesture control, the indication may also be performed in the determined gaze area, which will not be repeated here.
在一些实施例中,当使用光标等方式指示用户进行手势控制时,为了避免光标遮挡用户正在观看的内容,在显示光标时,可以将该光标显示在注视区域内不遮挡用户观看视线的位置。示例性的,可以根据用户当前正在使用的应用的类型,确定光标在注视区域内的显示位置。例如,若用户当前正在使用的应用是阅读类应用,此时在显示光标时可以将光标显示在注视区域内未显示有文字的位置;若用户当前正在使用的应用是视频类应用,此时在显示光标时可以将光标显示在注视区域的左下角的位置,等等。In some embodiments, when the cursor is used to instruct the user to perform gesture control, in order to prevent the cursor from blocking the content the user is watching, when the cursor is displayed, the cursor can be displayed at a position within the gaze area that does not block the user's line of sight. Exemplarily, the display position of the cursor in the gaze area may be determined according to the type of application currently being used by the user. For example, if the application that the user is currently using is a reading application, the cursor can be displayed at a position where no text is displayed in the gaze area when displaying the cursor; if the application that the user is currently using is a video application, at this time When displaying the cursor, the cursor may be displayed at the lower left corner of the gaze area, and so on.
在一些实施例中,由于用户观看视频时,其眼睛的眼动类型在大部分情况下不为跳动,因此,在这种场景(即观影场景)下,如果指示用户进行手势控制,则容易影响到用户观看视频。因此,在该种场景下,为了避免误指示用户进行手势控制的情况,可以在电子设备上设置有预设区域,当用户注视该预设区域,且眼动类型不为跳动时,则可以根据眼动类型指示用户进行手势控制。示例性的,可以通过用户的注视位置,确定用户的注视区域是否为预设区域。例如,可以将用户的注视位置与预设区域进行对比,以确定用户的注视位置是否位于该预设区域中。此外,在执行这部分操作时,也可以先对用户使用电子设备的场景进行判断,以确定当前的使用场景是否是观看视频的场景。其中,电子设备可以基于用户正在使用的应用(application,APP)的类型和APP上显示的内容,确定用户使用电子设备的场景。例如,当用户正在使用的APP为视频类应用,且APP上正在播放视频时,则可以确定出用户当只有电子设备的场景为观看视频的场景。In some embodiments, since the eye movement type of the user's eyes is not bouncing in most cases when watching a video, it is easy to instruct the user to perform gesture control in such a scene (that is, a movie viewing scene). Affect the user to watch the video. Therefore, in this scenario, in order to avoid misinstructing the user to perform gesture control, a preset area can be set on the electronic device. Gaze type directs the user to gesture control. Exemplarily, it may be determined whether the user's gaze area is a preset area based on the user's gaze position. For example, the user's gaze position may be compared with a preset area to determine whether the user's gaze position is located in the preset area. In addition, when performing this part of the operation, it is also possible to judge the scene in which the user uses the electronic device first, so as to determine whether the current use scene is a scene of watching a video. Wherein, the electronic device may determine the scene where the user uses the electronic device based on the type of the application (application, APP) being used by the user and the content displayed on the APP. For example, when the APP being used by the user is a video application and a video is being played on the APP, it can be determined that the scene where the user only has electronic devices is a scene of watching a video.
以上即是对本申请提供的与电子设备进行交互的方法的介绍,为便于理解下面举例进行说明。The above is the introduction of the method for interacting with the electronic device provided by the present application, and for the convenience of understanding, an example is given below for illustration.
示例性的,图17示出了一种对电子设备进行控制的过程。在图17中,电子设备上具备上文所描述的眼动追踪装置和手势追踪装置,其中,在指示用户进行手势控制时是通过光标进行指示。如图17所示,该过程可以包括以下步骤:Exemplarily, FIG. 17 shows a process of controlling an electronic device. In FIG. 17 , the electronic device is equipped with the eye tracking device and the gesture tracking device described above, wherein the cursor is used to indicate when instructing the user to perform gesture control. As shown in Figure 17, the process may include the following steps:
S1301、进行眼动追踪。S1301. Perform eye movement tracking.
具体地,可以通过上文所描述的眼动追踪装置110进行眼动追踪,详见上文描述,此处不再赘述。Specifically, the eye tracking device 110 described above may be used to perform eye tracking, see the above description for details, and details will not be repeated here.
S1302、分析眼动类型。S1302. Analyzing eye movement types.
具体地,在对用户的眼睛的眼动进行追踪后,可以根据追踪到的眼动数据,分析眼动类型。比如,根据用户眼睛的眼动速度分析等。详见上文描述,此处不再赘述。Specifically, after the eye movement of the user's eyes is tracked, the eye movement type may be analyzed according to the tracked eye movement data. For example, according to the eye movement speed analysis of the user's eyes, etc. See the above description for details, and will not repeat them here.
S1303、判断眼动类型是否为跳动。S1303. Determine whether the eye movement type is jitter.
具体地,确定出眼动类型后,即可以判断出眼动类型是否是跳动。其中,当眼动类型为跳动时,表明当前用户没有操控电子设备的意图,因此,此时即执行S1304,即返回执行S1301。当眼动类型不为跳动时,表明当前用户存在操控电子设备的意图, 因此此时可以进入到精准控制阶段,即执行S1305。Specifically, after the eye movement type is determined, it can be determined whether the eye movement type is jitter. Wherein, when the eye movement type is jitter, it indicates that the current user has no intention of manipulating the electronic device. Therefore, at this time, S1304 is executed, that is, the execution returns to S1301. When the eye movement type is not jitter, it indicates that the current user has an intention to manipulate the electronic device, so at this time, the precise control stage can be entered, that is, S1305 is executed.
S1304、确定注视位置。S1304. Determine the gaze position.
具体地,当用户存在操控电子设备的意图时,则可以确定用户的注视位置,以便后续确定操作区域。对于确定注视位置的方式,详见上文描述,此处不再赘述。Specifically, when the user intends to manipulate the electronic device, the gaze position of the user may be determined, so as to subsequently determine the operation area. For the manner of determining the gaze position, refer to the above description for details, and details will not be repeated here.
在确定出注视位置后,即可以执行后续流程。其中,在后续的控制阶段,可以同时执行S1305和S1310。After the fixation position is determined, subsequent procedures can be performed. Wherein, in the subsequent control stage, S1305 and S1310 may be executed simultaneously.
S1305、根据注视位置,确定注视区域,并在适宜位置显示光标。S1305. Determine a gaze area according to the gaze position, and display a cursor at an appropriate position.
具体地,确定出注视位置后,即可以根据该注视位置,确定出注视区域,进而可以在该注视区域中的适宜位置显示光标。示例性的,注视区域可以为图14中描述的区域53,显示光标可以为图16中描述的显示方式。详见上文描述,此处不再赘述。Specifically, after the gazing position is determined, a gazing area can be determined according to the gazing position, and then a cursor can be displayed at an appropriate position in the gazing area. Exemplarily, the gaze area may be the area 53 described in FIG. 14 , and the display cursor may be the display manner described in FIG. 16 . See the above description for details, and will not repeat them here.
S1306、进行手部追踪。S1306. Perform hand tracking.
具体地,可以通过上文所描述的手势追踪装置120对用户的手部进行手部追踪,详见上文描述,此处不再赘述。Specifically, the user's hand may be tracked by the gesture tracking device 120 described above, see the above description for details, and will not be repeated here.
S1307、光标按规定比例尺随手部运动。S1307. The cursor moves along with the hand according to the prescribed scale.
具体地,当追踪到用户手部移动后,则可以控制光标按照规定的比例尺跟随用户的手部运动。示例性的,如图18所示,在T1时刻,用户A的手部22处于区域52中的位置521处,此时光标92处于区域53中的位置531处;在T2时刻,用户A的手部22产生移动,并移动至区域52中的位置522处,此时光标92可以按规定比例尺随手部22移动,并移动至区域53中的位置532处。Specifically, after the user's hand movement is tracked, the cursor can be controlled to follow the user's hand movement according to a specified scale. Exemplarily, as shown in FIG. 18 , at time T1, user A's hand 22 is at position 521 in area 52, and at this time cursor 92 is at position 531 in area 53; at time T2, user A's hand The hand part 22 moves and moves to the position 522 in the area 52 . At this time, the cursor 92 can move with the hand part 22 according to a prescribed scale and move to the position 532 in the area 53 .
S1308、判断是否有手势操作。S1308. Determine whether there is a gesture operation.
具体地,可以通过上文所描述的手势追踪装置120对用户的手部进行手部追踪,进而确定出是否有手势操作。其中,当有手势操作时,则执行S1310;否则,则返回执行S1307。Specifically, the gesture tracking device 120 described above may perform hand tracking on the user's hand, and then determine whether there is a gesture operation. Wherein, when there is gesture operation, execute S1310; otherwise, return to execute S1307.
S1309、完成对应操作。S1309. Complete the corresponding operation.
具体地,当确定出有手势操作时,则根据预先设定的手势与控制指令之间的对应关系,生成与当前确定出的手势对应的控制指令,并输出该控制指令,进而完成相应的操作。例如,继续参阅图9的(B),此时,当确定出的手势为确认手势时,则输出确认指令,从而完成确认选取“设置”的操作。Specifically, when it is determined that there is a gesture operation, according to the preset correspondence between gestures and control instructions, a control instruction corresponding to the currently determined gesture is generated, and the control instruction is output to complete the corresponding operation . For example, continue to refer to (B) of FIG. 9 , at this time, when the determined gesture is a confirmation gesture, a confirmation instruction is output, thereby completing the operation of confirming and selecting "setting".
S1310、判断眼动类型是否为跳动,以及在眼动类型为跳动时,控制光标消失并返回执行S1301,在眼动类型不为跳动时,继续进行手部追踪,即执行S1306。S1310. Determine whether the eye movement type is jitter, and if the eye movement type is jitter, control the cursor to disappear and return to execute S1301; if the eye movement type is not jitter, continue hand tracking, that is, execute S1306.
具体地,在手势控制阶段,可以继续对用户的眼动类型进行判断,以及在眼动类型为跳动时,控制光标消失并返回执行S1301,在眼动类型不为跳动时,继续进行手部追踪,即执行S1306。Specifically, in the gesture control stage, it is possible to continue to judge the user's eye movement type, and when the eye movement type is jumping, the control cursor disappears and returns to S1301, and when the eye movement type is not jumping, continue hand tracking , that is, execute S1306.
可以理解的是,本申请实施例中各步骤的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对本申请实施例的实施过程构成任何限定。此外,在一些可能的实现方式中,本申请实施例中的各执行步骤可以根据实际情况调整执行顺序和/或选择性执行,此处不做限定。此外,本申请实施例中提及的眼动控制和手势控制可以根据需要进行组合,比如:先眼动控制再手势控制,或者,先手势控制再眼动控制,或者,眼动控制和手势控制同时进行等等。It can be understood that the sequence numbers of the steps in the embodiments of the present application do not mean the order of execution, and the execution order of each process should be determined by its functions and internal logic, and should not constitute the implementation process of the embodiments of the present application. Any restrictions. In addition, in some possible implementation manners, the execution order of each execution step in the embodiment of the present application may be adjusted and/or selectively executed according to actual conditions, which is not limited here. In addition, the eye movement control and gesture control mentioned in the embodiments of this application can be combined as needed, for example: eye movement control first and then gesture control, or gesture control first and then eye movement control, or eye movement control and gesture control Simultaneously etc.
作为一种可能的实现方式,如图19所示,在S1901,电子设备可以获取用户眼睛的眼动类型。在S1902,当确定眼动类型为第一类型(比如:跳动)时,电子设备根据第一类型对电子设备进行控制。在S1903,当确定眼动类型为第二类型(比如:非跳动)时,电子设备指示用户进行手势控制。之后,在S1904,电子设备可以获取用户的手势。接着,在S1905,电子设备可以根据获取的手势对电子设备进行控制。可以理解的是,对于电子设备获取眼动类型和获取手势的过程,以及对电子设备进行控制的过程等,详见上文中的相关描述,此处就不再一一赘述。As a possible implementation manner, as shown in FIG. 19 , at S1901, the electronic device may obtain an eye movement type of the user's eyes. At S1902, when it is determined that the eye movement type is the first type (for example: jitter), the electronic device controls the electronic device according to the first type. At S1903, when it is determined that the eye movement type is the second type (for example: non-jumping), the electronic device instructs the user to perform gesture control. Afterwards, at S1904, the electronic device may acquire the user's gesture. Next, at S1905, the electronic device may control the electronic device according to the acquired gesture. It can be understood that, for the process of obtaining the eye movement type and the gesture of the electronic device, and the process of controlling the electronic device, etc., please refer to the relevant description above, and will not repeat them here.
作为另一种可能的实现方式,如图20所示,在S2001,电子设备可以先获取用户的手势。然后,在S2002,根据获取到的手势对电子设备进行控制。接着,在S2003,电子设备可以获取用户的眼动类型,其中,电子设备在获取用户的手势的过程或者在对电子设备进行控制的过程中,也可以执行S2003。接着,在S2004,当确定用户的眼动类型为第二类型(比如:非跳动)时,电子设备可以继续获取用户的手势,并在S2005,根据获取的手势对电子设备进行控制。或者,在S2006,当确定用户的眼动类型为第一类型(比如:跳动)时,电子设备可以根据此时的眼动类型对电子设备进行控制,并限制电子设备根据用户的手势控制电子设备,例如:电子设备可以不获取用户的手势,或者,电子设备虽然获取了用户的手势但不对该手势进行处理等。在一个例子中,在电子设备根据第一类型的眼动类型对电子设备进行控制后,若电子设备确定出用户眼睛的眼动类型由第一类型(比如:跳动)切换至第二类型(比如:非跳动),电子设备则可以指示用户进行手势控制;然后,电子设备可以获取用户的手势,并根据获取的手势对电子设备进行控制。可以理解的是,对于电子设备获取眼动类型和获取手势的过程,以及对电子设备进行控制的过程等,详见上文中的相关描述,此处就不再一一赘述。As another possible implementation manner, as shown in FIG. 20 , at S2001, the electronic device may acquire a user's gesture first. Then, at S2002, the electronic device is controlled according to the acquired gesture. Next, in S2003, the electronic device may obtain the user's eye movement type, wherein the electronic device may also execute S2003 during the process of obtaining the user's gesture or during the process of controlling the electronic device. Next, at S2004, when it is determined that the user's eye movement type is the second type (for example: non-jumping), the electronic device may continue to obtain the user's gesture, and at S2005, control the electronic device according to the obtained gesture. Or, in S2006, when it is determined that the user's eye movement type is the first type (for example: jumping), the electronic device can control the electronic device according to the eye movement type at this time, and limit the electronic device to control the electronic device according to the user's gesture , for example: the electronic device may not obtain the user's gesture, or the electronic device may not process the user's gesture although it has obtained the user's gesture. In an example, after the electronic device controls the electronic device according to the first type of eye movement type, if the electronic device determines that the eye movement type of the user's eyes is switched from the first type (such as: jumping) to the second type (such as : non-jumping), the electronic device can instruct the user to perform gesture control; then, the electronic device can obtain the gesture of the user, and control the electronic device according to the obtained gesture. It can be understood that, for the process of obtaining the eye movement type and the gesture of the electronic device, and the process of controlling the electronic device, etc., please refer to the relevant description above, and will not repeat them here.
作为又一种可能的实现方式,如图21所示,在S2101,电子设备可以先获取用户的眼动信息。在S2102,电子设备根据眼动信息确定出的眼动类型为第一类型(比如:跳动),然后,在S2103,电子设备可以根据此时的眼动类型对电子设备进行控制。接着,在S2104,电子设备继续获取用户的眼动信息,然后,在S2105,电子设备根据眼动信息确定出的眼动类型为第二类型(比如:非跳动),接着,在S2106,电子设备可以获取用户的手势,并在S2107可以根据获取的手势对电子设备进行控制。之后,电子设备可以继续获取户的眼动信息,当电子设备由眼动信息确定出的眼动类型为第一类型(比如:跳动)时,电子设备可以根据此时的眼动类型对电子设备进行控制,并限制获取用户的手势。可以理解的是,对于电子设备获取眼动类型和获取手势的过程,以及对电子设备进行控制的过程等,详见上文中的相关描述,此处就不再一一赘述。As yet another possible implementation manner, as shown in FIG. 21 , at S2101, the electronic device may first obtain eye movement information of the user. In S2102, the eye movement type determined by the electronic device according to the eye movement information is the first type (for example: jumping), and then, in S2103, the electronic device may control the electronic device according to the eye movement type at this time. Next, at S2104, the electronic device continues to obtain the user's eye movement information, and then, at S2105, the electronic device determines that the eye movement type according to the eye movement information is the second type (for example: non-jumping), and then, at S2106, the electronic device The user's gesture can be acquired, and the electronic device can be controlled according to the acquired gesture at S2107. Afterwards, the electronic device can continue to acquire the user's eye movement information. When the eye movement type determined by the electronic device from the eye movement information is the first type (for example: jumping), the electronic device can control the electronic device according to the eye movement type at this time. Take control and restrict access to user gestures. It can be understood that, for the process of obtaining the eye movement type and the gesture of the electronic device, and the process of controlling the electronic device, etc., please refer to the relevant description above, and will not repeat them here.
作为再一种可能的实现方式,如图22所示,在S2201,电子设备可以获取用户针对电子设备上第一区域进行操作的第一手势。接着,在S2202,电子设备可以根据此时获取的第一手势对电子设备进行控制。接着,在S2203,电子设备可以获取用户的眼动类型。在S2204,当确定用户的眼动类型为第一类型(比如:跳动)时,电子设备可以根据此时的眼动类型对电子设备进行控制,并限制电子设备获取用户的手势。在S2205,电子设备可以继续获取用户的眼动类型。在S2206、当确定用户的眼动类型 由第一类型(比如:跳动)切换至第二类型(比如:非跳动),且用户的注视位置由电子设备上的第一区域切换至第二区域时,电子设备可以指示用户进行手势控制。接着,在S2207,电子设备可以获取用户针对第二区域进行操作的第二手势,并在S2208根据获取的第二手势对电子设备进行控制。可以理解的是,对于电子设备获取眼动类型和获取手势的过程,以及指示用户进行手势控制,对电子设备进行控制的过程等,详见上文中的相关描述,此处就不再一一赘述。示例性的,图22所描述的过程可以理解为用户先对电子设备上的一个区域进行手势控制,然后,再将视线转移到电子设备上的另一个区域,接着在对新的区域进行手势控制。例如,如图24所示,在图24的(B)至(D)中用户对区域53中的内容进行手势控制;接着,在由图24的(D)切换至图24的(E)的过程中,电子设备获取到的用户的眼动类型为跳动,此时电子设备可以实时确定用户眼睛的注视位置(即对电子设备进行控制),在该过程中电子设备可以不再获取用户的手势,或者继续获取用户的手势但不做响应(即限制电子设备获取用户的手势);接着,如图24的(E)所示,电子设备继续获取用户的眼动类型,此时用户的眼动类型由跳动切换为非跳动,并且用户的注视位置也由图24的(D)中的位置切换至图片63所在的位置,此时,如图24的(F)所示,电子设备可以指示用户进行手势控制,并在后续获取用户针对图片63所在区域进行操作的手势,以及根据获取到的手势对电子设备进行控制。As yet another possible implementation manner, as shown in FIG. 22 , at S2201, the electronic device may acquire a first gesture performed by the user on the first area on the electronic device. Next, at S2202, the electronic device may control the electronic device according to the first gesture acquired at this time. Next, at S2203, the electronic device may acquire the eye movement type of the user. At S2204, when it is determined that the user's eye movement type is the first type (for example: jumping), the electronic device may control the electronic device according to the current eye movement type, and restrict the electronic device from acquiring the user's gestures. At S2205, the electronic device may continue to acquire the eye movement type of the user. In S2206, when it is determined that the user's eye movement type is switched from the first type (for example: jumping) to the second type (for example: non-beating), and the user's gaze position is switched from the first area on the electronic device to the second area , the electronic device can instruct the user to perform gesture control. Next, at S2207, the electronic device may acquire a second gesture performed by the user on the second area, and at S2208 control the electronic device according to the acquired second gesture. It can be understood that, for the process of obtaining the eye movement type and gesture of the electronic device, as well as the process of instructing the user to perform gesture control and controlling the electronic device, etc., please refer to the relevant description above for details, and will not go into details here. . Exemplarily, the process described in FIG. 22 can be understood as that the user first performs gesture control on an area on the electronic device, and then shifts the line of sight to another area on the electronic device, and then performs gesture control on the new area. . For example, as shown in Figure 24, in (B) to (D) of Figure 24, the user performs gesture control to the content in area 53; Then, switch from (D) of Figure 24 to (E) of Figure 24 During the process, the user's eye movement type acquired by the electronic device is jerk, at this time, the electronic device can determine the gaze position of the user's eyes in real time (that is, control the electronic device), and the electronic device can no longer obtain the user's gestures during this process. , or continue to acquire the user's gestures but do not respond (that is, restrict the electronic device from acquiring the user's gestures); then, as shown in (E) of Figure 24, the electronic device continues to acquire the user's eye movement type, and at this time the user's eye movement The type is switched from beating to non-jumping, and the user's gaze position is also switched from the position in (D) of Figure 24 to the position where the picture 63 is located. At this time, as shown in (F) of Figure 24, the electronic device can indicate the user Gesture control is performed, and the gestures performed by the user on the area where the picture 63 is located are subsequently obtained, and the electronic device is controlled according to the obtained gestures.
以上即是对本申请提供的技术方案的介绍,为便于理解下面分场景进行举例说明。The above is the introduction of the technical solution provided by the present application, and the following sub-scenes are illustrated for ease of understanding.
场景一scene one
在该场景下电子设备100的显示屏上显示有多个项目,比如应用、文件、图片、视频等等。In this scenario, multiple items are displayed on the display screen of the electronic device 100, such as applications, files, pictures, videos and so on.
如图23的(A)所示,用户A可以浏览电子设备100上的项目,在用户A浏览过程中,与电子设备100配套的眼动追踪装置可以确定出用户A的眼睛的眼动类型为跳动,此时则继续对用户A的眼睛的眼动类型进行判断。As shown in (A) of Figure 23, user A can browse items on the electronic device 100, and during user A's browsing process, the eye movement tracking device matched with the electronic device 100 can determine that the eye movement type of user A's eyes is At this time, continue to judge the eye movement type of user A's eyes.
接着,如图23的(B)所示,用户A的眼睛开始注视项目61,此时电子设备100上的眼动追踪装置可以确定出用户A的眼睛的眼动类型不为跳动。之后,可以利用眼动追踪装置确定出用户A的眼睛的注视位置,进而可以确定出注视区域53,以及在注视区域53的左下角显示光标92。此时,用户A可以利用其手部22在于注视区域53对应的操作区域52中进行手势操作。但此时用户A改变主意,并再次重新浏览项目。这时,可以利用眼动追踪装置确定出用户A的眼睛的眼动类型为跳动,此时,则不再显示光标902,即控制光标902消失,并显示如图23的(C)中电子设备100所显示的界面。在一个例子中,在图23的(B)中,在确定用户A的眼睛注视项目61后,为了提升控制效率,可以将光标92置于项目61上,这样用户A可以直接选取项目61,而不用在移动光标92后才能选取项目61。Next, as shown in (B) of FIG. 23 , user A's eyes start to fixate on item 61 , and the eye-tracking device on the electronic device 100 can determine that the eye-movement type of user A's eyes is not jitter. Afterwards, the gaze position of the user A's eyes can be determined by using the eye tracking device, and then the gaze area 53 can be determined, and the cursor 92 can be displayed at the lower left corner of the gaze area 53 . At this time, the user A can use his hand 22 to perform gesture operations in the operation area 52 corresponding to the gaze area 53 . But at this point User A changes his mind and rebrowses the item again. At this time, the eye-tracking device can be used to determine that the eye movement type of user A's eyes is twitching. At this time, the cursor 902 is no longer displayed, that is, the cursor 902 is controlled to disappear, and the electronic device shown in (C) of FIG. 23 is displayed. 100 displayed interface. In one example, in (B) of FIG. 23 , after determining that user A's eyes are fixed on item 61, in order to improve control efficiency, cursor 92 can be placed on item 61, so that user A can directly select item 61, and The item 61 can be selected without moving the cursor 92 .
接着,如图23的(D)所示,用户A的眼睛开始注视项目62,此时电子设备100上的眼动追踪装置可以确定出用户A的眼睛的眼动类型不为跳动。之后,可以利用与电子设备100配套的眼动追踪装置确定出用户A的眼睛的注视位置,进而可以确定出注视区域53,以及在注视区域53的左下角显示光标92。此时,用户A可以利用其手 部22在于注视区域53对应的操作区域52中进行手势操作。在一个例子中,在图23的(D)中,在确定用户A的眼睛注视项目62后,为了提升控制效率,可以将光标92置于项目62上,这样用户A可以直接选取项目62,而不用在移动光标92后才能选取项目62。Next, as shown in (D) of FIG. 23 , the user A's eyes start to focus on the item 62 . At this time, the eye-tracking device on the electronic device 100 can determine that the eye movement type of the user A's eyes is not jitter. After that, the gaze position of user A's eyes can be determined by using the eye tracking device matched with the electronic device 100 , and then the gaze area 53 can be determined, and the cursor 92 can be displayed at the lower left corner of the gaze area 53 . At this time, user A can use his hand 22 to perform gesture operations in the operation area 52 corresponding to the gaze area 53 . In one example, in (D) of FIG. 23 , after determining that the user A's eyes are fixed on the item 62, in order to improve the control efficiency, the cursor 92 can be placed on the item 62, so that the user A can directly select the item 62, and It is not necessary to move the cursor 92 to select the item 62 .
接着,如图23的(E)所示,用户A移动其手部22。在用户A的手部22移动时,与电子设备100配套的手势追踪装置可以追踪到手部22的移动。之后,电子设备100可以控制其注视区域53中的光标92跟随手部22一起移动。当光标92移动至项目62上时,用户A即可以通过确认手势完成“确认”操作。至此即完成选取项目62的操作。Next, the user A moves his hand 22 as shown in (E) of FIG. 23 . When the hand 22 of the user A moves, the gesture tracking device matched with the electronic device 100 can track the movement of the hand 22 . Afterwards, the electronic device 100 can control the cursor 92 in its gaze area 53 to move along with the hand 22 . When the cursor 92 moves to the item 62, the user A can complete the "confirmation" operation through the confirmation gesture. So far, the operation of selecting item 62 is completed.
这样,通过结合眼动追踪和手势追踪,可以让用户在不使用外设的情况自然高效的完成项目选择,而且系统允许用户中途改变主意,完成二次或多次选择。另一方面,由于眼动操控为手势操控完成了粗略选择,用户在整个操控过程中手部移动的距离和时间会很短,因此疲劳程度很低,提升了用户体验和操控精度。In this way, by combining eye tracking and gesture tracking, users can complete item selection naturally and efficiently without using peripheral devices, and the system allows users to change their minds midway and complete secondary or multiple selections. On the other hand, since the eye movement control completes the rough selection for the gesture control, the distance and time of the user's hand movement during the entire control process will be very short, so the degree of fatigue is very low, which improves the user experience and control accuracy.
场景二scene two
该场景下电子设备100上可以显示有文字等内容,此时用户可以阅读电子设备100上的文字。示例性的,该场景可以为会议、教育等场景。In this scenario, content such as text may be displayed on the electronic device 100 , and the user may read the text on the electronic device 100 at this time. Exemplarily, the scene may be a meeting, education and other scenes.
如图24的(A)所示,用户A阅读电子设备100的显示屏上正在显示的文字。在用户A阅读过程中,与电子设备100配套的眼动追踪装置检测到眼动类型为平缓追踪。As shown in (A) of FIG. 24 , the user A reads the text being displayed on the display screen of the electronic device 100 . During user A's reading process, the eye movement tracking device matched with the electronic device 100 detects that the eye movement type is smooth tracking.
接着,如图24的(B)所示,可以利用眼动追踪装置确定出用户A的眼睛的注视位置,进而可以确定出注视区域53,即用户眼睛的眼动类型不为跳动,此时可以在注视区域53内显示光标92。为了避免光标92影响用户A阅读,光标92可以出现在不遮挡阅读视线的地方。其中,在图24的(B)中也可以确定出与注视区域对应的操作区域52,用户A的手部22可以在操作区域52中做出相应的手势操作。Next, as shown in (B) of FIG. 24 , the gaze position of user A's eyes can be determined by using the eye tracking device, and then the gaze area 53 can be determined, that is, the eye movement type of the user's eyes is not jumping, and at this time it can be A cursor 92 is displayed within the gaze area 53 . In order to prevent the cursor 92 from affecting user A's reading, the cursor 92 may appear in a place that does not block the reading line of sight. Wherein, the operation area 52 corresponding to the gaze area can also be determined in (B) of FIG. 24 , and the hand 22 of the user A can make a corresponding gesture operation in the operation area 52 .
接着,如图24的(C)所示,用户A移动其手部22。在用户A的手部22移动时,与电子设备100配套的手势追踪装置可以追踪到手部22的移动。之后,电子设备100可以控制其注视区域53中的光标92跟随手部22一起移动。当光标92移动至用户A想要进行标注的位置时,用户A可以停止移动其手部22,并可以做出开始标注的手势,从而进行标注。Next, the user A moves his hand 22 as shown in (C) of FIG. 24 . When the hand 22 of the user A moves, the gesture tracking device matched with the electronic device 100 can track the movement of the hand 22 . Afterwards, the electronic device 100 can control the cursor 92 in its gaze area 53 to move along with the hand 22 . When the cursor 92 moves to the position where user A wants to mark, user A can stop moving his hand 22 and make a gesture to start marking, so as to mark.
接着,如图24的(D)所示,用户A在做出开始标注的手势后,其可以继续移动其手部22,并在移动至其想要结束标注的位置时,做出停止标注的手势,此时即结束标注,此时即得到标注区域531。其中,在用户A进行标注过程中,光标92可以跟随用户A的手部22移动。可以理解的是,用户A在标注时除了采用图24的(D)中描述的标注方式外,也可以采用其他的标注方式,比如使用图案进行标注等等,此处不做限定。Then, as shown in (D) of FIG. 24 , after user A makes the gesture to start labeling, he can continue to move his hand 22, and when he moves to the position where he wants to end labeling, he can make a gesture to stop labeling. Gesture, the labeling is ended at this moment, and the labeling area 531 is obtained at this moment. Wherein, during the process of user A marking, the cursor 92 may move along with user A's hand 22 . It can be understood that, in addition to the marking method described in (D) of FIG. 24 , user A can also use other marking methods when marking, such as marking with patterns, etc., which are not limited here.
接着,如图24的(E)所示,在用户A标注完成后,用户A的视线转移至图片63上,此时,与电子设备100配套的眼动追踪装置可以检测到用户A的眼动类型为跳动,因此可以控制光标92消失。Next, as shown in (E) of FIG. 24 , after user A finishes marking, user A's line of sight shifts to picture 63. At this time, the eye movement tracking device matched with electronic device 100 can detect user A's eye movement. The type is bouncing, so the cursor 92 can be controlled to disappear.
接着,如图24的(F)所示,用户A的视线转移至图片63后,其被图片63吸引,此时与电子设备100配套的眼动追踪装置可以检测到用户A的眼动类型不为跳动,所 以此时可以控制显示光标92,如将光标92显示在注视区域53内,以指示用户A进行手势控制。此外在图24的(F)中也可以确定出与注视区域对应的操作区域52,用户A的手部22可以在操作区域52中做出相应的手势操作。Next, as shown in (F) of FIG. 24 , after user A shifts his gaze to picture 63, he is attracted by picture 63. At this time, the eye movement tracking device matched with electronic device 100 can detect that user A's eye movement type is different. Therefore, the display of the cursor 92 can be controlled at this time, for example, the cursor 92 can be displayed in the gaze area 53 to instruct the user A to perform gesture control. In addition, in (F) of FIG. 24 , the operation area 52 corresponding to the gaze area can also be determined, and the user A's hand 22 can perform corresponding gesture operations in the operation area 52 .
接着,如图24的(G)所示,用户A想要在图片63上标注文字,此时,用户A可以通过移动其手部22,以移动光标92。当将光标92移动至用户A所需的位置时,用户A可以使用文字书写手势在操控区域52中书写文字,例如,书写“smile”等。在用户A书写文字的过程中,电子设备100上的手势追踪装置可以追踪到用户A的手部22的书写轨迹,之后,电子设备100的显示屏上即可以显示出与用户书写轨迹相同的内容,例如显示“smile”。为避免用户A在书写过程中手部22需要一直向右延伸,在用户A书写一定字数后,用户A可以使用特定的手势将书写的位置切换到操作区域52的左侧,然后再继续书写。Next, as shown in (G) of FIG. 24 , user A wants to mark text on the picture 63 , at this time, user A can move the cursor 92 by moving his hand 22 . When the cursor 92 is moved to the desired position of the user A, the user A can use a text writing gesture to write text in the manipulation area 52 , for example, write "smile" and so on. In the process of user A's writing, the gesture tracking device on the electronic device 100 can track the writing track of user A's hand 22, and then the display screen of the electronic device 100 can display the same content as the user's writing track. , for example to display "smile". In order to prevent user A from extending his hand 22 to the right during the writing process, after user A writes a certain number of words, user A can use a specific gesture to switch the writing position to the left side of the operation area 52, and then continue writing.
这样,通过结合眼动和手势操控,使用户可以自然高效地在电子设备上完成远程标注。其中,标注的类型包括但不限于图案、文字、符号等。此外,目前相关技术中在进行标注时往往近距离触控标注或用语音进行远程标注,但近距离触控标注要求用户每次标注时都得走近屏幕,十分不便;而语音远程标注一方面只能输入文字信息,很难输入图案信息,且语音信号识别还受环境噪音影响。而本方案中使用户可以远距离完成自然高效的多种信息标注,提升了标注效率和用户体验。In this way, by combining eye movement and gesture control, users can naturally and efficiently complete remote annotation on electronic devices. Wherein, the type of annotation includes but not limited to pattern, text, symbol and so on. In addition, in the current related technologies, when marking, it is often done by close touch marking or remote marking by voice, but close touch marking requires users to approach the screen every time they mark, which is very inconvenient; and voice remote marking on the one hand Only text information can be input, and it is difficult to input pattern information, and speech signal recognition is also affected by environmental noise. However, in this solution, users can complete natural and efficient annotation of various information remotely, which improves the efficiency of annotation and user experience.
场景三scene three
该场景可以为娱乐场景。例如,该场景可以是游戏场景,此时用户可以正在体验第一人称射击类游戏、体感游戏、益智类游戏、棋牌类游戏等。下面以第一人称射击类游戏为例介绍该场景。The scene may be an entertainment scene. For example, the scene may be a game scene. At this time, the user may be experiencing first-person shooting games, somatosensory games, puzzle games, chess and card games, and the like. The following uses a first-person shooter game as an example to introduce this scenario.
如图25的(A)所示,在用户A游戏程中,当用户A瞄准目标71时,与电子设备100配套的眼动追踪装置检测到眼动类型为注视。接着,可以利用眼动追踪装置确定出用户A的眼睛的注视位置,进而可以确定出注视区域53,以及在注视区域53内显示用于瞄准的光标92。之后,用户A的手部22可以在与注视区域53对应的操作区域52中可以做出射击手势,以完成射击目标71的操作。As shown in (A) of FIG. 25 , during the game of user A, when user A aims at the target 71 , the eye movement tracking device matched with the electronic device 100 detects that the eye movement type is gaze. Then, the gaze position of the user A's eyes can be determined by using the eye tracking device, and then the gaze area 53 can be determined, and the cursor 92 for aiming can be displayed in the gaze area 53 . Afterwards, the user A's hand 22 can make a shooting gesture in the operation area 52 corresponding to the gaze area 53 to complete the operation of shooting the target 71 .
接着,如图25的(B)所示,在用户A完成对目标71的射击后,用户A发现目标72,并将目光转移至目标72上。在该过程中,与电子设备100配套的眼动追踪装置检测到眼动类型为跳动,此时可以控制光标92消失。Next, as shown in (B) of FIG. 25 , after the user A finishes shooting the target 71 , the user A finds the target 72 and shifts his gaze to the target 72 . During this process, the eye movement tracking device matched with the electronic device 100 detects that the eye movement type is jitter, and at this time, the cursor 92 can be controlled to disappear.
接着,如图25的(C)所示,当用户A瞄准目标72时,与电子设备100配套的眼动追踪装置检测到眼动类型为注视。接着,用于瞄准的光标92会出现注视区域53内。此时,光标92未完全处于目标72上,为了提升射击的准确度,用户A可以对光标92进行调整。其中,如图25的(D)所示,用户A的手部22可以在与注视区域53对应的操作区域52中可以做出移动手势,此时,光标92可以跟随用户A的手部22移动。当进一步完成瞄准后,用户A的手部22可以在与注视区域53对应的操作区域52中可以做出射击手势,以完成射击目标72的操作。Next, as shown in (C) of FIG. 25 , when the user A aims at the target 72 , the eye movement tracking device matched with the electronic device 100 detects that the eye movement type is gaze. A cursor 92 for aiming will then appear within the gaze area 53 . At this time, the cursor 92 is not completely on the target 72 , in order to improve the shooting accuracy, the user A can adjust the cursor 92 . Wherein, as shown in (D) of Figure 25, the hand 22 of user A can make a movement gesture in the operation area 52 corresponding to the gaze area 53, and at this time, the cursor 92 can follow the hand 22 of user A to move . After aiming is further completed, the user A's hand 22 can make a shooting gesture in the operation area 52 corresponding to the gaze area 53 to complete the operation of shooting the target 72 .
这样,通过结合眼动和手势操控,使得用户在娱乐场景下可以自然高效的完成目标切换,也可以实现更高精度的操作,从而获得流畅的娱乐体验。此外,目前相关技 术中体感游戏的交互方式往往要求用户做出较大幅度动作,因此效率低下,易疲劳以及精度较低。而本方案中用户只需小范围的移动手部即可以完成操作,因此不易造成疲劳,操控效率和精度都更高。In this way, by combining eye movement and gesture control, users can naturally and efficiently complete target switching in entertainment scenarios, and can also achieve higher-precision operations, thereby obtaining a smooth entertainment experience. In addition, the interaction mode of somatosensory games in the related art often requires the user to make relatively large movements, so the efficiency is low, fatigue is easy and the precision is low. In this solution, the user only needs to move the hand in a small range to complete the operation, so it is not easy to cause fatigue, and the control efficiency and precision are higher.
场景四scene four
该场景可以为虚拟现实(virtual reality,VR)、增强现实(augmented reality,AR)、混合现实(mixed reality,MR)等电子设备创造的虚拟大屏界面的使用场景。This scene can be a usage scene of a virtual large-screen interface created by electronic devices such as virtual reality (virtual reality, VR), augmented reality (augmented reality, AR), and mixed reality (mixed reality, MR).
如图26的(A)所示,电子设备100创造的虚拟大屏界面的可视域(field of view,FOV)上显示有多个项目,用户A正在浏览项目,比如项目64和项目65。在用户A浏览过程中,与电子设备100配套的眼动追踪装置可以检测用户A的眼睛的眼动类型。此时在FOV内未显示光标92。As shown in (A) of FIG. 26 , multiple items are displayed on the field of view (FOV) of the virtual large-screen interface created by the electronic device 100, and user A is browsing items, such as item 64 and item 65. During user A's browsing process, the eye movement tracking device matched with the electronic device 100 can detect the eye movement type of user A's eyes. Cursor 92 is not displayed within the FOV at this time.
接着,如图26的(B)所示,当用户A注视项目64时,与电子设备100配套的眼动追踪装置可以确定出用户A的眼动类型不为跳动。之后,可以利用眼动追踪装置确定出用户A的眼睛的注视位置,进而可以确定出注视区域53,以及在注视区域53内显示光标92,以指示用户A进行手势控制。此时,用户A可以利用其手部26在于注视区域53对应的操作区域52中进行手势操作。Next, as shown in (B) of FIG. 26 , when the user A focuses on the item 64 , the eye-tracking device matched with the electronic device 100 can determine that the eye movement type of the user A is not jitter. Afterwards, the eye tracking device can be used to determine the gaze position of the user A's eyes, and then the gaze area 53 can be determined, and a cursor 92 is displayed in the gaze area 53 to instruct user A to perform gesture control. At this time, the user A can use his hand 26 to perform gesture operations in the operation area 52 corresponding to the gaze area 53 .
接着,如图26的(C)所示,用户A想要关闭项目64,用户A可以在操作区域52中移动其手部26。在用户A的手部26移动时,与电子设备100配套的手势追踪装置可以追踪到手部26的移动。之后,电子设备100可以控制其注视区域53中的光标92跟随手部26一起移动。当光标92移动至项目62上时,用户A可以停止移动手部26。Next, as shown in (C) of FIG. 26 , the user A wants to close the item 64 , and the user A can move his hand 26 in the operation area 52 . When the hand 26 of the user A moves, the gesture tracking device matched with the electronic device 100 can track the movement of the hand 26 . Afterwards, the electronic device 100 can control the cursor 92 in its gaze area 53 to move along with the hand 26 . User A may stop moving hand 26 when cursor 92 is moved over item 62 .
接着,如图26的(D)所示,用户A可以通过其手部26做出确认手势完成“确认”操作。之后,与电子设备100配套的手势追踪装置可以追踪到手部26的“确认”操作。接着,电子设备100即可以关闭其FOV上的项目64。至此即完成关闭项目64的操作。Next, as shown in (D) of FIG. 26 , user A can make a confirmation gesture with his hand 26 to complete the "confirmation" operation. Afterwards, the gesture tracking device matched with the electronic device 100 can track the “confirm” operation of the hand 26 . Then, the electronic device 100 can close the item 64 on its FOV. So far, the operation of closing item 64 is completed.
这样,通过结合眼动和手势操控,使得用户可以通过小范围的手势动作实现VR/AR/MR界面的精准操控。此外,目前相关技术中,VR/AR/MR等设备大多使用摄像头捕捉手势动作进行操控,一方面,手必须在摄像头视野内完成操作,限制了操控的自由度,另一方面,操控精度也不满足用户需求而本方案中使用眼动操控先选择注视区域,为手势操控提供了起始点和比例尺,解决了非视线内的手势操控与FOV界面坐标的对应问题,使得用户可以高效且高精度地完成VR/AR/MR设备的界面操控,提升了用户体验。In this way, by combining eye movement and gesture control, users can achieve precise control of the VR/AR/MR interface through a small range of gestures. In addition, in the current related technologies, most VR/AR/MR devices use cameras to capture gestures for manipulation. On the one hand, the hand must complete the operation within the camera's field of view, which limits the degree of freedom of manipulation. On the other hand, the manipulation accuracy is also low. To meet the needs of users, eye movement control is used in this solution to select the gaze area first, which provides a starting point and scale for gesture control, and solves the problem of correspondence between non-line-of-sight gesture control and FOV interface coordinates, allowing users to efficiently and accurately Complete the interface control of VR/AR/MR equipment and improve the user experience.
场景五scene five
该场景可以为娱乐场景。例如该场景可以是观看视频(比如看电影等)的场景。在该场景中,在电子设备上设置有预设区域,当用户注视该预设区域时,则可以判断用户当前存在控制电子设备的意图,而当用户注视其他区域时,可以判断用户当前不存在控制电子设备的意图。可以理解的是,由于用户观看视频时,其眼睛的眼动类型在大部分情况下不为跳动,因此,这样的设计方式可以避免出现误识别的情况,提升用户体验。The scene may be an entertainment scene. For example, the scene may be a scene of watching a video (such as watching a movie, etc.). In this scenario, a preset area is set on the electronic device. When the user looks at the preset area, it can be judged that the user currently has the intention to control the electronic device. When the user looks at other areas, it can be judged that the user does not currently exist. Intent to control electronic devices. It is understandable that when a user watches a video, the eye movement type of the user's eyes is not jumping in most cases. Therefore, such a design method can avoid misidentification and improve user experience.
具体地,如图27的(A)所示,用户A正在电子设备100上观看电影。在用户A观看电影中,与电子设备100配套的眼动追踪装置可以实时或周期性检测用户的眼动 类型。Specifically, as shown in (A) of FIG. 27 , user A is watching a movie on the electronic device 100 . When user A is watching a movie, the eye movement tracking device matched with the electronic device 100 can detect the user's eye movement type in real time or periodically.
如图27的(B)所示,此时用户A注视区域56。该区域56为电子设备100上预设的区域。当用户注视该区域56时,在图27的(B)中,与电子设备100配套的眼动追踪装置可以检测到用户A的眼动类型不为跳动,且确定出用户A的眼睛的注视位置为区域56,此时转向图27的(C)。As shown in (B) of FIG. 27 , the user A is looking at the area 56 at this time. The area 56 is a preset area on the electronic device 100 . When the user gazes at the area 56, in (B) of FIG. 27 , the eye-tracking device matched with the electronic device 100 can detect that the eye movement type of user A is not jumping, and determine the gaze position of user A's eyes is the area 56, and turn to (C) of FIG. 27 at this time.
接着,如图27的(C)所示,可以利用眼动追踪装置确定出用户A的眼睛的注视位置,进而可以确定出注视区域53,以及在注视区域53内显示显示播放进度和光标92,此时光标92可以显示在注视区域53的左下角。其中,在图27的(C)中也可以确定出与注视区域53对应的操作区域52,用户A的手部22可以在操作区域52中做出相应的手势操作。可以理解的是,用户观看视频时,其一般是想调整进度或者调整音量等,因此,为了避免光标92影响观看体验,可以将光标92显示在进度调整或者音量调整的位置处。此外,考虑到进度调整和音量调整一般不在同一位置,因此此时可以同时显示出音量调整键和播放进度条等,由此以便用户基于需求进行选择。当然了,也可以在电子设备100上设置与音量调整对应的区域,以及与播放进度对应的区域,当用户注视与音量调整对应的区域时,则显示音量调整键,当用户注视与播放进度对应的区域时,则显示播放进度条。换言之,在电子设备100上可以设置有与目标功能对应目标区域,当用户注视目标区域时,则在电子设备上显示与目标功能相关的控制键。Next, as shown in (C) of FIG. 27 , the eye tracking device can be used to determine the gaze position of the user A's eyes, and then the gaze area 53 can be determined, and the playback progress and the cursor 92 can be displayed in the gaze area 53, At this time, the cursor 92 may be displayed in the lower left corner of the gaze area 53 . Wherein, the operation area 52 corresponding to the gaze area 53 can also be determined in (C) of FIG. 27 , and the hand 22 of the user A can perform corresponding gesture operations in the operation area 52 . It can be understood that when a user watches a video, he generally wants to adjust the progress or volume, etc. Therefore, in order to prevent the cursor 92 from affecting the viewing experience, the cursor 92 can be displayed at the position where the progress is adjusted or the volume is adjusted. In addition, considering that the progress adjustment and the volume adjustment are generally not in the same position, the volume adjustment key and the playback progress bar can be displayed at the same time at this time, so that the user can make a selection based on needs. Of course, the electronic device 100 can also be provided with an area corresponding to the volume adjustment and an area corresponding to the playback progress. When the user looks at the area corresponding to the volume adjustment, the volume adjustment key will be displayed. When the user looks at the area corresponding to the playback progress area, the playback progress bar will be displayed. In other words, a target area corresponding to the target function may be set on the electronic device 100 , and when the user looks at the target area, control keys related to the target function will be displayed on the electronic device.
接着,如图27的(D)所示,用户A移动其手部22。在用户A的手部22移动时,与电子设备100配套的手势追踪装置可以追踪到手部22的移动。之后,电子设备100可以控制其注视区域53中的光标92跟随手部22一起移动。当光标92移动至用户A想要进行调整的位置时,用户A可以停止移动其手部22。Next, the user A moves his hand 22 as shown in (D) of FIG. 27 . When the hand 22 of the user A moves, the gesture tracking device matched with the electronic device 100 can track the movement of the hand 22 . Afterwards, the electronic device 100 can control the cursor 92 in its gaze area 53 to move along with the hand 22 . When the cursor 92 moves to a position that user A wants to adjust, user A can stop moving his hand 22 .
接着,如图27的(E)所示,用户A在可以做出选中进度条的手势,并拖动进度条。在该过程中,与电子设备100配套的手势追踪装置可以追踪到手部22的移动。之后,电子设备100可以控制其注视区域53中的光标92跟随手部22一起移动。当光标92移动至用户A想要进行标注的位置时,用户A可以停止移动其手部22。Next, as shown in (E) of FIG. 27 , user A can make a gesture of selecting the progress bar and drag the progress bar. During this process, the gesture tracking device matched with the electronic device 100 can track the movement of the hand 22 . Afterwards, the electronic device 100 can control the cursor 92 in its gaze area 53 to move along with the hand 22 . When the cursor 92 moves to the position that user A wants to mark, user A can stop moving his hand 22 .
接着,如图27的(F)所示,在用户A停止拖动进度条后,与电子设备100配套的眼动追踪装置可以检测到用户A的眼动类型为跳动,因此可以控制光标92消失。Next, as shown in (F) of FIG. 27 , after user A stops dragging the progress bar, the eye-tracking device matched with the electronic device 100 can detect that the eye movement type of user A is jumping, so the cursor 92 can be controlled to disappear .
这样,通过结合眼动操控和手势操控,使得眼动操控和手势操控之间可以无感交互,使得用户在娱乐场景下可以自然高效的完成播放进度、声音等的调整,也可以实现更高精度的操作,从而获得高品质的操控体验和观影体验。In this way, by combining eye movement control and gesture control, there can be no sense interaction between eye movement control and gesture control, so that users can naturally and efficiently complete the adjustment of playback progress, sound, etc. in entertainment scenes, and can also achieve higher precision. operation, so as to obtain high-quality control experience and movie viewing experience.
可以理解的是,本申请的实施例中的处理器可以是中央处理单元(central processing unit,CPU),还可以是其他通用处理器、数字信号处理器(digital signal processor,DSP)、专用集成电路(application specific integrated circuit,ASIC)、现场可编程门阵列(field programmable gate array,FPGA)或者其他可编程逻辑器件、晶体管逻辑器件,硬件部件或者其任意组合。通用处理器可以是微处理器,也可以是任何常规的处理器。It can be understood that the processor in the embodiments of the present application may be a central processing unit (central processing unit, CPU), and may also be other general processors, digital signal processors (digital signal processor, DSP), application specific integrated circuits (application specific integrated circuit, ASIC), field programmable gate array (field programmable gate array, FPGA) or other programmable logic devices, transistor logic devices, hardware components or any combination thereof. A general-purpose processor can be a microprocessor, or any conventional processor.
本申请的实施例中的方法步骤可以通过硬件的方式来实现,也可以由处理器执行 软件指令的方式来实现。软件指令可以由相应的软件模块组成,软件模块可以被存放于随机存取存储器(random access memory,RAM)、闪存、只读存储器(read-only memory,ROM)、可编程只读存储器(programmable rom,PROM)、可擦除可编程只读存储器(erasable PROM,EPROM)、电可擦除可编程只读存储器(electrically EPROM,EEPROM)、寄存器、硬盘、移动硬盘、CD-ROM或者本领域熟知的任何其它形式的存储介质中。一种示例性的存储介质耦合至处理器,从而使处理器能够从该存储介质读取信息,且可向该存储介质写入信息。当然,存储介质也可以是处理器的组成部分。处理器和存储介质可以位于ASIC中。The method steps in the embodiments of the present application may be implemented by means of hardware, or may be implemented by means of a processor executing software instructions. The software instructions can be composed of corresponding software modules, and the software modules can be stored in random access memory (random access memory, RAM), flash memory, read-only memory (read-only memory, ROM), programmable read-only memory (programmable rom) , PROM), erasable programmable read-only memory (erasable PROM, EPROM), electrically erasable programmable read-only memory (electrically EPROM, EEPROM), register, hard disk, mobile hard disk, CD-ROM or known in the art any other form of storage medium. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. Of course, the storage medium may also be a component of the processor. The processor and storage medium can be located in the ASIC.
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者通过所述计算机可读存储介质进行传输。所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如固态硬盘(solid state disk,SSD))等。In the above embodiments, all or part of them may be implemented by software, hardware, firmware or any combination thereof. When implemented using software, it may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on the computer, the processes or functions according to the embodiments of the present application will be generated in whole or in part. The computer can be a general purpose computer, a special purpose computer, a computer network, or other programmable devices. The computer instructions may be stored in or transmitted via a computer-readable storage medium. The computer instructions may be transmitted from one website site, computer, server, or data center to another website site by wired (such as coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (such as infrared, wireless, microwave, etc.) , computer, server or data center for transmission. The computer-readable storage medium may be any available medium that can be accessed by a computer, or a data storage device such as a server or a data center integrated with one or more available media. The available medium may be a magnetic medium (such as a floppy disk, a hard disk, or a magnetic tape), an optical medium (such as a DVD), or a semiconductor medium (such as a solid state disk (solid state disk, SSD)), etc.
可以理解的是,在本申请的实施例中涉及的各种数字编号仅为描述方便进行的区分,并不用来限制本申请的实施例的范围。It can be understood that the various numbers involved in the embodiments of the present application are only for convenience of description, and are not used to limit the scope of the embodiments of the present application.

Claims (34)

  1. 一种与电子设备进行交互的,其特征在于,所述方法包括:A method for interacting with an electronic device, wherein the method includes:
    所述电子设备获取用户眼睛的眼动类型;The electronic device obtains the eye movement type of the user's eyes;
    当确定所述眼动类型为第一类型时,所述电子设备根据所述第一类型对所述电子设备进行控制;When it is determined that the eye movement type is the first type, the electronic device controls the electronic device according to the first type;
    当确定所述眼动类型为第二类型时,所述电子设备指示所述用户进行手势控制;When determining that the eye movement type is the second type, the electronic device instructs the user to perform gesture control;
    所述电子设备获取所述用户的手势;The electronic device obtains the user's gesture;
    所述电子设备根据获取的所述手势对所述电子设备进行控制。The electronic device controls the electronic device according to the acquired gesture.
  2. 根据权利要求1所述的方法,其特征在于,所述指示用户进行手势控制,具体包括:The method according to claim 1, wherein the instructing the user to perform gesture control specifically comprises:
    获取所述用户的注视位置;Obtain the gaze position of the user;
    在所述注视位置所在的第一预设区域内提示所述用户进行手势控制。Prompting the user to perform gesture control within a first preset area where the gaze position is located.
  3. 根据权利要求1或2所述的方法,其特征在于,所述指示用户进行手势控制,具体包括:The method according to claim 1 or 2, wherein the instructing the user to perform gesture control specifically comprises:
    根据所述电子设备上所显示的第一内容的类型,确定目标区域;determining a target area according to the type of the first content displayed on the electronic device;
    在所述目标区域内提示所述用户进行手势控制,其中,所述目标区域为所述电子设备上未包含所述第一内容的区域。Prompting the user to perform gesture control in the target area, wherein the target area is an area on the electronic device that does not contain the first content.
  4. 根据权利要求1-3任一所述的方法,其特征在于,所述指示用户进行手势控制,具体包括:The method according to any one of claims 1-3, wherein the instructing the user to perform gesture control specifically includes:
    获取所述用户的注视位置;Obtain the gaze position of the user;
    根据所述注视位置,确定所述用户的注视区域;determining a gaze area of the user according to the gaze position;
    当所述注视区域为第二预设区域时,提示所述用户进行手势控制。When the gaze area is the second preset area, the user is prompted to perform gesture control.
  5. 根据权利要求1-4任一所述的方法,其特征在于,所述指示用户进行手势控制,包括以下一项或多项:The method according to any one of claims 1-4, wherein the instructing the user to perform gesture control includes one or more of the following:
    控制所述电子设备显示光标,控制放大所述电子设备所显示的至少部分内容,或者,控制与所述电子设备配套的声音组件播报第一语音,所述第一语音用于提示所述用户进行手势控制。Controlling the electronic device to display a cursor, controlling to enlarge at least part of the content displayed by the electronic device, or controlling a sound component matched with the electronic device to broadcast a first voice, the first voice is used to prompt the user to perform gesture control.
  6. 根据权利要求5所述的方法,其特征在于,所述指示用户进行手势控制包括控制所述电子设备显示光标,且所述光标显示在所述电子设备的显示区域中的第一区域内;The method according to claim 5, wherein the instructing the user to perform gesture control comprises controlling the electronic device to display a cursor, and the cursor is displayed in a first area of the display area of the electronic device;
    所述方法还包括:The method also includes:
    确定所述眼动类型持续为所述第二类型,在所述用户的眼睛出现移动且所述用户的注视区域由所述电子设备的显示区域中的所述第一区域切换至第二区域时,控制所述光标由所述第一区域移动至所述第二区域,其中,所述注视区域由所述用户的眼睛的注视位置确定。determining that the eye movement type continues to be the second type, when the user's eyes move and the user's gaze area is switched from the first area to the second area in the display area of the electronic device , controlling the cursor to move from the first area to the second area, wherein the gaze area is determined by the gaze position of the user's eyes.
  7. 根据权利要求5或6所述的方法,其特征在于,所述指示用户进行手势控制包括控制放大所述电子设备所显示的至少部分内容,且放大的内容为所述电子设备的显示区域中第一区域中所显示的内容;The method according to claim 5 or 6, wherein the instructing the user to perform gesture control includes controlling to enlarge at least part of the content displayed by the electronic device, and the enlarged content is the first in the display area of the electronic device. the content displayed in an area;
    所述方法还包括:The method also includes:
    确定所述眼动类型持续为所述第二类型,在所述用户的眼睛出现移动且用户的注视区域由所述电子设备的所述第一区域切换至第二区域时,将所述第一区域中放大的内容恢复至初始状态,以及放大所述第二区域中的内容。determining that the eye movement type continues to be the second type, and when the user's eyes move and the user's gaze area is switched from the first area to the second area of the electronic device, the first The enlarged content in the area is restored to the original state, and the content in the second area is enlarged.
  8. 根据权利要求1-7任一所述的方法,其特征在于,所述电子设备指示所述用户进行手势控制之后,所述方法还包括:The method according to any one of claims 1-7, wherein after the electronic device instructs the user to perform gesture control, the method further comprises:
    当确定所述眼动类型由所述第二类型切换为所述第一类型时,所述电子设备停止指示所述用户进行手势控制,和/或,所述电子设备根据所述第一类型对所述电子设备进行控制,并限制所述电子设备获取所述用户的手势。When it is determined that the eye movement type is switched from the second type to the first type, the electronic device stops instructing the user to perform gesture control, and/or, the electronic device performs gesture control according to the first type The electronic device controls and restricts the electronic device from acquiring the user's gesture.
  9. 根据权利要求1-8任一所述的方法,其特征在于,在确定所述眼动类型为所述第二类型之后,还包括:The method according to any one of claims 1-8, further comprising: after determining that the eye movement type is the second type:
    获取所述用户的注视位置;Obtain the gaze position of the user;
    根据所述注视位置,确定所述用户的注视区域的尺寸;determining the size of the user's gaze area according to the gaze position;
    根据所述注视区域的尺寸,确定所述用户进行手势控制时的操作区域的尺寸。According to the size of the gaze area, the size of the operation area when the user performs gesture control is determined.
  10. 根据权利要求1-9任一所述的方法,其特征在于,所述电子设备根据所述第一类型对所述电子设备进行控制,具体包括:The method according to any one of claims 1-9, wherein the electronic device controls the electronic device according to the first type, specifically comprising:
    所述电子设备根据所述第一类型,实时确定所述用户的注视位置,和/或,所述电子设备根据所述第一类型,控制切换所述电子设备上所显示的内容。The electronic device determines the gaze position of the user in real time according to the first type, and/or, the electronic device controls switching of content displayed on the electronic device according to the first type.
  11. 一种与电子设备进行交互的方法,其特征在于,所述方法包括:A method for interacting with an electronic device, characterized in that the method includes:
    所述电子设备获取用户的手势;The electronic device acquires a user's gesture;
    所述电子设备根据获取的所述手势对所述电子设备进行控制;The electronic device controls the electronic device according to the acquired gesture;
    所述电子设备获取所述用户的眼动类型;The electronic device obtains the eye movement type of the user;
    当确定所述用户的眼动类型为第一类型时,所述电子设备根据所述第一类型对所述电子设备进行控制,并限制所述电子设备根据所述用户的手势控制所述电子设备;When it is determined that the user's eye movement type is the first type, the electronic device controls the electronic device according to the first type, and restricts the electronic device from controlling the electronic device according to the user's gesture ;
    当确定所述用户的眼动类型为第二类型时,所述电子设备获取所述用户的手势;When determining that the user's eye movement type is the second type, the electronic device acquires the user's gesture;
    所述电子设备根据获取的所述手势对所述电子设备进行控制。The electronic device controls the electronic device according to the acquired gesture.
  12. 根据权利要求11所述的方法,其特征在于,所述电子设备根据所述第一类型对所述电子设备进行控制,并限制所述电子设备获取所述用户的手势之后,所述方法还包括:The method according to claim 11, wherein after the electronic device controls the electronic device according to the first type and restricts the electronic device from acquiring the user's gesture, the method further comprises :
    当确定所述用户的眼动类型由所述第一类型切换至所述第二类型时,所述电子设备指示所述用户进行手势控制;When it is determined that the eye movement type of the user is switched from the first type to the second type, the electronic device instructs the user to perform gesture control;
    所述电子设备获取所述用户的手势;The electronic device obtains the user's gesture;
    所述电子设备根据获取的所述手势对所述电子设备进行控制。The electronic device controls the electronic device according to the acquired gesture.
  13. 根据权利要求11或12所述的方法,其特征在于,所述限制所述电子设备根据所述用户的手势控制所述电子设备,具体包括:所述电子设备不获取所述用户的手势,或者,所述电子设备虽然获取了所述用户的手势但不对所述手势进行处理。The method according to claim 11 or 12, wherein the limiting the electronic device to control the electronic device according to the user's gesture specifically comprises: the electronic device does not acquire the user's gesture, or , although the electronic device acquires the gesture of the user, it does not process the gesture.
  14. 一种与电子设备进行交互的方法,其特征在于,所述方法包括:A method for interacting with an electronic device, characterized in that the method comprises:
    所述电子设备获取用户针对所述电子设备上第一区域进行操作的第一手势;The electronic device acquires a first gesture performed by the user on the first area on the electronic device;
    所述电子设备根据获取的所述第一手势对所述电子设备进行控制;The electronic device controls the electronic device according to the acquired first gesture;
    所述电子设备获取所述用户的眼动类型;The electronic device obtains the eye movement type of the user;
    当确定所述用户的眼动类型为第一类型时,所述电子设备根据所述第一类型对所述电子设备进行控制,并限制所述电子设备获取所述用户的手势;When it is determined that the user's eye movement type is the first type, the electronic device controls the electronic device according to the first type, and restricts the electronic device from acquiring gestures of the user;
    所述电子设备继续获取所述用户的眼动类型;The electronic device continues to obtain the eye movement type of the user;
    当确定所述用户的眼动类型由所述第一类型切换至第二类型,且所述用户的注视位置由所述电子设备上的所述第一区域切换至第二区域时,所述电子设备指示所述用户进行手势控制;When it is determined that the eye movement type of the user is switched from the first type to the second type, and the gaze position of the user is switched from the first area on the electronic device to the second area, the electronic the device instructs the user to perform gesture control;
    所述电子设备获取所述用户针对所述第二区域进行操作的第二手势;acquiring, by the electronic device, a second gesture performed by the user on the operation of the second area;
    所述电子设备根据获取的所述第二手势对所述电子设备进行控制。The electronic device controls the electronic device according to the acquired second gesture.
  15. 一种与电子设备进行交互的方法,其特征在于,所述方法包括:A method for interacting with an electronic device, characterized in that the method comprises:
    所述电子设备获取用户的第一眼动信息;The electronic device obtains the user's first eye movement information;
    所述电子设备根据所述第一眼动信息确定第一眼动类型;The electronic device determines a first eye movement type according to the first eye movement information;
    所述电子设备根据所述第一眼动类型对所述电子设备进行控制;The electronic device controls the electronic device according to the first eye movement type;
    所述电子设备获取所述用户的第二眼动信息;The electronic device acquires the second eye movement information of the user;
    所述电子设备根据所述第二眼动信息确定第二眼动类型;The electronic device determines a second eye movement type according to the second eye movement information;
    所述电子设备获取所述用户的手势;The electronic device obtains the user's gesture;
    所述电子设备根据获取的所述手势对所述电子设备进行控制。The electronic device controls the electronic device according to the acquired gesture.
  16. 根据权利要求15所述的方法,其特征在于,所述方法还包括:method according to claim 15, is characterized in that, described method also comprises:
    所述电子设备获取用户的第三眼动信息;The electronic device obtains the user's third eye movement information;
    所述电子设备根据所述第三眼动信息确定所述第一眼动类型;The electronic device determines the first eye movement type according to the third eye movement information;
    所述电子设备根据所述第一眼动类型对所述电子设备进行控制,并限制所述电子设备获取所述用户的手势。The electronic device controls the electronic device according to the first eye movement type, and restricts the electronic device from acquiring gestures of the user.
  17. 一种电子设备,其特征在于,包括:An electronic device, characterized in that it comprises:
    眼动追踪装置,用于获取用户的眼动信息;An eye-tracking device for obtaining user's eye-movement information;
    手势追踪装置,用于获取所述用户的手势;a gesture tracking device, configured to acquire gestures of the user;
    至少一个存储器,用于存储程序;at least one memory for storing programs;
    至少一个处理器,用于执行所述存储器存储的程序,当所述存储器存储的程序被执行时,所述处理器用于:At least one processor configured to execute the program stored in the memory, when the program stored in the memory is executed, the processor is configured to:
    控制所述眼动追踪装置获取所述用户的眼动信息;controlling the eye-tracking device to acquire eye-movement information of the user;
    当由所述眼动信息确定的用户眼睛的眼动类型为第一类型时,根据所述第一类型对所述电子设备进行控制;When the eye movement type of the user's eyes determined by the eye movement information is a first type, control the electronic device according to the first type;
    当由所述眼动信息确定的所述用户眼睛的眼动类型为第二类型时,控制所述电子设备指示所述用户进行手势控制;When the eye movement type of the user's eyes determined by the eye movement information is the second type, controlling the electronic device to instruct the user to perform gesture control;
    控制所述手势追踪装置获取所述用户的手势;controlling the gesture tracking device to acquire gestures of the user;
    根据所述手势追踪装置获取的所述手势对所述电子设备进行控制。The electronic device is controlled according to the gesture acquired by the gesture tracking device.
  18. 根据权利要求17所述的电子设备,其特征在于,所述处理器控制所述电子设 备指示所述用户进行手势控制,具体包括:The electronic device according to claim 17, wherein the processor controls the electronic device to instruct the user to perform gesture control, specifically comprising:
    所述处理器根据所述用户的眼动信息,确定所述用户的注视位置;The processor determines the user's gaze position according to the user's eye movement information;
    所述处理器控制所述电子设备在所述注视位置所在的第一预设区域内提示所述用户进行手势控制。The processor controls the electronic device to prompt the user to perform gesture control within a first preset area where the gaze position is located.
  19. 根据权利要求17或18所述的电子设备,其特征在于,所述处理器控制所述电子设备指示所述用户进行手势控制,具体包括:The electronic device according to claim 17 or 18, wherein the processor controls the electronic device to instruct the user to perform gesture control, specifically comprising:
    所述处理器根据所述电子设备上所显示的第一内容的类型,确定目标区域;The processor determines a target area according to the type of the first content displayed on the electronic device;
    所述处理器控制所述电子设备在所述目标区域内提示所述用户进行手势控制,其中,所述目标区域为所述电子设备上未包含所述第一内容的区域。The processor controls the electronic device to prompt the user to perform gesture control in the target area, wherein the target area is an area on the electronic device that does not contain the first content.
  20. 根据权利要求17-19任一所述的电子设备,其特征在于,所述处理器控制所述电子设备指示所述用户进行手势控制,具体包括:The electronic device according to any one of claims 17-19, wherein the processor controls the electronic device to instruct the user to perform gesture control, specifically including:
    所述处理器根据所述用户的眼动信息,获取所述用户的注视位置;The processor obtains the user's gaze position according to the user's eye movement information;
    所述处理器根据所述注视位置,确定所述用户的注视区域;The processor determines the gaze area of the user according to the gaze position;
    当所述注视区域为第二预设区域时,所述处理器控制所述电子设备提示所述用户进行手势控制。When the gaze area is the second preset area, the processor controls the electronic device to prompt the user to perform gesture control.
  21. 根据权利要求17-20任一所述的电子设备,其特征在于,所述处理器控制所述电子设备指示所述用户进行手势控制,包括以下一项或多项:The electronic device according to any one of claims 17-20, wherein the processor controls the electronic device to instruct the user to perform gesture control, including one or more of the following:
    所述处理器控制所述电子设备显示光标,所述处理器控制放大所述电子设备所显示的至少部分内容,或者,所述处理器控制与所述电子设备配套的声音组件播报第一语音,所述第一语音用于提示所述用户进行手势控制。The processor controls the electronic device to display a cursor, the processor controls to amplify at least part of the content displayed by the electronic device, or the processor controls a sound component matched with the electronic device to broadcast the first voice, The first voice is used to prompt the user to perform gesture control.
  22. 根据权利要求21所述的电子设备,其特征在于,所述处理器控制所述电子设备指示所述用户进行手势控制包括所述处理器控制所述电子设备显示光标,且所述光标显示在所述电子设备的显示区域中的第一区域内;The electronic device according to claim 21, wherein the processor controlling the electronic device to instruct the user to perform gesture control comprises the processor controlling the electronic device to display a cursor, and the cursor is displayed on the in the first area of the display area of the electronic device;
    所述处理器还用于:The processor is also used to:
    当由所述眼动信息确定的用户眼睛的眼动类型持续为所述第二类型,且在所述用户的眼睛出现移动且所述用户的注视区域由所述电子设备的显示区域中的所述第一区域切换至第二区域时,控制所述电子设备将所述光标由所述第一区域移动至所述第二区域,其中,所述注视区域由所述用户的眼睛的注视位置确定。When the eye movement type of the user's eyes determined by the eye movement information continues to be the second type, and the user's eyes move and the user's gaze area is determined by the display area of the electronic device When the first area is switched to the second area, control the electronic device to move the cursor from the first area to the second area, wherein the gaze area is determined by the gaze position of the user's eyes .
  23. 根据权利要求21或22所述的电子设备,其特征在于,所述处理器控制所述电子设备指示所述用户进行手势控制包括所述处理器控制放大所述电子设备所显示的至少部分内容,且放大的内容为所述电子设备的显示区域中第一区域中所显示的内容;The electronic device according to claim 21 or 22, wherein the processor controlling the electronic device to instruct the user to perform gesture control comprises controlling the processor to enlarge at least part of the content displayed by the electronic device, And the enlarged content is the content displayed in the first area of the display area of the electronic device;
    所述处理器还用于:The processor is also used to:
    当由所述眼动信息确定的用户眼睛的眼动类型持续为所述第二类型,且在所述用户的眼睛出现移动且用户的注视区域由所述电子设备的所述第一区域切换至第二区域时,控制所述电子设备将所述第一区域中放大的内容恢复至初始状态,以及放大所述第二区域中的内容。When the eye movement type of the user's eyes determined by the eye movement information continues to be the second type, and the user's eyes move and the user's gaze area is switched from the first area of the electronic device to In the second area, the electronic device is controlled to restore the enlarged content in the first area to an initial state, and to enlarge the content in the second area.
  24. 根据权利要求17-23任一所述的电子设备,其特征在于,所述处理器控制所述电子设备指示所述用户进行手势控制之后,所述处理器还用于:The electronic device according to any one of claims 17-23, wherein after the processor controls the electronic device to instruct the user to perform gesture control, the processor is further configured to:
    当由所述眼动信息确定的用户眼睛的眼动类型由所述第二类型切换为所述第一类 型时,控制所述电子设备停止指示所述用户进行手势控制,和/或,根据所述第一类型对所述电子设备进行控制,并限制所述手势追踪装置获取所述用户的手势。When the eye movement type of the user's eyes determined by the eye movement information is switched from the second type to the first type, control the electronic device to stop instructing the user to perform gesture control, and/or, according to the The first type controls the electronic device, and restricts the gesture tracking device from acquiring the user's gestures.
  25. 根据权利要求17-24任一所述的电子设备,其特征在于,在由所述眼动信息确定的所述用户眼睛的眼动类型为第二类型之后,所述处理器还用于:The electronic device according to any one of claims 17-24, wherein after the eye movement type of the user's eyes determined by the eye movement information is the second type, the processor is further configured to:
    获取所述用户的注视位置;Obtain the gaze position of the user;
    根据所述注视位置,确定所述用户的注视区域的尺寸;determining the size of the user's gaze area according to the gaze position;
    根据所述注视区域的尺寸,确定所述用户进行手势控制时的操作区域的尺寸。According to the size of the gaze area, the size of the operation area when the user performs gesture control is determined.
  26. 根据权利要求17-25任一所述的电子设备,其特征在于,所述处理器根据所述第一类型对所述电子设备进行控制,具体包括:The electronic device according to any one of claims 17-25, wherein the processor controls the electronic device according to the first type, specifically comprising:
    所述处理器根据所述第一类型,实时确定所述用户的注视位置,和/或,所述处理器根据所述第一类型,控制所述电子设备切换所述电子设备上所显示的内容。The processor determines the gaze position of the user in real time according to the first type, and/or, the processor controls the electronic device to switch the content displayed on the electronic device according to the first type .
  27. 一种电子设备,其特征在于,包括:An electronic device, characterized in that it comprises:
    眼动追踪装置,用于获取用户的眼动信息;An eye-tracking device for obtaining user's eye-movement information;
    手势追踪装置,用于获取所述用户的手势;a gesture tracking device, configured to acquire gestures of the user;
    至少一个存储器,用于存储程序;at least one memory for storing programs;
    至少一个处理器,用于执行所述存储器存储的程序,当所述存储器存储的程序被执行时,所述处理器用于:At least one processor configured to execute the program stored in the memory, when the program stored in the memory is executed, the processor is configured to:
    控制所述手势追踪装置获取的所述用户的手势;controlling the user's gesture acquired by the gesture tracking device;
    根据所述手势追踪装置获取的所述手势对所述电子设备进行控制;controlling the electronic device according to the gesture acquired by the gesture tracking device;
    当由所述眼动信息确定的用户眼睛的眼动类型为第一类型时,根据所述第一类型对所述电子设备进行控制,并限制根据所述用户的手势控制所述电子设备;When the eye movement type of the user's eyes determined by the eye movement information is the first type, control the electronic device according to the first type, and limit the control of the electronic device according to the user's gesture;
    当由所述眼动信息确定的用户眼睛的眼动类型为第二类型时,根据所述手势追踪装置获取的所述手势对所述电子设备进行控制。When the eye movement type of the user's eyes determined by the eye movement information is the second type, the electronic device is controlled according to the gesture acquired by the gesture tracking device.
  28. 根据权利要求27所述的电子设备,其特征在于,所述处理器在根据所述第一类型对所述电子设备进行控制,并限制根据所述用户的手势控制所述电子设备之后,还用于:The electronic device according to claim 27, wherein the processor further uses At:
    当由所述眼动信息确定的用户眼睛的眼动类型由所述第一类型切换至所述第二类型时,控制所述电子设备指示所述用户进行手势控制;When the eye movement type of the user's eyes determined by the eye movement information is switched from the first type to the second type, controlling the electronic device to instruct the user to perform gesture control;
    控制所述手势追踪装置获取的所述用户的手势;controlling the user's gesture acquired by the gesture tracking device;
    根据所述手势追踪装置获取的所述手势对所述电子设备进行控制。The electronic device is controlled according to the gesture acquired by the gesture tracking device.
  29. 根据权利要求27或28所述的电子设备,其特征在于,所述处理器限制根据所述用户的手势控制所述电子设备,具体包括:The electronic device according to claim 27 or 28, wherein the processor limits the control of the electronic device according to the user's gesture, specifically comprising:
    所述处理器控制所述手势追踪装置不获取所述用户的手势,或者,所述处理器虽然控制所述手势追踪装置继续获取所述用户的手势但不对所述手势进行处理。The processor controls the gesture tracking device not to acquire the user's gesture, or the processor controls the gesture tracking device to continue acquiring the user's gesture but does not process the gesture.
  30. 一种电子设备,其特征在于,包括:An electronic device, characterized in that it comprises:
    眼动追踪装置,用于获取用户的眼动信息;An eye-tracking device for obtaining user's eye-movement information;
    手势追踪装置,用于获取所述用户的手势;a gesture tracking device, configured to acquire gestures of the user;
    至少一个存储器,用于存储程序;at least one memory for storing programs;
    至少一个处理器,用于执行所述存储器存储的程序,当所述存储器存储的程序被执行时,所述处理器用于:At least one processor configured to execute the program stored in the memory, when the program stored in the memory is executed, the processor is configured to:
    控制所述手势追踪装置获取所述用户的手势;controlling the gesture tracking device to acquire gestures of the user;
    根据所述手势追踪装置获取的所述用户针对所述电子设备上第一区域进行操作的第一手势对所述电子设备进行控制;controlling the electronic device according to the first gesture of the user operating on the first area on the electronic device acquired by the gesture tracking device;
    控制所述眼动追踪装置获取所述用户的眼动信息;controlling the eye-tracking device to acquire eye-movement information of the user;
    当由所述眼动信息确定的用户眼睛的眼动类型为第一类型时,根据所述第一类型对所述电子设备进行控制,并限制所述手势追踪装置获取所述用户的手势;When the eye movement type of the user's eyes determined by the eye movement information is the first type, control the electronic device according to the first type, and restrict the gesture tracking device from acquiring the user's gesture;
    继续控制所述眼动追踪装置获取所述用户的眼动信息;continue to control the eye-tracking device to obtain eye-movement information of the user;
    当由所述眼动信息确定的用户眼睛的眼动类型由所述第一类型切换至第二类型,且所述用户的注视位置由所述电子设备上的所述第一区域切换至第二区域时,控制所述电子设备指示所述用户进行手势控制;When the eye movement type of the user's eyes determined by the eye movement information is switched from the first type to the second type, and the gaze position of the user is switched from the first area on the electronic device to the second area, controlling the electronic device to instruct the user to perform gesture control;
    控制所述手势追踪装置获取所述用户的手势;controlling the gesture tracking device to acquire gestures of the user;
    根据所述手势追踪装置获取的所述用户针对所述第二区域进行操作的第二手势对所述电子设备进行控制。The electronic device is controlled according to the second gesture of the user operating on the second area acquired by the gesture tracking device.
  31. 一种电子设备,其特征在于,包括:An electronic device, characterized in that it comprises:
    眼动追踪装置,用于获取用户的眼动信息;An eye-tracking device for obtaining user's eye-movement information;
    手势追踪装置,用于获取所述用户的手势;a gesture tracking device, configured to acquire gestures of the user;
    至少一个存储器,用于存储程序;at least one memory for storing programs;
    至少一个处理器,用于执行所述存储器存储的程序,当所述存储器存储的程序被执行时,所述处理器用于:At least one processor configured to execute the program stored in the memory, when the program stored in the memory is executed, the processor is configured to:
    控制所述眼动追踪装置获取所述用户的第一眼动信息;controlling the eye-tracking device to acquire first eye-movement information of the user;
    根据所述第一眼动信息确定第一眼动类型;determining a first eye movement type according to the first eye movement information;
    根据所述第一眼动类型对所述电子设备进行控制;controlling the electronic device according to the first eye movement type;
    控制所述眼动追踪装置获取所述用户的第二眼动信息;controlling the eye-tracking device to obtain second eye-movement information of the user;
    根据所述第二眼动信息确定第二眼动类型;determining a second eye movement type according to the second eye movement information;
    控制所述手势追踪装置获取所述用户的手势;controlling the gesture tracking device to acquire gestures of the user;
    根据所述手势追踪装置获取的所述手势对所述电子设备进行控制。The electronic device is controlled according to the gesture acquired by the gesture tracking device.
  32. 根据权利要求31所述的电子设备,其特征在于,所述处理器还用于:The electronic device according to claim 31, wherein the processor is also used for:
    控制所述眼动追踪装置获取所述用户的第三眼动信息;controlling the eye-tracking device to acquire third eye-movement information of the user;
    根据所述第三眼动信息确定所述第一眼动类型;determining the first eye movement type according to the third eye movement information;
    根据所述第一眼动类型对所述电子设备进行控制,并限制所述手势追踪装置获取所述用户的手势。The electronic device is controlled according to the first eye movement type, and the gesture tracking device is restricted from acquiring gestures of the user.
  33. 一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,当所述计算机程序在电子设备上运行时,使得所述电子设备执行如权利要求1-16任一所述的方法。A computer-readable storage medium, the computer-readable storage medium stores a computer program, and when the computer program is run on an electronic device, the electronic device executes the method according to any one of claims 1-16 .
  34. 一种计算机程序产品,其特征在于,当所述计算机程序产品在电子设备上运行时,使得所述电子设备执行如权利要求1-16任一所述的方法。A computer program product, characterized in that, when the computer program product is run on an electronic device, the electronic device is made to execute the method according to any one of claims 1-16.
PCT/CN2022/125889 2021-11-10 2022-10-18 Method for interacting with electronic device, and electronic device WO2023082952A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111329181.9 2021-11-10
CN202111329181.9A CN116107419A (en) 2021-11-10 2021-11-10 Method for interacting with electronic equipment and electronic equipment

Publications (1)

Publication Number Publication Date
WO2023082952A1 true WO2023082952A1 (en) 2023-05-19

Family

ID=86254918

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/125889 WO2023082952A1 (en) 2021-11-10 2022-10-18 Method for interacting with electronic device, and electronic device

Country Status (2)

Country Link
CN (1) CN116107419A (en)
WO (1) WO2023082952A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116909407B (en) * 2023-09-12 2024-01-12 深圳康荣电子有限公司 Touch display screen panoramic interaction method and control system based on virtual reality

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015133889A1 (en) * 2014-03-07 2015-09-11 -Mimos Berhad Method and apparatus to combine ocular control with motion control for human computer interaction
CN109814722A (en) * 2019-02-25 2019-05-28 苏州长风航空电子有限公司 A kind of multi-modal man-machine interactive system and exchange method
US20190265802A1 (en) * 2013-06-20 2019-08-29 Uday Parshionikar Gesture based user interfaces, apparatuses and control systems
CN110297540A (en) * 2019-06-12 2019-10-01 浩博泰德(北京)科技有限公司 A kind of human-computer interaction device and man-machine interaction method
CN110456911A (en) * 2019-08-09 2019-11-15 Oppo广东移动通信有限公司 Electronic equipment control method and device, electronic equipment and readable storage medium
CN111324202A (en) * 2020-02-19 2020-06-23 中国第一汽车股份有限公司 Interaction method, device, equipment and storage medium
US20200382717A1 (en) * 2019-05-28 2020-12-03 Ganzin Technology, Inc. Eye-tracking module with scenario-based mode switching function
CN112286358A (en) * 2020-11-02 2021-01-29 恒大新能源汽车投资控股集团有限公司 Screen operation method and device, electronic equipment and computer-readable storage medium
CN113534950A (en) * 2021-03-30 2021-10-22 北京航空航天大学 Virtual object interaction method based on mixed reality technology

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190265802A1 (en) * 2013-06-20 2019-08-29 Uday Parshionikar Gesture based user interfaces, apparatuses and control systems
WO2015133889A1 (en) * 2014-03-07 2015-09-11 -Mimos Berhad Method and apparatus to combine ocular control with motion control for human computer interaction
CN109814722A (en) * 2019-02-25 2019-05-28 苏州长风航空电子有限公司 A kind of multi-modal man-machine interactive system and exchange method
US20200382717A1 (en) * 2019-05-28 2020-12-03 Ganzin Technology, Inc. Eye-tracking module with scenario-based mode switching function
CN110297540A (en) * 2019-06-12 2019-10-01 浩博泰德(北京)科技有限公司 A kind of human-computer interaction device and man-machine interaction method
CN110456911A (en) * 2019-08-09 2019-11-15 Oppo广东移动通信有限公司 Electronic equipment control method and device, electronic equipment and readable storage medium
CN111324202A (en) * 2020-02-19 2020-06-23 中国第一汽车股份有限公司 Interaction method, device, equipment and storage medium
CN112286358A (en) * 2020-11-02 2021-01-29 恒大新能源汽车投资控股集团有限公司 Screen operation method and device, electronic equipment and computer-readable storage medium
CN113534950A (en) * 2021-03-30 2021-10-22 北京航空航天大学 Virtual object interaction method based on mixed reality technology

Also Published As

Publication number Publication date
CN116107419A (en) 2023-05-12

Similar Documents

Publication Publication Date Title
US11797146B2 (en) Systems, methods, and graphical user interfaces for annotating, measuring, and modeling environments
US11231777B2 (en) Method for controlling device on the basis of eyeball motion, and device therefor
US20210263593A1 (en) Hand gesture input for wearable system
US11539876B2 (en) User interfaces for altering visual media
US20230386146A1 (en) Systems, Methods, and Graphical User Interfaces for Displaying and Manipulating Virtual Objects in Augmented Reality Environments
US10048763B2 (en) Distance scalable no touch computing
US9268404B2 (en) Application gesture interpretation
KR20220040493A (en) Devices, methods, and graphical user interfaces for interacting with three-dimensional environments
US9507417B2 (en) Systems and methods for implementing head tracking based graphical user interfaces (GUI) that incorporate gesture reactive interface objects
US9298266B2 (en) Systems and methods for implementing three-dimensional (3D) gesture based graphical user interfaces (GUI) that incorporate gesture reactive interface objects
CN117032519A (en) Apparatus, method and graphical user interface for interacting with a three-dimensional environment
US11778339B2 (en) User interfaces for altering visual media
KR101919009B1 (en) Method for controlling using eye action and device thereof
US20170068416A1 (en) Systems And Methods for Gesture Input
DK201770387A1 (en) User interface for a flashlight mode on an electronic device
JP2010145861A (en) Head mount display
JP2010145860A (en) Head mount display
CN118159935A (en) Apparatus, method and graphical user interface for content applications
WO2023082952A1 (en) Method for interacting with electronic device, and electronic device
US12032754B2 (en) Information processing apparatus, information processing method, and non-transitory computer readable medium
Nguyen et al. Direct manipulation video navigation on touch screens
CN118215903A (en) Apparatus, method, and graphical user interface for rendering virtual objects in a virtual environment
Schmieder et al. Thumbs up: 3D gesture input on mobile phones using the front facing camera
US20240211025A1 (en) Control device and control method
US12101567B2 (en) User interfaces for altering visual media

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22891749

Country of ref document: EP

Kind code of ref document: A1