CN115480639A - Human-computer interaction system, human-computer interaction method, wearable device and head display device - Google Patents

Human-computer interaction system, human-computer interaction method, wearable device and head display device Download PDF

Info

Publication number
CN115480639A
CN115480639A CN202211146467.8A CN202211146467A CN115480639A CN 115480639 A CN115480639 A CN 115480639A CN 202211146467 A CN202211146467 A CN 202211146467A CN 115480639 A CN115480639 A CN 115480639A
Authority
CN
China
Prior art keywords
target
control instruction
head display
virtual
virtual scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211146467.8A
Other languages
Chinese (zh)
Inventor
杨天翼
尹子硕
陈昊芝
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Positive Negative Infinite Technology Co ltd
Original Assignee
Beijing Positive Negative Infinite Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Positive Negative Infinite Technology Co ltd filed Critical Beijing Positive Negative Infinite Technology Co ltd
Priority to CN202211146467.8A priority Critical patent/CN115480639A/en
Publication of CN115480639A publication Critical patent/CN115480639A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a human-computer interaction system, a human-computer interaction method, wearable equipment and head display equipment, and relates to the technical field of human-computer interaction. The human-computer interaction system comprises: the head display equipment is used for acquiring the virtual scene interface and the eye activity information of the target object in real time; controlling the virtual cursor to point to the target element based on the eye activity information; and the wearable equipment is used for responding to the target trigger operation detected in the operation interface, generating and sending a target control instruction to the head display equipment for displaying the virtual scene interface, and the head display equipment is also used for receiving and responding to the target control instruction sent by the wearable equipment and adjusting the target element to be in a target state. According to the embodiment of the application, the target object can be adjusted by simply triggering the target object in the wearable device, the operation mode is simple, the hand movement of the target object is less, fatigue is avoided even if long-time operation is carried out, and the operation experience of the target object is enhanced.

Description

Human-computer interaction system, human-computer interaction method, wearable device and head display device
Technical Field
The application relates to the technical field of human-computer interaction, in particular to a human-computer interaction system, a human-computer interaction method, wearable equipment and head display equipment.
Background
VR (Virtual Reality)/AR (Augmented Reality) technology applies Virtual information to the real world, and real environment and Virtual objects are superimposed on the same picture or space in real time and exist at the same time.
The VR/AR head display equipment is an application of VR/AR technology, and when a target object is interacted through target elements in a scene interface displayed by the VR/AR head display equipment and the head display equipment, cursor moving operation and triggering operation need to be carried out. The existing solution implements the cursor movement operation and the trigger operation in the following two ways: the first method is to combine with the holding device, emit a ray from the front of the holding device, the ray and the focus of the virtual interface, i.e. the cursor position, rotate the holding device, i.e. move the cursor, and then combine the button or touch pad on the head display device to perform the triggering operation. The second method is to perform cursor movement operation by means of head rotation or eye movement tracking, and then perform triggering operation by combining with a button or a touch pad on the head display device.
Although both of the above two methods can satisfy the requirement of interaction between the target object and the target element, the first method requires a long-time hand to hold even if the target object does not interact with the target element, is not suitable for long-time operation, and can obstruct daily activities of the hand; in the second mode, the hand is far away from the head display device, the operation is inconvenient, and the long-time hand-lifting operation is easy to cause fatigue.
Disclosure of Invention
Embodiments of the present application provide a human-computer interaction system, a human-computer interaction method, a wearable device, a head display device, an electronic device, a computer-readable storage medium, and a computer program product, which can solve at least one problem in the background art.
According to a first aspect of embodiments of the present application, there is provided a human-computer interaction system, including: the device comprises a head display device and a wearable device arranged on limbs;
the head display equipment collects a virtual scene interface and eye activity information of a target object in real time, and controls a virtual cursor to point to a target element in at least one interactive element in the virtual scene interface based on the eye activity information;
the wearable device responds to a target trigger operation detected in an operation interface, generates a target control instruction and sends the target control instruction to the head display device, wherein the control instruction is used for adjusting a target element pointed by a virtual cursor in a virtual scene interface to be in a target state;
and the head display equipment receives and responds to a target control instruction sent by the wearable equipment, and adjusts the target element to be in a target state.
According to a second aspect of embodiments of the present application, there is provided a human-computer interaction method performed by a wearable device provided to a limb, the method including:
responding to a target trigger operation detected in an operation interface, and generating a target control instruction, wherein the target control instruction is used for adjusting a target element pointed by a virtual cursor in a virtual scene interface into a target state;
and sending the control instruction to a head display device for displaying the virtual scene interface so that the head display device adjusts the target element to be in a target state.
According to a third aspect of the embodiments of the present application, there is provided a human-computer interaction method, performed by a head display device, the method including:
acquiring a virtual scene interface and eye activity information of a target object in real time;
controlling a virtual cursor to point to a target element in at least one interactive element in the virtual scene interface based on the eye activity information;
and receiving and responding to a target control instruction sent by the wearable equipment, and adjusting the target element to be in a target state.
According to a fourth aspect of embodiments of the present application, there is provided a wearable device provided to a limb, the wearable device including:
the target control instruction generation module is used for responding to a target trigger operation detected in the operation interface and generating a target control instruction, and the target control instruction is used for adjusting a target element pointed by a virtual cursor in the virtual scene interface into a target state;
and the target control instruction sending module is used for sending the control instruction to the head display equipment for displaying the virtual scene interface so as to enable the head display equipment to adjust the target element to be in the target state.
According to a fifth aspect of embodiments of the present application, there is provided a head display apparatus, including:
the acquisition module is used for acquiring the virtual scene interface and the eye activity information of the target object in real time;
the control module is used for controlling the virtual cursor to point to a target element in at least one interactive element in the virtual scene interface based on the eye activity information;
and the adjusting module is used for receiving and responding to the target control instruction sent by the wearable equipment and adjusting the target element to be in the target state.
According to a sixth aspect of embodiments of the present application, there is provided an electronic device comprising a memory, a processor and a computer program stored on the memory, the processor implementing the steps of the method as provided in the second and third aspects when executing the program.
According to a seventh aspect of embodiments of the present application, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the method as provided in the second and third aspects.
According to an eighth aspect of embodiments of the present application, there is provided a computer program product comprising computer instructions stored in a computer-readable storage medium, which, when read by a processor of a computer device from the computer-readable storage medium, cause the processor to execute the computer instructions, so that the computer device performs the steps of implementing the method as provided in the second and third aspects.
The technical scheme provided by the embodiment of the application has the following beneficial effects: according to the embodiment of the application, the movement of the virtual cursor is controlled through the eye movement information of the target object, when the virtual cursor points to the target element, target trigger operation is performed in the operation interface of the wearable device arranged on the limb, the wearable device generates a target control instruction corresponding to the target trigger operation, the target control instruction is used for adjusting the target element pointed by the virtual cursor in the virtual scene interface displayed by the head display device to be in a target state, the target object can adjust the state of the target element through simple trigger operation performed in the wearable device, the operation mode is simple, the learning cost is low, hands are easy to use, the movement of hands of the target object is few, fatigue is avoided even if long-time operation is performed, and the operation experience of the target object is enhanced.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings used in the description of the embodiments of the present application will be briefly described below.
Fig. 1 is a schematic structural diagram of a human-computer interaction system according to an embodiment of the present disclosure;
fig. 2 is a schematic view illustrating an interaction flow between a head display device and a wearable device according to an embodiment of the present application;
fig. 3a is a schematic diagram illustrating a current state of a target element pointed by a virtual cursor in a virtual scene interface according to an embodiment of the present disclosure; (ii) a
Fig. 3b is a schematic diagram of a triggering operation performed in a wearable device smart watch according to an embodiment of the present application;
fig. 3c is a schematic diagram illustrating a target element pointed by a virtual cursor in a virtual scene interface according to the present application after being adjusted to a target state;
fig. 4a is a schematic diagram of a menu in a folded state in a virtual scene interface according to an embodiment of the present application; (ii) a
Fig. 4b is a schematic diagram of a target object performing a sliding-up type triggering operation in a wearable device according to an embodiment of the present application;
fig. 4c is a schematic diagram illustrating a target element pointed by a virtual cursor in a virtual scene interface being adjusted to a target state according to an embodiment of the present application;
fig. 5 is a schematic flowchart of a human-computer interaction method according to an embodiment of the present disclosure;
fig. 6 is a schematic flowchart of another human-computer interaction method according to an embodiment of the present disclosure;
fig. 7 is a schematic diagram of interaction between a head display device and a smart watch according to an embodiment of the present application;
FIG. 8 is a schematic structural diagram of a head display device according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of a wearable device placed on a limb according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Embodiments of the present application are described below in conjunction with the drawings in the present application. It should be understood that the embodiments set forth below in connection with the drawings are exemplary descriptions for explaining technical solutions of the embodiments of the present application, and do not limit the technical solutions of the embodiments of the present application.
As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, information, data, steps, operations, elements, and/or components, but do not preclude the presence or addition of other features, information, data, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may be present. Further, "connected" or "coupled" as used herein may include wirelessly connected or wirelessly coupled. The term "and/or" as used herein indicates at least one of the items defined by the term, e.g., "a and/or B" may be implemented as "a", or as "B", or as "a and B".
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
Several terms referred to in this application will first be introduced and explained:
Head-Mounted Display (HMD), head shows equipment promptly, through various Head show equipment, to eyes send optical signal, can realize different effects such as virtual reality, augmented reality, mixed reality, head shows equipment and includes VR all-in-one, VR components of a whole that can function independently machine, VR box, VR glasses or intelligent glasses, the relevant equipment of personal mobile cinema equipment etc. Head shows equipment can let the experience person have the sensation of being personally on the scene when watching virtual scene interface.
Human-Computer Interaction (HCI): the method refers to a process of information exchange between a person and a computer for completing a certain task in a certain interaction mode by using a certain dialogue language between the person and the computer, and in the embodiment of the application, the process refers to a process of interaction between a target object and a head display device.
The technical solutions of the embodiments of the present application and the technical effects produced by the technical solutions of the present application will be described below through descriptions of several exemplary embodiments. It should be noted that the following embodiments may be referred to, referred to or combined with each other, and the description of the same terms, similar features, similar implementation steps, etc. in different embodiments is not repeated.
Fig. 1 is a schematic structural diagram of a human-computer interaction system provided in an embodiment of the present application, including a wearable device 101 and a head display device 102 disposed on a limb. The wearable device 101 may be any terminal device that can be worn on a limb, such as a smart watch, a smart ring, a smart wristband, a smart glove, a mobile phone terminal, and a tablet computer. The head display device 102 may be at least one of a VR head display device, an AR head display device, and an MR head display device; a communication connection is established between the wearable device 101 and the head display device 102, and a wireless communication connection can be established between the two devices through bluetooth.
The head display device 102 acquires the virtual scene interface and the eye activity information of the target object in real time, and controls the virtual cursor to point to a target element in at least one interactive element in the virtual scene interface based on the eye activity information;
the wearable device 101 detects a target trigger operation detected in the operation interface, and generates a target control instruction in response to the target trigger operation, wherein the target control instruction is used for adjusting a target element pointed by a virtual cursor in the virtual scene interface to a target state; and sending the control instruction to a head display device for displaying the virtual scene interface so that the head display device adjusts the target element to be in a target state.
The head display device 102 receives and responds to the target control instruction to adjust the target element to be in the target state.
The embodiment of the application provides a man-machine interaction system, which comprises: the head display device and the wearable device are arranged on the limb; fig. 2 is a schematic diagram of an interaction flow between a head-up display device and a wearable device provided in an embodiment of the present application, including:
step S201, a head display device collects a virtual scene interface and eye activity information of a target object in real time;
step S202, the head display device controls a virtual cursor to point to a target element in at least one interactive element in a virtual scene interface based on the eye activity information;
step S203, the wearable device generates a target control instruction in response to the target trigger operation detected in the operation interface, and sends the target control instruction to the head display device, where the target control instruction is used to adjust a target element pointed by a virtual cursor in the virtual scene interface to a target state.
And step S204, the head display equipment receives and responds to the target control instruction sent by the wearable equipment, and the target element is adjusted to be in a target state.
The embodiment of the application relates to interaction between head display equipment and wearable equipment arranged on limbs, and after a target object wears the head display equipment, a virtual scene interface displayed in the head display equipment can be watched, and the virtual scene interface can be any type of interactive interface, such as a game interface.
When a target object views a virtual scene interface, the head display device controls the virtual cursor to move in the virtual scene interface based on the eye activity information of the target object, namely, the position of the virtual cursor changes along with the change of the eye activity information.
The head display equipment comprises an eye movement tracking sensor, wherein the eye movement tracking sensor is used for detecting eye movement information of a target object in real time, and the eye movement information is the state of eye movement.
The virtual scene interface of the embodiment of the application includes at least one interactable element, and the interactable element may be an interactable element in the virtual interface, for example, the interactable element is an application icon, a virtual button, a window, and the like.
When the target object interacts with the target element in the virtual scene interface, the virtual cursor can be controlled to point to the target element through the eye activity information, the target element is triggered through the wearable device, and after the target element is triggered, the state of the target element can be changed to be the target state.
The wearable device of the embodiment of the application can be arranged on the limb of the target object, for example, on the wrist, and the wearable device comprises at least one of a smart watch, a smart ring, a smart wristband, a smart glove, a mobile phone and a tablet computer, and specifically, for example, the smart watch is worn on the wrist of the target object.
The wearable equipment is arranged on the limbs of the target object, so that the hand motion of the target object is less, and the wearable equipment is not tired even if being operated for a long time.
The wearable device is provided with a target application program, a target object can perform triggering operation on an operation interface of the target application program, a touch sensor is arranged in the wearable device, the target triggering operation performed on the operation interface by the target object can be detected, the target triggering operation comprises at least one of clicking, double clicking and sliding towards a preset direction, and the preset direction can be any direction, such as an upward direction, a downward direction and the like.
It is to be noted that when the virtual cursor does not point to the interactable element (that is, when the virtual cursor points to the interactable element), the trigger operation performed by the target object in the target application is an invalid trigger operation, and when the virtual cursor points to the interactable element, the trigger operation performed by the target object in the target application is an effective trigger operation, which is an effective trigger operation in the embodiment of the present application.
Each type of target trigger operation in the embodiment of the application has a corresponding control instruction, the wearable device generates the target control instruction corresponding to the target trigger operation after detecting the target trigger operation, and the target control instruction is used for adjusting a target element pointed by a virtual cursor in a virtual scene interface to a target state.
The wearable device sends the target control instruction to the head display device displaying the virtual scene interface after generating the target control instruction, and the head display device responds to the target control instruction after receiving the target control instruction and adjusts the target element to be in the target state.
Specifically, as shown in fig. 3a, a schematic diagram exemplarily showing a current state of a target element pointed by a virtual cursor in a virtual scene interface is provided, assuming that the target element is a virtual control and is in an OFF state;
as shown in fig. 3b, which exemplarily shows a schematic diagram of performing a trigger operation in a wearable device smart watch, after a target application is installed in the smart watch, a target object may directly perform a target trigger operation on the smart watch, such as performing a click operation in the smart watch, and after the wearable device detects the click operation, the wearable device may respond to the click operation to generate a corresponding target control instruction and send the target control instruction to a head display device;
as shown in fig. 3c, it exemplarily shows that, after receiving the target control instruction, the head display device adjusts the target element in the virtual scene interface to the target state in response to the target control instruction, and the target state is the startup ON state.
According to the embodiment of the application, the movement of the virtual cursor is controlled through the eye movement information of the target object, when the virtual cursor points to the target element, target triggering operation is carried out in the operation interface of the wearable device arranged on the limb, the wearable device generates the target control instruction corresponding to the target triggering operation, the target control instruction is used for adjusting the target element pointed by the virtual cursor in the virtual scene interface displayed by the head display device to be in the target state, the target object can adjust the state of the target element through simple triggering operation in the wearable device, the operation mode is simple, the learning cost is low, hands are easy to use, the hand movement of the target object is less, fatigue is avoided even if long-time operation is carried out, and the operation experience of the target object is enhanced.
The embodiment of the present application provides a possible implementation manner, where controlling a virtual cursor to point to a target element in at least one interactive element in a scene interface based on eye activity information includes:
determining a sight line falling point of the eyes of the target object in the virtual scene interface based on the eye activity information, and taking an interactive element where the sight line falling point is located as a target element pointed by the virtual cursor.
The method and the device for determining the eye movement of the target object determine the sight line drop point of the eyes of the target object in the virtual scene interface based on the eye movement information, wherein the position of the sight line drop point is the position of the virtual cursor, and the interactive element of the sight line drop point is the target element pointed by the virtual cursor.
The embodiment of the present application provides a possible implementation manner, and generating a target control instruction in response to a target trigger operation detected in an operation interface includes:
determining the type of a target trigger operation; the type is at least one of single click, double click and sliding towards a preset direction;
and determining a target control instruction corresponding to the type of the target trigger operation according to the pre-established corresponding relation between the type of the trigger operation and the control instruction.
When a target application program in the wearable device is opened, a target object can perform target triggering operation on an operation interface of the target application program, a touch sensor is arranged on a screen of the wearable device and can detect the target triggering operation performed by the target object and judge the type of the target triggering operation, and the target triggering operation is any one of clicking, double clicking and sliding towards a preset direction.
Each type of trigger operation in the embodiment of the application has a corresponding control instruction, the embodiment of the application can establish a corresponding relationship between the type of the trigger operation and the control instruction in advance, and after the type of the target trigger operation is determined, the target control instruction corresponding to the type of the target trigger operation is determined according to the corresponding relationship between the type of the trigger operation and the control instruction.
The embodiment of the application provides a possible implementation mode, wherein an operation interface is divided into a plurality of grids;
determining the type of the target trigger operation, including:
acquiring touch data corresponding to target trigger operation; the touch data comprise touch times in a preset time period and Boolean values of grids in the operation interface;
for any grid, if the Boolean value of the grid is determined to be the target Boolean value, determining the grid to be the target grid, and recording the coordinates of the target grid;
and determining the type of the target trigger operation according to the touch times and the coordinates of each target grid.
The screen of the wearable device in the embodiment of the application may be divided into a plurality of grids, for example, the grid is divided into 64 × 64 grids, each grid has corresponding coordinates, the operation interface is displayed on the screen, that is, the operation interface is also divided into a plurality of grids, when the target object performs the target triggering operation on the operation interface, the touch sensor may detect touch data corresponding to the target triggering operation, where the touch data includes coordinates of the target grid of which the number of touches and the boolean value are true in a preset time period.
If the Boolean value of each grid is true, representing that the grid belongs to the position of the target trigger operation, and the grid is a target grid, and recording the coordinate of the target grid; if the Boolean value of the grid is false, the grid is not represented to the position of the target trigger operation.
After the coordinates of the target grid are recorded, whether the target triggering operation is single-click (or double-click) or sliding can be judged according to an image or track formed by the coordinates of the target grid.
The touch data also comprises data such as touch times in preset time, and whether the target trigger operation is single-click or double-click can be determined according to the touch times.
According to the embodiment of the application, the type of the target trigger operation performed by the target object can be determined through the touch data, and the target trigger operation can be judged to belong to any one of single click, double click or sliding through the touch data.
The embodiment of the present application provides a possible implementation manner, receiving and responding to a target control instruction, and adjusting a target element to a target state, including:
and determining a target state corresponding to the target control instruction according to the pre-established association relationship between the state of the interactive element and the control instruction, and adjusting the target element to be the target state.
The method and the device for triggering the target object in the wearable device have the advantages that the types of the triggering operation which can be carried out by the target object on the operation interface of the wearable device are limited, namely the types of the triggering operation comprise any one of clicking, double clicking and sliding towards the preset direction, compared with the number of the types of the triggering operation, the number of the interactive elements is larger than the number of the triggering operation, in order to avoid confusion, the association relation between the interactive elements and the control instruction corresponding to the triggering operation is established in advance, and after different interactive elements receive the control instruction corresponding to the same triggering operation, the corresponding target states are different.
Specifically, for example, the virtual scene interface includes an element a and an element B, both of which belong to interactable elements.
When the virtual cursor points to the element A, the element A is a target element, the element A is a virtual button, the current state of the element A is a closed state, when a target object performs a single-type triggering operation on an operation interface of the wearable device, the wearable device generates and sends a target control instruction to the head display device when detecting the single-type triggering operation, and the head display device adjusts the state of the element A to be an open state after receiving the target control instruction.
Continuing the above example, when the virtual cursor points to the element B, the element B is a target element, the element B is a menu, the current state of the element B is a folded state, and when the target object performs a single-click type trigger operation on an operation interface of the wearable device, the wearable device generates and sends a target control instruction to the head display device when detecting the single-click type trigger operation, and the head display device adjusts the state of the element B to an expanded state after receiving the target control instruction.
According to the embodiment of the application, the target state corresponding to the target control instruction can be determined according to the pre-established association relationship between the state of the interactive element and the control instruction, and the target element is adjusted to be the target state.
In the embodiment of the application, the interactive elements include a menu, when the virtual cursor points to the interactive elements, the menu belongs to a target element, and in the virtual scene interface, the menu is generally in a folded state as shown in fig. 4a, which exemplarily shows the menu in the folded state in the virtual scene interface; as shown in fig. 4b, the target triggering operation of the target object in the wearable device is a sliding up, and the wearable device generates a target control instruction corresponding to the sliding up and sends the target control instruction to the head display device; after receiving the target control instruction, the head display device may adjust the menu in the virtual scene interface to an expanded state, as shown in fig. 4c, which exemplarily shows the menu in the expanded state in the virtual scene interface.
The embodiment of the application provides a possible implementation manner, and the wearable device includes at least one of a smart watch, a smart ring, a smart wristband, a smart glove, a mobile phone, and a tablet computer, and of course, other terminals or devices may also be used, which is not limited in the embodiment of the application.
The embodiment of the application provides a human-computer interaction method, which is executed by a wearable device arranged on a limb, and as shown in fig. 5, the method comprises the following steps:
step S501, responding to a target trigger operation detected in an operation interface, and generating a target control instruction, wherein the control instruction is used for adjusting a target element pointed by a virtual cursor in a virtual scene interface to a target state;
step S502, sending the control instruction to the head display equipment for displaying the virtual scene interface, so that the head display equipment adjusts the target element to be in the target state.
After the head display device collects the eye activity information of the virtual scene interface and the target object, the virtual cursor is controlled to point to a target element in at least one interactive element in the virtual scene interface based on the eye activity information, namely, the virtual cursor is controlled to move through eye activity, and the interactive element pointed by the virtual cursor is the target element.
In the embodiment of the application, after determining that a virtual cursor points to a target element, a target triggering operation may be performed on a wearable device, where the wearable device generates a target control instruction in response to the target triggering operation detected in an operation interface, and sends the target control instruction to a head display device, where the control instruction is used to adjust the target element pointed by the virtual cursor in a virtual scene interface to a target state, and the head display device adjusts the target element to the target state in response to the target control instruction after receiving the target control instruction.
The embodiment of the application provides a human-computer interaction method, which is executed by a head display device, and as shown in fig. 6, the method includes:
step S601, collecting a virtual scene interface and eye activity information of a target object in real time;
step S602, based on the eye activity information, controlling the virtual cursor to point to a target element in at least one interactive element in the virtual scene interface;
step S603, receiving and responding to the target control instruction sent by the wearable device, and adjusting the target element to be in the target state.
After the target object wears the head display device, the virtual scene interface displayed in the head display device can be viewed, and the virtual scene interface can be any type of interactive interface, such as a game interface.
When a target object views a virtual scene interface, the head display device controls the virtual cursor to move in the virtual scene interface based on the eye activity information of the target object, namely, the position of the virtual cursor changes along with the change of the eye activity information.
The virtual scene interface of the embodiment of the application includes at least one interactable element, and the interactable element may be an interactable element in the virtual interface, for example, the interactable element is an application icon, a virtual button, a window, and the like,
the head display equipment comprises an eye movement tracking sensor, wherein the eye movement tracking sensor is used for detecting eye movement information of a target object in real time, and the eye movement information is the state of eye movement.
Each type of target trigger operation in the embodiment of the application has a corresponding control instruction, the wearable device generates a target control instruction corresponding to the target trigger operation after detecting the target trigger operation, the target control instruction is used for adjusting a target element pointed by a virtual cursor in a virtual scene interface to a target state, and the head display device responds to the target control instruction after receiving the target control instruction and adjusts the target element to the target state.
The embodiment of the application controls the movement of the virtual cursor through the eye movement information of the target object, when the virtual cursor points to the target element, the target triggering operation is performed in the operation interface of the wearable device arranged on the limb, the wearable device generates the target control instruction corresponding to the target triggering operation, the target object can simply trigger the wearable device to adjust the state of the target element, the operation mode is simple, the learning cost is low, the operation is easy, the hand movement of the target object is less, the long-time operation can be performed without fatigue, and the operation experience of the target object is enhanced.
As shown in fig. 7, it exemplarily illustrates that an interaction diagram between a head display device and a smart watch is provided according to an embodiment of the present application, and a communication connection is established between the head display device and the smart watch through bluetooth.
The head display equipment comprises an eye movement tracking sensor, can monitor the state of eye movement of a target object in real time through the eye movement tracking sensor, determines a sight line falling point of the target object according to the state of the eye movement, controls the movement of a virtual cursor in a virtual scene interface according to the sight line falling point, and takes an interactive element where the sight line falling point is located as a target element pointed by the virtual cursor.
The smart watch is provided with a target application program, and after the target application program is opened, the target application program can monitor target trigger operation of a target object on an operation interface of the target application program in real time, generate a target control instruction corresponding to the target trigger operation and send the target control instruction to the head display equipment;
after receiving the target control instruction, the head display device determines a target state corresponding to the target control instruction according to a pre-established association relationship between the state of the interactive element and the control instruction, and adjusts the target element to be the target state.
An embodiment of the present application provides an apparatus for head display, as shown in fig. 8, the apparatus 80 for head display includes:
the acquisition module 810 is configured to acquire the virtual scene interface and the eye activity information of the target object in real time;
a control module 820 configured to control a virtual cursor to point to a target element of at least one interactable element in the virtual scene interface based on the eye activity information;
and an adjusting module 830, configured to receive and respond to the target control instruction sent by the wearable device, and adjust the target element to be in the target state.
The embodiment of the application controls the movement of the virtual cursor through the eye movement information of the target object, when the virtual cursor points to the target element, the target triggering operation is performed in the operation interface of the wearable device arranged on the limb, the wearable device generates the target control instruction corresponding to the target triggering operation, the target object can simply trigger the wearable device to adjust the state of the target element, the operation mode is simple, the learning cost is low, the operation is easy, the hand movement of the target object is less, the long-time operation can be performed without fatigue, and the operation experience of the target object is enhanced.
The embodiment of the present application provides a wearable device provided on a limb, and as shown in fig. 9, the wearable device 90 provided on a limb may include:
a target control instruction generating module 910, configured to generate a target control instruction in response to a target trigger operation detected in the operation interface, where the target control instruction is used to adjust a target element pointed by a virtual cursor in the virtual scene interface to a target state;
and a target control instruction sending module 920, configured to send a control instruction to a head display device that displays a virtual scene interface, so that the head display device adjusts a target element to be in a target state.
According to the embodiment of the application, the target triggering operation is performed in the operation interface of the wearable device arranged on the limb, the wearable device generates the target control instruction corresponding to the target triggering operation, the target control instruction is used for adjusting the target element pointed by the virtual cursor in the virtual scene interface displayed by the head display device to be in the target state, the target object can adjust the state of the target element by performing simple triggering operation in the wearable device, the operation mode is simple, the learning cost is low, the operation is easy, the hand movement of the target object is less, fatigue is avoided after long-time operation, and the operation experience of the target object is enhanced.
The apparatus of the embodiment of the present application may execute the method provided by the embodiment of the present application, and the implementation principle is similar, the actions executed by the modules in the apparatus of the embodiments of the present application correspond to the steps in the method of the embodiments of the present application, and for the detailed functional description of the modules of the apparatus, reference may be specifically made to the description in the corresponding method shown in the foregoing, and details are not repeated here.
The embodiment of the application provides an electronic device, which comprises a memory, a processor and a computer program stored on the memory, wherein the processor executes the computer program to realize the steps of the human-computer interaction method, and compared with the related technology, the method can realize the following steps:
according to the embodiment of the application, the movement of the virtual cursor is controlled through the eye movement information of the target object, when the virtual cursor points to the target element, target triggering operation is carried out in the operation interface of the wearable device arranged on the limb, the wearable device generates the target control instruction corresponding to the target triggering operation, the target control instruction is used for adjusting the target element pointed by the virtual cursor in the virtual scene interface displayed by the head display device to be in the target state, the target object can adjust the state of the target element through simple triggering operation in the wearable device, the operation mode is simple, the learning cost is low, hands are easy to use, the hand movement of the target object is less, fatigue is avoided even if long-time operation is carried out, and the operation experience of the target object is enhanced.
In an alternative embodiment, an electronic device is provided, as shown in fig. 10, the electronic device 4000 shown in fig. 10 comprising: a processor 4001 and a memory 4003. Processor 4001 is coupled to memory 4003, such as via bus 4002. Optionally, the electronic device 4000 may further include a transceiver 4004, and the transceiver 4004 may be used for data interaction between the electronic device and other electronic devices, such as transmission of data and/or reception of data. It should be noted that the transceiver 4004 is not limited to one in practical applications, and the structure of the electronic device 4000 is not limited to the embodiment of the present application.
The Processor 4001 may be a CPU (Central Processing Unit), a general-purpose Processor, a DSP (Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), an FPGA (Field Programmable Gate Array) or other Programmable logic device, a transistor logic device, a hardware component, or any combination thereof. Which may implement or execute the various illustrative logical blocks, modules, and circuits described in connection with the disclosure herein. The processor 4001 may also be a combination that performs a computing function, e.g., comprising one or more microprocessors, a combination of DSPs and microprocessors, etc.
Bus 4002 may include a path that carries information between the aforementioned components. The bus 4002 may be a PCI (Peripheral Component Interconnect) bus, an EISA (Extended Industry Standard Architecture) bus, or the like. The bus 4002 may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown in FIG. 10, but this is not intended to represent only one bus or type of bus.
The Memory 4003 may be a ROM (Read Only Memory) or other types of static storage devices that can store static information and instructions, a RAM (Random Access Memory) or other types of dynamic storage devices that can store information and instructions, an EEPROM (Electrically Erasable Programmable Read Only Memory), a CD-ROM (Compact Disc Read Only Memory) or other optical Disc storage, optical Disc storage (including Compact Disc, laser Disc, optical Disc, digital versatile Disc, blu-ray Disc, etc.), a magnetic Disc storage medium, other magnetic storage devices, or any other medium that can be used to carry or store a computer program and that can be Read by a computer, without limitation.
The memory 4003 is used for storing computer programs for executing the embodiments of the present application, and is controlled by the processor 4001 to execute. The processor 4001 is used to execute computer programs stored in the memory 4003 to implement the steps shown in the foregoing method embodiments.
The electronic device package may include, but is not limited to, a mobile terminal such as a mobile phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a vehicle-mounted terminal (e.g., a car navigation terminal), etc., and a stationary terminal such as a digital TV, a desktop computer, etc., among others. The electronic device shown in fig. 10 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
Embodiments of the present application provide a computer-readable storage medium, on which a computer program is stored, and when being executed by a processor, the computer program may implement the steps and corresponding contents of the foregoing method embodiments. Compared with the prior art, the method can realize that: according to the embodiment of the application, the movement of the virtual cursor is controlled through the eye movement information of the target object, when the virtual cursor points to the target element, target triggering operation is carried out in the operation interface of the wearable device arranged on the limb, the wearable device generates the target control instruction corresponding to the target triggering operation, the target control instruction is used for adjusting the target element pointed by the virtual cursor in the virtual scene interface displayed by the head display device to be in the target state, the target object can adjust the state of the target element through simple triggering operation in the wearable device, the operation mode is simple, the learning cost is low, hands are easy to use, the hand movement of the target object is less, fatigue is avoided even if long-time operation is carried out, and the operation experience of the target object is enhanced.
It should be noted that the computer readable medium in the present disclosure can be a computer readable signal medium or a computer readable medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
Embodiments of the present application further provide a computer program product, which includes a computer program, and when the computer program is executed by a processor, the steps and corresponding contents of the foregoing method embodiments can be implemented. Compared with the prior art, the method can realize that:
according to the embodiment of the application, the movement of the virtual cursor is controlled through the eye movement information of the target object, when the virtual cursor points to the target element, target trigger operation is performed in the operation interface of the wearable device arranged on the limb, the wearable device generates a target control instruction corresponding to the target trigger operation, the target control instruction is used for adjusting the target element pointed by the virtual cursor in the virtual scene interface displayed by the head display device to be in a target state, the target object can adjust the state of the target element through simple trigger operation performed in the wearable device, the operation mode is simple, the learning cost is low, hands are easy to use, the movement of hands of the target object is few, fatigue is avoided even if long-time operation is performed, and the operation experience of the target object is enhanced.
The terms "first," "second," "third," "fourth," "1," "2," and the like in the description and claims of this application and in the preceding drawings, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It should be understood that the data so used are interchangeable under appropriate circumstances such that the embodiments of the application described herein are capable of operation in other sequences than illustrated or otherwise described herein.
It should be understood that, although each operation step is indicated by an arrow in the flowchart of the embodiment of the present application, the implementation order of the steps is not limited to the order indicated by the arrow. In some implementation scenarios of the embodiments of the present application, the implementation steps in the flowcharts may be performed in other sequences as desired, unless explicitly stated otherwise herein. In addition, some or all of the steps in each flowchart may include multiple sub-steps or multiple stages based on an actual implementation scenario. Some or all of these sub-steps or stages may be performed at the same time, or each of these sub-steps or stages may be performed at different times, respectively. In a scenario where execution times are different, an execution sequence of the sub-steps or the phases may be flexibly configured according to requirements, which is not limited in the embodiment of the present application.
The foregoing is only an optional implementation manner of a part of implementation scenarios in the present application, and it should be noted that, for those skilled in the art, other similar implementation means based on the technical idea of the present application are also within the protection scope of the embodiments of the present application without departing from the technical idea of the present application.

Claims (12)

1. A human-computer interaction system, comprising: the head display device and the wearable device are arranged on the limb;
the head display equipment collects the virtual scene interface and the eye activity information of the target object in real time; controlling a virtual cursor to point to a target element of at least one interactable element in the virtual scene interface based on the eye activity information;
the wearable device responds to a target trigger operation detected in an operation interface, generates a target control instruction and sends the target control instruction to the head display device, wherein the target control instruction is used for adjusting a target element pointed by a virtual cursor in a virtual scene interface to a target state;
and the head display equipment receives and responds to a target control instruction sent by the wearable equipment, and adjusts the target element to be in a target state.
2. The system of claim 1, wherein the controlling the virtual cursor to point to a target element of the at least one interactable element in the scene interface based on the eye-activity information comprises:
determining a sight line falling point of the eyes of the target object in the virtual scene interface based on the eye activity information, and taking an interactive element where the sight line falling point is located as a target element pointed by a virtual cursor.
3. The system of claim 1, wherein generating a target control instruction in response to a target trigger operation detected in an operator interface comprises:
determining the type of the target trigger operation; the type is at least one of single click, double click and sliding towards a preset direction;
and determining a target control instruction corresponding to the type of the target trigger operation according to the pre-established corresponding relationship between the type of the trigger operation and the control instruction.
4. The system of claim 3, wherein the operator interface is divided into a plurality of grids;
the determining of the type of the target trigger operation comprises:
acquiring touch data corresponding to target trigger operation; the touch data comprise touch times in a preset time period and Boolean values of grids in the operation interface;
for any grid, if the Boolean value of the grid is determined to be the target Boolean value, determining the grid to be the target grid, and recording the coordinates of the target grid;
and determining the type of target trigger operation according to the touch times and the coordinates of each target grid.
5. The system of claim 1, wherein receiving and responding to the target control directive to adjust the target element to a target state comprises:
and determining a target state corresponding to the target control instruction according to the pre-established association relationship between the state of the interactive element and the control instruction, and adjusting the target element to be the target state.
6. The system of any one of claims 1-5, wherein the wearable device comprises at least one of a smart watch, a smart ring, a smart wristband, a smart glove, a cell phone, a tablet.
7. A human-computer interaction method, characterized in that the method is executed by a wearable device arranged on a limb, and the method comprises the following steps:
responding to a target trigger operation detected in an operation interface, and generating a target control instruction, wherein the target control instruction is used for adjusting a target element pointed by a virtual cursor in a virtual scene interface into a target state;
and sending the control instruction to a head display device for displaying the virtual scene interface so that the head display device adjusts the target element to be in a target state.
8. A human-computer interaction method, performed by a head-up display device, the method comprising:
acquiring a virtual scene interface and eye activity information of a target object in real time;
controlling a virtual cursor to point to a target element of at least one interactable element in the virtual scene interface based on the eye activity information;
and receiving and responding to a target control instruction sent by the wearable equipment, and adjusting the target element to be in a target state.
9. A wearable device provided to a limb, the wearable device comprising:
the target control instruction generation module is used for responding to a target trigger operation detected in the operation interface and generating a target control instruction, and the target control instruction is used for adjusting a target element pointed by a virtual cursor in the virtual scene interface into a target state;
and the target control instruction sending module is used for sending the control instruction to a head display device for displaying the virtual scene interface so that the head display device adjusts the target element to be in a target state.
10. An apparatus for head display, the apparatus comprising:
the acquisition module is used for acquiring the virtual scene interface and the eye activity information of the target object in real time;
a control module for controlling a virtual cursor to point to a target element of at least one interactable element in the virtual scene interface based on the eye activity information;
and the adjusting module is used for receiving and responding to a target control instruction sent by the wearable device and adjusting the target element to be in a target state.
11. An electronic device comprising a memory, a processor and a computer program stored on the memory, characterized in that the processor executes the computer program to implement the steps of the method of at least one of claims 7-8.
12. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to at least one of claims 7 to 8.
CN202211146467.8A 2022-09-20 2022-09-20 Human-computer interaction system, human-computer interaction method, wearable device and head display device Pending CN115480639A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211146467.8A CN115480639A (en) 2022-09-20 2022-09-20 Human-computer interaction system, human-computer interaction method, wearable device and head display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211146467.8A CN115480639A (en) 2022-09-20 2022-09-20 Human-computer interaction system, human-computer interaction method, wearable device and head display device

Publications (1)

Publication Number Publication Date
CN115480639A true CN115480639A (en) 2022-12-16

Family

ID=84393129

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211146467.8A Pending CN115480639A (en) 2022-09-20 2022-09-20 Human-computer interaction system, human-computer interaction method, wearable device and head display device

Country Status (1)

Country Link
CN (1) CN115480639A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116107436A (en) * 2023-04-13 2023-05-12 北京乐开科技有限责任公司 VR virtual image interaction method and system based on mobile device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116107436A (en) * 2023-04-13 2023-05-12 北京乐开科技有限责任公司 VR virtual image interaction method and system based on mobile device

Similar Documents

Publication Publication Date Title
EP3465620B1 (en) Shared experience with contextual augmentation
EP3164785B1 (en) Wearable device user interface control
WO2017209979A1 (en) Video pinning
US20230315197A1 (en) Gaze timer based augmentation of functionality of a user input device
Budhiraja et al. Using a HHD with a HMD for mobile AR interaction
US20190324539A1 (en) Systems and methods for providing dynamic haptic playback for an augmented or virtual reality environments
CN111176764B (en) Display control method and terminal equipment
WO2019067482A1 (en) Displaying applications in a simulated reality setting
CN112817453A (en) Virtual reality equipment and sight following method of object in virtual reality scene
CN115480639A (en) Human-computer interaction system, human-computer interaction method, wearable device and head display device
CN110192169A (en) Menu treating method, device and storage medium in virtual scene
US10558340B2 (en) Inadvertent dismissal prevention for graphical content
CN117130518A (en) Control display method, head display device, electronic device and readable storage medium
CN110717993A (en) Interaction method, system and medium of split type AR glasses system
CN115002551A (en) Video playing method and device, electronic equipment and medium
CN113485625A (en) Electronic equipment response method and device and electronic equipment
JPWO2020031493A1 (en) Terminal device and control method of terminal device
US20240153211A1 (en) Methods, apparatuses, terminals and storage media for display control based on extended reality
WO2024131405A1 (en) Object movement control method and apparatus, device, and medium
JP7246390B2 (en) Direct manipulation of display devices using wearable computing devices
US20240103625A1 (en) Interaction method and apparatus, electronic device, storage medium, and computer program product
US20240028130A1 (en) Object movement control method, apparatus, and device
CN117453037A (en) Interactive method, head display device, electronic device and readable storage medium
CN115562779A (en) Media information processing method, device, equipment and storage medium
CN117991967A (en) Virtual keyboard interaction method, device, equipment, storage medium and program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination