CN113413585A - Interaction method and device of head-mounted display equipment and electronic equipment - Google Patents

Interaction method and device of head-mounted display equipment and electronic equipment Download PDF

Info

Publication number
CN113413585A
CN113413585A CN202110689938.9A CN202110689938A CN113413585A CN 113413585 A CN113413585 A CN 113413585A CN 202110689938 A CN202110689938 A CN 202110689938A CN 113413585 A CN113413585 A CN 113413585A
Authority
CN
China
Prior art keywords
interactive
interaction
head
mounted display
interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110689938.9A
Other languages
Chinese (zh)
Other versions
CN113413585B (en
Inventor
李炳耀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202110689938.9A priority Critical patent/CN113413585B/en
Publication of CN113413585A publication Critical patent/CN113413585A/en
Application granted granted Critical
Publication of CN113413585B publication Critical patent/CN113413585B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/212Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/28Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
    • A63F13/285Generating tactile feedback signals via the game input device, e.g. force feedback
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1012Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals involving biosensors worn by the player, e.g. for measuring heart beat, limb activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1037Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted for converting control signals received from the game device into a haptic signal, e.g. using force feedback
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/105Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals using inertial sensors, e.g. accelerometers, gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biophysics (AREA)
  • Cardiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application relates to an interaction method and device of a head-mounted display device, a computer device and a storage medium. The method is applied to head-mounted display equipment worn by a user, and the wearable equipment worn by the user is connected with the head-mounted display equipment; the method comprises the following steps: displaying an interactive interface corresponding to the condition of the function application to be entered; responding to an interactive operation triggered by an interactive element in the interactive interface, and executing an interactive event triggered by the interactive operation aiming at the interactive element; and the interactive operation is obtained by matching the pose data sent by the wearable equipment. By adopting the method, the interactive operation of the head-mounted display equipment can be simplified, and the interactive efficiency is improved.

Description

Interaction method and device of head-mounted display equipment and electronic equipment
Technical Field
The present application relates to the field of computer technologies, and in particular, to an interaction method and apparatus for a head-mounted display device, an electronic device, and a computer-readable storage medium.
Background
With the development of computer technology, various intelligent electronic products, such as head-mounted display devices implementing augmented reality technology and virtual reality technology, are increasingly widely used. The head-mounted display equipment can convert a user from a traditional two-dimensional screen space to a three-dimensional space through an augmented reality technology and a virtual reality technology, and provides a better visual effect. And in the interactive control to wearing display device at present, there is the problem that interactive efficiency is low because of the difficulty of interactive operation, and it is unfavorable for the user to interact with wearing display device.
Disclosure of Invention
The embodiment of the application provides an interaction method and device for a head-mounted display device, an electronic device and a computer readable storage medium, which can simplify interaction operation on the head-mounted display device and improve interaction efficiency.
An interaction method of a head-mounted display device is applied to the head-mounted display device worn by a user, and the wearable device worn by the user is connected with the head-mounted display device; the method comprises the following steps:
displaying an interactive interface corresponding to the condition of the function application to be entered;
responding to an interactive operation triggered by an interactive element in the interactive interface, and executing an interactive event triggered by the interactive operation aiming at the interactive element; and the interactive operation is obtained by matching the pose data sent by the wearable equipment.
An interaction device of a head-mounted display device is applied to the head-mounted display device worn by a user, and the wearable device worn by the user is connected with the head-mounted display device; the device comprises:
the interaction interface display module is used for displaying an interaction interface corresponding to the function application to be entered;
the interactive operation response module is used for responding to interactive operation triggered by interactive elements in the interactive interface and executing interactive events triggered by the interactive elements by the interactive operation; and the interactive operation is obtained by matching the pose data sent by the wearable equipment.
An electronic device comprising a memory and a processor, the memory having stored therein a computer program that, when executed by the processor, causes the processor to perform the steps of:
displaying an interactive interface corresponding to the condition of the function application to be entered;
responding to an interactive operation triggered by an interactive element in the interactive interface, and executing an interactive event triggered by the interactive operation aiming at the interactive element; and the interactive operation is obtained by matching the pose data sent by the wearable equipment.
A computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
displaying an interactive interface corresponding to the condition of the function application to be entered;
responding to an interactive operation triggered by an interactive element in the interactive interface, and executing an interactive event triggered by the interactive operation aiming at the interactive element; and the interactive operation is obtained by matching the pose data sent by the wearable equipment.
According to the interaction method and device of the head-mounted display device, the electronic device and the computer readable storage medium, the corresponding interaction interface is displayed under the condition that the function application is to be accessed, and the interaction event triggered by the interaction element is executed by the interaction operation in response to the interaction operation obtained by matching the pose data of the wearable device connected with the head-mounted display device in the interaction interface, so that the interaction processing of the head-mounted display device is realized. In the interaction processing process of the head-mounted display device, under the condition that the head-mounted display device is in the functional application to be entered, the interaction operation triggered by the wearable device through the wearing of the user in the corresponding interaction interface is responded, the interaction operation is obtained according to the pose data sent by the wearable device connected with the head-mounted display device in a matching mode, therefore, the interaction with the head-mounted display device can be achieved through the pose data of the wearable device under the condition that the head-mounted display device is in the functional application to be entered, the interaction operation mode is simplified, and the interaction efficiency is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a diagram of an application environment of a method for interaction with a head-mounted display device in one embodiment;
FIG. 2 is a flow diagram that illustrates a method for interaction with a head mounted display device, under an embodiment;
FIG. 3 is a schematic interface diagram of a display device configuration interactive interface in one embodiment;
FIG. 4 is an interface diagram illustrating interaction elements in a selected device configuration interaction interface, under an embodiment;
FIG. 5 is a schematic diagram of an interface for entering a color setting in one embodiment;
FIG. 6 is a schematic diagram illustrating an interface for displaying an application information interaction interface, according to an embodiment;
FIG. 7 is an interface diagram illustrating selection of an interactive element in an application information interaction interface, according to an embodiment;
FIG. 8 is a diagram illustrating an interface of an accessibility application in one embodiment;
FIG. 9 is a flow diagram of determining interoperation in one embodiment;
FIG. 10 is a diagram of an application environment for interaction with a head mounted display device in another embodiment;
FIG. 11 is a flowchart of an interaction method of a head mounted display device in another embodiment;
FIG. 12 is a block diagram showing the structure of an interaction means of a head-mounted display device in one embodiment;
FIG. 13 is a diagram illustrating an internal structure of a computer device according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It will be understood that, as used herein, the terms "first," "second," and the like may be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another. For example, a first client may be referred to as a second client, and similarly, a second client may be referred to as a first client, without departing from the scope of the present application. Both the first client and the second client are clients, but they are not the same client.
FIG. 1 is a diagram of an application environment of a method for interaction with a head-mounted display device in one embodiment. As shown in fig. 1, the application environment includes a wearable device 102 worn by a user, a head mounted display device 104 worn by the user, and a server 106. Where wearable device 102 communicates with head mounted display device 104 over a network, head mounted display device 104 communicates with server 106 over a network. After the user wears the head-mounted display device 104, the server 106 may issue the display data to the head-mounted display device 104 for display, and display a corresponding interactive interface when the head-mounted display device 104 is to enter a functional application, and the head-mounted display device 104 executes an interactive event triggered by an interactive element in response to an interactive operation obtained by matching pose data of the wearable device 102 in the interactive interface, thereby implementing interactive processing on the head-mounted display device 104. In addition, when the local end of the head-mounted display device 104 stores data to be displayed, the head-mounted display device 104 may not be connected to the server 104, and the interaction process of the wearable device 102 on the head-mounted display device 104 may be directly implemented.
The wearable device 102 may be, but not limited to, various portable wearable devices such as a smart watch and a smart bracelet, the head-mounted display device 104 may be, but not limited to, various head-mounted display devices such as AR (Augmented Reality) glasses and VR (Virtual Reality) glasses, and the server 106 may be implemented by an independent server or a server cluster formed by a plurality of servers.
FIG. 2 is a flow diagram that illustrates a method for interacting with a head mounted display device, under an embodiment. The interaction method of the head-mounted display device in this embodiment is described by taking the example of the method performed on the head-mounted display device connected with the wearable device and worn by the user in fig. 1. As shown in fig. 2, the interaction method of the head mounted display device includes steps 202 to 204.
Step 202, displaying an interactive interface corresponding to the condition of the function application to be entered.
The functional application refers to an application that can be installed and run on a client on the head-mounted display device, and can provide corresponding functional services, and specifically may include various types of applications, such as a game application, a video application, a news media application, a social application, and the like. When the functional application runs on the head-mounted display device, the functional application can be displayed in the display area of the head-mounted display device so as to show the application content of the functional application. For example, the head-mounted display device may specifically be AR glasses or a helmet, VR glasses or a helmet, and after the user wears the head-mounted display device, a three-dimensional picture of the functional application may be displayed, and the user may perform an interaction, such as a game interaction, in the displayed three-dimensional picture. VR, virtual reality, is a computer simulation technique that creates and experiences a virtual world, which uses a computer to create a simulated environment into which a user is immersed. The virtual reality technology is to combine electronic signals generated by computer technology with data in real life to convert the electronic signals into phenomena which can be felt by people, wherein the phenomena can be real objects in reality or substances which can not be seen by naked eyes of users and are expressed by a three-dimensional model. AR, augmented reality, is a technique for increasing a user's perception of the real world through information provided by a computer system, which overlays content objects such as computer-generated virtual objects, scenes, or system cues into a real scene to enhance or modify the perception of the real world environment or data representing the real world environment.
For different functional applications, the method has corresponding interaction modes, such as various interaction modes of touch, remote control, gestures and the like. After the head-mounted display device enters the functional application, an application interactive interface corresponding to the functional application is displayed, and a user can perform interactive processing in an interactive mode supported by the functional application under the application interactive interface. And under the condition that the head-mounted display device does not enter the functional application, that is, the head-mounted display device is in the condition of waiting to enter the functional application, the head-mounted display device may display an interactive interface corresponding to the condition where the head-mounted display device is located, for example, the head-mounted display device displays a starting interactive interface at a starting stage, displays a main interface after the starting, and various interactive interfaces such as a configuration interface for configuring the head-mounted display device, and in such an interactive interface, the head-mounted display device does not enter the functional application yet, cannot interact in an interactive mode supported by the functional application, often needs to interact through a remote controller associated with the head-mounted display device, and interacts with the head-mounted display device through the remote controller, and needs a user to trigger corresponding interactive operation through related keys on the remote controller, so that the interactive operation is difficult and the interactive efficiency is low. For example, for the head-mounted display device, if the user wears the head-mounted display device, the user is not convenient to check the remote controller, the user is more difficult to operate when interacting through the remote controller, and the interaction efficiency is low.
Specifically, when the head-mounted display device does not enter the functional application and is in a state of waiting to enter the functional application, the head-mounted display device displays a corresponding interactive interface, and in the interactive interface, a user may perform interaction related to the head-mounted display device with respect to the head-mounted display device, for example, various interactions such as controlling the start or the stop of the head-mounted display device, configuring operating parameters of the head-mounted display device, and viewing application information of the functional application installed on the head-mounted display device.
Step 204, responding to the interactive operation triggered by the interactive element in the interactive interface, and executing the interactive event triggered by the interactive operation aiming at the interactive element; and the interactive operation is obtained by matching the pose data sent by the wearable equipment.
The interactive operation is obtained according to the pose data matching sent by the wearable device, the wearable device can be specifically a device worn by a user wearing the head-mounted display device, such as various wearable devices including a watch and a bracelet, the wearable device is connected with the head-mounted display device, and if the wearable device can be connected through Bluetooth, the wearable device can send the pose data of the wearable device to the head-mounted display device, and the corresponding interactive operation is matched by the head-mounted display device based on the pose data. The position and posture data are data representing the position and posture of the wearable device, when the user wears the wearable device to trigger interaction on the head-mounted display device, the user can control the position and posture of the wearable device to change, if the user wears the smart watch, the position and posture of the smart watch can be changed by moving an arm and rotating a wrist, the wearable device can capture the position data and posture data through a built-in sensor, and the motion state of the wearable device can be determined according to the position data and posture data, so that the position and posture data of the wearable device can be obtained.
The wearable device sends the collected pose data to the head-mounted display device, the head-mounted display device conducts interaction operation matching on the pose data sent by the wearable device, and interaction operation corresponding to triggering is determined according to a matching result, wherein the interaction operation is determined to be various operations such as moving, selecting, confirming or canceling. The interactive elements are elements which can respond to interactive operation to trigger corresponding interactive events in the interactive interface, for example, the interactive elements can be various types of controls, the interactive events corresponding to the controls can be executed by triggering the interactive operation on the controls, and corresponding setting is confirmed to be executed if confirmation operation is triggered on the confirmation button, so that interactive processing on the head-mounted display device is realized.
Specifically, a user may trigger an interactive operation in an interactive interface displayed by the head-mounted display device, and the interactive operation is obtained by matching pose data sent by the wearable device worn by the user, so that the user may trigger different interactive operations on an interactive element in the interactive interface by changing the pose of the wearable device, and after the user triggers the interactive operation on the interactive element, the head-mounted display device executes an interactive event triggered by the interactive operation on the interactive element, such as various interactive events of configuring the head-mounted display device, entering a selected functional application, and the like.
In the interaction method of the head-mounted display device in this embodiment, a corresponding interaction interface is displayed in the case of a functional application to be entered, and in response to an interaction operation obtained by matching pose data of a wearable device connected to the head-mounted display device in the interaction interface, an interaction event triggered by the interaction element is executed by the interaction operation, so that interaction processing of the head-mounted display device is realized. In the interaction processing process of the head-mounted display device, under the condition that the head-mounted display device is in the functional application to be entered, the interaction operation triggered by the wearable device through the wearing of the user in the corresponding interaction interface is responded, the interaction operation is obtained according to the pose data sent by the wearable device connected with the head-mounted display device in a matching mode, therefore, the interaction with the head-mounted display device can be achieved through the pose data of the wearable device under the condition that the head-mounted display device is in the functional application to be entered, the interaction operation mode is simplified, and the interaction efficiency is improved.
In one embodiment, displaying an interactive interface corresponding to a case of a to-be-entered functional application includes: and under the condition that the functional application is to be accessed, the display device configures an interactive interface, and interactive elements configured for the head-mounted display device are displayed in the device configuration interactive interface.
The device configuration interactive interface may include various interactive elements configured for the head-mounted display device, for example, the device configuration interactive interface may include interactive controls configured for display parameters of the head-mounted display device, such as various interactive controls including brightness, contrast, saturation, color system, theme, and the like, and a user may trigger configuration processing on the display parameters of the head-mounted display device for each interactive control. For different configuration functions, corresponding equipment configuration interactive interfaces can be provided, the equipment configuration interactive interfaces can comprise different interactive elements, and the specific interface content of the equipment configuration interactive interfaces can be flexibly set according to actual needs.
Specifically, under the condition that the head-mounted display device is in the function application to be entered, the head-mounted display device displays a device configuration interactive interface, and specifically, when the configuration processing for the head-mounted display device is triggered for a user, the head-mounted display device displays a device configuration interactive interface corresponding to the device configuration processing, in the device configuration interactive interface, various interactive elements are displayed, the displayed various interactive elements are interface elements configured for the head-mounted display device, and specifically, various types of controls are provided, so that an entry for the configuration interactive processing for the head-mounted display device is provided for the user.
Further, in response to an interactive operation triggered by an interactive element in the interactive interface, executing an interactive event triggered by the interactive operation with respect to the interactive element, including: and responding to the interactive operation triggered by the interactive element in the equipment configuration interactive interface, and displaying the equipment configuration result of configuring the head-mounted display equipment through the interactive operation and the interactive element.
Specifically, for each interactive element displayed in the device configuration interactive interface, a user may generate different motion states by manipulating the wearable device to trigger different interactive operations according to pose data of the wearable device. After a user triggers interactive operation aiming at interactive elements in the equipment configuration interactive interface, the head-mounted display equipment responds to the interactive operation, determines the interactive elements aiming at the interactive operation, and performs configuration processing on the head-mounted display equipment through the interactive operation and the interactive elements to display equipment configuration results. In specific implementation, the head-mounted display device may display a waiting interface when the head-mounted display device is configured through interactive operation and interactive elements, display processing progress information of the configuration processing in the waiting interface, display a processing procedure of the configuration for a user, and display a corresponding device configuration result after the configuration is completed.
In a specific application, as shown in fig. 3, when the head-mounted display device is to enter a functional application, the display device configures an interactive interface, the interactive interface includes interactive elements with color settings, a user controls the smart bracelet to change a pose to trigger an interactive operation by moving an arm, and an interaction focus is marked by a displayed ray. As shown in fig. 4, the ray is moved to the interactive element corresponding to the color setting. Further, as shown in fig. 5, when the interactive operation on the color setting is triggered, the head-mounted display device executes an interactive event corresponding to the color setting, and enters an interface of the color setting to set color parameters such as brightness, contrast, and saturation, so as to configure the head-mounted display device.
In this embodiment, the head-mounted display device configures the interactive interface when the head-mounted display device is to enter the functional application, and displays a device configuration result configured for the head-mounted display device through the interactive operation and the interactive element in response to an interactive operation triggered by the user for the interactive element in the device configuration interactive interface, so that interactive processing for configuring the head-mounted display device can be realized through pose data of the wearable device when the head-mounted display device is to enter the functional application, an interactive operation mode is simplified, and interaction efficiency is improved.
In one embodiment, displaying an interactive interface corresponding to a case of a to-be-entered functional application includes: and under the condition of entering the function application, displaying an application information interaction interface, and displaying interaction elements of application description information corresponding to each function application in the application information interaction interface.
The application information interaction interface is an interface for interacting application information of various functional applications in the head-mounted display device, and may include various interaction elements for interacting the application information of the various functional applications in the head-mounted display device, for example, the application information interaction interface may include an interaction control for accessing the functional applications of the head-mounted display device, and specifically may display application icons corresponding to the functional applications in the application information interaction interface, and a user may trigger, for each application icon, interaction processing on the corresponding functional application, such as accessing the corresponding functional application, uninstalling the functional application, and performing interaction processing such as information update on the functional application. The specific interface content of the application information interaction interface corresponds to various functional applications supported and operated by the head-mounted display equipment. The types of various interactive elements in the application information interactive interface can also be flexibly set according to actual needs, for example, interactive elements in various forms such as application icons, application names, application label pages and the like corresponding to various functional applications can be set.
Specifically, under the condition that the head-mounted display device is in a state of waiting to enter a functional application, the head-mounted display device displays an application information interaction interface, specifically, when a user triggers to preview the functional application of the head-mounted display device, the head-mounted display device displays an application information interaction interface corresponding to the functional application preview, in the application information interaction interface, various interaction elements are displayed, the displayed various interaction elements are interface elements representing the corresponding functional application, specifically, application icons corresponding to the various functional applications, and an entrance for browsing or accessing the functional application is provided for the user.
Further, in response to an interactive operation triggered by an interactive element in the interactive interface, executing an interactive event triggered by the interactive operation with respect to the interactive element, including: and responding to the interactive operation of the interactive elements in the application information interactive interface, and displaying an application interactive result corresponding to the interactive elements acted by the interactive operation.
Specifically, for each interactive element displayed in the application information interactive interface, a user can generate different motion states by operating the wearable device, so as to trigger different interactive operations according to pose data of the wearable device. After a user triggers interactive operation aiming at interactive elements in an application information interactive interface, the head-mounted display device responds to the interactive operation, determines the interactive elements aiming at the interactive operation, carries out interactive processing on the corresponding interactive elements through the interactive operation, and displays corresponding application interactive results. In specific implementation, the head-mounted display device may display a waiting interface when performing interactive processing on a corresponding interactive element through an interactive operation, display processing progress information of the interactive processing in the waiting interface, display a processing procedure of the interactive processing for a user, and display a corresponding application interaction result after the interactive processing is completed. For example, a user triggers an access operation on an interactive element a in an application information interactive interface, which indicates that the user needs to access a functional application corresponding to the interactive element a, the head-mounted display device may display an application start animation for accessing the functional application corresponding to the interactive element a, and after the functional application is started, the head-mounted display device enters the functional application, thereby implementing access processing on the functional application.
In a specific application, as shown in fig. 6, the head-mounted display device displays an application information interaction interface in the case of entering a functional application, application icons corresponding to 4 functional applications are displayed in the application information interaction interface, a user controls the smart bracelet worn by the user to change the pose by moving an arm to trigger an interaction operation, and an interaction operation focus is marked by a displayed ray. As shown in fig. 7, after the user selects the application icon for wilderness play, the user triggers access to wilderness play by rotating the wrist. As shown in fig. 8, after the access is triggered to the application icon for wilderness play, the head-mounted display device enters wilderness play and displays application content corresponding to the wilderness play.
In this embodiment, the head-mounted display device displays the application information interaction interface under the condition that the head-mounted display device is to enter the functional application, and responds to the interaction operation triggered by the user on the interaction element in the application information interaction interface to display the application interaction result corresponding to the interaction element, so that the functional application of the head-mounted display device can be interactively processed through the pose data of the wearable device under the condition that the head-mounted display device is to enter the functional application, the interaction operation mode is simplified, and the interaction efficiency is improved.
In one embodiment, in response to an interactive operation triggered on an interactive element in an interactive interface, performing the interactive operation on an interactive event triggered by the interactive element includes: in response to element selection operation triggered by interactive elements in the interactive interface, marking target interactive elements selected through the element selection operation in the interactive interface; and when the execution condition of the target interactive element is met, executing the interactive event triggered by the target interactive element.
The element selection operation is an operation triggered by the user through the wearable device in the interactive interface to select the interactive elements, and the interactive elements in the interactive interface can be selected through the element selection operation to determine the interactive objects required by the user for interactive processing. The target interactive element is an interactive element which is selected by the user from the interactive elements in the interactive interface through the triggered element selection operation and needs to be subjected to interactive processing. The execution condition is a condition for triggering execution of an interaction event corresponding to the target interaction element, and the execution condition can be flexibly set according to actual needs, if the duration of the element selection operation reaches a preset duration after the user selects the target interaction element, the execution condition of the target interaction element is considered to be satisfied, and if the user further triggers execution of a confirmation operation of the interaction event corresponding to the target interaction element after the user selects the target interaction element, the execution condition of the target interaction element is considered to be satisfied.
Specifically, after the head-mounted display device displays the interactive interface, a user triggers an element selection operation in the interactive interface through the wearable device, the element selection operation acts on an interactive element in the interactive interface, the element selection operation is a target interactive element with respect to the acting interactive element, the head-mounted display device can mark a target interactive element selected by the element selection operation in the interactive interface, for example, the target interactive element can be specially marked, such as highlighted, to prompt that the interactive element is the selected target interactive element, the target interactive element can be marked through a ray or a selection mark, such as the ray points to the target interactive element, and the selection mark falls in an area of the target interactive element, so that the target interactive element selected by the element selection operation is marked in the interactive interface. Further, the head-mounted display device determines whether an execution condition corresponding to the target interactive element is met, and if the user triggers a determination operation for the target interactive element, if so, it indicates that the execution condition corresponding to the target interactive element is met, and the head-mounted display device executes an interactive event triggered by the target interactive element, thereby implementing interactive processing on the head-mounted display device.
In the embodiment, a user triggers element selection operation in an interactive interface through wearable equipment, a target interactive element selected through the element selection operation is marked on the interactive interface through head-mounted display equipment, and when an execution condition of the target interactive element is met, the head-mounted display equipment executes an interactive event triggered by the target interactive element, so that interactive processing of the head-mounted display equipment can be realized through pose data of the wearable equipment under the condition that the head-mounted display equipment is to enter functional application, an interactive operation mode is simplified, and interactive efficiency is improved.
In one embodiment, after displaying the corresponding interactive interface in the case of the to-be-entered functional application, the method further includes: and displaying an interactive preselected mark at a preset position in the interactive interface.
The interaction preselection mark may be a mark for marking an interaction focus corresponding to the interaction operation of the user, and specifically may be a ray, the ray is emitted to an interaction interface displayed by the head-mounted display device, and the end point position of the ray is the interaction focus corresponding to the interaction operation of the user. The interactive pre-selection mark can also be a mark in various forms such as a mouse mark, a selection frame and the like, and is used for marking the interaction focus corresponding to the interaction operation of the user. The specific type of the interactive preselected mark can be configured in advance according to actual needs, and can be set individually by a user.
Specifically, after the head-mounted display device displays the corresponding interactive interface under the condition of entering the functional application, an interactive preselected mark is displayed at a preset position in the interactive interface, for example, the interactive preselected mark can be displayed at a geometric midpoint position of the interactive interface, so that a focus point of interaction corresponding to the interaction operation of the user is represented as the geometric midpoint of the interactive interface.
Further, marking the target interactive element selected by the element selection operation in the interactive interface, including: and moving the interaction preselection mark from a preset position to a position corresponding to the target interaction element selected by the element selection operation.
Specifically, after the user triggers the element selection operation, the head-mounted display device moves the interaction preselection mark from the preset position to the position corresponding to the target interaction element selected by the element selection operation, so that the focus of the interaction corresponding action of the user is represented as the target interaction element through the interaction preselection mark, that is, the user selects the target interaction element for interaction processing. During specific implementation, the head-mounted display device may analyze pose data corresponding to the element selection operation, determine a target interaction element selected by the element selection operation, and move the interaction preselection mark from a preset position to a position corresponding to the target interaction element.
In the embodiment, the interaction focus corresponding to the interaction operation of the user is marked through the interaction preselection mark, so that the user can determine the interaction element through the object marked by the interaction preselection mark, visually display the target interaction element selected by the user, facilitate the real-time feedback of the interaction operation of the user, facilitate the interaction processing of the user aiming at the target interaction element, and improve the interaction efficiency.
In one embodiment, when the execution condition of the target interactive element is satisfied, the interactive event triggered by the target interactive element is executed, including: and in response to the execution confirmation operation triggered by the target interactive element, executing the interactive event triggered by the execution confirmation operation aiming at the target interactive element.
The execution confirmation operation is an operation of confirming that execution is triggered for the target interactive element, and specifically may be an operation of further triggering for the target interactive element by the user. For example, when the wearable device is a smart watch, the user may change the pose of the smart watch by rotating the wrist, such as by rotating the wrist clockwise, triggering a confirmation operation to be performed to trigger an interaction event for the target interaction element.
Specifically, after the user selects the target interactive element, the user may further trigger, by the wearable device, an execution confirmation operation, which indicates that the user confirms that the interactive event corresponding to the selected target interactive element needs to be executed, and in response to the execution confirmation operation, the head-mounted display device executes, for the interactive event triggered by the target interactive element, the execution confirmation operation, such as accessing a functional application corresponding to the target interactive element, or performing configuration processing on the head-mounted display device, which is matched with the target interactive element.
In the embodiment, the head-mounted display device responds to the execution confirmation operation triggered by the user aiming at the target interactive element, and executes the interactive event triggered by the execution confirmation operation aiming at the target interactive element, so that the interaction processing of the functional application of the head-mounted display device can be realized through the pose data of the wearable device under the condition that the head-mounted display device is to enter the functional application, the interactive operation mode is simplified, and the interactive efficiency is improved.
In one embodiment, the method for interacting with a head-mounted display device further comprises: displaying a calibration interface in response to a calibration operation triggered by the wearable device; marking an interactive operation result corresponding to the calibration interactive operation in a calibration interface; and matching the calibration interactive operation with the current pose data of the wearable equipment.
The calibration operation is an operation of calibrating the pose of the wearable device, the pose of the wearable device can be calibrated by calibrating the pose of the wearable device, so that the sensitivity and accuracy of the user for triggering the interactive operation through the wearable device are ensured, and the interactive efficiency of the head-mounted display device is ensured. The specific trigger of the calibration operation can be set according to actual needs, for example, the calibration operation can be triggered according to pose data or a calibration instruction of the wearable device, for example, when a specific area of the wearable device, such as a display screen area, is blocked and continues for a certain time, or when a user sends a calibration instruction to the head-mounted display device through the wearable device, the calibration operation is triggered. The calibration interface is an interface for calibrating the pose of the wearable device, and a user can trigger calibration interactive operation in the calibration interface so as to calibrate the pose of the wearable device. The calibration interactive operation is matched with the current pose data of the wearable device, namely after the calibration operation is triggered and the head-mounted display device displays a calibration interface, the corresponding calibration interactive operation can be matched and determined according to the current pose data of the wearable device, and the calibration interactive operation is matched with the current pose data of the wearable device.
Specifically, when a user needs to calibrate the pose of the wearable device to ensure the sensitivity and accuracy of the interactive operation, the user may trigger the calibration operation through the wearable device, for example, the user may cover a specific area of the wearable device to trigger the calibration operation, and the head-mounted display device displays a calibration interface in response to the calibration operation. Further, the head-mounted display device matches the corresponding calibration interactive operation according to the current pose data of the wearable device, and marks an interactive operation result corresponding to the calibration interactive operation in a calibration interface. For example, the head-mounted display device may match corresponding calibration interactive operations according to the current pose data of the wearable device, mark an action focus corresponding to the calibration interactive operations in an interface geometric center of the calibration interface, so as to mark an interactive operation result corresponding to the calibration interactive operations, which indicates that the action focus of the interactive operations corresponding to the current pose data of the wearable device worn by the user is the interface geometric center of the calibration interface, and the user may change the current pose data of the wearable device when the calibration is triggered, so as to implement calibration processing of the pose of the wearable device.
In this embodiment, the head-mounted display device may display, in response to a calibration operation triggered by the user through the wearable device, an interaction operation result corresponding to the calibration interaction operation matched with the current pose data of the wearable device in the punctuation interface, and perform calibration processing on the pose of the wearable device, so that the sensitivity and accuracy of interaction performed through the wearable device may be ensured, and the improvement of interaction efficiency is facilitated.
In one embodiment, the method for interacting with a head-mounted display device further comprises: in the event that vibration feedback is triggered, vibration feedback is performed by a vibration component in the wearable device.
Wherein, the vibration component can be arranged in the wearable device to provide vibration feedback for a user body wearing the wearable device. Specifically, under the condition that vibration feedback is triggered, if the user takes effect through the interactive operation triggered by the wearable device or the action focus of the interactive operation is located at the edge of the interactive interface, the user may be prompted to perform vibration feedback by considering that vibration feedback is required, and then vibration feedback may be performed through a vibration component in the wearable device. During concrete realization, wear display device and can monitor whether satisfy the condition that triggers vibration feedback, if, then send the vibration instruction to wearable equipment, wearable equipment is according to the vibration instruction, and the control vibration subassembly carries out vibration feedback for the user. In specific application, the vibration intensity of the vibration assembly can be flexibly set according to actual needs, for example, the vibration intensity can correspond to the operation amplitude of the interactive operation, and specifically can be positively correlated, namely, the larger the operation amplitude of the interactive operation is, the larger the corresponding vibration intensity is. In addition, when the focus of the interactive operation is located at the edge position of the interactive interface, the corresponding vibration intensity may be at a maximum to prompt the user to reach the edge position of the operation.
In this embodiment, under the condition of triggering vibration feedback, carry out vibration feedback through the vibration subassembly in the wearable equipment to can provide feedback information for the user through vibration feedback, so that the user can in time obtain interactive operation's effect, be favorable to the user to carry out interactive processing, improve the interactive efficiency who wears display device.
In an embodiment, as shown in fig. 9, the interaction method of the head-mounted display device further includes processing for determining an interactive operation, specifically including:
and step 902, acquiring pose data of the wearable device, wherein the pose data is acquired by a sensor assembly in the wearable device.
The position and posture data are data representing the position and posture of the wearable device, when a user wears the wearable device to trigger interaction on the head-mounted display device, the user can control the position and posture of the wearable device to be changed, if the user wears the smart watch, the position and posture of the smart watch can be changed by moving an arm and rotating a wrist, the wearable device can capture the position data and posture data through a built-in sensor, and the motion state of the wearable device can be determined according to the position data and posture data, so that the position and posture data of the wearable device can be obtained. The pose data are acquired by sensor components in the wearable equipment and specifically comprise a direction sensor, a speed sensor, a distance sensor and the like.
In a specific implementation, the sensor assembly may detect X, Y, Z three-axis accelerations of the wearable device in a spatial coordinate system and X, Y, Z three-axis rotational angular velocities, and the head-mounted display device may calculate the posture of the wearable device through the three-axis accelerations; the motion direction and the rotation direction of the wearable device can be judged through the acceleration change and the rotation angular velocity change of the three axes; calculating the displacement and the rotation angle of the wearable device on three axes through integral operation of the acceleration and the angular velocity; and synthesizing the actual displacement of the wearable device according to the displacement components on the three axes, thereby determining the motion state of the wearable device.
Specifically, the user changes the motion state of wearable equipment through controlling wearable equipment of wearing, and the sensor assembly in the wearable equipment gathers position appearance data to wearing display device with position appearance data transmission, wearing display device receives the position appearance data that wearable equipment sent.
And 904, analyzing the pose state of the pose data to obtain the pose state of the wearable equipment.
After the head-mounted display device receives the pose data sent by the wearable device, the head-mounted display device analyzes the pose state of the pose data, for example, the motion state of the wearable device can be analyzed through the pose data to obtain the pose state of the wearable device. For example, while the sensor assembly may detect X, Y, Z three-axis accelerations of the wearable device in a spatial coordinate system and X, Y, Z rotational angular velocities about the three axes, the head-mounted display device may calculate the pose of the wearable device through the three-axis accelerations; the motion direction and the rotation direction of the wearable device can be judged through the acceleration change and the rotation angular velocity change of the three axes; calculating the displacement and the rotation angle of the wearable device on three axes through integral operation of the acceleration and the angular velocity; and synthesizing the actual displacement of the wearable device according to the displacement components on the three axes, thereby obtaining the motion state of the wearable device and further determining the pose state of the wearable device.
And step 906, based on the pose state and a preset pose operation mapping relation, matching to obtain interactive operation triggered by the wearable equipment.
After the pose state of the wearable device is obtained, the head-mounted display device can inquire a preset pose operation mapping relation, wherein the pose operation mapping relation comprises interactive operation corresponding to various pose states. The head-mounted display equipment matches the pose state with a preset pose operation mapping relation, and obtains interactive operation triggered by the wearable equipment according to a matching result, so that interactive operation started by a user through controlling the wearable equipment is determined.
In the embodiment, the head-mounted display device analyzes the pose data acquired and sent by the wearable device, matches the determined pose state with the preset pose operation mapping relation, and obtains the interactive operation triggered by the wearable device according to the matching result, so that the user can change the pose data of the wearable device through control to trigger the interactive operation, the interactive operation of the head-mounted display device is realized, the interactive operation mode is simplified, and the interactive efficiency is improved.
The application further provides an application scenario applying the interaction method of the head-mounted display device. Specifically, the application of the interaction method of the head-mounted display device in the application scene is as follows:
the head-mounted display device can be an intelligent display device, such as AR glasses, VR glasses or intelligent televisions, and the traditional head-mounted display device is controlled by a remote controller and a mobile phone inside the functional application. Under the condition that the head-mounted display equipment is to enter the functional application, such as in a starting stage and a main interface, the head-mounted display equipment is generally controlled by a remote controller and a mobile phone, and after entering the specific functional application, a corresponding mode is selected for interaction according to the requirements of the functional application.
Specifically, under the condition that the head-mounted display device is to enter functional application, the control of the head-mounted display device through the remote controller is the most popular mode at present, the remote controller is connected with the head-mounted display device in a wired or wireless mode, and the remote controller is generally provided with an up-down direction key, a left-right direction key, a return key, a confirmation key and a menu key. When the remote controller is awakened, the head-mounted display device can highlight the position of the cursor in the current display interface. The user can move the cursor position through the up, down, left and right direction keys and perform operations such as confirmation, return and the like.
On the other hand, a specific application program needs to be run by using a mobile phone as a remote controller, the mobile phone is connected with a head-mounted display device in a wired or wireless manner, the current direction of the mobile phone can be calculated by IMU (Inertial Measurement Unit) sensor data on the mobile phone, a ray is displayed on a display screen of the head-mounted device to indicate the current direction of the mobile phone, a user can change the area pointed by the ray by adjusting the pose of the mobile phone, and corresponding operation can be performed on the current direction area by a button on the display screen of the mobile phone. A reset ray button is arranged on the screen of the mobile phone, rays can be moved to a specified position by clicking the reset ray button, and the rays are calibrated with the current mobile phone direction.
However, the user cannot see the real world after wearing the head-mounted display device, cannot see the positions of the keys of the remote controller or the mobile phone, and can only perform blind operation on the remote controller and the mobile phone, which affects the accuracy and efficiency of interaction. In addition, even if the head-mounted display device can see the virtual image and the real world simultaneously after being worn, the attention of the user needs to be switched back and forth between the real world and the virtual image when the remote controller and the mobile phone are used for controlling, the focus of the human eyes also needs to be continuously changed and adjusted, the immersive experience of the user is influenced, and the interaction efficiency is also influenced.
Based on this, the interaction method of the head-mounted display device provided by this embodiment is applied to the head-mounted display device worn by the user, and the wearable device worn by the user is connected to the head-mounted display device. Wherein, the wearable equipment that the user dressed can be various wearable equipment such as intelligent wrist-watch or intelligent bracelet. As shown in fig. 10, the smart watch or the smart bracelet may be composed of a band 100, a case 200, a touch screen 300 and a sensor assembly, the touch screen 300 and the sensor assembly being mounted on the case 200, the smart watch fixing the case 200 at a wrist of a user through the band 100. The wearable device establishes a connection with the head-mounted display device in a wireless manner, for example, through bluetooth, after the connection is successful, the wearable device is automatically opened or the user opens the head-mounted display device control application, and after the wearable device successfully opens the control application, the head-mounted display device displays a ray 500 at a default position in the screen 400.
Wherein the sensor assembly can detect X, Y, Z acceleration in three axes and angular velocity of rotation about X, Y, Z in three axes. The posture of the electronic device can be calculated by connecting the head-mounted display equipment and the three axes of acceleration; the motion direction and the rotation direction of the wearable device can be judged through the acceleration change and the rotation angular velocity change of the three axes; calculating the displacement and the rotation angle of the wearable device on three axes through integral operation of the acceleration and the angular velocity; and synthesizing the actual displacement of the wearable device according to the displacement components on the three axes. Specifically, the sensor assembly senses the motion state of the wearable device, obtains motion parameters such as motion direction, speed and distance parameters, the wearable device sends the motion parameters to the head-mounted display device, the head-mounted display device calculates motion data, and the pointing direction of the ray 500 in the screen 400 is changed according to the motion data. If the user rotates the wrist clockwise, the wearable device issues a "confirm" command, which is equivalent to "click" the pointing position of the ray 500, and the screen 400 of the head-mounted display device switches the corresponding content, i.e., executes the corresponding interaction event.
Specifically, as shown in fig. 11, after the wearable device is connected to the head-mounted display device, the wearable device starts the interactive control application, collects the movement direction, speed, and distance of the user during arm swinging to obtain movement data, and sends the collected movement data to the head-mounted display device. The head-mounted display device resolves the received motion data, changes ray direction, and executes an interaction event corresponding to the ray direction interaction element when the user triggers the confirmation operation. The sensors in the wearable device can detect the moving distances of the wearable device in the three directions of X, Y and Z and the rotation angles around the three axes of X, Y and Z, and the up-down and left-right movement of the wearable device can control the ray 500 in the screen 400 to rotate up, down, left and right around the starting point, so that the ray 500 is changed to point to different positions in the screen 400. The user may turn the wrist-worn device clockwise to issue a "confirm" command and turn the wrist counter-clockwise to issue a "return/cancel" command. The gestures corresponding to the "confirm" command and the "return/cancel" command may be user-defined settings. In practical use, a user generally has difficulty in making the hand perform strict up-down and left-right translational motion, and most of the use scenes are that the elbow is placed on the armrest of the chair or the desk and swings up, down, left and right around the elbow as the center of a circle. The sensor in the wearable device can restore the real motion trajectory of the user's forearm by detecting the motion components in the three directions of X, Y and Z, and the pointing direction of the ray 500 in the screen 400 of the head-mounted display device can be changed by swinging the forearm.
In addition, the wearable device also has a calibration function, when the calibration function is triggered, the ray 500 in the screen 400 of the head-mounted display device is reset, and the pose of the wearable device at this moment is matched with the ray 500 in the reset position. The calibration function can enable a user to quickly find the ray 500 position and adjust the hand gesture, and the phenomena of ray loss and overlarge arm stroke in the control process are prevented. The calibration function can be triggered by tapping the wearable device touch screen 300 or covering the touch screen 300, the specific triggering mode can be set by the user in a user-defined mode, and the calibration function is triggered by defaulting to covering the touch screen 300.
The wearable device also includes a vibratable component by which the user's operation is fed back. When the user carries out effectual forearm swing, the vibration subassembly feeds back through the vibration, and the vibration intensity is directly proportional with swing amplitude, and the vibration amplitude is the biggest when ray 500 directive position reaches the screen 400 edge of wearing display device, reminds user ray 500 to have reached limit position. When the user rotates the wrist, corresponding feedback can be given to the user through the vibration assembly, the user can sense the rotation angle of the wrist in one mode except according to the twisting range of the arm of the user, when the rotation angle exceeds a threshold value, the vibration disappears, and the user is reminded that the rotation successfully triggers a corresponding instruction to be sent to the head-mounted display device.
In this embodiment, for the virtual reality type head-mounted display device, the user cannot see the real world after wearing the device, and cannot see the position of the key when using the existing remote controller or mobile phone, so that the user is prone to misoperation or can feel the correct position of the key for a long time. In the embodiment, the user can control the head-mounted display device through hand movement, and any key operation is not needed, so that the accuracy and the speed of the operation can be improved. For the enhanced real-world head-mounted display device, although a user can see the display world after wearing the display device, the attention of the user needs to be switched back and forth between the head-mounted display device and a remote controller or a mobile phone, the focus of human eyes also needs to be changed correspondingly, and the use experience of the user is influenced. In this embodiment, the user can put attention on the display screen of the head-mounted device completely, and the head-mounted device can be controlled only by simple hand movement, so that the immersive experience effect of the user is improved, and the interaction efficiency is improved.
It should be understood that, although the steps in the flowcharts of fig. 2, 9 and 11 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 2, 9, and 11 may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or stages is not necessarily sequential, but may be performed alternately or alternatingly with other steps or at least some of the sub-steps or stages of other steps.
Fig. 12 is a block diagram illustrating an interaction apparatus 1200 of a head-mounted display device according to an embodiment. The interaction apparatus 1200 of the head-mounted display device is applied to the head-mounted display device worn by the user, and the wearable device worn by the user is connected with the head-mounted display device. As shown in fig. 12, the interaction means 1200 of the head-mounted display device includes an interaction interface display module 1202 and an interaction operation response module 1204; wherein:
an interactive interface display module 1202, configured to display an interactive interface corresponding to the function application to be entered;
the interactive operation response module 1204 is used for responding to interactive operation triggered by interactive elements in the interactive interface and executing interactive events triggered by the interactive elements by the interactive operation; and the interactive operation is obtained by matching the pose data sent by the wearable equipment.
In an embodiment, the interactive interface display module 1202 is further configured to, in a case that the functional application is to be entered, display a device configuration interactive interface, and display an interactive element configured for the head-mounted display device in the device configuration interactive interface; the interactive operation response module 1204 is further configured to display a device configuration result obtained by configuring the head-mounted display device through the interactive operation and the interactive element in response to the interactive operation triggered by the interactive element in the device configuration interactive interface.
In an embodiment, the interactive interface display module 1202 is further configured to display an application information interactive interface under the condition that the functional application is to be entered, and display an interactive element of application description information corresponding to each functional application in the application information interactive interface; the interactive operation response module 1204 is further configured to respond to an interactive operation on an interactive element in the application information interactive interface, and display an application interactive result corresponding to the interactive element acted by the interactive operation.
In one embodiment, the interoperation response module 1204 includes a target interactive element determination module and a target interactive element execution module; wherein: the target interactive element determining module is used for responding to element selection operation triggered by interactive elements in the interactive interface and marking the target interactive elements selected by the element selection operation in the interactive interface; and the target interactive element execution module is used for executing the interactive event triggered by the target interactive element when the execution condition of the target interactive element is met.
In one embodiment, the system further comprises a pre-sign display module, which is used for displaying the interactive pre-selection sign at a preset position in the interactive interface; and the target interactive element determining module is also used for moving the interactive preselected mark from the preset position to the position corresponding to the target interactive element selected by the element selection operation.
In one embodiment, the target interactive element execution module is further configured to execute, in response to the execution confirmation operation triggered by the target interactive element, the interactive event triggered by the execution confirmation operation for the target interactive element.
In one embodiment, the system further comprises a calibration interface display module and a calibration result module; wherein: the calibration interface display module is used for responding to calibration operation triggered by the wearable equipment and displaying a calibration interface; the calibration result module is used for marking an interactive operation result corresponding to the calibration interactive operation in the calibration interface; and matching the calibration interactive operation with the current pose data of the wearable equipment.
In one embodiment, the wearable device further comprises a vibration feedback module for performing vibration feedback through a vibration component in the wearable device if the vibration feedback is triggered.
In one embodiment, the system further comprises a pose data acquisition module, a pose data analysis module and an interactive operation determination module; wherein: the pose data acquisition module is used for acquiring pose data of the wearable equipment, and the pose data is acquired by a sensor assembly in the wearable equipment; the pose data analysis module is used for carrying out pose state analysis on the pose data to obtain a pose state of the wearable device; and the interactive operation determining module is used for matching to obtain interactive operation triggered by the wearable equipment based on the pose state and a preset pose operation mapping relation.
The division of each module in the interaction apparatus of the head-mounted display device is merely used for illustration, and in other embodiments, the interaction apparatus of the head-mounted display device may be divided into different modules as needed to complete all or part of the functions of the interaction apparatus of the head-mounted display device.
For specific definition of the interaction means of the head-mounted display device, reference may be made to the above definition of the interaction method of the head-mounted display device, which is not described herein again. The modules in the interaction device of the head-mounted display device can be wholly or partially implemented by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
Fig. 13 is a schematic diagram of an internal structure of an electronic device in one embodiment. The electronic device can be a head-mounted display device, and specifically can be a display device for realizing different effects such as virtual reality, augmented reality, mixed reality and the like. The electronic device includes a processor and a memory connected by a system bus. The processor may include one or more processing units, among others. The processor may be a CPU (Central Processing Unit), a DSP (Digital Signal processor), or the like. The memory may include a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The computer program can be executed by a processor for implementing an interaction method of a head-mounted display device provided in the following embodiments. The internal memory provides a cached execution environment for the operating system computer programs in the non-volatile storage medium.
The implementation of each module in the interaction device of the head-mounted display device provided in the embodiments of the present application may be in the form of a computer program. The computer program may be run on a terminal or a server. Program modules constituted by such computer programs may be stored on the memory of the electronic device. Which when executed by a processor, performs the steps of the method described in the embodiments of the present application.
The embodiment of the application also provides a computer readable storage medium. One or more non-transitory computer-readable storage media containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform steps of a method of interacting with a head mounted display device.
Embodiments of the present application also provide a computer program product containing instructions that, when executed on a computer, cause the computer to perform an interaction method for a head-mounted display device.
Any reference to memory, storage, database, or other medium used herein may include non-volatile and/or volatile memory. The nonvolatile Memory may include a ROM (Read-Only Memory), a PROM (Programmable Read-Only Memory), an EPROM (Erasable Programmable Read-Only Memory), an EEPROM (Electrically Erasable Programmable Read-Only Memory), or a flash Memory. Volatile Memory can include RAM (Random Access Memory), which acts as external cache Memory. By way of illustration and not limitation, RAM is available in many forms, such as SRAM (Static Random Access Memory), DRAM (Dynamic Random Access Memory), SDRAM (Synchronous Dynamic Random Access Memory), Double Data Rate DDR SDRAM (Double Data Rate Synchronous Random Access Memory), ESDRAM (Enhanced Synchronous Dynamic Random Access Memory), SLDRAM (Synchronous Link Dynamic Random Access Memory), RDRAM (Random Dynamic Random Access Memory), and DRmb DRAM (Dynamic Random Access Memory).
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (12)

1. The interaction method of the head-mounted display device is characterized by being applied to the head-mounted display device worn by a user, wherein the wearable device worn by the user is connected with the head-mounted display device; the method comprises the following steps:
displaying an interactive interface corresponding to the condition of the function application to be entered;
in response to an interactive operation triggered by an interactive element in the interactive interface, executing an interactive event triggered by the interactive operation aiming at the interactive element; and the interactive operation is obtained by matching according to the pose data sent by the wearable equipment.
2. The method of claim 1, wherein displaying the interactive interface corresponding to the application of functionality to be entered comprises:
under the condition that a functional application is to be entered, a display device configures an interactive interface, and interactive elements configured for the head-mounted display device are displayed in the device configuration interactive interface;
the executing the interaction operation for the interaction event triggered by the interaction element in response to the interaction operation triggered by the interaction element in the interaction interface comprises:
and responding to an interactive operation triggered by an interactive element in the equipment configuration interactive interface, and displaying an equipment configuration result for configuring the head-mounted display equipment through the interactive operation and the interactive element.
3. The method of claim 1, wherein displaying the interactive interface corresponding to the application of functionality to be entered comprises:
under the condition of entering functional application, displaying an application information interaction interface, and displaying interaction elements of application description information corresponding to each functional application in the application information interaction interface;
the executing the interaction operation for the interaction event triggered by the interaction element in response to the interaction operation triggered by the interaction element in the interaction interface comprises:
and responding to the interactive operation of the interactive elements in the application information interactive interface, and displaying an application interactive result corresponding to the interactive elements acted by the interactive operation.
4. The method of claim 1, wherein the performing the interaction operation in response to the interaction operation triggered by the interaction element in the interactive interface for the interaction event triggered by the interaction element comprises:
in response to an element selection operation triggered on an interactive element in the interactive interface, marking a target interactive element selected through the element selection operation in the interactive interface;
and when the execution condition of the target interactive element is met, executing the interactive event triggered by the target interactive element.
5. The method of claim 4, wherein after displaying the interactive interface corresponding to the application of functionality to be entered, further comprising:
displaying an interactive preselected mark at a preset position in the interactive interface;
the marking out the target interactive element selected by the element selection operation in the interactive interface comprises:
and moving the interaction preselection mark from the preset position to a position corresponding to the target interaction element selected by the element selection operation.
6. The method of claim 4, wherein the executing the interaction event triggered by the target interactive element when the execution condition of the target interactive element is satisfied comprises:
in response to an execution confirmation operation triggered by the target interactive element, executing the interactive event triggered by the execution confirmation operation aiming at the target interactive element.
7. The method of claim 1, further comprising:
displaying a calibration interface in response to a calibration operation triggered by the wearable device;
marking an interactive operation result corresponding to the calibration interactive operation in the calibration interface; the calibration interactive operation is matched with the current pose data of the wearable device.
8. The method of claim 1, further comprising:
in the event that vibration feedback is triggered, vibration feedback is performed by a vibration component in the wearable device.
9. The method according to any one of claims 1 to 8, further comprising:
acquiring pose data of the wearable device, wherein the pose data is acquired by a sensor assembly in the wearable device;
analyzing the pose state of the pose data to obtain the pose state of the wearable equipment;
and matching to obtain the interactive operation triggered by the wearable equipment based on the pose state and a preset pose operation mapping relation.
10. The interaction device of the head-mounted display equipment is characterized by being applied to the head-mounted display equipment worn by a user, wherein the wearable equipment worn by the user is connected with the head-mounted display equipment; the device comprises:
the interactive interface display module is used for displaying the corresponding interactive interface under the condition of entering the functional application;
the interaction operation response module is used for responding to the interaction operation triggered by the interaction element in the interaction interface and executing the interaction event triggered by the interaction element by the interaction operation; and the interactive operation is obtained by matching according to the pose data sent by the wearable equipment.
11. An electronic device comprising a memory and a processor, the memory having stored therein a computer program, wherein the computer program, when executed by the processor, causes the processor to perform the steps of the method of interacting with a head mounted display device according to any of claims 1 to 9.
12. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 9.
CN202110689938.9A 2021-06-21 2021-06-21 Interaction method and device of head-mounted display equipment and electronic equipment Active CN113413585B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110689938.9A CN113413585B (en) 2021-06-21 2021-06-21 Interaction method and device of head-mounted display equipment and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110689938.9A CN113413585B (en) 2021-06-21 2021-06-21 Interaction method and device of head-mounted display equipment and electronic equipment

Publications (2)

Publication Number Publication Date
CN113413585A true CN113413585A (en) 2021-09-21
CN113413585B CN113413585B (en) 2024-03-22

Family

ID=77789862

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110689938.9A Active CN113413585B (en) 2021-06-21 2021-06-21 Interaction method and device of head-mounted display equipment and electronic equipment

Country Status (1)

Country Link
CN (1) CN113413585B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023216930A1 (en) * 2022-05-12 2023-11-16 华为技术有限公司 Wearable-device based vibration feedback method, system, wearable device and electronic device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110069101A (en) * 2019-04-24 2019-07-30 洪浛檩 A kind of wearable calculating equipment and a kind of man-machine interaction method
CN111766937A (en) * 2019-04-02 2020-10-13 广东虚拟现实科技有限公司 Virtual content interaction method and device, terminal equipment and storage medium
CN112308569A (en) * 2020-11-19 2021-02-02 Oppo(重庆)智能科技有限公司 Application function calling method, device, terminal and storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018107695A (en) * 2016-12-27 2018-07-05 フォーブ インコーポレーテッド Estimation system, estimation method, and estimation program
US10867593B1 (en) * 2018-02-08 2020-12-15 Facebook Technologies, Llc In-ear emitter configuration for audio delivery

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111766937A (en) * 2019-04-02 2020-10-13 广东虚拟现实科技有限公司 Virtual content interaction method and device, terminal equipment and storage medium
CN110069101A (en) * 2019-04-24 2019-07-30 洪浛檩 A kind of wearable calculating equipment and a kind of man-machine interaction method
CN112308569A (en) * 2020-11-19 2021-02-02 Oppo(重庆)智能科技有限公司 Application function calling method, device, terminal and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023216930A1 (en) * 2022-05-12 2023-11-16 华为技术有限公司 Wearable-device based vibration feedback method, system, wearable device and electronic device

Also Published As

Publication number Publication date
CN113413585B (en) 2024-03-22

Similar Documents

Publication Publication Date Title
TW202105133A (en) Virtual user interface using a peripheral device in artificial reality environments
US9268400B2 (en) Controlling a graphical user interface
US20220269333A1 (en) User interfaces and device settings based on user identification
US9013396B2 (en) System and method for controlling a virtual reality environment by an actor in the virtual reality environment
CN106445131B (en) Virtual target operating method and device
CN112817453A (en) Virtual reality equipment and sight following method of object in virtual reality scene
CN113885345A (en) Interaction method, device and equipment based on intelligent home simulation control system
CN113413585B (en) Interaction method and device of head-mounted display equipment and electronic equipment
CN113315963A (en) Augmented reality display method, device, system and storage medium
CN109643182A (en) Information processing method and device, cloud processing equipment and computer program product
CN118012265A (en) Man-machine interaction method, device, equipment and medium
CN112162631B (en) Interactive device, data processing method and medium
CN117835044B (en) Debugging method and device of motion capture camera
JP7500866B2 (en) Wearable terminal device, program, and display method
US11875465B2 (en) Virtual reality data-processing device, system and method
CN108268126B (en) Interaction method and device based on head-mounted display equipment
WO2024026024A1 (en) Devices and methods for processing inputs to a three-dimensional environment
CN116166161A (en) Interaction method based on multi-level menu and related equipment
CN117827072A (en) Equipment control method and device and electronic equipment
CN117032465A (en) User interface and device settings based on user identification
WO2022178132A1 (en) User interfaces and device settings based on user identification
CN117742479A (en) Man-machine interaction method, device, equipment and medium
WO2023049111A1 (en) Devices, methods, and graphical user interfaces for interacting with three-dimensional environments
JP2023168746A (en) Information processing apparatus, information processing system, information processing method, and program
CN115878013A (en) Menu interaction method and related equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant