CN113413585B - Interaction method and device of head-mounted display equipment and electronic equipment - Google Patents

Interaction method and device of head-mounted display equipment and electronic equipment Download PDF

Info

Publication number
CN113413585B
CN113413585B CN202110689938.9A CN202110689938A CN113413585B CN 113413585 B CN113413585 B CN 113413585B CN 202110689938 A CN202110689938 A CN 202110689938A CN 113413585 B CN113413585 B CN 113413585B
Authority
CN
China
Prior art keywords
interaction
interactive
interface
head
mounted display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110689938.9A
Other languages
Chinese (zh)
Other versions
CN113413585A (en
Inventor
李炳耀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202110689938.9A priority Critical patent/CN113413585B/en
Publication of CN113413585A publication Critical patent/CN113413585A/en
Application granted granted Critical
Publication of CN113413585B publication Critical patent/CN113413585B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/212Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/28Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
    • A63F13/285Generating tactile feedback signals via the game input device, e.g. force feedback
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1012Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals involving biosensors worn by the player, e.g. for measuring heart beat, limb activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1037Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted for converting control signals received from the game device into a haptic signal, e.g. using force feedback
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/105Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals using inertial sensors, e.g. accelerometers, gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biophysics (AREA)
  • Cardiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application relates to an interaction method and device of a head-mounted display device, a computer device and a storage medium. The method is applied to the head-mounted display equipment worn by the user, and the wearable equipment worn by the user is connected with the head-mounted display equipment; comprising the following steps: displaying an interactive interface corresponding to the condition of the function application to be entered; responding to the interactive operation triggered by the interactive element in the interactive interface, and executing the interactive event triggered by the interactive operation aiming at the interactive element; the interaction operation is obtained by matching according to pose data sent by the wearable equipment. By adopting the method, the interactive operation on the head-mounted display equipment can be simplified, and the interactive efficiency is improved.

Description

Interaction method and device of head-mounted display equipment and electronic equipment
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to an interaction method and apparatus for a head-mounted display device, an electronic device, and a computer readable storage medium.
Background
With the development of computer technology, various intelligent electronic products, such as head-mounted display devices implementing augmented reality technology and virtual reality technology, are increasingly used. The head-mounted display device can convert a user from a traditional two-dimensional screen space to a three-dimensional space through an augmented reality technology and a virtual reality technology, and provides a better visual effect. In the current interactive control of the head-mounted display device, the problem of low interactive efficiency caused by difficult interactive operation exists, and the interaction between the user and the head-mounted display device is not facilitated.
Disclosure of Invention
The embodiment of the application provides an interaction method and device for head-mounted display equipment, electronic equipment and a computer readable storage medium, which can simplify the interaction operation for the head-mounted display equipment and improve the interaction efficiency.
The interaction method of the head-mounted display device is applied to the head-mounted display device worn by a user, and the wearable device worn by the user is connected with the head-mounted display device; the method comprises the following steps:
displaying an interactive interface corresponding to the condition of the function application to be entered;
responding to the interactive operation triggered by the interactive element in the interactive interface, and executing the interactive event triggered by the interactive operation aiming at the interactive element; the interaction operation is obtained by matching according to pose data sent by the wearable equipment.
The interaction device of the head-mounted display device is applied to the head-mounted display device worn by a user, and the wearable device worn by the user is connected with the head-mounted display device; the device comprises:
the interactive interface display module is used for displaying an interactive interface corresponding to the condition of the function application to be entered;
the interactive operation response module is used for responding to the interactive operation triggered by the interactive element in the interactive interface and executing the interactive event triggered by the interactive operation aiming at the interactive element; the interaction operation is obtained by matching according to pose data sent by the wearable equipment.
An electronic device comprising a memory and a processor, the memory storing a computer program that, when executed by the processor, causes the processor to perform the steps of:
displaying an interactive interface corresponding to the condition of the function application to be entered;
responding to the interactive operation triggered by the interactive element in the interactive interface, and executing the interactive event triggered by the interactive operation aiming at the interactive element; the interaction operation is obtained by matching according to pose data sent by the wearable equipment.
A computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of:
displaying an interactive interface corresponding to the condition of the function application to be entered;
responding to the interactive operation triggered by the interactive element in the interactive interface, and executing the interactive event triggered by the interactive operation aiming at the interactive element; the interaction operation is obtained by matching according to pose data sent by the wearable equipment.
According to the interaction method, the device, the electronic equipment and the computer readable storage medium of the head-mounted display equipment, the corresponding interaction interface is displayed under the condition that the function application is to be entered, the interaction event triggered by the interaction operation aiming at the interaction element is executed in response to the interaction operation obtained by matching the pose data of the wearable equipment connected with the head-mounted display equipment in the interaction interface, and therefore interaction processing of the head-mounted display equipment is achieved. In the interaction processing process of the head-mounted display device, under the condition that the head-mounted display device is in the function application to be entered, through responding to the interaction operation triggered by the wearable device in the corresponding interaction interface by the user, the interaction operation is obtained according to the matching of the pose data sent by the wearable device connected with the head-mounted display device, so that interaction with the head-mounted display device can be achieved through the pose data of the wearable device under the condition that the head-mounted display device is to be entered into the function application, the interaction operation mode is simplified, and the interaction efficiency is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is an application environment diagram of an interaction method of a head mounted display device in one embodiment;
FIG. 2 is a flow diagram of a method of interaction of a head mounted display device in one embodiment;
FIG. 3 is an interface diagram of a display device configuration interactive interface in one embodiment;
FIG. 4 is an interface diagram of interactive elements in a selected device configuration interactive interface in one embodiment;
FIG. 5 is a schematic diagram of an interface for entering color settings in one embodiment;
FIG. 6 is an interface diagram illustrating an application information interaction interface in one embodiment;
FIG. 7 is an interface diagram of interactive elements in a selected application information interactive interface in one embodiment;
FIG. 8 is a schematic diagram of an interface into a functional application in one embodiment;
FIG. 9 is a flow diagram of a determine interaction operation in one embodiment;
FIG. 10 is an application environment diagram of interactions of a head mounted display device in another embodiment;
FIG. 11 is a flow chart of a method of interaction of a head mounted display device in another embodiment;
FIG. 12 is a block diagram of an interactive apparatus of a head mounted display device in one embodiment;
fig. 13 is an internal structural view of a computer device in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application.
It will be understood that the terms "first," "second," and the like, as used herein, may be used to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another element. For example, a first client may be referred to as a second client, and similarly, a second client may be referred to as a first client, without departing from the scope of the present application. Both the first client and the second client are clients, but they are not the same client.
FIG. 1 is a schematic view of an application environment of an interaction method of a head mounted display device in one embodiment. As shown in fig. 1, the application environment includes a wearable device 102 worn by a user, a head mounted display device 104 worn by the user, and a server 106. Wherein wearable device 102 communicates with head mounted display device 104 over a network, and head mounted display device 104 communicates with server 106 over a network. After the user wears the head-mounted display device 104, the server 106 may issue display data to the head-mounted display device 104 to display, and display a corresponding interactive interface when the head-mounted display device 104 is to enter a function application, where the head-mounted display device 104 responds to an interactive operation obtained by matching pose data of the wearable device 102 in the interactive interface, and executes an interactive event triggered by the interactive operation with respect to the interactive element, so as to implement interactive processing on the head-mounted display device 104. In addition, when the local end of the head-mounted display device 104 stores the data to be displayed, the head-mounted display device 104 may not be connected to the server 104, so as to directly implement the interaction processing of the head-mounted display device 104 through the wearable device 102.
The wearable device 102 may be, but not limited to, various portable wearable devices such as a smart watch and a smart bracelet, and the head-mounted display device 104 may be, but not limited to, various head-mounted display devices such as AR (Augmented Reality) glasses, VR (Virtual Reality) glasses, and the server 106 may be implemented as a stand-alone server or a server cluster composed of a plurality of servers.
FIG. 2 is a flow diagram of a method of interaction of a head mounted display device in one embodiment. The interaction method of the head-mounted display device in this embodiment is described by taking an example of running on the head-mounted display device worn by the user and connected to the wearable device in fig. 1. As shown in fig. 2, the interaction method of the head-mounted display device includes steps 202 to 204.
Step 202, displaying the corresponding interactive interface under the condition of the function application to be entered.
The functional application refers to an application which can be installed and run on the head-mounted display device and can provide corresponding functional services, and specifically can comprise various types of applications such as a game application, a video application, a news media application, a social application and the like. When the function application runs on the head-mounted display device, the function application can be displayed in a display area of the head-mounted display device so as to show application content of the function application. For example, the head-mounted display device may be specifically AR glasses or helmets, VR glasses or helmets, etc., and after the user wears the head-mounted display device, a three-dimensional image of the function application may be displayed, and the user may interact in the displayed three-dimensional image, such as playing a game. VR, virtual reality, among other things, is a computer simulation technique that can create and experience a virtual world, using a computer to create a simulated environment into which a user is immersed. The virtual reality technology is to use data in real life, combine electronic signals generated by computer technology with various output devices to convert the electronic signals into phenomena which can be perceived by people, wherein the phenomena can be real and cut objects in reality or substances which can not be seen by naked eyes of users and are shown by a three-dimensional model. AR, augmented reality, is a technology that increases a user's perception of the real world through information provided by a computer system, which superimposes computer-generated virtual objects, scenes, or content objects such as system cues into the real scene to augment or modify the perception of the real world environment or data representing the real world environment.
For different functional applications, the method has corresponding interaction modes, such as various interaction modes of touch, remote control, gesture and the like. After the head-mounted display device enters the functional application, an application interaction interface corresponding to the functional application is displayed, and under the application interaction interface, a user can conduct interaction processing through an interaction mode supported by the functional application. In the case that the head-mounted display device does not enter the function application, that is, the head-mounted display device is in the condition of the function application to be entered, the head-mounted display device can display an interactive interface corresponding to the corresponding condition, for example, the head-mounted display device displays various interactive interfaces such as a starting interactive interface in a starting stage, a main interface after starting and a configuration interface for configuring the head-mounted display device, in the interactive interfaces, the head-mounted display device does not enter the function application yet, cannot interact in an interactive mode supported by the function application, and often needs to interact with a remote controller associated with the head-mounted display device through the remote controller, and needs to trigger corresponding interactive operation through touching related keys on the remote controller. For example, for a head-mounted display device, if the user has worn the head-mounted display device, the user is inconvenient to view the remote controller, and the user is more difficult to operate when interacting through the remote controller, and the interaction efficiency is low.
Specifically, in a case where the head-mounted display device does not enter the function application, and is in a state where the function application is to be entered, the head-mounted display device displays a corresponding interactive interface in which a user can perform interactions related to the head-mounted display device with respect to the head-mounted display device, for example, various interactions such as controlling the start or the shut-down of the head-mounted display device, configuring operation parameters of the head-mounted display device, viewing application information of the function application installed by the head-mounted display device, and the like.
Step 204, responding to the interaction operation triggered by the interaction element in the interaction interface, and executing the interaction event triggered by the interaction operation aiming at the interaction element; the interaction operation is obtained by matching according to pose data sent by the wearable equipment.
The wearable device can be specifically a device worn by a user wearing the head-mounted display device, such as various wearable devices including a watch, a bracelet and the like, and the wearable device is connected with the head-mounted display device, such as through Bluetooth, so that the wearable device can send the self pose data to the head-mounted display device, and the head-mounted display device can match corresponding interaction operations based on the pose data. The position and posture data are data representing the position and the posture of the wearable device, when the wearable device is worn to trigger interaction with the head-mounted display device, a user can control the position and the posture of the wearable device to be changed, for example, when the user wears the intelligent watch, the position and the posture of the intelligent watch can be changed by moving an arm and rotating a wrist, the wearable device can capture the position data and the posture data through built-in sensors, and the movement state of the wearable device can be determined according to the position data and the posture data, so that the position and posture data of the wearable device are obtained.
The wearable device sends the collected pose data to the head-mounted display device, the head-mounted display device performs interaction matching on the pose data sent by the wearable device, and corresponding triggered interaction operations are determined according to matching results, such as various operations including movement, selection, confirmation or cancellation of the interaction operations. The interactive element is an element in the interactive interface, which can respond to the interactive operation to trigger a corresponding interactive event, for example, the interactive element can be various types of controls, the interactive event corresponding to the control can be executed by triggering the interactive operation on the control, and if the confirmation button is used for triggering the confirmation operation, the corresponding setting is confirmed to be executed, so that the interactive processing on the head-mounted display equipment is realized.
Specifically, the user can trigger interactive operation in the interactive interface displayed by the head-mounted display device, the interactive operation is obtained by matching pose data sent by the wearable device worn by the user, so that the user can trigger different interactive operations on the interactive elements in the interactive interface by changing the pose of the wearable device, and after the user triggers interactive operation on the interactive elements, the head-mounted display device executes the interactive events triggered by the interactive operations on the interactive elements, such as configuration processing on the head-mounted display device, entering various interactive events such as selected functional application and the like.
According to the interaction method of the head-mounted display device, corresponding interaction interfaces are displayed under the condition that function application is to be entered, interaction events triggered by interaction operations aiming at interaction elements are executed in response to interaction operations obtained by matching the interaction interfaces according to pose data of the wearable device connected with the head-mounted display device, and therefore interaction processing of the head-mounted display device is achieved. In the interaction processing process of the head-mounted display device, under the condition that the head-mounted display device is in the function application to be entered, through responding to the interaction operation triggered by the wearable device in the corresponding interaction interface by the user, the interaction operation is obtained according to the matching of the pose data sent by the wearable device connected with the head-mounted display device, so that interaction with the head-mounted display device can be achieved through the pose data of the wearable device under the condition that the head-mounted display device is to be entered into the function application, the interaction operation mode is simplified, and the interaction efficiency is improved.
In one embodiment, displaying an interactive interface corresponding to a case of a function application to be entered includes: and under the condition that the function application is to be entered, displaying a device configuration interaction interface, and displaying interaction elements configured for the head-mounted display device in the device configuration interaction interface.
The device configuration interaction interface is an interface for performing configuration interaction on the head-mounted display device, various interaction elements for configuring the head-mounted display device can be included in the device configuration interaction interface, for example, the device configuration interaction interface can include interaction controls for configuring display parameters of the head-mounted display device, for example, various interaction controls such as brightness, contrast, saturation, color system, theme and the like, and a user can trigger configuration processing on the display parameters of the head-mounted display device for each interaction control. For different configuration functions, the device configuration interactive interface can have corresponding device configuration interactive interfaces, different interactive elements can be included in the device configuration interactive interface, and specific interface contents of the device configuration interactive interface can be flexibly set according to actual needs.
Specifically, when the head-mounted display device is in the condition of being subjected to the function application, the head-mounted display device displays a device configuration interaction interface, and particularly when configuration processing for the head-mounted display device is triggered for a user, the head-mounted display device displays a device configuration interaction interface corresponding to the device configuration processing, in the device configuration interaction interface, various interaction elements are displayed, wherein the displayed various interaction elements are interface elements for the configuration processing for the head-mounted display device, particularly various types of controls are provided for the user to provide an entrance for the configuration interaction processing for the head-mounted display device.
Further, in response to an interaction operation triggered on an interaction element in the interaction interface, performing the interaction event triggered by the interaction operation for the interaction element, including: and responding to the interactive operation triggered by the interactive element in the equipment configuration interactive interface, and displaying equipment configuration results of configuring the head-mounted display equipment through the interactive operation and the interactive element.
Specifically, for each interactive element displayed in the device configuration interactive interface, a user can generate different motion states by controlling the wearable device so as to trigger different interactive operations according to pose data of the wearable device. After the user triggers the interactive operation aiming at the interactive element in the equipment configuration interactive interface, the head-mounted display equipment responds to the interactive operation, determines the interactive element aimed at by the interactive operation, carries out configuration processing on the head-mounted display equipment through the interactive operation and the interactive element, and displays equipment configuration results. When the configuration processing is carried out on the head-mounted display device through the interactive operation and the interactive elements, the head-mounted display device can display a waiting interface, display processing progress information of the configuration processing in the waiting interface so as to display a processing procedure of the configuration for a user, and display a corresponding device configuration result after the configuration is completed.
In a specific application, as shown in fig. 3, when the head-mounted display device is to enter a functional application, the display device configures an interactive interface, wherein the interactive interface comprises interactive elements with color settings, and a user controls the worn smart bracelet to change the gesture by moving an arm so as to trigger interactive operation, and the interaction focus is marked by the displayed rays. As shown in FIG. 4, the ray moves to the interactive element corresponding to the color setting. Further, as shown in fig. 5, when the interaction operation for the color setting is triggered, the head-mounted display device executes an interaction event corresponding to the color setting, and enters the interface for the color setting to set color parameters such as brightness, contrast, saturation and the like, so as to realize configuration of the head-mounted display device.
In this embodiment, when the head-mounted display device is to enter the function application, the display device configures the interactive interface, and in response to the interactive operation triggered by the user to the interactive element in the device configuration interactive interface, the device configuration result of configuring the head-mounted display device through the interactive operation and the interactive element is displayed, so that the interactive processing of configuring the head-mounted display device can be realized through the pose data of the wearable device when the head-mounted display device is to enter the function application, the interactive operation mode is simplified, and the interactive efficiency is improved.
In one embodiment, displaying an interactive interface corresponding to a case of a function application to be entered includes: and displaying an application information interaction interface under the condition that the function application is to be entered, and displaying interaction elements of application description information corresponding to each function application in the application information interaction interface.
The application information interaction interface is an interface for interacting application information of various functional applications in the head-mounted display device, various interaction elements for interacting application information of various functional applications in the head-mounted display device can be included in the application information interaction interface, for example, an interaction control for accessing the functional applications of the head-mounted display device can be included in the application information interaction interface, specifically, application icons corresponding to the functional applications can be displayed in the application information interaction interface, and a user can trigger interaction processing for the corresponding functional applications according to the application icons, for example, interaction processing such as accessing the corresponding functional applications, unloading the functional applications, updating information of the functional applications and the like can be performed. The specific interface content of the application information interaction interface corresponds to various functional applications supported by the head-mounted display device. The types of various interactive elements in the application information interactive interface can be flexibly set according to actual needs, such as various forms of interactive elements, such as application icons, application names, application label pages and the like, corresponding to various functional applications.
Specifically, when the head-mounted display device is in a condition of being ready to enter a functional application, the head-mounted display device displays an application information interaction interface, and particularly when the user triggers the preview of the functional application of the head-mounted display device, the head-mounted display device displays an application information interaction interface corresponding to the preview of the functional application, in the application information interaction interface, various interaction elements are displayed, wherein the displayed various interaction elements are interface elements representing the corresponding functional application, and particularly application icons corresponding to the various functional applications are displayed, so that an entry for browsing or accessing the functional application is provided for the user.
Further, in response to an interaction operation triggered on an interaction element in the interaction interface, performing the interaction event triggered by the interaction operation for the interaction element, including: and responding to the interactive operation of the interactive element in the application information interactive interface, and displaying an application interactive result corresponding to the interactive operation acting on the interactive element.
Specifically, for each interactive element displayed in the application information interactive interface, a user can generate different motion states by controlling the wearable device so as to trigger different interactive operations according to pose data of the wearable device. After the user triggers the interactive operation aiming at the interactive element in the application information interactive interface, the head-mounted display equipment responds to the interactive operation, determines the interactive element aimed at by the interactive operation, carries out interactive processing on the corresponding interactive element through the interactive operation, and displays the corresponding application interactive result. When the method is specifically implemented, the head-mounted display device can display a waiting interface when the corresponding interactive elements are subjected to interactive processing through interactive operation, display processing progress information of the interactive processing in the waiting interface so as to display processing procedures of the interactive processing for a user, and display corresponding application interactive results after the interactive processing is completed. For example, the user triggers an access operation to the interactive element a in the application information interactive interface, which indicates that the user needs to access the functional application corresponding to the interactive element a, and the head-mounted display device can display an application starting animation of the functional application corresponding to the access interactive element a, and enter the functional application after the functional application is started, so that the access processing to the functional application is realized.
In a specific application, as shown in fig. 6, when the head-mounted display device is to enter a functional application, an application information interaction interface is displayed, application icons corresponding to 4 functional applications are displayed in the application information interaction interface, a user controls a worn intelligent bracelet to change the gesture by moving an arm so as to trigger interaction operation, and an interaction operation focus is marked by a displayed ray. As shown in fig. 7, after selecting the application icon of the wild game, the user triggers access to the wild game by rotating the wrist. As shown in fig. 8, after the application icon for the wild game triggers access, the head-mounted display device enters the wild game to display the application content corresponding to the wild game.
In this embodiment, the head-mounted display device displays the application information interaction interface under the condition that the head-mounted display device is to enter the function application, and responds to the interaction operation triggered by the interaction element in the application information interaction interface by the user, and displays the application interaction result corresponding to the interaction operation interaction element, so that the interaction processing of the function application of the head-mounted display device can be realized through the pose data of the wearable device under the condition that the head-mounted display device is to enter the function application, the interaction operation mode is simplified, and the interaction efficiency is improved.
In one embodiment, in response to an interactive operation triggered on an interactive element in an interactive interface, performing the interactive operation for the interactive event triggered by the interactive element comprises: responding to element selection operation triggered by the interaction element in the interaction interface, and marking target interaction elements selected by the element selection operation in the interaction interface; and executing the interaction event triggered by the target interaction element when the execution condition of the target interaction element is met.
The element selection operation is an operation of selecting the interactive element, which is triggered by the user in the interactive interface through the wearable device, and the element selection operation can be performed on each interactive element in the interactive interface so as to determine an interactive object which is required to be subjected to interactive processing by the user. The target interaction element is the interaction element which is selected from the interaction elements in the interaction interface by the user through the triggered element selection operation and needs to be subjected to interaction processing. The execution condition is a condition for triggering and executing the interaction event corresponding to the target interaction element, the execution condition can be flexibly set according to actual needs, if the duration of the element selection operation reaches the preset duration after the target interaction element is selected by a user, the execution condition of the target interaction element is considered to be met, and if the confirmation operation of the interaction event corresponding to the target interaction element is further triggered after the target interaction element is selected by the user, the execution condition of the target interaction element is considered to be met.
Specifically, after the head-mounted display device displays the interactive interface, the user triggers an element selection operation in the interactive interface through the wearable device, the element selection operation acts on the interactive element in the interactive interface, the element selection operation aims at the interactive element acted on as a target interactive element, the head-mounted display device can mark the target interactive element selected by the element selection operation in the interactive interface, for example, the target interactive element can be specially marked, such as highlighted, so as to prompt the interactive element to be the selected target interactive element, the target interactive element can be marked through rays or selection marks, such as rays are directed to the target interactive element, and the mark landing point is selected in the area of the target interactive element, so that the target interactive element selected by the element selection operation is marked in the interactive interface. Further, the head-mounted display device confirms whether the execution conditions corresponding to the target interaction elements are met or not, if yes, the head-mounted display device executes the interaction event triggered by the target interaction elements, and therefore interaction processing of the head-mounted display device is achieved.
In this embodiment, a user triggers an element selection operation in an interactive interface through a wearable device, and a target interactive element selected by the element selection operation is marked in the interactive interface by a head-mounted display device.
In one embodiment, after displaying the corresponding interactive interface in the case of the function application to be entered, the method further includes: and displaying the interaction preselection mark at a preset position in the interaction interface.
The interaction preselection mark can be a mark for marking an interaction focus corresponding to the interaction operation of the user, and concretely can be a ray, the ray is emitted to an interaction interface displayed by the head-mounted display device, and the endpoint position of the ray is the interaction focus corresponding to the interaction operation of the user. The interaction preselection mark can also be a mark in various forms such as a mouse mark, a selection box and the like, and is used for marking the interaction operation corresponding action focus of a user. The specific type of the interaction preselected mark can be configured in advance according to actual needs, and personalized setting can be performed by a user.
Specifically, after the head-mounted display device displays the corresponding interactive interface under the condition of entering the function application, displaying an interactive preselected mark at a preset position in the interactive interface, for example, the interactive preselected mark can be displayed at a geometric midpoint position of the interactive interface so as to indicate that a focus of interaction corresponding to the interactive operation of the user is the geometric midpoint of the interactive interface.
Further, marking a target interaction element selected by element selection operation in the interaction interface, including: and moving the interaction preselection mark from a preset position to a position corresponding to the target interaction element selected by the element selection operation.
Specifically, after the user triggers the element selection operation, the head-mounted display device moves the interaction preselected mark from a preset position to a position corresponding to the target interaction element selected by the element selection operation, so that the interaction preselected mark indicates that the focus of interaction corresponding to the interaction operation of the user is the target interaction element, namely, the user selects the target interaction element for interaction processing. In specific implementation, the head-mounted display device can analyze pose data corresponding to element selection operation, determine a target interaction element selected by the element selection operation, and move an interaction preselected mark from a preset position to a position corresponding to the target interaction element.
In the embodiment, the interaction focus corresponding to the interaction operation of the user is marked through the interaction pre-selection mark, so that the user can determine the interaction element through the object marked by the interaction pre-selection mark, intuitively display the target interaction element selected by the user, be favorable for feeding back the interaction operation of the user in real time, be convenient for the user to carry out interaction processing on the target interaction element, and improve the interaction efficiency.
In one embodiment, when the execution condition of the target interaction element is satisfied, executing the interaction event triggered for the target interaction element includes: and responding to the execution confirmation operation triggered by the target interaction element, and executing the interaction event triggered by the execution confirmation operation aiming at the target interaction element.
The confirmation operation is an operation for confirming the triggering execution of the target interaction element, and specifically may be an operation for further triggering the target interaction element by the user. For example, when the wearable device is a smart watch, the user may trigger the execution of a confirmation operation by rotating the wrist to change the pose of the smart watch, such as by rotating the wrist clockwise, to trigger an interaction event for the target interaction element.
Specifically, after the user selects the target interaction element, the user may further trigger to execute a confirmation operation through the wearable device, which indicates that the user confirms that the interaction event corresponding to the selected target interaction element needs to be executed, and the head-mounted display device responds to the execution confirmation operation to execute the interaction event triggered by the execution confirmation operation for the target interaction element, for example, access the function application corresponding to the target interaction element, or perform configuration processing matched with the target interaction element on the head-mounted display device.
In this embodiment, the head-mounted display device responds to the execution confirmation operation triggered by the user for the target interaction element, and executes the interaction event triggered by the execution confirmation operation for the target interaction element, so that the interaction processing of the functional application of the head-mounted display device can be realized through the pose data of the wearable device under the condition that the head-mounted display device is to enter the functional application, the interaction operation mode is simplified, and the interaction efficiency is improved.
In one embodiment, the method of interaction of the head mounted display device further comprises: responding to a calibration operation triggered by the wearable equipment, and displaying a calibration interface; marking interactive operation results corresponding to the calibration interactive operation in a calibration interface; the calibration interaction operation is matched with the current pose data of the wearable device.
The calibration operation is an operation of performing calibration processing on the pose of the wearable device, and the pose of the wearable device can be calibrated through the calibration processing on the pose of the wearable device, so that the sensitivity and the accuracy of triggering interactive operation by a user through the wearable device are ensured, and the interaction efficiency for the head-mounted display device is ensured. The specific triggering of the calibration operation can be set according to actual needs, for example, the specific triggering can be triggered according to pose data or calibration instructions of the wearable device, for example, when a specific area of the wearable device, such as a display screen area, is shielded for a certain duration, or when a user sends the calibration instructions to the head-mounted display device through the wearable device, the calibration operation is triggered. The calibration interface is an interface for performing calibration processing on the pose of the wearable equipment, and a user can trigger calibration interaction operation in the calibration interface so as to perform calibration processing on the pose of the wearable equipment. The calibration interaction operation is matched with the current pose data of the wearable device, namely after the calibration operation is triggered and the head-mounted display device displays a calibration interface, the corresponding calibration interaction operation can be matched and determined according to the current pose data of the wearable device, and the calibration interaction operation is matched with the current pose data of the wearable device.
Specifically, when the user needs to perform calibration processing on the pose of the wearable device to ensure the sensitivity and accuracy of the interactive operation, the user can trigger the calibration operation through the wearable device, for example, the user can cover a specific area of the wearable device, trigger the calibration operation, and the head-mounted display device responds to the calibration operation to display a calibration interface. Further, the head-mounted display device matches corresponding calibration interaction operations according to the current pose data of the wearable device, and an interaction operation result corresponding to the calibration interaction operations is marked in a calibration interface. For example, the head-mounted display device may match corresponding calibration interaction operations according to current pose data of the wearable device, and mark an interaction focus corresponding to the calibration interaction operations in an interface geometric center of the calibration interface, so as to mark an interaction operation result corresponding to the calibration interaction operations, which indicates that an interaction focus corresponding to the current pose data of the wearable device worn by a user is the interface geometric center of the calibration interface, and the user may change the current pose data of the wearable device when triggering calibration, so as to implement calibration processing on the pose of the wearable device.
In this embodiment, the head-mounted display device may respond to the calibration operation triggered by the user through the wearable device, and display the interactive operation result corresponding to the calibration interactive operation matched with the current pose data of the wearable device in the punctuation interface.
In one embodiment, the method of interaction of the head mounted display device further comprises: in case of triggering the vibration feedback, the vibration feedback is performed by a vibration component in the wearable device.
Wherein, the vibration component can set up in wearable equipment to for wearing the user body of wearable equipment with vibration feedback. Specifically, under the condition of triggering vibration feedback, if the interaction operation triggered by the user through the wearable device takes effect or the action focus of the interaction operation is located at the edge position of the interaction interface, the user can be considered to need to perform vibration feedback, and the vibration feedback can be performed through the vibration component in the wearable device. When the method is specifically implemented, the head-mounted display device can monitor whether the condition of triggering vibration feedback is met, if yes, a vibration instruction is sent to the wearable device, and the wearable device controls the vibration assembly to perform vibration feedback for a user according to the vibration instruction. In a specific application, the vibration intensity of the vibration component can be flexibly set according to actual needs, for example, the vibration intensity can be corresponding to the operation amplitude of the interactive operation, and specifically, positive correlation can be formed, that is, the larger the operation amplitude of the interactive operation is, the larger the corresponding vibration intensity is. In addition, when the action focus of the interactive operation is located at the edge position of the interactive interface, the corresponding vibration intensity may be maximized to prompt the user to reach the edge position of the operation.
In this embodiment, under the condition of triggering vibration feedback, vibration feedback is performed through a vibration component in the wearable device, so that feedback information can be provided for a user through vibration feedback, so that the user can timely obtain the interaction effect of interactive operation, interactive processing is facilitated for the user, and the interaction efficiency of the head-mounted display device is improved.
In one embodiment, as shown in fig. 9, the interaction method of the head-mounted display device further includes a process of determining interaction operation, specifically including:
in step 902, pose data of the wearable device is acquired, wherein the pose data is acquired by a sensor component in the wearable device.
The position and posture data are data representing the position and the posture of the wearable device, when the user wears the wearable device to trigger interaction with the head-mounted display device, the user can control the position and the posture of the wearable device to be changed, for example, when the user wears the intelligent watch, the user can change the position and the posture of the intelligent watch by moving an arm and rotating a wrist, the wearable device can capture the position data and the posture data through built-in sensors, and the movement state of the wearable device can be determined according to the position data and the posture data, so that the position and posture data of the wearable device are obtained. The pose data are acquired by a sensor component in the wearable equipment, and specifically comprise a direction sensor, a speed sensor, a distance sensor and the like.
In specific implementation, the sensor assembly can detect acceleration of the wearable device on three axes X, Y, Z and rotation angular velocity around the three axes X, Y, Z in a space coordinate system, and the head-mounted display device can calculate the posture of the wearable device through the acceleration of the three axes; the movement direction and the rotation direction of the wearable equipment can be judged through the acceleration change and the rotation angular velocity change of the three shafts; calculating displacement and rotation angles of the wearable equipment on three axes through integral calculation of acceleration and angular speed; and synthesizing the actual displacement of the wearable device according to the displacement components on the three axes, so as to determine the motion state of the wearable device.
Specifically, a user changes the motion state of the wearable device by controlling the wearable device, a sensor component in the wearable device collects pose data and sends the pose data to the head-mounted display device, and the head-mounted display device receives the pose data sent by the wearable device.
Step 904, analyzing the pose state of the pose data to obtain the pose state of the wearable device.
After the head-mounted display device receives the pose data sent by the wearable device, the head-mounted display device analyzes the pose state of the pose data, for example, the motion state of the wearable device can be analyzed through the pose data, and the pose state of the wearable device is obtained. For example, when the sensor assembly can detect acceleration of the wearable device in three axes of X, Y, Z and rotational angular velocity about three axes of X, Y, Z in the spatial coordinate system, the head-mounted display device can calculate the pose of the wearable device from the acceleration of the three axes; the movement direction and the rotation direction of the wearable equipment can be judged through the acceleration change and the rotation angular velocity change of the three shafts; calculating displacement and rotation angles of the wearable equipment on three axes through integral calculation of acceleration and angular speed; and synthesizing the actual displacement of the wearable equipment according to the displacement components on the three axes, so as to obtain the motion state of the wearable equipment, and further determining the pose state of the wearable equipment.
Step 906, based on the mapping relation between the pose state and the preset pose operation, matching to obtain the interactive operation triggered by the wearable device.
After the pose state of the wearable device is obtained, the head-mounted display device can query a preset pose operation mapping relation, wherein the pose operation mapping relation comprises interaction operations corresponding to and matched with various pose states. The head-mounted display device matches the pose state with a preset pose operation mapping relation, and interaction operation triggered by the wearable device is obtained according to a matching result, so that interaction operation of a user through controlling the wearable device is determined.
In this embodiment, the head-mounted display device analyzes pose data acquired and sent by the wearable device, matches the pose state according to the determined pose state with a preset pose operation mapping relation, and obtains interaction operation triggered by the wearable device according to a matching result, so that a user can trigger the interaction operation by controlling and changing the pose data of the wearable device, and interaction operation on the head-mounted display device is achieved, an interaction operation mode is simplified, and interaction efficiency is improved.
The application scene is applied to the interaction method of the head-mounted display device. Specifically, the application of the interaction method of the head-mounted display device in the application scene is as follows:
The head-mounted display device can be intelligent display device, such as intelligent devices including AR glasses, VR glasses or intelligent televisions, and the traditional head-mounted display device is controlled in the functional application by a remote controller and a mobile phone serving as a remote controller and gesture recognition. Under the condition that the head-mounted display equipment is to enter a functional application, such as in a starting stage and a main interface, the head-mounted display equipment is generally controlled through a remote controller and a mobile phone, and a corresponding mode is selected for interaction according to the requirements of the functional application after the head-mounted display equipment enters a specific functional application.
Specifically, in the case where the head-mounted display device is to enter a functional application, controlling the head-mounted display device by a remote controller is currently the most popular mode, the remote controller establishes connection with the head-mounted display device by a wired or wireless mode, and four direction keys, namely an up-down direction key, a left-right direction key, a return key, a confirm key and a menu key, are generally arranged on the remote controller. When the remote controller is awakened, the head-mounted display device can highlight the position of the cursor in the current display interface. The user can move the cursor position by the up-down, left-right direction keys, and perform operations such as confirmation, return and the like.
On the other hand, a mode of using the mobile phone as a remote controller needs to run a specific application program, the mobile phone is connected with the head-mounted display device in a wired or wireless mode, the current pointing direction of the mobile phone can be calculated by using the IMU (Inertial Measurement Unit ) sensor data on the mobile phone, meanwhile, a ray is displayed on the display screen of the head-mounted device for indicating the current pointing direction of the mobile phone, a user can change the area pointed by the ray by adjusting the pose of the mobile phone, and corresponding operation can be carried out on the current pointing area through a button on the display screen of the mobile phone. The mobile phone screen is provided with a reset ray button, and rays can be moved to a designated position by clicking the reset ray button and calibrated with the current mobile phone pointing direction.
However, the user cannot see the real world after wearing the head-mounted display device, and cannot see the position of the key of the remote controller or the mobile phone, so that only the blind operation can be performed on the remote controller and the mobile phone, and the interaction accuracy and efficiency are affected. In addition, even if the head-mounted display device can see the virtual image and the real world at the same time after being worn, the user's attention is required to be switched back and forth between the real world and the virtual image when being controlled by the remote controller and the mobile phone, the focus of the human eyes is required to be continuously changed and adjusted, the immersive experience of the user is affected, and the interaction efficiency is also affected.
Based on the above, the interaction method of the head-mounted display device provided by the embodiment is applied to the head-mounted display device worn by the user, and the wearable device worn by the user is connected with the head-mounted display device. The wearable device worn by the user can be various wearable devices such as a smart watch or a smart bracelet. As shown in fig. 10, the smart watch or the smart bracelet may be composed of a wristband 100, a case 200, a touch screen 300 and a sensor assembly, the touch screen 300 and the sensor assembly being mounted on the case 200, the smart watch securing the case 200 at a wrist of a user through the wristband 100. The wearable device establishes connection with the head-mounted display device in a wireless manner, for example, through Bluetooth, the wearable device automatically opens or a user opens the control application of the head-mounted display device after successful connection, and the head-mounted display device displays a ray 500 at a default position in the screen 400 after the control application is successfully opened by the wearable device.
The sensor assembly can detect, among other things, acceleration in three axes of X, Y, Z and rotational angular velocity about three axes of X, Y, Z. The head-mounted display equipment is connected with the electronic device through acceleration of three axes, so that the gesture of the electronic device can be calculated; the movement direction and the rotation direction of the wearable equipment can be judged through the acceleration change and the rotation angular velocity change of the three shafts; calculating displacement and rotation angles of the wearable equipment on three axes through integral calculation of acceleration and angular speed; the actual displacement of the wearable device is synthesized according to the displacement components on the three axes. Specifically, the sensor assembly senses a motion state of the wearable device, obtains motion parameters such as a motion direction, a motion speed, a distance parameter, and the like, and the wearable device sends the motion parameters to the head-mounted display device, which calculates motion data and changes the direction of the ray 500 in the screen 400 according to the motion data. If the user rotates the wrist clockwise, the wearable device issues a "confirm" command, which acts as a "click" operation on the pointing position of the ray 500, and the screen 400 of the head-mounted display device switches the corresponding content, i.e. executes the corresponding interaction event.
Specifically, as shown in fig. 11, after the wearable device establishes connection with the head-mounted display device, the wearable device starts an interactive control application, collects the movement direction, speed and distance of the user in the process of swinging the arm, obtains movement data, and sends the movement data obtained by collection to the head-mounted display device. The head-mounted display device calculates the received motion data, changes the ray direction, and executes the interaction event corresponding to the ray direction interaction element when the user triggers the confirmation operation. The sensor in the wearable device can detect the moving distance of the wearable device in the three directions of X, Y and Z and the rotating angle around the three axes of X, Y and Z, and the up-down, left-right movement of the wearable device can control the ray 500 in the screen 400 to do up-down, left-right rotation around the starting point, so as to change the direction of the ray 500 to different positions in the screen 400. The user may rotate the wrist-worn device clockwise to issue a "confirm" command and counter-clockwise to issue a "return/cancel" command. Gesture users corresponding to the 'confirm' command and the 'return/cancel' command can be set through customization. In practical use, a user generally has difficulty in making a hand do strict vertical and horizontal translational movement, and most of use scenes are to place an elbow on a chair armrest or a table and do vertical and horizontal swing by taking the elbow as a center of a circle. The sensor in the wearable device can restore the real motion track of the forearm of the user by detecting the motion components in the X, Y and Z directions, and the direction of the ray 500 in the screen 400 of the head-mounted display device can be changed by swinging the forearm.
In addition, the wearable device also has a calibration function, and when the calibration function is triggered, the ray 500 in the screen 400 of the head-mounted display device can be reset, and the pose of the wearable device at the moment can be matched with the ray 500 at the reset position. The calibration function can enable a user to quickly find the position of the ray 500 and adjust the hand gesture, and the phenomena of ray loss and overlarge arm travel in the control process are prevented. The calibration function can be triggered by tapping the touch screen 300 of the wearable device or covering the touch screen 300, and the specific triggering mode can also be set by user definition, and the touch screen 300 is covered by default to trigger the calibration function.
The wearable device further comprises a vibration component through which the user's operation is fed back. When the user performs effective forearm swing, the vibration component feeds back through vibration, the vibration intensity is in direct proportion to the swing amplitude, and when the ray 500 pointing position reaches the edge of the screen 400 of the head-mounted display device, the vibration amplitude is maximum, so that the user is reminded that the ray 500 has reached the limit position. When the user rotates the wrist, the vibration component can give corresponding feedback to the user, so that the user can sense the rotation angle of the wrist in one more mode besides the torsion degree of the arm of the user, and when the rotation angle exceeds a threshold value, the vibration disappears to remind the user that the rotation has successfully triggered the corresponding instruction to be sent to the head-mounted display device.
In this embodiment, for the virtual reality type head-mounted display device, the user cannot see the real world after wearing, and cannot see the position of the key when using the existing remote controller or mobile phone for control, so that misoperation is easy to occur or a long time is required to find the correct position of the key. In this embodiment, the user may control the head-mounted display device through hand movement, and no key operation is required, so that accuracy and speed of operation may be improved. For the augmented reality type head-mounted display device, a user can see the display world after wearing the device, but the attention of the user needs to be switched back and forth between the head-mounted display device and a remote controller or a mobile phone, and the focus of human eyes also needs to be changed correspondingly, so that the use experience of the user is affected. In this embodiment, the user may pay attention to the display screen of the headset completely, and only simple hand movements are required to control the headset, so that the immersion experience effect of the user is improved, and the interaction efficiency is improved.
It should be understood that, although the steps in the flowcharts of fig. 2, 9, and 11 are shown in order as indicated by the arrows, these steps are not necessarily performed in order as indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in fig. 2, 9, and 11 may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of execution of the sub-steps or stages is not necessarily sequential, but may be performed in turn or alternately with at least some of the other steps or sub-steps of other steps.
Fig. 12 is a block diagram of an interaction apparatus 1200 of a head-mounted display device according to an embodiment. The interaction device 1200 of the head-mounted display device is applied to the head-mounted display device worn by a user, and the wearable device worn by the user is connected with the head-mounted display device. As shown in fig. 12, the interactive apparatus 1200 of the head-mounted display device includes an interactive interface display module 1202 and an interactive operation response module 1204; wherein:
the interactive interface display module 1202 is configured to display an interactive interface corresponding to a function application to be entered;
an interactive operation response module 1204, configured to execute an interactive event triggered by an interactive operation for an interactive element in response to the interactive operation triggered by the interactive element in the interactive interface; the interaction operation is obtained by matching according to pose data sent by the wearable equipment.
In one embodiment, the interactive interface display module 1202 is further configured to display a device configuration interactive interface, where the interactive element configured for the head-mounted display device is displayed in the device configuration interactive interface, in a case where the functional application is to be entered; the interactive operation response module 1204 is further configured to respond to an interactive operation triggered by an interactive element in the device configuration interactive interface, and display a device configuration result of configuring the head-mounted display device through the interactive operation and the interactive element.
In one embodiment, the interactive interface display module 1202 is further configured to display an application information interactive interface when the function application is to be entered, and display interactive elements of application description information corresponding to each function application in the application information interactive interface; the interactive operation response module 1204 is further configured to respond to an interactive operation of the interactive element in the application information interactive interface, and display an application interaction result corresponding to the interactive element acted by the interactive operation.
In one embodiment, the interactive operation response module 1204 includes a target interactive element determination module and a target interactive element execution module; wherein: the target interactive element determining module is used for responding to the element selection operation triggered by the interactive elements in the interactive interface and marking the target interactive elements selected by the element selection operation in the interactive interface; and the target interaction element execution module is used for executing the interaction event triggered by the target interaction element when the execution condition of the target interaction element is met.
In one embodiment, the interactive terminal further comprises a pre-sign display module for displaying the interactive pre-selection sign at a preset position in the interactive interface; and the target interaction element determining module is also used for moving the interaction preselected mark from the preset position to the position corresponding to the target interaction element selected by the element selection operation.
In one embodiment, the target interactive element execution module is further configured to, in response to an execution validation operation triggered on the target interactive element, execute an interactive event triggered by the execution validation operation on the target interactive element.
In one embodiment, the device further comprises a calibration interface display module and a calibration result module; wherein: the calibration interface display module is used for responding to the calibration operation triggered by the wearable equipment and displaying a calibration interface; the calibration result module is used for marking interactive operation results corresponding to the calibration interactive operation in the calibration interface; the calibration interaction operation is matched with the current pose data of the wearable device.
In one embodiment, the device further comprises a vibration feedback module for performing vibration feedback by a vibration component in the wearable device in case the vibration feedback is triggered.
In one embodiment, the system further comprises a pose data acquisition module, a pose data analysis module and an interactive operation determination module; wherein: the pose data acquisition module is used for acquiring pose data of the wearable equipment, wherein the pose data are acquired by a sensor assembly in the wearable equipment; the pose data analysis module is used for analyzing the pose state of the pose data to obtain the pose state of the wearable equipment; and the interactive operation determining module is used for obtaining interactive operation triggered by the wearable equipment based on the matching of the pose state and a preset pose operation mapping relation.
The division of the modules in the interaction device of the head-mounted display apparatus is merely for illustration, and in other embodiments, the interaction device of the head-mounted display apparatus may be divided into different modules as needed to complete all or part of the functions of the interaction device of the head-mounted display apparatus.
For specific limitations of the interaction means of the head mounted display device, reference may be made to the above limitations of the interaction method of the head mounted display device, which are not repeated here. The above-described respective modules in the interactive means of the head-mounted display device may be implemented in whole or in part by software, hardware, and combinations thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
Fig. 13 is a schematic diagram of an internal structure of an electronic device in one embodiment. The electronic device can be a head-mounted display device, and particularly can be a display device for realizing different effects such as virtual reality, augmented reality, mixed reality and the like. The electronic device includes a processor and a memory connected by a system bus. Wherein the processor may comprise one or more processing units. The processor may be a CPU (Central Processing Unit ) or DSP (Digital Signal Processing, digital signal processor), etc. The memory may include a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The computer program is executable by a processor for implementing a method of interaction for a head mounted display device as provided in the various embodiments below. The internal memory provides a cached operating environment for operating system computer programs in the non-volatile storage medium.
The implementation of each module in the interaction device of the head-mounted display device provided in the embodiments of the present application may be in the form of a computer program. The computer program may run on a terminal or a server. Program modules of the computer program may be stored in the memory of the electronic device. Which when executed by a processor, performs the steps of the methods described in the embodiments of the present application.
Embodiments of the present application also provide a computer-readable storage medium. One or more non-transitory computer-readable storage media containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform steps of an interaction method of a head mounted display device.
Embodiments of the present application also provide a computer program product containing instructions that, when run on a computer, cause the computer to perform a method of interaction of a head mounted display device.
Any reference to memory, storage, database, or other medium used herein may include non-volatile and/or volatile memory. The nonvolatile Memory may include a ROM (Read-Only Memory), a PROM (Programmable Read-Only Memory ), an EPROM (Erasable Programmable Read-Only Memory, erasable programmable Read-Only Memory), an EEPROM (Electrically Erasable Programmable Read-Only Memory), or a flash Memory. Volatile memory can include RAM (Random Access Memory ), which acts as external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as SRAM (Static Random Access Memory ), DRAM (Dynamic Random Access Memory, dynamic random access memory), SDRAM (Synchronous Dynamic Random Access Memory ), double data rate DDR SDRAM (Double Data Rate Synchronous Dynamic Random Access memory, double data rate synchronous dynamic random access memory), ESDRAM (Enhanced Synchronous Dynamic Random Access memory ), SLDRAM (Sync Link Dynamic Random Access Memory, synchronous link dynamic random access memory), RDRAM (Rambus Dynamic RandomAccess Memory, bus dynamic random access memory), DRDRAM (Direct Rambus Dynamic Random Access Memory, interface dynamic random access memory).
The above examples only represent a few embodiments of the present application, which are described in more detail and are not to be construed as limiting the scope of the present application. It should be noted that it would be apparent to those skilled in the art that various modifications and improvements could be made without departing from the spirit of the present application, which would be within the scope of the present application. Accordingly, the scope of protection of the present application is to be determined by the claims appended hereto.

Claims (12)

1. The interaction method of the head-mounted display device is characterized by being applied to the head-mounted display device worn by a user, wherein the wearable device worn by the user is connected with the head-mounted display device; the method comprises the following steps:
responding to the calibration operation triggered by the wearable equipment, displaying an interactive operation result corresponding to the calibration operation matched with the current pose data of the wearable equipment in a calibration interface, wherein the interactive operation result comprises the following steps:
matching corresponding calibration interaction operations according to the current pose data of the wearable equipment, and marking corresponding action focuses of the calibration interaction operations at interface geometric center points of the calibration interface so as to mark interaction operation results corresponding to the calibration interaction operations;
The calibration operation is triggered when a display screen of the wearable device is shielded and reaches a preset duration;
displaying an interactive interface corresponding to the condition of the function application to be entered;
responding to interactive operation triggered by an interactive element in the interactive interface, and when an action focus of the interactive operation is positioned at the edge position of the interactive interface or rotation of the wearable equipment is detected, sending a vibration instruction to the wearable equipment, wherein the vibration instruction is used for controlling a vibration component to perform vibration feedback for a user by the wearable equipment, and the vibration intensity of the vibration feedback is in direct proportion to the operation amplitude of the interactive operation;
receiving an instruction sent by the wearable equipment, wherein the instruction is an instruction for enabling the wearable equipment to vibrate and disappear when the rotation angle of the wearable equipment exceeds a threshold value and prompting that the rotation is successfully triggered; executing the interaction event triggered by the interaction operation aiming at the interaction element; the interaction operation is obtained by matching according to pose data sent by the wearable equipment.
2. The method according to claim 1, wherein displaying the corresponding interactive interface in the case of the function application to be entered comprises:
Under the condition that a function application is to be entered, displaying a device configuration interaction interface, and displaying interaction elements configured for the head-mounted display device in the device configuration interaction interface;
the responding to the interaction operation triggered by the interaction element in the interaction interface, executing the interaction event triggered by the interaction operation aiming at the interaction element, and comprises the following steps:
and responding to the interactive operation triggered by the interactive element in the equipment configuration interactive interface, and displaying equipment configuration results for configuring the head-mounted display equipment through the interactive operation and the interactive element.
3. The method according to claim 1, wherein displaying the corresponding interactive interface in the case of the function application to be entered comprises:
displaying an application information interaction interface under the condition that the function application is to be entered, and displaying interaction elements of application description information corresponding to each function application in the application information interaction interface;
the responding to the interaction operation triggered by the interaction element in the interaction interface, executing the interaction event triggered by the interaction operation aiming at the interaction element, and comprises the following steps:
and responding to the interactive operation of the interactive element in the application information interactive interface, and displaying an application interactive result corresponding to the interactive operation acting on the interactive element.
4. The method of claim 1, wherein the performing the interaction event triggered by the interaction operation for the interaction element in response to the interaction operation triggered by the interaction element in the interaction interface comprises:
responding to element selection operation triggered by the interaction element in the interaction interface, and marking target interaction elements selected by the element selection operation in the interaction interface;
and executing the interaction event triggered by the target interaction element when the execution condition of the target interaction element is met.
5. The method according to claim 4, further comprising, after the displaying the corresponding interactive interface in the case of the function application to be entered,:
displaying an interaction preselection mark at a preset position in the interaction interface;
the marking the target interaction element selected by the element selection operation in the interaction interface comprises the following steps:
and moving the interaction preselection mark from the preset position to a position corresponding to the target interaction element selected by the element selection operation.
6. The method of claim 4, wherein executing the interaction event triggered for the target interaction element when the execution condition of the target interaction element is satisfied comprises:
And responding to the execution confirmation operation triggered by the target interaction element, and executing the interaction event triggered by the execution confirmation operation aiming at the target interaction element.
7. The method according to claim 1, wherein the method further comprises:
responding to calibration operation triggered by the wearable equipment, and displaying a calibration interface;
marking interactive operation results corresponding to the calibration interactive operation in the calibration interface; and the calibration interaction operation is matched with the current pose data of the wearable equipment.
8. The method according to claim 1, wherein the method further comprises: in case of triggering vibration feedback, vibration feedback is performed by a vibration component in the wearable device.
9. The method according to any one of claims 1 to 8, further comprising: acquiring pose data of the wearable equipment, wherein the pose data is acquired by a sensor assembly in the wearable equipment;
analyzing the pose state of the pose data to obtain the pose state of the wearable equipment;
and based on the pose state and a preset pose operation mapping relation, matching to obtain the interactive operation triggered by the wearable equipment.
10. The interaction device of the head-mounted display device is characterized by being applied to the head-mounted display device worn by a user, and the wearable device worn by the user is connected with the head-mounted display device; the device comprises:
the interactive interface display module is used for responding to the calibration operation triggered by the wearable equipment and displaying an interactive operation result corresponding to the calibration operation matched with the current pose data of the wearable equipment in a calibration interface, and comprises the following steps: matching corresponding calibration interaction operations according to the current pose data of the wearable equipment, and marking corresponding action focuses of the calibration interaction operations at interface geometric center points of the calibration interface so as to mark interaction operation results corresponding to the calibration interaction operations; the calibration operation is triggered when a display screen of the wearable device is shielded and reaches a preset duration; displaying an interactive interface corresponding to the condition of the function application to be entered;
the interactive operation response module is used for responding to the interactive operation triggered by the interactive element in the interactive interface, and sending a vibration instruction to the wearable equipment when the action focus of the interactive operation is positioned at the edge position of the interactive interface or the rotation of the wearable equipment is detected, wherein the vibration instruction is used for controlling a vibration component to perform vibration feedback for a user by the wearable equipment, and the vibration intensity of the vibration feedback is in direct proportion to the operation amplitude of the interactive operation; receiving an instruction sent by the wearable equipment, wherein the instruction is an instruction for enabling the wearable equipment to vibrate and disappear when the rotation angle of the wearable equipment exceeds a threshold value and prompting that the rotation is successfully triggered; executing the interaction event triggered by the interaction operation aiming at the interaction element; the interaction operation is obtained by matching according to pose data sent by the wearable equipment.
11. An electronic device comprising a memory and a processor, the memory having stored therein a computer program, characterized in that the computer program, when executed by the processor, causes the processor to perform the steps of the interaction method of a head mounted display device as claimed in any of claims 1 to 9.
12. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method according to any one of claims 1 to 9.
CN202110689938.9A 2021-06-21 2021-06-21 Interaction method and device of head-mounted display equipment and electronic equipment Active CN113413585B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110689938.9A CN113413585B (en) 2021-06-21 2021-06-21 Interaction method and device of head-mounted display equipment and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110689938.9A CN113413585B (en) 2021-06-21 2021-06-21 Interaction method and device of head-mounted display equipment and electronic equipment

Publications (2)

Publication Number Publication Date
CN113413585A CN113413585A (en) 2021-09-21
CN113413585B true CN113413585B (en) 2024-03-22

Family

ID=77789862

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110689938.9A Active CN113413585B (en) 2021-06-21 2021-06-21 Interaction method and device of head-mounted display equipment and electronic equipment

Country Status (1)

Country Link
CN (1) CN113413585B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117093068A (en) * 2022-05-12 2023-11-21 华为技术有限公司 Vibration feedback method and system based on wearable device, wearable device and electronic device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108259886A (en) * 2016-12-27 2018-07-06 Fove股份有限公司 Deduction system, presumption method and program for estimating
CN110069101A (en) * 2019-04-24 2019-07-30 洪浛檩 A kind of wearable calculating equipment and a kind of man-machine interaction method
CN111766937A (en) * 2019-04-02 2020-10-13 广东虚拟现实科技有限公司 Virtual content interaction method and device, terminal equipment and storage medium
US10867593B1 (en) * 2018-02-08 2020-12-15 Facebook Technologies, Llc In-ear emitter configuration for audio delivery
CN112308569A (en) * 2020-11-19 2021-02-02 Oppo(重庆)智能科技有限公司 Application function calling method, device, terminal and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108259886A (en) * 2016-12-27 2018-07-06 Fove股份有限公司 Deduction system, presumption method and program for estimating
US10867593B1 (en) * 2018-02-08 2020-12-15 Facebook Technologies, Llc In-ear emitter configuration for audio delivery
CN111766937A (en) * 2019-04-02 2020-10-13 广东虚拟现实科技有限公司 Virtual content interaction method and device, terminal equipment and storage medium
CN110069101A (en) * 2019-04-24 2019-07-30 洪浛檩 A kind of wearable calculating equipment and a kind of man-machine interaction method
CN112308569A (en) * 2020-11-19 2021-02-02 Oppo(重庆)智能科技有限公司 Application function calling method, device, terminal and storage medium

Also Published As

Publication number Publication date
CN113413585A (en) 2021-09-21

Similar Documents

Publication Publication Date Title
EP3908906B1 (en) Near interaction mode for far virtual object
US9268400B2 (en) Controlling a graphical user interface
CN110456907A (en) Control method, device, terminal device and the storage medium of virtual screen
US20220269333A1 (en) User interfaces and device settings based on user identification
WO2014141504A1 (en) Three-dimensional user interface device and three-dimensional operation processing method
WO2013168171A1 (en) Method for gesture-based operation control
CN112817453A (en) Virtual reality equipment and sight following method of object in virtual reality scene
CN113413585B (en) Interaction method and device of head-mounted display equipment and electronic equipment
KR20180094875A (en) Information processing apparatus, information processing method, and program
WO2022204664A1 (en) Devices, methods, and graphical user interfaces for maps
CN110717993A (en) Interaction method, system and medium of split type AR glasses system
US20240019938A1 (en) Systems for detecting gestures performed within activation-threshold distances of artificial-reality objects to cause operations at physical electronic devices, and methods of use thereof
US20230092874A1 (en) Devices, Methods, and Graphical User Interfaces for Interacting with Three-Dimensional Environments
US20230367403A1 (en) Terminal device, virtual object manipulation method, and virtual object manipulation program
CN115002443A (en) Image acquisition processing method and device, electronic equipment and storage medium
JP7500866B2 (en) Wearable terminal device, program, and display method
CN112162631B (en) Interactive device, data processing method and medium
US20240103686A1 (en) Methods for controlling and interacting with a three-dimensional environment
CN118012265A (en) Man-machine interaction method, device, equipment and medium
WO2023275919A1 (en) Wearable terminal device, program, and display method
EP3958095A1 (en) A mobile computer-tethered virtual reality/augmented reality system using the mobile computer as a man machine interface
CN117827072A (en) Equipment control method and device and electronic equipment
JP2023168746A (en) Information processing apparatus, information processing system, information processing method, and program
CN113220110A (en) Display system and method
WO2023049111A1 (en) Devices, methods, and graphical user interfaces for interacting with three-dimensional environments

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant