CN114397961A - Head-mounted display device control method, head-mounted display device assembly, and medium - Google Patents

Head-mounted display device control method, head-mounted display device assembly, and medium Download PDF

Info

Publication number
CN114397961A
CN114397961A CN202111627725.XA CN202111627725A CN114397961A CN 114397961 A CN114397961 A CN 114397961A CN 202111627725 A CN202111627725 A CN 202111627725A CN 114397961 A CN114397961 A CN 114397961A
Authority
CN
China
Prior art keywords
head
cursor
display device
mounted display
touch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111627725.XA
Other languages
Chinese (zh)
Inventor
王佳辉
王文博
王俊杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Companion Technology Co ltd
Original Assignee
Hangzhou Companion Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Companion Technology Co ltd filed Critical Hangzhou Companion Technology Co ltd
Priority to CN202111627725.XA priority Critical patent/CN114397961A/en
Publication of CN114397961A publication Critical patent/CN114397961A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Embodiments of the present disclosure disclose a head-mounted display device control method, a head-mounted display device assembly, and a medium. One embodiment of the method comprises: in response to determining that the head-mounted display device enters a cursor interaction mode, displaying a cursor in a display screen of the head-mounted display device, and performing touch initialization setting on a touch area in the cursor interaction mode; in response to the detection that the mobile device generates displacement, controlling a cursor to move in a display screen of the head-mounted display device according to displacement information of the mobile device; and in response to the detection of the touch operation acting on the touch area, controlling a display interface displayed in the display screen of the head-mounted display device to update according to the touch operation and the position of the cursor in the display screen of the head-mounted display device. According to the implementation method, efficient and flexible interaction can be realized in scenes with frequent interaction and complex interaction, and the user experience is improved.

Description

Head-mounted display device control method, head-mounted display device assembly, and medium
Technical Field
Embodiments of the present disclosure relate to the field of computer technologies, and in particular, to a head-mounted display device control method, a head-mounted display device assembly, and a medium.
Background
A head-mounted display device, for example, AR (Augmented Reality) glasses or MR (Mixed Reality) glasses, may enable a user to view display content after wearing the same. The user may control the display content in the head mounted display device through interaction with the head mounted display device. Currently, when a user controls display content in a head-mounted display device, an interaction manner generally adopted is as follows: the touch-sensitive display screen of the mobile phone is used as a touch pad, interaction is realized through sliding operation on the touch pad, or rays are displayed in the display screen of the head-mounted display device, and the rays are controlled by an inertial sensor in the head-mounted display device to interact.
However, when the head mounted display apparatus is controlled in the above manner, there are often technical problems as follows:
the interaction mode is complex, the method is suitable for application scenes (such as video and audio scenes) with less interaction and simple interaction, efficient and flexible interaction is difficult to realize in scenes (such as office scenes) with frequent interaction and complex interaction, and the user experience is reduced.
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
Some embodiments of the present disclosure propose a head-mounted display device control method, a head-mounted display device assembly and a computer readable medium to solve the technical problems mentioned in the background section above.
In a first aspect, some embodiments of the present disclosure provide a head-mounted display device control method applied to a head-mounted display device assembly, where the head-mounted display device assembly includes a head-mounted display device having a display screen and a mobile device having a touch area, and the mobile device is communicatively connected to the head-mounted display device, the method includes: in response to determining that the head-mounted display device enters a cursor interaction mode, displaying a cursor on a display screen of the head-mounted display device, and performing touch initialization setting on the touch area in the cursor interaction mode; in response to the detection that the mobile device generates displacement, controlling the cursor to move in a display screen of the head-mounted display device according to displacement information of the mobile device; and in response to the detection of the touch operation acting on the touch area, controlling a display interface displayed on a display screen of the head-mounted display device to update according to the touch operation and the position of the cursor on the display screen of the head-mounted display device.
In a second aspect, some embodiments of the present disclosure provide a head mounted display device assembly comprising: one or more processors; a head-mounted display device having a display screen disposed thereon, the display screen configured to display a cursor and a display interface; a mobile device having a touch-sensitive display screen disposed thereon, the touch-sensitive display screen configured to display a cursor control interface; a storage device having one or more programs stored thereon, which when executed by one or more processors, cause the one or more processors to implement the method described in any of the implementations of the first aspect.
In a third aspect, some embodiments of the present disclosure provide a computer readable medium on which a computer program is stored, wherein the program, when executed by a processor, implements the method described in any of the implementations of the first aspect.
The above embodiments of the present disclosure have the following advantages: by the head-mounted display equipment control method of some embodiments of the disclosure, efficient and flexible interaction can be realized in scenes with frequent interaction and complex interaction, and user experience is improved. Specifically, the reason why the existing interaction mode is complex and is only applicable to application scenarios with less interaction and simple interaction is that: firstly, a touch sensitive display screen of a mobile phone is used as a touch pad, and interaction is realized through sliding operation on the touch pad. When a user holds the mobile phone by one hand, the range of motion of the fingers of the user is limited, and flexible interaction and control are difficult to realize. Secondly, the rays are controlled by the inertial sensor in the head-mounted display device, the head of a user who needs to wear the head-mounted display device moves to realize interaction, and in a scene with frequent and complex interaction, the user needs to perform complex and frequent head movement, so that the experience of the user when wearing the head-mounted display device is greatly reduced. Based on this, the head-mounted display device control method according to some embodiments of the present disclosure first displays a cursor on a display screen of the head-mounted display device in response to determining that the head-mounted display device enters a cursor interaction mode, and performs a touch initialization setting in the cursor interaction mode on the touch area. And then, in response to the detection that the mobile device generates displacement, controlling the cursor to move in the display screen of the head-mounted display device according to the displacement information of the mobile device. Therefore, the mobile equipment can be placed on a support for use and can also be held in the air for use, the limitation of the range of motion of the fingers is avoided, and flexible interaction can be realized. And finally, in response to the detection of the touch operation acting on the touch area, controlling a display interface displayed in the display screen of the head-mounted display device to update according to the touch operation and the position of the cursor in the display screen of the head-mounted display device. Therefore, interaction can be realized by combining the displacement of the mobile equipment and the touch operation acting on the touch area. The complex and frequent interaction can be carried out through the combination of the two, and the user experience is improved.
Drawings
The above and other features, advantages and aspects of various embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements. It should be understood that the drawings are schematic and that elements and elements are not necessarily drawn to scale.
FIG. 1 is an architectural diagram of an exemplary system in which some embodiments of the present disclosure may be applied;
2-4 are a number of schematic diagrams of one application scenario of a head mounted display device control method of some embodiments of the present disclosure;
FIG. 5 is a flow diagram of some embodiments of a head mounted display device control method according to the present disclosure;
FIG. 6 is a flow chart of further embodiments of a head mounted display device control method according to the present disclosure;
FIG. 7 is a schematic structural diagram of a head mounted display device assembly suitable for use in implementing some embodiments of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings. The embodiments and features of the embodiments in the present disclosure may be combined with each other without conflict.
It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a", "an", and "the" modifications in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that "one or more" may be used unless the context clearly dictates otherwise.
The names of messages or information exchanged between devices in the embodiments of the present disclosure are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Fig. 1 is an architectural diagram 100 of an exemplary system in which some embodiments of the present disclosure may be applied.
As shown in fig. 1, exemplary system architecture 100 may include a head mounted display device assembly 11. Head mounted display device assembly 11 may include a head mounted display device 111 and a mobile device 112.
The head mounted display device 111 may include at least one display screen 1111. The display screen is used for imaging in front of the eyes of the user. In addition, head mounted display device 111 also includes a frame 1112. In some embodiments, the sensors, processing unit, memory, and battery of head mounted display device 111 can be placed inside frame 1112. In some alternative implementations of some implementations, one or more of the sensors, processing unit, memory, and battery may also be integrated into another separate accessory (not shown) that is connected to the frame 1112 via a data cable. In some optional implementations of some implementations, the head mounted display device 111 may only have display functionality and partial sensors, while data processing, data storage, power supply capabilities, and the like, are provided by the mobile device 112.
A touch region 1121 is disposed on the mobile device 112. The mobile device 112 and the head mounted display device 111 may communicate through a wireless connection or a wired connection. In some optional implementations of some embodiments, a touch-sensitive display screen (not shown) is further disposed on the mobile device 112, and the touch region 1121 may be a touch-sensitive display screen region.
It should be understood that the number of head mounted display devices and mobile devices in fig. 1 is merely illustrative. There may be any number of head mounted display devices and mobile devices, as desired for implementation.
Fig. 2-4 are schematic diagrams of one application scenario of a head mounted display device control method of some embodiments of the present disclosure.
The head-mounted display device assembly includes a head-mounted display device 201 having a display screen 2011 and a mobile device 202 having a touch area 2021, and the mobile device 202 is communicatively connected to the head-mounted display device 201. In some of these embodiments, the mobile device 202 may be placed in a normal interaction mode by default. In the normal interaction mode, a touch sensitive display screen of the mobile device 202 (e.g., a cell phone) may be used as a touchpad. The interaction is achieved by a sliding operation on the touch pad. Or displaying the rays in a display screen of the head-mounted display device and controlling the rays to interact by using an inertial sensor in the head-mounted display device.
In the application scenario of fig. 2, in response to determining that the head-mounted display device 201 enters the cursor interaction mode, the computing device may display a cursor 203 on a display screen 2011 of the head-mounted display device 201, and perform a touch initialization setting in the cursor interaction mode on the touch area 2021.
In the application scenario of fig. 3, in response to detecting that the mobile device 202 generates a displacement, the computing device may control the cursor 203 to move in the display screen 2011 of the head-mounted display device 201 according to the displacement information of the mobile device 202.
In the application scenario of fig. 4, in response to detecting the touch operation 204 acting on the touch area 2021, the computing device may control the display interface 205 displayed on the display screen 2011 of the head-mounted display device 201 to update according to the position of the cursor 203 on the display screen 2011 of the head-mounted display device 201 and the touch operation 204.
The computing device may be hardware or software. When the computing device is hardware, it may be implemented as a distributed cluster formed by a plurality of servers or terminal devices, or may be implemented as a single server or a single terminal device, for example, the computing device may be the head-mounted display device or the mobile device. When the computing device is embodied as software, it may be installed in the hardware devices enumerated above. It may be implemented, for example, as multiple software or software modules to provide distributed services, or as a single software or software module. And is not particularly limited herein.
It should be understood that the number of computing devices, head mounted display devices, and mobile devices described above are merely illustrative. There may be any number of computing devices, head mounted display devices, and mobile devices, as desired for implementation.
With continued reference to fig. 5, a flow 500 of some embodiments of a head mounted display device control method according to the present disclosure is shown. The head-mounted display device control method is applied to a head-mounted display device assembly, the head-mounted display device assembly comprises a head-mounted display device with a display screen and a mobile device with a touch area, the mobile device is in communication connection with the head-mounted display device, and the method comprises the following steps:
step 501, in response to determining that the head-mounted display device enters a cursor interaction mode, displaying a cursor on a display screen of the head-mounted display device, and performing touch initialization setting on a touch area in the cursor interaction mode.
In some embodiments, the confirmation may be performed by a user through a physical or virtual key, or may be determined by an instruction of an interaction mode issued when a different application is started, so that the head-mounted display device 201 enters a cursor interaction mode. In one embodiment, an interactive key for mode switching can be set in the mobile device, so that the head-mounted display device can be switched between a common interactive mode and a cursor interactive mode to meet the requirements of users in different scenes. In another embodiment, the cursor interaction mode may be entered automatically depending on the currently executing application. For example, the head-mounted display device may be controlled to enter the cursor interaction mode from the normal interaction mode or other interaction modes according to a request sent to the system when the application starts, and in this embodiment, when the application stops running, the head-mounted display device will also automatically exit the cursor interaction mode. In another embodiment, whether to enter the cursor interaction mode may be determined according to the current display mode of the head-mounted display device. For example, considering that the head-mounted display mode can be generally divided into a 2D mode and a 3D mode, in the 2D mode, all display contents are displayed and set in one plane, and at this time, the mode of cursor interaction is more intuitive for the user, so the head-mounted display device can be set to the cursor interaction mode in response to the head-mounted display device entering the 2D mode.
In some embodiments, an executing body of the head-mounted display device control method (e.g., the head-mounted display device 111 or the mobile device 112 shown in fig. 1) may display a cursor on a display screen of the head-mounted display device in response to determining that the head-mounted display device enters a cursor interaction mode, and perform touch initialization setting in the cursor interaction mode on the touch area. The cursor interaction mode may be a mode in which a display screen in a display screen of the head-mounted display device is controlled to be updated by movement and operation of a cursor. The head-mounted display equipment can enter a cursor interaction mode in mobile office scenes or video entertainment scenes and other scenes. The mobile device may be a control device without a display screen, and thus the touch area in the mobile device has no display function and can only provide a touch function. When the head-mounted display device enters a cursor interaction mode, a cursor can be displayed in the center of a display screen of the head-mounted display device. The cursor may also be displayed at a preset position in the display screen.
The touch area may be divided into a left key touch area and a right key touch area according to preset configuration information. The configuration information may be preset position information of the left key touch area and preset position information of the right key touch area. The position information may be coordinates in a coordinate system of the touch area. Therefore, the touch area can be divided by the configuration information. Optionally, the performing the initial touch setting of the touch area in the cursor interaction mode by the executing body may include the following steps:
and dividing the touch area into a left key touch area, a right key touch area and a cursor sensitivity adjusting area according to preset configuration information.
Therefore, the requirements of users in different application scenes can be met by setting and adjusting the sensitivity of the cursor.
And step 502, in response to detecting that the mobile device generates displacement, controlling a cursor to move in a display screen of the head-mounted display device according to displacement information of the mobile device.
In some embodiments, the executing body may control the cursor to move in the display screen of the head-mounted display device according to displacement information of the mobile device in response to detecting that the mobile device generates displacement. Wherein, laser displacement sensor can also be arranged in the head-mounted display equipment. The displacement information of the mobile device may be determined using the laser displacement sensor.
First, the laser displacement sensor provided in the head-mounted display device may be used to detect displacement amounts of the mobile device in three coordinate axis directions in a three-dimensional space, so as to obtain three-dimensional displacement amount information. And then, converting the three-dimensional displacement information into a screen coordinate system of the head-mounted display equipment to obtain two-dimensional displacement information. And finally, controlling the cursor to move in the display screen of the head-mounted display equipment by using the two-dimensional displacement information.
In some optional implementations of some embodiments, the touch initialization setting further includes setting of a sensitivity of a cursor. The cursor sensitivity comprises a cursor moving speed and a touch operation response speed. The touch operation response speed is used for controlling the response speed of the touch operation acting on the touch area. And
the execution main body controls the cursor to move in the display screen of the head-mounted display device according to the displacement information of the mobile device, and the execution main body further comprises:
and controlling the cursor to move in the display screen of the head-mounted display equipment according to the set cursor sensitivity and the displacement information of the mobile equipment.
Step 503, in response to detecting the touch operation acting on the touch area, controlling a display interface displayed in the display screen of the head-mounted display device to update according to the touch operation and the position of the cursor in the display screen of the head-mounted display device.
In some embodiments, the execution main body may control, in response to detecting the touch operation applied to the touch area, a display interface displayed in a display screen of the head-mounted display device to update according to the touch operation and the position of the cursor in the display screen of the head-mounted display device. The touch operation applied to the touch area may be a touch operation applied to the left key touch area or the right key touch area in the touch area. The touch operation may be a single click, a double click, a long press, and the like. Optionally, the touch operation applied to the touch area may be a touch operation applied to the left key touch area, the right key touch area, or the cursor sensitivity adjustment area in the touch area.
In practice, the executing body may first generate a control command according to the touch operation and the position of the cursor in the display screen of the head-mounted display device. And then, updating a display interface displayed in the display screen of the head-mounted display equipment according to the control instruction.
As an example, the display interface displayed in the display screen of the head mounted display device described above may be an office document interface. The touch operation may be a single click operation applied to a left key touch area of the touch areas. The position of the cursor in the display screen of the head-mounted display device may correspond to a closing control in the office document interface. A document closing instruction may be generated, and the office document in the office document interface may be closed according to the document closing instruction. Therefore, the display interface displayed in the display screen of the head-mounted display device is updated.
In some optional implementations of some embodiments, the executing body may perform the following steps:
the method comprises the steps of responding to the fact that a scene corresponding to the display interface is detected to be an input scene, and performing input initialization setting on the touch area under the input scene. The input scene may be an office input scene, a search input scene, a chat input scene, and the like. The initialization setting may be to set the entire touch area as a handwriting input area.
And a second step of displaying the input information in a display interface in a display screen of the head-mounted display device in response to detecting the touch input operation acting on the touch area. The touch input operation may be a handwriting input operation. The input information may be characters or figures.
Optionally, the controlling the display interface displayed in the display screen of the head-mounted display device to update further includes: and adjusting the display style of the cursor to the display style corresponding to the touch operation. A corresponding UI (User Interface) icon may be set in advance for each touch operation, so that different display styles of the cursor are embodied by different UI icons. When a user performs a touch operation in the touch area, a cursor displayed on a display screen of the head-mounted display device may be adjusted to a UI icon corresponding to the touch operation. Therefore, the interactive experience of the user can be improved.
The above embodiments of the present disclosure have the following advantages: by the head-mounted display equipment control method of some embodiments of the disclosure, efficient and flexible interaction can be realized in scenes with frequent interaction and complex interaction, and user experience is improved. Specifically, the reason why the existing interaction mode is complex and is only applicable to application scenarios with less interaction and simple interaction is that: firstly, a touch sensitive display screen of a mobile phone is used as a touch pad, and interaction is realized through sliding operation on the touch pad. In this way, the mobile phone needs to be placed on the support, and the mobile phone can be slid on the touch pad in a large range through fingers, so that interaction and control are realized. When a user holds the mobile phone by one hand, the range of motion of the fingers of the user is limited, and flexible interaction and control are difficult to realize. Secondly, the rays are controlled by the inertial sensor in the head-mounted display device, the head of a user who needs to wear the head-mounted display device moves to realize interaction, and in a scene with frequent and complex interaction, the user needs to perform complex and frequent head movement, so that the experience of the user when wearing the head-mounted display device is greatly reduced. Based on this, the head-mounted display device control method according to some embodiments of the present disclosure first displays a cursor on a display screen of the head-mounted display device in response to determining that the head-mounted display device enters a cursor interaction mode, and performs a touch initialization setting in the cursor interaction mode on the touch area. And then, in response to the detection that the mobile device generates displacement, controlling the cursor to move in the display screen of the head-mounted display device according to the displacement information of the mobile device. Therefore, the mobile equipment can be placed on a support for use and can also be held in the air for use, the limitation of the range of motion of the fingers is avoided, and flexible interaction can be realized. And finally, in response to the detection of the touch operation acting on the touch area, controlling a display interface displayed in the display screen of the head-mounted display device to update according to the touch operation and the position of the cursor in the display screen of the head-mounted display device. Therefore, interaction can be realized by combining the displacement of the mobile equipment and the touch operation acting on the touch area. The complex and frequent interaction can be carried out through the combination of the two, and the user experience is improved.
With further reference to FIG. 6, a flow 600 of further embodiments of a head mounted display device control method is shown. The process 600 of the method for controlling a head-mounted display device is applied to a head-mounted display device assembly, where the head-mounted display device assembly includes a head-mounted display device having a display screen and a mobile device having a touch area, and the mobile device is in communication connection with the head-mounted display device, and includes the following steps:
step 601, in response to determining that the head-mounted display device enters the cursor interaction mode, displaying a cursor in a display screen of the head-mounted display device, and performing touch initialization setting on a touch area in the cursor interaction mode.
In some embodiments, an executing body of the head-mounted display device control method (e.g., the head-mounted display device 111 shown in fig. 1 or the mobile device 112 such as the computing device) may display a cursor on a display screen of the head-mounted display device in response to determining that the head-mounted display device enters a cursor interaction mode, and perform touch initialization setting in the cursor interaction mode on the touch area. The mobile device may have a touch-sensitive display screen, and the touch area may be a display area of the touch-sensitive display screen. For example, the mobile device may be a smartphone with a touch-sensitive display screen. The above-described cursor interaction mode may be a mode in which the display screen in the display screen of the head-mounted display device is controlled to be updated by the movement and operation of the cursor. The head-mounted display equipment can enter a cursor interaction mode in mobile office scenes or video entertainment scenes and other scenes. When the head-mounted display device enters a cursor interaction mode, a cursor can be displayed in the center of a display screen of the head-mounted display device. The cursor may also be displayed at a preset position in the display screen.
The touch initialization setting in the cursor interaction mode for the touch area may be displaying a cursor control interface on a touch-sensitive display screen of the mobile device. The cursor control interface may include a virtual left key and a virtual right key. The virtual left key and the virtual left key may be controls displayed in the cursor control interface. The touch initialization setting may further include setting of sensitivity of a cursor. The cursor sensitivity comprises a cursor moving speed and a touch operation response speed. The touch operation response speed is used for controlling the response speed of the touch operation acting on the touch area.
In some optional implementations of some embodiments, the cursor control interface further includes a virtual cursor sensitivity adjustment key. And the executing body can also respond to the detection of the sensitivity adjustment touch operation acting on the virtual cursor sensitivity adjustment key and update and set the cursor sensitivity.
Step 602, determining displacement amounts of the mobile device in a first target direction and a second target direction respectively according to displacement data generated by a displacement sensor in the mobile device, so as to obtain a first mobile device displacement amount and a second mobile device displacement amount.
In some embodiments, the executing body may determine displacement amounts of the mobile device in the first target direction and the second target direction according to displacement data generated by a displacement sensor in the mobile device, so as to obtain a first mobile device displacement amount and a second mobile device displacement amount. Wherein the first target direction and the second target direction are determined according to a positive target direction. The first target direction may be a direction perpendicular to the positive target direction and parallel to the touch area when the positive target direction is determined. The second target direction may be a direction perpendicular to the positive target direction and the second target direction. The displacement sensor may be a sensor that can be used to detect displacement. The displacement sensor may include, but is not limited to: inductive displacement sensors, capacitive displacement sensors, eddy current displacement sensors or inertial sensors, etc.
The positive target direction may be determined by: and determining the current direction of the mobile device as a target positive direction in response to detecting a positive direction setting operation acting on the virtual left key or the virtual right key. In practice, the above-mentioned positive direction setting operation may be set according to actual conditions, and is not limited herein. For example, the positive direction setting operation may be a long press operation. The current orientation of the mobile device may be a direction perpendicular to a touch area of the mobile device.
The positive target direction may be determined by: and in response to detecting that the head-mounted display equipment enters the cursor interaction mode, determining the current orientation of the mobile equipment as a target positive direction.
And step 603, determining the displacement of the cursor in the first cursor movement direction and the displacement and movement speed of the cursor in the second cursor movement direction of the display screen of the head-mounted display device by respectively using the first mobile device displacement, the second mobile device displacement and the set cursor sensitivity.
In some embodiments, the executing body may determine a displacement amount of the cursor in a first cursor moving direction and a displacement amount and a moving speed of the cursor in a second cursor moving direction of the display screen of the head-mounted display device, respectively, using the first mobile device displacement amount, the second mobile device displacement amount, and the set cursor sensitivity. The first target direction and the second target direction may be preset to correspond to the first cursor moving direction and the second cursor moving direction. The product of the first mobile device displacement and the second mobile device displacement with the preset scaling can be determined as the displacement in the first cursor moving direction and the displacement in the second cursor moving direction. Then, the set cursor sensitivity including the cursor movement speed may be determined as the movement speed of the cursor described above.
For example, if the first target direction corresponds to the first cursor moving direction, the second target direction corresponds to the second cursor moving direction. The displacement of the cursor in the first cursor moving direction can be determined by using the displacement of the first mobile device. And determining the displacement of the cursor in the second cursor moving direction by using the displacement of the second mobile equipment.
In some optional implementations of some embodiments, the executing body may further perform the following steps:
the method comprises a first step of responding to the detection that the head-mounted display device generates displacement, respectively determining displacement amounts of the head-mounted display device in a first target direction and a second target direction according to displacement data generated by a displacement sensor in the head-mounted display device, and obtaining a first target head-mounted display device displacement amount and a second target head-mounted display device displacement amount. The displacement sensor in the head-mounted display device may include, but is not limited to: inductive displacement sensors, capacitive displacement sensors, eddy current displacement sensors or inertial sensors, etc.
And a second step of controlling the cursor to move in the display screen of the head-mounted display device according to the displacement of the first head-mounted display device and the displacement of the second head-mounted display device. The cursor can be controlled to move in a first cursor moving direction and a second cursor moving direction of a display screen of the head-mounted display device at the same time.
The correspondence between the first target direction and the second target direction and the first cursor moving direction and the second cursor moving direction may be set in advance. The product of the first mobile device displacement and the second mobile device displacement with the preset scaling can be determined as the displacement in the first cursor moving direction and the displacement in the second cursor moving direction.
Optionally, the executing body controls the cursor to move in the display screen of the head-mounted display device according to the displacement information of the mobile device in response to detecting that the mobile device generates a displacement, and may further include:
a first step of, in response to detecting that the mobile device is displaced and the head-mounted display device is displaced, executing the following displacement amount determination step:
and respectively determining the displacement of the mobile equipment in a first target direction and a second target direction according to displacement data generated by a displacement sensor in the mobile equipment to obtain the displacement of the first mobile equipment and the displacement of the second mobile equipment.
And respectively determining the displacement of the head-mounted display device in the first target direction and the second target direction according to the displacement sensor in the head-mounted display device to obtain a first head-mounted display device displacement and a second head-mounted display device displacement.
And a second step of controlling the cursor to move in the display screen of the head-mounted display device based on the first mobile device displacement, the second target direction displacement, the first head-mounted display device displacement and the second head-mounted display device displacement.
The sum of the first mobile device displacement amount and the first head mounted display device displacement amount may be determined as a displacement amount in the first cursor moving direction. And determining the sum of the displacement of the second mobile equipment and the displacement of the second head-mounted display equipment as the displacement in the second cursor moving direction. And simultaneously controlling the cursor to move in a first cursor moving direction and a second cursor moving direction of a display screen of the head-mounted display equipment. The displacement of the cursor is the displacement in the first cursor moving direction and the displacement in the second cursor moving direction.
And step 604, controlling the cursor to move in the display screen of the head-mounted display device according to the displacement amount in the first cursor moving direction, the displacement amount in the second cursor moving direction and the moving speed.
In some embodiments, the executing body may control the cursor to move in the display screen of the head-mounted display device according to the displacement amount in the first cursor moving direction, the displacement amount in the second cursor moving direction, and the moving speed.
The cursor may be controlled to move in the first cursor moving direction and the second cursor moving direction. The displacement of the cursor is the displacement in the first cursor moving direction and the displacement in the second cursor moving direction.
Step 605, in response to detecting the touch operation acting on the touch area, controlling a display interface displayed in the display screen of the head-mounted display device to update according to the touch operation and the position of the cursor in the display screen of the head-mounted display device.
In some embodiments, the specific implementation manner and technical effects of step 605 may refer to step 503 in those embodiments corresponding to fig. 5, and are not described herein again.
Step 606, in response to detecting that the scene corresponding to the display interface is an input scene, displaying a virtual input control in the cursor control interface.
In some embodiments, the execution subject may display a virtual input control in the cursor control interface in response to detecting that a scene corresponding to the display interface is an input scene. The input scene may be an office input scene, a search input scene, a chat input scene, and the like. The virtual input control can be a virtual keyboard, a writing input area or a voice input control, etc.
Step 607, in response to detecting the input operation acting on the virtual input control, displaying the input information in a display interface in a display screen of the head-mounted display device.
In some embodiments, the execution subject may display the input information in a display interface in a display screen of the head-mounted display device in response to detecting the input operation acting on the virtual input control. The input operation may be a keyboard input operation, a handwriting input operation, a voice input operation, or the like.
As can be seen from fig. 6, compared with the description of some embodiments corresponding to fig. 5, the flow 600 of the head-mounted display device control method in some embodiments corresponding to fig. 6 embodies the detailed steps of controlling the movement of the cursor by using the displacement information of the mobile device, and the step of inputting in the touch area of the mobile device. Therefore, the scheme described by the embodiments can more accurately control the movement of the cursor in the display screen of the head-mounted display device, thereby realizing more accurate interaction and control. Meanwhile, input can be performed through the touch area, and the interaction function of the mobile device is further improved.
Referring now to FIG. 7, a schematic structural diagram of a head mounted display device assembly (e.g., head mounted display device assembly 11 of FIG. 1)700 suitable for use in implementing some embodiments of the present disclosure is shown. The head mounted display device assembly shown in fig. 7 is only one example and should not bring any limitations to the functionality and scope of use of the embodiments of the present disclosure.
As shown in fig. 7, the head mounted display device assembly 700 may include a processing device (e.g., central processing unit, graphics processor, etc.) 701 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)702 or a program loaded from a storage device 708 into a Random Access Memory (RAM) 703. In the RAM 703, various programs and data necessary for the operation of the head mounted display device assembly 700 are also stored. The processing device 701, the ROM 702, and the RAM 703 are connected to each other by a bus 704. An input/output (I/O) interface 705 is also connected to bus 704.
Generally, the following devices may be connected to the I/O interface 705: input devices 706 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, mobile device, etc.; an output device 707 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, a head-mounted display device, or the like; and a communication device 709. The communications device 709 may allow the head mounted display device assembly 700 to communicate wirelessly or by wire with other devices to exchange data. While fig. 7 illustrates a head mounted display device assembly 700 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided. Each block shown in fig. 7 may represent one device or may represent multiple devices as desired.
In particular, according to some embodiments of the present disclosure, the processes described above with reference to the flow diagrams may be implemented as computer software programs. For example, some embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In some such embodiments, the computer program may be downloaded and installed from a network via communications means 709, or may be installed from storage 708, or may be installed from ROM 702. The computer program, when executed by the processing device 701, performs the above-described functions defined in the methods of some embodiments of the present disclosure.
It should be noted that the computer readable medium described in some embodiments of the present disclosure may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In some embodiments of the disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In some embodiments of the present disclosure, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network Protocol, such as HTTP (HyperText Transfer Protocol), and may interconnect with any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: in response to determining that the head-mounted display device enters a cursor interaction mode, displaying a cursor on a display screen of the head-mounted display device, and performing touch initialization setting on the touch area in the cursor interaction mode; in response to the detection that the mobile device generates displacement, controlling the cursor to move in a display screen of the head-mounted display device according to displacement information of the mobile device; and in response to the detection of the touch operation acting on the touch area, controlling a display interface displayed on a display screen of the head-mounted display device to update according to the touch operation and the position of the cursor on the display screen of the head-mounted display device.
Computer program code for carrying out operations for embodiments of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), systems on a chip (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.

Claims (19)

1. A head-mounted display device control method is applied to a head-mounted display device assembly, the head-mounted display device assembly comprises a head-mounted display device with a display screen and a mobile device with a touch area, the mobile device is in communication connection with the head-mounted display device, and the method comprises the following steps:
in response to determining that the head-mounted display device enters a cursor interaction mode, displaying a cursor in a display screen of the head-mounted display device, and performing touch initialization setting on the touch area in the cursor interaction mode;
in response to the detection that the mobile device generates displacement, controlling the cursor to move in a display screen of the head-mounted display device according to displacement information of the mobile device;
and in response to detecting the touch operation acting on the touch area, controlling a display interface displayed in a display screen of the head-mounted display device to update according to the touch operation and the position of the cursor in the display screen of the head-mounted display device.
2. The method of claim 1, wherein the touch initialization settings further comprise settings for cursor sensitivity, the cursor sensitivity comprising a cursor movement speed and a touch operation response speed for controlling a response speed of a touch operation applied to the touch area; and
the controlling the cursor to move in the display screen of the head-mounted display device according to the displacement information of the mobile device includes:
respectively determining displacement amounts of the mobile equipment in a first target direction and a second target direction according to displacement data generated by a displacement sensor in the mobile equipment to obtain a first mobile equipment displacement amount and a second mobile equipment displacement amount, wherein the first target direction and the second target direction are determined according to a target positive direction;
determining the displacement amount of the cursor in a first cursor movement direction and the displacement amount and movement speed of the cursor in a second cursor movement direction of a display screen of the head-mounted display device by respectively using the first mobile device displacement amount, the second mobile device displacement amount and the set cursor sensitivity;
and controlling the cursor to move in the display screen of the head-mounted display device according to the displacement in the first cursor moving direction, the displacement in the second cursor moving direction and the moving speed.
3. The method of claim 1, wherein the touch initialization settings further comprise settings for cursor sensitivity, the cursor sensitivity comprising a cursor movement speed and a touch operation response speed for controlling a response speed of a touch operation applied to the touch area; and
the controlling the cursor to move in the display screen of the head-mounted display device according to the displacement information of the mobile device further comprises:
and controlling the cursor to move in the display screen of the head-mounted display device according to the set cursor moving speed and the displacement information of the mobile device.
4. The method of claim 2, wherein the positive target direction is determined by:
in response to detecting a positive direction setting operation acting on the virtual left key or the virtual right key, determining a current orientation of the mobile device as a target positive direction.
5. The method of claim 1, wherein the mobile device has a touch-sensitive display screen, the touch-sensitive area being the touch-sensitive display screen area; and
the performing of the touch initialization setting in the cursor interaction mode on the touch area includes:
displaying a cursor control interface in a touch-sensitive display screen of the mobile device, wherein the cursor control interface includes a virtual left key and a virtual right key.
6. The method of claim 5, wherein the touch initialization settings further comprise settings for cursor sensitivity, the cursor sensitivity comprising a cursor movement speed and a touch operation response speed for controlling a response speed of a touch operation applied to the touch area, the cursor control interface further comprising virtual cursor sensitivity adjustment keys; and
the method further comprises the following steps:
and updating and setting the cursor sensitivity in response to the detection of the sensitivity adjustment touch operation acting on the virtual cursor sensitivity adjustment key.
7. The method of claim 5, wherein the touch operation applied to the touch area is a touch operation applied to the virtual left key or the virtual right key in the cursor control interface.
8. The method of claim 1, wherein the performing of the touch initialization setting in the cursor interaction mode on the touch area comprises:
and dividing the touch area into a left key touch area and a right key touch area according to preset configuration information.
9. The method of claim 1, wherein the performing of the touch initialization setting in the cursor interaction mode on the touch area comprises:
and dividing the touch area into a left key touch area, a right key touch area and a cursor sensitivity adjusting area according to preset configuration information.
10. The method according to claim 8, wherein the touch operation applied to the touch area is a touch operation applied to the left key touch area or the right key touch area in the touch area.
11. The method according to claim 9, wherein the touch operation applied to the touch area is a touch operation applied to the left key touch area, the right key touch area, or the cursor sensitivity adjustment area in the touch area.
12. The method of claim 4, wherein the method further comprises:
responding to the detected scene corresponding to the display interface as an input scene, and displaying a virtual input control in the cursor control interface;
in response to detecting an input operation acting on the virtual input control, displaying the input information in a display interface in a display screen of the head-mounted display device.
13. The method of claim 1, wherein the method further comprises:
in response to the fact that the scene corresponding to the display interface is detected to be an input scene, performing input initialization setting on the touch area under the input scene;
in response to detecting a touch input operation acting on the touch area, displaying the input information in a display interface in a display screen of the head-mounted display device.
14. The method of claim 1, wherein the method further comprises:
in response to the detection that the head-mounted display device generates displacement, respectively determining displacement amounts of the head-mounted display device in a first target direction and a second target direction according to displacement data generated by a displacement sensor in the head-mounted display device, and obtaining a first target head-mounted display device displacement amount and a second target head-mounted display device displacement amount;
and controlling the cursor to move in the display screen of the head-mounted display device according to the first head-mounted display device displacement and the second head-mounted display device displacement.
15. The method of claim 1, wherein the controlling the cursor to move in the display screen of the head-mounted display device according to the displacement information of the mobile device in response to detecting that the mobile device generates the displacement comprises:
in response to detecting that the head mounted display device generates displacement while the mobile device generates displacement, executing the following displacement amount determination steps:
respectively determining displacement amounts of the mobile equipment in a first target direction and a second target direction according to displacement data generated by a displacement sensor in the mobile equipment to obtain a first mobile equipment displacement amount and a second mobile equipment displacement amount;
respectively determining the displacement of the head-mounted display device in the first target direction and the second target direction according to a displacement sensor in the head-mounted display device to obtain a first head-mounted display device displacement and a second head-mounted display device displacement;
controlling the cursor to move in a display screen of the head mounted display device based on the first mobile device displacement amount, the second target direction displacement amount, the first head mounted display device displacement amount, and the second head mounted display device displacement amount.
16. The method of claim 1, wherein the controlling updating a display interface displayed in a display screen of the head-mounted display device further comprises:
and adjusting the display style of the cursor to be the display style corresponding to the touch operation.
17. The method of claim 4, wherein the positive target direction is further determined by:
in response to detecting that the head-mounted display device enters the cursor interaction mode, determining a current orientation of the mobile device as a positive target direction.
18. A head-mounted display device assembly, comprising:
one or more processors;
a head-mounted display device having a display screen disposed thereon, the display screen configured to display a cursor and a display interface;
a mobile device having a touch-sensitive display screen disposed thereon, the touch-sensitive display screen configured to display a cursor control interface;
storage means having one or more programs stored thereon which, when executed by the one or more processors, cause the one or more processors to implement the method of any of claims 1-17.
19. A computer-readable medium, on which a computer program is stored, wherein the program, when executed by a processor, implements the method of any one of claims 1-17.
CN202111627725.XA 2021-12-28 2021-12-28 Head-mounted display device control method, head-mounted display device assembly, and medium Pending CN114397961A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111627725.XA CN114397961A (en) 2021-12-28 2021-12-28 Head-mounted display device control method, head-mounted display device assembly, and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111627725.XA CN114397961A (en) 2021-12-28 2021-12-28 Head-mounted display device control method, head-mounted display device assembly, and medium

Publications (1)

Publication Number Publication Date
CN114397961A true CN114397961A (en) 2022-04-26

Family

ID=81229100

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111627725.XA Pending CN114397961A (en) 2021-12-28 2021-12-28 Head-mounted display device control method, head-mounted display device assembly, and medium

Country Status (1)

Country Link
CN (1) CN114397961A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102981626A (en) * 2012-12-12 2013-03-20 紫光股份有限公司 Wearing-on-head type computer
JP2013164736A (en) * 2012-02-10 2013-08-22 Kyushu Institute Of Technology Input device for portable information device
US20170115839A1 (en) * 2015-10-21 2017-04-27 Lg Electronics Inc. Mobile terminal and method for controlling the same
CN107024981A (en) * 2016-10-26 2017-08-08 阿里巴巴集团控股有限公司 Exchange method and device based on virtual reality
CN107977083A (en) * 2017-12-20 2018-05-01 北京小米移动软件有限公司 Operation based on VR systems performs method and device
CN110196629A (en) * 2018-02-27 2019-09-03 优酷网络技术(北京)有限公司 Virtual reality interface shows control method and device
CN112527174A (en) * 2019-09-19 2021-03-19 华为技术有限公司 Information processing method and electronic equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013164736A (en) * 2012-02-10 2013-08-22 Kyushu Institute Of Technology Input device for portable information device
CN102981626A (en) * 2012-12-12 2013-03-20 紫光股份有限公司 Wearing-on-head type computer
US20170115839A1 (en) * 2015-10-21 2017-04-27 Lg Electronics Inc. Mobile terminal and method for controlling the same
CN107024981A (en) * 2016-10-26 2017-08-08 阿里巴巴集团控股有限公司 Exchange method and device based on virtual reality
CN107977083A (en) * 2017-12-20 2018-05-01 北京小米移动软件有限公司 Operation based on VR systems performs method and device
CN110196629A (en) * 2018-02-27 2019-09-03 优酷网络技术(北京)有限公司 Virtual reality interface shows control method and device
CN112527174A (en) * 2019-09-19 2021-03-19 华为技术有限公司 Information processing method and electronic equipment

Similar Documents

Publication Publication Date Title
CN112015314B (en) Information display method and device, electronic equipment and medium
WO2022183887A1 (en) Video editing method and apparatus, video playback method and apparatus, device and medium
WO2023284791A1 (en) Virtual interface operation method, head-mounted display device and computer-readable medium
CN114363686B (en) Method, device, equipment and medium for publishing multimedia content
US20180253160A1 (en) Hop Navigation
US20230199262A1 (en) Information display method and device, and terminal and storage medium
CN115097984B (en) Interaction method, interaction device, electronic equipment and storage medium
EP4328725A1 (en) Display method and apparatus, electronic device, and storage medium
CN114397961A (en) Head-mounted display device control method, head-mounted display device assembly, and medium
CN113703704A (en) Interface display method, head-mounted display device and computer readable medium
CN111290692B (en) Picture display method and device, electronic equipment and computer readable medium
CN113342440A (en) Screen splicing method and device, electronic equipment and storage medium
EP3264203A1 (en) Intelligent interaction method, equipment and system
KR101488096B1 (en) Method and terminal of sharing positions between terminals
CN114415932A (en) Head-mounted display device control method, head-mounted display device, and readable medium
US20240096043A1 (en) Display method, apparatus, electronic device and storage medium for a virtual input device
CN117075770A (en) Interaction control method and device based on augmented reality, electronic equipment and storage medium
CN117632063A (en) Display processing method, device, equipment and medium based on virtual reality space
CN116808589A (en) Motion control method and device, readable medium and electronic equipment
CN117826977A (en) Interaction method, interaction device, electronic equipment, storage medium and computer program product
CN116048281A (en) Interaction method, device, equipment and storage medium in virtual reality scene
CN117193543A (en) Three-dimensional information input method, head-mounted display device, and readable medium
CN117631810A (en) Operation processing method, device, equipment and medium based on virtual reality space
CN116301530A (en) Virtual scene processing method and device, electronic equipment and storage medium
CN114168063A (en) Virtual key display method, head-mounted display device, and computer-readable medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination