CN116449963A - Virtual reality interaction method and device based on VR (virtual reality) head-mounted equipment - Google Patents

Virtual reality interaction method and device based on VR (virtual reality) head-mounted equipment Download PDF

Info

Publication number
CN116449963A
CN116449963A CN202310705479.8A CN202310705479A CN116449963A CN 116449963 A CN116449963 A CN 116449963A CN 202310705479 A CN202310705479 A CN 202310705479A CN 116449963 A CN116449963 A CN 116449963A
Authority
CN
China
Prior art keywords
virtual reality
user
probe
response
reality environment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310705479.8A
Other languages
Chinese (zh)
Inventor
楼彦昕
罗皓
周旭东
邓楠
朱旭
李梦赫
丁可
蒋玉洁
邱思源
胡霁月
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shahe Technology Beijing Co ltd
Original Assignee
Shahe Technology Beijing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shahe Technology Beijing Co ltd filed Critical Shahe Technology Beijing Co ltd
Priority to CN202310705479.8A priority Critical patent/CN116449963A/en
Publication of CN116449963A publication Critical patent/CN116449963A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0489Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
    • G06F3/04892Arrangements for controlling cursor position based on codes indicative of cursor displacements from one discrete location to another, e.g. using cursor control keys associated to different directions or using the tab key

Abstract

The application discloses a virtual reality interaction method and device based on VR head-mounted equipment. The method comprises the following steps: providing a handle in a virtual reality environment, the handle configured to be a component of a control interface, or as a guideline for a grabbing action, or as part of a virtual environment; in response to a user's manipulation of the handle, a direct manipulation of an object associated with the handle is performed. According to the method and the device, the interactive elements such as the handles are provided in the virtual reality environment, so that a brand new direct operation mode is utilized to realize the interactive mode in the virtual reality, and the immersion and comfort of a user are improved.

Description

Virtual reality interaction method and device based on VR (virtual reality) head-mounted equipment
Technical Field
The application relates to the technical field of virtual reality, in particular to a virtual reality interaction method and device based on VR (virtual reality) head-mounted equipment.
Background
In virtual reality (Virtual Reality System, VR), direct manipulation refers to a user being able to accomplish certain tasks through direct manipulation of a virtual object or environment. This way of interaction is very important in VR because it enables users to feel that they really exist in the virtual world and that they can directly feel the impact on the virtual object. Since most VR devices still rely on controllers at the present time, we once have also generalized most close range operations using controller gestures to direct manipulation. Examples of direct manipulation in VR include, but are not limited to: (1) The user picks up a cup with his hand (or a hand-held control); (2) The user presses one button using a finger (or using a controller gesture); (3) The user pulls one of the push rods with his hand (or a hand-held control).
Indirect manipulation is in contrast to direct manipulation, which requires the user to interact with an object via an intermediary. Examples of indirect manipulation in VR include, but are not limited to: (1) The user interacts with buttons on a virtual screen using a Laser Pointer (Laser Pointer), which in this example is the intermediary; (2) The user presses a button on the VR controller to open a menu or make an action.
In the current virtual reality interaction mode, indirect manipulation is often used, and a 2D interface based on a laser pen is still a mainstream design mode. Regardless of distance, users are always interacting with a conventional 2D interface using a laser pen. In the early years, most VR hardware used three degrees of freedom controllers and head displays, and laser pens were a very few interactive modalities that could be used widely. Meanwhile, the 2D interface equipped with the laser pen is a continuation of the conventional screen interface, and most developers/users have experience with using the screen interface. This point makes the laser pen based 2D interface the dominant design modality of the current interaction approach in VR headsets.
Six-degree-of-freedom devices have become increasingly popular, the boundaries between controllers and gesture operations also becoming increasingly ambiguous, and short panels of this type that are not directly manipulated have also begun to emerge: in virtual reality, the interaction based on the laser pen is not ergonomic, violating the intuition of the user. Compared with the indirect interaction medium (the mouse only has two X, Y axes moving on a plane) in the real world, the laser pen in the virtual reality inputs more data, and the error of the far end of the laser pen is larger. The tracking accuracy of the virtual reality device further amplifies the error, which greatly reduces the accuracy and efficiency of information input. At the same time, indirect manipulation using a laser pen can greatly reduce the user's immersion, as this is not a common tool in the real world. In addition, this interaction modality is not suitable for long-term use because the user's elbow is not physically supported. In general, a laser pen as an extension of a conventional interface in a virtual reality environment does not take good advantage of the benefits offered by virtual reality.
In addition, the current virtual reality interaction mode also comprises a displacement design. The displacement determines the manner in which a player moves from point a to point B. Players can wear six degrees of freedom devices, move directly in the real world, and they also need ancillary interactions to reach locations that are not reachable due to real world area limitations. The displacement modes which are widely adopted at present are only two modes of 'translation' and 'transmission'. Wherein, translation means that the player makes linear displacement on the flat ground by pushing a rocker on the controller. By transfer is meant that the player selects a location on the ground by means of a laser pen of a fishing line and then moves to that location instantaneously.
Strictly speaking, both the transfer and translation are indirect manipulations, since they are achieved for displacement by means of an intermediate medium. "panning" is a common moving mode in 2D screen games, but in virtual reality, because of not conforming to coordination of eyes and eyes, users often feel uncomfortable, and even new users can feel motion sickness in use, which can greatly impair user experience of users. While "delivery" is a way of moving that the real world does not possess, it severely disrupts the immersive sensation of virtual reality in the use experience. Therefore, a movement mode which is more intuitive for the user and does not cause the user to feel uncomfortable is needed in the current virtual reality interaction mode.
Disclosure of Invention
The application provides a virtual reality interaction method and device based on VR (virtual reality) head-mounted equipment.
In a first aspect, the present application provides a virtual reality interaction method based on VR headset, including: providing a handle in a virtual reality environment, the handle configured to be a component of a control interface, or as a guideline for a grabbing action, or as part of a virtual environment; in response to a user's manipulation of the handle, a direct manipulation of an object associated with the handle is performed.
In some alternative embodiments, the method further comprises: providing the probe in the virtual reality environment in response to detecting operation of the user evoking the probe with the handheld controller; responding to the operation of pointing to one direction by using the probe by a user, and marking a plurality of interactable elements within a preset range of the pointed direction; in response to a user pressing a selection key on the controller, expanding source information or a virtual interface of the interactable element closest to the center in the detection range of the probe; in response to detecting operation of hiding the probe by the user with the controller, hiding the probe in the virtual reality environment.
In some alternative embodiments, the method further comprises: providing a button in a virtual reality environment, the button configured to have a height; in response to a user pressing or releasing the button, at least one of the following forms of feedback is provided: the height of the button is correspondingly lowered or raised, and the controller held by the user starts vibrating or stops vibrating.
In some alternative embodiments, the method further comprises: in response to a user clicking an input box in a virtual reality environment, popping up a virtual keyboard; an auxiliary pointing member is provided at a tip of a finger of an avatar in a virtual reality environment in response to an operation of placing the finger on the keyboard.
In some alternative embodiments, the keys on the keyboard have a height, the method further comprising: in response to a user pressing or releasing a key on the keyboard, providing at least one of the following feedback forms: the height of the key is correspondingly reduced or increased, and the controller held by the user starts vibrating or stops vibrating.
In some alternative embodiments, the method further comprises: and responding to the operation of grabbing a blank position in the virtual reality environment by the user, moving the avatar of the user in the virtual reality environment to the position, and realizing displacement operation.
In a second aspect, the present application provides a virtual reality interaction device based on VR headset, including: a detection module configured to detect an operation of a user; a handle operation module configured to provide a handle in a virtual reality environment, the handle configured to be a component of a control interface, or as a guideline for a grabbing action, or as part of a virtual environment; in response to a user's manipulation of the handle, a direct manipulation of an object associated with the handle is performed.
In some alternative embodiments, the apparatus further comprises at least one of a probe operation module, a button operation module, a keyboard operation module, and a displacement operation module: wherein, the liquid crystal display device comprises a liquid crystal display device,
the probe manipulation module is configured to: providing the probe in the virtual reality environment in response to detecting operation of the user evoking the probe with the handheld controller; responding to the operation of pointing to one direction by using the probe by a user, and marking a plurality of interactable elements within a preset range of the pointed direction; in response to a user pressing a selection key on the controller, expanding source information or a virtual interface of the interactable element closest to the center in the detection range of the probe; and hiding the probe in the virtual reality environment in response to detecting an operation of hiding the probe by the user using the controller;
the button operation module is configured to: providing a button in a virtual reality environment, the button configured to have a height; in response to a user pressing or releasing the button, at least one of the following forms of feedback is provided: the height of the button is correspondingly lowered or raised, and the controller held by the user starts vibrating or stops vibrating.
The keyboard operation module is configured to pop up a virtual keyboard in response to the operation of clicking an input box in the virtual reality environment by a user; and providing an auxiliary positioning member at a tip of a finger of an avatar in a virtual reality environment in response to an operation of placing the finger on the keyboard;
the displacement operation module is configured to respond to the operation of grabbing a blank position in the virtual reality environment by a user, and move the avatar of the user in the virtual reality environment to the position to realize displacement operation.
In a third aspect, the present application provides a computer device comprising: one or more processors; a storage device having one or more programs stored thereon, which when executed by the one or more processors, cause the one or more processors to implement the VR headset based virtual reality interaction method as set forth in the first aspect.
In a fourth aspect, the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by one or more processors, implements the VR headset based virtual reality interaction method of the first aspect.
In order to solve some technical problems caused by indirect manipulation often used in the current virtual reality interaction mode, the application provides a virtual reality interaction method and device based on VR (virtual reality) head-mounted equipment. According to the method and the device, the interaction elements such as the handles are provided in the virtual reality environment, and the brand new direct operation (namely direct manipulation) mode is utilized to realize the interaction mode in the virtual reality, so that the immersion and the comfort of a user are improved.
In some optional embodiments, through the design of five basic components of a handle, a probe, a button, a keyboard and displacement, a whole set of direct operation interaction modes are combined, direct interaction in a virtual reality scene is realized, the information input precision and efficiency of a user are greatly improved, and the immersion of the user is enhanced.
Additionally, in some alternative embodiments, new movement schemes are designed to compensate for physical eye coordination with user arm motion, which can significantly reduce motion sickness, while learning costs are lower than translation or transmission because it is a highly real world metaphor motion.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the detailed description of non-limiting embodiments made with reference to the following drawings. The drawings are only for purposes of illustrating particular embodiments and are not to be construed as limiting the application. In the drawings:
FIG. 1 is a system architecture diagram of VR headset based virtual reality interaction methods and apparatus in accordance with the present application;
FIG. 2 is a schematic diagram of a basic interaction element design of one interaction modality provided by the VR headset-based virtual reality interaction method and apparatus in accordance with the present application;
FIG. 3 is a flow chart of one embodiment of a VR headset based virtual reality interaction method in accordance with the present application;
FIG. 4 is a block diagram of one embodiment of a VR headset based virtual reality interaction device in accordance with the present application;
fig. 5 is a schematic diagram of a hardware composition structure of an embodiment of a computer device according to the present application.
Detailed Description
For a more complete understanding of the features and technical content of the embodiments of the present application, reference should be made to the following detailed description of the embodiments of the present application, taken in conjunction with the accompanying drawings, which are for purposes of illustration only and not intended to limit the embodiments of the present application.
Referring to fig. 1, fig. 1 illustrates an exemplary system architecture 100 of at least one embodiment of VR headset based virtual reality interaction methods and apparatus according to this application.
As shown in fig. 1, the system architecture 100 may include a VR headset 101 and handheld controllers 102, 103. The VR headset 101 and controllers 102, 103 are communicatively coupled to each other, illustratively by wireless connection such as bluetooth, or by wired connection. Various client applications may be installed on VR headset 101, such as 3D gaming class applications, shopping class applications, social platform software, and the like. The controllers 102, 103 are handheld devices paired with the VR headset 101, on which at least one key, including but not limited to a select key, may be disposed. The controllers 102 and 103 are a handle type controller and a glove type controller, respectively, and may be alternatively used in practical applications. Of course, other types of controls other than handles or gloves may be used, and are not limited in this regard.
Referring to fig. 2, fig. 2 is a schematic diagram of basic interaction element design of an interaction modality, in a virtual reality interaction method and apparatus based on VR headset according to the present application.
As shown in FIG. 2, in the virtual world, basic interaction modalities can be divided into three interaction modes, namely high-frequency interaction, semantic interaction and tool interaction.
1. High frequency interactions refer to things that are done everywhere in the exploration virtual world, which constitute the core experience of a multi-person online platform. They are highly related to the functions on other links. This section of interaction type is roughly designed as an analog operation.
2. Semantic type interactions are abstract operations that are not common to the physical world, but are common in software. Semantic types of interactions, although not as common as high frequency interactions, are the basis for most functions. This section of interaction type is roughly designed as a simple interface.
3. Tool interaction is single in function, high in integration and low in associated interaction with other functions. Such interactions are important but not common. This application coarsely designs this part of the interaction type as a tool to be imitated.
Based on the above classification of interaction modes, in order to realize direct manipulation in a virtual reality environment, the application defines the action of 'grabbing' as an interaction core, and five basic but universal interaction elements are designed to realize all operations from low frequency to high frequency of a user (user) in a virtual world. The interaction elements of the fifth foundation are a handle, a probe, a button, a keyboard and displacement respectively. The "simulated operation", "simple interface", and "simulated tool" described in the above three interactions may be implemented using one or more of these five basic elements.
With the five basic elements, a user can realize an efficient, smooth and immersive virtual reality experience. These five basic elements are described in detail below.
1. Handle grip
Handles are the most central element in the interaction modality. Through a handle, a user can use the controller gesture or directly use the hand to directly manipulate an object. The handle is a critical interaction component that enables direct manipulation. The following three interaction scenarios may be implemented:
1. as a component of the control interface;
2. as a guide for the grabbing action;
3. as part of a virtual environment.
In a virtual reality environment, a handle is a visual, directly operable control that may be provided on any object, for example, the handle may be provided in a sphere or other shape on a room door or drawer or any other operable object, the handle being associated with the object, the associated object being operable by operation of the handle, for example, grasping and pulling the door handle, the room door may be opened or closed.
2. Probe with a probe tip
The probe is an important basic control next to the handle in the interaction modality. The user may view the world, meta information of all interactable elements in the world, or evoke a virtual interface of the selected object by using the probe.
The use of the probe comprises four steps:
1. calling out a probe;
2. selecting interactable elements in the world;
3. expanding the virtual interface;
4. hiding the probe;
unlike the most commonly used laser pens in traditional virtual reality interactive systems, we use a stealth, variable thickness 'ray' for the probe in the interactive modality according to the design concept of direct manipulation (or referred to as direct manipulation). When a user points in one direction using the probe, a plurality of interactable elements within a certain range of the pointed direction are marked. When the user presses the selection key, the source information or virtual interface of the interactable element closest to the center within the probe detection range will be expanded. This design increases the tolerance of the user to select remotely located, moving interactable elements of the world.
By way of example, the user may evoke the probe through the hand-held controller, and the operation of the evoked probe may be defined, for example, as: the user presses a specific button on the controller or simultaneously makes a preset action (e.g., waving or flicking the controller in a certain direction). The operation of hiding the probe may be similar or opposite to the operation of evoking the probe and will not be described in detail.
3. Push button
In various graphical operating systems, button controls are a common interface element for interaction. The buttons are typically provided with text labels or icons, and the user can perform a corresponding operation by pressing the buttons. In the interactive mode, unlike the buttons of the conventional 2D interface, buttons in the physical world are simulated as much as possible in view and feedback. Most interactions involving abstractions can be expressed using buttons.
For example, in response to a user pushing or releasing a button, at least one of the following feedback forms may be provided: the height of the button is correspondingly lowered or raised to provide visual feedback and/or the user-held controller begins to vibrate or stops vibrating to provide tactile feedback.
4. Keyboard with keyboard body
In a virtual reality environment, the keyboard is typically used in a manner similar to that used in a normal computer environment. The user can use individual keys of the keyboard to input various types of text, such as user name/password/search key words, etc.
As with the other interaction components, the keyboard of the interaction modality supports direct user interaction with direct hand or controller index finger gestures. When the keyboard is ejected, an auxiliary positioning element such as a small ball appears on the finger tip of the index finger of the user in the virtual environment, and the user can be helped to aim the key position more accurately by utilizing the small ball on the finger tip.
In this application, a keyboard design is understood to be a combination of buttons and handles. Buttons with physical feedback as keys (or key caps) can provide good visual tactile feedback. While the handle allows the user to freely move the keyboard to a comfortable position while adjusting to the proper size.
For example, a user may pop up a keyboard when clicking an input box in a virtual reality environment, and hide the keyboard when completing an input.
5. Displacement of
In order to keep the experience consistent with the direct-manipulated components described above, the present application introduces a new movement scheme, drag, in the interaction modality.
When using drag, the user "pulls" his avatar in virtual reality to that direction by grabbing a blank position in space. This displacement is similar to real rock climbing, with the imagination that numerous invisible rock climbing handles are distributed throughout the air of the virtual environment, and the user moves his avatar by grabbing these handles.
In the embodiment of the application, part or all of the above various basic elements may be further combined to realize various types of interactions.
Referring to fig. 3, fig. 3 is a flow chart of one embodiment of a VR headset based virtual reality interaction method in accordance with the present application. Based on the interaction modality and the basic interaction element design described above, the virtual reality interaction method based on the VR headset in the embodiment of the application may include the following steps:
step 31, providing a handle in the virtual reality environment, the handle being configured as a component of the control interface, either as a guideline for the grabbing action, or as part of the virtual environment;
step 32, in response to the user operating the handle, directly manipulating the object associated with the handle.
In some optional implementations, the example methods of the present application further include:
providing the probe in the virtual reality environment in response to detecting operation of the user evoking the probe with the handheld controller;
responding to the operation of pointing to one direction by using a probe by a user, and marking a plurality of interactable elements within a preset range of the pointed direction;
in response to a user pressing a selection key on the controller, expanding source information or a virtual interface of the interactable element closest to the center in the detection range of the probe;
in response to detecting operation of hiding the probe by the user with the controller, hiding the probe in the virtual reality environment.
In some optional implementations, the example methods of the present application further include:
providing a button in a virtual reality environment, the button configured to have a height;
in response to a user pressing or releasing a button, at least one of the following forms of feedback is provided: the height of the button is correspondingly lowered or raised, and the controller held by the user starts vibrating or stops vibrating.
In some optional implementations, the example methods of the present application further include:
in response to a user clicking an input box in a virtual reality environment, popping up a virtual keyboard;
in response to a user's finger placement on a keyboard of an avatar in a virtual reality environment, an auxiliary pointing element is provided at a tip of the finger.
In some optional implementations, the example methods of the present application further include: the keys on the keyboard have heights, and the method of the embodiment of the application further comprises the following steps:
in response to a user pressing or releasing a key on a keyboard, at least one of the following forms of feedback is provided: the height of the key is correspondingly lowered or raised, and the controller held by the user starts vibrating or stops vibrating.
In some optional implementations, the example methods of the present application further include:
and responding to the operation of grabbing a blank position in the virtual reality environment by the user, moving the avatar of the user in the virtual reality environment to the position, and realizing displacement operation.
Referring to fig. 4, fig. 4 is a block diagram of one embodiment of a VR headset based virtual reality interaction apparatus according to this application. As shown in fig. 4, a virtual reality interaction apparatus 400 based on VR headset according to an embodiment of the present application may include:
a detection module 41 configured to detect an operation of a user;
a handle operation module 42 configured to provide a handle in the virtual reality environment, the handle configured to be a component of the control interface, either as a guideline for a grabbing action, or as part of the virtual environment; in response to a user's manipulation of the handle, a direct manipulation of the handle-associated object is performed.
In some alternative implementations, the example apparatus further includes at least one of a probe operation module 43, a button operation module 44, a keyboard operation module 45, and a displacement operation module 46, wherein:
a probe operation module 43 configured to: providing the probe in the virtual reality environment in response to detecting operation of the user evoking the probe with the handheld controller; responding to the operation of pointing to one direction by using a probe by a user, and marking a plurality of interactable elements within a preset range of the pointed direction; in response to a user pressing a selection key on the controller, expanding source information or a virtual interface of the interactable element closest to the center in the detection range of the probe; and hiding the probe in the virtual reality environment in response to detecting an operation of hiding the probe by the user with the controller;
a button operation module 44 configured to: providing a button in a virtual reality environment, the button configured to have a height; in response to a user pressing or releasing a button, at least one of the following forms of feedback is provided: the height of the button is correspondingly lowered or raised, and the controller held by the user starts vibrating or stops vibrating.
A keyboard operation module 45 configured to pop up a virtual keyboard in response to an operation of clicking an input box by a user in a virtual reality environment; and providing an auxiliary positioning member at a tip of a finger in response to an operation of placing the finger of the avatar of the user on the keyboard in the virtual reality environment;
the displacement operation module 46 is configured to move an avatar of the user in the virtual reality environment to a position in response to an operation of grabbing the position in the virtual reality environment by the user, and to implement the displacement operation.
It should be noted that, the implementation details and technical effects of each module in the apparatus of this embodiment may refer to the descriptions of other embodiments in this application, and are not described herein again. The implementation scheme in each module of the device has a variety, so long as the purpose of the module can be achieved, and the practical deployment is not limited to the specific implementation scheme.
Referring to fig. 5, fig. 5 is a schematic structural diagram of one embodiment of a computer device for implementing a VR headset according to the present application. As shown in fig. 5, a computer device 500 of the present application may include:
one or more processors 501;
a memory 502 having one or more programs 503 stored thereon;
components such as processor 501 and memory 502 may be coupled together by bus system 504; bus system 504 is used to enable connected communications between these components;
the one or more programs 503, when executed by the one or more processors 501, cause the one or more processors 501 to implement the VR headset based virtual reality interaction method as disclosed in the method embodiments above.
The bus system 504 may include a power bus, a control bus, and a status signal bus in addition to the data bus. The memory 502 may be volatile memory or nonvolatile memory, and may include both volatile and nonvolatile memory. The processor 501 may be an integrated circuit chip with signal processing capabilities, may be a general purpose processor, a digital signal processor (DSP, digital Signal Processor), or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like.
Embodiments of the present application also provide a computer-readable storage medium having stored thereon a computer program that, when executed by one or more processors, implements a VR headset based virtual reality interaction method as disclosed in the method embodiments above.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It should be understood that the terms "system" and "network" are often used interchangeably herein. The term "and/or" in this application is merely an association relation describing an associated object, and indicates that three relations may exist, for example, a and/or B may indicate: a exists alone, A and B exist together, and B exists alone. In this application, the character "/" generally indicates that the associated object is an or relationship.
The foregoing description of the preferred embodiments of the present application is not intended to limit the scope of the present application, but is intended to cover any modifications, equivalents, and alternatives falling within the spirit and principles of the present application.

Claims (10)

1. The virtual reality interaction method based on the VR headset is characterized by comprising the following steps of:
providing a handle in a virtual reality environment, the handle configured to be a component of a control interface, or as a guideline for a grabbing action, or as part of a virtual environment;
in response to a user's manipulation of the handle, a direct manipulation of an object associated with the handle is performed.
2. The method as recited in claim 1, further comprising:
providing the probe in the virtual reality environment in response to detecting operation of the user evoking the probe with the handheld controller;
responding to the operation of pointing to one direction by using the probe by a user, and marking a plurality of interactable elements within a preset range of the pointed direction;
in response to a user pressing a selection key on the controller, expanding source information or a virtual interface of the interactable element closest to the center in the detection range of the probe;
in response to detecting operation of hiding a probe by a user with the controller, hiding the probe in a virtual reality environment.
3. The method as recited in claim 1, further comprising:
providing a button in a virtual reality environment, the button configured to have a height;
in response to a user pressing or releasing the button, at least one of the following forms of feedback is provided: the height of the button is correspondingly lowered or raised, and the controller held by the user starts vibrating or stops vibrating.
4. The method as recited in claim 1, further comprising:
in response to a user clicking an input box in a virtual reality environment, popping up a virtual keyboard;
an auxiliary pointing member is provided at a tip of a finger of an avatar in a virtual reality environment in response to an operation of placing the finger on the keyboard.
5. The method of claim 4, wherein keys on the keyboard have a height, the method further comprising:
in response to a user pressing or releasing a key on the keyboard, providing at least one of the following feedback forms: the height of the key is correspondingly reduced or increased, and the controller held by the user starts vibrating or stops vibrating.
6. The method as recited in claim 1, further comprising:
and responding to the operation of grabbing a blank position in the virtual reality environment by the user, moving the avatar of the user in the virtual reality environment to the position, and realizing displacement operation.
7. Virtual reality interaction device based on VR head-mounted device, characterized by comprising:
a detection module configured to detect an operation of a user;
a handle operation module configured to provide a handle in a virtual reality environment, the handle configured to be a component of a control interface, or as a guideline for a grabbing action, or as part of a virtual environment; in response to a user's manipulation of the handle, a direct manipulation of an object associated with the handle is performed.
8. The apparatus of claim 7, further comprising at least one of a probe operation module, a button operation module, a keyboard operation module, and a displacement operation module:
the probe manipulation module is configured to: providing the probe in the virtual reality environment in response to detecting operation of the user evoking the probe with the handheld controller; responding to the operation of pointing to one direction by using the probe by a user, and marking a plurality of interactable elements within a preset range of the pointed direction; in response to a user pressing a selection key on the controller, expanding source information or a virtual interface of the interactable element closest to the center in the detection range of the probe; and hiding the probe in the virtual reality environment in response to detecting an operation of hiding the probe by the user using the controller;
the button operation module is configured to: providing a button in a virtual reality environment, the button configured to have a height; in response to a user pressing or releasing the button, at least one of the following forms of feedback is provided: the height of the button is correspondingly lowered or raised, and the controller held by the user starts vibrating or stops vibrating;
the keyboard operation module is configured to pop up a virtual keyboard in response to the operation of clicking an input box in the virtual reality environment by a user; and providing an auxiliary positioning member at a tip of a finger of an avatar in a virtual reality environment in response to an operation of placing the finger on the keyboard;
the displacement operation module is configured to respond to the operation of grabbing a blank position in the virtual reality environment by a user, and move the avatar of the user in the virtual reality environment to the position to realize displacement operation.
9. A computer device, comprising:
one or more processors;
a storage device having one or more programs stored thereon,
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the VR headset based virtual reality interaction method of any of claims 1-6.
10. A computer-readable storage medium, having stored thereon a computer program which, when executed by one or more processors, implements the VR headset based virtual reality interaction method of any of claims 1-6.
CN202310705479.8A 2023-06-15 2023-06-15 Virtual reality interaction method and device based on VR (virtual reality) head-mounted equipment Pending CN116449963A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310705479.8A CN116449963A (en) 2023-06-15 2023-06-15 Virtual reality interaction method and device based on VR (virtual reality) head-mounted equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310705479.8A CN116449963A (en) 2023-06-15 2023-06-15 Virtual reality interaction method and device based on VR (virtual reality) head-mounted equipment

Publications (1)

Publication Number Publication Date
CN116449963A true CN116449963A (en) 2023-07-18

Family

ID=87134090

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310705479.8A Pending CN116449963A (en) 2023-06-15 2023-06-15 Virtual reality interaction method and device based on VR (virtual reality) head-mounted equipment

Country Status (1)

Country Link
CN (1) CN116449963A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160371888A1 (en) * 2014-03-10 2016-12-22 Bae Systems Plc Interactive information display
CN109284000A (en) * 2018-08-10 2019-01-29 西交利物浦大学 Three-dimensional geometry object visualization method and system under a kind of reality environment
CN109669538A (en) * 2018-12-05 2019-04-23 中国航天员科研训练中心 One kind in virtual reality compound movement constraint under grasping body exchange method
CN110162179A (en) * 2019-05-24 2019-08-23 北京理工大学 A kind of Intellisense virtual assembly system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160371888A1 (en) * 2014-03-10 2016-12-22 Bae Systems Plc Interactive information display
CN109284000A (en) * 2018-08-10 2019-01-29 西交利物浦大学 Three-dimensional geometry object visualization method and system under a kind of reality environment
CN109669538A (en) * 2018-12-05 2019-04-23 中国航天员科研训练中心 One kind in virtual reality compound movement constraint under grasping body exchange method
CN110162179A (en) * 2019-05-24 2019-08-23 北京理工大学 A kind of Intellisense virtual assembly system

Similar Documents

Publication Publication Date Title
US7161579B2 (en) Hand-held computer interactive device
Van Dam Post-WIMP user interfaces
Bowman et al. 3d user interfaces: New directions and perspectives
Wacker et al. Arpen: Mid-air object manipulation techniques for a bimanual ar system with pen & smartphone
Stuerzlinger et al. The value of constraints for 3D user interfaces
CN112424727A (en) Cross-modal input fusion for wearable systems
US20200310561A1 (en) Input device for use in 2d and 3d environments
Bowman et al. Novel uses of Pinch Gloves™ for virtual environment interaction techniques
US8232989B2 (en) Method and apparatus for enhancing control of an avatar in a three dimensional computer-generated virtual environment
CN108027705A (en) The apparatus and method inputted for buttons/keys and " finger writing " mixed type and low gabarit/geometry-variable controller based on hand
Li et al. Get a grip: Evaluating grip gestures for vr input using a lightweight pen
Smith et al. Digital foam interaction techniques for 3D modeling
KR102021851B1 (en) Method for processing interaction between object and user of virtual reality environment
Lee et al. Design and empirical evaluation of a novel near-field interaction metaphor on distant object manipulation in vr
Schlünsen et al. A VR study on freehand vs. widgets for 3D manipulation tasks
Yukang et al. Gesture-based target acquisition in virtual and augmented reality
CN116449963A (en) Virtual reality interaction method and device based on VR (virtual reality) head-mounted equipment
KR101962464B1 (en) Gesture recognition apparatus for functional control
Lemoine et al. Interaction techniques: 3d menus-based paradigm
Mendes Manipulation of 3d objects in immersive virtual environments
Poupyrev 3d manipulation techniques
Scicali et al. Usability study of leap motion controller
Yang et al. An intuitive human-computer interface for large display virtual reality applications
Chang et al. TanGo: Exploring Expressive Tangible Interactions on Head-Mounted Displays
van Dam Post-Wimp user interfaces: The human connection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination