CN109847343B - Virtual reality interaction method and device, storage medium and electronic equipment - Google Patents

Virtual reality interaction method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN109847343B
CN109847343B CN201811634513.2A CN201811634513A CN109847343B CN 109847343 B CN109847343 B CN 109847343B CN 201811634513 A CN201811634513 A CN 201811634513A CN 109847343 B CN109847343 B CN 109847343B
Authority
CN
China
Prior art keywords
controller
angle range
virtual reality
user
viewpoint position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811634513.2A
Other languages
Chinese (zh)
Other versions
CN109847343A (en
Inventor
许世杰
吴伟迪
谭清宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN201811634513.2A priority Critical patent/CN109847343B/en
Publication of CN109847343A publication Critical patent/CN109847343A/en
Application granted granted Critical
Publication of CN109847343B publication Critical patent/CN109847343B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The present disclosure provides a virtual reality interaction method and apparatus, a storage medium, and an electronic device, wherein the method includes: detecting a first control operation through the head-mounted display equipment, and acquiring a viewpoint position and a sight of a user; detecting a second control operation through the controller, and acquiring the position and the overturning angle of the controller; acquiring the visual field angle range of the user and the turning angle range of the controller; judging whether a first preset condition is met or not according to the viewpoint position, the sight line, the position and the turning angle of the controller, the view angle range and the turning angle range; and if so, displaying a control on the graphical user interface. According to the embodiment of the application, the interactive immersion in the virtual reality scene can not be interrupted, the operation efficiency is improved, and the user experience is improved.

Description

Virtual reality interaction method and device, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of virtual reality technologies, and in particular, to a virtual reality interaction method and apparatus, a storage medium, and an electronic device.
Background
Virtual Reality (VR) technology is an emerging, digital human interface technology. In the virtual reality technology, a virtual reality scene with comprehensive perception, including hearing, touch and the like, mainly based on visual perception can be provided for a user through an optical structure, a display system, a virtual reality engine and the like. Moreover, the user can not only sense the virtual reality scene through various sense channels such as vision, hearing, touch, acceleration and the like, but also interact with the virtual reality scene through modes such as a handle, a remote controller, voice, actions, expressions, gestures, sight lines and the like, so that the experience of being personally on the scene is generated. At present, the virtual reality technology has been widely applied in the fields of games, medical treatment, education, engineering training and the like.
Taking game application as an example, the greatest benefit brought by the virtual reality technology is that a strong immersion feeling can be created for the user, and the interest of the game is greatly improved. On the other hand, in a common plane-based information processing interface, processing can be performed on a two-dimensional screen by means of input devices such as fingers, a handle or a keyboard, so that convenient, quick and accurate interactive operation is realized.
For example, in a game, a player needs to open a menu in the interface or view character information. The prior art has two kinds: the first is to open the menu by clicking a certain button of the handle. The second solution is to attach the button for opening the menu to the wrist and to open the menu by clicking the button with the other hand.
Among them, the first existing solution has the following two disadvantages: get into the VR recreation after, because the player can wear the VR head-mounted display device who shields the sight, consequently the player can't directly see own hand and handle, if the position of certain button is far away from the finger of player, new hand player can be difficult to find this button, and the learning cost is higher. In addition, the VR game emphasizes the immersion, and the keys on the handle belong to things outside the game world, so that the immersion is weakened to some extent.
In the second prior art, the main disadvantage is that two hands are needed to cooperate, when the battle game is played, the player often holds the weapon with the other hand, and the menu is opened only by putting down the weapon, which is troublesome.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
An object of the present disclosure is to provide a virtual reality interaction method and apparatus, a storage medium, and an electronic device, thereby overcoming, at least to some extent, one or more problems due to limitations and disadvantages of the related art.
According to an aspect of the present disclosure, there is provided a virtual reality interaction method applied to a virtual reality system, the virtual reality system at least including a head-mounted display device, and at least one controller, running a program on the virtual reality system, rendering a graphical user interface on the head-mounted display device, the method including:
detecting a first control operation through the head-mounted display equipment, and acquiring a viewpoint position and a sight of a user;
detecting a second control operation through the controller, and acquiring the position and the overturning angle of the controller;
acquiring the visual field angle range of the user and the turning angle range of the controller;
judging whether a first preset condition is met or not according to the viewpoint position, the sight line, the position and the turning angle of the controller, the view angle range and the turning angle range;
and if so, displaying a control on the graphical user interface.
According to an aspect of the present disclosure, there is provided a virtual reality interaction apparatus applied to a virtual reality system, the virtual reality system at least includes a head-mounted display device, and at least one controller, a program is run on the virtual reality system, and a graphical user interface is rendered on the head-mounted display device, the apparatus includes:
the first acquisition module is used for acquiring a viewpoint position and a sight of a user by detecting a first control operation through the head-mounted display equipment;
the second acquisition module is used for acquiring the position and the turnover angle of the controller by detecting a second control operation through the controller;
the third acquisition module is used for acquiring the visual field angle range of the user and the turning angle range of the controller;
the first judgment module is used for judging whether a first preset condition is met or not according to the viewpoint position, the sight line, the position and the turning angle of the controller, the view angle range and the turning angle range;
and the first display module is used for displaying a control on the graphical user interface if the first preset condition is met.
According to an aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements a virtual reality interaction method as any one of the above.
According to an aspect of the present disclosure, there is provided an electronic device including:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the virtual reality interaction method of any one of the above via execution of the executable instructions.
The invention discloses a virtual reality interaction method and device, a storage medium and electronic equipment. The first control operation can be detected through the head-mounted display equipment, and the viewpoint position and the sight of the user can be acquired; detecting a second control operation through the controller, and acquiring the position and the overturning angle of the controller; acquiring the visual field angle range of the user and the turning angle range of the controller; judging whether a first preset condition is met or not according to the viewpoint position, the sight line, the position and the turning angle of the controller, the view angle range and the turning angle range; and if so, displaying a control on the graphical user interface. According to the method, corresponding parameters of each device in the virtual reality system are obtained, and under the condition that certain preset conditions are met, the corresponding control is triggered to be displayed on the graphical user interface of the head-mounted display device, the whole operation can be carried out only by one hand, the trigger rule of the control also accords with the operation habit of a user, the interactive immersion in the virtual reality scene is not interrupted, the operation efficiency is improved, and the user experience is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The above and other features and advantages of the present disclosure will become more apparent by describing in detail exemplary embodiments thereof with reference to the attached drawings. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty. In the drawings:
fig. 1 is a flow chart of a virtual reality interaction method in an exemplary embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a virtual reality interaction in an exemplary embodiment of the present disclosure;
FIG. 3 is a block diagram of a virtual reality interaction side apparatus of the present disclosure;
FIG. 4 is a block diagram view of an electronic device in an exemplary embodiment according to the present disclosure.
FIG. 5 is a schematic diagram illustrating a program product in an exemplary embodiment of the disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The same reference numerals denote the same or similar parts in the drawings, and thus, a repetitive description thereof will be omitted.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the embodiments of the disclosure can be practiced without one or more of the specific details, or with other methods, components, materials, devices, steps, and so forth. In other instances, well-known structures, methods, devices, implementations, materials, or operations are not shown or described in detail to avoid obscuring aspects of the disclosure.
The block diagrams shown in the figures are functional entities only and do not necessarily correspond to physically separate entities. That is, these functional entities may be implemented in the form of software, or in one or more software-hardened modules, or in different networks and/or processor devices and/or microcontroller devices.
The exemplary embodiment first discloses a virtual reality interaction method, which is mainly applied to a virtual reality system, and the virtual reality system may include a head-mounted display device and at least one controller. The head-mounted display device can be composed of an optical structure and a display system, wherein the display system is connected with an external virtual reality engine to receive display contents processed by the external virtual reality engine, and then a virtual reality scene is presented to a user through the optical structure; or only comprises an optical structure, and the display system and the virtual reality engine are provided by external equipment such as a smart phone; that is, the virtual reality system to which the virtual reality interaction method is applied in the present exemplary embodiment is not particularly limited. Referring to fig. 1, a virtual reality interaction method in this example embodiment may include:
referring to fig. 1, the virtual reality interaction method may include the following steps:
s1, detecting a first control operation through head-mounted display equipment to acquire a user viewpoint position and a sight line;
s2, detecting a second control operation through the controller, and acquiring the position and the overturning angle of the controller;
s3, acquiring the visual field angle range of the user and the overturning angle range of the controller;
s4, judging whether a first preset condition is met or not according to the viewpoint position, the sight line, the position and the turning angle of the controller, the view angle range and the turning angle range;
and S5, if yes, displaying a control on the graphical user interface.
According to the virtual reality interaction method in the present exemplary embodiment, a first control operation may be detected by the head-mounted display device, and a user viewpoint position and a line of sight may be acquired; detecting a second control operation through the controller, and acquiring the position and the overturning angle of the controller; acquiring the visual field angle range of the user and the turning angle range of the controller; judging whether a first preset condition is met or not according to the viewpoint position, the sight line, the position and the turning angle of the controller, the view angle range and the turning angle range; and if so, displaying a control on the graphical user interface. According to the method, corresponding parameters of each device in the virtual reality system are obtained, and under the condition that certain preset conditions are met, the corresponding control is triggered to be displayed on the graphical user interface of the head-mounted display device, the whole operation can be carried out only by one hand, the trigger rule of the control also accords with the operation habit of a user, the interactive immersion in the virtual reality scene is not interrupted, the operation efficiency is improved, and the user experience is improved.
The virtual reality interaction method in the present exemplary embodiment will be further described with reference to fig. 2.
In an exemplary embodiment of the present disclosure, the method includes:
s1, detecting a first control operation through head-mounted display equipment to acquire a user viewpoint position and a sight line;
s2, detecting a second control operation through the controller, and acquiring the position and the overturning angle of the controller;
s3, acquiring the visual field angle range of the user and the overturning angle range of the controller;
s4, judging whether a first preset condition is met or not according to the viewpoint position, the sight line, the position and the turning angle of the controller, the view angle range and the turning angle range;
and S5, if yes, displaying a control on the graphical user interface.
In an exemplary embodiment of the present disclosure, in step S1, the user viewpoint position and the line of sight are acquired by the head mounted display device detecting the first control operation. Specifically, the head-mounted display device may be a wearable device in the form of a helmet, and the user wears the head-mounted display device on the head, and in response to the virtual reality application being turned on, the user can see that a display screen on the head-mounted display device renders a graphical user interface in which corresponding content is displayed. The head-mounted display device may comprise a plurality of sensors, such as a gyroscope, a gravity sensor, and the like, through which head motion information of the user may be obtained, such as the head of the user swings left and right, and accordingly, the head-mounted display device senses the swing information, and controls the content displayed on the graphical user interface to change accordingly according to the swing information. The first control operation is detected by the head-mounted display device, and the first control operation can be swinging in the left-right direction or head raising and head lowering in the vertical direction. For the first control operation, a viewpoint position and a line of sight of the user may be acquired, wherein the viewpoint position may be set as a center position of the head-mounted display device, and the line of sight is a ray generated from the center position of the head-mounted display device as a starting point at an angle at which the head-mounted display device senses that the user heads up or heads down.
In the exemplary embodiment of the present disclosure, in step S2, the position and the flip angle of the controller are acquired by the controller detecting the second control operation. At least one controller is usually further included in the virtual reality system, wherein the controller may be a controller held by a user, such as a handle, or a wearable device worn by the user, such as a watch or a bracelet. The interaction operation under the virtual reality scene in the interface is mapped through the user operation based on the controller or responding to the limb action of the user. And detecting a second control operation through the controller, acquiring the position and the turning angle of the controller, and controlling the head-mounted display device and the controller to perform matching operation by a user in a virtual reality scene so as to trigger and complete corresponding operation in application.
In an exemplary embodiment of the present disclosure, in step S3, a visual field angle range of the user and a flip angle range of the controller are acquired. In the virtual reality application, the head-mounted display device and the controller both set an effective operation range, and the effective operation range can be set according to the requirements of a developer or a user, for example, the head-mounted display device senses that the head-up and head-down angle of the user is 30 °, and the operation beyond the 30 ° range is regarded as an invalid operation, and the same is true. The turning angle range of the controller can be correspondingly set, the turning angle of the user for lifting or throwing the hand is detected, if the turning angle is within the turning angle range, the effective operation is performed, and if the turning angle is beyond the range, the invalid operation is performed.
In an exemplary embodiment of the present disclosure, in step S4, it is determined whether a first preset condition is satisfied according to the viewpoint position, the line of sight, the position and the turning angle of the controller, and the viewing angle range and the turning angle range. Specifically, according to the user viewpoint position, the sight line, the position and the turning angle of the controller, the view angle range and the turning angle range acquired in the previous steps, whether the parameters meet a preset trigger condition or not is judged, and corresponding operation in the virtual reality scene is correspondingly triggered.
And S5, if yes, displaying a control on the graphical user interface.
And if the first preset condition is met, displaying a control on the graphical user interface. The control may be a main control of a virtual reality game, or a most common control in an application, and may be specifically set by a developer or a user according to a requirement, which is not limited in this embodiment.
According to the virtual reality interaction method in the present exemplary embodiment, a first control operation may be detected by the head-mounted display device, and a user viewpoint position and a line of sight may be acquired; detecting a second control operation through the controller, and acquiring the position and the overturning angle of the controller; acquiring the visual field angle range of the user and the turning angle range of the controller; judging whether a first preset condition is met or not according to the viewpoint position, the sight line, the position and the turning angle of the controller, the view angle range and the turning angle range; and if so, displaying a control on the graphical user interface. According to the method, corresponding parameters of each device in the virtual reality system are obtained, and under the condition that certain preset conditions are met, the corresponding control is triggered to be displayed on the graphical user interface of the head-mounted display device, the whole operation can be carried out only by one hand, the trigger rule of the control also accords with the operation habit of a user, the interactive immersion in the virtual reality scene is not interrupted, the operation efficiency is improved, and the user experience is improved.
Further, as an optional scheme, the first preset condition is:
x1< y1, and x2< y 2;
an included angle between a position connecting line of the viewpoint position and the controller and the sight line is x1, an included angle between a position connecting line of the viewpoint position and the controller and a ray perpendicular to the controller is x2, a visual field angle range y1 of the user, and a turning angle range y2 of the controller.
As shown in fig. 2, a schematic diagram of a user wearing a head-mounted display device (not shown) and interacting with a hand controller (bracelet) is shown. As shown in the figure, the line of sight of the user is L1, the connection line between the viewpoint position of the user and the position of the controller is L0, the ray perpendicular to the controller is L2, the included angle between L1 and L0 is x1, and the included angle between L2 and L0 is x 2. Wherein, the visual field angle range of the user is y1, and the turnover angle range of the controller is y 2. For example, the viewing angle range y1 of the user is 30 °, the flip angle y2 of the controller is 45 °, and when x1 is 25 ° and x2 is 30 °, it is determined that the first preset condition is satisfied. If x1 is 25 ° and x2 is 50 °, the first preset condition is not satisfied; similarly, if x1 is 35 °, x2 is 35 °, the first preset condition is not satisfied.
Further, as an optional scheme, after the step S5, the method further includes:
step S6: detecting the change of at least one of the user viewpoint position, the sight line, the position of the controller and the turnover angle in real time, and acquiring the corresponding parameters after the change;
step S7: judging whether a second preset condition is met or not according to the changed corresponding parameters;
step S8: and if so, hiding the control on the graphical user interface.
Since the limb operation of the user is dynamically changed in real time, in step S6, the change of at least one of the viewpoint position, the sight line, the position of the controller, and the turning angle of the user is detected in real time, and the changed corresponding parameter is obtained. The changes of the parameters need to be read and recorded in time, and each change of the parameters may be an expression of the operation intention of the user in the virtual reality scene. In step S7, it is determined whether a second predetermined condition is satisfied according to the changed corresponding parameter. In step S8, if yes, the control is hidden on the graphical user interface. The user stays on the called control for a sufficient time, or the user simply calls the control by misoperation, at this time, the user needs to close the control, in the conventional setting, a close button is arranged on an interface, and the user clicks the close button through the cooperation operation of the head-mounted display and the controller to trigger the closing of the control. However, this operation takes a lot of time, and the close control is usually displayed on the interface with a smaller control, which also requires a higher precision for the operation, resulting in a poorer user experience. Therefore, in this embodiment, a second preset condition is constructed through the change of the corresponding parameter, and when the second preset condition is met, the closing of the control in the interface is triggered.
Further, as an optional scheme, the second preset condition is:
x1> y1 or x2> y 2;
an included angle between a position connecting line of the viewpoint position and the controller and the sight line is x1, an included angle between a position connecting line of the viewpoint position and the controller and a ray perpendicular to the controller is x2, a visual field angle range y1 of the user, and a turning angle range y2 of the controller.
As shown in fig. 2, the parameters related to the second preset condition and the first preset condition are the same, and the same mechanism is used to perform the trigger determination of the corresponding operation, so that the system resources can be saved to a certain extent. The sight line of the user is L1, the connecting line between the viewpoint position of the user and the position of the controller is L0, the ray perpendicular to the controller is L2, the included angle between L1 and L0 is x1, and the included angle between L2 and L0 is x 2. Wherein, the visual field angle range of the user is y1, and the turnover angle range of the controller is y 2. For example, the user's view angle range y1 is 30 °, the controller's flip angle y2 is 45 °, and when x1 is 45 °, it is determined that the second preset condition is satisfied. If x1 is equal to 50 °, a first preset condition is satisfied; namely, under the condition that x1> y1 or x2> y2 is detected, the second preset condition is determined to be met, and therefore closing of the control in the interface is triggered.
In summary, according to the virtual reality interaction method in the present exemplary embodiment, the first control operation may be detected by the head-mounted display device, and the viewpoint position and the line of sight of the user may be acquired; detecting a second control operation through the controller, and acquiring the position and the overturning angle of the controller; acquiring the visual field angle range of the user and the turning angle range of the controller; judging whether a first preset condition is met or not according to the viewpoint position, the sight line, the position and the turning angle of the controller, the view angle range and the turning angle range; and if so, displaying a control on the graphical user interface. According to the method, corresponding parameters of each device in the virtual reality system are obtained, and under the condition that certain preset conditions are met, a corresponding control is triggered to be displayed on a graphical user interface of the head-mounted display device, the whole operation can be carried out only by one hand, the trigger rule of the control also accords with the operation habit of a user, and the interactive immersion in the virtual reality scene cannot be interrupted. Furthermore, through the same trigger mechanism, the quick closing operation of the control in the interface is realized, the operation efficiency is improved, and the user experience is improved.
It should be noted that although the various steps of the methods of the present disclosure are depicted in the drawings in a particular order, this does not require or imply that these steps must be performed in this particular order, or that all of the depicted steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
In an exemplary embodiment of the present disclosure, there is also provided a virtual reality interaction apparatus applied to a virtual reality system, where the virtual reality system at least includes a head-mounted display device, and at least one controller, a program is run on the virtual reality system, and a graphical user interface is rendered on the head-mounted display device, the apparatus includes: as shown in fig. 3, the virtual reality interacting device 10 may include: a first obtaining module 101, a second obtaining module 102, a third obtaining module, a first determining module 104 and a first displaying module 105. Wherein the apparatus comprises:
the first acquisition module is used for acquiring a viewpoint position and a sight of a user by detecting a first control operation through the head-mounted display equipment;
the second acquisition module is used for acquiring the position and the turnover angle of the controller by detecting a second control operation through the controller;
the third acquisition module is used for acquiring the visual field angle range of the user and the turning angle range of the controller;
the first judgment module is used for judging whether a first preset condition is met or not according to the viewpoint position, the sight line, the position and the turning angle of the controller, the view angle range and the turning angle range;
and the first display module is used for displaying a control on the graphical user interface if the first preset condition is met.
In the present exemplary embodiment, the first preset condition is:
x1< y1, and x2< y 2;
an included angle between a position connecting line of the viewpoint position and the controller and the sight line is x1, an included angle between a position connecting line of the viewpoint position and the controller and a ray perpendicular to the controller is x2, a visual field angle range y1 of the user, and a turning angle range y2 of the controller.
In this example embodiment, the virtual reality interacting device may further include:
a detection and acquisition module (not shown in the figure) for detecting the change of at least one of the user viewpoint position, the sight line, the position of the controller and the turning angle in real time to acquire the changed corresponding parameters;
a second determining module (not shown in the figure) for determining whether a second preset condition is satisfied according to the changed corresponding parameter;
and a second display module (not shown in the figure) configured to hide the control on the graphical user interface if the second preset condition is met.
In this example embodiment, the second preset condition is:
x1> y1 or x2> y 2;
an included angle between a position connecting line of the viewpoint position and the controller and the sight line is x1, an included angle between a position connecting line of the viewpoint position and the controller and a ray perpendicular to the controller is x2, a visual field angle range y1 of the user, and a turning angle range y2 of the controller.
The specific details of each virtual reality interaction device module are already described in detail in the corresponding virtual reality interaction method, and therefore are not described herein again.
It should be noted that although in the above detailed description several modules or units of the apparatus for performing are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
In an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or program product. Thus, various aspects of the invention may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
An electronic device 600 according to this embodiment of the invention is described below with reference to fig. 4. The electronic device 600 shown in fig. 4 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present invention.
As shown in fig. 4, the electronic device 600 is embodied in the form of a general purpose computing device. The components of the electronic device 600 may include, but are not limited to: the at least one processing unit 610, the at least one memory unit 620, a bus 630 connecting different system components (including the memory unit 620 and the processing unit 610), and a display unit 640.
Wherein the storage unit stores program code that is executable by the processing unit 610 to cause the processing unit 610 to perform steps according to various exemplary embodiments of the present invention as described in the above section "exemplary methods" of the present specification.
The storage unit 620 may include readable media in the form of volatile memory units, such as a random access memory unit (RAM)6201 and/or a cache memory unit 6202, and may further include a read-only memory unit (ROM) 6203.
The memory unit 620 may also include a program/utility 6204 having a set (at least one) of program modules 6205, such program modules 6205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 630 may be one or more of several types of bus structures, including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 600 may also communicate with one or more external devices 700 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 600, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 600 to communicate with one or more other computing devices. Such communication may occur via an input/output (I/O) interface 650. Also, the electronic device 600 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the Internet) via the network adapter 660. As shown, the network adapter 660 communicates with the other modules of the electronic device 600 over the bus 630. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the electronic device 600, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, there is also provided a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, aspects of the invention may also be implemented in the form of a program product comprising program code means for causing a terminal device to carry out the steps according to various exemplary embodiments of the invention described in the above section "exemplary methods" of the present description, when said program product is run on the terminal device.
Referring to fig. 5, a program product 800 for implementing the above method according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Furthermore, the above-described figures are merely schematic illustrations of processes involved in methods according to exemplary embodiments of the invention, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.

Claims (8)

1. A virtual reality interaction method, applied to a virtual reality system, the virtual reality system at least including a head-mounted display device and at least one controller, running a program on the virtual reality system, rendering a graphical user interface on the head-mounted display device, the method comprising:
detecting a first control operation through the head-mounted display equipment, and acquiring a viewpoint position and a sight of a user;
detecting a second control operation through the controller, and acquiring the position and the overturning angle of the controller;
acquiring the visual field angle range of the user and the turning angle range of the controller;
judging whether a first preset condition is met or not according to the viewpoint position, the sight line, the position and the turning angle of the controller, the view angle range and the turning angle range, wherein the first preset condition is as follows:
x1< y1, and x2< y 2;
wherein an included angle between a position connecting line of the viewpoint position and the controller and the sight line is x1, an included angle between a position connecting line of the viewpoint position and the controller and a ray perpendicular to the controller is x2, a visual field angle range y1 of the user, and a turning angle range y2 of the controller;
and if so, displaying a control on the graphical user interface.
2. The method of claim 1, wherein after the step of displaying a control on the graphical user interface, further comprises:
detecting the change of at least one of the user viewpoint position, the sight line, the position of the controller and the turnover angle in real time, and acquiring the corresponding parameters after the change;
judging whether a second preset condition is met or not according to the changed corresponding parameters;
and if so, hiding the control on the graphical user interface.
3. The method according to claim 2, wherein the second preset condition is:
x1> y1 or x2> y 2;
an included angle between a position connecting line of the viewpoint position and the controller and the sight line is x1, an included angle between a position connecting line of the viewpoint position and the controller and a ray perpendicular to the controller is x2, a visual field angle range y1 of the user, and a turning angle range y2 of the controller.
4. A virtual reality interaction device is applied to a virtual reality system, the virtual reality system at least comprises a head-mounted display device and at least one controller, a program runs on the virtual reality system, and a graphical user interface is rendered on the head-mounted display device, the device comprises:
the first acquisition module is used for acquiring the viewpoint position and the sight of a user by detecting a first control operation through the head-mounted equipment;
the second acquisition module is used for acquiring the position and the turnover angle of the controller by detecting a second control operation through the controller;
the third acquisition module is used for acquiring the visual field angle range of the user and the turning angle range of the controller;
the first judgment module is used for judging whether a first preset condition is met according to the viewpoint position, the sight line, the position and the turning angle of the controller, the view angle range and the turning angle range, and the first preset condition is as follows:
x1< y1, and x2< y 2;
wherein an included angle between a position connecting line of the viewpoint position and the controller and the sight line is x1, an included angle between a position connecting line of the viewpoint position and the controller and a ray perpendicular to the controller is x2, a visual field angle range y1 of the user, and a turning angle range y2 of the controller;
and the first display module is used for displaying a control on the graphical user interface if the first preset condition is met.
5. The apparatus of claim 4, wherein the apparatus further comprises:
the detection acquisition module is used for detecting the change of at least one of the user viewpoint position, the sight line, the position of the controller and the turning angle in real time and acquiring the corresponding parameters after the change;
the second judging module is used for judging whether a second preset condition is met or not according to the changed corresponding parameters;
and the second display module is used for hiding the control on the graphical user interface if the second preset condition is met.
6. The apparatus of claim 5, wherein the second preset condition is:
x1> y1 or x2> y 2;
an included angle between a position connecting line of the viewpoint position and the controller and the sight line is x1, an included angle between a position connecting line of the viewpoint position and the controller and a ray perpendicular to the controller is x2, a visual field angle range y1 of the user, and a turning angle range y2 of the controller.
7. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the virtual reality interaction method of any one of claims 1 to 3.
8. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the virtual reality interaction method of any of claims 1-3 via execution of the executable instructions.
CN201811634513.2A 2018-12-29 2018-12-29 Virtual reality interaction method and device, storage medium and electronic equipment Active CN109847343B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811634513.2A CN109847343B (en) 2018-12-29 2018-12-29 Virtual reality interaction method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811634513.2A CN109847343B (en) 2018-12-29 2018-12-29 Virtual reality interaction method and device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN109847343A CN109847343A (en) 2019-06-07
CN109847343B true CN109847343B (en) 2022-02-15

Family

ID=66893293

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811634513.2A Active CN109847343B (en) 2018-12-29 2018-12-29 Virtual reality interaction method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN109847343B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105393190A (en) * 2013-06-25 2016-03-09 微软技术许可有限责任公司 Selecting user interface elements via position signal
CN106462231A (en) * 2014-03-17 2017-02-22 Itu 商业发展公司 Computer-implemented gaze interaction method and apparatus
WO2017100755A1 (en) * 2015-12-10 2017-06-15 Appelago Inc. Automated migration of animated icons for dynamic push notifications
CN106861184A (en) * 2016-12-28 2017-06-20 北京乐动卓越科技有限公司 A kind of method and system that man-machine interaction is realized in immersion VR game
CN108536374A (en) * 2018-04-13 2018-09-14 网易(杭州)网络有限公司 Virtual objects direction-controlling method and device, electronic equipment, storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10921896B2 (en) * 2015-03-16 2021-02-16 Facebook Technologies, Llc Device interaction in augmented reality

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105393190A (en) * 2013-06-25 2016-03-09 微软技术许可有限责任公司 Selecting user interface elements via position signal
CN106462231A (en) * 2014-03-17 2017-02-22 Itu 商业发展公司 Computer-implemented gaze interaction method and apparatus
WO2017100755A1 (en) * 2015-12-10 2017-06-15 Appelago Inc. Automated migration of animated icons for dynamic push notifications
CN106861184A (en) * 2016-12-28 2017-06-20 北京乐动卓越科技有限公司 A kind of method and system that man-machine interaction is realized in immersion VR game
CN108536374A (en) * 2018-04-13 2018-09-14 网易(杭州)网络有限公司 Virtual objects direction-controlling method and device, electronic equipment, storage medium

Also Published As

Publication number Publication date
CN109847343A (en) 2019-06-07

Similar Documents

Publication Publication Date Title
CN107469354B (en) Visible sensation method and device, storage medium, the electronic equipment of compensating sound information
CN106843498B (en) Dynamic interface interaction method and device based on virtual reality
CN107930119B (en) Information processing method, information processing device, electronic equipment and storage medium
CN110115842B (en) Application processing system, application processing method, and application processing program
US20200387286A1 (en) Arm gaze-driven user interface element gating for artificial reality systems
CN104281260A (en) Method and device for operating computer and mobile phone in virtual world and glasses adopting method and device
CN110568929B (en) Virtual scene interaction method and device based on virtual keyboard and electronic equipment
JP2017530438A (en) Object placement based on gaze in a virtual reality environment
US10921879B2 (en) Artificial reality systems with personal assistant element for gating user interface elements
US20190026589A1 (en) Information processing device, information processing method, and program
US11086475B1 (en) Artificial reality systems with hand gesture-contained content window
KR20220032059A (en) Touch free interface for augmented reality systems
US11043192B2 (en) Corner-identifiying gesture-driven user interface element gating for artificial reality systems
CN106598246B (en) Interaction control method and device based on virtual reality
CN108776544B (en) Interaction method and device in augmented reality, storage medium and electronic equipment
US10852839B1 (en) Artificial reality systems with detachable personal assistant for gating user interface elements
CN109542323A (en) Interaction control method and device, storage medium, electronic equipment
US20240211053A1 (en) Intention-based user interface control for electronic devices
CN110231910A (en) A kind of control method and terminal device
CN109847343B (en) Virtual reality interaction method and device, storage medium and electronic equipment
CN110075534B (en) Real-time voice method and device, storage medium and electronic equipment
CN110908568B (en) Control method and device for virtual object
CN115480639A (en) Human-computer interaction system, human-computer interaction method, wearable device and head display device
JP6999822B2 (en) Terminal device and control method of terminal device
US11816757B1 (en) Device-side capture of data representative of an artificial reality environment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant