CN109847343A - Virtual reality exchange method and device, storage medium, electronic equipment - Google Patents

Virtual reality exchange method and device, storage medium, electronic equipment Download PDF

Info

Publication number
CN109847343A
CN109847343A CN201811634513.2A CN201811634513A CN109847343A CN 109847343 A CN109847343 A CN 109847343A CN 201811634513 A CN201811634513 A CN 201811634513A CN 109847343 A CN109847343 A CN 109847343A
Authority
CN
China
Prior art keywords
controller
virtual reality
flip angle
user
angle range
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811634513.2A
Other languages
Chinese (zh)
Other versions
CN109847343B (en
Inventor
许世杰
吴伟迪
谭清宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN201811634513.2A priority Critical patent/CN109847343B/en
Publication of CN109847343A publication Critical patent/CN109847343A/en
Application granted granted Critical
Publication of CN109847343B publication Critical patent/CN109847343B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

Present disclose provides a kind of virtual reality exchange method and device, storage medium, electronic equipments, this method comprises: detecting that the first control operates by wearing display equipment, obtain user's viewpoint position and sight;It detects that the second control operates by controller, obtains position and the flip angle of controller;Obtain the field-of-view angle range of the user and the flip angle range of the controller;According to the viewpoint position, the sight, the position of the controller and flip angle and the field-of-view angle range and the flip angle range, determine whether to meet one first preset condition;If so, showing a control on the graphical user interface.The embodiment of the present disclosure will not interrupt the feeling of immersion of interaction in virtual reality scenario, improve operating efficiency and the user experience is improved.

Description

Virtual reality exchange method and device, storage medium, electronic equipment
Technical field
This disclosure relates to which technical field of virtual reality more particularly to a kind of virtual reality exchange method and device, storage are situated between Matter, electronic equipment.
Background technique
Virtual reality (Virtual Reality, VR) technology is a kind of emerging, digitized human-machine interface technology.? In virtual reality technology, it can be provided jointly for user by the part such as optical texture, display system and virtual reality engine One based on visual experience, the virtual reality scenario of perception is integrated including the sense of hearing, tactile etc..Moreover, user can not only lead to Cross the polyesthesias such as vision, the sense of hearing, tactile and acceleration channel perception virtual reality scenario, can also by handle, remote controler, The modes such as voice, movement, expression, gesture and sight and virtual reality scenario interact, to generate experience on the spot in person. Currently, the extensive use that virtual reality technology has obtained in fields such as game, medical treatment, education, engineering trainings.
By taking game application as an example, virtual reality technology bring maximum benefit is can to build very strong immerse for user Sense, greatly improves the interest of game.It but then, can be in the information processing interface common based on plane By input equipments such as finger, handle or keyboards by being handled on two-dimensional screen, realize convenient, fast and accurate Interactive operation.
For example, in gaming, player needs to open the menu in interface or checks Role Information.Prior art has two Kind: the first is to open menu by clicking some key of handle.Second scheme is that the button of unlatching menu is adhered to In wrist, button is clicked by another hand and opens menu.
Wherein, there are following two disadvantages for the first existing scheme: after entering VR game, since player can put on masking view The VR of line wears display equipment, therefore player cannot be directly viewed the hand and handle of oneself, if the positional distance of some key Farther out, new hand player can be not easily found this key to the finger of player, and learning cost is higher.In addition, VR game is emphasized to immerse Sense, and the key on handle is the things belonged to except gaming world, therefore can weaken feeling of immersion to a certain degree.
And in second of existing scheme, major defect is that both hands is needed to cooperate, and when carrying out battle game, player is past Weapon can be held toward another hand, menu could be opened by needing first to put down weapon, will be more troublesome.
It should be noted that information is only used for reinforcing the reason to the background of the disclosure disclosed in above-mentioned background technology part Solution, therefore may include the information not constituted to the prior art known to persons of ordinary skill in the art.
Summary of the invention
The disclosure is designed to provide a kind of virtual reality exchange method and device, storage medium, electronic equipment, in turn One or more is overcome the problems, such as caused by the limitation and defect due to the relevant technologies at least to a certain extent.
According to one aspect of the disclosure, a kind of virtual reality exchange method is provided, a virtual reality system, institute are applied to It states virtual reality system and wears display equipment, and an at least controller including at least one, transported in the virtual reality system One program of row renders a graphic user interface in display equipment in described wear, which comprises
It detects that the first control operates by wearing display equipment, obtains user's viewpoint position and sight;
It detects that the second control operates by controller, obtains position and the flip angle of controller;
Obtain the field-of-view angle range of the user and the flip angle range of the controller;
According to the viewpoint position, the sight, the position of the controller and flip angle and the field-of-view angle Range and the flip angle range, determine whether to meet one first preset condition;
If so, showing a control on the graphical user interface.
According to one aspect of the disclosure, a kind of virtual reality interactive device is provided, a virtual reality system, institute are applied to It states virtual reality system and wears display equipment, and an at least controller including at least one, transported in the virtual reality system One program of row renders a graphic user interface in display equipment in described wear, and described device includes:
First obtains module, for detecting that the first control operates by wearing display equipment, obtains user's viewpoint position And sight;
Second obtains module, for detecting that the second control operates by controller, obtains position and the overturning of controller Angle;
Third obtains module, for obtaining the field-of-view angle range of the user and the flip angle of the controller Range;
First determination module, for according to the viewpoint position, the sight, the position of the controller and flip angle Degree and the field-of-view angle range and the flip angle range, determine whether to meet one first preset condition;
First display module, if when for meeting first preset condition, showing one on the graphical user interface Control.
According to one aspect of the disclosure, a kind of computer readable storage medium is provided, computer program is stored thereon with, The computer program realizes virtual reality exchange method described in above-mentioned any one when being executed by processor.
According to one aspect of the disclosure, a kind of electronic equipment is provided, comprising:
Processor;And
Memory, for storing the executable instruction of the processor;
Wherein, the processor be configured to execute via the executable instruction is executed it is any one of above-mentioned described in Virtual reality exchange method.
The virtual reality exchange method and device, storage medium, electronic equipment that a kind of example embodiment of the disclosure provides.It can To detect that the first control operates by wearing display equipment, user's viewpoint position and sight are obtained;It is detected by controller Second control operation, obtains position and the flip angle of controller;Obtain the field-of-view angle range and the control of the user The flip angle range of device processed;According to the viewpoint position, the sight, the position of the controller and flip angle, and The field-of-view angle range and the flip angle range, determine whether to meet one first preset condition;If so, in the figure A control is shown in user interface.According to the relevant parameter for obtaining each equipment in virtual reality system, meeting certain predetermined item Under part, triggers and show corresponding control on the graphic user interface for wearing display equipment, whole operation only needs one hand It carries out, the triggering rule of control also complies with the operating habit of user, and will not interrupt interactive in virtual reality scenario immerse Sense improves operating efficiency and the user experience is improved.
It should be understood that above general description and following detailed description be only it is exemplary and explanatory, not The disclosure can be limited.
Detailed description of the invention
It is described in detail its exemplary embodiment by referring to accompanying drawing, the above and other feature and advantage of the disclosure will become It obtains more obvious.It should be evident that the accompanying drawings in the following description is only some embodiments of the present disclosure, it is common for this field For technical staff, without creative efforts, it is also possible to obtain other drawings based on these drawings.Attached In figure:
Fig. 1 is the flow chart of one of one exemplary embodiment of disclosure virtual reality exchange method;
Fig. 2 is the virtual reality interaction schematic diagram in one exemplary embodiment of the disclosure;
Fig. 3 is a kind of block diagram of virtual reality interaction side device of the disclosure;
Fig. 4 is the module diagram that the disclosure shows the electronic equipment in an exemplary embodiment.
Fig. 5 is that the disclosure shows the program product schematic diagram in an exemplary embodiment.
Specific embodiment
Example embodiment is described more fully with reference to the drawings.However, example embodiment can be real in a variety of forms It applies, and is not understood as limited to embodiment set forth herein;On the contrary, thesing embodiments are provided so that the disclosure will be comprehensively and complete It is whole, and the design of example embodiment is comprehensively communicated to those skilled in the art.Identical appended drawing reference indicates in figure Same or similar part, thus repetition thereof will be omitted.
In addition, described feature, structure or characteristic can be incorporated in one or more implementations in any suitable manner In example.In the following description, many details are provided to provide and fully understand to embodiment of the disclosure.However, It will be appreciated by persons skilled in the art that can be with technical solution of the disclosure without one in the specific detail or more It is more, or can be using other methods, constituent element, material, device, step etc..In other cases, it is not shown in detail or describes Known features, method, apparatus, realization, material or operation are to avoid fuzzy all aspects of this disclosure.
Block diagram shown in the drawings is only functional entity, not necessarily must be corresponding with physically separate entity. I.e., it is possible to realize these functional entitys using software form, or these are realized in the module of one or more softwares hardening A part of functional entity or functional entity, or realized in heterogeneous networks and/or processor device and/or microcontroller device These functional entitys.
A kind of virtual reality exchange method is disclosed in the present exemplary embodiment first, is mainly used in virtual reality system System, the virtual reality system may include wearing display equipment, an and at least controller.Wherein, display equipment, example are worn It can be such as made of optical texture and display system, wherein display system is connect with external Virtual Reality-Engine, to receive outside Virtual reality engine treated display content, then a virtual reality scenario is showed for user by optical texture;It can also Only to include optical texture, and display system and virtual reality engine are provided by external equipments such as smart phones;I.e. this example is real It applies in mode for virtual reality system applied by virtual reality exchange method and without particular determination.With reference to institute in Fig. 1 Show, the virtual reality exchange method in this example embodiment may include:
Shown in referring to Fig.1, the virtual reality exchange method be may comprise steps of:
Step S1. detects that the first control operates by wearing display equipment, obtains user's viewpoint position and sight;
Step S2. detects that the second control operates by controller, obtains position and the flip angle of controller;
Step S3. obtains the field-of-view angle range of the user and the flip angle range of the controller;
Step S4. is according to the viewpoint position, the sight, the position of the controller and flip angle and described Field-of-view angle range and the flip angle range, determine whether to meet one first preset condition;
Step S5. if so, show a control on the graphical user interface.
Virtual reality exchange method according to the present exemplary embodiment can detect first by wearing display equipment Control operation, obtains user's viewpoint position and sight;It detects that the second control operates by controller, obtains the position of controller And flip angle;Obtain the field-of-view angle range of the user and the flip angle range of the controller;According to the view Point position, the sight, the position of the controller and flip angle and the field-of-view angle range and the flip angle Range determines whether to meet one first preset condition;If so, showing a control on the graphical user interface.According to acquisition The relevant parameter of each equipment in virtual reality system triggers under the conditions of meeting certain predetermined in the figure for wearing display equipment Corresponding control is shown in user interface, whole operation only needs one hand that can carry out, and the triggering rule of control also complies with user Operating habit, and the feeling of immersion of interaction will not be interrupted in virtual reality scenario, improve operating efficiency and improve user Experience.
In the following, will be described further as shown in connection with fig. 2 to the virtual reality exchange method in the present exemplary embodiment.
In the exemplary embodiment of the disclosure, which comprises
Step S1. detects that the first control operates by wearing display equipment, obtains user's viewpoint position and sight;
Step S2. detects that the second control operates by controller, obtains position and the flip angle of controller;
Step S3. obtains the field-of-view angle range of the user and the flip angle range of the controller;
Step S4. is according to the viewpoint position, the sight, the position of the controller and flip angle and described Field-of-view angle range and the flip angle range, determine whether to meet one first preset condition;
Step S5. if so, show a control on the graphical user interface.
In the exemplary embodiment of the disclosure, in step sl, detect that the first control operates by wearing display equipment, Obtain user's viewpoint position and sight.Specifically, it wears and shows that equipment can be the wearable device in the form of the helmet, user will It wears display equipment and is clothed to head, in response to the unlatching of virtual reality applications, user is it can be seen that wear in display equipment Display screen rendering has graphic user interface, and corresponding content is shown in graphic user interface.It may include more for wearing display equipment A sensor, such as gyroscope, gravity sensor etc., by these sensors, the head movement information of available user, example If user's head swings, correspondingly wears and sense this swing information in display equipment, controlled according to this swing information Content corresponding change shown by graphic user interface.Detect that the first control operates by wearing display equipment, the first control Operation can be the swing on left and right directions, is also possible to the new line in vertical direction, bows.It, can for the first control operation To obtain the viewpoint position and sight of user, wherein viewpoint position can be set to wear the center of display equipment, and regard Line is then to wear the center of display equipment as starting point, to wear the angle that display apparatus senses come back or bow to user The ray of generation.
In the exemplary embodiment of the disclosure, in step s 2, detects that the second control operates by controller, obtain control The position of device processed and flip angle.It usually further include having an at least controller in virtual reality system, wherein controller can be with It is the controller that user holds, such as the wearable device that handle or user wear, such as wrist-watch or bracelet.By being based on The user's operation of controller or limb action in response to user, to be mapped in the interaction under the virtual reality scenario in interface Operation.It detects that the second control operates by controller, position and the flip angle of controller is obtained, in virtual reality scenario Under, user, which needs to control, wears compounding practice between display equipment and controller, thus the corresponding behaviour in triggering and completing to apply Make.
In the exemplary embodiment of the disclosure, in step s3, the field-of-view angle range of the user and described is obtained The flip angle range of controller.In virtual reality applications, an effectively behaviour can be all arranged by wearing display equipment and controller Make range, effective opereating specification can be configured according to the demand of developer or user, such as worn display apparatus senses and used The angle that family new line is bowed is 30 °, and the operation beyond 30 ° of ranges is considered as invalid operation, similarly.The flip angle range of controller It can also correspondingly be configured, detect the flip angle that user lifts hand or swings one's arm, be to have if within the scope of flip angle Effect operation, goes beyond the scope, is invalid operation.
In the exemplary embodiment of the disclosure, in step s 4, according to the viewpoint position, the sight, the control The position of device and flip angle and the field-of-view angle range and the flip angle range, determine whether to meet one first Preset condition.Specifically, user's viewpoint position, sight according to acquired in abovementioned steps, the position of controller and flip angle, And field-of-view angle range and the flip angle range, therefore, it is determined that whether these parameters meet a preset trigger condition, The corresponding operating in virtual reality scenario will accordingly be triggered.
Step S5. if so, show a control on the graphical user interface.
If meeting the first preset condition, a control is shown on a graphical user interface.Wherein the control can be virtually The most frequently used control in the dominant control of reality game, or application, can specifically be carried out according to demand by developer or user Setting, the present embodiment are not limited to this.
Virtual reality exchange method according to the present exemplary embodiment can detect first by wearing display equipment Control operation, obtains user's viewpoint position and sight;It detects that the second control operates by controller, obtains the position of controller And flip angle;Obtain the field-of-view angle range of the user and the flip angle range of the controller;According to the view Point position, the sight, the position of the controller and flip angle and the field-of-view angle range and the flip angle Range determines whether to meet one first preset condition;If so, showing a control on the graphical user interface.According to acquisition The relevant parameter of each equipment in virtual reality system triggers under the conditions of meeting certain predetermined in the figure for wearing display equipment Corresponding control is shown in user interface, whole operation only needs one hand that can carry out, and the triggering rule of control also complies with user Operating habit, and the feeling of immersion of interaction will not be interrupted in virtual reality scenario, improve operating efficiency and improve user Experience.
Further, as a kind of optional scheme, the first preset condition are as follows:
X1 < y1, and x2 < y2;
Wherein, the angle of the position line and the sight of the viewpoint position and the controller is x1, the viewpoint The position line of position and the controller and the angle of the ray perpendicular to the controller are x2, the angle of visibility of the user Spend the flip angle range y2 of range y1 and the controller.
As shown in Fig. 2, shown in figure user wear head-wearing display device (not shown) and with hand controller (hand Ring) interaction schematic diagram.As shown, the sight of user is L1, the line between user's viewpoint position and the position of controller For L0, it be the angle of x1, L2 and L0 is x2 that the ray perpendicular to the controller, which is the angle of L2, L1 and L0,.Wherein, user Field-of-view angle range is y1, and the flip angle range of controller is y2.For example, the field-of-view angle range y1 of user is 30 °, control The flip angle y2 of device is 45 °, when x1=25 °, x2=30 °, then determines to meet the first preset condition.If x1=25 °, x2= 50 °, then it is unsatisfactory for the first preset condition;Similarly, if x1=35 °, x2=35 °, and be unsatisfactory for the first preset condition.
Further, as a kind of optional scheme, after step S5 further include:
Step S6: in user's viewpoint position, sight described in real-time detection, the position of the controller and flip angle at least One of variation, the relevant parameter after obtaining the variation;
Step S7: according to the relevant parameter after the variation, determine whether to meet one second preset condition;
Step S8: if so, hiding the control on the graphical user interface.
Since the limbs operation of user is real-time dynamic change, in step s 6, user's viewpoint position described in real-time detection It sets, the variation of sight, the position of the controller and at least one of flip angle, the relevant parameter after obtaining the variation. For the variation of above-mentioned parameter, need to read and record in time, the variation of each parameter may be user in void The expression of intention is operated under quasi- reality scene.In the step s 7, according to the relevant parameter after the variation, determine whether to meet one Second preset condition.In step s 8, if so, hiding the control on the graphical user interface.User is resurrecting Sufficient dwell time or user are that maloperation resurrects the control on control, and user needs to close the control at this time, often The setting of rule is that a close button is arranged on interface, and user is clicked by the compounding practice of head-mounted display and controller and closed Button, with the closing of trigger control.But this operation will take a substantial amount of time, and closing control is usually with smaller Control be shown on interface, it is so also more demanding to the accuracy of operation so that user experience is poor.Therefore, originally Embodiment still passes through the variation of relevant parameter, and one second preset condition of building then triggers boundary when meeting the second preset condition The closing of control in face.
Further, as a kind of optional scheme, the second preset condition are as follows:
X1 > y1 or x2 > y2;
Wherein, the angle of the position line and the sight of the viewpoint position and the controller is x1, the viewpoint The position line of position and the controller and the angle of the ray perpendicular to the controller are x2, the angle of visibility of the user Spend the flip angle range y2 of range y1 and the controller.
As shown in Fig. 2, the second preset condition with parameter involved in the first preset condition be it is identical, use same set of machine The triggering that reason carries out corresponding operating determines, to can also save system resource to a certain extent.The sight of user is L1, is used Line between family viewpoint position and the position of controller is L0, and the ray perpendicular to the controller is the folder of L2, L1 and L0 Angle is that the angle of x1, L2 and L0 are x2.Wherein, the field-of-view angle range of user is y1, and the flip angle range of controller is y2. For example, the field-of-view angle range y1 of user is 30 °, the flip angle y2 of controller is 45 °, when x1=45 °, then determines to meet Second preset condition.If x1=50 °, meeting the first preset condition;Under conditions of detecting x1 > y1 or x2 > y2, it is determined as full The second preset condition of foot, to trigger the closing of control in interface.
In conclusion the virtual reality exchange method according to the present exemplary embodiment, according to the present exemplary embodiment in Virtual reality exchange method, can by wear display equipment detect the first control operate, obtain user's viewpoint position and Sight;It detects that the second control operates by controller, obtains position and the flip angle of controller;Obtain the view of the user The flip angle range of wild angular range and the controller;According to the viewpoint position, the sight, the controller Position and flip angle and the field-of-view angle range and the flip angle range, it is pre- to determine whether to meet one first If condition;If so, showing a control on the graphical user interface.According to obtain virtual reality system in each equipment it is corresponding Parameter triggers under the conditions of meeting certain predetermined and shows corresponding control on the graphic user interface for wearing display equipment, whole A operation only needs one hand that can carry out, and the triggering rule of control also complies with the operating habit of user, and will not interrupt virtual The feeling of immersion of interaction in reality scene.Further, by identical trigger mechanism, the quick closedown of control in interface is realized Operation improves operating efficiency and the user experience is improved.
It should be noted that although describing each step of method in the disclosure in the accompanying drawings with particular order, This does not require that or implies must execute these steps in this particular order, or have to carry out step shown in whole Just it is able to achieve desired result.Additional or alternative, it is convenient to omit multiple steps are merged into a step and held by certain steps Row, and/or a step is decomposed into execution of multiple steps etc..
In an exemplary embodiment of the disclosure, a kind of virtual reality interactive device is additionally provided, it is virtual existing to be applied to one Real system, the virtual reality system includes at least one and wears display equipment, and an at least controller, in the virtual reality A program is run in system, renders a graphic user interface in display equipment in described wear, described device includes: such as Fig. 3 institute Show, the virtual reality interactive device 10 may include: that the first acquisition module 101, second obtains module 102, third obtains mould Block, the first determination module 104 and the first display module 105.Wherein, described device includes:
First obtains module, for detecting that the first control operates by wearing display equipment, obtains user's viewpoint position And sight;
Second obtains module, for detecting that the second control operates by controller, obtains position and the overturning of controller Angle;
Third obtains module, for obtaining the field-of-view angle range of the user and the flip angle of the controller Range;
First determination module, for according to the viewpoint position, the sight, the position of the controller and flip angle Degree and the field-of-view angle range and the flip angle range, determine whether to meet one first preset condition;
First display module, if when for meeting first preset condition, showing one on the graphical user interface Control.
In this example embodiment, the first preset condition are as follows:
X1 < y1, and x2 < y2;
Wherein, the angle of the position line and the sight of the viewpoint position and the controller is x1, the viewpoint The position line of position and the controller and the angle of the ray perpendicular to the controller are x2, the angle of visibility of the user Spend the flip angle range y2 of range y1 and the controller.
In this example embodiment, the virtual reality interactive device can also include:
Detection obtains module (not shown), for user's viewpoint position, sight, the controller described in real-time detection Position and at least one of flip angle variation, the relevant parameter after obtaining the variation;
Second determination module (not shown), for determining whether to meet one according to the relevant parameter after the variation Second preset condition;
Second display module (not shown), if for meeting second preset condition, in graphical user circle The control is hidden on face.
In this example embodiment, second preset condition are as follows:
X1 > y1 or x2 > y2;
Wherein, the angle of the position line and the sight of the viewpoint position and the controller is x1, the viewpoint The position line of position and the controller and the angle of the ray perpendicular to the controller are x2, the angle of visibility of the user Spend the flip angle range y2 of range y1 and the controller.
The detail of each virtual reality interactive device module is in corresponding virtual reality exchange method among the above It is described in detail, therefore details are not described herein again.
It should be noted that although being referred to several modules or unit of the equipment for execution in the above detailed description, But it is this divide it is not enforceable.In fact, according to embodiment of the present disclosure, two or more above-described modules Either the feature and function of unit can embody in a module or unit.Conversely, an above-described module or The feature and function of person's unit can be to be embodied by multiple modules or unit with further division.
In an exemplary embodiment of the disclosure, a kind of electronic equipment that can be realized the above method is additionally provided.
Person of ordinary skill in the field it is understood that various aspects of the invention can be implemented as system, method or Program product.Therefore, various aspects of the invention can be embodied in the following forms, it may be assumed that complete hardware embodiment, complete The embodiment combined in terms of full Software Implementation (including firmware, microcode etc.) or hardware and software, can unite here Referred to as circuit, " module " or " system ".
The electronic equipment 600 of this embodiment according to the present invention is described referring to Fig. 4.The electronics that Fig. 4 is shown Equipment 600 is only an example, should not function to the embodiment of the present invention and use scope bring any restrictions.
As shown in figure 4, electronic equipment 600 is showed in the form of universal computing device.The component of electronic equipment 600 can wrap It includes but is not limited to: at least one above-mentioned processing unit 610, at least one above-mentioned storage unit 620, the different system components of connection The bus 630 of (including storage unit 620 and processing unit 610), display unit 640.
Wherein, the storage unit is stored with program code, and said program code can be held by the processing unit 610 Row, so that various according to the present invention described in the execution of the processing unit 610 above-mentioned " illustrative methods " part of this specification The step of illustrative embodiments.
Storage unit 620 may include the readable medium of volatile memory cell form, such as Random Access Storage Unit (RAM) 6201 and/or cache memory unit 6202, it can further include read-only memory unit (ROM) 6203.
Storage unit 620 can also include program/utility with one group of (at least one) program module 6205 6204, such program module 6205 includes but is not limited to: operating system, one or more application program, other program moulds It may include the realization of network environment in block and program data, each of these examples or certain combination.
Bus 630 can be to indicate one of a few class bus structures or a variety of, including storage unit bus or storage Cell controller, peripheral bus, graphics acceleration port, processing unit use any bus structures in a variety of bus structures Local bus.
Electronic equipment 600 can also be with one or more external equipments 700 (such as keyboard, sensing equipment, bluetooth equipment Deng) communication, can also be enabled a user to one or more equipment interact with the electronic equipment 600 communicate, and/or with make Any equipment (such as the router, modulation /demodulation that the electronic equipment 600 can be communicated with one or more of the other calculating equipment Device etc.) communication.This communication can be carried out by input/output (I/O) interface 650.Also, electronic equipment 600 can be with By network adapter 660 and one or more network (such as local area network (LAN), wide area network (WAN) and/or public network, Such as internet) communication.As shown, network adapter 660 is communicated by bus 630 with other modules of electronic equipment 600. It should be understood that although not shown in the drawings, other hardware and/or software module can not used in conjunction with electronic equipment 600, including but not Be limited to: microcode, device driver, redundant processing unit, external disk drive array, RAID system, tape drive and Data backup storage system etc..
Through the above description of the embodiments, those skilled in the art is it can be readily appreciated that example described herein is implemented Mode can also be realized by software realization in such a way that software is in conjunction with necessary hardware.Therefore, according to the disclosure The technical solution of embodiment can be embodied in the form of software products, which can store non-volatile at one Property storage medium (can be CD-ROM, USB flash disk, mobile hard disk etc.) in or network on, including some instructions are so that a calculating Equipment (can be personal computer, server, terminal installation or network equipment etc.) is executed according to disclosure embodiment Method.
In an exemplary embodiment of the disclosure, a kind of computer readable storage medium is additionally provided, energy is stored thereon with Enough realize the program product of this specification above method.In some possible embodiments, various aspects of the invention may be used also In the form of being embodied as a kind of program product comprising program code, when described program product is run on the terminal device, institute Program code is stated for executing the terminal device described in above-mentioned " illustrative methods " part of this specification according to this hair The step of bright various illustrative embodiments.
Refering to what is shown in Fig. 5, describing the program product for realizing the above method of embodiment according to the present invention 800, can using portable compact disc read only memory (CD-ROM) and including program code, and can in terminal device, Such as it is run on PC.However, program product of the invention is without being limited thereto, in this document, readable storage medium storing program for executing can be with To be any include or the tangible medium of storage program, the program can be commanded execution system, device or device use or It is in connection.
Described program product can be using any combination of one or more readable mediums.Readable medium can be readable letter Number medium or readable storage medium storing program for executing.Readable storage medium storing program for executing for example can be but be not limited to electricity, magnetic, optical, electromagnetic, infrared ray or System, device or the device of semiconductor, or any above combination.The more specific example of readable storage medium storing program for executing is (non exhaustive List) include: electrical connection with one or more conducting wires, portable disc, hard disk, random access memory (RAM), read-only Memory (ROM), erasable programmable read only memory (EPROM or flash memory), optical fiber, portable compact disc read only memory (CD-ROM), light storage device, magnetic memory device or above-mentioned any appropriate combination.
Computer-readable signal media may include in a base band or as carrier wave a part propagate data-signal, In carry readable program code.The data-signal of this propagation can take various forms, including but not limited to electromagnetic signal, Optical signal or above-mentioned any appropriate combination.Readable signal medium can also be any readable Jie other than readable storage medium storing program for executing Matter, the readable medium can send, propagate or transmit for by instruction execution system, device or device use or and its The program of combined use.
The program code for including on readable medium can transmit with any suitable medium, including but not limited to wirelessly, have Line, optical cable, RF etc. or above-mentioned any appropriate combination.
The program for executing operation of the present invention can be write with any combination of one or more programming languages Code, described program design language include object oriented program language-Java, C++ etc., further include conventional Procedural programming language-such as " C " language or similar programming language.Program code can be fully in user It calculates and executes in equipment, partly executes on a user device, being executed as an independent software package, partially in user's calculating Upper side point is executed on a remote computing or is executed in remote computing device or server completely.It is being related to far Journey calculates in the situation of equipment, and remote computing device can pass through the network of any kind, including local area network (LAN) or wide area network (WAN), it is connected to user calculating equipment, or, it may be connected to external computing device (such as utilize ISP To be connected by internet).
In addition, above-mentioned attached drawing is only the schematic theory of processing included by method according to an exemplary embodiment of the present invention It is bright, rather than limit purpose.It can be readily appreciated that the time that above-mentioned processing shown in the drawings did not indicated or limited these processing is suitable Sequence.In addition, be also easy to understand, these processing, which can be, for example either synchronously or asynchronously to be executed in multiple modules.
Those skilled in the art after considering the specification and implementing the invention disclosed here, will readily occur to its of the disclosure His embodiment.This application is intended to cover any variations, uses, or adaptations of the disclosure, these modifications, purposes or Adaptive change follow the general principles of this disclosure and including the undocumented common knowledge in the art of the disclosure or Conventional techniques.The description and examples are only to be considered as illustrative, and the true scope and spirit of the disclosure are by claim It points out.
It should be understood that the present disclosure is not limited to the precise structures that have been described above and shown in the drawings, and And various modifications and changes may be made without departing from the scope thereof.The scope of the present disclosure is only limited by the attached claims.

Claims (10)

1. a kind of virtual reality exchange method, which is characterized in that be applied to a virtual reality system, the virtual reality system is extremely Display equipment, and an at least controller are worn including one less, a program is run in the virtual reality system, in the head It wears and renders a graphic user interface in display equipment, which comprises
It detects that the first control operates by wearing display equipment, obtains user's viewpoint position and sight;
It detects that the second control operates by controller, obtains position and the flip angle of controller;
Obtain the field-of-view angle range of the user and the flip angle range of the controller;
According to the viewpoint position, the sight, the position of the controller and flip angle and the field-of-view angle range With the flip angle range, determine whether to meet one first preset condition;
If so, showing a control on the graphical user interface.
2. the method as described in claim 1, which is characterized in that first preset condition are as follows:
X1 < y1, and x2 < y2;
Wherein, the angle of the position line and the sight of the viewpoint position and the controller is x1, the viewpoint position Position line and the angle of the ray perpendicular to the controller with the controller are x2, the field-of-view angle model of the user Enclose the flip angle range y2 of y1 and the controller.
3. method according to claim 1 or 2, which is characterized in that described to show a control step on a graphical user interface Afterwards further include:
The variation of user's viewpoint position, sight, the position of the controller and at least one of flip angle described in real-time detection, Relevant parameter after obtaining the variation;
According to the relevant parameter after the variation, determine whether to meet one second preset condition;
If so, hiding the control on the graphical user interface.
4. method as claimed in claim 3, which is characterized in that second preset condition are as follows:
X1 > y1 or x2 > y2;
Wherein, the angle of the position line and the sight of the viewpoint position and the controller is x1, the viewpoint position Position line and the angle of the ray perpendicular to the controller with the controller are x2, the field-of-view angle model of the user Enclose the flip angle range y2 of y1 and the controller.
5. a kind of virtual reality interactive device, is applied to a virtual reality system, the virtual reality system includes at least one Display equipment, and an at least controller are worn, a program is run in the virtual reality system, wears display equipment described One graphic user interface of upper rendering, described device include:
First obtains module, for detecting that the first control operates by helmet, obtains user's viewpoint position and sight;
Second obtains module, for detecting that the second control operates by controller, obtains position and the flip angle of controller;
Third obtains module, for obtaining the field-of-view angle range of the user and the flip angle range of the controller;
First determination module, for according to the viewpoint position, the sight, the position of the controller and flip angle, with And the field-of-view angle range and the flip angle range, determine whether to meet one first preset condition;
First display module, if when for meeting first preset condition, showing a control on the graphical user interface.
6. device as claimed in claim 5, which is characterized in that first preset condition are as follows:
X1 < y1, and x2 < y2;
Wherein, the angle of the position line and the sight of the viewpoint position and the controller is x1, the viewpoint position Position line and the angle of the ray perpendicular to the controller with the controller are x2, the field-of-view angle model of the user Enclose the flip angle range y2 of y1 and the controller.
7. such as device described in claim 5 or 6, which is characterized in that described device further include:
Detection obtains module, for user's viewpoint position, sight described in real-time detection, the position of the controller and flip angle The variation of at least one of degree, the relevant parameter after obtaining the variation;
Second determination module, for according to the relevant parameter after the variation, determining whether to meet one second preset condition;
Second display module, if hiding the control on the graphical user interface for meeting second preset condition.
8. device as claimed in claim 7, which is characterized in that second preset condition are as follows:
X1 > y1 or x2 > y2;
Wherein, the angle of the position line and the sight of the viewpoint position and the controller is x1, the viewpoint position Position line and the angle of the ray perpendicular to the controller with the controller are x2, the field-of-view angle model of the user Enclose the flip angle range y2 of y1 and the controller.
9. a kind of computer readable storage medium, is stored thereon with computer program, which is characterized in that the computer program quilt Virtual reality exchange method described in any one of claim 1-4 is realized when processor executes.
10. a kind of electronic equipment characterized by comprising
Processor;And
Memory, for storing the executable instruction of the processor;
Wherein, the processor is configured to come any one of perform claim requirement 1-4 institute via the execution executable instruction The virtual reality exchange method stated.
CN201811634513.2A 2018-12-29 2018-12-29 Virtual reality interaction method and device, storage medium and electronic equipment Active CN109847343B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811634513.2A CN109847343B (en) 2018-12-29 2018-12-29 Virtual reality interaction method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811634513.2A CN109847343B (en) 2018-12-29 2018-12-29 Virtual reality interaction method and device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN109847343A true CN109847343A (en) 2019-06-07
CN109847343B CN109847343B (en) 2022-02-15

Family

ID=66893293

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811634513.2A Active CN109847343B (en) 2018-12-29 2018-12-29 Virtual reality interaction method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN109847343B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105393190A (en) * 2013-06-25 2016-03-09 微软技术许可有限责任公司 Selecting user interface elements via position signal
US20160274762A1 (en) * 2015-03-16 2016-09-22 The Eye Tribe Aps Device interaction in augmented reality
CN106462231A (en) * 2014-03-17 2017-02-22 Itu 商业发展公司 Computer-implemented gaze interaction method and apparatus
WO2017100755A1 (en) * 2015-12-10 2017-06-15 Appelago Inc. Automated migration of animated icons for dynamic push notifications
CN106861184A (en) * 2016-12-28 2017-06-20 北京乐动卓越科技有限公司 A kind of method and system that man-machine interaction is realized in immersion VR game
CN108536374A (en) * 2018-04-13 2018-09-14 网易(杭州)网络有限公司 Virtual objects direction-controlling method and device, electronic equipment, storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105393190A (en) * 2013-06-25 2016-03-09 微软技术许可有限责任公司 Selecting user interface elements via position signal
CN106462231A (en) * 2014-03-17 2017-02-22 Itu 商业发展公司 Computer-implemented gaze interaction method and apparatus
US20160274762A1 (en) * 2015-03-16 2016-09-22 The Eye Tribe Aps Device interaction in augmented reality
WO2017100755A1 (en) * 2015-12-10 2017-06-15 Appelago Inc. Automated migration of animated icons for dynamic push notifications
CN106861184A (en) * 2016-12-28 2017-06-20 北京乐动卓越科技有限公司 A kind of method and system that man-machine interaction is realized in immersion VR game
CN108536374A (en) * 2018-04-13 2018-09-14 网易(杭州)网络有限公司 Virtual objects direction-controlling method and device, electronic equipment, storage medium

Also Published As

Publication number Publication date
CN109847343B (en) 2022-02-15

Similar Documents

Publication Publication Date Title
US11580711B2 (en) Systems and methods for controlling virtual scene perspective via physical touch input
US11003307B1 (en) Artificial reality systems with drawer simulation gesture for gating user interface elements
CN107890672B (en) Visible sensation method and device, storage medium, the electronic equipment of compensating sound information
US20200387286A1 (en) Arm gaze-driven user interface element gating for artificial reality systems
KR20170036704A (en) Multi-user gaze projection using head mounted display devices
US10921879B2 (en) Artificial reality systems with personal assistant element for gating user interface elements
CN108671539A (en) Target object exchange method and device, electronic equipment, storage medium
US11294475B1 (en) Artificial reality multi-modal input switching model
US20200388247A1 (en) Corner-identifiying gesture-driven user interface element gating for artificial reality systems
US10852839B1 (en) Artificial reality systems with detachable personal assistant for gating user interface elements
CN108549487A (en) Virtual reality exchange method and device
CN108776544A (en) Exchange method and device, storage medium, electronic equipment in augmented reality
CN110215686A (en) Display control method and device, storage medium and electronic equipment in scene of game
CN110908568B (en) Control method and device for virtual object
CN109847343A (en) Virtual reality exchange method and device, storage medium, electronic equipment
Khalaf et al. A framework of input devices to support designing composite wearable computers
US20240126373A1 (en) Tractable body-based ar system input
US20240070994A1 (en) One-handed zoom operation for ar/vr devices
US20230342026A1 (en) Gesture-based keyboard text entry
US20230315208A1 (en) Gesture-based application invocation
WO2023235672A1 (en) Ar-based virtual keyboard
CN116166161A (en) Interaction method based on multi-level menu and related equipment
Halonen Interaction Design Principles for Industrial XR
CN117762243A (en) Motion mapping for continuous gestures
WO2024050263A1 (en) Wrist rotation manipulation of virtual objects

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant