CN109697001A - The display methods and device of interactive interface, storage medium, electronic device - Google Patents

The display methods and device of interactive interface, storage medium, electronic device Download PDF

Info

Publication number
CN109697001A
CN109697001A CN201711000972.0A CN201711000972A CN109697001A CN 109697001 A CN109697001 A CN 109697001A CN 201711000972 A CN201711000972 A CN 201711000972A CN 109697001 A CN109697001 A CN 109697001A
Authority
CN
China
Prior art keywords
display
interaction interface
dimension interaction
grid
display mode
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201711000972.0A
Other languages
Chinese (zh)
Other versions
CN109697001B (en
Inventor
沈超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201711000972.0A priority Critical patent/CN109697001B/en
Priority to PCT/CN2018/111650 priority patent/WO2019080870A1/en
Publication of CN109697001A publication Critical patent/CN109697001A/en
Application granted granted Critical
Publication of CN109697001B publication Critical patent/CN109697001B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a kind of display methods of interactive interface and device, storage medium, electronic device.Wherein, this method comprises: showing three-dimension interaction interface according to the first display mode in the virtual reality scenario of target application, three-dimension interaction interface is for being configured target application;The operational order of the first account number is obtained, the first account number is the account number of target application, and operational order, which is used to indicate, executes the first operation to the target object on three-dimension interaction interface, and the first operation is for being configured target application;In response to operational order, the first operation is executed to target object, and show three-dimension interaction interface according to the second display mode, the second display mode is for identifying the first operation by using the display mode different from the first display mode.The present invention solves the technical issues of can not feeding back in the related technology to the operation of user.

Description

The display methods and device of interactive interface, storage medium, electronic device
Technical field
The present invention relates to internet areas, and the display methods and device, storage in particular to a kind of interactive interface are situated between Matter, electronic device.
Background technique
Virtual reality VR (full name in English virtual reality), also referred to as virtual technology or virtual environment are to utilize computer Simulation generates the virtual world of a three-dimensional space, provides simulation of the user about sense organs such as visions, and user is allowed to feel seemingly body Its border is gone through, the things in three-dimensional space can be observed in time, without limitation.VR can pass through the side of " software+hardware device " Formula is realized.
Common VR software includes Steam and Oculus, and Steam is a kind of digital distribution, digital copyright management and social activity System, it supports the operations systems such as Windows, OS X and Linux for the distribution of numerical software and game sale and subsequent update System is currently the maximum PC numbers game platform in the whole world.Oculus VR is a virtual reality scientific & technical corporation.
It is that a fully functional 360 degree house type space virtual is existing that the hardware product of Steam, which has Steam VR, Steam VR, Entity is tested, this development kit contain a head-mounted display, two single-hand handling & controllers, one can be in space while chasing after The positioning system of track display and controller, remaining equipment for providing on the Steam that arranges in pairs or groups, can experience the virtual reality of high-order.
It is a true to nature that the hardware product of Oculus VR, which has Oculus Rift and Oculus Touch, Oculus Rift, Virtual reality head-mounted display, and sold in the market at present.Oculus Touch is the movement of Oculus Rift Capture handle, adapted space positioning system uses, and Oculus Touch uses the design of similar bracelet, allow video camera to The hand at family is tracked, and sensor can also track finger movement, while the grip also to offer convenience for user.
By using above-mentioned hardware product, user can be experienced the scene of experiencing virtual reality, but in void In the experience of the process of quasi- reality scene, when user removes an object in touching virtual reality scenario, due to virtual reality field Scape, which will not operate the touching of user, to be fed back, and then user is caused not know whether to have touched the object.
The technical issues of for that can not feed back in the related technology to the operation of user, not yet proposes effective solution at present Certainly scheme.
Summary of the invention
The embodiment of the invention provides a kind of display methods of interactive interface and device, storage medium, electronic device, so that It is few to solve the technical issues of feeding back in the related technology to the operation of user.
According to an aspect of an embodiment of the present invention, a kind of display methods of interactive interface is provided, the display methods packet It includes: showing that three-dimension interaction interface, three-dimension interaction interface are used according to the first display mode in the virtual reality scenario of target application It is configured in target application;The operational order of the first account number is obtained, the first account number is the account number of target application, operational order It is used to indicate and the first operation is executed to the target object on three-dimension interaction interface, the first operation is for setting target application It sets;In response to operational order, the first operation is executed to target object, and show three-dimension interaction interface according to the second display mode, Second display mode is for identifying the first operation by using the display mode different from the first display mode.
According to another aspect of an embodiment of the present invention, a kind of display device of interactive interface is additionally provided, the display device It include: the first display unit, for showing three-dimension interaction according to the first display mode in the virtual reality scenario of target application Interface, wherein three-dimension interaction interface is for being configured target application;Acquiring unit, for obtaining the operation of the first account number Instruction, wherein the first account number is the account number of target application, and operational order is used to indicate to the target object on three-dimension interaction interface The first operation is executed, the first operation is for being configured target application;Second display unit is used in response to operational order, First operation is executed to target object, and shows three-dimension interaction interface according to the second display mode, wherein the second display mode is used In identifying the first operation by using the display mode different from the first display mode.
In embodiments of the present invention, three-dimensional hand over is shown according to the first display mode in the virtual reality scenario of target application Mutual interface, three-dimension interaction interface is for being configured target application;The operational order of the first account number is obtained, which refers to Show and the first operation is executed to the target object on three-dimension interaction interface, the first operation is for being configured target application;Response In operational order, the first operation is executed to target object, and show three-dimension interaction interface, the second display according to the second display mode Mode for identifying the first operation by using the display mode different from the first display mode, by using be not touched Then different display modes is shown, this mode is fed back to operate to the first of user, can solve related skill The technical issues of operation of user can not being fed back in art, and then reached the skill realized and fed back to the operation of user Art effect.
Detailed description of the invention
The drawings described herein are used to provide a further understanding of the present invention, constitutes part of this application, this hair Bright illustrative embodiments and their description are used to explain the present invention, and are not constituted improper limitations of the present invention.In the accompanying drawings:
Fig. 1 is the schematic diagram of the hardware environment of the display methods of interactive interface according to an embodiment of the present invention;
Fig. 2 is a kind of flow chart of the display methods of optional interactive interface according to an embodiment of the present invention;
Fig. 3 is a kind of schematic diagram of optional interactive interface according to an embodiment of the present invention;
Fig. 4 is a kind of schematic diagram of optional interactive interface according to an embodiment of the present invention;
Fig. 5 is a kind of schematic diagram of optional interactive interface according to an embodiment of the present invention;
Fig. 6 is a kind of schematic diagram of optional panel parameter according to an embodiment of the present invention;
Fig. 7 is a kind of schematic diagram of optional material information according to an embodiment of the present invention;
Fig. 8 is a kind of flow chart of the display methods of optional interactive interface according to an embodiment of the present invention;
Fig. 9 is a kind of schematic diagram of the display device of optional interactive interface according to an embodiment of the present invention;
And
Figure 10 is a kind of structural block diagram of terminal according to an embodiment of the present invention.
Specific embodiment
In order to enable those skilled in the art to better understand the solution of the present invention, below in conjunction in the embodiment of the present invention Attached drawing, technical scheme in the embodiment of the invention is clearly and completely described, it is clear that described embodiment is only The embodiment of a part of the invention, instead of all the embodiments.Based on the embodiments of the present invention, ordinary skill people The model that the present invention protects all should belong in member's every other embodiment obtained without making creative work It encloses.
It should be noted that description and claims of this specification and term " first " in above-mentioned attached drawing, " Two " etc. be to be used to distinguish similar objects, without being used to describe a particular order or precedence order.It should be understood that using in this way Data be interchangeable under appropriate circumstances, so as to the embodiment of the present invention described herein can in addition to illustrating herein or Sequence other than those of description is implemented.In addition, term " includes " and " having " and their any deformation, it is intended that cover Cover it is non-exclusive include, for example, the process, method, system, product or equipment for containing a series of steps or units are not necessarily limited to Step or unit those of is clearly listed, but may include be not clearly listed or for these process, methods, product Or other step or units that equipment is intrinsic.
Firstly, the part noun or term that occur during the embodiment of the present invention is described are suitable for as follows It explains:
HUD: head-up display (Head Up Display), hereinafter referred to as HUD are auxiliary with flight on aircraft Instrument is helped, can be used for other field, such as game.
According to embodiments of the present invention, a kind of embodiment of the method for the display methods of interactive interface is provided.
Optionally, in the present embodiment, the display methods of above-mentioned interactive interface can be applied to as shown in Figure 1 by servicing In the hardware environment that device 102, terminal 104 and VR glasses 106 are constituted.As shown in Figure 1, server 102 passes through network and terminal 104 are attached, and above-mentioned network includes but is not limited to: wide area network, Metropolitan Area Network (MAN) or local area network, terminal 104 are not limited to PC, hand Machine, tablet computer etc..The display methods of the interactive interface of the embodiment of the present invention can be executed by server 102, can also be by Terminal 104 executes, and can also be and is executed jointly by server 102 and terminal 104.Wherein, terminal 104 executes implementation of the present invention The display methods of the interactive interface of example is also possible to be executed by client mounted thereto, by VR glasses 106 come the side of display The implementing result of method.
When the display methods of the interactive interface of the embodiment of the present invention by terminal to be individually performed when, directly hold at the terminal Row program code corresponding with the present processes.
Fig. 2 is a kind of flow chart of the display methods of optional interactive interface according to an embodiment of the present invention, such as Fig. 2 institute Show, this method may comprise steps of:
Step S202 shows three-dimension interaction interface according to the first display mode in the virtual reality scenario of target application, Three-dimension interaction interface is for being configured target application.
Above-mentioned virtual reality scenario can realize that target application is used for reality by way of " software+hardware device " The software of existing virtual reality scenario, and it is used to show the i.e. hardware device at three-dimension interaction interface.The first above-mentioned display mode can Think the display mode at the three-dimension interaction interface defaulted in target application.
Optionally, above-mentioned target application includes but is not limited to social application and game application.
Step S204 obtains the operational order of the first account number, and the first account number is the account number of target application, and operational order is used for It indicates to execute the target object on three-dimension interaction interface the first operation, the first operation is for being configured target application.
The first above-mentioned account number is the account number for identifying a virtual objects in virtual reality scenario, which exists Movement in virtual reality scenario is indicated by the user of the first account number in reality, as the virtual objects execute the instruction of user Indicate that the operation executed, virtual objects execution follow movement of user etc. in reality.
It include one or more operational controls (such as operation button, slider bar) at three-dimension interaction interface, each operation control Region where part can be understood as a target object.Operational order is generation when virtual objects touch three-dimension interaction interface Instruction.
Step S206 executes the first operation to target object, and show according to the second display mode in response to operational order Three-dimension interaction interface, the second display mode is for identifying the first behaviour by using the display mode different from the first display mode Make.
When the virtual objects in virtual reality scenario touch three-dimension interaction interface, three-dimension interaction interface use with not by Different display modes is shown when touching, i.e., is shown using the second display mode, using the second display mode into Row display is equivalent to the feedback of the first operation to virtual objects, that is to say the feedback of the operation to user, when user observes When display mode changes, it will know that the first operation has touched three-dimension interaction interface.
S202 to step S206 through the above steps, according to the first display mode in the virtual reality scenario of target application Show three-dimension interaction interface, three-dimension interaction interface is for being configured target application;The operational order of the first account number is obtained, it should Operational order instruction executes the first operation to the target object on three-dimension interaction interface, and the first operation is for carrying out target application Setting;In response to operational order, the first operation is executed to target object, and show three-dimension interaction circle according to the second display mode Face, the second display mode is for identifying the first operation by using the display mode different from the first display mode, by adopting It is shown with from touched then different display mode, this mode feeds back the first operation of user, can To solve the technical issues of can not feeding back in the related technology to the operation of user, and then the operation realized to user is reached The technical effect fed back.
For the application (such as social application and game) under non-VR environment, applicants have recognized that, real touching feedback and Touching feedback in is isolated:
(1) such as host PC game, PC game etc. are the non-touch screen game under 2D display environment, whether based on It is handle or based on keyboard or mouse, the touch mechanism of real world is all not present, because player is to take always in reality (i.e. touch mechanism has been converted into the operation to control) of operational controls, that is, touch always.In virtual world Touching, such as the player operated is hit by others and touches relationship as bullet, plays when game design in order to allow Family experiences this collision sense, and general way is to allow player experience to this feeling using the method for audiovisual, usually In the case of can also assist handle vibrate such mode to strengthen the cognition of user.But generally speaking real touching feedback and Touching feedback in game is isolated, because player does not go to complete the such movement of touching.
(2) for another example, using mobile phone games as the touch screen game of representative, touch screen game bring maximum change is exactly Wan Jiazhen Have this movement of touching, and because 2D screen is necessary being, there are true touch feedbacks by player, this is a kind of True haptic interaction means, this is the maximum advantage place of touch screen game, so player plays some points on touch screen game The impression for hitting class game is very true, because the accuracy of operation and natural degree are all that player wants.But from virtual world From the perspective of real world, what the touching feedback in real touching feedback and game was still isolated, because of virtual world It is the world of a 3Dization, real world is also the world of a 3Dization, only has 2D screen as one for mobile phone games A window, is folded user's impression of player as mirror, and the true finger of player touches screen, then needs one Mapping relations can just influence the operating of virtual world, therefore influence very much the feeling of immersion of player and substitute into sense.
Therefore, the touching under VR reality environment is fed back, the application under non-VR environment does not provide available Realization mechanism.
Further, applicant is by virtually showing that carrying out analysis recognizes that VR reality environment bring is most to VR Big advantage is exactly the variation and impression that user can perceive 3d space, because the impression of the true 3d space position of user exists Position in virtual world corresponds, therefore the movement in movement and game real for a user is not isolated, But merge completely.Therefore for touching feedback, this provides the touching in the touching feedback and game of user's reality The overall situation and possibility that feedback is not isolated mutually simulate touching feedback by strengthening visual feedback to promote user experience. But since the ability of touching simulation is not present in current VR equipment.
Inside the VR application of the application, for the feedback of touching, it is provided with following several implementations:
(1) vibration prompting user is then triggered as long as having caused collision by vibrational feedback, user's operation not will receive Any influence;
(2) cause logic variation to be fed back, for example the hand of virtual world should be followed by true hand always Change in location, but if there is a virtual desk to block the hand of player in virtual world, but do not have in real world This desk, the hand of player can reach the position of desk in virtual world, and at this time the hand of virtual world can rest on desk Edge rather than more gone as true hand is punctured into inside desk, this is also a kind of common processing method (if it is first Kind of method be exactly hand be punctured into inside desk go and shake);
(3) change the position of touched object, for example the hand of virtual world should be the position for being followed by true hand always Variation is set, but if there is a virtual desk to block the hand of player in virtual world, but there is no this in real world A desk, the hand of player can reach the position of desk in virtual world, if the hand of virtual world continues to press on desk, desk Original position is collided out by hand, shows as moving in virtual world.
In the three kinds of above modes, the third mode is sensuously most true but influences whether the variation of scene of game, It is not suitable for for some components that should not influence game logic, such as the 3D panel for showing UI;Second of side Formula then also can be regarded as a kind of gimmick of visual enhancement because the virtual hand and player that player sees experience it is actual The position of hand is inconsistent, in this way can be with the dislike in player's impression;First way is then fairly simple rough, above vision Feedback allows people not to be very comfortable.Therefore, three of the above feedback more or less affects the experience of user.
In order to further increase the experience of user, it to be a kind of that present invention also provides the feedback systems under a kind of VR environment The implementation method of visual feedback effect based on finger clicking operation.A kind of visual feedback effect is realized, allows the finger of player When clicking plane, plane has the dynamic effect of fluctuation, and touch feedback is simulated and expressed by visual feedback, this The advantage of mode is that player compares true nature in the impression of virtual world, has unlike the method that front is told about indisposed Sense, while such mode does not interfere with the logic of original virtual world.Therefore this mode is particularly suitable for using in void The 3D panel of UI content is shown in the quasi- world, the interaction for this panel can strengthen user's click in this way The visual performance of accuracy, let the user know that oneself click specific location where.
Embodiments herein is described in further detail below with reference to step shown in Fig. 2:
In the technical solution that step S202 is provided, according to the first display mode in the virtual reality scenario of target application Show three-dimension interaction interface.
As shown in figure 3, showing a kind of first display mode at three-dimension interaction interface, that is, the display mode defaulted.Three It ties up in interactive interface, user or virtual objects can be configured target application, such as be configured to its some function, function (namely target object) can be being embodied in the form of icon icon.
In the technical solution that step S204 is provided, the operational order for obtaining the first account number is including but not limited to following real Existing mode:
(1) operational order that behavior generates depending on the user's operation
When operated by the user, the position data of user is acquired in real time by positioning device, and according to collected position It sets data the operational motion of user is mapped to virtual objects, if the operation of virtual objects touches three-dimension interaction interface, It then triggers and generates above-mentioned operational order;
(2) operational order generated according to the input operation triggering of input equipment
Above-mentioned input equipment can be a part of hardware device, or the equipment connecting with hardware device, user Ke Tong The input equipment is crossed to control the virtual objects in virtual reality scenario, when by input device controls virtual objects three When being configured in dimension interactive panel, above-mentioned operational order is generated.
In the technical solution that step S206 is provided, in response to operational order, the first operation is executed to target object, and press Three-dimension interaction interface is shown according to the second display mode.
Second display mode identifies the first operation by using the display mode different from the first display mode, including but It is not limited to embody by following form:
(1) second display mode is different from color used by the first display mode, such as background color, font color, three Tie up the integral color etc. of interactive interface;
(2) second display modes are different from background picture used by the first display mode;
(3) second display modes are different from display mode of first display mode to the content in three-dimension interaction interface.
First two is relatively easy to realize, the 3rd kind is described in detail emphatically below:
When showing three-dimension interaction interface according to the second display mode, determines that the second display mode indicates texture, at least exist The first area at three-dimension interaction interface forms three-D grain, and shows that being at least formed with the three-dimensional of three-D grain in first area hands over Mutual interface.
Above-mentioned first area is the position in the region namely virtual objects click on three-dimension interaction interface where target object It sets.
Optionally, according to the instruction of the second display mode, display is at least formed with default three-D grain in first area Three-dimension interaction interface can be realized by following process: display is formed with three-dimension interaction circle of three-D grain within a preset period of time Face, the distance between three-D grain and target object of the first moment display within a preset period of time are shown less than the second moment Three-D grain and the distance between target object, the second moment in preset time period was later than for the first moment.
It should be noted that when display is formed with the three-dimension interaction interface of three-D grain within a preset period of time, three-dimensional pattern Reason is shown that, if three-D grain is shown centered on target object, display effect is more preferable near target object.
Above-mentioned three-D grain includes three-dimensional ripple, and display is formed with the three-dimension interaction of three-D grain within a preset period of time When interface, it can be achieved by the steps of:
Step S2062 is formed with the three-dimension interaction interface of the first three-dimensional ripple, the first three-dimensional ripple in the display of the first moment Centered on target object.Step S2062 can be realized by following sub-step (step 1 and step 2):
Step 1, obtains the first data acquisition system and the second data set, the first data acquisition system include multiple first data, often A first data are used to indicate a vertex of the grid of grid panel at the location of first moment, and grid panel is used for Second area shows three-dimension interaction interface, and second area is the area where the three-dimension interaction interface shown according to the first display mode Domain, the second data set include multiple second data, and each second data are used to indicate a vertex of the grid of grid panel Normal at the location of first moment.
(1) operating force of the first operation indicated in operational order is obtained
For each operating force, the representative points under the influence of operating force can be pre-configured with and (be denoted as One vertex) position generate initial offset, over time, the operating force influence range can be extended to it is other (vertex where ripple after spreading is denoted as the second vertex, and the radius of ripple is greater than the first vertex where the second vertex in region The vertex of place ripple), but caused offset can be less than above-mentioned initial offset, meanwhile, ripple is being generated originally Position (i.e. the first vertex), the influence of the operating force can reduce namely the position on the first vertex generate offset be less than Initial offset can specifically be configured.A kind of optional configuration mode is as follows:
Offset y=y0- at, y0For initial offset, since t be the time clicking three-dimension interaction interface, and a is normal Number indicates the offset of decaying in each second.
Optionally, offset can also be with the time at non-linear relation, such as conic section relationship, logarithmic curve relationship.
Position offset corresponding with operating force is being obtained, the position that can obtain each vertex in the manner described above is inclined Shifting amount.
(2) the first data for being used to indicate first position are obtained according to position offset
First position is to determine that the second position is representative points according to above-mentioned position offset and the second position Position where before offset.
The first above-mentioned data are to be used to indicate the position data of position after representative points shift.
Optionally, in order to which obtained curve is more uniform, following data-optimized processing can be executed.
(3) data-optimized processing
First data of the data on each vertex and adjacent vertex are done into average treatment.Such as some representative points (is denoted as Third vertex) with adjacent and close to target object vertex do average treatment, by third vertex and adjacent and far from target object Vertex do average treatment.
(4) the second data in the second data set are obtained
Above-mentioned second data are used to indicate the normal on a vertex of grid at the location of first moment, which can To be the vector of the normal.
For a vertex, it is typically in the junction of four grids, then then there are four normals in the vertex, Distribution corresponds to the discovery of each grid (being equivalent to a plane).Four vectors of a vertex correspondence are so equivalent to, And the second data can refer to a vector with this four vectors with binding relationship in this application, such as this four vectors Average value, the second data are also possible to the vector for having other binding relationships with this four vectors.
After having obtained the second above-mentioned data, can carry out " data-optimized processing ", specific processing mode with it is upper The optimization processing mode for stating the first data is similar, and details are not described herein.
Step 2 renders the grid of grid panel according to the first data acquisition system and the second data set, with display It is formed with the three-dimension interaction interface of the first three-dimensional ripple, the material of the grid of grid panel is arranged to liquid, the first three-dimensional pattern Reason is the texture that the disturbance of liquid generates.
Optionally, carrying out rendering to the grid of grid panel according to the first data acquisition system and the second data set includes: root The shadow information of target gridding is determined according to first data and the second data on the vertex of target gridding, target gridding is grid panel In current grid to be rendered;It is rendered according to material of the shadow information to target gridding.
Above-mentioned shadow information includes one or more of information such as incident direction, angle of reflection, the refraction angle of light.
Step S2064 is formed with the three-dimension interaction interface of the second three-dimensional ripple, the second three-dimensional ripple in the display of the second moment It is the ripple formed after the first three-dimensional ripple is spread.
The implementation of step S2064 is similar with the implementation of step S2062, distinctive points be obtain the first data and It when the second data, was subject to for the second moment, needs to consider the decaying of position offset.
As a kind of optional embodiment, embodiments herein is described in detail from product side below.
For interactive interface as shown in Figure 3 (i.e. one piece of 3D Mesh shown above a translucent UI) and above Operational controls (i.e. target object), user can click operational controls (as shown in Figure 4), go to click UI with finger and hand over Mutual interface, when the finger of user encounters panel, panel can generate a disturbance effect (i.e. ripple) in the place encountered, Similar finger encounters that the water surface is the same, the ripple an of vibration and the effect of diffusion is had, as shown in figure 5, the obvious journey of this effect Degree parameter is all adjustable, this effect is clicked grid Mesh and can all be generated, but simultaneously if the position clicked is UI Some positions interacted above panel, such as button, then the visual performance of button can also synchronize generation.
The present invention also provides a kind of preferred embodiments, and the realization process of the said goods is described in detail from technology side below.
(1) architecture logic about entire product
For the logical construction of this component of realization, the grid panel (3D of UI Widget can be pasted comprising one piece Mesh), Fig. 6 illustrates the parameter of this block panel, it is noted herein that the vertex number of used panel wants enough More, panel as used herein includes 1000*512 grid vertex and its material, rather than simply only there are four vertex Square, reason for this is that subsequent ripple effect is caused by true vertex position variation, it is therefore desirable to have enough Vertex realize this.Second information that Fig. 6 is shown is exactly to use to be named as " Water Material The material of Widget " (water material components), the specific implementation of this material as shown in fig. 7, can finally see this block in gaming The final effect of 3D panel is accomplished by this material.Main when realization includes three aspects, and one is SlateUI, Namely the UI Widget panel of target is attached to above this Mesh as textures, the other is in order to will be precalculated Normal map (correspond to the second data set) use on, third is exactly by the position disturbance patch of precalculated panel In figure (corresponding to the first data acquisition system) use.
Slate is a set of cross-platform UI frame, both can be used to do UI (such as UE4 Editor), the tool of application program UI, can also play games in UI, above-mentioned SlateUI is exactly the UI in game, the HUD being equivalent in game.
Above-mentioned UI Widget is a basis UI component, it, which is one, arbitrarily to position on the screen as needed Rectangle, this pendant has a region, it can't see at runtime, it is held for accommodating the ideal of other components Device.
Above-mentioned grid Mesh is a kind of grid, can produce the effect as shocks such as landform, sleeps, and creation Mesh is main Including three vertex, triangle, number of segment parameters.
(2) operation logic about entire product:
It is as shown in Figure 8:
Step S801 is obtained and is clicked or the information of the position of touching, such as when finger contact surface plate or finger leave panel Coordinate.
The intensity of click is arranged in step S802, can determine initial strength according to the speed that unit time finger one moves.
Step S803 draws the point (namely Draw Material to Render Target A) of click.
Step S804 draws the animation of each frame, draws, can be realized in the following way for the animation of each frame.
That each frame does change for the above mentioned display to final 3D grid panel, here first from " first frame " this node starts to say, that is to say, that first frame in gaming, first have to do is exactly to obtain current description The textures of grid panel top displacement information (the first data), then put up figure to modify this, allow its original data smoothing, with Accomplish slowly to spread that effect until complete steady until to surrounding as ripples, specific way is exactly in textures Each position takes surrounding data to be averaged, and result is then rendered into one and puts up above figure (i.e. more new images), this The exactly thing done of " Draw Material to Render Target A " this step.First frame in gaming will also be done Thing be " update normal " (i.e. Draw Material to Render Target AN), the thing that this step is done with above One category seemingly, obtains the textures of current description grid panel vertex normal information, then similarly using similar to the above, takes Ambient data does average algorithm new normal map is calculated.Needing the thing done there are also a step is to obtain currently The expression effect (i.e. update UI panel) of UIWidget, equally by its result be plotted to one it is ready already dedicated for depositing Above the textures of UI Widget performance, here it is the things to be done of " UI Widget " this step.So far it has been ready for final All data required for material used in the 3D grid of display.
In above step it should be noted that for every a kind of textures need all more than one prepared (i.e. by the first data To copy respectively be at least two parts for set and the second data set), at least two, read and write access conflict and avoid wash with watercolours in order to prevent Waiting in dye.For example we using the result that the data modification of A textures renders are stored to B textures in N frame, then N+1 The result that the data modification that frame will use B textures obtains later is stored to A textures.In the step of saying below, when N frame Use A textures as a result, use B textures when N+1 frame as a result, it is such avoid repeatedly to render when The data access conflict of time.
The preparation of data required for " Water Material_Widget " this material is completed above, therefore Three following steps are exactly to go finally to use above 3D grid panel using ready data above, are obtained final desired Effect." updating grid vertex position " this step is exactly the textures for obtaining good grid vertex displacement data calculated above, then right Its textures data above corresponded to is obtained in each vertex position of 3D panel, then according to data by the top of 3D panel Point does certain change in location in world space, this is why can see after click panel, panel is as the water surface Change in location has occurred.Because panel be it is transparent, for correct display effect, need to do in real time the normal of panel Update, " update grid material " this step is exactly to obtain ready normal map above, then according to the corresponding pass of position System removes the normal on each vertex of adjustment 3D grid panel." updating grid textures " this step is exactly to obtain to store current UI The textures of Widget performance, are directly attached to above grid panel and show.Can thus see in gaming can be because of finger The UI grid panel clicking interaction and changing.
Say that the thing that each frame can all be done, the thing done are exactly to make entire panel more smooth above, until completely Calmness, as sleep.And when finger touches panel or when finger leaves panel, then it just look like water It is the same that one block of stone is thrown away in face, causes the fluctuation of the water surface, is herein exactly to generate response, this realization is exactly such as stream of Fig. 8 above Shown in journey figure.When generation " finger touches panel " or " finger leaves panel " event, first have to confirmation is hand The position hit and intensity are given directions, the two parameters are corresponded to the corresponding position for saving top displacement textures, then at this This corresponding position above textures is drawn one according to intensity and is greatly justified, and expression is exactly that this local vertex will occur Very big offset, that is, produce ripple effect.
The present invention mainly describes under VR environment, a kind of realization side of the visual feedback effect based on finger clicking operation Method.Reality environment provides the environment of realization for the experience that some real worlds cannot be accomplished, especially view brought by it Feel effective promotion with the feeling of immersion of the sense of hearing.But current VR equipment also has the shortcomings that very fatal, is exactly human-computer interaction The mode of input and output is very limited, especially in output facet, just only shakes other than vision and the sense of hearing.And the application A kind of visual feedback effect is realized, when allowing the finger of player to click plane, plane has the dynamic effect of fluctuation, leads to Visual feedback is crossed to simulate and express touch feedback.The mode for strengthening visual feedback in this way, can allow user that can more melt Enter into virtual world.
It should be noted that for the various method embodiments described above, for simple description, therefore, it is stated as a series of Combination of actions, but those skilled in the art should understand that, the present invention is not limited by the sequence of acts described because According to the present invention, some steps may be performed in other sequences or simultaneously.Secondly, those skilled in the art should also know It knows, the embodiments described in the specification are all preferred embodiments, and related actions and modules is not necessarily of the invention It is necessary.
Through the above description of the embodiments, those skilled in the art can be understood that according to above-mentioned implementation The method of example can be realized by means of software and necessary general hardware platform, naturally it is also possible to by hardware, but it is very much In the case of the former be more preferably embodiment.Based on this understanding, technical solution of the present invention is substantially in other words to existing The part that technology contributes can be embodied in the form of software products, which is stored in a storage In medium (such as ROM/RAM, magnetic disk, CD), including some instructions are used so that a terminal device (can be mobile phone, calculate Machine, server or network equipment etc.) execute method described in each embodiment of the present invention.
According to embodiments of the present invention, it additionally provides a kind of for implementing the interactive interface of the display methods of above-mentioned interactive interface Display device.Fig. 9 is a kind of schematic diagram of the display device of optional interactive interface according to an embodiment of the present invention, such as Fig. 9 It is shown, the apparatus may include: the first display unit 91, acquiring unit 93 and the second display unit 95.
First display unit 91, it is three-dimensional for being shown in the virtual reality scenario of target application according to the first display mode Interactive interface, three-dimension interaction interface is for being configured target application.
Above-mentioned virtual reality scenario can realize that target application is used for reality by way of " software+hardware device " The software of existing virtual reality scenario, and it is used to show three-dimension interaction interface i.e. hardware device.The first above-mentioned display mode can be with For the display mode at the three-dimension interaction interface defaulted in target application.
Optionally, above-mentioned target application includes but is not limited to social application and game application.
Acquiring unit 93, for obtaining the operational order of the first account number, wherein the first account number is the account number of target application, Operational order, which is used to indicate, executes the first operation to the target object on three-dimension interaction interface, and the first operation is for target application It is configured.
The first above-mentioned account number is the account number for identifying a virtual objects in virtual reality scenario, which exists Movement in virtual reality scenario is indicated by the user of the first account number in reality, as the virtual objects execute the instruction of user Indicate that the operation executed, virtual objects execution follow movement of user etc. in reality.
It include one or more operational controls (such as operation button, slider bar) at three-dimension interaction interface, each operation control Region where part can be understood as a target object.Operational order is generation when virtual objects touch three-dimension interaction interface Instruction.
Second display unit 95, for executing the first operation to target object, and aobvious according to second in response to operational order The mode of showing shows three-dimension interaction interface, wherein the second display mode is used for by using the display different from the first display mode Mode identifies the first operation.
When the virtual objects in virtual reality scenario touch three-dimension interaction interface, three-dimension interaction interface use with not by Different display modes is shown when touching, i.e., is shown using the second display mode, using the second display mode into Row display is equivalent to the feedback of the first operation to virtual objects, that is to say the feedback of the operation to user, when user observes When display mode changes, it will know that the first operation has touched three-dimension interaction interface.
It should be noted that the first display unit 91 in the embodiment can be used for executing the step in the embodiment of the present application Rapid S202, the acquiring unit 93 in the embodiment can be used for executing the step S204 in the embodiment of the present application, in the embodiment The second display unit 95 can be used for executing the step S206 in the embodiment of the present application.
Herein it should be noted that above-mentioned module is identical as example and application scenarios that corresponding step is realized, but not It is limited to above-described embodiment disclosure of that.It should be noted that above-mentioned module as a part of device may operate in as In hardware environment shown in FIG. 1, hardware realization can also be passed through by software realization.
By above-mentioned module, three-dimension interaction circle is shown according to the first display mode in the virtual reality scenario of target application Face, three-dimension interaction interface is for being configured target application;The operational order of the first account number is obtained, operational order instruction pair Target object on three-dimension interaction interface executes the first operation, and the first operation is for being configured target application;In response to behaviour It instructs, the first operation is executed to target object, and show three-dimension interaction interface, the second display mode according to the second display mode For identifying the first operation by using the display mode different from the first display mode, by using with it is touched then Different display modes are shown that this mode is fed back to operate to the first of user, can solve in the related technology The technical issues of operation of user can not being fed back, and then reached and realized that the technology fed back to the operation of user is imitated Fruit.
The second above-mentioned display unit is also used to the instruction according to the second display mode, and display is at least formed in first area There is the three-dimension interaction interface of three-D grain, wherein first area is the region on three-dimension interaction interface where target object.
Optionally, the second display unit is also used to show three-dimension interaction circle for being formed with three-D grain within a preset period of time Face, wherein when the distance between three-D grain and target object of the first moment display within a preset period of time are less than second Carve the distance between three-D grain and the target object of display, wherein the second moment in preset time period was later than for the first moment.
The second above-mentioned display unit may include: the first display module, be formed with first for the display at the first moment The three-dimension interaction interface of three-dimensional ripple, wherein the first three-dimensional ripple is centered on target object;Second display module is used for The display of second moment is formed with the three-dimension interaction interface of the second three-dimensional ripple, wherein the second three-dimensional ripple is the first three-dimensional ripple The ripple formed after diffusion.
Optionally, the first above-mentioned display module includes: acquisition submodule, for obtaining the first data acquisition system and the second number According to set, wherein the first data acquisition system includes multiple first data, and each first data are used to indicate the grid of grid panel At the location of first moment, grid panel is used to show three-dimension interaction interface, second area in second area on one vertex Region where the three-dimension interaction interface that shows according to the first display mode, the second data set includes multiple second data, Each second data are used to indicate the normal on a vertex of the grid of grid panel at the location of first moment;Display Module is formed with for being rendered according to the first data acquisition system and the second data set to the grid of grid panel with display The three-dimension interaction interface of first three-dimensional ripple, wherein the material of the grid of grid panel is arranged to liquid, the first three-D grain The texture generated for the disturbance of liquid.
Above-mentioned acquisition submodule is also used to obtain the operating force of the first operation indicated in operational order;Obtain with The corresponding position offset of operating force, wherein position offset is used to indicate representative points under the influence of operating force The offset that position generates, representative points are any one vertex of the grid of grid panel;It is obtained and is used according to position offset The first data in instruction first position, wherein first position is determined according to position offset and the second position, wherein The second position is the position where before representative points shift.
Above-mentioned display sub-module is also used to determine target according to first data and the second data on the vertex of target gridding The shadow information of grid, wherein target gridding is current grid to be rendered in grid panel;According to shadow information to target network The material of lattice is rendered.
VR reality environment bring sharpest edges are exactly the variation and impression that user can perceive 3d space, because Position of the impression of the true 3d space position of user in virtual world corresponds, therefore real for a user Movement in movement and game is not isolated, but merge completely.Therefore for touching feedback, this provides user Touching the feedback mutually overall situation and possibility do not isolated in the touching feedback of reality and game, by strengthen visual feedback come Simulation touching feedback is to promote user experience.But since the ability of touching simulation is not present in current VR equipment.
This application provides a kind of display devices of the interactive interface under VR environment, realize a kind of visual feedback effect, When allowing the finger of player to click plane, plane has the dynamic effect of fluctuation, is simulated and is expressed by visual feedback Touch feedback, the advantage of this mode are the side that player compares true nature in the impression of virtual world, tells about unlike front Method equally has indisposed sense, while such mode does not interfere with the logic of original virtual world.Therefore this mode is special It is suitble to using the 3D panel for showing UI content in virtual world, the interaction for this panel in this way can be strong Change user click accuracy visual performance, let the user know that oneself click specific location where.
Herein it should be noted that above-mentioned module is identical as example and application scenarios that corresponding step is realized, but not It is limited to above-described embodiment disclosure of that.It should be noted that above-mentioned module as a part of device may operate in as In hardware environment shown in FIG. 1, hardware realization can also be passed through by software realization, wherein hardware environment includes network Environment.
According to embodiments of the present invention, additionally provide a kind of server for implementing the display methods of above-mentioned interactive interface or Terminal.
Figure 10 is a kind of structural block diagram of terminal according to an embodiment of the present invention, and as shown in Figure 10, which may include: One or more (one is only shown in Figure 10) processors 1001, memory 1003 and (such as above-mentioned implementation of transmitting device 1005 Sending device in example), as shown in Figure 10, which can also include input-output equipment 1007.
Wherein, memory 1003 can be used for storing software program and module, such as the interactive interface in the embodiment of the present invention Display methods and the corresponding program instruction/module of device, processor 1001 by operation be stored in it is soft in memory 1003 Part program and module realize the display side of above-mentioned interactive interface thereby executing various function application and data processing Method.Memory 1003 may include high speed random access memory, can also include nonvolatile memory, such as one or more magnetism Storage device, flash memory or other non-volatile solid state memories.In some instances, memory 1003 can further comprise The memory remotely located relative to processor 1001, these remote memories can pass through network connection to terminal.Above-mentioned net The example of network includes but is not limited to internet, intranet, local area network, mobile radio communication and combinations thereof.
Above-mentioned transmitting device 1005 is used to that data to be received or sent via network, can be also used for processor with Data transmission between memory.Above-mentioned network specific example may include cable network and wireless network.In an example, Transmitting device 1005 includes a network adapter (Network Interface Controller, NIC), can pass through cable It is connected with other network equipments with router so as to be communicated with internet or local area network.In an example, transmission dress 1005 are set as radio frequency (Radio Frequency, RF) module, is used to wirelessly be communicated with internet.
Wherein, specifically, memory 1003 is for storing application program.
The application program that processor 1001 can call memory 1003 to store by transmitting device 1005, it is following to execute Step:
Three-dimension interaction interface is shown according to the first display mode in the virtual reality scenario of target application, wherein three-dimensional Interactive interface is for being configured target application;
Obtain the operational order of the first account number, wherein the first account number is the account number of target application, and operational order is used to indicate First operation is executed to the target object on three-dimension interaction interface, the first operation is for being configured target application;
In response to operational order, the first operation is executed to target object, and show three-dimension interaction according to the second display mode Interface, wherein the second display mode is for identifying the first operation by using the display mode different from the first display mode.
Processor 1001 is also used to execute following step:
Obtain the first data acquisition system and the second data set, wherein the first data acquisition system includes multiple first data, each First data are used to indicate a vertex of the grid of grid panel at the location of first moment, and grid panel is used for the Two regions show three-dimension interaction interface, and second area is the area where the three-dimension interaction interface shown according to the first display mode Domain, the second data set include multiple second data, and each second data are used to indicate a vertex of the grid of grid panel Normal at the location of first moment;
The grid of grid panel is rendered according to the first data acquisition system and the second data set, is formed with the with display The three-dimension interaction interface of one three-dimensional ripple, wherein the material of the grid of grid panel is arranged to liquid, and the first three-D grain is The texture that the disturbance of liquid generates.
Using the embodiment of the present invention, three-dimensional hand over is shown according to the first display mode in the virtual reality scenario of target application Mutual interface, three-dimension interaction interface is for being configured target application;The operational order of the first account number is obtained, which refers to Show and the first operation is executed to the target object on three-dimension interaction interface, the first operation is for being configured target application;Response In operational order, the first operation is executed to target object, and show three-dimension interaction interface, the second display according to the second display mode Mode for identifying the first operation by using the display mode different from the first display mode, by using be not touched Then different display modes is shown, this mode is fed back to operate to the first of user, can solve related skill The technical issues of operation of user can not being fed back in art, and then reached the skill realized and fed back to the operation of user Art effect.
Optionally, the specific example in the present embodiment can be with reference to example described in above-described embodiment, the present embodiment Details are not described herein.
It will appreciated by the skilled person that structure shown in Fig. 10 is only to illustrate, terminal can be smart phone (such as Android phone, iOS mobile phone), tablet computer, palm PC and mobile internet device (Mobile Internet Devices, MID), the terminal devices such as PAD.Figure 10 it does not cause to limit to the structure of above-mentioned electronic device.For example, terminal is also May include than shown in Figure 10 more perhaps less component (such as network interface, display device) or have and Figure 10 institute Show different configurations.
Those of ordinary skill in the art will appreciate that all or part of the steps in the various methods of above-described embodiment is can It is completed with instructing the relevant hardware of terminal device by program, which can store in a computer readable storage medium In, storage medium may include: flash disk, read-only memory (Read-Only Memory, ROM), random access device (Random Access Memory, RAM), disk or CD etc..
The embodiments of the present invention also provide a kind of storage mediums.Optionally, in the present embodiment, above-mentioned storage medium can With the program code of the display methods for executing interactive interface.
Optionally, in the present embodiment, above-mentioned storage medium can be located at multiple in network shown in above-described embodiment On at least one network equipment in the network equipment.
Optionally, in the present embodiment, storage medium is arranged to store the program code for executing following steps:
S21 shows three-dimension interaction interface according to the first display mode in the virtual reality scenario of target application, wherein Three-dimension interaction interface is for being configured target application;
S22 obtains the operational order of the first account number, wherein the first account number is the account number of target application, and operational order is used for It indicates to execute the target object on three-dimension interaction interface the first operation, the first operation is for being configured target application;
S23 executes the first operation to target object, and show three-dimensional according to the second display mode in response to operational order Interactive interface, wherein the second display mode is used to identify first by using the display mode different from the first display mode Operation.
Optionally, storage medium is also configured to store the program code for executing following steps:
S31 obtains the first data acquisition system and the second data set, wherein and the first data acquisition system includes multiple first data, Each first data are used to indicate a vertex of the grid of grid panel at the location of first moment, and grid panel is used for Three-dimension interaction interface is shown in second area, where second area is the three-dimension interaction interface shown according to the first display mode Region, the second data set include multiple second data, and each second data are used to indicate a top of the grid of grid panel The normal of point is at the location of first moment;
S32 renders the grid of grid panel according to the first data acquisition system and the second data set, to be formed with showing There is the three-dimension interaction interface of the first three-dimensional ripple, wherein the material of the grid of grid panel is arranged to liquid, the first three-dimensional pattern Reason is the texture that the disturbance of liquid generates.
Optionally, the specific example in the present embodiment can be with reference to example described in above-described embodiment, the present embodiment Details are not described herein.
Optionally, in the present embodiment, above-mentioned storage medium can include but is not limited to: USB flash disk, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), mobile hard disk, magnetic disk or The various media that can store program code such as CD.
The serial number of the above embodiments of the invention is only for description, does not represent the advantages or disadvantages of the embodiments.
If the integrated unit in above-described embodiment is realized in the form of SFU software functional unit and as independent product When selling or using, it can store in above-mentioned computer-readable storage medium.Based on this understanding, skill of the invention Substantially all or part of the part that contributes to existing technology or the technical solution can be with soft in other words for art scheme The form of part product embodies, which is stored in a storage medium, including some instructions are used so that one Platform or multiple stage computers equipment (can be personal computer, server or network equipment etc.) execute each embodiment institute of the present invention State all or part of the steps of method.
In the above embodiment of the invention, it all emphasizes particularly on different fields to the description of each embodiment, does not have in some embodiment The part of detailed description, reference can be made to the related descriptions of other embodiments.
In several embodiments provided herein, it should be understood that disclosed client, it can be by others side Formula is realized.Wherein, the apparatus embodiments described above are merely exemplary, such as the division of the unit, and only one Kind of logical function partition, there may be another division manner in actual implementation, for example, multiple units or components can combine or It is desirably integrated into another system, or some features can be ignored or not executed.Another point, it is shown or discussed it is mutual it Between coupling, direct-coupling or communication connection can be through some interfaces, the INDIRECT COUPLING or communication link of unit or module It connects, can be electrical or other forms.
The unit as illustrated by the separation member may or may not be physically separated, aobvious as unit The component shown may or may not be physical unit, it can and it is in one place, or may be distributed over multiple In network unit.It can select some or all of unit therein according to the actual needs to realize the mesh of this embodiment scheme 's.
It, can also be in addition, the functional units in various embodiments of the present invention may be integrated into one processing unit It is that each unit physically exists alone, can also be integrated in one unit with two or more units.Above-mentioned integrated list Member both can take the form of hardware realization, can also realize in the form of software functional units.
The above is only a preferred embodiment of the present invention, it is noted that for the ordinary skill people of the art For member, various improvements and modifications may be made without departing from the principle of the present invention, these improvements and modifications are also answered It is considered as protection scope of the present invention.

Claims (14)

1. a kind of display methods of interactive interface characterized by comprising
Three-dimension interaction interface is shown according to the first display mode in the virtual reality scenario of target application, wherein the three-dimensional Interactive interface is for being configured the target application;
Obtain the operational order of the first account number, wherein first account number is the account number of the target application, the operational order It is used to indicate and the first operation is executed to the target object on the three-dimension interaction interface, first operation is for the target Using being configured;
In response to the operational order, first operation is executed to the target object, and show according to the second display mode The three-dimension interaction interface, wherein second display mode is used to show by using different from first display mode Show mode to identify first operation.
2. the method according to claim 1, wherein showing the three-dimension interaction interface according to the second display mode Include:
According to the instruction of second display mode, display is at least formed with the three-dimension interaction of three-D grain in first area Interface, wherein the first area is the region on the three-dimension interaction interface where the target object, the three-D grain For identifying first operation.
3. according to the method described in claim 2, it is characterized in that, according to second display mode instruction, display at least Include: at the three-dimension interaction interface that first area is formed with default three-D grain
Display is formed with the three-dimension interaction interface of the three-D grain within a preset period of time, wherein when described default Between the first moment display in section the three-D grain and the distance between the target object shown less than the second moment The distance between the three-D grain and the target object, wherein second moment in the preset time period is later than First moment.
4. according to the method described in claim 3, it is characterized in that, the three-D grain includes three-dimensional ripple, wherein default Show that the three-dimension interaction interface for being formed with the three-D grain includes: in period
The three-dimension interaction interface of the first three-dimensional ripple is formed in first moment display, wherein described first is three-dimensional Ripple is centered on the target object;
The three-dimension interaction interface of the second three-dimensional ripple is formed in second moment display, wherein described second is three-dimensional Ripple is the ripple formed after the described first three-dimensional ripple is spread.
5. according to the method described in claim 4, it is characterized in that, being formed with the first three-dimensional ripple in first moment display The three-dimension interaction interface include:
Obtain the first data acquisition system and the second data set, wherein first data acquisition system includes multiple first data, each First data are used to indicate a vertex of the grid of grid panel in the location of described first moment, the grid For panel for showing the three-dimension interaction interface in second area, the second area is to show according to first display mode The three-dimension interaction interface where region, the second data set includes multiple second data, it is each it is described second number According to the grid for being used to indicate the grid panel a vertex normal at the location of described first moment;
The grid of the grid panel is rendered according to first data acquisition system and the second data set, with display It is formed with the three-dimension interaction interface of the described first three-dimensional ripple, wherein the material of the grid of the grid panel is set For liquid, first three-D grain is the texture that the disturbance of the liquid generates.
6. according to the method described in claim 5, it is characterized in that, according to first data acquisition system and second data set It closes that the grid of the grid panel render and includes:
The shadow information of the target gridding is determined according to first data on the vertex of target gridding and second data, Wherein, the target gridding is current grid to be rendered in the grid panel;
The material of the target gridding is rendered according to the shadow information.
7. according to the method described in claim 5, it is characterized in that, obtaining the first data acquisition system includes obtaining as follows First data on each vertex of the grid of the grid panel include:
Obtain the operating force of first operation indicated in the operational order;
Obtain position offset corresponding with the operating force, wherein the position offset is used to indicate in the operation The offset that the position of representative points generates under the influence of dynamics, the representative points are any of the grid of the grid panel One vertex;
First data for being used to indicate first position are obtained according to the position offset, wherein the first position is It is determined according to the position offset and the second position, wherein the second position is before the representative points shift The position at place.
8. a kind of display device of interactive interface characterized by comprising
First display unit, for showing three-dimension interaction circle according to the first display mode in the virtual reality scenario of target application Face, wherein the three-dimension interaction interface is for being configured the target application;
Acquiring unit, for obtaining the operational order of the first account number, wherein first account number is the account of the target application Number, the operational order, which is used to indicate, executes the first operation, first behaviour to the target object on the three-dimension interaction interface It acts on and the target application is configured;
Second display unit, for executing described first to the target object and operating in response to the operational order, and according to Second display mode shows the three-dimension interaction interface, wherein second display mode is used for by using with described first The different display mode of display mode operates to identify described first.
9. device according to claim 8, which is characterized in that second display unit is also used to aobvious according to described second Show that the instruction of mode, display are at least formed with the three-dimension interaction interface of three-D grain in first area, wherein described first Region is the region on the three-dimension interaction interface where the target object, and the three-D grain is for identifying first behaviour Make.
10. device according to claim 9, which is characterized in that second display unit is also used in preset time period It is interior to show the three-dimension interaction interface for being formed with the three-D grain, wherein the first moment in the preset time period The three-D grain of the distance between described three-D grain and the target object of display less than the display of the second moment and institute State the distance between target object, wherein second moment in the preset time period is later than first moment.
11. device according to claim 10, which is characterized in that second display unit includes:
First display module, for being formed with the three-dimension interaction interface of the first three-dimensional ripple in first moment display, Wherein, the described first three-dimensional ripple is centered on the target object;
Second display module, for being formed with the three-dimension interaction interface of the second three-dimensional ripple in second moment display, Wherein, the described second three-dimensional ripple is the ripple formed after the described first three-dimensional ripple is spread.
12. device according to claim 11, which is characterized in that first display module includes:
Acquisition submodule, for obtaining the first data acquisition system and the second data set, wherein first data acquisition system includes more A first data, each first data are used to indicate a vertex of the grid of grid panel locating for first moment Position, the grid panel is used to show the three-dimension interaction interface in second area, and the second area is according to described Region where the three-dimension interaction interface that first display mode is shown, the second data set include multiple second numbers According to each second data are used to indicate the normal on a vertex of the grid of the grid panel in the first moment institute The position at place;
Display sub-module, for the grid according to first data acquisition system and the second data set to the grid panel It is rendered, to show the three-dimension interaction interface for being formed with the described first three-dimensional ripple, wherein the net of the grid panel The material of lattice is arranged to liquid, and first three-D grain is the texture that the disturbance of the liquid generates.
13. a kind of storage medium, which is characterized in that the storage medium includes the program of storage, wherein when described program is run Execute method described in 1 to 7 any one of the claims.
14. a kind of electronic device, including memory, processor and it is stored on the memory and can transports on the processor Capable computer program, which is characterized in that the processor executes the claims 1 to 7 by the computer program Method described in one.
CN201711000972.0A 2017-10-24 2017-10-24 Interactive interface display method and device, storage medium and electronic device Active CN109697001B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201711000972.0A CN109697001B (en) 2017-10-24 2017-10-24 Interactive interface display method and device, storage medium and electronic device
PCT/CN2018/111650 WO2019080870A1 (en) 2017-10-24 2018-10-24 Interaction interface display method and device, storage medium, and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711000972.0A CN109697001B (en) 2017-10-24 2017-10-24 Interactive interface display method and device, storage medium and electronic device

Publications (2)

Publication Number Publication Date
CN109697001A true CN109697001A (en) 2019-04-30
CN109697001B CN109697001B (en) 2021-07-27

Family

ID=66227798

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711000972.0A Active CN109697001B (en) 2017-10-24 2017-10-24 Interactive interface display method and device, storage medium and electronic device

Country Status (2)

Country Link
CN (1) CN109697001B (en)
WO (1) WO2019080870A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112587927A (en) * 2020-12-29 2021-04-02 苏州幻塔网络科技有限公司 Prop control method and device, electronic equipment and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101183276A (en) * 2007-12-13 2008-05-21 上海交通大学 Interactive system based on CCD camera porjector technology
CN103474007A (en) * 2013-08-27 2013-12-25 湖南华凯创意展览服务有限公司 Interactive display method and system
CN104281260A (en) * 2014-06-08 2015-01-14 朱金彪 Method and device for operating computer and mobile phone in virtual world and glasses adopting method and device
WO2016153618A1 (en) * 2015-03-20 2016-09-29 Sony Computer Entertainment Inc. Dynamic gloves to convey sense of touch and movement for virtual objects in hmd rendered environments
US20170092086A1 (en) * 2015-09-25 2017-03-30 Oculus Vr, Llc Transversal actuator for haptic feedback
CN106774824A (en) * 2016-10-26 2017-05-31 网易(杭州)网络有限公司 Virtual reality exchange method and device
CN106775258A (en) * 2017-01-04 2017-05-31 虹软(杭州)多媒体信息技术有限公司 The method and apparatus that virtual reality is interacted are realized using gesture control
CN106896915A (en) * 2017-02-15 2017-06-27 传线网络科技(上海)有限公司 Input control method and device based on virtual reality

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102289337A (en) * 2010-06-18 2011-12-21 上海三旗通信科技有限公司 Brand new display method of mobile terminal interface
CN102430244B (en) * 2011-12-30 2014-11-05 领航数位国际股份有限公司 Method for generating visual man-machine interaction by touching with finger
US9378592B2 (en) * 2012-09-14 2016-06-28 Lg Electronics Inc. Apparatus and method of providing user interface on head mounted display and head mounted display thereof
CN104460988B (en) * 2014-11-11 2017-12-22 陈琦 A kind of input control method of smart mobile phone virtual reality device
CN105630160A (en) * 2015-12-21 2016-06-01 黄鸣生 Virtual reality using interface system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101183276A (en) * 2007-12-13 2008-05-21 上海交通大学 Interactive system based on CCD camera porjector technology
CN103474007A (en) * 2013-08-27 2013-12-25 湖南华凯创意展览服务有限公司 Interactive display method and system
CN104281260A (en) * 2014-06-08 2015-01-14 朱金彪 Method and device for operating computer and mobile phone in virtual world and glasses adopting method and device
WO2016153618A1 (en) * 2015-03-20 2016-09-29 Sony Computer Entertainment Inc. Dynamic gloves to convey sense of touch and movement for virtual objects in hmd rendered environments
US20170092086A1 (en) * 2015-09-25 2017-03-30 Oculus Vr, Llc Transversal actuator for haptic feedback
CN106774824A (en) * 2016-10-26 2017-05-31 网易(杭州)网络有限公司 Virtual reality exchange method and device
CN106775258A (en) * 2017-01-04 2017-05-31 虹软(杭州)多媒体信息技术有限公司 The method and apparatus that virtual reality is interacted are realized using gesture control
CN106896915A (en) * 2017-02-15 2017-06-27 传线网络科技(上海)有限公司 Input control method and device based on virtual reality

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112587927A (en) * 2020-12-29 2021-04-02 苏州幻塔网络科技有限公司 Prop control method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
WO2019080870A1 (en) 2019-05-02
CN109697001B (en) 2021-07-27

Similar Documents

Publication Publication Date Title
JP6659644B2 (en) Low latency visual response to input by pre-generation of alternative graphic representations of application elements and input processing of graphic processing unit
Park et al. Tangible augmented prototyping of digital handheld products
CN106873767B (en) Operation control method and device for virtual reality application
CN108273265A (en) The display methods and device of virtual objects
US20220044490A1 (en) Virtual reality presentation of layers of clothing on avatars
KR101723823B1 (en) Interaction Implementation device of Dynamic objects and Virtual objects for Interactive Augmented space experience
CN104199542A (en) Intelligent mirror obtaining method and device and intelligent mirror
Linowes Unity 2020 virtual reality projects: Learn VR development by building immersive applications and games with Unity 2019.4 and later versions
Linowes Unity virtual reality projects: Learn virtual reality by developing more than 10 engaging projects with unity 2018
CN108109209A (en) A kind of method for processing video frequency and its device based on augmented reality
CN113318428B (en) Game display control method, nonvolatile storage medium, and electronic device
Glover et al. Complete Virtual Reality and Augmented Reality Development with Unity: Leverage the power of Unity and become a pro at creating mixed reality applications
Capece et al. Graphvr: A virtual reality tool for the exploration of graphs with htc vive system
Alshaal et al. Enhancing virtual reality systems with smart wearable devices
CN107735758A (en) Synchronous digital ink strokes are presented
US20180315253A1 (en) Virtual Reality Presentation of Clothing Fitted on Avatars
US10282897B2 (en) Automatic generation of three-dimensional entities
Donovan Mastering Oculus Rift Development
US12106421B1 (en) Surface attribute transfer
CN109697001A (en) The display methods and device of interactive interface, storage medium, electronic device
US11977725B2 (en) Authoring system for interactive virtual reality environments
Balcomb et al. The Effects of Hand Representation on Experience and Performance for 3D Interactions in Virtual Reality Games
Casiez et al. Towards VE that are more closely related to the real world
Ramsbottom A virtual reality interface for previsualization
US20240331323A1 (en) Systems, methods, and computer-readable media for 3d asset compiler

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant