CN107908281A - Virtual reality exchange method, device and computer-readable recording medium - Google Patents
Virtual reality exchange method, device and computer-readable recording medium Download PDFInfo
- Publication number
- CN107908281A CN107908281A CN201711080735.XA CN201711080735A CN107908281A CN 107908281 A CN107908281 A CN 107908281A CN 201711080735 A CN201711080735 A CN 201711080735A CN 107908281 A CN107908281 A CN 107908281A
- Authority
- CN
- China
- Prior art keywords
- animation
- destination object
- virtual reality
- target
- kinematic parameter
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 42
- 238000001514 detection method Methods 0.000 claims description 21
- 238000012545 processing Methods 0.000 claims description 16
- 238000007667 floating Methods 0.000 claims description 12
- 230000002452 interceptive effect Effects 0.000 claims description 12
- 230000009471 action Effects 0.000 claims description 8
- 230000007704 transition Effects 0.000 claims description 7
- 238000004590 computer program Methods 0.000 claims description 2
- 241000406668 Loxodonta cyclotis Species 0.000 claims 1
- 230000003993 interaction Effects 0.000 abstract description 12
- 230000000694 effects Effects 0.000 abstract description 5
- 238000005516 engineering process Methods 0.000 description 16
- 238000004891 communication Methods 0.000 description 11
- 238000010586 diagram Methods 0.000 description 9
- 230000008859 change Effects 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 4
- 230000005236 sound signal Effects 0.000 description 4
- 230000000712 assembly Effects 0.000 description 3
- 238000000429 assembly Methods 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000007726 management method Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- KLDZYURQCUYZBL-UHFFFAOYSA-N 2-[3-[(2-hydroxyphenyl)methylideneamino]propyliminomethyl]phenol Chemical compound OC1=CC=CC=C1C=NCCCN=CC1=CC=CC=C1O KLDZYURQCUYZBL-UHFFFAOYSA-N 0.000 description 1
- 241000208340 Araliaceae Species 0.000 description 1
- 235000005035 Panax pseudoginseng ssp. pseudoginseng Nutrition 0.000 description 1
- 235000003140 Panax quinquefolius Nutrition 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 201000001098 delayed sleep phase syndrome Diseases 0.000 description 1
- 208000033921 delayed sleep phase type circadian rhythm sleep disease Diseases 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 235000008434 ginseng Nutrition 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 210000000697 sensory organ Anatomy 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 239000011800 void material Substances 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/012—Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The disclosure is directed to a kind of virtual reality exchange method, device and computer-readable recording medium, the virtual reality exchange method includes:When detecting that the destination object in virtual reality scenario is selected, the kinematic parameter of input is detected;According to default correspondence between the kinematic parameter and kinematic parameter and the display animation of the destination object, the target for obtaining the destination object shows animation;Show that the target shows animation in the virtual reality scenario.Pass through the technical solution of the disclosure, display animation that can be according to the kinematic parameter display target object of input in virtual reality scenario, destination object is showed lively, abundant display effect, so as to lift the interaction sense of virtual reality scenario and lively sense, lift user experience.
Description
Technical field
This disclosure relates to technical field of virtual reality, more particularly to a kind of virtual reality exchange method, device and computer
Readable storage medium storing program for executing.
Background technology
With the development of science and technology, virtual reality (Virtual Reality, VR) technology is come into being.Virtual reality technology
It is an important directions of emulation technology, is emulation technology and computer graphics, human-machine interface technology, multimedia sensing technology
And the set of the multiple technologies such as network technology, it generates the virtual reality in a three-dimensional space using virtual reality device simulation
Scene, there is provided simulation of the user on sense organs such as vision, the sense of hearing, tactiles, allows user as on the spot in person.
Virtual reality interaction in correlation technique, is typically to be moved according to the user's head detected to change virtual reality
The visual angle of scene, thus, interaction uninteresting, the no interactions sense mostly of virtual reality scenario.
The content of the invention
To overcome problem present in correlation technique, the disclosure provides a kind of virtual reality exchange method, device and calculating
Machine readable storage medium storing program for executing.
According to the first aspect of the embodiment of the present disclosure, there is provided a kind of virtual reality exchange method, including:
When detecting that the destination object in virtual reality scenario is selected, the kinematic parameter of input is detected;
According to default corresponding pass between the kinematic parameter and kinematic parameter and the display animation of the destination object
System, the target for obtaining the destination object show animation;
Show that the target shows animation in the virtual reality scenario.
Alternatively, the kinematic parameter of the detection input includes:
Detect the kinematic parameter of the display device of the virtual reality scenario;
It is described according to default right between the kinematic parameter and kinematic parameter and the display animation of the destination object
It should be related to, the target for obtaining the destination object shows animation, including:
According to pre- between the kinematic parameter of the display device and kinematic parameter and the floating animation of the destination object
If correspondence, obtain the target float animation of the destination object, the target shows that animation includes the target float
Animation.
Alternatively, the kinematic parameter of the detection input includes:
Receive the rotation parameter of the input;
It is described according to default right between the kinematic parameter and kinematic parameter and the display animation of the destination object
It should be related to, the target for obtaining the destination object shows that animation includes:
According to default corresponding pass between the rotation parameter and rotation parameter and the rotation animation of the destination object
System, obtains the target rotational animation of the destination object, and the target shows that animation includes the target rotational animation.
Alternatively, the method further includes:
According to the deliberate action of the preset model of the destination object and the preset model, the destination object is created
Multiple display animations;
Transition processing is carried out to the multiple display animation.
Alternatively, in the database, the database provides calling interface, the acquisition for the multiple display animation storage
The target of the destination object shows that animation includes:
The calling interface is called according to the kinematic parameter, is shown with obtaining the corresponding target of the kinematic parameter
Animation.
According to the second aspect of the embodiment of the present disclosure, there is provided a kind of virtual reality interactive device, including:
Detection module, is configured as, when the destination object in detecting virtual reality scenario is selected, detecting input
Kinematic parameter;
Acquisition module, is configured as the display animation according to the kinematic parameter and kinematic parameter and the destination object
Between default correspondence, obtain the destination object target show animation;
Display module, is configured as showing that the target shows animation in the virtual reality scenario.
Alternatively, the detection module includes:
Detection sub-module, is configured as detecting the kinematic parameter of the display device of the virtual reality scenario;
The acquisition module includes:
First acquisition submodule, is configured as the kinematic parameter and kinematic parameter according to the display device and the mesh
Default correspondence between the floating animation of mark object, obtains the target float animation of the destination object, and the target is shown
Show that animation includes the target float animation.
Alternatively, the detection module includes:
Receiving submodule, is configured as receiving the rotation parameter of the input;
The acquisition module includes:
Second acquisition submodule, is configured as the rotation according to the rotation parameter and rotation parameter and the destination object
Default correspondence between rotation picture, obtains the target rotation animation of the destination object, and the target shows that animation includes
The target rotates animation.
Alternatively, described device further includes:
Creation module, is configured as the deliberate action of the preset model and the preset model according to the destination object,
Create multiple display animations of the destination object;
Transitional module, is configured as carrying out transition processing to the multiple display animation.
Alternatively, in the database, the database provides calling interface, the acquisition for the multiple display animation storage
Module includes:
3rd acquisition submodule, is configured as calling the calling interface according to the kinematic parameter, to obtain the fortune
Move the corresponding target of parameter and show animation.
According to the third aspect of the embodiment of the present disclosure, there is provided a kind of virtual reality interactive device, including:
Processor;
For storing the memory of processor-executable instruction;
Wherein, the processor is configured as:
When detecting that the destination object in virtual reality scenario is selected, the kinematic parameter of input is detected;
According to default corresponding pass between the kinematic parameter and kinematic parameter and the display animation of the destination object
System, the target for obtaining the destination object show animation;
Show that the target shows animation in the virtual reality scenario.
According to the fourth aspect of the embodiment of the present disclosure, there is provided a kind of computer-readable recording medium, is stored thereon with calculating
Machine programmed instruction, the programmed instruction realize the virtual reality exchange method that disclosure first aspect is provided when being executed by processor
The step of.
The technical scheme provided by this disclosed embodiment can include the following benefits:By by kinematic parameter and target
The display animation association of object, can move according to display of the kinematic parameter display target object of input in virtual reality scenario
Draw, destination object is showed lively, abundant display effect, so that the interaction sense of virtual reality scenario and lively sense are lifted,
Lift user experience.
It should be appreciated that the general description and following detailed description of the above are only exemplary and explanatory, not
The disclosure can be limited.
Brief description of the drawings
Attached drawing herein is merged in specification and forms the part of this specification, shows the implementation for meeting the disclosure
Example, and be used to together with specification to explain the principle of the disclosure.
Fig. 1 is a kind of schematic diagram of the implementation environment of virtual reality interaction according to an exemplary embodiment;
Fig. 2 is a kind of flow chart of virtual reality exchange method according to an exemplary embodiment;
Fig. 3 A are a kind of flow charts of the virtual reality exchange method shown according to another exemplary embodiment;
Fig. 3 B are a kind of flow charts of the virtual reality exchange method shown according to another exemplary embodiment;
Fig. 4 shows the schematic diagram of a scenario when implementing the virtual reality exchange method of disclosure offer;
Fig. 5 shows the schematic diagram of a scenario when implementing the virtual reality exchange method of disclosure offer;
Fig. 6 is a kind of flow chart of the virtual reality exchange method shown according to another exemplary embodiment;
Fig. 7 is a kind of block diagram of virtual reality interactive device according to an exemplary embodiment;
Fig. 8 A and Fig. 8 B are a kind of block diagrams of the virtual reality interactive device shown according to another exemplary embodiment;
Fig. 9 is a kind of block diagram of the virtual reality interactive device shown according to another exemplary embodiment;
Figure 10 is a kind of block diagram of device for virtual reality exchange method according to an exemplary embodiment.
Embodiment
Here exemplary embodiment will be illustrated in detail, its example is illustrated in the accompanying drawings.Following description is related to
During attached drawing, unless otherwise indicated, the same numbers in different attached drawings represent the same or similar key element.Following exemplary embodiment
Described in embodiment do not represent all embodiments consistent with the disclosure.On the contrary, they be only with it is such as appended
The example of the consistent apparatus and method of some aspects be described in detail in claims, the disclosure.
Fig. 1 is a kind of schematic diagram of the implementation environment of virtual reality interaction according to an exemplary embodiment.Such as Fig. 1
Shown, which can include VR equipment 100 and exterior controlling equipment 200.In the disclosure, VR equipment 100 can example
The VR helmets, VR glasses etc. in this way.In addition, VR equipment can be split type VR equipment, or integral type VR equipment, this public affairs
Open and this is not limited.Exterior controlling equipment 200 may, for example, be handle, remote controler etc..With VR equipment 100 it is VR heads in Fig. 1
Helmet and exterior controlling equipment 200 are handles to illustrate.
VR equipment 100 can simulate the virtual reality scenario in one three-dimensional space of generation, be deposited in the virtual reality scenario
In one or more objects, VR equipment 100 built-in sensor component can track user's head, eye etc. movement and
Voice messaging input by user etc. is identified, so that according to tracking result and/or voice messaging input by user to virtual reality field
Scape is controlled, such as chooses one of object, the size for adjusting the object and distance etc., it is possible thereby to realize user and void
Intend the interaction of reality scene.
In addition, can be logical to establish using various wired or wireless technologies between VR equipment 100 and exterior controlling equipment 200
Letter connection.For example, connection mode can for example include but not limited to:Bluetooth, WiFi (Wireless-Fidelity, wireless guarantor
Very), 2G networks, 3G network, 4G networks etc..
After exterior controlling equipment 200 and VR equipment 100 establish communication connection, exterior controlling equipment 200 can be with
VR equipment 100 can interact, and to control the virtual reality scenario that the VR equipment 100 generates, such as adjust the big of the object
It is small and far and near etc..
The virtual reality exchange method and device of disclosure offer are described below in conjunction with attached drawing.
Fig. 2 is a kind of flow chart of virtual reality exchange method according to an exemplary embodiment, wherein, this method
VR equipment is can be applied to, for example, the VR equipment 100 shown in Fig. 1.As shown in Fig. 2, this method may comprise steps of:
In step s 201, when detecting that the destination object in virtual reality scenario is selected, the movement of input is detected
Parameter.
In the virtual reality scenario of VR equipment simulatings generation, user can select the virtual reality field by the VR equipment
Any one object in scape simultaneously inputs corresponding kinematic parameter to be controlled to the object chosen.In the embodiments of the present disclosure,
The object that user chooses is destination object.
In step S202, according to pre- between the kinematic parameter of input and the display animation of kinematic parameter and destination object
If correspondence, obtain destination object target show animation.
In step S203, display target shows animation in virtual reality scenario.
It is stored with VR equipment between the display animation and kinematic parameter and the display animation of each object of each object
Default correspondence.In embodiment of the disclosure, the correspondence between the display animation of kinematic parameter and each object
Can be set when VR equipment is dispatched from the factory or User Defined set.
After the kinematic parameter of input is detected, VR equipment can be according to the kinematic parameter and kinematic parameter and mesh of input
Default correspondence between the display animation of mark object, the target of invocation target object show that animation is simultaneously shown in virtual reality
In scene.
Thus, by the way that kinematic parameter is associated with the display animation of destination object, can be shown according to the kinematic parameter of input
Show display animation of the destination object in virtual reality scenario, destination object is showed lively, abundant display effect, so that
The interaction sense of virtual reality scenario and lively sense are lifted, lifts user experience.
In one embodiment, the kinematic parameter of input can derive from the movement ginseng of the display device of virtual reality scenario
Number.Correspondingly, as shown in Figure 3A, above-mentioned steps S201 includes:
In step S211, the kinematic parameter of the display device of virtual reality scenario is detected.
In embodiment of the disclosure, the display device of virtual reality scenario can be VR equipment as described above.User
Wear VR equipment and rotate head, in this process, VR equipment also rotates, and configures the sensor group in VR equipment
Part (such as gyro sensor) can detect the kinematic parameter of VR equipment in real time.In embodiment of the disclosure, display device
Kinematic parameter can such as include but not limited to display device rotation direction.
Correspondingly, above-mentioned steps S202 can include:
In step S221, according to the floating animation of the kinematic parameter of display device and kinematic parameter and destination object it
Between default correspondence, obtain the target float animation of destination object, wherein, target shows that animation is moved including target float
Draw.
Multiple the floating animations and kinematic parameter and destination object of destination object set in advance are stored with VR equipment
Floating animation between correspondence.Table 1 provides corresponding between a kind of kinematic parameter and the floating animation of destination object
The example of relation.
Table 1
VR equipment can be called according to the correspondence between the floating animation of the default kinematic parameter and destination object
The target float animation of the destination object simultaneously shows the target float animation in virtual reality scenario.
For example, by taking VR equipment 100 is the VR helmets as an example, user wears VR equipment, by application program and VR apparatus bounds,
Then open application program to enter in the virtual reality scenario of VR equipment generation, and select to need to control in the virtual reality scenario
Object.As shown in figure 4, user chooses the chair 300 (destination object) in the virtual reality scenario, then along the side of arrow C1
To rotating head to drive VR equipment movings to rotate in same direction.After VR equipment detects that chair 300 is selected, built in it
Sensor component detect its rotation direction, the direction to arrow C2 that the chair 300 is transferred then according to rotation direction is floated
Dynamic animation (target float animation) is simultaneously shown in virtual reality scenario, so that chair 300 is showed as floating in water
Dynamic effect.
Thus, can by the way that the floating animation of the movement of virtual reality scenario display device and destination object is associated
Destination object is set to produce corresponding slight fluctuation to be worn in user when VR equipment rotates head, so as to lift virtual reality
The lively sense and entertaining sense of scene interactivity, lift user experience.
In another embodiment, the kinematic parameter of input can derive from exterior controlling equipment (outside as shown in Figure 1
Controlling equipment 200).Correspondingly, as shown in Figure 3B, above-mentioned steps S201 can include:
In step S212, the rotation parameter of input is received.
Exterior controlling equipment can have destination object rotation control function, and when the function is triggered, outside manipulation is set
It is standby to start to be controlled destination object.In control, user can input rotation parameter in exterior controlling equipment,
Such as user can hold exterior controlling equipment, and (or mobile) the outside controlling equipment is rotated to change off-balancesheet portion manipulation
The posture of equipment.In this process, configuring sensor component (such as gyro sensor) in exterior controlling equipment can be with
The rotation parameter of the external detection device is detected in real time.In embodiment of the disclosure, rotation parameter can for example be included but not
It is limited to direction of rotation, rotation angle etc..Table 2 provides the corresponding pass between a kind of rotation parameter and the rotation animation of destination object
The example of system.
Table 2
Exterior controlling equipment can will carry the control letter of the rotation parameter of input by the communication connection with VR equipment
Number VR equipment is sent to, VR equipment receives the control signal, so as to get rotation parameter input by user.
Correspondingly, above-mentioned steps S202 can include:
In step S222, according to pre- between the rotation parameter of input and the rotation animation of rotation parameter and destination object
If correspondence, obtain destination object target rotation animation, wherein, target show animation include target rotation animation.
Multiple the rotation animations and rotation parameter and destination object of destination object set in advance are stored with VR equipment
Rotation animation between correspondence.After VR equipment receives the rotation parameter of input, according to the default rotation parameter with
Correspondence between the rotation animation of destination object can be rotated animation with the target of invocation target object and in virtual reality field
Show that the target rotates animation in scape.
For example, as shown in figure 5, it is the VR helmets with VR equipment 100 and exterior controlling equipment 200 is handle example, Yong Huxuan
Suffered the chair 300 (destination object) in virtual reality scenario, then hand-held handle and by handle around longitudinal axis A1, along arrow C3 institutes
The direction shown rotates 60 ° (rotational angles not shown in figure), the sensor component (such as gyro sensor) being built in handle
The rotation direction and rotational angle of handle are collected, the rotation direction and rotational angle are the rotation parameter inputted.Handle will
The control signal for carrying rotation parameter is sent to the VR helmets, after the VR helmets receive the control signal, according to the rotation of input
Parameter is transferred the animation (target rotation animation) that 60 ° are rotated around longitudinal axis A2, along the direction shown in arrow C4 of the chair 300 and is shown
Show in virtual reality scenario.Thus, user can be rotated by exterior controlling equipment control targe object.
It should be noted that in further embodiments, user can also be manipulated while head is rotated by outside
Equipment inputs rotation parameter.In this case, VR equipment gets the kinematic parameter of display device and the rotation of input at the same time
Parameter, then display target object floats and rotates in virtual reality scenario.
Fig. 6 is a kind of flow chart of virtual reality exchange method according to an exemplary embodiment, wherein, this method
VR equipment is can be applied to, for example, the VR equipment 100 shown in Fig. 1.As shown in fig. 6, this method may comprise steps of:
In step s 601, according to the deliberate action of the preset model of destination object and preset model, destination object is created
Multiple display animations.
The preset model of destination object and the deliberate action of the preset model are stored with VR equipment.VR equipment is pre- by this
If the deliberate action of model and preset model is associated, so as to create multiple display animations of destination object.
In step S602, transition processing is carried out to multiple display animations.
In order to improve the display effect of the display animation of destination object, VR equipment is moved in the multiple displays for creating destination object
After picture, transition processing is carried out to this multiple display animation so that the switching between each display animation is more smooth, natural.
In step S603, when detecting that the destination object in virtual reality scenario is selected, detection is input by user
Kinematic parameter.
In step s 604, according to pre- between the kinematic parameter of input and the display animation of kinematic parameter and destination object
If correspondence, obtain destination object target show animation.
In step s 605, display target shows animation in virtual reality scenario.
It should be noted that in one embodiment, multiple display animations of destination object can store in the database,
The database can provide calling interface, such as API (Application Programming Interface, application programming
Interface).VR equipment can call the calling interface according to kinematic parameter input by user, be corresponded to obtaining the kinematic parameter of input
Target show animation.
Fig. 7 is a kind of block diagram of virtual reality interactive device 700 according to an exemplary embodiment, wherein, the dress
VR equipment can be configured at by putting 700, for example, the VR equipment 100 shown in Fig. 1.With reference to Fig. 7, which can include:Detection
Module 701, acquisition module 702 and display module 703.
The detection module 701 is configured as when the destination object in detecting virtual reality scenario is selected, and detection is defeated
The kinematic parameter entered.
The acquisition module 702 is configured as the display according to the kinematic parameter and kinematic parameter and the destination object
Default correspondence between animation, the target for obtaining the destination object show animation.
The display module 703 is configured as showing that the target shows animation in the virtual reality scenario.
Alternatively, in one embodiment, as shown in Figure 8 A, detection module 701 can include:
Detection sub-module 711, is configured as detecting the kinematic parameter of the display device of the virtual reality scenario.
Acquisition module 702 can include:
First acquisition submodule 721, is configured as the kinematic parameter and kinematic parameter according to the display device and institute
Default correspondence between the floating animation of destination object is stated, obtains the target float animation of the destination object, the mesh
Mark shows that animation includes the target float animation.
Alternatively, in another embodiment, as shown in Figure 8 B, detection module 701 can include:
Receiving submodule 712, is configured as receiving the rotation parameter of the input
Acquisition module 702 can include:
Second acquisition submodule 722, is configured as according to the rotation parameter and rotation parameter and the destination object
Rotation animation between default correspondence, obtain the target rotation animation of the destination object, the target shows animation
Animation is rotated including the target.
Alternatively, in another embodiment, as shown in figure 9, described device 700 can also include:
Creation module 704, is configured as being moved according to the preset model of the destination object and the default of the preset model
Make, create multiple display animations of the destination object;
Transitional module 705, is configured as carrying out transition processing to the multiple display animation.
Alternatively, in another embodiment, as shown in figure 9, the multiple display animation storage is in the database, it is described
Database provides calling interface, and acquisition module 702 can include:
3rd acquisition submodule 723, is configured as calling the calling interface according to the kinematic parameter, with described in acquisition
The corresponding target of kinematic parameter shows animation.
On the device in above-described embodiment, wherein modules perform the concrete mode of operation in related this method
Embodiment in be described in detail, explanation will be not set forth in detail herein.
The disclosure also provides a kind of computer-readable recording medium, is stored thereon with computer program instructions, which refers to
The step of virtual reality exchange method that the disclosure provides, is realized in order when being executed by processor.
Figure 10 is a kind of frame of device 800 for virtual reality exchange method according to an exemplary embodiment
Figure.For example, the device 800 can be VR equipment.
With reference to Figure 10, device 800 can include following one or more assemblies:Processing component 802, memory 804, electric power
Component 806, multimedia component 808, audio component 810, the interface 812 of input/output (I/O), sensor component 814, and
Communication component 816.
The integrated operation of the usual control device 800 of processing component 802, such as with display, call, data communication, phase
The operation that machine operates and record operation is associated.Processing component 802 can refer to including one or more processors 820 to perform
Order, to complete all or part of step of above-mentioned virtual reality exchange method.In addition, processing component 802 can include one
Or multiple modules, easy to the interaction between processing component 802 and other assemblies.For example, processing component 802 can include multimedia
Module, to facilitate the interaction between multimedia component 808 and processing component 802.
Memory 804 is configured as storing various types of data to support the operation in device 800.These data are shown
Example includes the instruction of any application program or method for being operated on device 800, and contact data, telephone book data, disappears
Breath, picture, video etc..Memory 804 can be by any kind of volatibility or non-volatile memory device or their group
Close and realize, as static RAM (SRAM), electrically erasable programmable read-only memory (EEPROM) are erasable to compile
Journey read-only storage (EPROM), programmable read only memory (PROM), read-only storage (ROM), magnetic memory, flash
Device, disk or CD.
Electric power assembly 806 provides electric power for the various assemblies of device 800.Electric power assembly 806 can include power management system
System, one or more power supplys, and other components associated with generating, managing and distributing electric power for device 800.
Multimedia component 808 is included in the screen of one output interface of offer between described device 800 and user.One
In a little embodiments, screen can include liquid crystal display (LCD) and touch panel (TP).If screen includes touch panel, screen
Curtain may be implemented as touch-screen, to receive input signal from the user.Touch panel includes one or more touch sensings
Device is to sense the gesture on touch, slip and touch panel.The touch sensor can not only sense touch or sliding action
Border, but also detect and the duration and pressure associated with the touch or slide operation.In certain embodiments, more matchmakers
Body component 808 includes a front camera and/or rear camera.When device 800 is in operator scheme, such as screening-mode or
During video mode, front camera and/or rear camera can receive exterior multi-medium data.Each front camera and
Rear camera can be a fixed optical lens system or have focusing and optical zoom capabilities.
Audio component 810 is configured as output and/or input audio signal.For example, audio component 810 includes a Mike
Wind (MIC), when device 800 is in operator scheme, during such as call model, logging mode and speech recognition mode, microphone by with
It is set to reception external audio signal.The received audio signal can be further stored in memory 804 or via communication set
Part 816 is sent.In certain embodiments, audio component 810 further includes a loudspeaker, for exports audio signal.
I/O interfaces 812 provide interface between processing component 802 and peripheral interface module, and above-mentioned peripheral interface module can
To be keyboard, click wheel, button etc..These buttons may include but be not limited to:Home button, volume button, start button and lock
Determine button.
Sensor component 814 includes one or more sensors, and the state for providing various aspects for device 800 is commented
Estimate.For example, sensor component 814 can detect opening/closed mode of device 800, and the relative positioning of component, for example, it is described
Component is the display and keypad of device 800, and sensor component 814 can be with 800 1 components of detection device 800 or device
Position change, the existence or non-existence that user contacts with device 800,800 orientation of device or acceleration/deceleration and device 800
Temperature change.Sensor component 814 can include proximity sensor, be configured to detect without any physical contact
Presence of nearby objects.Sensor component 814 can also include optical sensor, such as CMOS or ccd image sensor, for into
As being used in application.In certain embodiments, which can also include acceleration transducer, gyro sensors
Device, Magnetic Sensor, pressure sensor or temperature sensor.
Communication component 816 is configured to facilitate the communication of wired or wireless way between device 800 and other equipment.Device
800 can access the wireless network based on communication standard, such as WiFi, 2G or 3G, or combinations thereof.In an exemplary implementation
In example, communication component 816 receives broadcast singal or broadcast related information from external broadcasting management system via broadcast channel.
In one exemplary embodiment, the communication component 816 further includes near-field communication (NFC) module, to promote junction service.Example
Such as, in NFC module radio frequency identification (RFID) technology can be based on, Infrared Data Association (IrDA) technology, ultra wide band (UWB) technology,
Bluetooth (BT) technology and other technologies are realized.
In the exemplary embodiment, device 800 can be believed by one or more application application-specific integrated circuit (ASIC), numeral
Number processor (DSP), digital signal processing appts (DSPD), programmable logic device (PLD), field programmable gate array
(FPGA), controller, microcontroller, microprocessor or other electronic components are realized, for performing above-mentioned virtual reality interaction side
Method.
In the exemplary embodiment, a kind of non-transitorycomputer readable storage medium including instructing, example are additionally provided
Such as include the memory 804 of instruction, above-metioned instruction can be performed by the processor 820 of device 800 to be handed over to complete above-mentioned virtual reality
Mutual method.For example, the non-transitorycomputer readable storage medium can be ROM, random access memory (RAM), CD-
ROM, tape, floppy disk and optical data storage devices etc..
Those skilled in the art will readily occur to other embodiment party of the disclosure after considering specification and putting into practice the disclosure
Case.This application is intended to cover any variations, uses, or adaptations of the disclosure, these modifications, purposes or adaptability
Change follows the general principle of the disclosure and including the undocumented common knowledge in the art of the disclosure or usual skill
Art means.Description and embodiments are considered only as exemplary, and the true scope and spirit of the disclosure are by following claim
Point out.
It should be appreciated that the present disclosure is not limited to the precise structures that have been described above and shown in the drawings, and
And various modifications and changes may be made without departing from the scope thereof.The scope of the present disclosure is only limited by appended claim.
Claims (12)
- A kind of 1. virtual reality exchange method, it is characterised in that including:When detecting that the destination object in virtual reality scenario is selected, the kinematic parameter of input is detected;According to default correspondence between the kinematic parameter and kinematic parameter and the display animation of the destination object, obtain The target of the destination object is taken to show animation;Show that the target shows animation in the virtual reality scenario.
- 2. virtual reality exchange method according to claim 1, it is characterised in that the kinematic parameter bag of the detection input Include:Detect the kinematic parameter of the display device of the virtual reality scenario;It is described according between the kinematic parameter and kinematic parameter and the display animation of the destination object it is default it is corresponding close System, the target for obtaining the destination object show animation, including:According to default between the kinematic parameter of the display device and kinematic parameter and the floating animation of the destination object Correspondence, obtains the target float animation of the destination object, and the target shows that animation includes the target float animation.
- 3. virtual reality exchange method according to claim 1, it is characterised in that the kinematic parameter bag of the detection input Include:Receive the rotation parameter of the input;It is described according between the kinematic parameter and kinematic parameter and the display animation of the destination object it is default it is corresponding close System, the target for obtaining the destination object show that animation includes:According to default correspondence between the rotation parameter and rotation parameter and the rotation animation of the destination object, obtain The target rotational animation of the destination object is taken, the target shows that animation includes the target rotational animation.
- 4. virtual reality exchange method according to claim 1, it is characterised in that the method further includes:According to the deliberate action of the preset model of the destination object and the preset model, the multiple of the destination object are created Show animation;Transition processing is carried out to the multiple display animation.
- 5. virtual reality exchange method according to claim 4, it is characterised in that the multiple display animation is stored in data In storehouse, the database provides calling interface, and the target for obtaining the destination object shows that animation includes:The calling interface is called according to the kinematic parameter, show with to obtain the kinematic parameter corresponding target and moved Draw.
- A kind of 6. virtual reality interactive device, it is characterised in that including:Detection module, is configured as, when the destination object in detecting virtual reality scenario is selected, detecting the movement of input Parameter;Acquisition module, is configured as according between the kinematic parameter and kinematic parameter and the display animation of the destination object Default correspondence, the target for obtaining the destination object show animation;Display module, is configured as showing that the target shows animation in the virtual reality scenario.
- 7. virtual reality interactive device according to claim 6, it is characterised in that the detection module includes:Detection sub-module, is configured as detecting the kinematic parameter of the display device of the virtual reality scenario;The acquisition module includes:First acquisition submodule, is configured as the kinematic parameter and kinematic parameter according to the display device and the target pair Default correspondence between the floating animation of elephant, obtains the target float animation of the destination object, and the target shows dynamic Picture includes the target float animation.
- 8. virtual reality interactive device according to claim 6, it is characterised in that the detection module includes:Receiving submodule, is configured as receiving the rotation parameter of the input;The acquisition module includes:Second acquisition submodule, is configured as being moved according to the rotation of the rotation parameter and rotation parameter and the destination object Default correspondence between picture, obtains the target rotation animation of the destination object, it is described that the target shows that animation includes Target rotates animation.
- 9. virtual reality interactive device according to claim 6, it is characterised in that described device further includes:Creation module, is configured as the deliberate action of the preset model and the preset model according to the destination object, creates Multiple display animations of the destination object;Transitional module, is configured as carrying out transition processing to the multiple display animation.
- 10. virtual reality interactive device according to claim 9, it is characterised in that the multiple display animation is stored in In database, the database provides calling interface, and the acquisition module includes:3rd acquisition submodule, is configured as calling the calling interface according to the kinematic parameter, is joined with obtaining the movement The corresponding target of number shows animation.
- A kind of 11. virtual reality interactive device, it is characterised in that including:Processor;For storing the memory of processor-executable instruction;Wherein, the processor is configured as:When detecting that the destination object in virtual reality scenario is selected, the kinematic parameter of input is detected;According to default correspondence between the kinematic parameter and kinematic parameter and the display animation of the destination object, obtain The target of the destination object is taken to show animation;Show that the target shows animation in the virtual reality scenario.
- 12. a kind of computer-readable recording medium, is stored thereon with computer program instructions, it is characterised in that the programmed instruction The step of method any one of claim 1 to 5 is realized when being executed by processor.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711080735.XA CN107908281A (en) | 2017-11-06 | 2017-11-06 | Virtual reality exchange method, device and computer-readable recording medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711080735.XA CN107908281A (en) | 2017-11-06 | 2017-11-06 | Virtual reality exchange method, device and computer-readable recording medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107908281A true CN107908281A (en) | 2018-04-13 |
Family
ID=61843460
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711080735.XA Pending CN107908281A (en) | 2017-11-06 | 2017-11-06 | Virtual reality exchange method, device and computer-readable recording medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107908281A (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110751707A (en) * | 2019-10-24 | 2020-02-04 | 北京达佳互联信息技术有限公司 | Animation display method, animation display device, electronic equipment and storage medium |
CN110992453A (en) * | 2019-12-17 | 2020-04-10 | 上海米哈游天命科技有限公司 | Scene object display method and device, electronic equipment and storage medium |
CN111080756A (en) * | 2019-12-19 | 2020-04-28 | 米哈游科技(上海)有限公司 | Interactive animation generation method, device, equipment and medium |
CN111124137A (en) * | 2019-12-31 | 2020-05-08 | 广州华多网络科技有限公司 | Image display method, device, equipment and storage medium |
CN111626254A (en) * | 2020-06-02 | 2020-09-04 | 上海商汤智能科技有限公司 | Display animation triggering method and device |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101414217A (en) * | 2007-10-16 | 2009-04-22 | 康佳集团股份有限公司 | Interesting display method of terminal dialing character style |
CN103744513A (en) * | 2014-01-17 | 2014-04-23 | 深圳好未来智能科技有限公司 | Interaction system and interaction method both with interactive-type 3D (three-dimensional) figure |
CN105094523A (en) * | 2015-06-17 | 2015-11-25 | 厦门幻世网络科技有限公司 | 3D animation display method and apparatus |
CN105389011A (en) * | 2015-12-01 | 2016-03-09 | 深圳还是威健康科技有限公司 | Display rotation control method and intelligent wearable device |
CN106547352A (en) * | 2016-10-18 | 2017-03-29 | 小派科技(上海)有限责任公司 | A kind of display packing of virtual reality picture, wear display device and its system |
CN106873767A (en) * | 2016-12-30 | 2017-06-20 | 深圳超多维科技有限公司 | The progress control method and device of a kind of virtual reality applications |
CN106873783A (en) * | 2017-03-29 | 2017-06-20 | 联想(北京)有限公司 | Information processing method, electronic equipment and input unit |
-
2017
- 2017-11-06 CN CN201711080735.XA patent/CN107908281A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101414217A (en) * | 2007-10-16 | 2009-04-22 | 康佳集团股份有限公司 | Interesting display method of terminal dialing character style |
CN103744513A (en) * | 2014-01-17 | 2014-04-23 | 深圳好未来智能科技有限公司 | Interaction system and interaction method both with interactive-type 3D (three-dimensional) figure |
CN105094523A (en) * | 2015-06-17 | 2015-11-25 | 厦门幻世网络科技有限公司 | 3D animation display method and apparatus |
CN105389011A (en) * | 2015-12-01 | 2016-03-09 | 深圳还是威健康科技有限公司 | Display rotation control method and intelligent wearable device |
CN106547352A (en) * | 2016-10-18 | 2017-03-29 | 小派科技(上海)有限责任公司 | A kind of display packing of virtual reality picture, wear display device and its system |
CN106873767A (en) * | 2016-12-30 | 2017-06-20 | 深圳超多维科技有限公司 | The progress control method and device of a kind of virtual reality applications |
CN106873783A (en) * | 2017-03-29 | 2017-06-20 | 联想(北京)有限公司 | Information processing method, electronic equipment and input unit |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110751707A (en) * | 2019-10-24 | 2020-02-04 | 北京达佳互联信息技术有限公司 | Animation display method, animation display device, electronic equipment and storage medium |
CN110751707B (en) * | 2019-10-24 | 2021-02-05 | 北京达佳互联信息技术有限公司 | Animation display method, animation display device, electronic equipment and storage medium |
CN110992453A (en) * | 2019-12-17 | 2020-04-10 | 上海米哈游天命科技有限公司 | Scene object display method and device, electronic equipment and storage medium |
CN110992453B (en) * | 2019-12-17 | 2024-01-23 | 上海米哈游天命科技有限公司 | Scene object display method and device, electronic equipment and storage medium |
CN111080756A (en) * | 2019-12-19 | 2020-04-28 | 米哈游科技(上海)有限公司 | Interactive animation generation method, device, equipment and medium |
CN111080756B (en) * | 2019-12-19 | 2023-09-08 | 米哈游科技(上海)有限公司 | Interactive animation generation method, device, equipment and medium |
CN111124137A (en) * | 2019-12-31 | 2020-05-08 | 广州华多网络科技有限公司 | Image display method, device, equipment and storage medium |
CN111124137B (en) * | 2019-12-31 | 2023-08-08 | 广州华多网络科技有限公司 | Image display method, device, equipment and storage medium |
CN111626254A (en) * | 2020-06-02 | 2020-09-04 | 上海商汤智能科技有限公司 | Display animation triggering method and device |
CN111626254B (en) * | 2020-06-02 | 2024-04-16 | 上海商汤智能科技有限公司 | Method and device for triggering display animation |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107908281A (en) | Virtual reality exchange method, device and computer-readable recording medium | |
CN105259654B (en) | Spectacle terminal and its control method | |
CN107800945A (en) | Method and device that panorama is taken pictures, electronic equipment | |
CN102467343B (en) | Mobile terminal and the method for controlling mobile terminal | |
CN104639843B (en) | Image processing method and device | |
CN109089170A (en) | Barrage display methods and device | |
CN108510597A (en) | Edit methods, device and the non-transitorycomputer readable storage medium of virtual scene | |
CN106791893A (en) | Net cast method and device | |
CN107040646A (en) | Mobile terminal and its control method | |
CN106537319A (en) | Screen-splitting display method and device | |
CN106341522A (en) | Mobile Terminal And Method For Controlling The Same | |
CN111701238A (en) | Virtual picture volume display method, device, equipment and storage medium | |
CN104991752B (en) | Control the method and device of screen rotation | |
CN106804000A (en) | Direct playing and playback method and device | |
CN105549732A (en) | Method and device for controlling virtual reality device and virtual reality device | |
CN107832036A (en) | Sound control method, device and computer-readable recording medium | |
CN109151546A (en) | A kind of method for processing video frequency, terminal and computer readable storage medium | |
CN106993229A (en) | Interactive attribute methods of exhibiting and device | |
CN108260020A (en) | The method and apparatus that interactive information is shown in panoramic video | |
CN112634416A (en) | Method and device for generating virtual image model, electronic equipment and storage medium | |
CN107977083A (en) | Operation based on VR systems performs method and device | |
CN107515669A (en) | Display methods and device | |
CN108319363A (en) | Product introduction method, apparatus based on VR and electronic equipment | |
CN106067833A (en) | Mobile terminal and control method thereof | |
CN107832746A (en) | Expression recognition method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180413 |
|
RJ01 | Rejection of invention patent application after publication |