CN105955483A - Virtual reality terminal and visual virtualization method and device thereof - Google Patents
Virtual reality terminal and visual virtualization method and device thereof Download PDFInfo
- Publication number
- CN105955483A CN105955483A CN201610295037.0A CN201610295037A CN105955483A CN 105955483 A CN105955483 A CN 105955483A CN 201610295037 A CN201610295037 A CN 201610295037A CN 105955483 A CN105955483 A CN 105955483A
- Authority
- CN
- China
- Prior art keywords
- virtual
- data
- virtual reality
- reality terminal
- build
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1423—Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
Abstract
The embodiment of the invention provides a virtual reality terminal and a visual virtualization method and device thereof. The method comprises the following steps: according to wearing pose data of the virtual reality terminal in a practical physical space and practical body type data and practical movement data corresponding to a user, constructing virtual body type data and virtual movement data in a three-dimensional virtual scene; and according to the constructed virtual body type data and virtual movement data, generating a corresponding virtual display image. When the user uses the virtual reality terminal provided by the invention, the immersive feeling by the point of view of a first person can be generated in virtue of a reference substance, the immersive feeling is improved, the user can more whole-heartedly devoted into the entertainment experience of a virtual world, and the acceptance degree of the user on the sense of reality of the virtual scene is greatly improved.
Description
Technical field
The present embodiments relate to technical field of virtual reality, particularly relate to a kind of virtual reality terminal and
Vision virtual method and device.
Background technology
Virtual reality technology is a kind of computer simulation system, can create within the system and experiencing virtual
The world.Substantially, this system utilizes computer to generate a kind of virtual environment, and this virtual environment is that one is many
The interactively Three-Dimensional Dynamic what comes into a driver's of source information fusion and the system emulation to entity behavior, it is heavy to reach
Immersion is experienced.
The multi-source information that virtual reality relates to includes real-time three-dimensional computer graphics techniques, Radix Rumicis (the wide visual field)
Stereo display technique, tracking technique to observer's head, eye and hands, and sense of touch/power feels feedback, three-dimensional
The transmission of sound, network, phonetic entry export technique etc..In addition, further relate to binocular stereo vision, double
Item stereo vision has played bigger effect.In technique of binocular stereoscopic vision, the difference that two eyes are seen
Image produces respectively, is respectively displayed on two different display.It addition, also there is virtual reality
System uses individual monitor, but after the upper special glasses of band, eyes can only see odd-numbered frame image,
Another eyes can only see even frame image, owing between odd, even frame, difference i.e. exists parallax, thus
Create third dimension.
But, current virtual reality technology is more by above-mentioned multi-technical fusion, it is provided that immersion
Visual experience, the most current virtual reality device, virtual reality glasses, be nearly all similar wear-type
Equipment.Such as when user brings a virtual reality device to be in a three-dimensional room, virtual reality
Fictionalizing a virtual three-dimensional space model in the display of device, the direct feel of user is to enter
Another one space, but now user cannot produce on the spot in person with first person by object of reference
Sensation, causes user to be substantially reduced the actual experience degree of virtual scene, and such as user is virtual existing in use
During real terminal, need continuous moving view point, but during viewpoint moves viewing, can't see user
Head and health, and the gesture motion etc. that user makes so that user cannot put into whole-heartedly
In the recreation experience of virtual world.
Summary of the invention
The purpose of the embodiment of the present invention is to provide a kind of virtual reality terminal and vision virtual method thereof and dress
Put, in order to solve in prior art user when using virtual reality terminal cannot by object of reference produce with
The technical problem of the sensation that first person is on the spot in person.
Technical scheme is as follows:
The embodiment of the present invention provides the vision virtual method of a kind of virtual reality terminal, comprising:
According to virtual reality terminal described in actual physics space to wear pose data and user corresponding
Actual build data and actual motion data, build virtual build data in three-dimensional virtual scene and
Virtual acting data;
According to the described virtual build data built and the corresponding virtual display of described virtual acting data genaration
Image.
Preferably, in one embodiment of this invention, described user is corresponding actual build data and reality
Border action data generates the most in the following way:
Gather build view data and the action image data of virtual reality terminal user;
Build view data and action image data to the virtual reality terminal user collected solve
Analysis, generates the described actual build data of correspondence and described actual motion data respectively.
Preferably, in one embodiment of this invention, the pose data of wearing of described virtual reality terminal have
Body generates in the following way:
Gather that virtual reality terminal user wears described virtual reality terminal wears posture information;
Carry out wearing pose data described in parsing generation accordingly to described posture information of wearing.
Preferably, in one embodiment of this invention, gather virtual reality terminal user and wear described void
The posture information of wearing intending non-real end includes: gather virtual reality eventually by least one infrared camera
What end user wore described virtual reality terminal wears posture information;Or, by gyroscope and/or add
What velocity sensor gathered that virtual reality terminal user wears described virtual reality terminal wears posture information.
Preferably, in one embodiment of this invention, the virtual build number in three-dimensional virtual scene is built
According to virtual acting data:
According to 3-D geometric model in three-dimensional virtual scene and normal direction figure, described in actual physics space
Virtual reality terminal wear pose data and actual build data corresponding to user and actual motion data
Be converted to the virtual build data in three-dimensional virtual scene and virtual acting data.
The embodiment of the present invention provides the vision virtual bench of a kind of virtual reality terminal, comprising:
Construction unit, for according to described in actual physics space virtual reality terminal wear pose number
According to and actual build data corresponding to user and actual motion data, build in three-dimensional virtual scene
Virtual build data and virtual acting data;
Dummy unit, described virtual build data and described virtual acting data genaration according to building are corresponding
Virtual display image.
Preferably, in one embodiment of this invention, also include the first collecting unit, be used for gathering virtual
The build view data of non-real end user and action image data;And to the virtual reality collected
The build view data of terminal user and action image data resolve, and generate the described of correspondence respectively
Actual build data and described actual motion data.
Preferably, in one embodiment of this invention, also include the second collecting unit, be used for gathering virtual
What non-real end user wore described virtual reality terminal wears posture information;And wear position to described
Appearance information carries out resolving described in generation accordingly wears pose data.
Preferably, in one embodiment of this invention, described second collecting unit is further used for obtaining logical
The virtual reality terminal user crossing at least one infrared camera collection wears described virtual reality terminal
Wear pose image information;Or, for obtaining the void gathered by gyroscope and/or acceleration transducer
That intends that non-real end user wears described virtual reality terminal wears posture information.
Preferably, in one embodiment of this invention, described construction unit is further used for according to three-dimensional empty
Intend 3-D geometric model and normal direction figure in scene, by virtual reality terminal described in actual physics space
Wear pose data and actual build data corresponding to user and actual motion data is converted to three-dimensional empty
Intend the virtual build data in scene and virtual acting data.
The embodiment of the present invention provides a kind of virtual reality terminal, and it includes hardware processor, at described hardware
Reason device for according to described in actual physics space virtual reality terminal wear pose data and user
Corresponding actual build data and actual motion data, build the virtual build number in three-dimensional virtual scene
According to virtual acting data;And according to the described virtual build data built and described virtual acting data
Generate corresponding virtual display image.
The technical scheme of the embodiment of the present invention has the advantage that
By according to described in actual physics space virtual reality terminal wear pose data and user
Corresponding actual build data and actual motion data, build the virtual build number in three-dimensional virtual scene
According to virtual acting data;And according to the described virtual build data built and described virtual acting data
Generate corresponding virtual display image so that user use the present invention provide virtual reality terminal time,
Can produce the sensation on the spot in person with first person by object of reference, such as user is virtual in use
During non-real end, during user uses the continuous moving view point of virtual reality terminal to watch, energy
Enough limbs images seeing that from the screen of virtual reality terminal the foundation true limbs of user and action fictionalize
And action so that user has had the true object of reference in consciousness to the observation of virtual scene, improves body and faces
The sensation in its border, it is possible to put into more whole-heartedly in the recreation experience of virtual world, be greatly improved
User's approval degree to the sense of reality of virtual scene.
Accompanying drawing explanation
In order to be illustrated more clearly that the embodiment of the present invention or technical scheme of the prior art, below will be to reality
Execute the required accompanying drawing used in example or description of the prior art to be briefly described, it should be apparent that under,
Accompanying drawing during face describes is some embodiments of the present invention, for those of ordinary skill in the art,
On the premise of not paying creative work, it is also possible to obtain other accompanying drawing according to these accompanying drawings.
Fig. 1 is the vision virtual method schematic flow sheet of the embodiment of the present invention one virtual reality terminal;
Fig. 2 is the vision virtual method schematic flow sheet of the embodiment of the present invention two virtual reality terminal;
Fig. 3 is the vision virtual method schematic flow sheet of the embodiment of the present invention three virtual reality terminal;
Fig. 4 is the vision virtual device structure schematic diagram of the embodiment of the present invention four virtual reality terminal;
Fig. 5 is the vision virtual device structure schematic diagram of the virtual non-real end of the embodiment of the present invention five;
Fig. 6 is the vision virtual device structure schematic diagram of the embodiment of the present invention six virtual reality terminal;
Fig. 7 is the embodiment of the present invention seven virtual reality terminal structure schematic diagram.
Detailed description of the invention
For making the purpose of the embodiment of the present invention, technical scheme and advantage clearer, below in conjunction with this
Accompanying drawing in bright embodiment, is clearly and completely described the technical scheme in the embodiment of the present invention,
Obviously, described embodiment is a part of embodiment of the present invention rather than whole embodiments.Based on
Embodiment in the present invention, those of ordinary skill in the art are obtained under not making creative work premise
The every other embodiment obtained, broadly falls into the scope of protection of the invention.
In the following embodiment of the present invention, by according to virtual reality described in actual physics space eventually
End wear pose data and actual build data corresponding to user and actual motion data, build three
Virtual build data in dimension virtual scene and virtual acting data;And according to the described Dummy built
Type data and the described corresponding virtual display image of virtual acting data genaration so that user is using this
During the virtual reality terminal of bright offer, it is possible to produce on the spot in person with first person by object of reference
Sensation, such as user, when using virtual reality terminal, uses virtual reality terminal constantly to move user
During viewpoint is watched, it is possible to see from the screen of virtual reality terminal according to the true limb of user
Limbs image that body and action fictionalize and action so that the observation of virtual scene has been had in consciousness by user
True object of reference, improve sensation on the spot in person, it is possible to put into virtual world more whole-heartedly
Recreation experience in, be greatly improved user's approval degree to the sense of reality of virtual scene.
Fig. 1 is the vision virtual method schematic flow sheet of the embodiment of the present invention one virtual reality terminal;Such as figure
Shown in 1, the technical scheme in the present embodiment comprises the steps:
S101, according to described in actual physics space virtual reality terminal wear pose data and use
Actual build data that person is corresponding and actual motion data, build the virtual build in three-dimensional virtual scene
Data and virtual acting data;
In the present embodiment, wear pose data, actual build data that user is corresponding, actual act number
Hand is carried out according to the virtual scene following virtual reality terminal corresponding to specific user and specific user thereof
Impact or action etc. of running of leg.Wear pose data and refer to that user is using virtual reality eventually
The attitude data of virtual reality terminal itself, such as user bowing in use during end, face upward
Head, turn left and turn right during, what virtual reality terminal was corresponding moves downward, moves upward, to the left
Moving and move right, corresponding these similar states of virtual reality terminal produce attitude data.
In the present embodiment, in the virtual build data built in three-dimensional virtual scene and virtual acting data
May include that according to 3-D geometric model in three-dimensional virtual scene and normal direction figure, by actual physics space
Described in virtual reality terminal wear pose data and actual build data corresponding to user and reality is moved
The virtual build data in three-dimensional virtual scene and virtual acting data are converted to as data.
S102, according to build described virtual build data and described virtual acting data genaration the most empty
Intend display image.
In the present embodiment, in step S102 when generating mutually deserved virtual display image, can be by the scene
When scape renders, the described virtual build data built and described virtual acting data are loaded into the mould of scene
In type, generate corresponding virtual display image.
Specifically, in the present embodiment, scene wash with watercolours can be carried out by graphic process unit GPU with OpenGL
Dye, thus the described virtual build data built and described virtual acting data are loaded into the model of scene
In.
Fig. 2 is the vision virtual method schematic flow sheet of the embodiment of the present invention two virtual reality terminal;Such as figure
Shown in 2, the technical scheme in the present embodiment comprises the steps:
S201, the build view data gathering virtual reality terminal user and action image data;
In the present embodiment, in step S201 gather virtual reality terminal user build view data and
The build of virtual reality terminal user can be gathered by least one photographic head during action image data
View data and action image data.Specifically, such as gathered by infrared camera, now, can
To arrange multiple infraluminescence point on the body of the user, such as head, shoulder, arm, trunk,
The positions such as waistline, buttocks leg, by the infraluminescence point of these positions that infrared camera captures
Gather build view data and the action image data of virtual reality terminal user.
In the present embodiment, photographic head can be arranged directly in virtual reality terminal, or is arranged on use
In the environment of virtual reality terminal.When photographic head is arranged on virtual reality terminal, can be according to image acquisition
Portability, photographic head is arranged on the both sides of virtual reality terminal, or top, or virtual existing
Both sides and the top of real terminal are provided with photographic head.The front of described virtual reality device is placed two and is taken the photograph
As head, two photographic head are placed in side, when user uses device to carry out headwork, and described device
The image of two photographic head shootings in front is consistent with user front direction of observation, and such as user passes through
Bow so that two photographic head in device front photograph the image of both hands and leg, thus in a device
Real-time virtual goes out the consistent virtual limbs image of visual effect and limb action.
In an other embodiment, the build of virtual reality terminal user can be gathered by photographic head
View data, and carry out the tracking of movement locus by gyroscope/acceleration transducer thus form action
View data.
S202, to the build view data of the virtual reality terminal user collected and action image data
Resolve, generate the described actual build data of correspondence and described actual motion data respectively.
In the present embodiment, if gathering user build data and action data by photographic head, then can
Form a series of infrared image, i.e. infrared image sequence, therefore, by this infrared graphic sequence is solved
Analysis, thus generate actual build data and the actual motion data of correspondence.Infrared image sequence is being carried out
During parsing can by image to set characteristic point be identified and follow the tracks of, thus generate respectively right
The actual build data answered and actual motion data.Before infrared image sequence is resolved, in order to remove
Unnecessary noise, can carry out the mean operation of pixel value, then each picture by infrared image sequence
Element value is that this pixel average carries out difference, thus eliminates noise, also reduces data processing amount simultaneously,
Improve the efficiency that data process.
S203, according to described in actual physics space virtual reality terminal wear pose data and use
Actual build data that person is corresponding and actual motion data, build the virtual build in three-dimensional virtual scene
Data and virtual acting data;
S204, according to build described virtual build data and described virtual acting data genaration the most empty
Intend display image.
In the present embodiment, step S203-S204 is similar to the S101-S102 of above-mentioned Fig. 1, repeats no more in detail.
Fig. 3 is the vision virtual method schematic flow sheet of the embodiment of the present invention three virtual reality terminal;Such as figure
Shown in 3, the technical scheme in the present embodiment comprises the steps:
S301, gather that virtual reality terminal user wears described virtual reality terminal wear posture information;
In the present embodiment, step S301 gathers virtual reality terminal user and wears described virtual reality eventually
Being worn by least one camera collection virtual reality terminal user when wearing posture information of end is described
Virtual reality terminal wear pose image information.
In the present embodiment, the photographic head being referred in above-mentioned Fig. 2 is arranged and posture information is worn in collection,
I.e. in the present embodiment, such as virtual reality terminal is arranged multiple infraluminescence point, such as top, left and right
Multiple position such as both sides, the infraluminescence point of these positions captured by infrared camera gathers void
Intend non-real end and wear posture information.
S302, to described wear posture information carry out resolving generate corresponding described in wear pose data.
Similar embodiment shown in above-mentioned Fig. 1 or Fig. 2, in the present embodiment, if come by photographic head
Posture information is worn in collection, then can form a series of infrared image, i.e. infrared image sequence, therefore, logical
Cross and this infrared graphic sequence resolved, thus generate correspondence wear pose data.To infrared image
By the characteristic point set being identified and follows the tracks of on image, thus can give birth to when sequence resolves
Become correspondence wears pose data.Before infrared image sequence is resolved, in order to remove unnecessary making an uproar
Sound, can carry out the mean operation of pixel value by infrared image sequence, and then each pixel value is this
Pixel average carries out difference, thus eliminates noise, also reduces data processing amount simultaneously, improves data
The efficiency processed.
S303, according to described in actual physics space virtual reality terminal wear pose data and use
Actual build data that person is corresponding and actual motion data, build the virtual build in three-dimensional virtual scene
Data and virtual acting data;
S304, according to build described virtual build data and described virtual acting data genaration the most empty
Intend display image.
S201, the S202 being similar in S101, S102, the Fig. 2 in Fig. 1 about step S303-S304,
Repeat no more in detail.
Alternately, in an other vision virtual method embodiment, virtual existing in collection in step S301
Real terminal user wear described virtual reality terminal when wearing posture information can by gyroscope and/
Or acceleration transducer gather that virtual reality terminal user wears described virtual reality terminal wear pose
Information.The quantity of gyroscope and/or acceleration transducer can be one or more, when gyroscope and/or acceleration
When the quantity of degree sensor is multiple, can be by the quantity to multiple gyroscopes and/or acceleration transducer
The posture information of wearing collected carries out fusion treatment.
Fig. 4 is the vision virtual device structure schematic diagram of the embodiment of the present invention four virtual reality terminal;Such as figure
Shown in 4, the technical scheme in the present embodiment includes:
Construction unit 401, for according to described in actual physics space virtual reality terminal wear position
Appearance data and actual build data corresponding to user and actual motion data, build at three-dimensional virtual scene
In virtual build data and virtual acting data;
Dummy unit 402, for according to the described virtual build data built and described virtual acting data
Generate corresponding virtual display image.
Fig. 5 is the vision virtual device structure schematic diagram of the virtual non-real end of the embodiment of the present invention five;Such as figure
Shown in 5, the technical scheme in the present embodiment includes:
First collecting unit 500, for gathering the build view data of virtual reality terminal user and moving
Make view data;And to the build view data of the virtual reality terminal user collected and action diagram
As data resolve, generate the described actual build data of correspondence and described actual motion data respectively.
Construction unit 401, for according to described in actual physics space virtual reality terminal wear position
Appearance data and actual build data corresponding to user and actual motion data, build at three-dimensional virtual scene
In virtual build data and virtual acting data;
Dummy unit 402, for according to the described virtual build data built and described virtual acting data
Generate corresponding virtual display image.
Preferably, in any of the embodiments of the present invention, described first collecting unit is further used for obtaining
Build view data and the motion images of virtual reality terminal user is gathered by least one photographic head
Data.
Fig. 6 is the vision virtual device structure schematic diagram of the embodiment of the present invention six virtual reality terminal;Such as figure
Shown in 6, the technical scheme in the present embodiment includes:
Second collecting unit 600, is used for gathering virtual reality terminal user and wears described virtual reality eventually
That holds wears posture information;And carry out wearing described in parsing generation accordingly to described posture information of wearing
Pose data.
Second collecting unit described in the present embodiment is further used for obtaining by least one camera collection
What virtual reality terminal user wore described virtual reality terminal wears pose image information.
Construction unit 401, for according to described in actual physics space virtual reality terminal wear position
Appearance data and actual build data corresponding to user and actual motion data, build at three-dimensional virtual scene
In virtual build data and virtual acting data;
Dummy unit 402, according to the described virtual build data built and described virtual acting data genaration
Corresponding virtual display image.
Alternately, described second collecting unit is further used for obtaining by gyroscope and/or acceleration biography
What sensor gathered that virtual reality terminal user wears described virtual reality terminal wears posture information.
Fig. 7 is the embodiment of the present invention seven virtual reality terminal structure schematic diagram;As it is shown in fig. 7, it is virtual existing
Real terminal 700 includes hardware processor 701, and described hardware processor 701 is for according at actual physics
Virtual reality terminal described in space wear pose data and actual build data corresponding to user and reality
Border action data, builds the virtual build data in three-dimensional virtual scene and virtual acting data;And
According to the described virtual build data built and the described corresponding virtual display image of virtual acting data genaration.
In the present embodiment or other any embodiment, described hardware processor 701 obtains collection further
The build view data of virtual reality terminal user and action image data;And it is virtual to collect
The build view data of non-real end user and action image data resolve, and generate correspondence respectively
Described actual build data and described actual motion data.
In the present embodiment or other any embodiment, described hardware processor 701 obtains collection further
What virtual reality terminal user wore described virtual reality terminal wears posture information;And to described pendant
Wear posture information to carry out wearing pose data described in parsing generation accordingly.
In the present embodiment or other any embodiment, it is logical that described hardware processor 701 is further used for acquisition
Cross at least one photographic head to gather build view data and the motion images number of virtual reality terminal user
According to.
In the present embodiment or other any embodiment, it is logical that described hardware processor 701 is further used for acquisition
That crosses that at least one camera collection virtual reality terminal user wears described virtual reality terminal wears position
Appearance image information.
In the present embodiment or other any embodiment, it is logical that described hardware processor 701 is further used for acquisition
Cross gyroscope and/or acceleration transducer gathers virtual reality terminal user and wears described virtual reality eventually
That holds wears posture information.
Device embodiment described above is only schematically, wherein said illustrates as separating component
Unit can be or may not be physically separate, the parts shown as unit can be or
Person may not be physical location, i.e. may be located at a place, or can also be distributed to multiple network
On unit.Some or all of module therein can be selected according to the actual needs to realize the present embodiment
The purpose of scheme.Those of ordinary skill in the art are not in the case of paying performing creative labour, the most permissible
Understand and implement.
Through the above description of the embodiments, those skilled in the art is it can be understood that arrive each reality
The mode of executing can add the mode of required general hardware platform by software and realize, naturally it is also possible to by firmly
Part.Based on such understanding, the portion that prior art is contributed by technique scheme the most in other words
Dividing and can embody with the form of software product, this computer software product can be stored in computer can
Read in storage medium, such as ROM/RAM, magnetic disc, CD etc., including some instructions with so that one
Computer equipment (can be personal computer, server, or the network equipment etc.) performs each to be implemented
The method described in some part of example or embodiment.
Last it is noted that above example is only in order to illustrate technical scheme, rather than to it
Limit;Although the present invention being described in detail with reference to previous embodiment, the ordinary skill of this area
Personnel it is understood that the technical scheme described in foregoing embodiments still can be modified by it, or
Person carries out equivalent to wherein portion of techniques feature;And these amendments or replacement, do not make corresponding skill
The essence of art scheme departs from the spirit and scope of various embodiments of the present invention technical scheme.
Claims (11)
1. the vision virtual method of a virtual reality terminal, it is characterised in that including:
According to virtual reality terminal described in actual physics space to wear pose data and user corresponding
Actual build data and actual motion data, build virtual build data in three-dimensional virtual scene and
Virtual acting data;
According to the described virtual build data built and the corresponding virtual display of described virtual acting data genaration
Image.
The vision virtual method of virtual reality terminal the most according to claim 1, described user pair
The actual build data answered and actual motion data generate the most in the following way:
Gather build view data and the action image data of virtual reality terminal user;
Build view data and action image data to the virtual reality terminal user collected solve
Analysis, generates the described actual build data of correspondence and described actual motion data respectively.
The vision virtual method of virtual reality terminal the most according to claim 1, it is characterised in that
The pose data of wearing of described virtual reality terminal generate the most in the following way:
Gather that virtual reality terminal user wears described virtual reality terminal wears posture information;
Carry out wearing pose data described in parsing generation accordingly to described posture information of wearing.
The vision virtual method of virtual reality terminal the most according to claim 3, it is characterised in that
Collection virtual reality terminal user is worn the posture information of wearing of described virtual reality terminal and is included:
Gather virtual reality terminal user by least one infrared camera and wear described virtual reality eventually
That holds wears posture information;Or, gather virtual reality terminal by gyroscope and/or acceleration transducer
What user wore described virtual reality terminal wears posture information.
5., according to the vision virtual method of the virtual reality terminal described in any one of claim 1-4, build
Virtual build data and virtual acting data in three-dimensional virtual scene include:
According to 3-D geometric model in three-dimensional virtual scene and normal direction figure, described in actual physics space
Virtual reality terminal wear pose data and actual build data corresponding to user and actual motion data
Be converted to the virtual build data in three-dimensional virtual scene and virtual acting data.
6. the vision virtual bench of a virtual reality terminal, it is characterised in that including:
Construction unit, for according to described in actual physics space virtual reality terminal wear pose number
According to and actual build data corresponding to user and actual motion data, build in three-dimensional virtual scene
Virtual build data and virtual acting data;
Dummy unit, for according to the described virtual build data built and described virtual acting data genaration
Corresponding virtual display image.
The vision virtual bench of virtual reality terminal the most according to claim 6, also includes that first adopts
Collection unit, for gathering build view data and the action image data of virtual reality terminal user;With
And the build view data and action image data to the virtual reality terminal user collected resolves,
Generate the described actual build data of correspondence and described actual motion data respectively.
The vision virtual bench of virtual reality terminal the most according to claim 6, also includes that second adopts
Collection unit, the pose of wearing wearing described virtual reality terminal for gathering virtual reality terminal user is believed
Breath;And carry out wearing pose data described in parsing generation accordingly to described posture information of wearing.
The vision virtual bench of virtual reality terminal the most according to claim 8, it is characterised in that
Described second collecting unit is further used for obtaining the virtual reality gathered by least one infrared camera
What terminal user wore described virtual reality terminal wears pose image information;Or, it is used for obtaining logical
The virtual reality terminal user crossing gyroscope and/or acceleration transducer collection wears described virtual reality
Terminal wear posture information.
10. according to the vision virtual bench of the virtual reality terminal described in any one of claim 6-9, described
Construction unit is further used for according to 3-D geometric model in three-dimensional virtual scene and normal direction figure, by reality
Virtual reality terminal described in physical space wear pose data and actual build data corresponding to user
The virtual build data in three-dimensional virtual scene and virtual acting data are converted to actual motion data.
11. 1 kinds of virtual reality terminals, it is characterised in that include hardware processor, described hardware handles
Device for according to described in actual physics space virtual reality terminal wear pose data and user pair
The actual build data answered and actual motion data, build the virtual build data in three-dimensional virtual scene
With virtual acting data;And it is raw according to the described virtual build data built and described virtual acting data
Become corresponding virtual display image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610295037.0A CN105955483A (en) | 2016-05-06 | 2016-05-06 | Virtual reality terminal and visual virtualization method and device thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610295037.0A CN105955483A (en) | 2016-05-06 | 2016-05-06 | Virtual reality terminal and visual virtualization method and device thereof |
Publications (1)
Publication Number | Publication Date |
---|---|
CN105955483A true CN105955483A (en) | 2016-09-21 |
Family
ID=56914801
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610295037.0A Pending CN105955483A (en) | 2016-05-06 | 2016-05-06 | Virtual reality terminal and visual virtualization method and device thereof |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105955483A (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106412810A (en) * | 2016-11-23 | 2017-02-15 | 北京小米移动软件有限公司 | Data transmission method and device |
CN106484110A (en) * | 2016-09-30 | 2017-03-08 | 珠海市魅族科技有限公司 | A kind of method of simulation body action and virtual reality device |
CN106502388A (en) * | 2016-09-26 | 2017-03-15 | 惠州Tcl移动通信有限公司 | A kind of interactive movement technique and head-wearing type intelligent equipment |
CN106598237A (en) * | 2016-11-30 | 2017-04-26 | 宇龙计算机通信科技(深圳)有限公司 | Game interaction method and device based on virtual reality |
CN107185229A (en) * | 2017-04-26 | 2017-09-22 | 歌尔科技有限公司 | Game input method and device, the virtual reality system of virtual reality device |
CN108074278A (en) * | 2016-11-17 | 2018-05-25 | 百度在线网络技术(北京)有限公司 | Video presentation method, device and equipment |
CN108108026A (en) * | 2018-01-18 | 2018-06-01 | 珠海金山网络游戏科技有限公司 | A kind of VR virtual realities motion capture system and motion capture method |
CN108648281A (en) * | 2018-05-16 | 2018-10-12 | 热芯科技有限公司 | Mixed reality method and system |
CN112068703A (en) * | 2020-09-07 | 2020-12-11 | 北京字节跳动网络技术有限公司 | Target object control method and device, electronic device and storage medium |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7212197B1 (en) * | 1999-02-01 | 2007-05-01 | California Institute Of Technology | Three dimensional surface drawing controlled by hand motion |
CN102789313A (en) * | 2012-03-19 | 2012-11-21 | 乾行讯科(北京)科技有限公司 | User interaction system and method |
CN104699247A (en) * | 2015-03-18 | 2015-06-10 | 北京七鑫易维信息技术有限公司 | Virtual reality interactive system and method based on machine vision |
CN105528799A (en) * | 2014-10-21 | 2016-04-27 | 三星电子株式会社 | Virtual fitting device and virtual fitting method thereof |
-
2016
- 2016-05-06 CN CN201610295037.0A patent/CN105955483A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7212197B1 (en) * | 1999-02-01 | 2007-05-01 | California Institute Of Technology | Three dimensional surface drawing controlled by hand motion |
CN102789313A (en) * | 2012-03-19 | 2012-11-21 | 乾行讯科(北京)科技有限公司 | User interaction system and method |
CN105528799A (en) * | 2014-10-21 | 2016-04-27 | 三星电子株式会社 | Virtual fitting device and virtual fitting method thereof |
CN104699247A (en) * | 2015-03-18 | 2015-06-10 | 北京七鑫易维信息技术有限公司 | Virtual reality interactive system and method based on machine vision |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106502388A (en) * | 2016-09-26 | 2017-03-15 | 惠州Tcl移动通信有限公司 | A kind of interactive movement technique and head-wearing type intelligent equipment |
CN106502388B (en) * | 2016-09-26 | 2020-06-02 | 惠州Tcl移动通信有限公司 | Interactive motion method and head-mounted intelligent equipment |
CN106484110A (en) * | 2016-09-30 | 2017-03-08 | 珠海市魅族科技有限公司 | A kind of method of simulation body action and virtual reality device |
CN108074278A (en) * | 2016-11-17 | 2018-05-25 | 百度在线网络技术(北京)有限公司 | Video presentation method, device and equipment |
CN106412810A (en) * | 2016-11-23 | 2017-02-15 | 北京小米移动软件有限公司 | Data transmission method and device |
CN106598237A (en) * | 2016-11-30 | 2017-04-26 | 宇龙计算机通信科技(深圳)有限公司 | Game interaction method and device based on virtual reality |
WO2018196107A1 (en) * | 2017-04-26 | 2018-11-01 | 歌尔科技有限公司 | Gaming input method and device for virtual reality device, and virtual reality system |
CN107185229A (en) * | 2017-04-26 | 2017-09-22 | 歌尔科技有限公司 | Game input method and device, the virtual reality system of virtual reality device |
CN108108026A (en) * | 2018-01-18 | 2018-06-01 | 珠海金山网络游戏科技有限公司 | A kind of VR virtual realities motion capture system and motion capture method |
CN108648281A (en) * | 2018-05-16 | 2018-10-12 | 热芯科技有限公司 | Mixed reality method and system |
CN112068703A (en) * | 2020-09-07 | 2020-12-11 | 北京字节跳动网络技术有限公司 | Target object control method and device, electronic device and storage medium |
CN112068703B (en) * | 2020-09-07 | 2021-11-16 | 北京字节跳动网络技术有限公司 | Target object control method and device, electronic device and storage medium |
US11869195B2 (en) | 2020-09-07 | 2024-01-09 | Beijing Bytedance Network Technology Co., Ltd. | Target object controlling method, apparatus, electronic device, and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105955483A (en) | Virtual reality terminal and visual virtualization method and device thereof | |
Biocca | Virtual reality technology: A tutorial | |
CN107315470B (en) | Graphic processing method, processor and virtual reality system | |
CN108282648B (en) | VR rendering method and device, wearable device and readable storage medium | |
CN106873778A (en) | A kind of progress control method of application, device and virtual reality device | |
CN204695231U (en) | Portable helmet immersion systems | |
CN107004296A (en) | For the method and system that face is reconstructed that blocks to reality environment | |
CN102801994B (en) | Physical image information fusion device and method | |
JPH10188034A (en) | Three-dimensional image generator and distance operating device | |
CN104536579A (en) | Interactive three-dimensional scenery and digital image high-speed fusing processing system and method | |
CN104702936A (en) | Virtual reality interaction method based on glasses-free 3D display | |
CN106708270A (en) | Display method and apparatus for virtual reality device, and virtual reality device | |
JP7060544B6 (en) | Exercise equipment | |
CN204406327U (en) | Based on the limb rehabilitating analog simulation training system of said three-dimensional body sense video camera | |
CN106127846A (en) | Virtual reality terminal and vision virtual method thereof and device | |
Lou et al. | Reducing cybersickness by geometry deformation | |
CN204496117U (en) | 3d glasses | |
CN114708408A (en) | Experience system for virtual reality and meta-universe scene building in water | |
CN105721857A (en) | Helmet with double cameras | |
CN105979239A (en) | Virtual reality terminal, display method of video of virtual reality terminal and device | |
CN107632702B (en) | Holographic projection system adopting light-sensing data gloves and working method thereof | |
Saggio et al. | Augmented reality for restoration/reconstruction of artefacts with artistic or historical value | |
CN113253843B (en) | Indoor virtual roaming realization method and realization system based on panorama | |
CN205880817U (en) | AR and VR data processing equipment | |
CN105913379A (en) | Virtual reality terminal, its picture display method and apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20160921 |