CN108830939A - A kind of scene walkthrough experiential method and experiencing system based on mixed reality - Google Patents
A kind of scene walkthrough experiential method and experiencing system based on mixed reality Download PDFInfo
- Publication number
- CN108830939A CN108830939A CN201810586111.3A CN201810586111A CN108830939A CN 108830939 A CN108830939 A CN 108830939A CN 201810586111 A CN201810586111 A CN 201810586111A CN 108830939 A CN108830939 A CN 108830939A
- Authority
- CN
- China
- Prior art keywords
- scene
- virtual
- real
- mobile terminal
- viewpoint
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
Abstract
The invention discloses a kind of scene walkthrough experiential method based on mixed reality utilizes the mobile terminal with environment sensing and motion tracking technology and the AR with translucent display screen to show equipment;This method includes:Mobile terminal matches received virtual scene data with the real scene data acquired in real time;Mobile terminal updates the first viewpoint and the second viewpoint according to the real-time motion data of acquisition, and received virtual scene data are rendered to the first virtual scene and the second virtual scene and binocular split screen display available with the first viewpoint and the second viewpoint in real time;It will show that result projects to AR and shows on the translucent display screen of equipment;User's right and left eyes distinguish received first real scene and the second real scene, the first virtual scene and the second virtual scene interaction presented respectively with translucent display screen, realize the roaming experience of scene.A kind of scene walkthrough experiencing system based on mixed reality is also disclosed, immersion scene experience effect can be brought for user.
Description
Technical field
The invention belongs to field of image processings, and in particular to a kind of scene walkthrough experiential method and body based on mixed reality
Check system.
Background technique
When design scenario (such as architecture indoor spatial design scheme) is previewed, being typically designed scene can be with two-dimentional rendering figure
Form be presented to the user, although rendering figure is clear, exquisite, there is no stereoscopic effect, user is unable to personal understanding design field
Scape bring true effect, it is thus impossible to meet the needs of users.
With the development of VR (Virtual Reality, virtual reality) technology, people utilize VR head-mounted display apparatus
The design scenario that (including VR glasses, VR eyeshade, VR helmet etc.) viewing is presented, although it can be seen that the three-dimensional effect of design scenario
Fruit, but can have the following problems:Caused by the refresh rate of VR glasses, flashing, gyroscope etc. when high latency problem, it will cause
Vision motion sickness.When inconsistent between user's state being visually observed and the time of day of body, it is dizzy to will cause simulation
Dynamic disease.Both motion sickness, which will use family, may feel that strong spinning sensation, fatigue, dim eyesight, the malaise symptoms such as nausea, to use
Bring undesirable experience in family.Further, since VR head-mounted display apparatus is expensive, the range of experience user is limited.
With the development of AR (Augmented Reality, augmented reality) technology, people utilize AR technology by design scenario
3D model be presented in true environment, show equipment (including AR glasses, AR eyeshade, AR helmet etc.) by virtual objects by AR
It combines together with true environment, and is presented to the user the true new environment of sensory effects, for user experience.Though AR technology
It can be so presented to the user the stereoscopic effect of design scenario, but it is only display design scenario, there is no join with the house type of entity
System gets up, and a kind of immersion experience effect can not be provided for user, still cannot be met the needs of users.
The patent application of application publication number CN107909654A discloses a kind of Home Fashion & Design Shanghai experience system based on AR technology
System, including:It is printed with the model unit of the image information of the entity finishing material of house fitting-up needs, acquires institute on model unit
The acquiring units of some image informations, receive the data information of the acquiring unit transmission to indoor each goods of furniture for display rather than for use and floor or
Wallpaper combines the design cell and AR display unit for carrying out interior decoration design.This system passes through to the sofa figure scanned
The setting and planning that picture, tea table image, floor image etc. centralized integration carry out finishing pattern into an application scenarios are then
The displaying that augmented reality is carried out by AR display unit allows user comprehensively to watch following finishing effect, and can
To recognize the details of each household items.Although the technology contents are that user illustrates future with the means of augmented reality
Finishing effect, be also only to be shown to the finishing pattern 3D model of building, when user walks about indoors, cannot realize away
The finishing pattern which can be watched the region to, i.e., cannot realize the experience effect of immersion.
Summary of the invention
The object of the present invention is to provide a kind of scene walkthrough experiential method and experiencing system based on mixed reality.The scene
Roaming experiential method and experiencing system can provide a kind of immersion roaming experience for user.
For achieving the above object, the present invention provides following technical scheme:
First embodiment of the invention provides a kind of scene walkthrough experiential method based on mixed reality, the scene walkthrough
Experiential method shows equipment using mobile terminal and AR, and mobile terminal has environment sensing and motion tracking technology, and AR, which is shown, to be set
It is standby that there is translucent display screen, the induction of motion sickness can be effectively reduced;
The scene walkthrough experiential method includes the following steps:
Mobile terminal receive virtual scene data, in real time acquire real scene data, by virtual scene data with adopt in real time
The real scene data of collection are matched, so that virtual scene is corresponding with real scene;
Mobile terminal updates the first viewpoint and the second viewpoint according to the real-time motion data of acquisition, and in real time by received void
Quasi- contextual data is rendered to the first virtual scene and the second virtual scene with the first viewpoint and the second viewpoint, and is shown in mobile whole
The left half of and right one side of something for holding display screen, realizes binocular split screen effect;
First virtual scene of mobile terminal display screen is projected into AR with the second virtual scene and shows the translucent of equipment
On display screen;
User's right and left eyes distinguish received first real scene and the second real scene, present respectively with translucent display screen
The first virtual scene and the second virtual scene interaction, realize scene roaming experience.
Second embodiment of the present invention provides a kind of scene walkthrough experiencing system based on mixed reality, including clothes
Business device, the mobile terminal connecting with server communication, AR show equipment, and mobile terminal has environment sensing and motion tracking skill
Art, AR show that equipment has translucent display screen,
The server is simplified and is stored to the 3D model of the virtual scene of building, according to space bit in virtual scene
Relationship is set, occlusion culling baking processing is carried out to the 3D model of virtual scene, and store the processing result;
The server is corresponding to the virtual scene in request command after receiving the request command that mobile terminal is sent
Simplify 3D model and its occlusion culling baking processing result is packaged and is packed into scenario resources packet, under mobile terminal
It carries;
The mobile terminal shows equipment in conjunction with AR, utilizes above-mentioned scene walkthrough after server downloading scenario resources packet
Step in experiential method realizes the roaming experience of scene.
Third embodiment of the present invention provides a kind of scene walkthrough experiential method based on mixed reality, including following step
Suddenly:
Receive virtual scene data, in real time acquire real scene data, by virtual scene data with acquire in real time it is true
Contextual data is matched, so that virtual scene is equal sized with real scene;
The exercise data for obtaining user in real time updates the first viewpoint and the second viewpoint according to the exercise data of acquisition, and real
When received virtual scene data are rendered to the first virtual scene and the second virtual scene, institute with the first viewpoint and the second viewpoint
The Euclidean distance stated between the first viewpoint and the second viewpoint is close with the interpupillary distance of user;
User's right and left eyes distinguish received first real scene and the second real scene, respectively with the first virtual scene and the
Two virtual scene interactions realize the roaming experience of scene.
Compared with prior art, the invention has the advantages that:
Scene walkthrough experiential method and scene walkthrough experiencing system provided by the invention, will obtain real scene in real time, and
According to real-time operation information real-time rendering virtual scene, real scene is combined with virtual scene, is effectively avoiding motion sickness
While, provide the user with a kind of immersion roaming experience.
Detailed description of the invention
In order to more clearly explain the embodiment of the invention or the technical proposal in the existing technology, to embodiment or will show below
There is attached drawing needed in technical description to do simply to introduce, it should be apparent that, the accompanying drawings in the following description is only this
Some embodiments of invention for those of ordinary skill in the art, can be with root under the premise of not making the creative labor
Other accompanying drawings are obtained according to these attached drawings.
Fig. 1 is the flow chart for the scene walkthrough experiential method based on mixed reality that one embodiment of the invention provides.
Specific embodiment
To make the objectives, technical solutions, and advantages of the present invention more comprehensible, with reference to the accompanying drawings and embodiments to this
Invention is described in further detail.It should be appreciated that the specific embodiments described herein are only used to explain the present invention,
And the scope of protection of the present invention is not limited.
The object of the invention is to be based on mixed reality, the virtual scene of building is launched into the user into real scene
Virtual scene is watched by AR real world devices and real scene experiences the roaming body of immersion while avoiding motion sickness
Test effect.Real scene is that have certain boundary, naked eyes it can be seen that reality scene, virtual scene is the 3D to building
The scene that model rendering obtains, the scene need just see by certain display equipment.It is set with roaming architecture indoor space
For meter scheme, virtual scene is Interior Design, and real scene is house type (such as rough house to be designed in reality
Deng).
The scene walkthrough experiential method provided by the invention based on mixed reality is elaborated combined with specific embodiments below
And scene walkthrough experiencing system.
One embodiment of the invention provide the scene walkthrough experiencing system based on mixed reality include:Server and service
Mobile terminal, the AR of device communication connection show equipment.
Mobile terminal can be mobile phone, notebook, tablet computer etc., have environment sensing and motion tracking technology, can
It realizes the real-time perception and motion tracking of real scene, in addition, also having communication function, can be communicated with server, AR
Display equipment can be AR glasses, AR eyeshade, AR helmet etc., have translucent display screen.Specifically, AR shows that equipment can band
There is a bracket, the mobile terminals such as mobile phone, tablet computer can be supported.
Server is simplified and is stored to the 3D model of the virtual scene of building, is closed according to virtual scene spatial location
System carries out occlusion culling baking processing to the 3D model of virtual scene, and stores the processing result.
The dough sheet number that the 3D model of building includes can be very big, when will cause mobile terminal in this way and being rendered to 3D model,
Calculating consumption is very big, since the decline of rendering frame per second causes rendered picture Caton occur, influences user experience.In order to improve movement
The rendering efficiency of terminal can pre-process building 3D model, specifically include the simplification to 3D model, structure in server end
Build-up layers time detail model and occlusion culling baking processing.
Simplification and building level detail model for 3D model, can use LOD (Level OfDetail) technology pair
Original 3D model simplification is multiple class of different finenesses, thus realize in the case where having little influence on picture effect,
Reduce the total dough sheet number of model.In rendering, class, renders the 3D simplified model of the demand grade according to demand, with full
On the basis of sufficient display demand, rendering efficiency is improved, enhances user experience effect.
For some virtual scenes, since space has hiding relation, if rendering to each space, consumption is calculated very
Greatly, it influences to render image quality, therefore, it is necessary to carry out occlusion culling baking processing to 3D model, specifically, by the space of virtual scene
Multiple connected cubes are evenly dividing, and the hiding relation of the viewpoint and object between precalculated each cube, in wash with watercolours
When dye for camera looks into fee less than model to reduce the dough sheet number currently rendered, pass through raising without the rendering to it
Frame per second carrys out the high-quality picture of real-time rendering.
It handles since server has carried out simplified and occlusion culling baking to the 3D model of virtual scene in advance, is connecing in this way
When receiving call request, the available scenario resources packet of mobile terminal can be constructed automatically within a very short time, significantly reduced
Cost.
Server is after receiving the request command that mobile terminal is sent, to the corresponding simplification of virtual scene in request command
3D model and its occlusion culling baking processing result are packaged and are packed into scenario resources packet, for mobile terminal downloading.
Specifically, it can be applied using Unity3d as mobile terminal rendering engine and the building of server automatic scene.It moves
Dynamic terminal can be by sending http request to server, and specified design scheme constructs are packaged by application, and are beaten by url downloading
The scenario resources packet wrapped carries out mixed reality rendering.Server executes Unity3d after receiving http request, through order line
Batch mode constructs packing to specified design scheme automatically, and provides url link and download for mobile terminal.
Mobile terminal shows equipment in conjunction with AR after server downloading scenario resources packet, unrestrained using scene as shown in Figure 1
The step in experiential method is swum, realizes the roaming experience of scene.
As shown in Figure 1, the scene walkthrough experiential method that embodiment provides includes the following steps:
S101, mobile terminal receive virtual scene data, real scene data are acquired in real time, by virtual scene data and reality
When the real scene data that acquire matched so that virtual scene is corresponding with real scene.
Virtual scene data are the 3D model data constructed, and real scene data are the video camera shooting of mobile terminal
Reality scene data.Before roaming, virtual scene is first adjusted to the size of real scene, it is more true to be provided for user
Experience effect.
Specifically, it is described virtual scene data are carried out match with the real scene data acquired in real time including:
(a) the true any virtual tag point in the virtual scene data.
The number of virtual tag point is unrestricted, under normal circumstances, select in virtual scene position more outstanding as
Virtual tag point facilitates the subsequent matching for realizing virtual scene and real scene as basic point.
(b) prepare a mark picture, which is placed in real scene so that the angle of the mark picture and
Position of the virtual tag point in virtual scene is corresponding.
Mark picture can be the more complex picture of texture information, and the shape of the picture is arranged according to virtual tag point,
Guarantee that the angle of mark picture is corresponding with position of the virtual tag point in virtual scene so that it is convenient to according to the mark picture as far as possible
Determine authentic signature point.
(c) using mobile terminal video camera acquisition include identify picture image, and from the video camera to it is virtual
The angle divergent-ray of the corresponding mark picture in mark point position, the environment sensing and motion tracking skill of ray and mobile terminal
The intersection point on the ground of art identification is authentic signature point, and the number of authentic signature point is equal with virtual tag point number.
Since environment sensing and motion tracking technology can obtain the video cameras such as angle, the space coordinate of video camera ginseng in real time
Number information, while the spatial position of plane can be also identified in real time, it can obtain the spatial position of mark picture, therefore, root
According to the spatial position of camera parameters and mark picture, video camera can be obtained and identify the spatial relation of picture
With from video camera to it is corresponding with virtual tag point position mark picture angle divergent-ray, to obtain ray and environment sense
Know the intersection point on the true ground identified with motion tracking technology, which is authentic signature point.The present invention utilizes environment sensing
Picture is identified with motion tracking technology and object of reference, it is ingenious accurately to obtain authentic signature point, it is virtual scene and real scene
Matching proposes reliable data basis.
(d) virtual center of gravity of all virtual tag points and the true center of gravity of all authentic signatures point are calculated.
(e) space coordinate according to true center of gravity and virtual center of gravity is poor, converts to Virtual Space and virtual tag point, with
It is overlapped virtual center of gravity with true center of gravity.
Specifically, firstly, the space coordinate for calculating true center of gravity and virtual center of gravity is poor;Then, by virtual tag point and void
It is poor that quasi- contextual data adds space coordinate, so that virtual center of gravity is overlapped with true center of gravity, completes virtual scene and real scene
Initial matching.
(f) angle constituted according to virtual tag point, true center of gravity and authentic signature point corresponding with virtual tag point,
Virtual Space and virtual tag point are done on the basis of true center of gravity and are rotated, complete virtual scene data with acquire in real time it is true
The matching of contextual data.
It is only that virtual scene has been moved on to real scene is corresponding after the initial matching for completing virtual scene and real scene
On position, but some deviations are had on direction, also need to rotate virtual scene at this time, so that virtual scene and true
Scene corresponds, and realizes the exact matching of virtual scene and real scene.Specifically, firstly, according to virtual tag point, true
Center of gravity and authentic signature point corresponding with virtual tag point obtain angle angle (i), if there is multiple virtual tag points, herein,
It also needs to obtain angle angle (i), i=1,2 ... ..., the average value rotation of n;Then, by Virtual Space and virtual mark
Note point completes virtual scene data and real scene number around true center of gravity rotation angle angle (i) or average value rotation
According to matching.
S102, mobile terminal updates the first viewpoint and the second viewpoint according to the real-time motion data of acquisition, and will connect in real time
The virtual scene data of receipts are rendered to the first virtual scene and the second virtual scene with the first viewpoint and the second viewpoint, and are shown in
Left half of and right one side of something of mobile terminal display screen, realizes binocular split screen effect.
In order to enable user to obtain the experience effect of immersion in roaming, the virtual scene of rendering is wanted and place of arrival
It is corresponding, i.e., corresponding virtual scene is rendered according to place of arrival.The environment sensing and motion tracking technology of mobile terminal can
The exercise data for obtaining video camera in real time updates rendering viewpoint (the first viewpoint and the second viewpoint), i.e. root according to the exercise data
Rendering viewpoint is updated according to the spatial position where video camera.Space bit where first viewpoint and the second viewpoint, that is, rendering camera
It sets, the double vision of first viewpoint and the second viewpoint analog subscriber, therefore, between having centainly between the first viewpoint and the second viewpoint
Away from.Specifically, the two sides of spatial position where the video camera that the first viewpoint and the second viewpoint can be lived apart in mobile terminal, and
(interpupillary distance of range simulation user can be 6cm) is maintained a certain distance, to obtain sky where the first viewpoint and the second viewpoint
Between position.
After first viewpoint and the second viewpoint update, virtual scene data are rendered to first with the first viewpoint and the second viewpoint
Virtual scene and the second virtual scene, since the first virtual scene and the second virtual scene split screen display available are in the display of mobile terminal
On screen, the length direction that will cause virtual scene along screen is squeezed, if the size that virtual scene is normally shown is 9*16,
After split screen display available, the first virtual scene and the second virtual scene have been compressed to 9*8, it is clear that virtual scene is extruded, can be straight
Influence roaming experience effect is connect, needs to be adjusted the first virtual scene and the second virtual scene thus.Specifically, described to incite somebody to action
Received virtual scene data are rendered to the first virtual scene with the first viewpoint and the second viewpoint and the second virtual scene includes:
After received virtual scene data are rendered with the first viewpoint and the second viewpoint, the camera of virtual scene is modified
Perspective projection scene matrix, so as to be shown in first virtual scene of the left one side of something of mobile terminal display screen and be shown in mobile terminal
Second virtual scene of the right one side of something of display screen scales normal and viewpoint center and does not deviate.
Pass through the camera perspective projection scene matrix of virtual scene, when intercepting normal display, display screen center half region
Virtual scene as the first virtual scene and the second virtual scene so that being shown in mobile terminal display screen left half of first
Virtual scene scales normal and viewpoint center with the second virtual scene for being shown in the right one side of something of mobile terminal display screen and does not deviate.
Accordingly than the dough sheet number of 3D model, it is bigger on the visual effect influence of rendering figure to render material.Therefore, the movement
Terminal carries out wash with watercolours using 3D model of PBR (Physically Based Rendering) the material system to received virtual scene
Dye obtains the first virtual scene to rendering and the second virtual scene carries out antialiasing, environment mask, full frame floodlight after rendering
And tone mapping processing.
PBR material system is a kind of Rendering based on physics law simulation, can be mentioned significantly using the material color applying
High rendering effect.When being rendered using PBR material system, for some special substances, it is also necessary to carry out specially treated, have
Body includes:
For transparent material, since translucent material needs while rendering current object and behind object, rendering
When, depth test is closed, to realize the rendering to behind object, in addition it is also necessary to transparency be arranged, to control both front and back
The rendering ratio of material;
For self-luminous material, self-luminous material is mainly used in light bulb, fluorescent tube etc., small by other illumination effects, face
Color is uniform, and is also used as a kind of light source and impacts to environment.It, will in rendering in order to reduce computing resource consumption
Attribute of the self-luminous material as light source is closed, and its intensity of illumination is adjusted to a biggish value.
The value volume and range of product of light, the influence for computing resource consumption are very big.For this purpose, using simplified light system
System, constructs indoor illumination system by a small amount of directional light, point light source and gradient environment light jointly.
In order to reduce rendering calculation amount, in rendering, material system and lighting system have been all made of above-mentioned specially treated
Mode can reduce rendering image quality in this way, therefore, in order to which with lower cost, largely low promotion renders image quality, need to wash with watercolours
Dye image quality optimizes.Specifically, the first virtual scene is obtained to rendering and the second virtual scene carries out antialiasing, environment hides
Cover, full frame floodlight and tone mapping processing, make up details lost after illumination system simplifies, and then promote rendering image quality.
When mobile terminal collocation AR/VR wears peripheral hardware in use, picture detail can be amplified, it is therefore necessary to carry out anti-saw
Tooth processing, smooth edges avoid influencing user experience.The present invention using quickly approximate antialiasing algorithm (FXAA), to edge with
Texture carries out smooth operation, and the calculation amount of FXAA is lower, is suitable for rendering in mobile terminal.
Above-mentioned simplification illumination system will cause be shadow details loss, give user a kind of very flat feeling, for solution
The certainly problem schemes rendering to carry out environment shade processing.
Full frame floodlight (Bloom) refers in high dynamic range rendering, is reinforced highlight regions and is extended, mainly answered
The effect of window is penetrated for light, outdoor optical.The true of self-luminous material can be supplied by carrying out full frame floodlight processing to rendering figure
True feeling, to improve rendering image quality.
By antialiasing, environment mask and full frame floodlight, treated that rendering figure will appear tonal distortion phenomenon, directly
Influence rendering image quality.In order to recall to normally the tone for rendering figure, rendering is schemed to carry out tone mapping processing.Specifically, mainly
Tone mapping processing is carried out to the highlight regions of rendering figure.
In addition, server, which has carried out simplified and occlusion culling to 3D model in advance, bakes processing processing, therefore, in order to reduce
Computing cost is rendered, rendering frame per second is improved, when rendering to the corresponding simplified 3D model of virtual scene, according to occlusion culling
It bakes processing result and is rendered to 3D model is simplified.I.e. according to the spatial occlusion relationship of virtual scene, a rendering camera can
The scene observed.
First virtual scene of mobile terminal display screen and the second virtual scene are projected to half that AR shows equipment by S103
On transparent display screen.
S104, user's right and left eyes distinguish received first real scene and the second real scene, respectively with translucent display
Shield the first virtual scene and the second virtual scene interaction presented, realizes the roaming experience of scene.
Since AR shows that equipment has translucent display screen, for user in roaming, the received real scene of right and left eyes is to use
The reality scene that family oneself is seen.It is direct that first real scene and the second real scene are respectively from user's left eye
The reality scene that the reality scene and user's right eye seen are immediately seen.
Since mobile terminal has camera function, when user roams, the received real scene of right and left eyes can also be mobile
The real scene of the video camera acquisition of terminal.Specifically, the scene walkthrough experiential method includes:
The real scene of acquisition is shown in left half of and right one side of something of mobile terminal display screen, display by mobile terminal
It is the first real scene in the real scene of left one side of something, the real scene for being shown in right one side of something is the second real scene, is realized double
Mesh split screen effect;
First real scene of mobile terminal display screen is projected into AR with the second real scene and shows the translucent of equipment
On display screen;
The first real scene and use that received first real scene of user's left eye is presented on translucent display screen
The superposition for the real scene that family left eye is seen through translucent display screen;
The second real scene and use that received second real scene of user's right eye is presented on translucent display screen
The superposition for the real scene that family right eye is seen through translucent display screen.
When the real scene split screen display available that video camera is acquired, the length direction that also will appear real scene along screen is presented
Phenomenon is squeezed, the first real scene and the second real scene for needing to show sub-screen are adjusted.Specifically, described to adopt
The real scene of collection is shown in left one side of something of mobile terminal display screen and right one side of something includes:
If the operating system of mobile terminal is that iOS makes to be shown in movement by the way of modifying camera background changing matrix
First virtual scene of the left one side of something of terminal display screen contracts with the second virtual scene for being shown in the right one side of something of mobile terminal display screen
Normal and viewpoint center is put not deviate;
If the operating system of mobile terminal is Android, by the way of the scaling of modification camera background and offset, make
It is shown in first virtual scene of the left one side of something of mobile terminal display screen and the second void for being shown in the right one side of something of mobile terminal display screen
Quasi- scene scales normal and viewpoint center and does not deviate.
User using the above-mentioned scene walkthrough experiencing system based on mixed reality and method when being roamed, virtual scene
It is corresponding with real scene, while seeing virtual scene, also reality scene, such energy can be seen through translucent display screen
It is enough effectively prevented from motion sickness, and a kind of immersion roaming experience can be experienced.
Below for roaming rough house, the above-mentioned scene walkthrough experiencing system based on mixed reality is illustrated.
In experience, user wears AR and shows equipment (such as vast and boundless day unreal mirror), with environment sensing and motion tracking technology
Mobile phone is placed directly in AR and shows on the bracket of equipment.
Firstly, mobile phone can send downloading for the request of the interior decoration design scheme of the rough house, that is, virtual field
The 3D model and related data of scape, remote server understand simplified 3D model corresponding to virtual scene automatically after being connected to request
And its occlusion culling bakes processing result and is packaged into scenario resources packet for mobile phone-downloaded;
Then, after the mobile phone-downloaded resource packet, it is corresponding that received virtual room and the true room of acquisition are subjected to matching,
Initialization program before completing roaming;
Next, mobile phone updates the first viewpoint and the second viewpoint according to the real-time motion data of acquisition, and will receive in real time
3D model the first virtual room and the second virtual room are rendered to the first viewpoint and the second viewpoint, and split screen display available is in mobile phone
On screen, meanwhile, mobile phone acquisition the first true room and the second true room also split screen display available on mobile phone screen;
Finally, the virtual rendering figure and true picture on mobile phone screen project to the translucent display screen that AR shows equipment
On, user passes through while observing true room and virtual room on translucent display screen, enjoys the roaming of virtual room.
User also can see real room through translucent display screen, therefore, not swoon while seeing virtual room
Dynamic disease is felt, and can experience a kind of immersion roaming experience.
Another embodiment of the present invention provides a kind of scene walkthrough experiential method based on mixed reality, including it is following
Step:
Receive virtual scene data, in real time acquire real scene data, by virtual scene data with acquire in real time it is true
Contextual data is matched, so that virtual scene is equal sized with real scene;
The exercise data for obtaining user in real time updates the first viewpoint and the second viewpoint according to the exercise data of acquisition, and real
When received virtual scene data are rendered to the first virtual scene and the second virtual scene, institute with the first viewpoint and the second viewpoint
The Euclidean distance stated between the first viewpoint and the second viewpoint is close with the interpupillary distance of user;
User's right and left eyes distinguish received first real scene and the second real scene, respectively with the first virtual scene and the
Two virtual scene interactions realize the roaming experience of scene.
This method will obtain real scene in real time, and according to real-time operation information real-time rendering virtual scene, true field
Scape is combined with virtual scene, while effectively avoiding motion sickness, provides the user with a kind of immersion roaming experience.
Technical solution of the present invention and beneficial effect is described in detail in above-described specific embodiment, Ying Li
Solution is not intended to restrict the invention the foregoing is merely presently most preferred embodiment of the invention, all in principle model of the invention
Interior done any modification, supplementary, and equivalent replacement etc. are enclosed, should all be included in the protection scope of the present invention.
Claims (10)
1. a kind of scene walkthrough experiential method based on mixed reality, the scene walkthrough experiential method is aobvious using mobile terminal and AR
Show equipment, mobile terminal has environment sensing and motion tracking technology, and AR shows that equipment has translucent display screen;
The scene walkthrough experiential method includes the following steps:
Mobile terminal receives virtual scene data, acquire real scene data in real time, by virtual scene data in real time acquire
Real scene data are matched, so that virtual scene is corresponding with real scene;
Mobile terminal updates the first viewpoint and the second viewpoint according to the real-time motion data of acquisition, and in real time by received virtual field
Scape data are rendered to the first virtual scene and the second virtual scene with the first viewpoint and the second viewpoint, and it is aobvious to be shown in mobile terminal
Left half of and right one side of something of display screen, realizes binocular split screen effect;
First virtual scene of mobile terminal display screen and the second virtual scene are projected into the translucent display that AR shows equipment
On screen;
User's right and left eyes distinguish received first real scene and the second real scene, the presented respectively with translucent display screen
One virtual scene and the second virtual scene interaction realize the roaming experience of scene.
2. the scene walkthrough experiential method based on mixed reality as described in claim 1, which is characterized in that described by virtual field
Scape data carry out matching with the real scene data acquired in real time including:
The true any virtual tag point in the virtual scene data;
Prepare a mark picture, which is placed in real scene, so that the angle of the mark picture and virtual mark
Position of the note point in virtual scene is corresponding;
Using mobile terminal video camera acquisition include identify picture image, and from the video camera to virtual tag point institute
In the angle divergent-ray of the corresponding mark picture in position, environment sensing and the motion tracking technology identification of ray and mobile terminal
The intersection point on ground is authentic signature point, and the number of authentic signature point is equal with virtual tag point number;
Calculate the virtual center of gravity of all virtual tag points and the true center of gravity of all authentic signatures point;
According to the coordinate difference of true center of gravity and virtual center of gravity, Virtual Space and virtual tag point are converted, so that virtual center of gravity
It is overlapped with true center of gravity;
According to the angle that virtual tag point, true center of gravity and authentic signature point corresponding with virtual tag point are constituted, to virtual
Space and virtual tag point are done on the basis of true center of gravity and are rotated, the real scene number completing virtual scene data and acquiring in real time
According to matching.
3. the scene walkthrough experiential method based on mixed reality as described in claim 1, which is characterized in that it is described will be received
Virtual scene data are rendered to the first virtual scene with the first viewpoint and the second viewpoint and the second virtual scene includes:
After received virtual scene data are rendered with the first viewpoint and the second viewpoint, the camera perspective of virtual scene is modified
Scene matrix is projected, is shown so as to be shown in first virtual scene of the left one side of something of mobile terminal display screen and be shown in mobile terminal
The second virtual scene for shielding right one side of something scales normal and viewpoint center and does not deviate.
4. the scene walkthrough experiential method as claimed in claim 1 or 3 based on mixed reality, which is characterized in that the movement
Terminal renders the 3D model of received virtual scene using PBR material system, and after rendering, it is empty to obtain first to rendering
Quasi- scene and the second virtual scene carry out antialiasing, environment mask, full frame floodlight and tone mapping processing.
5. the scene walkthrough experiential method based on mixed reality as described in claim 1, which is characterized in that described first is true
Scene and the second real scene are respectively from the real scene that user's left eye is immediately seen and user's right eye be immediately seen it is true
Real field scape.
6. the scene walkthrough experiential method based on mixed reality as described in claim 1, which is characterized in that the scene walkthrough
Experiential method includes:
The real scene of acquisition is shown in left half of and right one side of something of mobile terminal display screen by mobile terminal, is shown in a left side
Half of real scene is the first real scene, and the real scene for being shown in right one side of something is the second real scene, realizes binocular point
Shield effect;
First real scene of mobile terminal display screen and the second real scene are projected into the translucent display that AR shows equipment
On screen;
The first real scene and user that received first real scene of user's left eye is presented on translucent display screen are left
The superposition for the real scene that eye is seen through translucent display screen;
The second real scene and user that received second real scene of user's right eye is presented on translucent display screen are right
The superposition for the real scene that eye is seen through translucent display screen.
7. the scene walkthrough experiential method based on mixed reality as claimed in claim 6, which is characterized in that described by acquisition
Real scene is shown in left one side of something of mobile terminal display screen and right one side of something includes:
If the operating system of mobile terminal is that iOS makes to be shown in mobile terminal by the way of modifying camera background changing matrix
First virtual scene of the left one side of something of display screen and the second virtual scene for being shown in the right one side of something of mobile terminal display screen scale just
Often and viewpoint center does not deviate;
If the operating system of mobile terminal is Android, by the way of the scaling of modification camera background and offset, make to show
In first virtual scene of the left one side of something of mobile terminal display screen be shown in second virtual field of the right one side of something of mobile terminal display screen
Scape scales normal and viewpoint center and does not deviate.
8. a kind of scene walkthrough experiencing system based on mixed reality, the mobile end being connect including server, with server communication
End, AR show equipment, and mobile terminal has environment sensing and motion tracking technology, and AR shows that equipment has translucent display screen,
It is characterized in that,
The server is simplified and is stored to the 3D model of the virtual scene of building, is closed according to virtual scene spatial location
System carries out occlusion culling baking processing to the 3D model of virtual scene, and stores the processing result;
The server is after receiving the request command that mobile terminal is sent, to the corresponding simplification of virtual scene in request command
3D model and its occlusion culling baking processing result are packaged and are packed into scenario resources packet, for mobile terminal downloading;
The mobile terminal shows equipment in conjunction with AR, using described in claim 1~8 after server downloading scenario resources packet
Scene walkthrough experiential method in step, realize scene roaming experience.
9. the scene walkthrough experiencing system based on mixed reality as described in claim 1, which is characterized in that virtual scene
When corresponding simplified 3D model is rendered, processing result is baked according to occlusion culling and is rendered to 3D model is simplified.
10. a kind of scene walkthrough experiential method based on mixed reality, includes the following steps:
Virtual scene data are received, acquire real scene data in real time, by virtual scene data and the real scene that in real time acquires
Data are matched, so that virtual scene is equal sized with real scene;
The exercise data for obtaining user in real time updates the first viewpoint and the second viewpoint according to the exercise data of acquisition, and in real time will
Received virtual scene data are rendered to the first virtual scene and the second virtual scene with the first viewpoint and the second viewpoint, and described
Euclidean distance between one viewpoint and the second viewpoint is close with the interpupillary distance of user;
User's right and left eyes distinguish received first real scene and the second real scene, empty with the first virtual scene and second respectively
Quasi- scene interactivity realizes the roaming experience of scene.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810586111.3A CN108830939B (en) | 2018-06-08 | 2018-06-08 | Scene roaming experience method and experience system based on mixed reality |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810586111.3A CN108830939B (en) | 2018-06-08 | 2018-06-08 | Scene roaming experience method and experience system based on mixed reality |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108830939A true CN108830939A (en) | 2018-11-16 |
CN108830939B CN108830939B (en) | 2022-06-10 |
Family
ID=64143311
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810586111.3A Active CN108830939B (en) | 2018-06-08 | 2018-06-08 | Scene roaming experience method and experience system based on mixed reality |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108830939B (en) |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109725728A (en) * | 2018-12-29 | 2019-05-07 | 三星电子(中国)研发中心 | A kind of the display modification method and device of AR equipment |
CN109901713A (en) * | 2019-02-25 | 2019-06-18 | 山东大学 | A kind of multi-person synergy assembly system and method |
CN110164233A (en) * | 2019-06-06 | 2019-08-23 | 西安勺子智能科技有限公司 | A kind of Portable individual weapon simulated training system based on MR |
CN110517580A (en) * | 2019-07-26 | 2019-11-29 | 北京林业大学 | A kind of manifolding formula interaction landscape system for the reconstruct of postindustrial resource landscape |
CN110889889A (en) * | 2019-11-12 | 2020-03-17 | 四川大学 | Oblique photography modeling data generation method applied to immersive display equipment |
CN111744180A (en) * | 2020-06-29 | 2020-10-09 | 完美世界(重庆)互动科技有限公司 | Method and device for loading virtual game, storage medium and electronic device |
CN111915736A (en) * | 2020-08-06 | 2020-11-10 | 黄得锋 | AR interaction control system, device and application |
CN112631424A (en) * | 2020-12-18 | 2021-04-09 | 上海影创信息科技有限公司 | Gesture priority control method and system and VR glasses thereof |
CN113190142A (en) * | 2021-04-28 | 2021-07-30 | 北京航空航天大学云南创新研究院 | Cubic model-based 3D environment interaction method and device |
WO2021249390A1 (en) * | 2020-06-12 | 2021-12-16 | 贝壳技术有限公司 | Method and apparatus for implementing augmented reality, storage medium, and electronic device |
CN113941138A (en) * | 2020-08-06 | 2022-01-18 | 黄得锋 | AR interaction control system, device and application |
CN113963100A (en) * | 2021-10-25 | 2022-01-21 | 广东工业大学 | Three-dimensional model rendering method and system for digital twin simulation scene |
CN115482325A (en) * | 2022-09-29 | 2022-12-16 | 北京百度网讯科技有限公司 | Picture rendering method, device, system, equipment and medium |
WO2023011216A1 (en) * | 2021-08-06 | 2023-02-09 | 华为技术有限公司 | Device hot-plugging method and terminal device |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1802586A (en) * | 2003-06-12 | 2006-07-12 | 西门子共同研究公司 | Calibrating real and virtual views |
CN105938250A (en) * | 2016-05-12 | 2016-09-14 | 深圳增强现实技术有限公司 | Multilayer augmented reality smart glasses |
CN105955456A (en) * | 2016-04-15 | 2016-09-21 | 深圳超多维光电子有限公司 | Virtual reality and augmented reality fusion method, device and intelligent wearable equipment |
CN106919262A (en) * | 2017-03-20 | 2017-07-04 | 广州数娱信息科技有限公司 | Augmented reality equipment |
-
2018
- 2018-06-08 CN CN201810586111.3A patent/CN108830939B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1802586A (en) * | 2003-06-12 | 2006-07-12 | 西门子共同研究公司 | Calibrating real and virtual views |
CN105955456A (en) * | 2016-04-15 | 2016-09-21 | 深圳超多维光电子有限公司 | Virtual reality and augmented reality fusion method, device and intelligent wearable equipment |
CN105938250A (en) * | 2016-05-12 | 2016-09-14 | 深圳增强现实技术有限公司 | Multilayer augmented reality smart glasses |
CN106919262A (en) * | 2017-03-20 | 2017-07-04 | 广州数娱信息科技有限公司 | Augmented reality equipment |
Non-Patent Citations (1)
Title |
---|
MICHAEL FIGL ET AL.: "A Fully Automated Calibration Method for an Optical See-Through Head-Mounted Operating Microscope With Variable Zoom and Focus", 《IEEE TRANSACTIONS ON MEDICAL IMAGING》 * |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109725728A (en) * | 2018-12-29 | 2019-05-07 | 三星电子(中国)研发中心 | A kind of the display modification method and device of AR equipment |
CN109901713A (en) * | 2019-02-25 | 2019-06-18 | 山东大学 | A kind of multi-person synergy assembly system and method |
CN110164233A (en) * | 2019-06-06 | 2019-08-23 | 西安勺子智能科技有限公司 | A kind of Portable individual weapon simulated training system based on MR |
CN110517580A (en) * | 2019-07-26 | 2019-11-29 | 北京林业大学 | A kind of manifolding formula interaction landscape system for the reconstruct of postindustrial resource landscape |
CN110889889A (en) * | 2019-11-12 | 2020-03-17 | 四川大学 | Oblique photography modeling data generation method applied to immersive display equipment |
WO2021249390A1 (en) * | 2020-06-12 | 2021-12-16 | 贝壳技术有限公司 | Method and apparatus for implementing augmented reality, storage medium, and electronic device |
CN111744180A (en) * | 2020-06-29 | 2020-10-09 | 完美世界(重庆)互动科技有限公司 | Method and device for loading virtual game, storage medium and electronic device |
CN113941138A (en) * | 2020-08-06 | 2022-01-18 | 黄得锋 | AR interaction control system, device and application |
CN111915736A (en) * | 2020-08-06 | 2020-11-10 | 黄得锋 | AR interaction control system, device and application |
CN112631424A (en) * | 2020-12-18 | 2021-04-09 | 上海影创信息科技有限公司 | Gesture priority control method and system and VR glasses thereof |
CN113190142A (en) * | 2021-04-28 | 2021-07-30 | 北京航空航天大学云南创新研究院 | Cubic model-based 3D environment interaction method and device |
WO2023011216A1 (en) * | 2021-08-06 | 2023-02-09 | 华为技术有限公司 | Device hot-plugging method and terminal device |
CN113963100A (en) * | 2021-10-25 | 2022-01-21 | 广东工业大学 | Three-dimensional model rendering method and system for digital twin simulation scene |
CN113963100B (en) * | 2021-10-25 | 2022-04-29 | 广东工业大学 | Three-dimensional model rendering method and system for digital twin simulation scene |
CN115482325A (en) * | 2022-09-29 | 2022-12-16 | 北京百度网讯科技有限公司 | Picture rendering method, device, system, equipment and medium |
CN115482325B (en) * | 2022-09-29 | 2023-10-31 | 北京百度网讯科技有限公司 | Picture rendering method, device, system, equipment and medium |
Also Published As
Publication number | Publication date |
---|---|
CN108830939B (en) | 2022-06-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108830939A (en) | A kind of scene walkthrough experiential method and experiencing system based on mixed reality | |
CN102540464B (en) | Head-mounted display device which provides surround video | |
CN112967390B (en) | Scene switching method and device and storage medium | |
CN103918011B (en) | Rendering system, rendering server and its control method | |
CN106101741A (en) | Internet video live broadcasting platform is watched the method and system of panoramic video | |
CN109461210B (en) | Panoramic roaming method for online home decoration | |
JP2016018560A (en) | Device and method to display object with visual effect | |
US20190213975A1 (en) | Image processing system, image processing method, and computer program | |
CN107430785A (en) | For showing the method and system of three dimensional object | |
CN107862718B (en) | 4D holographic video capture method | |
CN110517356A (en) | Realize system, the method and apparatus of the three-dimensional enhanced reality of multi-channel video fusion | |
CN106125491B (en) | More optical projection systems | |
CN109255841A (en) | AR image presentation method, device, terminal and storage medium | |
CN108509173A (en) | Image shows system and method, storage medium, processor | |
CN106210856A (en) | Internet video live broadcasting platform is watched the method and system of 3D panoramic video | |
GB2590871A (en) | System and method for providing a computer-generated environment | |
CN107005689B (en) | Digital video rendering | |
CN208506731U (en) | Image display systems | |
US11601636B2 (en) | Methods, systems, and media for generating an immersive light field video with a layered mesh representation | |
CN105282535A (en) | 3D projection system and 3D projection method in 3D space environment | |
WO2021139456A1 (en) | Calculation method and system for dynamic rendering of image having infinite visual boundary | |
CN110060349B (en) | Method for expanding field angle of augmented reality head-mounted display equipment | |
CN106547557A (en) | A kind of multi-screen interactive exchange method based on virtual reality and bore hole 3D | |
CN105959677A (en) | Naked eye 3D interaction method based on Android equipment | |
CN110458929A (en) | A kind of interiors rendering method and system based on Three.js |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |