CN109727318B - Method for realizing transfer door effect and presenting VR panoramic video picture in AR equipment - Google Patents

Method for realizing transfer door effect and presenting VR panoramic video picture in AR equipment Download PDF

Info

Publication number
CN109727318B
CN109727318B CN201910024597.6A CN201910024597A CN109727318B CN 109727318 B CN109727318 B CN 109727318B CN 201910024597 A CN201910024597 A CN 201910024597A CN 109727318 B CN109727318 B CN 109727318B
Authority
CN
China
Prior art keywords
scene
virtual
model
instruction
wall
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910024597.6A
Other languages
Chinese (zh)
Other versions
CN109727318A (en
Inventor
吴锦坚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Visual Innovation Technology Co ltd
Original Assignee
Visual Innovation Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Visual Innovation Technology Co ltd filed Critical Visual Innovation Technology Co ltd
Priority to CN201910024597.6A priority Critical patent/CN109727318B/en
Publication of CN109727318A publication Critical patent/CN109727318A/en
Application granted granted Critical
Publication of CN109727318B publication Critical patent/CN109727318B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Abstract

The invention relates to the technical field of virtual display and enhanced display development, and aims to provide a method for realizing a transfer gate effect and presenting VR panoramic video pictures in AR equipment. Firstly, loading a pre-stored virtual object model according to a loading instruction; constructing the virtual object model according to the scene construction instruction; generating an initial engineering file according to the constructed virtual scene, and adapting the initial engineering file to an AR engineering file; then, configuring a scene pre-operation state according to the scene construction instruction; then writing a shader which does not draw a color channel, and endowing the shader to an outer wall model, wherein only a model picture of a virtual door and a physical scene picture returned by an external camera are reserved; and finally, running the scene, packaging and outputting the final engineering file. The invention can integrate the real scene, the AR scene and the VR panoramic video picture, so that the user can generate the space-time crossing and the feeling of being in the scene, and the user experience is greatly improved.

Description

Method for realizing transfer door effect and presenting VR panoramic video picture in AR equipment
Technical Field
The invention relates to the technical field of virtual display and enhanced display development, in particular to a method for realizing a transfer gate effect and presenting VR panoramic video pictures in AR equipment.
Background
VR is a computer simulation system for implementing a virtual world, focusing on fully immersing a user into a virtual medium environment; AR is a technique that calculates the position and angle of a camera image in real time, plus corresponding images, video, 3D models, focusing on the interaction between the real world and virtual things.
Panoramic video is the video that the 360 degrees of all-round shooting was carried out with the 3D camera, and VR panoramic video is as a novel video mode, refers to with professional VR photographic function with the scene environment truly record, the post-treatment is carried out to the rethread computer, the video that can realize three-dimensional space show function that forms.
At present, VR and AR independently develop in respective ecosystems, which are two different virtual technologies, and the VR panoramic video is presented in AR equipment without combining the two technologies, and interaction between virtual things and a real environment in the AR technology is realized; meanwhile, current VR panoramic video technology does not have the technical ability to capture the real world and to interact virtual things with the real world.
In summary, the prior art lacks a mature technical solution for tightly combining VR panoramic video, AR virtual things and real world interactions.
Disclosure of Invention
In order to solve the above-mentioned problems of the prior art, the present invention provides a method for implementing a transfer gate effect and presenting VR panorama video frames in an AR device.
The technical scheme adopted by the invention is as follows:
a method of implementing a transfer gate effect and rendering VR panoramic video frames in an AR device, comprising the steps of:
s1, receiving a loading instruction from a human-computer interface, and loading a pre-stored virtual object model according to the loading instruction;
s2, receiving a scene construction instruction from a human-computer interface, and constructing a virtual object model according to the scene construction instruction to construct model structures of a virtual door, a window, an outer wall and a sky box in the scene;
s3, generating an initial engineering file according to the constructed virtual scene, and adapting the initial engineering file to an AR engineering file;
s4, configuring a scene running state according to a scene construction instruction, attaching the VR panoramic video to a sky box, and closing rendering of all objects except a virtual door, a window and an outer wall;
s5, writing a shader which does not draw a color channel, and endowing the shader to an outer wall model, wherein only a model picture of a virtual door and a physical scene picture returned by an external camera are reserved;
s6, receiving a scene operation instruction from a human-computer interface, and operating a scene, wherein a virtual door and an entity scene picture appear in the scene;
s7, receiving a virtual door opening instruction from a human-computer interface, opening and rendering objects in the virtual door, wherein the opened virtual door, a VR panoramic video picture in the virtual door and a physical scene picture outside the virtual door appear in the scene;
s8, packaging and outputting the final engineering file.
Preferably, steps S1-S8 are all completed in the Unity3D game engine.
Preferably, in the steps S2 and S4, the constructing operation is used for creating a virtual object; the virtual object comprises a model, lamplight, a map and materials.
Preferably, in the step S3, before the initial engineering file is adapted to the AR engineering file, the AR technical framework is loaded first; the AR technical framework is an ARKit framework and/or an ARcore framework.
Preferably, in the step 2, a virtual gate in the scene is constructed as an exchange entry between the virtual scene and the real scene; constructing a window in the scene as an exchange window of the AR scene and the VR scene; constructing an outer wall in a scene as a medium for isolating a virtual scene from a real scene; the sky box in the scene is constructed as a square polygonal model with spherical UV mapping.
Preferably, in step S4, when attaching the VR panoramic video to the sky box, the specific steps are as follows:
s41, receiving a loading instruction from a human-computer interface, and loading a prestored polygonal model with spherical UV mapping and square shape;
s42, receiving a giving instruction from a human-computer interface, and giving the material which uses self illumination and can accept UV mapping to the polygon model loaded in S41;
s43, mapping the VR panoramic image onto the polygonal model, and displaying a final panoramic picture.
Preferably, the specific steps of the step S5 are as follows:
s51, programming a shader which does not draw color channels by using a GPU programming language;
s52, receiving an attribute instruction of a human-computer interface, endowing the outer wall model with a coloring device to enable the outer wall model to be a transparent shade, hiding the inner model when the inner model of the outer wall is watched from the positive direction of the outer wall, and displaying the inner model normally when the inner model of the outer wall is watched from the negative direction of the outer wall;
s53, the virtual door is not covered by the outer wall shade area, and the virtual door is normally rendered and displayed in the scene.
Preferably, in the step S8, the final AR engineering file is packaged and output to the human-computer interface and the VR display device.
The beneficial effects of the invention are as follows:
according to the invention, a virtual transmission gate is constructed in a real scene shot by an external camera and is used as a real and virtual junction, so that the transmission gates in art works such as a plurality of science fiction movies and videos, science fiction literature and the like can be displayed in an AR mode, and a user can feel the unsightly spatial transformation in a tangential manner. Meanwhile, the VR panoramic video technology is combined at the same time, the VR panoramic video of the photographed real scene is added outside the AR virtual three-dimensional scene, the virtual scene and the real scene, the AR scene and the VR panorama are fused to a high degree, the user can generate the space-time crossing and the feeling of being in the scene, the inconvenience caused by the fact that the VR panorama does not have the capability of capturing the real world and the virtual things interact with the real world is avoided, the practicability is higher, the user experience is improved greatly, and the VR panoramic video capturing device is suitable for popularization and use.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow diagram of an embodiment.
Detailed Description
The invention is further illustrated by the following description of specific embodiments in conjunction with the accompanying drawings.
Before making a description of the embodiments, it is necessary to explain some terminology:
unity3D: unity is a multi-platform comprehensive game development tool developed by Unity Technologies, which enables players to easily create types of interactive content such as three-dimensional video games, building visualizations, real-time three-dimensional animations, etc., and is a fully integrated professional game engine.
VR: the full-scale Virtual Reality, chinese is a Virtual Reality, is a computer simulation system capable of creating and experiencing a Virtual world, utilizes a computer to generate a simulation environment, is a system simulation of multi-source information fusion, interactive three-dimensional dynamic view and entity behaviors, and enables a user to be immersed in the simulation environment.
AR: the full scale Augmented Reality, chinese is an augmented reality technology, which calculates the position and angle of a camera image in real time and adds corresponding images, video and 3D models, and the goal of the technology is to fit the virtual world around the real world and interact with the virtual world on the screen.
Panoramic video: is a video shot by a 3D camera in 360 degrees in all directions.
VR panoramic video: the method is characterized in that a professional VR photographing function is used for truly recording the field environment, and then a computer is used for post-processing, so that a video capable of realizing a three-dimensional space display function is formed.
Example 1:
a method of implementing a transfer gate effect and rendering VR panoramic video frames in an AR device, comprising the steps of:
s1, receiving a loading instruction from a human-computer interface, and loading a pre-stored virtual object model according to the loading instruction.
S2, receiving a scene construction instruction from a human-computer interface, and constructing a virtual object model according to the scene construction instruction to construct model structures of a virtual door, a window, an outer wall and a sky box in the scene; in this embodiment, the model structures of the virtual doors, windows, outer walls and sky boxes in the scene are constructed, which can present a multi-level interrelated model structure, and the scene is constructed according to the real world size, so as to form a virtual space with enough audience roaming activity.
S3, generating an initial engineering file according to the constructed virtual scene, and adapting the initial engineering file to an AR engineering file; various functions and experiences related to the augmented reality technology can be realized by adopting the AR engineering file.
In this embodiment, in step S3, before adapting the initial engineering file to the AR engineering file, the AR technical framework is loaded first; the AR technical framework is an ARKit framework and/or an ARcore framework.
S4, configuring a scene running state according to a scene construction instruction, attaching the VR panoramic video to a square sky box, and closing rendering of all objects except the virtual doors, windows and outer walls.
In this embodiment, in steps S2 and S4, the constructing operation is used to create a virtual object; the virtual object comprises a model, lamplight, a map and materials.
In this embodiment, in step S4, before the scene is run, the rendering of all virtual objects except the virtual doors, windows, and outer walls is turned off by calling the control instruction of the CPU.
S5, writing a shader which does not draw color channels, and endowing the shader to an outer wall model, wherein only model pictures of a virtual door and physical scene pictures returned by an external camera are reserved. It should be understood that the external camera may be implemented, but is not limited to, using an AR device, where the user can only see one door in the picture standing in a real scene.
S6, receiving a scene operation instruction from a human-computer interface, and operating a scene, wherein a virtual door and an entity scene picture appear in the scene, wherein when the scene is operated, a user can surround the periphery of the virtual door, only the virtual door and the entity scene picture can be seen, and other virtual scenes cannot be seen.
In this embodiment, in step S6, the rendering of all virtual objects except the virtual gate is turned off by calling the control instruction of the CPU; at this time, only the virtual gate and the real scene picture returned by the external camera are displayed in the AR device, and the effect of the transfer gate is presented.
S7, receiving a virtual door opening instruction from a human-computer interface, wherein objects in the virtual door are controlled by a program script to be opened and rendered, and the opened virtual door, a VR panoramic video picture in the virtual door and a physical scene picture outside the virtual door appear in the scene. At this time, the user can walk into the virtual space through the virtual door and can reach an open area such as a window in the space, and view the VR panorama through the virtual window.
In step S7, the user may touch the animation for starting the virtual door to open through the human-computer interface to send the virtual door opening command, and the internal mechanism of the scene operation is as follows:
the CPU receives a virtual door opening instruction, sends the instruction to the model control component, and starts preloaded door opening animation and rendering operation on the model; at this time, the transparent shade of the outer wall still functions, and only the virtual door, the virtual space in the virtual door and the real scene picture returned by the external camera are displayed in the AR device.
In step S7, when the user walks into the virtual space through the virtual gate, the internal mechanism of the scene operation is as follows:
s71, taking the virtual model as a polygonal model of the VR panoramic video carrier, and completely wrapping almost all virtual models including virtual doors, windows and outer walls so as to realize the immersion experience of VR;
s72, the transparent shade of the outer wall still plays a role, and a normal outer wall is wrapped outside the user movement range in front of the virtual door so as to prevent the VR panorama from being exposed outside the effect of the transmission door;
s73, the open area in the space, namely the area such as a window and a balcony, is free of an outer wall shade, and the normal viewing effect of VR panorama is kept.
S8, packaging and outputting the final engineering file.
In this embodiment, in step S8, after the final AR engineering file is packaged, the final AR engineering file is output to the human-computer interface and the VR display device.
In this embodiment, steps S1-S8 are all completed in the Unity3D game engine.
Example 2:
a method of implementing a transfer gate effect and rendering VR panoramic video frames in an AR device, comprising the steps of:
s1, receiving a loading instruction from a human-computer interface, and loading a pre-stored virtual object model according to the loading instruction.
S2, receiving a scene construction instruction from a human-computer interface, and constructing a virtual object model according to the scene construction instruction to construct model structures of a virtual door, a window, an outer wall and a sky box in the scene; in this embodiment, the model structures of the virtual doors, windows, outer walls and sky boxes in the scene are constructed, which can present a multi-level interrelated model structure, and the scene is constructed according to the real world size, so as to form a virtual space with enough audience roaming activity.
In the embodiment, in step S2, a pre-stored virtual object model may be loaded by receiving a loading instruction from a man-machine interface or an object model in a scene may be constructed according to a scene construction instruction, wherein a virtual door in the constructed scene is an exchange entry between a virtual scene and a real scene, and is not limited to various forms including a daily common door; the windows in the construction scene are exchange windows of the AR scene and the VR scene, and are not limited to various forms including daily common windows; the outer wall in the constructed scene is a medium for isolating the virtual scene from the real scene, and is not limited to various forms of daily common outer walls; the sky box in the construction scene is a square polygonal model with spherical UV mapping, and when the sky box is constructed, a prestored square polygonal model with spherical UV mapping is loaded by receiving a loading instruction from a human-computer interface.
S3, generating an initial engineering file according to the constructed virtual scene, and adapting the initial engineering file to an AR engineering file; various functions and experiences related to the augmented reality technology can be realized by adopting the AR engineering file.
In this embodiment, in step S3, before adapting the initial engineering file to the AR engineering file, the AR technical framework is loaded first; the AR technical framework is an ARKit framework and/or an ARcore framework.
S4, configuring a scene running state according to a scene construction instruction, attaching the VR panoramic video to a square sky box, and closing rendering of all objects except the virtual doors, windows and outer walls.
In this embodiment, in steps S2 and S4, the constructing operation is used to create a virtual object; the virtual object comprises a model, lamplight, a map and materials.
In this embodiment, in step S4, when attaching the VR panoramic video to the sky box, the specific steps are as follows:
s41, receiving a loading instruction from a human-computer interface, and loading a prestored polygonal model with spherical UV mapping and square shape;
s42, receiving a giving instruction from a human-computer interface, and giving the material which uses self illumination and can accept UV mapping to the polygon model loaded in S41;
s43, mapping the VR panoramic image onto the polygonal model, and displaying a final panoramic picture, wherein the final panoramic picture is an intact and correct panoramic picture.
In this embodiment, in step S4, before the scene is run, the rendering of all virtual objects except the virtual doors, windows, and outer walls is turned off by calling the control instruction of the CPU.
S5, writing a shader which does not draw color channels, and endowing the shader to an outer wall model, wherein only model pictures of a virtual door and physical scene pictures returned by an external camera are reserved. It should be understood that the external camera may be implemented, but is not limited to, using an AR device, where the user can only see one door in the picture standing in a real scene.
In this embodiment, the specific steps of step S5 are as follows:
s51, programming a shader which does not draw color channels by using a GPU programming language;
s52, receiving an attribute instruction of a human-computer interface, endowing the outer wall model with a coloring device to enable the outer wall model to be a transparent shade, hiding the inner model when the inner model of the outer wall is watched from the positive direction of the outer wall, and displaying the inner model normally when the inner model of the outer wall is watched from the negative direction of the outer wall;
s53, the virtual door is not covered by the outer wall shade area, and the virtual door is normally rendered and displayed in the scene.
S6, receiving a scene operation instruction from a human-computer interface, and operating a scene, wherein a virtual door and an entity scene picture appear in the scene, wherein when the scene is operated, a user can surround the periphery of the virtual door, only the virtual door and the entity scene picture can be seen, and other virtual scenes cannot be seen.
In this embodiment, in step S6, the rendering of all virtual objects except the virtual gate is turned off by calling the control instruction of the CPU; at this time, only the virtual gate and the real scene picture returned by the external camera are displayed in the AR device, and the effect of the transfer gate is presented.
S7, receiving a virtual door opening instruction from a human-computer interface, wherein objects in the virtual door are controlled by a program script to be opened and rendered, and the opened virtual door, a VR panoramic video picture in the virtual door and a physical scene picture outside the virtual door appear in the scene. At this time, the user can walk into the virtual space through the virtual door and can reach an open area such as a window in the space, and view the VR panorama through the virtual window.
In step S7, the user may touch the animation for starting the virtual door to open through the human-computer interface to send the virtual door opening command, and the internal mechanism of the scene operation is as follows:
the CPU receives a virtual door opening instruction, sends the instruction to the model control component, and starts preloaded door opening animation and rendering operation on the model; at this time, the transparent shade of the outer wall still functions, and only the virtual door, the virtual space in the virtual door and the real scene picture returned by the external camera are displayed in the AR device.
In step S7, when the user walks into the virtual space through the virtual gate, the internal mechanism of the scene operation is as follows:
s71, taking the virtual model as a polygonal model of the VR panoramic video carrier, and completely wrapping almost all virtual models including virtual doors, windows and outer walls so as to realize the immersion experience of VR;
s72, the transparent shade of the outer wall still plays a role, and a normal outer wall is wrapped outside the user movement range in front of the virtual door so as to prevent the VR panorama from being exposed outside the effect of the transmission door;
s73, the open area in the space, namely the area such as a window and a balcony, is free of an outer wall shade, and the normal viewing effect of VR panorama is kept.
S8, packaging and outputting the final engineering file.
In this embodiment, in step S8, after the final AR engineering file is packaged, the final AR engineering file is output to the human-computer interface and the VR display device.
In this embodiment, steps S1-S8 are all completed in the Unity3D game engine.
In the embodiment, a virtual transmission gate is constructed in a real scene shot by an external camera and is used as a real and virtual junction, so that the transmission gates in art works such as a plurality of science fiction movies and videos, science fiction literature and the like can be displayed in an AR mode, and a user can feel the unsightly spatial transformation in a tangential manner. Meanwhile, the VR panoramic video technology is combined, the VR panoramic video of the photographed real scene is added outside the AR virtual three-dimensional scene, and when a user still hesitates to be inexhaustible after entering the AR scene through the transmission door, the VR panoramic video outside the window can bring the user into another world again.
In the use process, the embodiment can enable the user to experience two different space conversions in the appearance, the reality and the false and the virtual are mutually inserted, the virtual scene and the real scene, the AR scene and the VR panorama are fused, the user generates the appearance of space-time crossing and being in the scene, the inconvenience caused by the fact that the VR panorama does not have the capability of capturing the real world and the virtual things interact with the real world is avoided, the practicability is higher, the user experience is improved greatly, and the embodiment is suitable for popularization and use.
The invention has great practical effect in particular but not limited application in industries such as real estate development sample houses, indoor decoration effect display and the like. For the industries, the technical method provided by the invention can generate influence on the display effect and the display technology, and can play a great role in improving the social productivity.
In this embodiment, in step S51, when writing a shader that does not draw color channels using GPU programming language, the program code used may be, but is not limited to, the following:
Figure SMS_1
Figure SMS_2
the foregoing description is only illustrative of the present invention and is not intended to limit the scope of the invention, and all equivalent structures or equivalent processes or direct or indirect application in other related technical fields are included in the scope of the present invention.

Claims (5)

1. A method for implementing a transfer gate effect and rendering VR panoramic video frames in an AR device, characterized by: the method comprises the following steps:
s1, receiving a loading instruction from a human-computer interface, and loading a pre-stored virtual object model according to the loading instruction;
s2, receiving a scene construction instruction from a human-computer interface, and constructing a virtual object model according to the scene construction instruction to construct model structures of a door, a window, an outer wall and a sky box in the scene;
s3, generating an initial engineering file according to the constructed virtual scene, and adapting the initial engineering file to an AR engineering file;
s4, configuring a scene running state according to a scene construction instruction, attaching the VR panoramic video to a sky box, and closing rendering of all objects except a door, a window and an outer wall;
s5, writing a shader which does not draw a color channel, and endowing the shader to an outer wall model, wherein only a model picture of a virtual door and a physical scene picture returned by an external camera are reserved;
s6, receiving a scene operation instruction from a human-computer interface, and operating a scene, wherein a virtual door and an entity scene picture appear in the scene;
s7, receiving a virtual door opening instruction from a human-computer interface, opening and rendering objects in the virtual door, wherein the opened virtual door, a VR panoramic video picture in the virtual door and a physical scene picture outside the virtual door appear in the scene;
s8, packaging and outputting the final engineering file;
in the step S2, a gate in the scene is constructed as an exchange entry between the virtual scene and the real scene; constructing a window in the scene as an exchange window of the AR scene and the VR scene; constructing an outer wall in a scene as a medium for isolating a virtual scene from a real scene; constructing a sky box in a scene into a square polygonal model with spherical UV mapping;
in step S4, when attaching the VR panoramic video to the sky box, the specific steps are as follows:
s41, receiving a loading instruction from a human-computer interface, and loading a prestored polygonal model with spherical UV mapping and square shape;
s42, receiving a giving instruction from a human-computer interface, and giving the material which uses self illumination and can accept UV mapping to the polygon model loaded in S41;
s43, mapping the VR panoramic image onto the polygonal model, and displaying a final panoramic picture;
the specific steps of the step S5 are as follows:
s51, programming a shader which does not draw color channels by using a GPU programming language;
s52, receiving an attribute instruction of a human-computer interface, endowing the outer wall model with a coloring device to enable the outer wall model to be a transparent shade, hiding the inner model when the inner model of the outer wall is watched from the positive direction of the outer wall, and displaying the inner model normally when the inner model of the outer wall is watched from the negative direction of the outer wall;
s53, the virtual door is not covered by the outer wall shade area, and the virtual door is normally rendered and displayed in the scene;
in step S7, after receiving the virtual door opening instruction from the human-computer interface, the internal mechanism of the scene operation is as follows:
the CPU receives a virtual door opening instruction, sends the instruction to the model control component, and starts preloaded door opening animation and rendering operation on the model;
in step S7, when the user walks into the virtual space through the virtual gate, the internal mechanism of the scene operation is as follows:
s71, taking the virtual model as a polygonal model of the VR panoramic video carrier, and completely wrapping almost all virtual models including doors, windows and outer walls;
s72, wrapping a normal outer wall outside the user movement range in front of the virtual door to prevent VR panorama from being exposed outside the effect of the transmission door;
s73, the open area in the space is free of an outer wall shade, and viewing effects on VR panorama are kept.
2. The method of implementing a transfer gate effect and rendering VR panoramic video frames in an AR device of claim 1, wherein: the steps S1-S8 are all completed in the Unity3D game engine.
3. The method of implementing a transfer gate effect and rendering VR panoramic video frames in an AR device of claim 1 or 2, wherein: in the step S2, a construction operation is used for creating a virtual object; the virtual object comprises a model, lamplight, a map and materials.
4. The method of implementing a transfer gate effect and rendering VR panoramic video frames in an AR device of claim 1 or 2, wherein: in the step S3, before the initial engineering file is adapted to the AR engineering file, the AR technical framework is loaded first; the AR technical framework is an ARKit framework and/or an ARcore framework.
5. The method of implementing a transfer gate effect and rendering VR panoramic video frames in an AR device of claim 1, wherein: in the step S8, after the final AR engineering file is packaged, the final AR engineering file is output to a human-computer interface and VR display equipment.
CN201910024597.6A 2019-01-10 2019-01-10 Method for realizing transfer door effect and presenting VR panoramic video picture in AR equipment Active CN109727318B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910024597.6A CN109727318B (en) 2019-01-10 2019-01-10 Method for realizing transfer door effect and presenting VR panoramic video picture in AR equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910024597.6A CN109727318B (en) 2019-01-10 2019-01-10 Method for realizing transfer door effect and presenting VR panoramic video picture in AR equipment

Publications (2)

Publication Number Publication Date
CN109727318A CN109727318A (en) 2019-05-07
CN109727318B true CN109727318B (en) 2023-04-28

Family

ID=66298987

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910024597.6A Active CN109727318B (en) 2019-01-10 2019-01-10 Method for realizing transfer door effect and presenting VR panoramic video picture in AR equipment

Country Status (1)

Country Link
CN (1) CN109727318B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110174950B (en) * 2019-05-28 2022-05-17 广州视革科技有限公司 Scene switching method based on transmission gate
CN110427107A (en) * 2019-07-23 2019-11-08 德普信(天津)软件技术有限责任公司 Virtually with real interactive teaching method and system, server, storage medium
CN112925982B (en) * 2021-03-12 2023-04-07 上海意略明数字科技股份有限公司 User redirection method and device, storage medium and computer equipment
CN112891940B (en) * 2021-03-16 2024-01-09 天津亚克互动科技有限公司 Image data processing method and device, storage medium and computer equipment
CN115990335A (en) * 2021-10-19 2023-04-21 北京字节跳动网络技术有限公司 Virtual scene construction method and device, electronic equipment, medium and product
CN114265496A (en) * 2021-11-30 2022-04-01 歌尔光学科技有限公司 VR scene switching method and device, VR head-mounted equipment and storage medium
CN117572997A (en) * 2024-01-15 2024-02-20 南京维赛客网络科技有限公司 Method, system and storage medium for mutual transmission in model space and panoramic space

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103426195A (en) * 2013-09-09 2013-12-04 天津常青藤文化传播有限公司 Method for generating three-dimensional virtual animation scenes watched through naked eyes

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1060772B1 (en) * 1999-06-11 2012-02-01 Canon Kabushiki Kaisha Apparatus and method to represent mixed reality space shared by plural operators, game apparatus using mixed reality apparatus and interface method thereof
US20090132309A1 (en) * 2007-11-21 2009-05-21 International Business Machines Corporation Generation of a three-dimensional virtual reality environment from a business process model
US20140240351A1 (en) * 2013-02-27 2014-08-28 Michael Scavezze Mixed reality augmentation
US20160163063A1 (en) * 2014-12-04 2016-06-09 Matthew Ashman Mixed-reality visualization and method
CN106527857A (en) * 2016-10-10 2017-03-22 成都斯斐德科技有限公司 Virtual reality-based panoramic video interaction method
CN106683197A (en) * 2017-01-11 2017-05-17 福建佳视数码文化发展有限公司 VR (virtual reality) and AR (augmented reality) technology fused building exhibition system and VR and AR technology fused building exhibition method
WO2018227098A1 (en) * 2017-06-09 2018-12-13 Vid Scale, Inc. External camera assisted virtual reality
CN108665553B (en) * 2018-04-28 2023-03-17 腾讯科技(深圳)有限公司 Method and equipment for realizing virtual scene conversion
CN108671545B (en) * 2018-05-24 2022-02-25 腾讯科技(深圳)有限公司 Method, device and storage medium for controlling interaction between virtual object and virtual scene
CN108986232B (en) * 2018-07-27 2023-11-10 江苏洪旭德生科技发展集团有限公司 Method for presenting AR environment picture in VR display device

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103426195A (en) * 2013-09-09 2013-12-04 天津常青藤文化传播有限公司 Method for generating three-dimensional virtual animation scenes watched through naked eyes

Also Published As

Publication number Publication date
CN109727318A (en) 2019-05-07

Similar Documents

Publication Publication Date Title
CN109727318B (en) Method for realizing transfer door effect and presenting VR panoramic video picture in AR equipment
US11962741B2 (en) Methods and system for generating and displaying 3D videos in a virtual, augmented, or mixed reality environment
US11335379B2 (en) Video processing method, device and electronic equipment
McCormac et al. Scenenet rgb-d: Can 5m synthetic images beat generic imagenet pre-training on indoor segmentation?
CN107861714B (en) Development method and system of automobile display application based on Intel RealSense
CN113228625A (en) Video conference supporting composite video streams
CN105389090B (en) Method and device, mobile terminal and the computer terminal of game interaction interface display
US11276150B2 (en) Environment map generation and hole filling
AU2019226134B2 (en) Environment map hole-filling
CN110673743A (en) Virtual-real interaction system and method based on three-dimensional space scene
CN106990961A (en) A kind of method for building up of WebGL graphics rendering engines
CN114387445A (en) Object key point identification method and device, electronic equipment and storage medium
CN112929750B (en) Camera adjusting method and display device
Takatori et al. Large-scale projection-based immersive display: The design and implementation of largespace
CN114779948A (en) Method, device and equipment for controlling instant interaction of animation characters based on facial recognition
Matos et al. The visorama system: A functional overview of a new virtual reality environment
Baillard et al. Mixed reality extended TV
WO2023097707A1 (en) Vr scene switching method and apparatus, and vr head-mounted device and storage medium
JIN et al. A Research on the Construction and Realization of the Indoor Building Model Suitable for Holographic Projection
US20230326161A1 (en) Data processing method and apparatus, electronic device, computer-readable storage medium, and computer program product
CN114187427A (en) Method for presenting virtual space based on iOS (Internet operating System) end ARkit
Jacquemin et al. Alice on both sides of the looking glass: Performance, installations, and the real/virtual continuity
Alawadhi et al. Deep Learning Through Parametrically Generated Virtual Building Information Models for Real-World Object Recognition
Sarmiento et al. Panoramic immersive videos-3d production and visualization framework
Wang Immersive and Interactive Digital Stage Design Based on Computer Automatic Virtual Environment and Performance Experience Innovation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant