CN114390268B - Virtual reality panoramic video manufacturing method based on Rhino and Enscape - Google Patents

Virtual reality panoramic video manufacturing method based on Rhino and Enscape Download PDF

Info

Publication number
CN114390268B
CN114390268B CN202111661925.7A CN202111661925A CN114390268B CN 114390268 B CN114390268 B CN 114390268B CN 202111661925 A CN202111661925 A CN 202111661925A CN 114390268 B CN114390268 B CN 114390268B
Authority
CN
China
Prior art keywords
enscape
virtual reality
rhino
model
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111661925.7A
Other languages
Chinese (zh)
Other versions
CN114390268A (en
Inventor
朱卓晖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Central South Architectural Design Institute Co Ltd
Original Assignee
Central South Architectural Design Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Central South Architectural Design Institute Co Ltd filed Critical Central South Architectural Design Institute Co Ltd
Priority to CN202111661925.7A priority Critical patent/CN114390268B/en
Publication of CN114390268A publication Critical patent/CN114390268A/en
Application granted granted Critical
Publication of CN114390268B publication Critical patent/CN114390268B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/275Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application discloses a virtual reality panoramic video manufacturing method based on rho and Enscape, which utilizes the rho and Enscape software to quickly manufacture high-efficiency and high-quality panoramic video. The virtual reality panoramic video manufacturing method based on the Rhino and the Enscape has high applicability, reduces the use threshold by using a common modeling software platform, is easy to use, simplifies the rendering operation most based on the high-efficiency and high-quality rendering effect and function of the Enscape, has strong controllability, and can realize rapid, high-efficiency and high-quality panoramic video manufacturing by adjusting various attributes of a camera through parameters.

Description

Virtual reality panoramic video manufacturing method based on Rhino and Enscape
Technical Field
The application belongs to the technical field of panoramic video production, and particularly relates to a virtual reality panoramic video production method based on Rhino and Enscape.
Background
With the continuous development and progress of technology, the presentation form in engineering design changes, from hand-drawn effect figures to computer rendering effect figures according to models, and to the popularization of roaming animation. With the popularization of virtual reality technology, the application of panoramic pictures and panoramic videos is gradually increased, and compared with the static frame expression mode of the panoramic pictures, the panoramic videos have richer expressed contents and spaces and are similar to the difference between effect pictures and roaming animations. In the aspect of engineering design, the design can be experienced in the design process, and details such as design concepts, space, size and the like can be expressed more completely, so that the waste of cost and time caused by adjustment after construction is avoided as much as possible.
At present, the design work uses the Rhino in cooperation with Grasshopper (visual programming software under the Rhino platform) as common modeling software in the design stage, and then carries out visual display in cooperation with a renderer. Conventional renderers such as Lumion, vray, D and Enscape can render panoramic images, but cannot make panoramic video. The current panoramic video manufacturing mode is mainly divided into two modes, one is panoramic shooting and recording engineering of an actual scene by utilizing a panoramic camera, and has the advantages of being simple and convenient to operate, but not applicable to a design process without a real object, the other is rendering and manufacturing by utilizing professional video software, and manufacturing can be performed by utilizing a model in the design process as a basis.
Disclosure of Invention
The application aims to provide a virtual reality panoramic video manufacturing method based on rho and Enscape, which utilizes the rho and Enscape software to quickly manufacture high-efficiency and high-quality panoramic video.
In order to achieve the above purpose, the application provides a virtual reality panoramic video production method based on Rhino and Enscape, which comprises the following steps:
s1, drawing a corresponding three-dimensional model in the Rhino according to design requirements;
s2, enriching model environments by using a scenery model in an Enscape resource library;
s3, drawing a camera path and a sight line target path in the Rhino;
s4, respectively picking up the camera path and the sight line target path in the step S3 in the Grasshopper;
s5, controlling the camera node by taking the two paths picked up in the step S4 as position basis and combining a time axis node in the Grasshopper;
s6, starting an Enscape real-time rendering window and model and visual angle synchronization function in the Rhino;
s7, adjusting a time axis node in the Grasshopper on the basis of the step S5, controlling the position of a camera, and automatically and synchronously updating the view angle in the Enscape;
s8, watching details in an Enscape along with the camera path in the step S7, and adjusting visual effects through the Enscape visual setting; and repeating the step S1 or the step S2 to perfect the model scene until the determined scene effect is achieved;
s9, invoking an Enscape panoramic image rendering command by using a Python node in the Grasshopper, and performing frame-by-frame rendering according to a time axis;
s10, analyzing the result file in the xml format after Enscape rendering in the step S9 in batches by using Python, and converting the result file into a file in a corresponding png picture format, which is called a picture group;
s11, using Adobe After Effect software to carry out frame and picture combination on the picture group in the step S10 and match background music to lead out the video file in the mp4 format.
Further, the method comprises the steps of:
s12, a player is carried by a computer or a mobile phone, a website supporting panoramic video or virtual reality glasses are worn for watching.
Further, in step S1, the three-dimensional model includes all scene models of streamlines walked within the video, including building models, architectural decoration models, special furniture, and characters.
Further, using the model in the Enscape's own repository enriches the environment while keeping the model face number low.
Further, using the Curve command in Rhino, two curved paths are drawn as a camera position path and a line-of-sight target path, respectively.
Further, in step S4, the two curves drawn are picked up by the Curve operator in the Grasshopper, respectively, and converted into the curves of the Grasshopper.
Further, in the Grasshopper, the camera node is controlled by taking the two picked paths as the position basis and combining with the Number slide time axis node.
Further, the Enscape real-time rendering window and model and view synchronization function is started in the Rhino by live updates in Enscape.
Compared with the prior art, the application has the following advantages:
the application has high applicability, reduces the use threshold by using a common modeling software platform, is easy to use, simplifies the rendering operation most, has strong controllability, can adjust various properties of a camera by parameters, and realizes the rapid, efficient and high-quality panoramic video production based on the high-efficient and high-quality rendering effect and function of Enscape.
Drawings
FIG. 1 is a flow chart of a method according to an embodiment of the application.
FIG. 2 is a schematic diagram of a Rhino model in accordance with an embodiment of the present application.
FIG. 3 is a schematic view of Enscape real-time rendering effects according to an embodiment of the present application; wherein fig. 3 (a) is an effect diagram of the belt material, and fig. 3 (b) is a white mold effect diagram.
Fig. 4 is a schematic diagram of a panoramic frame picture in a png format according to an embodiment of the present application; wherein fig. 4 (a) is an effect diagram of the belt material, and fig. 4 (b) is a white mold effect diagram.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application. In addition, the technical features of the embodiments of the present application described below may be combined with each other as long as they do not collide with each other.
The application rapidly prepares high-efficiency high-quality panoramic video by using Rhino and Enscape software, and provides a better visual display method in the design stage.
As shown in fig. 1, the virtual reality panoramic video production method based on Rhino and Enscape in this embodiment includes the following steps:
s1, drawing a corresponding three-dimensional model in the Rhino according to design requirements.
In Rhino, a base model is created from the design, and the three-dimensional model contains all scene models of streamlines walked within the video, including building models, architectural decoration models, special furniture, and characters, etc., as shown in fig. 2.
S2, enriching model environments by using the scenery model in the Enscape resource library. Scene scenery uses models in the Enscape's own repository for enriching the environment while keeping the number of faces of the model as low as possible.
The Enscape resource library has the advantages of large quantity of scenery, high quality, capability of calling the scene-rich environments such as furniture, lamps and green plants, and the like, and meanwhile, due to the fact that the environment-rich environment is a proxy low-surface number model, the LOD technology is utilized, the rendering precision and effect in a rendering engine are ensured, and meanwhile, the model pressure in a Rhino modeling software platform is reduced.
S3, drawing a camera path and a sight line target path in the Rhino.
Two curved paths are drawn by Curve or Polyline isoline related commands in Rhino, respectively as a camera position path and a sight line target path.
S4, respectively picking up the camera path and the sight line target path in the step S3 in the Grasshopper.
Grasshopper is visual programming software in the Rhino platform. The corresponding line is picked up by the Curve node in Grasshop and converted into the Curve of Grasshop.
And S5, controlling the camera node by combining the time axis node in the Grasshopper by taking the two paths picked up in the step S4 as positions.
And (3) generating camera position and camera sight line direction data by utilizing an evaluation Curve node and combining a time axis of the Number slide node for the line picked up in the step (S4), and respectively inputting the data to the camera node.
S6, starting an Enscape real-time rendering window and model and view angle synchronization function in the Rhino.
And opening an Enscape real-time rendering window and model and view angle synchronization function by using live updates in Enscape in the rho.
And S7, adjusting a time axis node in the Grasshopper on the basis of the step S5, controlling the camera position, and automatically and synchronously updating the view angle in the Enscape.
S8, along with the camera path in the step S7, the detail is watched in the Enscape, the visual effect is adjusted through the visual setting of the Enscape, and the step S1 or the step S2 can be repeated to perfect the model scene until the determined scene effect is achieved.
S9, invoking an Enscape panoramic image rendering command by using a Python node in the Grasshopper, and performing frame-by-frame rendering according to a time axis.
When the rendering detection switch is turned on in the Grasshopper and the Enscape is checked to finish rendering, the time axis is automatically accumulated by one unit, the Rhino camera lens moves to the corresponding position, the Enscape lens is updated, the rendering is started, and the like until the time axis is finished, as shown in fig. 3 (b) and 3 (a). The functionality of Python is implemented in a separate complete compiler.
S10, analyzing the result files in the xml format after the panoramic image rendering is completed by using the Enscape in the step S9 in batches of Python, and converting the result files into corresponding png picture format files, which are called picture groups.
Enscape saves the file in xml format in disk after rendering panorama, and looks up and downloads the file in png format through its own viewer, but cannot download in batch, so it decodes in batch by Python writing decoding program, converts xml file into png picture and saves it in disk, as shown in FIG. 4 (b) and FIG. 4 (a).
S11, using Adobe After Effect software to carry out frame and picture combination on the picture group in the step S10 and match background music to lead out the video file in the mp4 format.
And synthesizing the frame pictures into a video by using the png sequence function of Adobe After Effect, and performing the functions of modifying the frame rate, adding background music, adding Logo and the like to finally derive a video format.
S12, using the equipment supporting panoramic video to watch the video.
The panoramic video can be watched by using a website supporting panoramic video such as a self-contained player of a computer or an uploading B station, and the virtual reality glasses can be worn for watching. The viewing patterns may vary somewhat due to the relative format requirements of the individual software.
The application realizes the manufacture of the virtual reality panoramic video based on the conventional design flow, reduces the manufacture technical difficulty of the virtual reality panoramic video, perfects the technical defect of the design flow about the manufacture of the virtual reality panoramic video, and provides a new visual application solution of the virtual reality panoramic video.
It should be noted that each step/component described in the present application may be split into more steps/components, or two or more steps/components or part of operations of the steps/components may be combined into new steps/components, according to the implementation needs, to achieve the object of the present application.
It will be readily appreciated by those skilled in the art that the foregoing is merely a preferred embodiment of the application and is not intended to limit the application, but any modifications, equivalents, improvements or alternatives falling within the spirit and principles of the application are intended to be included within the scope of the application.

Claims (8)

1. A virtual reality panoramic video manufacturing method based on rho and Enscape is characterized by comprising the following steps:
s1, drawing a corresponding three-dimensional model in the Rhino according to design requirements;
s2, enriching model environments by using a scenery model in an Enscape resource library;
s3, drawing a camera path and a sight line target path in the Rhino;
s4, respectively picking up the camera path and the sight line target path in the step S3 in the Grasshopper;
s5, controlling the camera node by taking the two paths picked up in the step S4 as position basis and combining a time axis node in the Grasshopper;
s6, starting an Enscape real-time rendering window and model and visual angle synchronization function in the Rhino;
s7, adjusting a time axis node in the Grasshopper on the basis of the step S5, controlling the position of a camera, and automatically and synchronously updating the view angle in the Enscape;
s8, watching details in an Enscape along with the camera path in the step S7, and adjusting visual effects through the Enscape visual setting; and repeating the step S1 or the step S2 to perfect the model scene until the determined scene effect is achieved;
s9, invoking an Enscape panoramic image rendering command by using a Python node in the Grasshopper, and performing frame-by-frame rendering according to a time axis;
s10, analyzing the result file in the xml format after Enscape rendering in the step S9 in batches by using Python, and converting the result file into a file in a corresponding png picture format, which is called a picture group;
s11, using Adobe After Effect software to carry out frame and picture combination on the picture group in the step S10 and match background music to lead out the video file in the mp4 format.
2. The method for producing a virtual reality panorama video based on Rhino and Enscape according to claim 1, further comprising the steps of:
s12, a player is carried by a computer or a mobile phone, a website supporting panoramic video or virtual reality glasses are worn for watching.
3. The method for producing a virtual reality panorama video according to claim 1, wherein in step S1, the three-dimensional model comprises all scene models of streamlines within the video, including building models, building decoration models, special furniture, and figures.
4. The method for producing the virtual reality panorama video based on the rho and the Enscape according to claim 1, wherein the environment is enriched by using the model in the Enscape's own repository under the condition of keeping the number of faces of the model low.
5. The method for producing a virtual reality panorama video based on Rhino and Enscape according to claim 1, wherein two curved paths are drawn as a camera position path and a sight line target path, respectively, using a Curve command in Rhino.
6. The method for producing a virtual reality panorama video based on rho and Enscape according to claim 5, wherein in step S4, two curves drawn are picked up by the Curve arithmetic unit in the grasshop and converted into a Curve of the grasshop.
7. The method for producing the virtual reality panorama video based on the rho and the enccape according to claim 1, wherein the camera node is controlled by combining a Number slide time axis node based on the two picked paths as a position basis in the Grasshop.
8. The method for producing the virtual reality panorama video based on the rho and the Enscape according to claim 1, wherein the synchronization function of the model and the viewing angle of the Enscape real-time rendering window is started by live updates in Enscape in the rho.
CN202111661925.7A 2021-12-31 2021-12-31 Virtual reality panoramic video manufacturing method based on Rhino and Enscape Active CN114390268B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111661925.7A CN114390268B (en) 2021-12-31 2021-12-31 Virtual reality panoramic video manufacturing method based on Rhino and Enscape

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111661925.7A CN114390268B (en) 2021-12-31 2021-12-31 Virtual reality panoramic video manufacturing method based on Rhino and Enscape

Publications (2)

Publication Number Publication Date
CN114390268A CN114390268A (en) 2022-04-22
CN114390268B true CN114390268B (en) 2023-08-11

Family

ID=81199108

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111661925.7A Active CN114390268B (en) 2021-12-31 2021-12-31 Virtual reality panoramic video manufacturing method based on Rhino and Enscape

Country Status (1)

Country Link
CN (1) CN114390268B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018106198A1 (en) * 2016-12-10 2018-06-14 Yasar Universitesi Viewing three-dimensional models through mobile-assisted virtual reality (vr) glasses
CN108769648A (en) * 2018-06-08 2018-11-06 宿迁霖云软件科技有限公司 A kind of 3D scene rendering methods based on 720 degree of panorama VR
CN110544314A (en) * 2019-09-05 2019-12-06 上海电气集团股份有限公司 Fusion method, system, medium and device of virtual reality and simulation model

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018106198A1 (en) * 2016-12-10 2018-06-14 Yasar Universitesi Viewing three-dimensional models through mobile-assisted virtual reality (vr) glasses
CN108769648A (en) * 2018-06-08 2018-11-06 宿迁霖云软件科技有限公司 A kind of 3D scene rendering methods based on 720 degree of panorama VR
CN110544314A (en) * 2019-09-05 2019-12-06 上海电气集团股份有限公司 Fusion method, system, medium and device of virtual reality and simulation model

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于BIM模型的动画效果表现途径分析;邢文翠, 赵永生;城市建筑;第18卷(第20期);115-117 *

Also Published As

Publication number Publication date
CN114390268A (en) 2022-04-22

Similar Documents

Publication Publication Date Title
CN108200445B (en) Virtual playing system and method of virtual image
US9041899B2 (en) Digital, virtual director apparatus and method
CN107197341B (en) Dazzle screen display method and device based on GPU and storage equipment
CN110225224B (en) Virtual image guiding and broadcasting method, device and system
US11488348B1 (en) Computing virtual screen imagery based on a stage environment, camera position, and/or camera settings
CN105069827A (en) Method for processing video transitions through three-dimensional model
CN112929627B (en) Virtual reality scene implementation method and device, storage medium and electronic equipment
US11176716B2 (en) Multi-source image data synchronization
CN110996150A (en) Video fusion method, electronic device and storage medium
CN114581566A (en) Animation special effect generation method, device, equipment and medium
CN114390268B (en) Virtual reality panoramic video manufacturing method based on Rhino and Enscape
JP2022028091A (en) Image processing device, image processing method, and program
CN102438108B (en) Film processing method
US20230018921A1 (en) Smoothly changing a focus of a camera between multiple target objects
CN113516761A (en) Optical illusion type naked eye 3D content manufacturing method and device
JP2002271692A (en) Image processing method, image processing unit, studio apparatus, studio system and program
Luntraru et al. Harmonizing 2D and 3D in Modern Animation.
US9715900B2 (en) Methods, circuits, devices, systems and associated computer executable code for composing composite content
WO2023285873A1 (en) Smoothly changing a focus of a camera between multiple target objects
WO2023285872A1 (en) Smoothly changing a focus of a camera between multiple target objects
WO2023285871A1 (en) Smoothly changing a focus of a camera between multiple target objects
WO2023196845A2 (en) System and method for providing dynamic backgrounds in live-action videography
Wang The production of the movie stunt scene based on AE
WO2023196850A2 (en) System and method for providing dynamic backgrounds in live-action videography
KR101717777B1 (en) 3D Animation production methods

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant