CN115222927A - Stepping virtual roaming scene construction method - Google Patents

Stepping virtual roaming scene construction method Download PDF

Info

Publication number
CN115222927A
CN115222927A CN202210876538.3A CN202210876538A CN115222927A CN 115222927 A CN115222927 A CN 115222927A CN 202210876538 A CN202210876538 A CN 202210876538A CN 115222927 A CN115222927 A CN 115222927A
Authority
CN
China
Prior art keywords
camera
scene
stepping
model
acquiring
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210876538.3A
Other languages
Chinese (zh)
Inventor
孙克超
罗建超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Jiongyan Network Technology Co ltd
Original Assignee
Shanghai Jiongyan Network Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Jiongyan Network Technology Co ltd filed Critical Shanghai Jiongyan Network Technology Co ltd
Priority to CN202210876538.3A priority Critical patent/CN115222927A/en
Publication of CN115222927A publication Critical patent/CN115222927A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention relates to a method for constructing a stepping virtual roaming scene, which comprises the following steps: acquiring a panoramic picture; acquiring a scene model; acquiring camera coordinate data, wherein the camera coordinate data is generated according to the scene model; acquiring camera associated coordinates, wherein the camera associated coordinates are cameras adjacent to the cameras; acquiring a compiling code, wherein the compiling code is obtained according to the camera association coordinates; outputting a stepping roaming scene, wherein the stepping roaming scene comprises importing the panoramic picture, a data model and a compiling code, and the stepping roaming scene has the advantages that the rendered two-dimensional panoramic picture is combined with the three-dimensional space position of the panoramic picture camera, so that rapid stepping roaming experience can be realized; the visual consistency is greatly enhanced.

Description

Stepping virtual roaming scene construction method
Technical Field
The invention relates to the technical field of digital virtual exhibition halls, in particular to a method and a system for constructing a stepping virtual roaming scene.
Background
Virtual reality technology has become very popular in recent years, opening the door to new 3D worlds for people, extending from flat 3D displays to virtual 3D worlds with more location and pose interaction information. The virtual exhibition hall can intuitively feel various real or virtual scenes in all directions like being in the scene, and like shopping the physical exhibition hall, a user can select the viewing position, but the existing virtual exhibition interaction experience is poor, the visual angle switching is not smooth, the concrete expression is that the existing virtual exhibition hall panorama roams into the jump relation of a scene panorama, the jump of a two-dimensional picture is achieved, and the generated 3D effect is only an image obtained based on the living habits of human eyes. The disadvantage is that there is no correlation between three-dimensional spaces from the point of view of generation between two-dimensional pictures, and only switching between two-dimensional images. After the two-dimensional picture jumps, the viewer is very easy to generate visual confusion due to different angles.
Chinese patent document CN112132961A discloses a method and a system for generating a digital virtual exhibition hall based on a panorama template, the method comprising: acquiring exhibit information; determining a corresponding digital exhibition hall template according to the exhibit information; according to the method, digital virtual exhibition halls suitable for different exhibitors can be quickly generated by replacing replaceable positions in the digital exhibition hall template according to the exhibit information, and the digital virtual exhibition hall template can be used for quickly generating the digital virtual exhibition halls suitable for different exhibitors without modeling each exhibit.
In summary, there is a need for a method for combining a rendered two-dimensional panorama with a three-dimensional position of a panorama camera to achieve fast step-by-step roaming experience; the method is used for establishing a brand-new 3D live-action space stepping virtual roaming scene construction method.
Disclosure of Invention
The invention aims to provide a method for combining a rendered two-dimensional panorama with a three-dimensional space position where a panorama camera is positioned, so as to realize rapid stepping roaming experience; the method is used for establishing a brand-new 3D live-action space stepping virtual roaming scene construction method.
In order to achieve the purpose, the invention adopts the technical scheme that:
a method for constructing a stepping virtual roaming scene comprises the following steps: acquiring a panoramic picture; acquiring a scene model; acquiring camera coordinate data, wherein the camera coordinate data is generated according to the scene model; acquiring camera associated coordinates; acquiring a compiling code, wherein the compiling code is obtained according to the camera association coordinates; and outputting a stepping roaming scene, wherein the step roaming scene comprises importing the panoramic picture, a data model and a compiling code.
As a preferred technical solution, acquiring a panoramic picture includes rendering the panoramic picture, where the resolution of the panoramic picture is 8000 × 4000.
As a preferred technical solution, the scene model is a 3D model, which includes a camera, an exhibition hall, and a show.
As a preferred technical scheme, the cameras are uniform in height and fixed in orientation, and adjacent cameras can be seen; the exhibition hall comprises an outer wall body, a top and a ground; the display includes a character model.
As a preferred technical solution, the model of the scene is subtracted.
As a preferred technical solution, the camera includes an original lens and a step lens; the number of the original lens is 000, the stepping lenses are distributed along the path, and the numbers of the stepping lenses are sequentially increased by 001; the camera height is 130-160.
As a preferred technical solution, the scene model is imported into a blender to obtain the camera coordinate data.
As a preferred technical solution, the camera associated coordinate is a camera adjacent to the camera.
As a preferred technical solution, the compiled code is generated by a generator using the camera-associated coordinate data.
The invention has the advantages that:
the method for constructing the stepping type virtual roaming scene combines the relevance between the two-dimensional panoramic image and the three-dimensional space. And fast manufacturing stepping roaming is carried out through a corresponding technology, so that the switching of three-dimensional space relation is formed by switching continuous two-dimensional pictures, and then the two-dimensional panoramic pictures are combined. The visual consistency is enhanced, and the continuity of visual appearance is formed, namely, the user can obtain the experience of almost freely advancing in the three-dimensional space.
Drawings
Fig. 1 is a flowchart of a method for constructing a stepping virtual roaming scene according to the present invention.
Fig. 2 is a schematic view of the panoramic picture and its number.
Fig. 3 is a standard kr item list file as described in the present embodiment.
Fig. 4 is a camera distribution diagram in the scene model according to the embodiment.
Fig. 5 is a schematic view of a model of the exhibit according to the present embodiment.
Fig. 6 is a view showing the representation of the camera coordinate data according to the present embodiment.
Fig. 7 is a view showing the representation of the camera related coordinate data according to the present embodiment.
Fig. 8 is a compiled code list table according to the present embodiment.
Fig. 9 is a schematic diagram of a data packet according to the embodiment.
Fig. 10 is a schematic diagram of the generator according to the present embodiment.
Detailed Description
The present invention will be further described with reference to the following embodiments. It should be understood that these examples are for illustrative purposes only and are not intended to limit the scope of the present invention. Furthermore, it should be understood that various changes and modifications can be made by those skilled in the art after reading the disclosure of the present invention, and equivalents fall within the scope of the appended claims.
Example 1
Referring to fig. 1, fig. 1 is a flowchart of a method for constructing a stepping virtual roaming scene according to the present invention. A method for constructing a stepping virtual roaming scene at least comprises the following steps:
step S10: acquiring a panoramic picture;
rendering according to a normal panoramic image after confirming a map, a model, light and a proxy file, preferably, the resolution of the panoramic image is 8000 × 4000, and the panoramic image can be specifically set according to actual requirements;
referring to fig. 2, fig. 2 is a schematic view of a panoramic picture and its number according to the present embodiment. Specifically, the panoramic pictures are numbered in sequence from 000, and the numbers of the panoramic pictures correspond to the cameras with corresponding numbers; drawing all panoramic pictures into krpano for slicing generation to obtain a standard kr item list, as shown in fig. 3, where fig. 3 is a standard kr item list file in the embodiment, and the list includes a panos folder and a tour. Js file for subsequently generating a roaming scene;
referring to fig. 4 and 5, fig. 4 is a distribution diagram of cameras in the scene model of the present embodiment; fig. 5 is a schematic view of a model of the exhibit according to the present embodiment. Step S20: acquiring a scene model; the scene model is a 3D model, wherein the scene model comprises a camera, an exhibition hall and a display object; the cameras are uniform in height and fixed in orientation, and adjacent cameras can be seen; the exhibition hall comprises an outer wall body, a top and a ground; the exhibit includes a character model; the camera comprises an original lens and a stepping lens; the number of the original lens is Camera000, the stepping lenses are distributed along the path, and the numbers of the stepping lenses are sequentially increased by Camera 001; the camera height is 130-160;
specifically, the scene model is generated by 3Dsmax, preferably, the 3D model is in units of centimeters, so that subsequent operations are facilitated, the scene model is an exhibition hall and comprises an exhibit in the exhibition hall, a camera is arranged in the exhibition hall, the position of the camera is a stepping hot spot, the height of the camera in a flat scene must be uniform, adjacent cameras must be visible and free of shielding, and when a door scene is involved, the door is in an open state and 1 camera is placed at the position of the door; the exhibition hall outer wall body, the top and the ground of the scene model are single-sided, the outer wall, the ceiling and the floor are hard-mounted to form a polygon, and the outdoor exhibition hall is paved to form double-sided display.
Referring to fig. 6, fig. 6 is a schematic diagram illustrating coordinate data of a camera according to the embodiment. Step S30: acquiring camera coordinate data, wherein the camera coordinate data is generated according to the scene model; the camera coordinate data can be obtained by importing the scene model into a blender;
specifically, the obtained scene model is exported as a fbx file, a default camera and an object in the blend are deleted, the fbx file is imported into the blend, and camera coordinate data is generated and exported through the blend, wherein the camera coordinate data is a table file with camera numbers and coordinates;
referring to fig. 7, fig. 7 is a representation of the camera-associated coordinate data according to the embodiment. Step S40: acquiring camera associated coordinates; the camera association coordinates are cameras adjacent to the camera;
specifically, a 3Dsmax scene model is opened, a top view of the model is checked to conveniently check cameras adjacent to corresponding cameras, numbers of the adjacent cameras in a camera coordinate data table are added to the rear of a list of the corresponding cameras in the camera coordinate data table, association between the camera and the adjacent cameras can be completed, and namely, a table recorded with camera association coordinates is obtained;
referring to fig. 8, fig. 8 is a compiled code list according to the embodiment. Step S50: acquiring a compiled code, wherein the compiled code is acquired according to camera associated coordinates; generating the compiled code by a generator using the camera associated coordinate data;
specifically, the generator is a 3D stepping virtual exhibition hall code generator, and the camera association coordinate data is imported into the generator to obtain a corresponding standard document, where the standard document includes a scene.
Step S60: outputting a stepping roaming scene, wherein the stepping roaming scene comprises the steps of importing the panoramic picture, a data model and a compiling code;
specifically, importing a panos folder and a tour.js file in a file generated by the panoramic picture into a corresponding data packet, wherein the data packet is a data packet for generating a stepping roaming scene; exporting the scene model as an stl file named 00.Stl, and importing the file into a model3d folder in a data packet to complete the import construction of the scene model; replacing the obtained compiled codes into corresponding files in the data package, namely replacing contents in the style.txt and scene.txt files into project3d.xml files in the data package; meanwhile, the replacement is completed according to the corresponding data in the data packet, such as replacing multires = '512,768,1664,3200' according to the difference of the slice size of the panoramic picture, and the data is in the root.xml in the standard kr item list; adding a first scene needing open-field display into an autoload = 'true'; and (4) replacing the height value of the camera in the data packet, namely replacing the height value in the file floorspot in the data packet with the Z coordinate value of the camera.
It should be understood that, in order to ensure the loading speed, the model reduction processing needs to be performed on the scene model, the size of the scene model is controlled within 5M or 10M as much as possible, the smaller the size of the scene model is, the faster the loading is, for example, the number of model surfaces is reduced as much as possible, unnecessary small objects are deleted, and the surface blocking independent model reduction is performed on shapes with a larger number of surfaces, such as circles; it should be noted that, when the model is excessively reduced to a single surface, the stepped hot spot on the back is not blocked, and accordingly, the stl is simply extruded in the blender or processed in the 3dsmax mold reduction process to solve the problem.
Referring to fig. 9, fig. 9 is a schematic diagram of a data packet according to the embodiment. After the import association and data replacement of the related files are completed, the data packet can be operated, and the output of the stepping roaming scene is realized.
It should be noted that: the method for constructing the stepping type virtual roaming scene combines the relevance between the two-dimensional panoramic image and the three-dimensional space. And fast manufacturing stepping roaming is carried out through a corresponding technology, so that the switching of three-dimensional space relation is formed by switching continuous two-dimensional pictures, and then the two-dimensional panoramic pictures are combined. Resulting in enhanced visual consistency. The user can get an almost free-going experience in three-dimensional space.
Example 2
This embodiment is used for explaining and constructing a compound stepping virtual scene, and is substantially the same as embodiment 1, except that the scene model is sunk by one camera height relative to ground coordinates, so that a first layer, a Z-axis of camera coordinates is 0, if a 1F camera height is normally 160cm, a Z-axis of the scene model moves downwards by 160cm, a 2F ground distance is 320cm from the 1F ground height, that is, a Z-axis of the 2F camera is 320, see fig. 10, where fig. 10 is a schematic diagram of the generator of this embodiment, and a compound mode is selected in the 3D stepping virtual exhibition hall code generator to generate a corresponding code; the height value in the flowerspot file in the data packet uses the sinking value of the scene model.
The above description is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and additions can be made without departing from the principle of the present invention, and these modifications and additions should also be regarded as the protection scope of the present invention.

Claims (9)

1. A method for constructing a stepping virtual roaming scene is characterized by comprising the following steps:
acquiring a panoramic picture;
acquiring a scene model;
acquiring camera coordinate data, wherein the camera coordinate data is generated according to the scene model;
acquiring camera associated coordinates, wherein the camera associated coordinates are cameras adjacent to the cameras;
acquiring a compiling code, wherein the compiling code is obtained according to the camera association coordinates;
and outputting a stepping roaming scene, wherein the step roaming scene comprises importing the panoramic picture, a data model and a compiling code.
2. The method of claim 1, wherein obtaining a panoramic image comprises rendering the panoramic image, and the panoramic image has a resolution of 8000 x 4000.
3. The method of claim 1, wherein the scene model is a 3D model, which includes a camera, an exhibition hall, and a show.
4. The method for constructing a stepped virtual roaming scene according to claim 3, characterized in that the cameras are uniform in height and fixed in orientation, and adjacent cameras are visible; the exhibition hall comprises an outer wall body, a top and a ground; the display includes a character model.
5. The method of claim 1, wherein the scene model is reduced in size.
6. The method according to claim 1, wherein the camera comprises an original lens, a step lens; the number of the original lens coordinate is 000, the stepping lenses are distributed along the path, and the numbers of the stepping lenses are sequentially increased by 001; the camera height is 130-160.
7. The method as claimed in claim 1, wherein the scene model is imported into a blender to obtain the camera coordinate data.
8. The method of claim 1, wherein the camera association coordinates are of a camera adjacent to the camera.
9. The method of claim 1, wherein the compiled code is generated by a generator using the camera-associated coordinate data.
CN202210876538.3A 2022-07-25 2022-07-25 Stepping virtual roaming scene construction method Pending CN115222927A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210876538.3A CN115222927A (en) 2022-07-25 2022-07-25 Stepping virtual roaming scene construction method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210876538.3A CN115222927A (en) 2022-07-25 2022-07-25 Stepping virtual roaming scene construction method

Publications (1)

Publication Number Publication Date
CN115222927A true CN115222927A (en) 2022-10-21

Family

ID=83614188

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210876538.3A Pending CN115222927A (en) 2022-07-25 2022-07-25 Stepping virtual roaming scene construction method

Country Status (1)

Country Link
CN (1) CN115222927A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116243831A (en) * 2023-05-12 2023-06-09 青岛道可云网络科技有限公司 Virtual cloud exhibition hall interaction method and system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116243831A (en) * 2023-05-12 2023-06-09 青岛道可云网络科技有限公司 Virtual cloud exhibition hall interaction method and system
CN116243831B (en) * 2023-05-12 2023-08-08 青岛道可云网络科技有限公司 Virtual cloud exhibition hall interaction method and system

Similar Documents

Publication Publication Date Title
CN106157359B (en) Design method of virtual scene experience system
JP4879326B2 (en) System and method for synthesizing a three-dimensional image
US6268862B1 (en) Three dimensional virtual space generation by fusing images
US8493380B2 (en) Method and system for constructing virtual space
JP2020522194A (en) Method and system for generating fused reality scenes based on virtual and real-world objects represented from different Vantage points in different video data streams
CN102413346A (en) Image display apparatus
CN101208723A (en) Automatic scene modeling for the 3D camera and 3D video
CN110163942A (en) A kind of image processing method and device
US9754398B1 (en) Animation curve reduction for mobile application user interface objects
JP2004145832A (en) Devices of creating, editing and reproducing contents, methods for creating, editing and reproducing contents, programs for creating and editing content, and mobile communication terminal
KR102435185B1 (en) How to create 3D images based on 360° VR shooting and provide 360° VR contents service
WO2019141879A1 (en) Calibration to be used in an augmented reality method and system
CN108043027B (en) Storage medium, electronic device, game screen display method and device
CN108898675A (en) A kind of method and device for adding 3D virtual objects in virtual scene
JP2017117481A (en) Camera work generation method, camera work generation device, and camera work generation program
CN115222927A (en) Stepping virtual roaming scene construction method
Hong et al. Towards 3D television through fusion of kinect and integral-imaging concepts
Yoshida fVisiOn: glasses-free tabletop 3D display to provide virtual 3D media naturally alongside real media
CN117218266A (en) 3D white-mode texture map generation method, device, equipment and medium
Pietroszek Volumetric filmmaking
EP4125044A2 (en) Image processing apparatus, image processing method, and program
ES2300204B1 (en) SYSTEM AND METHOD FOR THE DISPLAY OF AN INCREASED IMAGE APPLYING INCREASED REALITY TECHNIQUES.
CN101566784A (en) Method for establishing depth of field data for three-dimensional image and system thereof
Solina New media art projects, panoramic images and live video as interface between real and virtual worlds
CN114693895B (en) Map switching method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination