CN110853143B - Scene realization method, device, computer equipment and storage medium - Google Patents

Scene realization method, device, computer equipment and storage medium Download PDF

Info

Publication number
CN110853143B
CN110853143B CN201910970163.5A CN201910970163A CN110853143B CN 110853143 B CN110853143 B CN 110853143B CN 201910970163 A CN201910970163 A CN 201910970163A CN 110853143 B CN110853143 B CN 110853143B
Authority
CN
China
Prior art keywords
scene
camera
cylinder
stage
picture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910970163.5A
Other languages
Chinese (zh)
Other versions
CN110853143A (en
Inventor
王征
冯智泉
江勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yamei Zhilian Data Technology Co ltd
Original Assignee
Guangzhou Yame Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Yame Information Technology Co ltd filed Critical Guangzhou Yame Information Technology Co ltd
Priority to CN201910970163.5A priority Critical patent/CN110853143B/en
Publication of CN110853143A publication Critical patent/CN110853143A/en
Application granted granted Critical
Publication of CN110853143B publication Critical patent/CN110853143B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/10Constructive solid geometry [CSG] using solid primitives, e.g. cylinders, cubes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/61Scene description
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Television Signal Processing For Recording (AREA)
  • Image Generation (AREA)

Abstract

The application relates to a scene realization method, a scene realization device, computer equipment and a storage medium. The method comprises the following steps: constructing a material set according to the end-to-end consecutive planar scene graph and a preset material object; loading the pre-created cylinder and the material set into a grid object to obtain a new grid object; the end-to-end consecutive plane scene diagrams are inner wall maps of the cylinder, and the preset material objects are respectively the top surface and the bottom surface of the cylinder; and loading the new grid object into a pre-created stage scene, and generating a scene picture after rendering the stage scene. By adopting the method, panoramic materials shot by professional equipment are not needed, and professional software is not needed to splice the panoramic materials, so that the realization difficulty of a panoramic scene is reduced while the panoramic effect is ensured.

Description

Scene realization method, device, computer equipment and storage medium
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to a scene implementation method, apparatus, computer device, and storage medium.
Background
Panoramic views represent as much of the surrounding environment as possible by means of wide-angle representation and in the form of drawings, photographs, videos, three-dimensional models, etc. Panoramic pictures, namely pictures obtained by capturing image information of the whole scene by a professional camera or pictures rendered by modeling software, are spliced by the software and played by a special player to form a panoramic scene for virtual reality browsing, and a two-dimensional plan view is simulated into a real three-dimensional space and presented to a user.
In the conventional technology, in order to implement a panoramic scene, professional equipment is generally required to acquire original panoramic materials, such as a single lens, a fisheye lens, a tripod and a combined device of a tripod head or a specific panoramic camera; after the panoramic material is acquired, a professional is still required to splice and map the panoramic material into a spherical space through professional software.
It can be seen that in the traditional panoramic scene implementation scheme, not only is the professional requirement of panoramic materials on shooting personnel high, expensive shooting equipment is needed, but also the panoramic scene is difficult to process, so that the construction threshold of the panoramic scene is too high, and an ordinary user cannot quickly realize the panoramic scene at all.
Disclosure of Invention
Based on the foregoing, it is necessary to provide a simple and quick scene implementation method, device, computer equipment and storage medium for the technical problems.
In a first aspect, the present application provides a scenario implementation method, the method including:
constructing a material set according to the end-to-end consecutive planar scene graph and a preset material object;
loading the pre-created cylinder and the material set into a grid object to obtain a new grid object; the texture picture object is an inner wall map of the cylinder, and the preset material objects are the top surface and the bottom surface of the cylinder respectively;
and loading the new grid object into a pre-created stage scene, and generating a scene picture after rendering the stage scene.
In one embodiment of the present application, the method further comprises:
acquiring camera initialization parameters, wherein the camera initialization parameters comprise a visual angle, an aspect ratio, a near-end distance and a far-end distance;
creating a camera according to the camera initialization parameters;
determining a position of the camera and adjusting an angle of the camera; the angle of the camera is used for adjusting the display angle of the stage scene.
In one embodiment of the present application, the method further comprises:
acquiring preset construction parameters, wherein the construction parameters comprise a top surface radius, a bottom surface radius, a cylinder height, the number of segments of the top surface and the bottom surface, the number of segments of the side surface and a top surface and bottom surface Boolean value;
the cylinder is created from the build parameters.
In one embodiment of the present application, after the creating the cylinder according to the build parameters, the method further includes:
and scaling the size of the cylinder according to the size of the stage scene so that the size of the cylinder is matched with the size of the pre-created stage scene.
In one embodiment of the present application, the building a material set according to the end-to-end continuous planar scene graph and the preset material object includes:
creating a texture picture object according to the end-to-end consecutive planar scene graph;
constructing a material set according to the texture picture object and the preset material object; the preset material object comprises two transparent plane views.
In one embodiment of the present application, the creating a texture picture object according to the end-to-end consecutive planar scene graph further includes:
acquiring picture addresses of the head-tail consecutive planar scene pictures;
loading the end-to-end coherent planar scene graph according to the picture address to form the texture picture object; the texture picture object is in a multi-stage progressive texture format.
In one embodiment of the present application, the generating a scene picture after rendering the stage scene includes:
acquiring an initialization parameter of a renderer, wherein the initialization parameter of the renderer comprises an antialiasing parameter, a coloring precision parameter and a texture precision parameter;
and rendering the stage scene according to the renderer initialization parameters and the rendering function to generate a panoramic scene picture.
In a second aspect, the present application provides a scene implementation device, the device including:
the material set construction module is used for constructing a material set according to the head-tail coherent planar scene graph and the preset material objects;
the grid object generation module loads the pre-created cylinder and the material set into the grid object to obtain a new grid object; the document picture object is an inner wall map of the cylinder, and the preset material objects are the top surface and the bottom surface of the cylinder respectively;
and the rendering module is used for loading the new grid object into a pre-created stage scene, and generating a scene picture after rendering the stage scene.
In one embodiment of the present application, the material set construction module is specifically configured to:
creating a texture picture object according to the end-to-end consecutive planar scene graph;
constructing a material set according to the texture picture object and a preset material object; the preset material object comprises two transparent plan views.
In one embodiment of the present application, the material set construction module is further specifically configured to:
acquiring picture addresses of the end-to-end consecutive planar scene graphs;
loading a head-to-tail coherent planar scene graph according to the picture address to form a texture picture object; the texture picture object is in a multi-level progressive texture format.
In one embodiment of the present application, the rendering module is specifically configured to:
acquiring an initialization parameter of a renderer, wherein the initialization parameter of the renderer comprises an antialiasing parameter, a coloring precision parameter and a texture precision parameter;
and rendering the stage scene according to the renderer initialization parameters and the rendering function to generate a panoramic scene picture.
In one embodiment of the present application, the apparatus may further include: a camera build module and a cylinder build module. Wherein:
the camera construction module is specifically configured to:
acquiring camera initialization parameters, wherein the camera initialization parameters comprise a visual angle, an aspect ratio, a near-end distance and a far-end distance;
creating a camera according to the camera initialization parameters;
determining the position of a camera and adjusting the angle of the camera; the angle of the camera is used for adjusting the display angle of the stage scene.
The cylinder construction module is specifically used for:
obtaining preset construction parameters, wherein the construction parameters comprise a top surface radius, a bottom surface radius, a cylinder height, the number of segments of the top surface and the bottom surface, the number of segments of the side surface and a top surface and bottom surface Boolean value;
a cylinder is created from the build parameters.
In one embodiment of the present application, the cylinder building block shown is also specifically for:
the size of the cylinder is scaled according to the size of the stage scene so that the size of the cylinder matches the size of the pre-created stage scene.
In a third aspect, the present application provides a computer device comprising a memory and a processor, the memory storing a computer program, the processor implementing the following steps when executing the computer program:
constructing a material set according to the end-to-end consecutive planar scene graph and a preset material object;
loading the pre-created cylinder and the material set into a grid object to obtain a new grid object; the texture picture object is an inner wall map of the cylinder, and the preset material objects are the top surface and the bottom surface of the cylinder respectively;
and loading the new grid object into a pre-created stage scene, and generating a scene picture after rendering the stage scene.
In a fourth aspect, the present application provides a computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of:
constructing a material set according to the end-to-end consecutive planar scene graph and a preset material object;
loading the pre-created cylinder and the material set into a grid object to obtain a new grid object; the texture picture object is an inner wall map of the cylinder, and the preset material objects are the top surface and the bottom surface of the cylinder respectively;
and loading the new grid object into a pre-created stage scene, and generating a scene picture after rendering the stage scene.
The scene realization method, the scene realization device, the computer equipment and the storage medium construct a material set according to the head-to-tail consecutive planar scene graph and the preset material objects; loading a pre-created cylinder and a material set into a grid object to obtain a new grid object; the end-to-end continuous plane scene graph is a map of the inner wall of the cylinder, and the preset material objects are the top surface and the bottom surface of the cylinder respectively; and loading the new grid object into a pre-created stage scene, and generating a scene picture after rendering the stage scene. According to the scene realization method provided by the embodiment of the application, the panoramic scene picture can be obtained by acquiring the head-to-tail coherent plane scene graph and taking the plane scene graph as the inner surface of the cylindrical object and rendering the stage scene containing the cylindrical object, panoramic materials shot by professional equipment are not needed, and the panoramic materials are not needed to be spliced and mapped into the spherical space by professional software, so that the realization difficulty of the panoramic scene is reduced while the panoramic effect is ensured.
Drawings
Fig. 1 is an implementation environment diagram of a scenario implementation method provided in an embodiment of the present application;
fig. 2 is a flowchart of a scenario implementation method provided in an embodiment of the present application;
FIG. 3a is a schematic representation of a planar scene provided in an embodiment of the present application;
FIG. 3b is a perspective view of a pillar according to an embodiment of the present application;
fig. 3c is a schematic view of a panoramic scene according to an embodiment of the present application;
FIG. 4 is a flowchart of another scenario implementation method provided in an embodiment of the present application;
FIG. 5 is a flowchart of another scenario implementation method provided in an embodiment of the present application;
FIG. 6 is a flowchart of another scenario implementation method provided in an embodiment of the present application;
FIG. 7 is a flowchart of another scenario implementation method provided in an embodiment of the present application;
FIG. 8 is a flowchart of another scenario implementation method provided in an embodiment of the present application;
fig. 9 is a block diagram of a scene implementation device according to an embodiment of the present application;
fig. 10 is a block diagram of another scenario implementation apparatus provided in an embodiment of the present application;
fig. 11 is a block diagram of a computer device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application.
The scene realization method provided by the application can be applied to an application environment shown in fig. 1. Wherein the terminal 102 communicates with the server 104 via a network. The terminal 102 may be, but not limited to, various personal computers, notebook computers, smartphones, tablet computers, and portable wearable devices, and the server 104 may be implemented by a stand-alone server or a server cluster composed of a plurality of servers. The server 104 is configured to store image materials and construction functions required in the panoramic scene construction process, and when a user accesses the panoramic scene of the server 104 through the terminal 102, the server 104 transmits the image materials and construction functions to the terminal 102, so that the terminal 102 generates the panoramic scene and displays the panoramic scene in a corresponding display device.
Referring to fig. 2, a scenario implementation method provided in this embodiment is illustrated, and the scenario implementation method is applied to the terminal in fig. 1, and includes the following steps:
step 202, constructing a material set according to the head-to-tail consecutive planar scene graph and the preset material objects.
In one embodiment of the present application, when a user accesses a panoramic scene in a server through the terminal, the terminal needs to acquire a material set corresponding to the panoramic scene. Specifically, the terminal may request the server to send the material corresponding to the panoramic scene by sending a material acquisition request to the server. Specifically, the material set corresponding to the panoramic scene at least comprises a planar scene graph and material objects which are connected end to end.
In an embodiment of the present application, the planar scene graph is associated with the panoramic scenes, and optionally, for each panoramic scene, the server side is provided with a corresponding planar scene graph, and when the terminal accesses one of the panoramic scenes, the server sends the planar scene graph corresponding to the accessed panoramic scene to the terminal; optionally, scene attributes are set in the panoramic scene, and the server can screen out a planar scene graph most suitable for the panoramic scene from the planar scene material library according to the scene attributes of the panoramic scene and send the planar scene graph to the terminal. For example, if the panoramic scene is an urban scene, a street type plan view can be searched in a plan scene material library, and if the panoramic scene is an outdoor scene, a forest and mountain and water type plan view can be searched in the plan scene material library.
In one embodiment of the present application, the planar scene graph is a generally rectangular planar picture, where the rectangular planar picture has a property of being consecutive from end to end, and therefore, a set of opposite sides thereof may be connected by a curling manner to form a columnar panorama. As shown in the planar scene diagram of fig. 3a, the left and right sides of the diagram are consecutive, so that the left and right sides can be connected as a set of opposite sides by crimping to form a cylindrical panorama as shown in fig. 3b, it can be seen that the connection during the head-to-tail connection is not visible in the cylindrical panorama as shown in fig. 3b because the planar scene diagram shown in fig. 3a is consecutive left and right.
Step 204, loading the pre-created cylinder and the material set into the grid object to obtain a new grid object; the end-to-end continuous plane scene graph is the inner wall map of the cylinder, and the preset material objects are the top surface and the bottom surface of the cylinder respectively.
In one embodiment of the present application, the pre-built cylinder cannot be rendered when it is, i.e. the separately built cylinder cannot be rendered and displayed in the display device of the terminal, and therefore it is necessary to combine the pre-built cylinder with a corresponding set of materials to form a new mesh object, which can be rendered and displayed. Specifically, the Mesh object may be created by a Mesh (material) function, where the Mesh object is a shape of the Mesh object and the material is a material of the Mesh object. In this embodiment, geometry is a pre-built cylinder and material is the collection of materials generated in step 202.
Specifically, the pre-constructed cylinder has a side, a top and a bottom, and the terminal uses the obtained end-to-end consecutive planar scene graph as the texture of the side, and uses the preset material object as the textures of the top and the bottom respectively. Optionally, the server may search the material object database for a sky type material and send the sky type material to the terminal, so that the terminal uses the sky type material as a material object corresponding to the top surface; the server may search the material object database for the material of the ground class and send the material of the ground class to the terminal, so that the terminal uses the material of the ground class as the material object corresponding to the bottom surface.
In one embodiment of the present application, the properties of the grid object may be set to change the position and display effect of the grid object in the stage scene. Specifically, the properties of the mesh object may include position, rotation, scale, X-axis translation, Y-axis translation, and Z-axis translation.
In a further embodiment, the side attribute of each surface of the cylinder may be set to "back" or "double" so that each material object in the material set may be displayed inside the cylinder, thereby creating a panoramic scene. In the panoramic scene diagram shown in fig. 3c, the panoramic scene when the side attribute of each surface of the cylinder is set to "back" is shown with the planar scene diagram shown in fig. 3a as the panoramic scene of the side of the cylinder.
Step 206, loading the new grid object into the pre-created stage scene, and generating a scene picture after rendering the stage scene.
In one embodiment of the present application, for the new mesh object, a scene picture is generated by loading it into a pre-created stage scene and rendering the stage scene with a renderer set in the stage scene.
In one embodiment of the present application, a light source object is further preset in the stage scene, and specifically, the light source object may be set as an ambient light source (AmbientLight), so that light may act on all objects, but no shadow effect is generated. The panoramic scene visibility can be improved, and meanwhile, the running pressure of the terminal processor can be reduced.
In the scene implementation method provided by the embodiment of the application, a material set is constructed according to the head-to-tail consecutive planar scene graph and the preset material objects; loading a pre-created cylinder and a material set into a grid object to obtain a new grid object; the end-to-end continuous plane scene graph is a map of the inner wall of the cylinder, and the preset material objects are the top surface and the bottom surface of the cylinder respectively; and loading the new grid object into a pre-created stage scene, and generating a scene picture after rendering the stage scene. According to the scene realization method provided by the embodiment of the application, the panoramic scene picture can be obtained by acquiring the head-to-tail coherent plane scene graph and taking the plane scene graph as the inner surface of the cylindrical object and rendering the stage scene containing the cylindrical object, panoramic materials shot by professional equipment are not needed, and the panoramic materials are not needed to be spliced and mapped into the spherical space by professional software, so that the realization difficulty of the panoramic scene is reduced while the panoramic effect is ensured.
Referring to fig. 4, a flowchart of another scenario implementation method provided in this embodiment is shown, where the scenario implementation method may be applied to the terminal 102 in the implementation environment described above. On the basis of the embodiment shown in fig. 2, the method may further include the following steps:
step 302, obtaining camera initialization parameters, wherein the camera initialization parameters comprise a visual angle, an aspect ratio, a near-end distance and a far-end distance.
Step 304, a camera is created according to the camera initialization parameters.
In one embodiment of the present application, the camera may be one of an orthogonal projection camera and a perspective projection camera, and when the camera type is set as the orthogonal projection camera, the camera is drawn according to a uniform size ratio regardless of the distance between the object and the camera; when the camera type is set as a perspective projection camera, the object becomes smaller as the distance between the object and the camera increases.
In one embodiment of the present application, a perspective projection camera is selected as the camera type of the camera, and the camera may be created from camera initialization parameters and a perspective projection camera creation function. Specifically, a three.perspective camera function may be used to create a camera, for which camera initialization parameters corresponding to the camera to be created are required, including a visual angle fov, an aspect ratio aspect, a near-end distance near and a far-end distance far. The visual angle is the opening angle of the camera in the vertical direction; the aspect ratio is the ratio of the lengths of the camera in the horizontal direction and the vertical direction and is equal to width/height; the near-end distance is the nearest distance from the camera to the observation area; the distal distance is the furthest distance of the camera from the observation area.
In a further embodiment, the aspect ratio is a ratio of a height to a width of the current browser window, and specifically, the width of the browser window may be obtained through window.
Step 306, determining the position of the camera and adjusting the angle of the camera; the angle of the camera is used for adjusting the display angle of the stage scene.
In one embodiment of the present application, the position of the camera may be set by the three.vector3 function, and the angle of the camera may be set by the scene.rotation.set function, through which the presentation angle of the stage scene may be adjusted. Preferably, the camera may be set up with the origin position.
In the scene implementation method provided by the embodiment of the application, camera initialization parameters are obtained, wherein the camera initialization parameters comprise a visual angle, an aspect ratio, a near-end distance and a far-end distance; creating a camera according to the camera initialization parameters; determining the position of a camera and adjusting the angle of the camera; the angle of the camera is used for adjusting the display angle of the stage scene. According to the scene realization method provided by the embodiment of the application, the camera can be created according to the preset camera initialization parameters, and the initial display angle of the stage scene is controlled by adjusting the initial angle of the camera, so that the controllability of the panoramic scene is improved.
Referring to fig. 5, a flowchart of another scenario implementation method provided in this embodiment is shown, where the scenario implementation method may be applied to the terminal 102 in the implementation environment described above. On the basis of the embodiment shown in fig. 2 or fig. 4, the method may further include the following steps:
step 402, obtaining preset construction parameters, wherein the construction parameters comprise a top surface radius, a bottom surface radius, a cylinder height, the number of segments of the top surface and the bottom surface, the number of segments of the side surface and a top surface and bottom surface Boolean value.
Step 404, creating a cylinder according to the build parameters.
In one embodiment of the present application, the cylinder is created according to preset build parameters and a cylinder creation function. Specifically, the cylinder may be created using the three. Cylindergeometry function for which corresponding build parameters for the cylinder to be created are required, including: radius top of top surface, radius bottom of bottom surface, height of cylinder, number of segments of top surface and bottom surface, number of segments of side surface, number of segments of height segments, and boolean value of top surface and bottom surface. Wherein, the Boolean value of the top surface and the bottom surface is of Boolean type, and the default value is false, which indicates that the top surface and the bottom surface are provided; when the top and bottom radii are set to the same value, a standard cylinder is created. In a specific embodiment, the top and bottom radii are set to 12, the cylinder height is set to 11, the top and bottom number of segments and the side number of segments are set to 72, the top and bottom Boolean values are false, meaning that a cylinder with both top and bottom radii of 12 and a height of 11 is created.
In order to match the created cylinder with the size of the stage scene, the method further comprises the following steps:
in step 406, the size of the cylinder is scaled according to the size of the stage scene, so that the size of the cylinder matches the size of the pre-created stage scene.
In one embodiment of the present application, the terminal uses the size of the current browser window as the size of the stage scene, so that the created cylinder needs to be scaled in equal proportion according to the size of the stage scene, and the scaled cylinder can be matched with the size of the stage scene created in advance.
In the scene implementation method provided by the embodiment of the application, the preset construction parameters are obtained, wherein the construction parameters comprise the radius of the top surface, the radius of the bottom surface, the height of the cylinder, the number of segments of the top surface and the bottom surface, the number of segments of the side surface and the Boolean value of the bottom surface; creating a cylinder according to the construction parameters; the size of the cylinder is scaled according to the size of the stage scene so that the size of the cylinder matches the size of the pre-created stage scene. According to the scene realization method provided by the embodiment of the application, the cylinder is created through the preset construction parameters and the construction parameters, and the generated cylinder is scaled according to the size of the stage scene, so that the cylinder which is more matched with the browser window can be obtained, and the finally obtained panoramic scene can be more perfectly presented in the browser, namely, the authenticity of the panoramic scene is improved.
Referring to fig. 6, a flowchart of another scenario implementation method provided in this embodiment is shown, where the scenario implementation method may be applied to the terminal 102 in the implementation environment described above. Based on the embodiment shown in fig. 2 or fig. 4, the step 202 may specifically include the following steps:
step 502, creating a texture picture object according to the end-to-end consecutive planar scene graphs.
Step 502, constructing a material set according to a texture picture object and a preset material object; the preset material object comprises two transparent plan views.
In one embodiment of the present application, the texture picture object and the preset material object may be stored by creating a texture array to form a material set. The texture array comprises a texture picture object generated by a head-to-tail consecutive planar scene graph and two transparent planar graph material objects. Specifically, the texture picture object may be saved to the texture array by the three. Merebasicmmaterial function; two transparent plan views with the transparency attribute set to true can also be saved as preset material objects into the texture array through the three.
In another embodiment of the present application, the color, wireframe width, wireframe line segment end points, wireframe line segment connection points, coloring, vertex color, and fogging properties of the texture picture object may also be set by the three.
In the scene implementation method provided by the embodiment of the application, a texture picture object is created according to the end-to-end consecutive planar scene graphs; constructing a material set according to the texture picture object and a preset material object; the preset material object comprises two transparent plan views. According to the scene realization method provided by the embodiment of the application, the material set required for constructing the panoramic scene can be rapidly generated.
Referring to fig. 7, a flowchart of another scenario implementation method provided in this embodiment is shown, where the scenario implementation method may be applied to the terminal 102 in the implementation environment described above. Based on the embodiment shown in fig. 6, the step 502 may specifically include the following steps:
step 602, obtaining picture addresses of end-to-end consecutive planar scene graphs.
In one embodiment of the present application, the terminal may request and receive a picture address of the end-to-end consecutive planar scene graph from the server, where the picture address may be a network address, a local path, and so on. The planar scene graph can be obtained through the picture address.
Step 604, loading a head-to-tail consecutive planar scene graph according to the picture address to form a texture picture object; the texture picture object is in a multi-level progressive texture format.
In one embodiment of the present application, the planar scene graph corresponding to the picture address may be loaded by the three.
In one embodiment of the present application, when a texture picture object is generated from the planar scene graph, multi-level progressive texture properties of the texture picture object are activated. Specifically, the multi-level gradually distant texture attribute generateMipmaps of the texture picture object may be set to true to set it to a multi-level gradually distant texture format. In this embodiment, after setting generateMipmaps to true, a set of scaled down texture maps is generated in advance, and smaller texture maps are automatically used when the camera is far from the texture. Typically taking up 33% more memory space, which is a typical space trade-off for time usage
In a further embodiment, a minFilter and a magFilter are required to be set, specifically, the minFilter attribute may be set to three.lineaFilter, and the magFilter attribute may be set to three.linearbilter. Since the texture changes frequently in this embodiment, we need to set a simple and efficient filter. I.e. the linerbilter linear filter.
In the scene implementation method provided by the embodiment of the application, the picture addresses of the head-to-tail consecutive plane scene graphs are obtained; loading a head-to-tail coherent planar scene graph according to the picture address to form a texture picture object; the texture picture object is in a multi-level progressive texture format. According to the scene realization method provided by the embodiment of the application, the format of the texture picture object is set to be the multi-stage gradually-far texture format, a group of reduced texture maps can be generated in advance, and when the distance between the camera and the texture changes, the corresponding texture maps are directly called, so that the response time of the panoramic scene is shortened, and the watching experience of a user is improved.
Referring to fig. 8, a flowchart of another scenario implementation method provided in the present embodiment is shown, where the scenario implementation method may be applied to the terminal 102 in the implementation environment described above. Based on the embodiment shown in fig. 2 or fig. 4, the step 206 may specifically include the following steps:
step 702, obtaining renderer initialization parameters, wherein the renderer initialization parameters include an antialiasing parameter, a shading precision parameter, and a texture precision parameter.
Step 704, rendering the stage scene according to the renderer initialization parameter and the rendering function, and generating a panoramic scene picture.
In one embodiment of the present application, the stage scene may be rendered by the three.webglrender function and renderer initialization parameters to generate a panoramic scene picture. Specifically, the renderer initialization parameter includes an antialiasing parameter antialiass indicating whether to open antialiasing; the coloring precision parameter precision represents coloring precision selection; the texture precision parameter mipmap represents texture precision selection. In a specific embodiment, the renderer initialization parameter may include an antialiasing parameter antialiass set to false, indicating that antialiasing is not turned on; the shading precision parameter precision and the texture precision parameter mipmap are set to highp, which indicates that the shading precision and the texture precision are high.
In the scene implementation method provided by the embodiment of the application, the renderer initialization parameters are acquired, wherein the renderer initialization parameters comprise an antialiasing parameter, a coloring precision parameter and a texture precision parameter; and rendering the stage scene according to the renderer initialization parameters and the rendering function to generate a panoramic scene picture. According to the scene realization method provided by the embodiment of the application, the obtained stage scene can be rapidly rendered according to the initialization parameters of the renderer, and the panoramic scene is obtained.
It should be understood that, although the steps in the above-described flowcharts are shown in order as indicated by the arrows, these steps are not necessarily performed in order as indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in the flowcharts described above may include a plurality of sub-steps or stages that are not necessarily performed at the same time, but may be performed at different times, and the order of execution of the sub-steps or stages is not necessarily sequential, but may be performed alternately or alternately with at least a part of the sub-steps or stages of other steps or other steps.
Referring to fig. 9, a block diagram of a scenario implementation apparatus 900 provided in an embodiment of the present application is shown. As shown in fig. 9, the scene implementation apparatus 900 may include: a material set construction module 901, a mesh object generation module 902, and a rendering module 903, wherein:
the material set construction module 901 is configured to construct a material set according to the end-to-end consecutive planar scene graphs and the preset material objects;
the grid object generating module 902 loads a cylinder and a material set which are created in advance into a grid object to obtain a new grid object; the document picture object is a map of the inner wall of the cylinder, and the preset material objects are the top surface and the bottom surface of the cylinder respectively;
the rendering module 903 is configured to load a new mesh object into a pre-created stage scene, and render the stage scene to generate a scene frame.
In one embodiment of the present application, the material set construction module 901 is specifically configured to:
creating a texture picture object according to the end-to-end consecutive planar scene graph;
constructing a material set according to the texture picture object and a preset material object; the preset material object comprises two transparent plan views.
In one embodiment of the present application, the material set construction module 901 is further specifically configured to:
acquiring picture addresses of the end-to-end consecutive planar scene graphs;
loading a head-to-tail coherent planar scene graph according to the picture address to form a texture picture object; the texture picture object is in a multi-level progressive texture format.
In one embodiment of the present application, the rendering module 903 is specifically configured to:
acquiring an initialization parameter of a renderer, wherein the initialization parameter of the renderer comprises an antialiasing parameter, a coloring precision parameter and a texture precision parameter;
and rendering the stage scene according to the renderer initialization parameters and the rendering function to generate a panoramic scene picture.
Referring to fig. 10, a block diagram of a scenario implementation apparatus 1000 provided in an embodiment of the present application is shown. As shown in fig. 10, the scenario implementation apparatus 1000 may optionally further include, in addition to the modules included in the scenario implementation apparatus 900: a camera build module 904 and a cylinder build module 905. Wherein:
the camera construction module 904 is specifically configured to:
acquiring camera initialization parameters, wherein the camera initialization parameters comprise a visual angle, an aspect ratio, a near-end distance and a far-end distance;
creating a camera according to the camera initialization parameters;
determining the position of a camera and adjusting the angle of the camera; the angle of the camera is used for adjusting the display angle of the stage scene.
The cylinder construction module 905 is specifically configured to:
obtaining preset construction parameters, wherein the construction parameters comprise a top surface radius, a bottom surface radius, a cylinder height, the number of segments of the top surface and the bottom surface, the number of segments of the side surface and a top surface and bottom surface Boolean value;
a cylinder is created from the build parameters.
In one embodiment of the present application, the illustrated cylinder building block 905 is also specifically configured to:
the size of the cylinder is scaled according to the size of the stage scene so that the size of the cylinder matches the size of the pre-created stage scene.
For specific limitations of the scenario implementation apparatus, reference may be made to the above limitations of the scenario implementation method, which are not repeated here. The respective modules in the above-described scene implementation apparatus may be implemented in whole or in part by software, hardware, and combinations thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
In one embodiment, a computer device is provided, which may be a terminal, and the internal structure thereof may be as shown in fig. 11. The computer device includes a processor, a memory, a network interface, a display screen, and an input device connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a scene implementation method. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, can also be keys, a track ball or a touch pad arranged on the shell of the computer equipment, and can also be an external keyboard, a touch pad or a mouse and the like.
It will be appreciated by those skilled in the art that the structure shown in fig. 11 is merely a block diagram of a portion of the structure associated with the present application and is not limiting of the computer device to which the present application applies, and that a particular computer device may include more or fewer components than shown, or may combine some of the components, or have a different arrangement of components.
In one embodiment, a computer device is provided comprising a memory and a processor, the memory having stored therein a computer program, the processor when executing the computer program performing the steps of:
constructing a material set according to the end-to-end consecutive planar scene graph and a preset material object;
loading a pre-created cylinder and a material set into a grid object to obtain a new grid object; the end-to-end continuous plane scene graph is a map of the inner wall of the cylinder, and the preset material objects are the top surface and the bottom surface of the cylinder respectively;
and loading the new grid object into a pre-created stage scene, and generating a scene picture after rendering the stage scene.
In one embodiment, a computer readable storage medium is provided having a computer program stored thereon, which when executed by a processor, performs the steps of:
constructing a material set according to the end-to-end consecutive planar scene graph and a preset material object;
loading a pre-created cylinder and a material set into a grid object to obtain a new grid object; the end-to-end continuous plane scene graph is a map of the inner wall of the cylinder, and the preset material objects are the top surface and the bottom surface of the cylinder respectively;
and loading the new grid object into a pre-created stage scene, and generating a scene picture after rendering the stage scene.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the various embodiments provided herein may include non-volatile and/or volatile memory. The nonvolatile memory can include Read Only Memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), memory bus direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM), among others.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples merely represent a few embodiments of the present application, which are described in more detail and are not to be construed as limiting the scope of the invention. It should be noted that it would be apparent to those skilled in the art that various modifications and improvements could be made without departing from the spirit of the present application, which would be within the scope of the present application. Accordingly, the scope of protection of the present application is to be determined by the claims appended hereto.

Claims (10)

1. A method for implementing a scene, the method comprising:
constructing a material set according to the end-to-end consecutive planar scene graph and a preset material object;
loading the pre-created cylinder and the material set into a grid object to obtain a new grid object; the end-to-end consecutive plane scene diagrams are inner wall maps of the cylinder, and the preset material objects are respectively the top surface and the bottom surface of the cylinder;
loading the new grid object into a pre-created stage scene, and generating a scene picture after rendering the stage scene;
acquiring a camera initialization parameter, creating a camera according to the camera initialization parameter, determining the position of the camera, and adjusting the angle of the camera; the angle of the camera is used for adjusting the display angle of the stage scene, and the camera initialization parameters comprise a visual angle, an aspect ratio, a near-end distance and a far-end distance.
2. The method according to claim 1, wherein the method further comprises:
setting the attribute of the grid object, and changing the position and display effect of the grid object in the stage scene; wherein the properties of the mesh object include position, rotation, scale, X-axis translation, Y-axis translation, and Z-axis translation.
3. The method according to claim 1 or 2, characterized in that the method further comprises:
acquiring preset construction parameters, wherein the construction parameters comprise a top surface radius, a bottom surface radius, a cylinder height, the number of segments of the top surface and the bottom surface, the number of segments of the side surface and a top surface and bottom surface Boolean value;
the cylinder is created from the build parameters.
4. A method according to claim 3, wherein after said creating said cylinder from said build parameters, further comprising:
and scaling the size of the cylinder according to the size of the stage scene so that the size of the cylinder is matched with the size of the pre-created stage scene.
5. The method according to claim 1 or 2, wherein constructing the material set from the end-to-end consecutive planar scene graphs and the preset material objects comprises:
creating a texture picture object according to the end-to-end consecutive planar scene graph;
constructing a material set according to the texture picture object and the preset material object; the preset material object comprises two transparent plane views.
6. The method of claim 5, wherein creating a texture picture object from an end-to-end consistent planar scene graph further comprises:
acquiring picture addresses of the head-tail consecutive planar scene pictures;
loading the end-to-end coherent planar scene graph according to the picture address to form the texture picture object; the texture picture object is in a multi-stage progressive texture format.
7. The method according to claim 1 or 2, wherein generating a scene picture after rendering the stage scene comprises:
acquiring an initialization parameter of a renderer, wherein the initialization parameter of the renderer comprises an antialiasing parameter, a coloring precision parameter and a texture precision parameter;
and rendering the stage scene according to the renderer initialization parameters and the rendering function to generate a panoramic scene picture.
8. A scene realization apparatus, the apparatus comprising:
the material set construction module is used for constructing a material set according to the head-tail coherent planar scene graph and the preset material objects;
the grid object generation module loads the pre-created cylinder and the material set into the grid object to obtain a new grid object; the end-to-end consecutive plane scene diagrams are inner wall maps of the cylinder, and the preset material objects are respectively the top surface and the bottom surface of the cylinder;
the rendering module is used for loading the new grid object into a pre-created stage scene, and generating a scene picture after rendering the stage scene;
the camera construction module is used for acquiring camera initialization parameters, creating a camera according to the camera initialization parameters, determining the position of the camera and adjusting the angle of the camera; the angle of the camera is used for adjusting the display angle of the stage scene, and the camera initialization parameters comprise a visual angle, an aspect ratio, a near-end distance and a far-end distance.
9. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method of any of claims 1 to 7 when the computer program is executed.
10. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 7.
CN201910970163.5A 2019-10-12 2019-10-12 Scene realization method, device, computer equipment and storage medium Active CN110853143B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910970163.5A CN110853143B (en) 2019-10-12 2019-10-12 Scene realization method, device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910970163.5A CN110853143B (en) 2019-10-12 2019-10-12 Scene realization method, device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN110853143A CN110853143A (en) 2020-02-28
CN110853143B true CN110853143B (en) 2023-05-16

Family

ID=69596285

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910970163.5A Active CN110853143B (en) 2019-10-12 2019-10-12 Scene realization method, device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN110853143B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112464126B (en) * 2020-12-14 2022-07-15 厦门市美亚柏科信息股份有限公司 Method for generating panoramic chart based on Threejs, terminal equipment and storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102750724B (en) * 2012-04-13 2018-12-21 广东赛百威信息科技有限公司 A kind of three peacekeeping panoramic system automatic-generationmethods based on image
CN107103638B (en) * 2017-05-27 2020-10-16 杭州万维镜像科技有限公司 Rapid rendering method of virtual scene and model
WO2019058266A1 (en) * 2017-09-21 2019-03-28 Varghese Thombra Sobin A system and method for conversion of a floor plan to a 3d scene for creation & rendering of virtual reality architectural scenes, walk through videos and images
CN110262763B (en) * 2018-03-21 2021-10-15 腾讯科技(深圳)有限公司 Augmented reality-based display method and apparatus, storage medium, and electronic device
CN108681987A (en) * 2018-05-10 2018-10-19 广州腾讯科技有限公司 The method and apparatus for generating panorama slice map

Also Published As

Publication number Publication date
CN110853143A (en) 2020-02-28

Similar Documents

Publication Publication Date Title
US11676253B2 (en) Systems, methods, and media for hierarchical progressive point cloud rendering
CN108564527B (en) Panoramic image content completion and restoration method and device based on neural network
US20230053462A1 (en) Image rendering method and apparatus, device, medium, and computer program product
US9024947B2 (en) Rendering and navigating photographic panoramas with depth information in a geographic information system
KR101956098B1 (en) Virtual exhibition space system and providing method using 2.5 dimensional image
US11044398B2 (en) Panoramic light field capture, processing, and display
CN113628331B (en) Data organization and scheduling method for photogrammetry model in illusion engine
EP3989175A1 (en) Illumination probe generation method, apparatus, storage medium, and computer device
US9754398B1 (en) Animation curve reduction for mobile application user interface objects
US20220335684A1 (en) Finite aperture omni-directional stereo light transport
CN116109765A (en) Three-dimensional rendering method and device for labeling objects, computer equipment and storage medium
CN110853143B (en) Scene realization method, device, computer equipment and storage medium
CN113724331B (en) Video processing method, video processing apparatus, and non-transitory storage medium
CN110889384A (en) Scene switching method and device, electronic equipment and storage medium
KR102237519B1 (en) Method of providing virtual exhibition space using 2.5 dimensionalization
CN117456076A (en) Material map generation method and related equipment
CN116012520B (en) Shadow rendering method, shadow rendering device, computer equipment and storage medium
CN112348938A (en) Method, device and computer equipment for optimizing three-dimensional object
WO2023056879A1 (en) Model processing method and apparatus, device, and medium
US12002165B1 (en) Light probe placement for displaying objects in 3D environments on electronic devices
CN114820980A (en) Three-dimensional reconstruction method and device, electronic equipment and readable storage medium
CN113012302B (en) Three-dimensional panorama generation method, device, computer equipment and storage medium
CN114020390A (en) BIM model display method and device, computer equipment and storage medium
CN112802183A (en) Method and device for reconstructing three-dimensional virtual scene and electronic equipment
CN114359456B (en) Picture pasting method and device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20230524

Address after: Room 101, No. 227 Gaotang Road, Tianhe District, Guangzhou City, Guangdong Province, 510665 (Location: Room 601)

Patentee after: Yamei Zhilian Data Technology Co.,Ltd.

Address before: Room 201, No.1 Hanjing Road, Tianhe District, Guangzhou City, Guangdong Province

Patentee before: GUANGZHOU YAME INFORMATION TECHNOLOGY Co.,Ltd.