CN110853143A - Scene implementation method and device, computer equipment and storage medium - Google Patents

Scene implementation method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN110853143A
CN110853143A CN201910970163.5A CN201910970163A CN110853143A CN 110853143 A CN110853143 A CN 110853143A CN 201910970163 A CN201910970163 A CN 201910970163A CN 110853143 A CN110853143 A CN 110853143A
Authority
CN
China
Prior art keywords
scene
cylinder
picture
camera
tail
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910970163.5A
Other languages
Chinese (zh)
Other versions
CN110853143B (en
Inventor
王征
冯智泉
江勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yamei Zhilian Data Technology Co ltd
Original Assignee
Guangzhou Yamei Information Science & Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Yamei Information Science & Technology Co Ltd filed Critical Guangzhou Yamei Information Science & Technology Co Ltd
Priority to CN201910970163.5A priority Critical patent/CN110853143B/en
Publication of CN110853143A publication Critical patent/CN110853143A/en
Application granted granted Critical
Publication of CN110853143B publication Critical patent/CN110853143B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/10Constructive solid geometry [CSG] using solid primitives, e.g. cylinders, cubes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/61Scene description
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Image Generation (AREA)
  • Television Signal Processing For Recording (AREA)

Abstract

The application relates to a scene implementation method, a scene implementation device, computer equipment and a storage medium. The method comprises the following steps: constructing a material set according to the head-to-tail coherent plane scene graph and a preset material object; loading a cylinder and the material set which are created in advance into a grid object to obtain a new grid object; the head-to-tail consecutive plane scene graphs are the inner wall maps of the cylinder, and the preset material objects are the top surface and the bottom surface of the cylinder respectively; and loading the new grid object into a pre-created stage scene, and rendering the stage scene to generate a scene picture. By adopting the method, the panoramic materials shot by professional equipment and the splicing of the panoramic materials by professional software are not needed, so that the panoramic effect is ensured, and the realization difficulty of the panoramic scene is reduced.

Description

Scene implementation method and device, computer equipment and storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a scene implementation method and apparatus, a computer device, and a storage medium.
Background
The panoramic view represents the surrounding environment as much as possible by means of wide-angle representation and forms such as painting, photos, videos and three-dimensional models. Panoramic pictures, namely, pictures obtained by capturing image information of the whole scene by a professional camera or pictures rendered by modeling software, splicing the pictures by using the software, and playing the pictures by using a special player to form a panoramic scene for virtual reality browsing, simulating a two-dimensional plane graph into a real three-dimensional space and presenting the real three-dimensional space to a user.
In the conventional technology, in order to implement a panoramic scene, professional equipment is usually required to acquire original panoramic materials, such as a single lens reflex, a fisheye lens, a tripod and a tripod head combined equipment or a specific panoramic camera; after the panoramic material is obtained, professional personnel are still required to splice and map the panoramic material into a spherical space through professional software.
It can be seen that, in the traditional implementation scheme of the panoramic scene, not only the professional requirements of the panoramic material on the shooting personnel are high, expensive shooting equipment is needed, but also the panoramic scene is difficult to process, so that the construction threshold of the panoramic scene is too high, and an ordinary user cannot rapidly implement the panoramic scene.
Disclosure of Invention
Therefore, it is necessary to provide a simple and fast scene implementation method, apparatus, computer device and storage medium for solving the above technical problems.
In a first aspect, the present application provides a method for implementing a scene, where the method includes:
constructing a material set according to the head-to-tail coherent plane scene graph and a preset material object;
loading a cylinder and the material set which are created in advance into a grid object to obtain a new grid object; the texture picture object is an inner wall map of the cylinder, and the preset material objects are the top surface and the bottom surface of the cylinder respectively;
and loading the new grid object into a pre-created stage scene, and rendering the stage scene to generate a scene picture.
In one embodiment of the present application, the method further comprises:
acquiring camera initialization parameters, wherein the camera initialization parameters comprise a visual angle, an aspect ratio, a near-end distance and a far-end distance;
creating a camera according to the camera initialization parameters;
determining a position of the camera and adjusting an angle of the camera; the angle of the camera is used for adjusting the display angle of the stage scene.
In one embodiment of the present application, the method further comprises:
acquiring preset construction parameters, wherein the construction parameters comprise a top surface radius, a bottom surface radius, a cylinder height, the number of sections of the top surface and the bottom surface, the number of sections of the side surface and a Boolean value of the top surface and the bottom surface;
creating the cylinder according to the build parameters.
In an embodiment of the application, after the creating the cylinder according to the building parameters, the method further includes:
scaling the size of the cylinder according to the size of the stage scene to match the size of the cylinder with the size of the pre-created stage scene.
In an embodiment of the present application, the constructing a material set according to an end-to-end consecutive planar scene graph and a preset material object includes:
creating texture picture objects according to head-to-tail coherent plane scene graphs;
constructing a material set according to the texture picture object and the preset material object; the preset material object comprises two transparent plan views.
In an embodiment of the present application, the creating a texture picture object according to a head-to-tail consecutive planar scene graph further includes:
acquiring picture addresses of the head-to-tail consecutive planar scene graphs;
loading the head-to-tail consecutive planar scene graphs according to the picture addresses to form the texture picture object; the format of the texture picture object is a multi-level progressive texture format.
In an embodiment of the present application, the generating a scene picture after rendering the stage scene includes:
acquiring renderer initialization parameters which comprise an anti-aliasing parameter, a coloring precision parameter and a texture precision parameter;
and rendering the stage scene according to the renderer initialization parameter and the rendering function to generate a panoramic scene picture.
In a second aspect, the present application provides a scene implementation apparatus, including:
the material set building module is used for building a material set according to the head-to-tail consecutive plane scene graphs and preset material objects;
the grid object generation module loads a cylinder and the material set which are created in advance into a grid object to obtain a new grid object; the literary picture object is an inner wall map of the cylinder, and the preset material objects are the top surface and the bottom surface of the cylinder respectively;
and the rendering module is used for loading the new grid object into a pre-created stage scene, and generating a scene picture after rendering the stage scene.
In an embodiment of the application, the material set building block is specifically configured to:
creating texture picture objects according to head-to-tail coherent plane scene graphs;
constructing a material set according to the texture picture object and a preset material object; the preset material object comprises two transparent plan views.
In an embodiment of the application, the material set building block is further specifically configured to:
acquiring picture addresses of head-to-tail consecutive planar scene graphs;
loading head-to-tail consecutive planar scene graphs according to the picture addresses to form texture picture objects; the format of the texture picture object is a multi-level progressive texture format.
In an embodiment of the present application, the rendering module is specifically configured to:
acquiring renderer initialization parameters which comprise an anti-aliasing parameter, a coloring precision parameter and a texture precision parameter;
and rendering the stage scene according to the renderer initialization parameter and the rendering function to generate a panoramic scene picture.
In one embodiment of the present application, the apparatus may further include: a camera building block and a cylinder building block. Wherein:
the camera construction module is specifically configured to:
acquiring camera initialization parameters, wherein the camera initialization parameters comprise a visual angle, an aspect ratio, a near-end distance and a far-end distance;
establishing a camera according to the camera initialization parameters;
determining the position of a camera and adjusting the angle of the camera; the angle of the camera is used to adjust the display angle of the stage scene.
The cylinder building block is specifically configured to:
acquiring preset construction parameters, wherein the construction parameters comprise a top surface radius, a bottom surface radius, a cylinder height, the number of sections of the top surface and the bottom surface, the number of sections of the side surface and a Boolean value of the top surface and the bottom surface;
a cylinder is created according to the build parameters.
In an embodiment of the present application, the cylinder building block is further specifically configured to:
the size of the cylinder is scaled according to the size of the stage scene to match the size of the cylinder to the size of the pre-created stage scene.
In a third aspect, the present application provides a computer device comprising a memory and a processor, the memory storing a computer program, the processor implementing the following steps when executing the computer program:
constructing a material set according to the head-to-tail coherent plane scene graph and a preset material object;
loading a cylinder and the material set which are created in advance into a grid object to obtain a new grid object; the texture picture object is an inner wall map of the cylinder, and the preset material objects are the top surface and the bottom surface of the cylinder respectively;
and loading the new grid object into a pre-created stage scene, and rendering the stage scene to generate a scene picture.
In a fourth aspect, the present application provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of:
constructing a material set according to the head-to-tail coherent plane scene graph and a preset material object;
loading a cylinder and the material set which are created in advance into a grid object to obtain a new grid object; the texture picture object is an inner wall map of the cylinder, and the preset material objects are the top surface and the bottom surface of the cylinder respectively;
and loading the new grid object into a pre-created stage scene, and rendering the stage scene to generate a scene picture.
According to the scene implementation method, the scene implementation device, the computer equipment and the storage medium, a material set is constructed according to the head-to-tail consecutive planar scene graphs and the preset material objects; loading a cylinder and a material set which are created in advance into a grid object to obtain a new grid object; the head-to-tail consecutive plane scene pictures are the pictures pasted on the inner wall of the cylinder, and the preset material objects are the top surface and the bottom surface of the cylinder respectively; and loading the new grid object into the pre-created stage scene, and rendering the stage scene to generate a scene picture. According to the scene implementation method provided by the embodiment of the application, the panoramic scene picture can be obtained by acquiring the head-to-tail connected planar scene pictures, taking the planar scene pictures as the inner surface of the cylindrical object and rendering the stage scene containing the cylindrical object.
Drawings
Fig. 1 is an implementation environment diagram of a scene implementation method provided in an embodiment of the present application;
fig. 2 is a flowchart of a scene implementation method provided in the embodiment of the present application;
fig. 3a is a schematic view of a planar scene provided in an embodiment of the present application;
FIG. 3b is a cylindrical panorama provided by an embodiment of the present application;
fig. 3c is a schematic view of a panoramic scene provided in an embodiment of the present application;
fig. 4 is a flowchart of another scenario implementation method provided in the embodiment of the present application;
fig. 5 is a flowchart of another scenario implementation method provided in the embodiment of the present application;
fig. 6 is a flowchart of another scenario implementation method provided in the embodiment of the present application;
fig. 7 is a flowchart of another scenario implementation method provided in the embodiment of the present application;
fig. 8 is a flowchart of another scenario implementation method provided in the embodiment of the present application;
fig. 9 is a block diagram of a scene implementation apparatus according to an embodiment of the present application;
fig. 10 is a block diagram of another scene implementation apparatus provided in the embodiment of the present application;
fig. 11 is a block diagram of a computer device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
The scene implementation method provided by the application can be applied to the application environment shown in fig. 1. Wherein the terminal 102 and the server 104 communicate via a network. The terminal 102 may be, but not limited to, various personal computers, notebook computers, smart phones, tablet computers, and portable wearable devices, and the server 104 may be implemented by an independent server or a server cluster formed by a plurality of servers. The server 104 is configured to store image materials and construction functions required in the panoramic scene construction process, and when a user accesses the panoramic scene of the server 104 through the terminal 102, the server 104 transmits the image materials and the construction functions to the terminal 102, so that the terminal 102 generates the panoramic scene and displays the panoramic scene on a corresponding display device.
Please refer to fig. 2, which shows a scene implementation method provided in this embodiment, and taking the application of this method to the terminal in fig. 1 as an example for description, the method includes the following steps:
and 202, constructing a material set according to the head-to-tail consecutive plane scene graph and a preset material object.
In an embodiment of the present application, when a user accesses a panoramic scene in a server through the terminal, the terminal needs to first obtain a material set corresponding to the panoramic scene. Specifically, the terminal may request the server to send the material corresponding to the panoramic scene by sending a material obtaining request to the server. Specifically, the material set corresponding to the panoramic scene at least comprises a head-to-tail consecutive planar scene graph and material objects.
In an embodiment of the present application, the planar scene graph is associated with the panoramic scene, optionally, for each panoramic scene, the server side is provided with a corresponding planar scene graph, and when the terminal accesses one of the panoramic scenes, the server sends the planar scene graph corresponding to the accessed panoramic scene to the terminal; optionally, a scene attribute is set in the panoramic scene, and the server may screen a planar scene graph most suitable for the panoramic scene in the planar scene material library according to the scene attribute of the panoramic scene and send the planar scene graph to the terminal. For example, if the panoramic scene is an urban scene, a street-type planar scene graph can be searched in a planar scene material library, and if the panoramic scene is an outdoor scene, a forest-type planar scene graph and a mountain-and-water-type planar scene graph can be searched in the planar scene material library.
In one embodiment of the present application, the planar scene graph is a generally rectangular planar picture, wherein the rectangular planar picture has an end-to-end property, and therefore, a group of opposite sides of the rectangular planar picture can be connected in a curling manner to form a cylindrical panorama. As shown in the plan view of fig. 3a, the left and right sides of the figure are consecutive, and therefore, the left and right sides can be connected as a set of opposite sides by curling to form a cylindrical panorama as shown in fig. 3b, it can be seen that since the plan view of fig. 3a is consecutive, the joints during the end-to-end connection are not visible in the cylindrical panorama as shown in fig. 3 b.
Step 204, loading a cylinder and a material set which are created in advance into a grid object to obtain a new grid object; the head-to-tail consecutive plane scene pictures are the pictures pasted on the inner wall of the cylinder, and the preset material objects are the top surface and the bottom surface of the cylinder respectively.
In one embodiment of the present application, the pre-constructed cylinder cannot be rendered at the time, i.e., the separately constructed cylinder cannot be rendered and displayed in the display device of the terminal, and therefore, the pre-constructed cylinder needs to be combined with the corresponding material set to form a new mesh object, which can be rendered and displayed. Specifically, the Mesh object may be created through a Mesh (geometry) of a construction function of the Mesh object, where geometry represents a shape of the Mesh object and material represents a material of the Mesh object. In this embodiment, geometry is a pre-constructed cylinder and material is the material set generated in step 202.
Specifically, the pre-constructed cylinder has a side surface, a top surface and a bottom surface, the terminal takes the obtained head-to-tail coherent plane scene images as the texture of the side surface, and the preset material objects are the texture of the top surface and the texture of the bottom surface respectively. Optionally, the server may search the material of the sky category in the material object database and send the material to the terminal, so that the terminal uses the material of the sky category as the material object corresponding to the top surface; the server can search the material object database for the material of the ground class and send the material to the terminal, so that the terminal takes the material of the ground class as the material object corresponding to the bottom surface.
In one embodiment of the present application, attributes of a mesh object may be set to change the position and display effect of the mesh object in a stage scene. Specifically, the attributes of the mesh object may include position, rotation, scale, X-axis translation, Y-axis translation, and Z-axis translation.
In a further embodiment, the side property of each surface of the cylinder may be set to "back" or "double" so that each material object in the material set can be displayed inside the cylinder, thereby creating a panoramic scene. As shown in the panoramic scene diagram of fig. 3c, it shows the panoramic scene when the side property of each surface of the cylinder is set as "back", and the planar scene diagram shown in fig. 3a is taken as the side of the cylinder.
And step 206, loading the new grid object into the pre-created stage scene, rendering the stage scene, and generating a scene picture.
In an embodiment of the present application, for the new mesh object, the new mesh object is loaded into a pre-created stage scene, and a scene picture is generated after the stage scene is rendered by using a renderer arranged in the stage scene.
In an embodiment of the present application, a light source object is also preset in the stage scene, and specifically, the light source object may be set as an ambient light source (AmbientLight), so that the light can act on all objects, but a shadow effect is not generated. The operating pressure of a terminal processor can be reduced while the visibility of the panoramic scene is improved.
In the scene implementation method provided by the embodiment of the application, a material set is constructed according to head-to-tail consecutive planar scene graphs and preset material objects; loading a cylinder and a material set which are created in advance into a grid object to obtain a new grid object; the head-to-tail consecutive plane scene pictures are the pictures pasted on the inner wall of the cylinder, and the preset material objects are the top surface and the bottom surface of the cylinder respectively; and loading the new grid object into the pre-created stage scene, and rendering the stage scene to generate a scene picture. According to the scene implementation method provided by the embodiment of the application, the panoramic scene picture can be obtained by acquiring the head-to-tail connected planar scene pictures, taking the planar scene pictures as the inner surface of the cylindrical object and rendering the stage scene containing the cylindrical object.
Referring to fig. 4, a flowchart of another scenario implementation method provided in this embodiment is shown, where the scenario implementation method can be applied to the terminal 102 in the implementation environment described above. On the basis of the embodiment shown in fig. 2, the method may further include the following steps:
step 302, camera initialization parameters are obtained, the camera initialization parameters including a viewing angle, an aspect ratio, a near-end distance and a far-end distance.
Step 304, a camera is created according to the camera initialization parameters.
In one embodiment of the present application, the camera may be one of an orthogonal projection camera and a perspective projection camera, and when the camera type is set as the orthogonal projection camera, the camera is drawn in a uniform size ratio regardless of a distance between the object and the camera; when the camera type is set as a perspective projection camera, the object becomes smaller as the distance between the object and the camera increases.
In one embodiment of the present application, a perspective projection camera is chosen as the camera type for the camera, and the camera may be created according to camera initialization parameters and a perspective projection camera creation function. Specifically, the camera may be created by using the three.perspectivecamera function, for which camera initialization parameters corresponding to the camera to be created are required, including the visible angle fov, the aspect ratio aspect, the near-end distance near, and the far-end distance far. Wherein, the visual angle is the field angle of the camera in the vertical direction; the height-width ratio is the ratio of the length of the camera in the horizontal direction to the length of the camera in the vertical direction and is equal to width/height; the near-end distance is the closest distance from the camera to the observation area; the far-end distance is the farthest distance from the camera to the observation area.
In a further embodiment, the aspect ratio is a ratio of a height to a width of the current browser window, and specifically, the width of the browser window may be obtained by a window.
Step 306, determining the position of the camera and adjusting the angle of the camera; the angle of the camera is used to adjust the display angle of the stage scene.
In one embodiment of the present application, the position of the camera may be set by the three.vector3 function, and the angle of the camera may be set by the scene.rotation.set function, by which the display angle of the stage scene may be adjusted. Preferably, the camera may be set to the origin position.
In the scene implementation method provided by the embodiment of the application, camera initialization parameters are obtained, wherein the camera initialization parameters comprise a visual angle, an aspect ratio, a near-end distance and a far-end distance; establishing a camera according to the camera initialization parameters; determining the position of a camera and adjusting the angle of the camera; the angle of the camera is used to adjust the display angle of the stage scene. According to the scene implementation method provided by the embodiment of the application, the camera can be created according to the preset camera initialization parameters, the initial display angle of the stage scene is controlled by adjusting the initial angle of the camera, and the controllability of the panoramic scene is improved.
Referring to fig. 5, a flowchart of another scenario implementation method provided in this embodiment is shown, where the scenario implementation method can be applied to the terminal 102 in the implementation environment described above. On the basis of the embodiment shown in fig. 2 or fig. 4, the method may further include the following steps:
step 402, obtaining preset construction parameters, wherein the construction parameters comprise a top surface radius, a bottom surface radius, a cylinder height, a number of sections of the top surface and the bottom surface, a number of sections of the side surface, and a top surface and bottom surface boolean value.
At step 404, a cylinder is created according to the build parameters.
In one embodiment of the present application, the cylinder is created according to preset construction parameters and a cylinder creation function. Specifically, the cylinder may be created by using a three.cylindergeometry function, and for the function, the construction parameters corresponding to the cylinder to be created are required, including: radius of top surface radiusttop, radius of bottom surface radiusBottom, height of cylinder, number of segments of top and bottom surfaces radiusssegments, number of segments of side surfaces, and boolean value openend of top and bottom surfaces. Wherein, the Boolean value of the top surface and the bottom surface is Boolean type, the default value is false, which indicates that the top surface and the bottom surface are provided; when the top radius and the bottom radius are set to the same value, a standard cylinder is created. In a specific embodiment, the top and bottom radii are set to 12, the cylinder height is set to 11, the number of segments on the top and bottom and the number of segments on the side are set to 72, and the Boolean value on the top and bottom is false, indicating that a cylinder with a top and bottom radii of 12 and a height of 11 is created.
In order to match the created cylinder with the size of the stage scene, on the basis of the above steps, the method further comprises the following steps:
the size of the cylinder is scaled 406 according to the size of the stage scene to match the size of the cylinder with the size of the pre-created stage scene.
In an embodiment of the application, the terminal may use the size of the current browser window as the size of the stage scene, and in order to enable the created cylinder to be displayed normally, the created cylinder needs to be scaled according to the size of the stage scene, and the scaled cylinder may have a size matching the size of the pre-created stage scene.
In the scene implementation method provided by the embodiment of the application, preset construction parameters are obtained, and the construction parameters include a top surface radius, a bottom surface radius, a cylinder height, a number of sections of the top surface and the bottom surface, a number of sections of the side surface, and a top surface bottom surface boolean value; creating a cylinder according to the construction parameters; the size of the cylinder is scaled according to the size of the stage scene to match the size of the cylinder to the size of the pre-created stage scene. According to the scene implementation method provided by the embodiment of the application, the cylinder is created through the preset construction parameters and the construction parameters, and the size of the generated cylinder is zoomed according to the size of the stage scene, so that the cylinder which is more matched with a browser window can be obtained, the finally obtained panoramic scene can be presented in the browser more perfectly, and the reality of the panoramic scene is improved.
Referring to fig. 6, a flowchart of another scenario implementation method provided in this embodiment is shown, where the scenario implementation method can be applied to the terminal 102 in the implementation environment described above. On the basis of the embodiment shown in fig. 2 or fig. 4, the step 202 may specifically include the following steps:
step 502, a texture picture object is created according to the head-to-tail consecutive planar scene graph.
Step 502, constructing a material set according to the texture picture object and a preset material object; the preset material object comprises two transparent plan views.
In one embodiment of the present application, the texture picture object and the predetermined material object may be stored by creating a texture array to form a material set. The texture array comprises a texture picture object generated by a head-to-tail coherent plane scene graph and comprises two transparent plane graph material objects. Specifically, the texture picture object may be stored in the texture array through the three. Two transparent plane maps with the transit attribute set to true may also be saved as preset material objects to the texture array via the three.
In another embodiment of the present application, the color, the wire frame width, the wire frame line segment end point, the wire frame line segment connection point, the coloring, the vertex color, and the fogging attribute of the texture picture object may also be set by the same.
In the scene implementation method provided by the embodiment of the application, the texture picture object is created according to the head-to-tail coherent plane scene graph; constructing a material set according to the texture picture object and a preset material object; the preset material object comprises two transparent plan views. According to the scene implementation method provided by the embodiment of the application, the material set required for constructing the panoramic scene can be quickly generated.
Referring to fig. 7, a flowchart of another scenario implementation method provided in this embodiment is shown, where the scenario implementation method can be applied to the terminal 102 in the implementation environment described above. On the basis of the embodiment shown in fig. 6, the step 502 may specifically include the following steps:
step 602, acquiring the picture addresses of the head-to-tail consecutive planar scene graphs.
In an embodiment of the present application, the terminal may request and receive, from the server, a picture address of the head-to-tail consecutive planar scene graph, where the picture address may be a network address, a local path, or the like. The planar scene graph can be obtained through the picture address.
Step 604, loading head-to-tail consecutive planar scene graphs according to picture addresses to form texture picture objects; the format of the texture picture object is a multi-level progressive texture format.
In one embodiment of the present application, the planar scene graph corresponding to the picture address may be loaded through the three.
In one embodiment of the present application, when a texture picture object is generated according to the planar scene graph, the multi-level progressively distant texture attribute of the texture picture object is activated. Specifically, the multilevel gradually-distant texture attribute generateMipmaps of the texture picture object may be set to true, so as to set the texture picture object to the multilevel gradually-distant texture format. In this embodiment, after setting generatemimaps to true, a set of reduced texture maps is generated in advance, and when the camera is farther from the texture, a smaller texture map is automatically used. Usually, it occupies 33% more storage space, which is a typical usage of space in exchange for time
In a further embodiment, a texture reduction method minFilter and a texture magnification method magFilter need to be set, specifically, a minFilter attribute may be set to be three. Since the texture changes frequently in this embodiment, we need to set a simple and efficient filter. I.e. a linear filter.
In the scene implementation method provided by the embodiment of the application, the picture addresses of head-to-tail consecutive planar scene graphs are obtained; loading head-to-tail consecutive planar scene graphs according to the picture addresses to form texture picture objects; the format of the texture picture object is a multi-level progressive texture format. According to the scene implementation method provided by the embodiment of the application, the format of the texture picture object is set to be the multi-level gradually-distant texture format, a group of reduced texture maps can be generated in advance, and when the distance between a camera and the texture changes, the corresponding texture map is directly called, so that the response time of the panoramic scene is shortened, and the watching experience of a user is improved.
Referring to fig. 8, a flowchart of another scenario implementation method provided in this embodiment is shown, where the scenario implementation method can be applied to the terminal 102 in the implementation environment described above. On the basis of the embodiment shown in fig. 2 or fig. 4, the step 206 may specifically include the following steps:
step 702, obtaining renderer initialization parameters, wherein the renderer initialization parameters comprise an anti-aliasing parameter, a coloring precision parameter and a texture precision parameter.
And 704, rendering the stage scene according to the renderer initialization parameter and the rendering function to generate a panoramic scene picture.
In an embodiment of the present application, a stage scene may be rendered through the three. Specifically, the renderer initialization parameter includes an antialiasing parameter antialias indicating whether or not to start antialiasing; the shading precision parameter precision represents the shading precision selection; the texture precision parameter mipmap represents the texture precision selection. In a specific embodiment, the renderer initialization parameter may include an anti-aliasing parameter, antialias, set to false, indicating that anti-aliasing is not turned on; the rendering precision parameter precision and the texture precision parameter mipmap are set to highp, which means that the rendering precision and the texture precision are high.
In the scene implementation method provided by the embodiment of the application, the initialization parameters of the renderer are obtained, wherein the initialization parameters of the renderer comprise an anti-aliasing parameter, a coloring precision parameter and a texture precision parameter; and rendering the stage scene according to the renderer initialization parameter and the rendering function to generate a panoramic scene picture. According to the scene implementation method provided by the embodiment of the application, the obtained stage scene can be quickly rendered according to the initialization parameters of the renderer, and the panoramic scene is obtained.
It should be understood that, although the steps in the above-described flowcharts are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a portion of the steps in the above-described flowcharts may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or the stages is not necessarily sequential, but may be performed alternately or alternatingly with other steps or at least a portion of the sub-steps or stages of other steps.
Referring to fig. 9, a block diagram of a scene implementation apparatus 900 according to an embodiment of the present application is shown. As shown in fig. 9, the scenario implementation apparatus 900 may include: a material set construction module 901, a mesh object generation module 902, and a rendering module 903, wherein:
the material set building module 901 is configured to build a material set according to a head-to-tail consecutive planar scene graph and a preset material object;
the grid object generating module 902 loads a cylinder and a material set created in advance into a grid object to obtain a new grid object; the literary picture object is an inner wall map of a cylinder, and the preset material objects are the top surface and the bottom surface of the cylinder respectively;
the rendering module 903 is configured to load a new mesh object into a pre-created stage scene, and generate a scene picture after rendering the stage scene.
In an embodiment of the present application, the material set building module 901 is specifically configured to:
creating texture picture objects according to head-to-tail coherent plane scene graphs;
constructing a material set according to the texture picture object and a preset material object; the preset material object comprises two transparent plan views.
In an embodiment of the present application, the material set building module 901 is further specifically configured to:
acquiring picture addresses of head-to-tail consecutive planar scene graphs;
loading head-to-tail consecutive planar scene graphs according to the picture addresses to form texture picture objects; the format of the texture picture object is a multi-level progressive texture format.
In an embodiment of the present application, the rendering module 903 is specifically configured to:
acquiring renderer initialization parameters which comprise an anti-aliasing parameter, a coloring precision parameter and a texture precision parameter;
and rendering the stage scene according to the renderer initialization parameter and the rendering function to generate a panoramic scene picture.
Referring to fig. 10, a block diagram of a scene implementation apparatus 1000 according to an embodiment of the present application is shown. As shown in fig. 10, the scene implementation apparatus 1000 may include, in addition to the modules included in the scene implementation apparatus 900, optionally: a camera build module 904 and a cylinder build module 905. Wherein:
the camera building module 904 is specifically configured to:
acquiring camera initialization parameters, wherein the camera initialization parameters comprise a visual angle, an aspect ratio, a near-end distance and a far-end distance;
establishing a camera according to the camera initialization parameters;
determining the position of a camera and adjusting the angle of the camera; the angle of the camera is used to adjust the display angle of the stage scene.
The cylinder building block 905 is specifically configured to:
acquiring preset construction parameters, wherein the construction parameters comprise a top surface radius, a bottom surface radius, a cylinder height, the number of sections of the top surface and the bottom surface, the number of sections of the side surface and a Boolean value of the top surface and the bottom surface;
a cylinder is created according to the build parameters.
In an embodiment of the present application, the illustrated cylinder building block 905 is further specifically configured to:
the size of the cylinder is scaled according to the size of the stage scene to match the size of the cylinder to the size of the pre-created stage scene.
For specific limitations of the scene implementation apparatus, reference may be made to the above limitations of the scene implementation method, which is not described herein again. The various modules in the scene realization device described above may be realized in whole or in part by software, hardware, and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer device is provided, which may be a terminal, and its internal structure diagram may be as shown in fig. 11. The computer device includes a processor, a memory, a network interface, a display screen, and an input device connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a scenario implementation method. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the computer equipment, an external keyboard, a touch pad or a mouse and the like.
Those skilled in the art will appreciate that the architecture shown in fig. 11 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is provided, comprising a memory and a processor, the memory having a computer program stored therein, the processor implementing the following steps when executing the computer program:
constructing a material set according to the head-to-tail coherent plane scene graph and a preset material object;
loading a cylinder and a material set which are created in advance into a grid object to obtain a new grid object; the head-to-tail consecutive plane scene pictures are the pictures pasted on the inner wall of the cylinder, and the preset material objects are the top surface and the bottom surface of the cylinder respectively;
and loading the new grid object into the pre-created stage scene, and rendering the stage scene to generate a scene picture.
In one embodiment, a computer-readable storage medium is provided, having a computer program stored thereon, which when executed by a processor, performs the steps of:
constructing a material set according to the head-to-tail coherent plane scene graph and a preset material object;
loading a cylinder and a material set which are created in advance into a grid object to obtain a new grid object; the head-to-tail consecutive plane scene pictures are the pictures pasted on the inner wall of the cylinder, and the preset material objects are the top surface and the bottom surface of the cylinder respectively;
and loading the new grid object into the pre-created stage scene, and rendering the stage scene to generate a scene picture.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A method for implementing a scene, the method comprising:
constructing a material set according to the head-to-tail coherent plane scene graph and a preset material object;
loading a cylinder and the material set which are created in advance into a grid object to obtain a new grid object; the head-to-tail consecutive plane scene graphs are the inner wall maps of the cylinder, and the preset material objects are the top surface and the bottom surface of the cylinder respectively;
and loading the new grid object into a pre-created stage scene, and rendering the stage scene to generate a scene picture.
2. The method of claim 1, further comprising:
acquiring camera initialization parameters, wherein the camera initialization parameters comprise a visual angle, an aspect ratio, a near-end distance and a far-end distance;
creating a camera according to the camera initialization parameters;
determining a position of the camera and adjusting an angle of the camera; the angle of the camera is used for adjusting the display angle of the stage scene.
3. The method according to claim 1 or 2, characterized in that the method further comprises:
acquiring preset construction parameters, wherein the construction parameters comprise a top surface radius, a bottom surface radius, a cylinder height, the number of sections of the top surface and the bottom surface, the number of sections of the side surface and a Boolean value of the top surface and the bottom surface;
creating the cylinder according to the build parameters.
4. The method of claim 3, wherein after creating the cylinder according to the build parameters, further comprising:
scaling the size of the cylinder according to the size of the stage scene to match the size of the cylinder with the size of the pre-created stage scene.
5. The method according to claim 1 or 2, wherein the building of the material set from the head-to-tail connected planar scene graph and the preset material objects comprises:
creating texture picture objects according to head-to-tail coherent plane scene graphs;
constructing a material set according to the texture picture object and the preset material object; the preset material object comprises two transparent plan views.
6. The method of claim 5, wherein creating texture picture objects from head-to-tail coherent planar scene graphs further comprises:
acquiring picture addresses of the head-to-tail consecutive planar scene graphs;
loading the head-to-tail consecutive planar scene graphs according to the picture addresses to form the texture picture object; the format of the texture picture object is a multi-level progressive texture format.
7. The method according to claim 1 or 2, wherein the generating a scene picture after rendering the stage scene comprises:
acquiring renderer initialization parameters which comprise an anti-aliasing parameter, a coloring precision parameter and a texture precision parameter;
and rendering the stage scene according to the renderer initialization parameter and the rendering function to generate a panoramic scene picture.
8. A scene realization apparatus, characterized in that the apparatus comprises:
the material set building module is used for building a material set according to the head-to-tail consecutive plane scene graphs and preset material objects;
the grid object generation module loads a cylinder and the material set which are created in advance into a grid object to obtain a new grid object; the literary picture object is an inner wall map of the cylinder, and the preset material objects are the top surface and the bottom surface of the cylinder respectively;
and the rendering module is used for loading the new grid object into a pre-created stage scene, and generating a scene picture after rendering the stage scene.
9. A computer device comprising a memory and a processor, the memory storing a computer program, wherein the processor implements the steps of the method of any one of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 7.
CN201910970163.5A 2019-10-12 2019-10-12 Scene realization method, device, computer equipment and storage medium Active CN110853143B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910970163.5A CN110853143B (en) 2019-10-12 2019-10-12 Scene realization method, device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910970163.5A CN110853143B (en) 2019-10-12 2019-10-12 Scene realization method, device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN110853143A true CN110853143A (en) 2020-02-28
CN110853143B CN110853143B (en) 2023-05-16

Family

ID=69596285

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910970163.5A Active CN110853143B (en) 2019-10-12 2019-10-12 Scene realization method, device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN110853143B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112464126A (en) * 2020-12-14 2021-03-09 厦门市美亚柏科信息股份有限公司 Method for generating panoramic chart based on Threejs, terminal equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102750724A (en) * 2012-04-13 2012-10-24 广州市赛百威电脑有限公司 Three-dimensional and panoramic system automatic-generation method based on images
CN107103638A (en) * 2017-05-27 2017-08-29 杭州万维镜像科技有限公司 A kind of Fast rendering method of virtual scene and model
CN108681987A (en) * 2018-05-10 2018-10-19 广州腾讯科技有限公司 The method and apparatus for generating panorama slice map
WO2019058266A1 (en) * 2017-09-21 2019-03-28 Varghese Thombra Sobin A system and method for conversion of a floor plan to a 3d scene for creation & rendering of virtual reality architectural scenes, walk through videos and images
CN110262763A (en) * 2018-03-21 2019-09-20 腾讯科技(深圳)有限公司 Display methods and device and storage medium and electronic equipment based on augmented reality

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102750724A (en) * 2012-04-13 2012-10-24 广州市赛百威电脑有限公司 Three-dimensional and panoramic system automatic-generation method based on images
CN107103638A (en) * 2017-05-27 2017-08-29 杭州万维镜像科技有限公司 A kind of Fast rendering method of virtual scene and model
WO2019058266A1 (en) * 2017-09-21 2019-03-28 Varghese Thombra Sobin A system and method for conversion of a floor plan to a 3d scene for creation & rendering of virtual reality architectural scenes, walk through videos and images
CN110262763A (en) * 2018-03-21 2019-09-20 腾讯科技(深圳)有限公司 Display methods and device and storage medium and electronic equipment based on augmented reality
CN108681987A (en) * 2018-05-10 2018-10-19 广州腾讯科技有限公司 The method and apparatus for generating panorama slice map

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112464126A (en) * 2020-12-14 2021-03-09 厦门市美亚柏科信息股份有限公司 Method for generating panoramic chart based on Threejs, terminal equipment and storage medium
CN112464126B (en) * 2020-12-14 2022-07-15 厦门市美亚柏科信息股份有限公司 Method for generating panoramic chart based on Threejs, terminal equipment and storage medium

Also Published As

Publication number Publication date
CN110853143B (en) 2023-05-16

Similar Documents

Publication Publication Date Title
CN108564527B (en) Panoramic image content completion and restoration method and device based on neural network
CN108939556B (en) Screenshot method and device based on game platform
US20170119471A1 (en) Augmented Reality Imaging System for Cosmetic Surgical Procedures
JP5295416B1 (en) Image processing apparatus, image processing method, and image processing program
CN113240769B (en) Spatial link relation identification method and device and storage medium
KR101956098B1 (en) Virtual exhibition space system and providing method using 2.5 dimensional image
US11044398B2 (en) Panoramic light field capture, processing, and display
US9589385B1 (en) Method of annotation across different locations
US11922568B2 (en) Finite aperture omni-directional stereo light transport
CN113643414A (en) Three-dimensional image generation method and device, electronic equipment and storage medium
US11250643B2 (en) Method of providing virtual exhibition space using 2.5-dimensionalization
CN114419226A (en) Panorama rendering method and device, computer equipment and storage medium
CN116109765A (en) Three-dimensional rendering method and device for labeling objects, computer equipment and storage medium
WO2017113729A1 (en) 360-degree image loading method and loading module, and mobile terminal
CN110889384A (en) Scene switching method and device, electronic equipment and storage medium
CN113724331A (en) Video processing method, video processing apparatus, and non-transitory storage medium
CN110853143B (en) Scene realization method, device, computer equipment and storage medium
CN115311364A (en) Camera positioning method, camera positioning device, electronic equipment and computer readable storage medium
CN110689609B (en) Image processing method, image processing device, electronic equipment and storage medium
CN112348938A (en) Method, device and computer equipment for optimizing three-dimensional object
CN117055785A (en) Method, device, equipment and storage medium for loading digital twin model
EP4227907A1 (en) Object annotation information presentation method and apparatus, and electronic device and storage medium
CN115512089A (en) Rapid browsing method of BIM (building information modeling) model
CN114820980A (en) Three-dimensional reconstruction method and device, electronic equipment and readable storage medium
CN113012302A (en) Three-dimensional panorama generation method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20230524

Address after: Room 101, No. 227 Gaotang Road, Tianhe District, Guangzhou City, Guangdong Province, 510665 (Location: Room 601)

Patentee after: Yamei Zhilian Data Technology Co.,Ltd.

Address before: Room 201, No.1 Hanjing Road, Tianhe District, Guangzhou City, Guangdong Province

Patentee before: GUANGZHOU YAME INFORMATION TECHNOLOGY Co.,Ltd.

TR01 Transfer of patent right