CN117808951A - Three-dimensional scene updating method and device, storage medium and electronic equipment - Google Patents

Three-dimensional scene updating method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN117808951A
CN117808951A CN202311841349.3A CN202311841349A CN117808951A CN 117808951 A CN117808951 A CN 117808951A CN 202311841349 A CN202311841349 A CN 202311841349A CN 117808951 A CN117808951 A CN 117808951A
Authority
CN
China
Prior art keywords
updated
bounding box
dimensional scene
rendered
vertex
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311841349.3A
Other languages
Chinese (zh)
Inventor
丁伟
刘从丰
陈硕
田广辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Luoyang Zhongzhi Software Technology Co ltd
Original Assignee
Luoyang Zhongzhi Software Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Luoyang Zhongzhi Software Technology Co ltd filed Critical Luoyang Zhongzhi Software Technology Co ltd
Priority to CN202311841349.3A priority Critical patent/CN117808951A/en
Publication of CN117808951A publication Critical patent/CN117808951A/en
Pending legal-status Critical Current

Links

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

A three-dimensional scene updating method, a three-dimensional scene updating device, a storage medium and electronic equipment relate to the field of computer vision; the processing efficiency of the three-dimensional scene can be improved. The three-dimensional scene updating method comprises the following steps: acquiring a region to be updated in a three-dimensional scene, wherein the region to be updated comprises a plurality of vertexes; obtaining a bounding box of the area to be updated according to the vertexes included in the area to be updated; acquiring a bounding box to be updated in the current view in the bounding box, and generating a texture map according to the bounding box to be updated; and sampling the texture map based on the vertexes to be rendered in the three-dimensional scene, and determining whether to update the vertexes to be rendered according to the sampling value.

Description

Three-dimensional scene updating method and device, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of computer vision, and in particular, to a method and apparatus for updating a three-dimensional scene, a storage medium, and an electronic device.
Background
With the progress and development of technology, smart cities play an increasing role in the field of urban planning. The urban planning effect can be simulated through a virtual three-dimensional scene in urban planning.
When simulating effects in a three-dimensional scene, it is often involved in processing a specific region in the three-dimensional scene. For example, when an old cell transformation plan is developed in one city, a city model is loaded in a three-dimensional scene, an old cell to be transformed is cut off, model data corresponding to a transformation scheme is loaded at the position of the old cell, and a transformed city model is formed, so that the whole transformed effect can be seen intuitively. In this process, it is first necessary to determine whether the model in the three-dimensional scene is within the range to be cut, and then cut it.
When processing an area of a three-dimensional scene, it is necessary to determine whether a point falls within the area to be processed. Three common methods for judging the range at present are adopted, one is to judge whether the vertexes are in the range or not by traversing all vertexes in the model and intersecting with points in the range one by one; one is to cross whether each vertex falls within a range in a GPU shader by passing in-range point coordinates to the GPU; and the other is that the point coordinates in the range are transmitted into the GPU through the float texture, then the point coordinates are resolved in the GPU, and then the point coordinates are intersected with the vertexes in the model. However, in the first mode, each vertex in the model needs to be intersected with a point in a range one by one, a great deal of time is needed, and the efficiency is low; in the second mode, the number of vertices that the GPU can receive is limited, and complex areas cannot be processed; in the third way, although the number of points of the GPU that are input at one time is not limited, it still takes a lot of time to calculate the delivery, and the efficiency is still not high.
Disclosure of Invention
The application provides a three-dimensional scene updating method, a three-dimensional scene updating device, a storage medium and electronic equipment, and the processing efficiency of a three-dimensional scene can be improved.
In a first aspect, the present application provides a three-dimensional scene updating method, including: acquiring a region to be updated in a three-dimensional scene, wherein the region to be updated comprises a plurality of vertexes; obtaining a bounding box of the area to be updated according to the vertexes included in the area to be updated; acquiring a bounding box to be updated in the current view in the bounding box, and generating a texture map according to the bounding box to be updated; and sampling the texture map based on the vertexes to be rendered in the three-dimensional scene, and determining whether to update the vertexes to be rendered according to the sampling value.
According to the three-dimensional scene updating method, for the to-be-updated area in the three-dimensional scene, the bounding box of the to-be-updated area is obtained, the texture map is generated according to the to-be-updated bounding box in the current view in the bounding box, whether the vertexes in the three-dimensional scene are in the to-be-updated area or not is determined by sampling the texture map, and the calculation of the vertexes one by one is not needed, so that the range determination efficiency can be improved, and the contour updating efficiency in the three-dimensional scene is improved. In addition, the texture mapping can adapt to a plurality of areas, is not limited by the number of vertexes in the areas, and can improve compatibility and usability. In addition, the area to be updated can be changed in real time in the three-dimensional scene, and the corresponding texture map can be updated in real time, so that the real-time update of the three-dimensional scene is realized, and the flexibility is high.
In an exemplary embodiment, the generating a texture map according to the bounding box to be updated includes: and generating a canvas consistent with the aspect ratio of the bounding box to be updated, determining the coordinates of the vertexes in the area to be updated in the canvas according to the relative coordinates of the vertexes in the area to be updated in the bounding box to be updated, and modifying the color at the coordinates into a set value to obtain the texture map.
In an exemplary embodiment, the generating a canvas consistent with the aspect ratio of the bounding box to be updated includes: when the bounding boxes to be updated comprise a plurality of bounding boxes, determining the total bounding boxes of the plurality of bounding boxes to be updated; and generating canvas consistent with the aspect ratio of the total bounding box.
In an exemplary embodiment, the sampling the texture map based on vertices to be rendered in the three-dimensional scene includes: obtaining the maximum point and the minimum point in the bounding box to be updated; comparing the vertex to be rendered with the maximum point and the minimum point, and determining whether the vertex to be rendered is in the bounding box to be updated; and if the vertex to be rendered is in the bounding box to be updated, sampling the texture map to obtain a sampling value corresponding to the vertex to be rendered.
In an exemplary embodiment, the sampling the texture map to obtain a sampling value corresponding to the vertex to be rendered includes: and sampling the texture map according to the relative coordinates of the vertex to be rendered in the bounding box to be updated, so as to obtain a sampling value at the relative coordinates.
In an exemplary embodiment, the determining whether to update the vertex to be rendered according to the sampling value includes: if the sampling value is the same as the set value, determining to update the vertex to be rendered; and if the sampling value is different from the set value, not updating the vertex to be rendered.
In an exemplary embodiment, the acquiring the bounding box to be updated in the current view of the bounding box includes: determining whether a distance between a center point of the bounding box and a camera position is within a preset threshold; when the distance is within the preset threshold value, determining whether the bounding box is within the current view; and if the bounding box is in the current view, determining the bounding box as a bounding box to be updated.
In a second aspect, the present application provides a three-dimensional scene updating apparatus, including: the updating area acquisition module is used for acquiring an area to be updated in the three-dimensional scene, wherein the area to be updated comprises a plurality of vertexes; the bounding box determining module is used for obtaining a bounding box of the area to be updated according to the vertexes included in the area to be updated; the texture map generation module is used for acquiring a bounding box to be updated in the current view in the bounding box and generating a texture map according to the bounding box to be updated; and the scene updating module is used for sampling the texture map based on the vertexes to be rendered in the three-dimensional scene and determining whether to update the vertexes to be rendered according to the sampling value.
In an exemplary embodiment, the texture map generation module specifically includes: and the canvas drawing module is used for generating a canvas consistent with the aspect ratio of the bounding box to be updated, determining the coordinates of the vertexes in the area to be updated in the canvas according to the relative coordinates of the vertexes in the area to be updated in the bounding box to be updated, and modifying the color at the coordinates into a set value to obtain the texture map.
In one exemplary embodiment, the canvas rendering module is specifically configured to: when the bounding boxes to be updated comprise a plurality of bounding boxes, determining the total bounding boxes of the plurality of bounding boxes to be updated; and generating canvas consistent with the aspect ratio of the total bounding box.
In an exemplary embodiment, the scene update module specifically includes: the vertex data acquisition module is used for acquiring the maximum point and the minimum point in the bounding box to be updated; the range judging module is used for comparing the vertex to be rendered with the maximum point and the minimum point and determining whether the vertex to be rendered is in the bounding box to be updated or not; and the sampling module is used for sampling the texture map if the vertex to be rendered is in the bounding box to be updated, so as to obtain a sampling value corresponding to the vertex to be rendered.
In an exemplary embodiment, the sampling module is specifically configured to: and sampling the texture map according to the relative coordinates of the vertex to be rendered in the bounding box to be updated, so as to obtain a sampling value at the relative coordinates.
In an exemplary embodiment, the scene update module specifically includes: the first judging module is used for determining to update the vertex to be rendered if the sampling value is the same as the set value; and the second judging module is used for not updating the vertex to be rendered if the sampling value is different from the set value.
In an exemplary embodiment, the bounding box determining module specifically includes: the distance determining module is used for determining whether the distance between the center point of the bounding box and the camera position is within a preset threshold value; the range determining module is used for determining whether the bounding box is in the current view when the distance is in the preset threshold value; and the bounding box screening module is used for determining the bounding box as a bounding box to be updated if the bounding box is in the current view.
In a third aspect, the present application provides an electronic device comprising a memory, one or more processors. Wherein the memory has stored therein one or more computer programs, the computer programs comprising instructions which, when executed by the processor, cause the electronic device to perform the three-dimensional scene updating method as in the first aspect.
In a fourth aspect, the present application provides a computer readable medium having instructions stored therein which, when executed on an electronic device, cause the electronic device to perform the three-dimensional scene updating method as in the first aspect.
In a fifth aspect, the present application provides a computer program product for, when run on an electronic device, causing the electronic device to perform the three-dimensional scene updating method according to the first aspect.
It may be appreciated that the benefits achieved by the three-dimensional scene updating apparatus, the electronic device, the computer readable medium and the computer program product provided above may refer to the benefits in the first aspect, and are not described herein again.
Drawings
Fig. 1 is a flow chart of a three-dimensional scene updating method according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of a three-dimensional scene updating device according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to clearly describe the technical solutions of the embodiments of the present application, in the embodiments of the present application, the words "first", "second", etc. are used to distinguish the same item or similar items having substantially the same function and effect. For example, the first chip and the second chip are merely for distinguishing different chips, and the order of the different chips is not limited. It will be appreciated by those of skill in the art that the words "first," "second," and the like do not limit the amount and order of execution, and that the words "first," "second," and the like do not necessarily differ. It should be noted that, in the embodiments of the present application, words such as "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion. In the embodiments of the present application, "at least one" means one or more, and "a plurality" means two or more.
The term "at … …" in the embodiment of the present application may be instantaneous when a certain situation occurs, or may be a period of time after a certain situation occurs, which is not particularly limited in the embodiment of the present application.
The implementation of the present embodiment will be described in detail below with reference to the accompanying drawings.
The present embodiment provides a three-dimensional scene update method, which may be applied to various electronic devices such as a computer (PC), a tablet PC, a virtual reality/augmented reality device, a wearable device, an industrial computer, a car machine, and the like, by way of example; the method can also be applied to servers, cloud terminals, server clusters and the like, and the embodiment is not particularly limited.
The three-dimensional scene updating method of the embodiment can be suitable for various updating scenes such as model clipping, model addition, model modification and the like in the three-dimensional scene. For example, taking model clipping as an example, by using the three-dimensional scene updating method of the present embodiment, it can be quickly determined whether the vertices in the three-dimensional scene need clipping, and then clipping the positions that need clipping.
Fig. 1 shows a flow chart of a three-dimensional scene updating method according to an embodiment of the present application. As shown in fig. 1, the three-dimensional scene update method may include the steps of:
step 101: and acquiring an area to be updated in the three-dimensional scene, wherein the area to be updated comprises a plurality of vertexes.
Step 102: and obtaining the bounding box of the area to be updated according to the vertexes included in the area to be updated.
Step 103: and acquiring a bounding box to be updated in the current view in the bounding box, and generating a texture map according to the bounding box to be updated.
Step 104: and sampling the texture map based on the vertexes to be rendered in the three-dimensional scene, and determining whether to update the vertexes to be rendered according to the sampling value.
The area to be updated refers to an area which needs to be cut, added and modified in the three-dimensional scene.
A vertex refers to a point in a three-dimensional scene, which can be represented by three coordinate values of x, y, and z.
A view volume refers to the area of a three-dimensional scene that is visible on the screen, i.e. the field of view of a virtual camera, also called viewing frustum, cone, view volume.
According to the vertexes included in the area to be updated in the three-dimensional scene, the bounding box of the area to be updated can be calculated, the texture map is generated according to the bounding box to be updated in the current view, whether the vertexes in the three-dimensional scene are in the area to be updated or not is determined by sampling the texture map, the calculation of the vertexes one by one is not needed, the efficiency of range determination can be improved, and the efficiency of contour updating in the three-dimensional scene is improved. In addition, the texture mapping can adapt to a plurality of areas, is not limited by the number of vertexes in the areas, and can improve compatibility and usability. In addition, the area to be updated can be changed in real time in the three-dimensional scene, and the corresponding texture map can be updated in real time, so that the real-time update of the three-dimensional scene is realized, and the flexibility is high.
Each step in the above three-dimensional scene updating method is described in detail below.
In step 101, a region to be updated in a three-dimensional scene is acquired, the region to be updated including a plurality of vertices.
The three-dimensional scene may include various virtual scenes such as a smart city scene, a game scene, and the like. The area to be updated is data set by the user according to the updating requirement of the three-dimensional scene, and can be uploaded to the electronic equipment. The vertices in the area to be updated may include a plurality of vertices, which is not particularly limited in this embodiment. And, a plurality of regions to be updated may be included in the three-dimensional scene. Assuming that the three-dimensional scene comprises a three-dimensional model of a city, and areas corresponding to 10 old cells in the three-dimensional model need to be cut out, vertex data of the 10 areas can be uploaded to electronic equipment, and the electronic equipment can acquire the vertex data of the 10 areas.
In step 102, a bounding box of the area to be updated is obtained according to the vertices included in the area to be updated.
Each area to be updated can comprise a plurality of vertexes, and the optimal surrounding space of the vertexes, namely the surrounding box, can be calculated according to the vertexes in the area to be updated. When 10 areas to be updated are obtained, 10 bounding boxes can be obtained through calculation.
In step 103, a bounding box to be updated in the current view is obtained, and a texture map is generated according to the bounding box to be updated.
According to the current camera position, the area where the current view is located can be obtained. The camera position in the virtual three-dimensional scene can be updated according to the operation of the user, when the camera position is updated, the current view volume is also updated, and the scene page seen by the user is changed.
And determining whether the bounding box of the area to be updated is in the current view according to the current view, and obtaining the bounding box to be updated in the current view. Illustratively, determining whether a distance between a center point of the bounding box and the camera position is within a preset threshold; when the distance is within the preset threshold value, determining whether the bounding box is within the current view; and if the bounding box is in the current view, determining the bounding box as a bounding box to be updated.
The preset threshold value may be set according to actual requirements, for example, 5km, 6km, etc. It will be appreciated that the specific details of the model are difficult for the human eye to discern when the camera is located further away, so that the decision process can be skipped for areas to be updated that exceed a preset threshold.
In this embodiment, if the distance between the center point of the bounding box and the camera position is within the preset threshold, the bounding box is intersected with the current view, and whether the bounding box is in the current view is determined. The distance between the center point of the bounding box and the camera position can be judged once, a part of bounding boxes with the distance exceeding a preset threshold value are screened out, the calculation amount of intersection calculation is reduced, and the calculation efficiency is improved.
In this embodiment, the current view object of the three-dimensional scene may be updated in real time according to the operation of the user. When the current view is updated, determining the bounding boxes to be updated falling in the current view from the bounding boxes of the multiple areas to be updated. For example, the bounding box to be updated may be updated once per frame, and the texture map may be updated in real time according to the updated bounding box to be updated, thereby updating the region to be updated in the three-dimensional scene.
A texture map may then be generated from the bounding box to be updated. Specifically, a canvas consistent with the aspect ratio of the bounding box to be updated is generated, the coordinates of the vertexes in the area to be updated in the canvas are determined according to the relative coordinates of the vertexes in the area to be updated in the bounding box to be updated, and the color at the coordinates is changed into a set value to obtain the texture map.
The aspect ratio of the bounding box to be updated may be obtained after obtaining the bounding box to be updated, which may also be referred to as the aspect ratio. From the aspect ratio, a canvas may be generated whose aspect ratio is consistent with the aspect ratio of the bounding box to be updated. For example, the maximum side length of the canvas may be preset to N, where N may be set according to the actual situation, e.g., 1024, 2048, etc. The canvas may be generated from the maximum side length and aspect ratio. The initial color of the canvas may be set when the canvas is generated, for example, the initial color may be black, red, etc., which is not particularly limited in this embodiment.
Determining relative coordinates of vertices of the region to be updated in the canvas, the relative coordinates referring toIs the coordinates of the vertex of the region to be updated relative to the minimum point of the canvas, which can be determined from the coordinates of the vertex relative to the minimum point in the bounding box to be updated. For example, the minimum point P1 (X min ,Y min ) Maximum point P2 (X max ,Y max ) Then the vertex of the region to be updated is (X1, y 1) and the relative coordinates of the vertex in the bounding box to be updated are x= (X1-X) min )/(X max -X min ) The x value of the vertex in the canvas may be calculated from the relative coordinates:where W is the length of the canvas. Similarly, the y value of the vertex in the canvas may be calculated: />Coordinates of the vertex in the canvas are obtained, where H is the width of the canvas.
After the coordinates of the vertex in the canvas are obtained by calculation, the color at the coordinates in the canvas is modified to a set value. The set point may be different from the initial color of the canvas, such as red, blue or other colors, which is not particularly limited in this embodiment.
For example, taking the web side as an example, a canvas with a maximum side length of N can be created according to the aspect ratio of the bounding box to be updated. After the canvas is created, the initial color is black. According to the coordinates of each vertex in each bounding box to be updated in the canvas, the color at the coordinates can be filled in red, and the filled canvas is the texture map.
In this embodiment, the bounding box to be updated may include a plurality of bounding boxes, and when the bounding box to be updated includes a plurality of bounding boxes, determining a total bounding box of the plurality of bounding boxes to be updated; and generating canvas consistent with the aspect ratio of the total bounding box. Accordingly, when calculating the coordinates of the vertices of the area to be updated in the canvas, the coordinates need to be calculated by the relative coordinates of the vertices of the area to be updated in the total bounding box. It will be appreciated that the area to be updated has the same proportion of vertices within the overall bounding box as it does within the canvas.
For example, taking the web side as an example, assuming that there are three bounding boxes (boxes) currently in the view, a total bounding box total is calculated by using the vertices of the three regions, and a canvas with a maximum side length N is created according to the aspect ratio of the box total (N may be set according to the user's own needs, such as 1024 or 2048). After the canvas is created, the initial color can be set to be black, then methods such as beginPath, moveTo, lineTo, closePath of canvas are called according to the relative coordinates of each vertex relative to the minimum point of the boxTotal, areas of the relative coordinates in the canvas are respectively drawn, and filling colors are set to be red, so that the texture map creation is completed.
Next, in step 104, the texture map is sampled based on vertices to be rendered in the three-dimensional scene, and whether to update the vertices to be rendered is determined according to the sampled values.
Wherein, the vertex to be rendered refers to a vertex in the three-dimensional scene. When the three-dimensional scene is displayed, the three-dimensional scene can be rendered in real time through the GPU, specifically, the texture map is transmitted to the GPU, a shader of the GPU can sample the texture map to obtain a sampling value corresponding to the vertex to be rendered in the three-dimensional scene, and whether the vertex to be rendered is in the area to be updated can be directly determined according to the sampling value. In this embodiment, the area to be updated and the vertex in the three-dimensional scene do not need to be subjected to intersection operation, and whether the vertex is in the area to be updated or not is determined by sampling the texture map, so that computing resources can be saved, and computing efficiency can be improved.
Specifically, the process of sampling includes the following: obtaining the maximum point and the minimum point in the bounding box to be updated; comparing the vertex to be rendered with the maximum point and the minimum point, and determining whether the vertex to be rendered is in the bounding box to be updated; and if the vertex to be rendered is in the bounding box to be updated, sampling the texture map to obtain a sampling value corresponding to the vertex to be rendered.
And transmitting the maximum point, the minimum point and the texture map in the bounding box to be updated to the GPU, and after the shader of the GPU acquires the transmitted texture map, the maximum point and the minimum point, comparing the vertex to be rendered in the three-dimensional scene with the maximum point and the minimum point to determine whether the vertex to be rendered falls in the bounding box to be updated. If the vertex to be rendered falls between the maximum point and the minimum point, the vertex to be rendered is in the bounding box to be updated.
And sampling the vertexes to be rendered, which fall in the bounding box to be updated, in the texture map to obtain sampling values corresponding to the vertexes to be rendered. Specifically, according to the relative coordinates of the vertex to be rendered in the bounding box to be updated, sampling the texture map to obtain sampling values at the relative coordinates.
The relative coordinates of the vertex to be rendered in the bounding box to be updated are texture coordinates of the vertex to be rendered, and the texture coordinates are used for sampling on a texture map to obtain sampling values. For example, if the vertex to be rendered is in the bounding box to be updated, subtracting the x and y coordinates of the minimum point of the bounding box to be updated from the x and y coordinates of the current vertex to be rendered, dividing by the length and width of the boxTotal to obtain texture coordinates of the vertex to be rendered, and then removing the texture map by using the texture coordinates to sample to obtain a sampling value.
It is understood that when the bounding box to be updated includes a plurality of points, the maximum point and the minimum point may be the maximum point and the minimum point of the total bounding box.
If the sampling value is the same as the set value, determining to render the vertex to be rendered; and if the sampling value is different from the set value, not rendering the vertex to be rendered. Assuming the set value is red, taking model clipping as an example, if the sampled value is red, indicating that the current vertex to be rendered is in the area to be updated, namely clipping is needed, discarding the vertex to be rendered, and not rendering; if the color obtained by sampling is black, the vertex to be rendered is outside the area to be updated, and updating, namely cutting, is not needed, and the vertex to be rendered is reserved.
Further, the embodiment also provides a three-dimensional scene updating device, which can be used for executing the three-dimensional scene updating method.
As shown in fig. 2, the three-dimensional scene updating apparatus 200 may include: an update region obtaining module 201, configured to obtain a region to be updated in a three-dimensional scene, where the region to be updated includes a plurality of vertices; a bounding box determining module 202, configured to obtain a bounding box of the area to be updated according to vertices included in the area to be updated; the texture map generation module 203 is configured to obtain a bounding box to be updated in the current view from the bounding boxes, and generate a texture map according to the bounding box to be updated; the scene updating module 204 is configured to sample the texture map based on vertices to be rendered in the three-dimensional scene, and determine whether to update the vertices to be rendered according to the sampling value.
In an exemplary embodiment, the texture map generation module 203 specifically includes: and the canvas drawing module is used for generating a canvas consistent with the aspect ratio of the bounding box to be updated, determining the coordinates of the vertexes in the area to be updated in the canvas according to the relative coordinates of the vertexes in the area to be updated in the bounding box to be updated, and modifying the color at the coordinates into a set value to obtain the texture map.
In one exemplary embodiment, the canvas rendering module is specifically configured to: when the bounding boxes to be updated comprise a plurality of bounding boxes, determining the total bounding boxes of the plurality of bounding boxes to be updated; and generating canvas consistent with the aspect ratio of the total bounding box.
In an exemplary embodiment, the scene update module 204 specifically includes: the vertex data acquisition module is used for acquiring the maximum point and the minimum point in the bounding box to be updated; the range judging module is used for comparing the vertex to be rendered with the maximum point and the minimum point and determining whether the vertex to be rendered is in the bounding box to be updated or not; and the sampling module is used for sampling the texture map if the vertex to be rendered is in the bounding box to be updated, so as to obtain a sampling value corresponding to the vertex to be rendered.
In an exemplary embodiment, the sampling module is specifically configured to: and sampling the texture map according to the relative coordinates of the vertex to be rendered in the bounding box to be updated, so as to obtain a sampling value at the relative coordinates.
In an exemplary embodiment, the scene update module 204 specifically includes: the first judging module is used for determining to update the vertex to be rendered if the sampling value is the same as the set value; and the second judging module is used for not updating the vertex to be rendered if the sampling value is different from the set value.
In an exemplary embodiment, the bounding box determining module 202 specifically includes: the distance determining module is used for determining whether the distance between the center point of the bounding box and the camera position is within a preset threshold value; the range determining module is used for determining whether the bounding box is in the current view when the distance is in the preset threshold value; and the bounding box screening module is used for determining the bounding box as a bounding box to be updated if the bounding box is in the current view.
The specific details of each module or unit in the above three-dimensional scene updating device are described in detail in the corresponding three-dimensional scene updating method, so that the details are not repeated here.
The embodiment of the application also provides electronic equipment, and fig. 3 shows a schematic structural diagram of the electronic equipment suitable for implementing the embodiment of the disclosure. The electronic device 300 shown in fig. 3 is only one example and should not be construed as limiting the functionality and scope of use of the disclosed embodiments.
As shown in fig. 3, the electronic device 300 includes a Central Processing Unit (CPU) 301 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 302 or a program loaded from a storage section 308 into a Random Access Memory (RAM) 303. In the RAM303, various programs and data required for the system operation are also stored. The CPU 301, ROM 302, and RAM303 are connected to each other through a bus 304. An input/output (I/O) interface 305 is also connected to bus 304.
The following components are connected to the I/O interface 305: an input section 306 including a keyboard, a mouse, and the like; an output portion 307 including a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, a speaker, and the like; a storage section 308 including a hard disk or the like; and a communication section 309 including a network interface card such as a LAN card, a modem, or the like. The communication section 309 performs communication processing via a network such as the internet. The drive 310 is also connected to the I/O interface 305 as needed. A removable medium 311 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is installed on the drive 310 as needed, so that a computer program read therefrom is installed into the storage section 308 as needed.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flowcharts. In such an embodiment, the computer program may be downloaded and installed from a network via the communication portion 309, and/or installed from the removable medium 311. The above-described functions defined in the embodiments of the present application are performed when the computer program is executed by a Central Processing Unit (CPU) 301.
For example, the computer program, when executed by the Central Processing Unit (CPU) 301, may perform the following: acquiring a region to be updated in a three-dimensional scene, wherein the region to be updated comprises a plurality of vertexes; obtaining a bounding box of the area to be updated according to the vertexes included in the area to be updated; acquiring a bounding box to be updated in the current view in the bounding box, and generating a texture map according to the bounding box to be updated; and sampling the texture map based on the vertexes to be rendered in the three-dimensional scene, and determining whether to update the vertexes to be rendered according to the sampling value.
It should be noted that the computer readable medium shown in the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present disclosure may be implemented by means of software, or may be implemented by means of hardware, and the described units may also be provided in a processor. Wherein the names of the units do not constitute a limitation of the units themselves in some cases.
As another aspect, the present application also provides a computer-readable medium that may be contained in the electronic device described in the above embodiment; or may exist alone without being incorporated into the electronic device. The computer readable medium carries one or more programs comprising instructions which, when executed by an electronic device, cause the electronic device to implement the methods described in the above embodiments.
It should be noted that although in the above detailed description several modules or units of a device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functions of two or more modules or units described above may be embodied in one module or unit, in accordance with embodiments of the present application. Conversely, the features and functions of one module or unit described above may be further divided into a plurality of modules or units to be embodied.
The foregoing is merely a specific embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered in the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A three-dimensional scene updating method, characterized by comprising:
acquiring a region to be updated in a three-dimensional scene, wherein the region to be updated comprises a plurality of vertexes;
obtaining a bounding box of the area to be updated according to the vertexes included in the area to be updated;
acquiring a bounding box to be updated in the current view in the bounding box, and generating a texture map according to the bounding box to be updated;
and sampling the texture map based on the vertexes to be rendered in the three-dimensional scene, and determining whether to update the vertexes to be rendered according to the sampling value.
2. The method of claim 1, wherein generating a texture map from the bounding box to be updated comprises:
and generating a canvas consistent with the aspect ratio of the bounding box to be updated, determining the coordinates of the vertexes in the area to be updated in the canvas according to the relative coordinates of the vertexes in the area to be updated in the bounding box to be updated, and modifying the color at the coordinates into a set value to obtain the texture map.
3. The method of three-dimensional scene updating according to claim 2, wherein the generating canvas consistent with the aspect ratio of the bounding box to be updated comprises:
when the bounding boxes to be updated comprise a plurality of bounding boxes, determining the total bounding boxes of the plurality of bounding boxes to be updated;
and generating canvas consistent with the aspect ratio of the total bounding box.
4. The method of claim 1, wherein the sampling the texture map based on vertices to be rendered in the three-dimensional scene comprises:
obtaining the maximum point and the minimum point in the bounding box to be updated;
comparing the vertex to be rendered with the maximum point and the minimum point, and determining whether the vertex to be rendered is in the bounding box to be updated;
and if the vertex to be rendered is in the bounding box to be updated, sampling the texture map to obtain a sampling value corresponding to the vertex to be rendered.
5. The method for updating a three-dimensional scene according to claim 4, wherein the step of sampling the texture map to obtain a sampling value corresponding to the vertex to be rendered comprises:
and sampling the texture map according to the relative coordinates of the vertex to be rendered in the bounding box to be updated, so as to obtain a sampling value at the relative coordinates.
6. The method for updating a three-dimensional scene according to claim 2, wherein the determining whether to update the vertex to be rendered according to the sampling value comprises:
if the sampling value is the same as the set value, determining to update the vertex to be rendered;
and if the sampling value is different from the set value, not updating the vertex to be rendered.
7. The method of claim 1, wherein the obtaining the bounding box to be updated in the current view of the bounding box comprises:
determining whether a distance between a center point of the bounding box and a camera position is within a preset threshold;
when the distance is within the preset threshold value, determining whether the bounding box is within the current view;
and if the bounding box is in the current view, determining the bounding box as a bounding box to be updated.
8. A three-dimensional scene updating apparatus, characterized by comprising:
the updating area acquisition module is used for acquiring an area to be updated in the three-dimensional scene, wherein the area to be updated comprises a plurality of vertexes;
the bounding box determining module is used for obtaining a bounding box of the area to be updated according to the vertexes included in the area to be updated;
the texture map generation module is used for acquiring a bounding box to be updated in the current view in the bounding box and generating a texture map according to the bounding box to be updated;
and the scene updating module is used for sampling the texture map based on the vertexes to be rendered in the three-dimensional scene and determining whether to update the vertexes to be rendered according to the sampling value.
9. A computer readable storage medium storing a computer program which, when executed by a processor, causes the processor to perform the three-dimensional scene updating method according to any of claims 1 to 7.
10. An electronic device comprising a processor and a memory, the memory having stored therein one or more computer programs, the one or more computer programs comprising instructions, which when executed by the electronic device, cause the electronic device to perform the three-dimensional scene updating method of any of claims 1-7.
CN202311841349.3A 2023-12-28 2023-12-28 Three-dimensional scene updating method and device, storage medium and electronic equipment Pending CN117808951A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311841349.3A CN117808951A (en) 2023-12-28 2023-12-28 Three-dimensional scene updating method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311841349.3A CN117808951A (en) 2023-12-28 2023-12-28 Three-dimensional scene updating method and device, storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN117808951A true CN117808951A (en) 2024-04-02

Family

ID=90419550

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311841349.3A Pending CN117808951A (en) 2023-12-28 2023-12-28 Three-dimensional scene updating method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN117808951A (en)

Similar Documents

Publication Publication Date Title
CN111340928B (en) Ray tracing-combined real-time hybrid rendering method and device for Web end and computer equipment
CN107358643B (en) Image processing method, image processing device, electronic equipment and storage medium
EP3882865A1 (en) Object loading method, device, storage medium, and electronic device
CN114820906B (en) Image rendering method and device, electronic equipment and storage medium
CN113674389B (en) Scene rendering method and device, electronic equipment and storage medium
CN113034656B (en) Rendering method, device and equipment for illumination information in game scene
CN113256782B (en) Three-dimensional model generation method and device, storage medium and electronic equipment
CN112734896B (en) Environment shielding rendering method and device, storage medium and electronic equipment
EP4213102A1 (en) Rendering method and apparatus, and device
CN113034657B (en) Rendering method, device and equipment for illumination information in game scene
CN109410213A (en) Polygon pel method of cutting out, computer readable storage medium, electronic equipment based on bounding box
CN111882631B (en) Model rendering method, device, equipment and storage medium
RU2680355C1 (en) Method and system of removing invisible surfaces of a three-dimensional scene
CN114155337A (en) Large-scale digital workshop rendering system and method based on Unity platform
CN111798554A (en) Rendering parameter determination method, device, equipment and storage medium
CN114742931A (en) Method and device for rendering image, electronic equipment and storage medium
CN110838167B (en) Model rendering method, device and storage medium
CN111127590B (en) Second-order Bezier curve drawing method and device
CN117808951A (en) Three-dimensional scene updating method and device, storage medium and electronic equipment
CN114170367B (en) Method, apparatus, storage medium, and device for infinite-line-of-sight pyramidal heatmap rendering
CN115908687A (en) Method and device for training rendering network, method and device for rendering network, and electronic equipment
CN106780693B (en) Method and system for selecting object in three-dimensional scene through drawing mode
CN115601524A (en) Crushing model generation method and device, electronic equipment and storage medium
CN115131513A (en) Three-dimensional terrain clipping method, device and equipment and readable storage medium
CN114581596A (en) Geometric body fast rendering method based on graphic processing unit GPU drive

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination