CN117132695A - Three-dimensional drawing method, three-dimensional drawing device, electronic equipment and storage medium - Google Patents
Three-dimensional drawing method, three-dimensional drawing device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN117132695A CN117132695A CN202311068903.9A CN202311068903A CN117132695A CN 117132695 A CN117132695 A CN 117132695A CN 202311068903 A CN202311068903 A CN 202311068903A CN 117132695 A CN117132695 A CN 117132695A
- Authority
- CN
- China
- Prior art keywords
- pixel
- image
- dimensional
- coordinates
- coordinate
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 47
- 238000012545 processing Methods 0.000 claims abstract description 50
- 239000011159 matrix material Substances 0.000 claims description 51
- 238000004364 calculation method Methods 0.000 claims description 16
- 238000004590 computer program Methods 0.000 claims description 12
- 239000000463 material Substances 0.000 claims description 11
- 230000009466 transformation Effects 0.000 claims description 8
- 238000013507 mapping Methods 0.000 claims description 6
- 230000001131 transforming effect Effects 0.000 claims description 5
- 238000010586 diagram Methods 0.000 description 13
- 239000003086 colorant Substances 0.000 description 4
- 238000004891 communication Methods 0.000 description 4
- 230000006978 adaptation Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000013475 authorization Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 238000006731 degradation reaction Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000011982 device technology Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/005—General purpose rendering architectures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/04—Texture mapping
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/20—Finite element generation, e.g. wire-frame surface description, tesselation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2012—Colour editing, changing, or manipulating; Use of colour codes
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Graphics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- Architecture (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Geometry (AREA)
- Image Generation (AREA)
- Processing Or Creating Images (AREA)
Abstract
The application provides a three-dimensional drawing method, a three-dimensional drawing device, electronic equipment and a storage medium, and relates to the field of image processing. The method comprises the following steps: determining a three-dimensional model, the three-dimensional model comprising a plurality of image patches; performing rasterization processing on any image patch to obtain a pixel image corresponding to the image patch and pixel information corresponding to each target pixel point in the pixel image; calculating the pixel information corresponding to each target pixel point through an image digital signal processor to obtain the color of the pixel image; and generating a three-dimensional image corresponding to the image patch through the pixel image and the color. According to the scheme, the image digital signal processor replaces the graphic processor to conduct three-dimensional drawing, so that the task of the graphic processor can be shared, and the efficiency of three-dimensional drawing is improved.
Description
Technical Field
The present application relates to the field of image processing, and in particular, to a three-dimensional drawing method, apparatus, electronic device, and storage medium.
Background
With the continuous development of display device technology, the display device can perform various processing optimizations on the image, so as to improve the quality of the image and the use experience of a user. A display device. The display equipment comprises a mobile phone, a computer, a car machine, a tablet personal computer and the like.
On the basis, the display device can perform three-dimensional drawing according to the three-dimensional model to obtain and display a three-dimensional image, and the three-dimensional image can more intuitively represent the image content.
In the related art, a graphic processor of a display device includes a function of three-dimensional drawing, and generation of a three-dimensional image is achieved by the graphic processor of the display device. However, when a graphics processor processes multiple image processing tasks simultaneously, performance degradation may occur, resulting in inefficiency in three-dimensional drawing.
Disclosure of Invention
The application provides a three-dimensional drawing method, a three-dimensional drawing device, electronic equipment and a storage medium, which are used for improving the efficiency of three-dimensional drawing.
In a first aspect, the present application provides a three-dimensional drawing method, including: determining a three-dimensional model, the three-dimensional model comprising a plurality of image patches; performing rasterization processing on any image patch to obtain a pixel image corresponding to the image patch and pixel information corresponding to each target pixel point in the pixel image; calculating the pixel information corresponding to each target pixel point through an image digital signal processor to obtain the color of the pixel image; and generating a three-dimensional image corresponding to the image patch through the pixel image and the color.
In one possible implementation manner, the rasterizing the image patch to obtain a pixel image corresponding to the image patch and pixel information corresponding to each target pixel point in the pixel image includes: determining a model observation matrix and a projection matrix of the three-dimensional model; calculating to obtain a model observation projection matrix through the model observation matrix and the projection matrix; and carrying out rasterization processing on the image patch by the model observation projection matrix to obtain a pixel image corresponding to the image patch and the pixel information corresponding to each target pixel point in the pixel image.
In one possible embodiment, the pixel patch is triangular; the pixel image is displayed in a display window; performing rasterization processing on the image patch by the model observation projection matrix to obtain a pixel image corresponding to the image patch and the pixel information corresponding to each target pixel point in the pixel image, wherein the pixel information comprises; determining three vertex model coordinates of the image patch, wherein each vertex model coordinate corresponds to one of the vertices of the image patch; mapping each vertex model coordinate through the model observation projection matrix to obtain a corresponding vertex three-dimensional coordinate; determining the length and the width of the display window; and carrying out transformation processing on the three-dimensional coordinates of the vertex, the length and the width to obtain the pixel image and the pixel information corresponding to each target pixel point in the pixel image.
In one possible embodiment, the vertex pixel coordinates include length coordinates, width coordinates, and depth coordinates; and transforming the vertex three-dimensional coordinates, the length and the width to obtain the pixel image and the pixel information corresponding to each target pixel point in the pixel image, wherein the pixel information comprises: transforming the three-dimensional coordinates of the vertexes, the length and the width to obtain the coordinates of the vertexes of the pixel image; determining a pixel range of the pixel image, a plurality of to-be-selected pixel points in the pixel range, and length coordinates and width coordinates of each to-be-selected pixel point through the vertex pixel coordinates; performing interpolation calculation processing through the depth coordinates of the vertex pixel coordinates to obtain the depth coordinates of each pixel point to be selected; determining coordinates of a plurality of historical pixel points, wherein the historical pixel points are pixel points in a pixel image corresponding to each historical image patch; and determining a plurality of target pixel points and the pixel information corresponding to each target pixel point from the plurality of to-be-selected pixel points through the coordinates of the plurality of historical pixel points.
In one possible implementation manner, determining, from the plurality of candidate pixels, a plurality of target pixels and the pixel information corresponding to each target pixel by using coordinates of the plurality of historical pixels includes: for any one pixel point to be selected, determining whether a corresponding pixel point exists in the plurality of historical pixel points, wherein the length coordinate of the corresponding pixel point is the same as the length coordinate of the pixel point to be selected, and the width coordinate of the corresponding pixel point is the same as the width coordinate of the pixel point to be selected; if the corresponding pixel point does not exist, determining the pixel point to be selected as the target pixel point; if the corresponding pixel point exists, determining whether the depth coordinate of the pixel point to be selected is smaller than the depth coordinate of the corresponding pixel point; if the depth coordinate of the pixel point to be selected is smaller than the depth coordinate of the corresponding pixel point, determining the pixel point to be selected as the target pixel point, and updating the depth coordinate of the corresponding pixel point through the depth coordinate of the pixel point to be selected; determining model information of each vertex in the image patch, and determining the model information as pixel information of each vertex in the pixel image; and determining the pixel information of each target pixel point in the pixel image according to the pixel information of each vertex.
In one possible implementation, the pixel information includes texture information and texture information; calculating, by an image digital signal processor, the pixel information corresponding to each target pixel point to obtain a color of the pixel image, including: calculating the texture information to obtain a plurality of color types; determining the mixing proportion of each color type through the material information; and performing color mixing processing through the plurality of color types and the mixing proportion to obtain the color of the pixel image.
In one possible embodiment, the method further comprises: storing the three-dimensional image corresponding to the image patch in an image buffer area; and combining the three-dimensional images in the image buffer area until the three-dimensional image corresponding to each image patch is generated, so as to obtain a three-dimensional combined image corresponding to the three-dimensional model.
In a second aspect, the present application provides a three-dimensional drawing device comprising: a determination module for determining a three-dimensional model, the three-dimensional model comprising a plurality of image patches; the pixel module is used for carrying out rasterization processing on any image patch to obtain a pixel image corresponding to the image patch and pixel information corresponding to each target pixel point in the pixel image; the calculation module is used for calculating the pixel information corresponding to each target pixel point through an image digital signal processor to obtain the color of the pixel image; and the generation module is used for generating a three-dimensional image corresponding to the image patch through the pixel image and the color.
In a possible implementation manner, the pixelation module is specifically configured to determine a model observation matrix and a projection matrix of the three-dimensional model; the pixelation module is specifically further used for calculating a model observation projection matrix through the model observation matrix and the projection matrix; the pixelized module is specifically further configured to perform rasterization processing on the image patch by using the model observation projection matrix, so as to obtain a pixel image corresponding to the image patch and the pixel information corresponding to each target pixel point in the pixel image.
In one possible embodiment, the pixel patch is triangular; the pixel image is displayed in a display window; the pixelation module is specifically configured to determine three vertex model coordinates of the image patch, where each vertex model coordinate corresponds to one of vertices of the image patch; the pixelized module is specifically further configured to map each vertex model coordinate through the model observation projection matrix to obtain a corresponding vertex three-dimensional coordinate; the pixelation module is specifically used for determining the length and the width of the display window; the pixelation module is specifically further configured to perform transformation processing through the three-dimensional coordinates of the vertex, the length and the width, so as to obtain the pixel image and the pixel information corresponding to each target pixel point in the pixel image.
In one possible embodiment, the vertex pixel coordinates include length coordinates, width coordinates, and depth coordinates; the pixelation module is specifically configured to perform transformation processing through the three-dimensional coordinate of the vertex, the length and the width to obtain a coordinate of a vertex pixel of the pixel image; the pixelation module is specifically further configured to determine a pixel range of the pixel image, a plurality of to-be-selected pixel points in the pixel range, and a length coordinate and a width coordinate of each to-be-selected pixel point according to the vertex pixel coordinates; the pixelation module is specifically further configured to perform interpolation calculation processing through the depth coordinate of the vertex pixel coordinate to obtain the depth coordinate of each pixel point to be selected; the pixelation module is specifically configured to determine coordinates of a plurality of historical pixel points, where the historical pixel points are pixel points in a pixel image corresponding to each historical image patch; the pixelation module is specifically further configured to determine a plurality of target pixel points from the plurality of candidate pixel points according to coordinates of the plurality of historical pixel points, and the pixel information corresponding to each target pixel point.
In a possible implementation manner, the pixelation module is specifically configured to determine, for any one pixel to be selected, whether a corresponding pixel exists in the plurality of historical pixel points, where a length coordinate of the corresponding pixel is the same as a length coordinate of the pixel to be selected, and a width coordinate of the corresponding pixel is the same as a width coordinate of the pixel to be selected; the pixelation module is specifically further configured to determine the pixel to be selected as the target pixel if the corresponding pixel does not exist; the pixelation module is specifically further configured to determine whether the depth coordinate of the pixel to be selected is smaller than the depth coordinate of the corresponding pixel if the corresponding pixel exists; the pixelation module is specifically further configured to determine the pixel to be selected as the target pixel if the depth coordinate of the pixel to be selected is smaller than the depth coordinate of the corresponding pixel, and update the depth coordinate of the corresponding pixel according to the depth coordinate of the pixel to be selected; the pixelation module is specifically further configured to determine model information of each vertex in the image patch, and determine the model information as pixel information of each vertex in the pixel image; the pixelation module is specifically further configured to determine pixel information of each target pixel point in the pixel image according to the pixel information of each vertex.
In one possible implementation, the pixel information includes texture information and texture information; the apparatus further comprises: the mixing module is used for carrying out calculation processing through the texture information to obtain a plurality of color types; the mixing module is further used for determining the mixing proportion of each color type through the material information; and the mixing module is also used for carrying out color mixing processing through the plurality of color types and the mixing proportion to obtain the color of the pixel image.
In one possible embodiment, the apparatus further comprises: the combination module is used for storing the three-dimensional image corresponding to the image patch in an image buffer area; and the combination module is also used for carrying out combination processing on the three-dimensional images in the image buffer area until the three-dimensional images corresponding to each image patch are generated, so as to obtain the three-dimensional combined image corresponding to the three-dimensional model.
In a third aspect, the present application provides an electronic device comprising: a processor, and a memory communicatively coupled to the processor; the memory stores computer-executable instructions; the processor executes computer-executable instructions stored in the memory to implement the method of any one of the first aspects.
In a fourth aspect, the present application provides a computer-readable storage medium having stored therein computer-executable instructions for performing the method of any of the first aspects by a processor.
In a fifth aspect, the present application provides a computer program product comprising a computer program for execution by a processor of the method according to any one of the first aspects.
The application provides a three-dimensional drawing method, a three-dimensional drawing device, electronic equipment and a storage medium, which comprise the following steps: determining a three-dimensional model, the three-dimensional model comprising a plurality of image patches; performing rasterization processing on any image patch to obtain a pixel image corresponding to the image patch and pixel information corresponding to each target pixel point in the pixel image; calculating the pixel information corresponding to each target pixel point through an image digital signal processor to obtain the color of the pixel image; and generating a three-dimensional image corresponding to the image patch through the pixel image and the color. According to the scheme, the image digital signal processor replaces the graphic processor to conduct three-dimensional drawing, so that the task of the graphic processor can be shared, and the efficiency of three-dimensional drawing is improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
Fig. 1 is a schematic diagram of an application scenario of a three-dimensional drawing method according to an embodiment of the present application;
FIG. 2 is a schematic flow chart of a three-dimensional drawing method according to an embodiment of the present application;
FIG. 3 is a schematic flow chart of a three-dimensional drawing method according to an embodiment of the present application;
FIG. 4 is a schematic view of a model observation projection provided in an embodiment of the present application;
FIG. 5 is a schematic diagram of determining a pixel range according to an embodiment of the present application;
FIG. 6 is a schematic diagram of a determined history pixel provided in an embodiment of the present application;
FIG. 7 is a schematic diagram of pixel information according to an embodiment of the present application;
FIG. 8 is a schematic structural diagram of a three-dimensional drawing device according to an embodiment of the present application;
FIG. 9 is a schematic structural diagram of a three-dimensional drawing device according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Specific embodiments of the present application have been shown by way of the above drawings and will be described in more detail below. The drawings and the written description are not intended to limit the scope of the inventive concepts in any way, but rather to illustrate the inventive concepts to those skilled in the art by reference to the specific embodiments.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the application. Rather, they are merely examples of apparatus and methods consistent with aspects of the application as detailed in the accompanying claims.
Fig. 1 is a schematic view of an application scenario of a three-dimensional drawing method according to an embodiment of the present application, and examples are illustrated in combination with the illustrated scenario: the three-dimensional model is a model obtained by modeling, the three-dimensional model can be displayed only by being converted into a three-dimensional image, the three-dimensional model is subjected to rasterization processing to obtain a pixel image in a display window, the pixel image comprises a range of pixel points corresponding to the three-dimensional image, the pixel image is rendered, and corresponding colors are added to each pixel point, so that the three-dimensional image is obtained.
The technical scheme of the present application and the technical scheme of the present application will be described in detail with specific examples. The following embodiments may be combined with each other, and the same or similar concepts or processes may not be described in detail in some embodiments. In describing the present application, the terms should be construed broadly in the art unless explicitly stated and limited otherwise. Embodiments of the present application will be described below with reference to the accompanying drawings.
Fig. 2 is a flow chart of a three-dimensional drawing method according to an embodiment of the application, the method includes the following steps:
s201, determining a three-dimensional model, wherein the three-dimensional model comprises a plurality of image patches.
As an example, the execution subject of this embodiment may be a three-dimensional drawing device, and there are various implementations of the three-dimensional drawing device. For example, the program may be software, or a medium storing a related computer program, such as a usb disk; alternatively, the apparatus may be a physical device, such as a chip, a smart terminal, a computer, a server, etc., in which the relevant computer program is integrated or installed.
The image surface patch is a basic image of a three-dimensional model, and the three-dimensional model is composed of a plurality of image surface patches.
In practical applications, triangles are used as image patches.
S202, for any image patch, rasterizing the image patch to obtain a pixel image corresponding to the image patch and pixel information corresponding to each target pixel point in the pixel image.
Optionally, the pixel information includes, but is not limited to: color texture, concave-convex texture, normal, material, etc.
The three-dimensional image obtained by drawing the three-dimensional model is displayed in a display window, and the display window is composed of a plurality of pixel points, so that the three-dimensional image is generated through the pixel images.
For example, the target pixel is a pixel displayed by the three-dimensional image on the display window.
Optionally, each rasterized image patch is stored in a rasterization buffer.
S203, calculating the pixel information corresponding to each target pixel point through an image digital signal processor to obtain the color of the pixel image.
Optionally, the data in the rasterizing buffer is isomorphic data, and the isomorphic data is read one by an image digital signal processor for processing.
It is understood that in the related art, colors of a three-dimensional image are processed by a shader of a graphic processor, which may result in a decrease in efficiency of processing colors when the graphic processor processes a plurality of tasks at the same time. The application processes color by the image digital signal processor and can share the task of the image processor.
S204, generating a three-dimensional image corresponding to the image patch through the pixel image and the color.
And filling the colors in the corresponding target pixel points to obtain a three-dimensional image.
Optionally, the image patches are in one-to-one correspondence with the three-dimensional images, and the three-dimensional images are generated one by one.
One possible implementation manner is to store the three-dimensional image corresponding to the image patch in an image buffer area; and combining the three-dimensional images in the image buffer area until the three-dimensional image corresponding to each image patch is generated, so as to obtain a three-dimensional combined image corresponding to the three-dimensional model.
Wherein the three-dimensional combined image is the final output image.
In combination with the scene example, the image patches are in one-to-one correspondence with the three-dimensional images until the number of the three-dimensional images in the image buffer area is the same as the number of the image patches, which indicates that the processing of all the image patches is completed.
In the feasible implementation mode, the three-dimensional images contain depth coordinates, namely, the occlusion relation exists among the three-dimensional images, the three-dimensional images are stored through the image buffer area, and the three-dimensional images can be combined according to the occlusion relation, so that the accuracy of generating the three-dimensional combined image is improved.
Optionally, the generated three-dimensional image is stored in an image buffer until a three-dimensional image corresponding to each image patch is generated.
The three-dimensional drawing method provided by the embodiment of the application determines a three-dimensional model, wherein the three-dimensional model comprises a plurality of image patches; performing rasterization processing on any image patch to obtain a pixel image corresponding to the image patch and pixel information corresponding to each target pixel point in the pixel image; calculating the pixel information corresponding to each target pixel point through an image digital signal processor to obtain the color of the pixel image; and generating a three-dimensional image corresponding to the image patch through the pixel image and the color. According to the scheme, the image digital signal processor replaces the graphic processor to conduct three-dimensional drawing, so that the task of the graphic processor can be shared, and the efficiency of three-dimensional drawing is improved.
On the basis of any of the above embodiments, a detailed process of three-dimensional drawing will be described below with reference to fig. 3.
Fig. 3 is a flow chart of a three-dimensional drawing method according to an embodiment of the application. As shown in fig. 3, the method includes:
s301, determining a three-dimensional model, wherein the three-dimensional model comprises a plurality of image patches.
It should be noted that, the execution process of S301 is referred to S201, and will not be described herein.
S302, determining a model observation matrix and a projection matrix of the three-dimensional model; and calculating to obtain a model observation projection matrix through the model observation matrix and the projection matrix.
The model observation matrix and the projection matrix represent the mapping relation between the image patch and the pixel image, and the pixel image corresponding to the image patch is determined through the model observation matrix and the projection matrix.
Optionally, the product of the model observation matrix and the projection matrix is determined as the model observation projection matrix.
Next, a model observation projection will be described with reference to fig. 4.
Fig. 4 is a schematic view of a model observation projection provided in an embodiment of the present application. As shown in fig. 4, when the three-dimensional model is observed through a fixed observation point, an image patch of the three-dimensional model is observed, and the image patch is projected on a display window to obtain a pixel image.
S303, determining three vertex model coordinates of the image surface patch, wherein each vertex model coordinate corresponds to one vertex of the image surface patch.
Wherein the pixel patch is triangular; the pixel image is displayed in a display window; the vertex model coordinates are coordinates of vertices of the image patch in the three-dimensional model.
S304, mapping each vertex model coordinate through the model observation projection matrix to obtain a corresponding vertex three-dimensional coordinate.
Wherein the vertex three-dimensional coordinates describe the position of the vertex of the pixel image in the display window.
Optionally, the product of the model observation projection matrix and the vertex model coordinates is determined as vertex three-dimensional coordinates.
S305, determining the length and the width of the display window.
The length and the width of the display window are the size of the display window, the size of the display window is the same as the size of the display screen, and the length and the width are both in units of pixels.
For example, a three-dimensional image is displayed through a display screen, and a user views the three-dimensional image through the display screen.
And S306, carrying out transformation processing on the three-dimensional coordinates of the vertexes, the length and the width to obtain the pixel coordinates of the vertexes of the pixel image.
For example, the three-dimensional coordinates of the vertex are three-dimensional coordinates including a length coordinate, a width coordinate and a depth coordinate, and the length coordinate and the width coordinate of the three-dimensional coordinates of the vertex are normalized by the depth coordinate of the three-dimensional coordinates of the vertex to obtain the length coordinate and the width coordinate of the pixel coordinate of the vertex.
For example, the three-dimensional coordinates of the vertex are (xt, yt, zt), the length is w, and the width is h, and the pixel coordinates of the vertex are (xt, w/zt, yt, h/zt).
S307, determining a pixel range of the pixel image, a plurality of to-be-selected pixel points in the pixel range, and length coordinates and width coordinates of each to-be-selected pixel point through the vertex pixel coordinates.
Next, a description will be given of determining a pixel range with reference to fig. 5.
Fig. 5 is a schematic diagram of determining a pixel range according to an embodiment of the present application. As shown in fig. 5, vertex pixel coordinates of three vertices are determined, a triangle range within the vertex pixel coordinates is determined as a pixel range of the pixel image, and a pixel point within the pixel range is a pixel point to be selected.
Optionally, a rectangular range is determined according to the minimum length coordinate, the minimum width coordinate, the minimum length coordinate and the maximum width coordinate in the vertex pixel coordinates of the three vertices, the rectangle is a triangle circumscribed rectangle, the pixel points in the rectangular range are scanned, and the pixel points in the triangular range corresponding to the three vertices in the rectangular range are determined as the pixel points to be selected.
And S308, carrying out interpolation calculation processing through the depth coordinates of the vertex pixel coordinates to obtain the depth coordinates of each pixel point to be selected.
By combining with a scene example, determining the range of the depth coordinate of each pixel point in the pixel image range through the depth coordinate of the vertex pixel coordinate, and carrying out difference value calculation on the range of the depth coordinate of each pixel point to be selected to obtain the depth coordinate of each pixel point to be selected.
S309, determining coordinates of a plurality of historical pixel points, wherein the historical pixel points are pixel points in a pixel image corresponding to each historical image patch.
Optionally, the rasterization buffer stores pixel coordinates of a plurality of historical pixel points for each pixel image. And merging the historical pixel points with the same length coordinates and width coordinates.
Alternatively, the depth coordinate of the historical pixel point is determined as the smaller value of the depth coordinate after the historical pixel points are combined.
Next, the determination of the history pixel point will be described with reference to fig. 6.
Fig. 6 is a schematic diagram of a determined historical pixel point according to an embodiment of the present application. As shown in fig. 6, the length coordinates of the pixel point 1 and the pixel point 2 are the same, the width coordinates of the pixel point 1 and the pixel point 2 are the same, and the value 300 with the smaller depth coordinates is determined as the depth coordinates of the history pixel point.
For example, in connection with a scene, a smaller value of the depth coordinate indicates that the pixel is in front, and a larger value of the depth coordinate indicates that the pixel is behind, and when displayed, the pixel in front obscures the pixel behind.
S310, determining a plurality of target pixel points and the pixel information corresponding to each target pixel point from the plurality of candidate pixel points through the coordinates of the plurality of historical pixel points.
For any one pixel point to be selected, determining whether a corresponding pixel point exists from the plurality of historical pixel points, wherein the length coordinate of the corresponding pixel point is the same as the length coordinate of the pixel point to be selected, and the width coordinate of the corresponding pixel point is the same as the width coordinate of the pixel point to be selected; if the corresponding pixel point does not exist, determining the pixel point to be selected as the target pixel point; if the corresponding pixel point exists, determining whether the depth coordinate of the pixel point to be selected is smaller than the depth coordinate of the corresponding pixel point; and if the depth coordinate of the pixel to be selected is smaller than the depth coordinate of the corresponding pixel, determining the pixel to be selected as the target pixel, and updating the depth coordinate of the corresponding pixel through the depth coordinate of the pixel to be selected. Determining model information of each vertex in the image patch, and determining the model information as pixel information of each vertex in the pixel image; and determining the pixel information of each target pixel point in the pixel image according to the pixel information of each vertex.
In combination with the scene example, if no corresponding pixel exists, it is indicated that there is no shielding of the historical pixel at the position of the pixel to be selected, the pixel to be selected is determined as the target pixel, and if the corresponding pixel exists, whether the corresponding pixel is shielded or not is required to be judged. If the depth coordinate of the pixel to be selected is smaller than the depth coordinate of the corresponding pixel, the corresponding pixel is displayed at the back, the pixel to be selected is not blocked, and the pixel to be selected is determined to be the target pixel.
In the feasible implementation mode, the shielding relation of the pixel points can be accurately determined by judging the depth coordinates, so that the accuracy of three-dimensional drawing is improved.
S311, calculating the pixel information corresponding to each target pixel point through an image digital signal processor to obtain the color of the pixel image.
Next, pixel information will be described with reference to fig. 7.
Fig. 7 is a schematic diagram of pixel information according to an embodiment of the present application. As shown in fig. 7, the pixel information is determined by a three-dimensional model, and each target pixel point in the pixel image has corresponding pixel information. And determining the color of the target pixel point through the pixel information, and sequentially executing the processing on each target pixel point to obtain the color of the pixel image.
A possible implementation manner, the pixel information includes material information and texture information; calculating the texture information to obtain a plurality of color types; determining the mixing proportion of each color type through the material information; and performing color mixing processing through the plurality of color types and the mixing proportion to obtain the color of the pixel image.
In combination with a scene example, the pixel information comprises material identifiers, and the mixing proportion corresponding to the material identifiers is determined through a preset corresponding relation. The texture information comprises a color texture identifier and a concave-convex texture identifier, and the texture color is determined through the color texture identifier and a preset corresponding relation. The pixel information comprises a normal value, a tangent vector is determined through the concave-convex texture mark and a preset corresponding relation, the tangent vector and the normal value are subjected to normalization calculation to obtain the finding, and the diffuse reflection color and the highlight color are obtained through calculation through preset light source information and normal. And mixing the texture color, the diffuse reflection color and the highlight color by taking the mixing proportion as weight to obtain the color of the target pixel point. The preset corresponding relation is stored in the memory.
In the feasible implementation mode, the color of the pixel image is obtained by carrying out mixed calculation on the pixel information obtained by the three-dimensional model, so that the model color of the three-dimensional model can be accurately reflected.
And S312, generating a three-dimensional image corresponding to the image patch through the pixel image and the color.
It should be noted that, the execution process of S312 is referred to S204, and will not be described herein.
Fig. 8 is a schematic structural diagram of a three-dimensional drawing device according to an embodiment of the present application. As shown in fig. 8, the three-dimensional drawing device 80 may include: a determination module 81, a pixelation module 82, a calculation module 83, and a generation module 84, wherein,
the determining module 81 is configured to determine a three-dimensional model, where the three-dimensional model includes a plurality of image patches.
The pixelation module 82 is configured to perform rasterization processing on an arbitrary image patch, so as to obtain a pixel image corresponding to the image patch and pixel information corresponding to each target pixel point in the pixel image.
The calculating module 83 is configured to calculate, by using an image digital signal processor, the pixel information corresponding to each target pixel point, so as to obtain a color of the pixel image.
The generating module 84 is configured to generate a three-dimensional image corresponding to the image patch according to the pixel image and the color.
Alternatively, the determining module 81 may perform S201 in the embodiment of fig. 2.
Alternatively, the pixelation module 82 may perform S202 in the embodiment of fig. 2.
Alternatively, the calculation module 83 may perform S203 in the embodiment of fig. 2.
Alternatively, the generation module 84 may perform S204 in the embodiment of fig. 2.
It should be noted that, the three-dimensional drawing device shown in the embodiment of the present application may execute the technical scheme shown in the embodiment of the method, and its implementation principle and beneficial effects are similar, and will not be described herein again.
In one possible implementation, the pixelation module 82 is specifically configured to:
determining a model observation matrix and a projection matrix of the three-dimensional model;
calculating to obtain a model observation projection matrix through the model observation matrix and the projection matrix;
and carrying out rasterization processing on the image patch by the model observation projection matrix to obtain a pixel image corresponding to the image patch and the pixel information corresponding to each target pixel point in the pixel image.
In one possible embodiment, the pixel patch is triangular; the pixel image is displayed in a display window; the pixelation module 82 is specifically configured to:
determining three vertex model coordinates of the image patch, wherein each vertex model coordinate corresponds to one of the vertices of the image patch;
mapping each vertex model coordinate through the model observation projection matrix to obtain a corresponding vertex three-dimensional coordinate;
determining the length and the width of the display window;
and carrying out transformation processing on the three-dimensional coordinates of the vertex, the length and the width to obtain the pixel image and the pixel information corresponding to each target pixel point in the pixel image.
In one possible embodiment, the vertex pixel coordinates include length coordinates, width coordinates, and depth coordinates; the pixelation module 82 is specifically configured to:
transforming the three-dimensional coordinates of the vertexes, the length and the width to obtain the coordinates of the vertexes of the pixel image;
determining a pixel range of the pixel image, a plurality of to-be-selected pixel points in the pixel range, and length coordinates and width coordinates of each to-be-selected pixel point through the vertex pixel coordinates;
Performing interpolation calculation processing through the depth coordinates of the vertex pixel coordinates to obtain the depth coordinates of each pixel point to be selected;
determining coordinates of a plurality of historical pixel points, wherein the historical pixel points are pixel points in a pixel image corresponding to each historical image patch;
and determining a plurality of target pixel points and the pixel information corresponding to each target pixel point from the plurality of to-be-selected pixel points through the coordinates of the plurality of historical pixel points.
In one possible implementation, the pixelation module 82 is specifically configured to:
for any one pixel point to be selected, determining whether a corresponding pixel point exists in the plurality of historical pixel points, wherein the length coordinate of the corresponding pixel point is the same as the length coordinate of the pixel point to be selected, and the width coordinate of the corresponding pixel point is the same as the width coordinate of the pixel point to be selected;
if the corresponding pixel point does not exist, determining the pixel point to be selected as the target pixel point;
if the corresponding pixel point exists, determining whether the depth coordinate of the pixel point to be selected is smaller than the depth coordinate of the corresponding pixel point;
if the depth coordinate of the pixel point to be selected is smaller than the depth coordinate of the corresponding pixel point, determining the pixel point to be selected as the target pixel point, and updating the depth coordinate of the corresponding pixel point through the depth coordinate of the pixel point to be selected;
Determining model information of each vertex in the image patch, and determining the model information as pixel information of each vertex in the pixel image;
and determining the pixel information of each target pixel point in the pixel image according to the pixel information of each vertex.
Fig. 9 is a schematic structural diagram of a three-dimensional drawing device according to an embodiment of the present application. On the basis of the embodiment shown in fig. 8, as shown in fig. 9, the three-dimensional drawing device 90 further includes: a mixing module 85 and a combining module 86, wherein:
the pixel information comprises material information and texture information; the mixing module 85 is configured to:
calculating the texture information to obtain a plurality of color types;
determining the mixing proportion of each color type through the material information;
and performing color mixing processing through the plurality of color types and the mixing proportion to obtain the color of the pixel image.
The combination module 86 is configured to:
storing the three-dimensional image corresponding to the image patch in an image buffer area;
and combining the three-dimensional images in the image buffer area until the three-dimensional image corresponding to each image patch is generated, so as to obtain a three-dimensional combined image corresponding to the three-dimensional model.
Fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the present application, as shown in fig. 10, where the electronic device includes:
a processor 291, the electronic device further comprising a memory 292; a communication interface (Communication Interface) 293 and bus 294 may also be included. The processor 291, the memory 292, and the communication interface 293 may communicate with each other via the bus 294. Communication interface 293 may be used for information transfer. The processor 291 may call logic instructions in the memory 292 to perform the methods of the above-described embodiments.
Further, the logic instructions in memory 292 described above may be implemented in the form of software functional units and stored in a computer-readable storage medium when sold or used as a stand-alone product.
The memory 292 is a computer readable storage medium, and may be used to store a software program, a computer executable program, and program instructions/modules corresponding to the methods in the embodiments of the present application. The processor 291 executes functional applications and data processing by running software programs, instructions and modules stored in the memory 292, i.e., implements the methods of the method embodiments described above.
Memory 292 may include a storage program area that may store an operating system, at least one application program required for functionality, and a storage data area; the storage data area may store data created according to the use of the terminal device, etc. Further, memory 292 may include high-speed random access memory, and may also include non-volatile memory.
Embodiments of the present application provide a non-transitory computer-readable storage medium having stored therein computer-executable instructions that, when executed by a processor, are configured to implement a method as described in the previous embodiments.
The embodiment of the application also provides a computer program product, which comprises a computer program stored in a computer readable storage medium, wherein at least one processor can read the computer program from the computer readable storage medium, and the technical scheme of the three-dimensional drawing method in the embodiment can be realized when the at least one processor executes the computer program.
The embodiment of the application also provides a chip for running the instructions, wherein a computer program is stored on the chip, and when the computer program is executed by the chip, the method for three-dimensional drawing is realized.
Other embodiments of the application will be apparent to those skilled in the art from consideration of the specification and practice of the application disclosed herein. This application is intended to cover any variations, uses, or adaptations of the application following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the application pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It should be noted that, the user information (including but not limited to user equipment information, user personal information, etc.) and the data (including but not limited to data for analysis, stored data, presented data, etc.) related to the present application are information and data authorized by the user or fully authorized by each party, and the collection, use and processing of the related data need to comply with the related laws and regulations and standards of the related country and region, and provide corresponding operation entries for the user to select authorization or rejection.
It is to be understood that the application is not limited to the precise arrangements and instrumentalities shown in the drawings, which have been described above, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the application is limited only by the appended claims.
Claims (17)
1. A method of three-dimensional mapping comprising:
determining a three-dimensional model, the three-dimensional model comprising a plurality of image patches;
performing rasterization processing on any image patch to obtain a pixel image corresponding to the image patch and pixel information corresponding to each target pixel point in the pixel image;
calculating the pixel information corresponding to each target pixel point through an image digital signal processor to obtain the color of the pixel image;
and generating a three-dimensional image corresponding to the image patch through the pixel image and the color.
2. The method of claim 1, wherein rasterizing the image patch to obtain a pixel image corresponding to the image patch and pixel information corresponding to each target pixel point in the pixel image comprises:
determining a model observation matrix and a projection matrix of the three-dimensional model;
calculating to obtain a model observation projection matrix through the model observation matrix and the projection matrix;
and carrying out rasterization processing on the image patch by the model observation projection matrix to obtain a pixel image corresponding to the image patch and the pixel information corresponding to each target pixel point in the pixel image.
3. The method of claim 2, wherein the pixel patches are triangular; the pixel image is displayed in a display window; performing rasterization processing on the image patch by the model observation projection matrix to obtain a pixel image corresponding to the image patch and the pixel information corresponding to each target pixel point in the pixel image, wherein the pixel information comprises;
determining three vertex model coordinates of the image patch, wherein each vertex model coordinate corresponds to one of the vertices of the image patch;
mapping each vertex model coordinate through the model observation projection matrix to obtain a corresponding vertex three-dimensional coordinate;
determining the length and the width of the display window;
and carrying out transformation processing on the three-dimensional coordinates of the vertex, the length and the width to obtain the pixel image and the pixel information corresponding to each target pixel point in the pixel image.
4. A method according to claim 3, wherein the vertex pixel coordinates comprise length coordinates, width coordinates, and depth coordinates; and transforming the vertex three-dimensional coordinates, the length and the width to obtain the pixel image and the pixel information corresponding to each target pixel point in the pixel image, wherein the pixel information comprises:
Transforming the three-dimensional coordinates of the vertexes, the length and the width to obtain the coordinates of the vertexes of the pixel image;
determining a pixel range of the pixel image, a plurality of to-be-selected pixel points in the pixel range, and length coordinates and width coordinates of each to-be-selected pixel point through the vertex pixel coordinates;
performing interpolation calculation processing through the depth coordinates of the vertex pixel coordinates to obtain the depth coordinates of each pixel point to be selected;
determining coordinates of a plurality of historical pixel points, wherein the historical pixel points are pixel points in a pixel image corresponding to each historical image patch;
and determining a plurality of target pixel points and the pixel information corresponding to each target pixel point from the plurality of to-be-selected pixel points through the coordinates of the plurality of historical pixel points.
5. The method of claim 4, wherein determining a plurality of target pixel points from the plurality of candidate pixel points by coordinates of the plurality of history pixel points, and the pixel information corresponding to each of the target pixel points, comprises:
for any one pixel point to be selected, determining whether a corresponding pixel point exists in the plurality of historical pixel points, wherein the length coordinate of the corresponding pixel point is the same as the length coordinate of the pixel point to be selected, and the width coordinate of the corresponding pixel point is the same as the width coordinate of the pixel point to be selected;
If the corresponding pixel point does not exist, determining the pixel point to be selected as the target pixel point;
if the corresponding pixel point exists, determining whether the depth coordinate of the pixel point to be selected is smaller than the depth coordinate of the corresponding pixel point;
if the depth coordinate of the pixel point to be selected is smaller than the depth coordinate of the corresponding pixel point, determining the pixel point to be selected as the target pixel point, and updating the depth coordinate of the corresponding pixel point through the depth coordinate of the pixel point to be selected;
determining model information of each vertex in the image patch, and determining the model information as pixel information of each vertex in the pixel image;
and determining the pixel information of each target pixel point in the pixel image according to the pixel information of each vertex.
6. The method of any one of claims 1-5, wherein the pixel information includes texture information and texture information; calculating, by an image digital signal processor, the pixel information corresponding to each target pixel point to obtain a color of the pixel image, including:
Calculating the texture information to obtain a plurality of color types;
determining the mixing proportion of each color type through the material information;
and performing color mixing processing through the plurality of color types and the mixing proportion to obtain the color of the pixel image.
7. The method according to any one of claims 1-6, further comprising:
storing the three-dimensional image corresponding to the image patch in an image buffer area;
and combining the three-dimensional images in the image buffer area until the three-dimensional image corresponding to each image patch is generated, so as to obtain a three-dimensional combined image corresponding to the three-dimensional model.
8. A three-dimensional drawing device, comprising:
a determination module for determining a three-dimensional model, the three-dimensional model comprising a plurality of image patches;
the pixel module is used for carrying out rasterization processing on any image patch to obtain a pixel image corresponding to the image patch and pixel information corresponding to each target pixel point in the pixel image;
the calculation module is used for calculating the pixel information corresponding to each target pixel point through an image digital signal processor to obtain the color of the pixel image;
And the generation module is used for generating a three-dimensional image corresponding to the image patch through the pixel image and the color.
9. The apparatus of claim 8, wherein the device comprises a plurality of sensors,
the pixelation module is specifically used for determining a model observation matrix and a projection matrix of the three-dimensional model;
the pixelation module is specifically further used for calculating a model observation projection matrix through the model observation matrix and the projection matrix;
the pixelized module is specifically further configured to perform rasterization processing on the image patch by using the model observation projection matrix, so as to obtain a pixel image corresponding to the image patch and the pixel information corresponding to each target pixel point in the pixel image.
10. The device of claim 9, wherein the pixel patches are triangular; the pixel image is displayed in a display window;
the pixelation module is specifically configured to determine three vertex model coordinates of the image patch, where each vertex model coordinate corresponds to one of vertices of the image patch;
the pixelized module is specifically further configured to map each vertex model coordinate through the model observation projection matrix to obtain a corresponding vertex three-dimensional coordinate;
The pixelation module is specifically used for determining the length and the width of the display window;
the pixelation module is specifically further configured to perform transformation processing through the three-dimensional coordinates of the vertex, the length and the width, so as to obtain the pixel image and the pixel information corresponding to each target pixel point in the pixel image.
11. The apparatus of claim 10, wherein the vertex pixel coordinates comprise length coordinates, width coordinates, and depth coordinates;
the pixelation module is specifically configured to perform transformation processing through the three-dimensional coordinate of the vertex, the length and the width to obtain a coordinate of a vertex pixel of the pixel image;
the pixelation module is specifically further configured to determine a pixel range of the pixel image, a plurality of to-be-selected pixel points in the pixel range, and a length coordinate and a width coordinate of each to-be-selected pixel point according to the vertex pixel coordinates;
the pixelation module is specifically further configured to perform interpolation calculation processing through the depth coordinate of the vertex pixel coordinate to obtain the depth coordinate of each pixel point to be selected;
The pixelation module is specifically configured to determine coordinates of a plurality of historical pixel points, where the historical pixel points are pixel points in a pixel image corresponding to each historical image patch;
the pixelation module is specifically further configured to determine a plurality of target pixel points from the plurality of candidate pixel points according to coordinates of the plurality of historical pixel points, and the pixel information corresponding to each target pixel point.
12. The apparatus of claim 11, wherein the device comprises a plurality of sensors,
the pixelation module is specifically configured to determine, for any one pixel point to be selected, whether a corresponding pixel point exists from the plurality of historical pixel points, where a length coordinate of the corresponding pixel point is the same as a length coordinate of the pixel point to be selected, and a width coordinate of the corresponding pixel point is the same as a width coordinate of the pixel point to be selected;
the pixelation module is specifically further configured to determine the pixel to be selected as the target pixel if the corresponding pixel does not exist;
the pixelation module is specifically further configured to determine whether the depth coordinate of the pixel to be selected is smaller than the depth coordinate of the corresponding pixel if the corresponding pixel exists;
The pixelation module is specifically further configured to determine the pixel to be selected as the target pixel if the depth coordinate of the pixel to be selected is smaller than the depth coordinate of the corresponding pixel, and update the depth coordinate of the corresponding pixel according to the depth coordinate of the pixel to be selected;
the pixelation module is specifically further configured to determine model information of each vertex in the image patch, and determine the model information as pixel information of each vertex in the pixel image;
the pixelation module is specifically further configured to determine pixel information of each target pixel point in the pixel image according to the pixel information of each vertex.
13. The apparatus according to any one of claims 8-12, wherein the pixel information includes texture information and texture information; the apparatus further comprises:
the mixing module is used for carrying out calculation processing through the texture information to obtain a plurality of color types;
the mixing module is further used for determining the mixing proportion of each color type through the material information;
and the mixing module is also used for carrying out color mixing processing through the plurality of color types and the mixing proportion to obtain the color of the pixel image.
14. The apparatus according to any one of claims 8-13, wherein the apparatus further comprises:
the combination module is used for storing the three-dimensional image corresponding to the image patch in an image buffer area;
and the combination module is also used for carrying out combination processing on the three-dimensional images in the image buffer area until the three-dimensional images corresponding to each image patch are generated, so as to obtain the three-dimensional combined image corresponding to the three-dimensional model.
15. An electronic device, comprising: a processor, and a memory communicatively coupled to the processor;
the memory stores computer-executable instructions;
the processor executes computer-executable instructions stored in the memory to implement the method of any one of claims 1-7.
16. A computer readable storage medium having stored therein computer executable instructions which when executed by a processor are adapted to carry out the method of any one of claims 1-7.
17. A computer program product comprising a computer program which, when executed by a processor, implements the method of any of claims 1-7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311068903.9A CN117132695A (en) | 2023-08-23 | 2023-08-23 | Three-dimensional drawing method, three-dimensional drawing device, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311068903.9A CN117132695A (en) | 2023-08-23 | 2023-08-23 | Three-dimensional drawing method, three-dimensional drawing device, electronic equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117132695A true CN117132695A (en) | 2023-11-28 |
Family
ID=88852134
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311068903.9A Pending CN117132695A (en) | 2023-08-23 | 2023-08-23 | Three-dimensional drawing method, three-dimensional drawing device, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117132695A (en) |
-
2023
- 2023-08-23 CN CN202311068903.9A patent/CN117132695A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7085012B2 (en) | Map rendering methods, equipment, computer equipment and computer programs | |
US20230053462A1 (en) | Image rendering method and apparatus, device, medium, and computer program product | |
CN112200900B (en) | Volume cloud rendering method and device, electronic equipment and storage medium | |
EP3180773B1 (en) | Bandwidth reduction using texture lookup by adaptive shading | |
US11348308B2 (en) | Hybrid frustum traced shadows systems and methods | |
US10593096B2 (en) | Graphics processing employing cube map texturing | |
US9224233B2 (en) | Blending 3D model textures by image projection | |
EP4213102A1 (en) | Rendering method and apparatus, and device | |
RU2680355C1 (en) | Method and system of removing invisible surfaces of a three-dimensional scene | |
CN111583381B (en) | Game resource map rendering method and device and electronic equipment | |
CN110428504B (en) | Text image synthesis method, apparatus, computer device and storage medium | |
CN105550973B (en) | Graphics processing unit, graphics processing system and anti-aliasing processing method | |
EP3343516A1 (en) | Method and device for applying an effect of an augmented or mixed reality application | |
CN114742931A (en) | Method and device for rendering image, electronic equipment and storage medium | |
CN112419460A (en) | Method, apparatus, computer device and storage medium for baking model charting | |
CN115512025A (en) | Method and device for detecting model rendering performance, electronic device and storage medium | |
CN116630516B (en) | 3D characteristic-based 2D rendering ordering method, device, equipment and medium | |
KR101118597B1 (en) | Method and System for Rendering Mobile Computer Graphic | |
CN112116719A (en) | Method and device for determining object in three-dimensional scene, storage medium and electronic equipment | |
CN117132695A (en) | Three-dimensional drawing method, three-dimensional drawing device, electronic equipment and storage medium | |
CN113593028B (en) | Three-dimensional digital earth construction method for avionics display control | |
CN115359172A (en) | Rendering method and related device | |
CN112848312B (en) | Method and device for detecting three-dimensional model object, computer equipment and storage medium | |
CN115690365A (en) | Method, apparatus, device and medium for managing three-dimensional model | |
CN112419459A (en) | Method, apparatus, computer device and storage medium for baked model AO mapping |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |