CN116597063A - Picture rendering method, device, equipment and medium - Google Patents

Picture rendering method, device, equipment and medium Download PDF

Info

Publication number
CN116597063A
CN116597063A CN202310885913.5A CN202310885913A CN116597063A CN 116597063 A CN116597063 A CN 116597063A CN 202310885913 A CN202310885913 A CN 202310885913A CN 116597063 A CN116597063 A CN 116597063A
Authority
CN
China
Prior art keywords
local
texture
grid
mesh
local texture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310885913.5A
Other languages
Chinese (zh)
Other versions
CN116597063B (en
Inventor
桑琪
梅元乔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202310885913.5A priority Critical patent/CN116597063B/en
Publication of CN116597063A publication Critical patent/CN116597063A/en
Application granted granted Critical
Publication of CN116597063B publication Critical patent/CN116597063B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Generation (AREA)

Abstract

The application discloses a picture rendering method, a device, equipment and a medium, wherein the method comprises the following steps: acquiring regularity parameters of a target picture to be rendered; the regularity parameter is used for representing the regularity of the target picture; acquiring a first local texture of a target picture; the global texture of the target picture is segmented into a plurality of local textures according to the regularity of the target picture, and the first local texture is any local texture in the plurality of local textures; mapping the local grids of the first local texture based on the regularity parameter to obtain local grids of all local textures except the first local texture in the plurality of local textures; rendering the target picture based on the first local texture and the local mesh of each of the plurality of local textures. By adopting the method and the device, the efficiency of rendering the target picture can be improved.

Description

Picture rendering method, device, equipment and medium
Technical Field
The present application relates to the field of image rendering technologies, and in particular, to a method, an apparatus, a device, and a medium for image rendering.
Background
In games, scenes are typically encountered that require rendering of pictures, which may be pictures of interface elements (e.g., controls) in the game interface that need to be displayed. In the existing application, the whole texture of the picture needs to be loaded, and then the picture is rendered and displayed based on the whole texture. Therefore, in the process of rendering and displaying the picture, the existing application needs to load the whole texture of the picture, and if the whole data volume of the texture of the picture is large, the loading speed of the texture of the picture is slow, so that the rendering efficiency of the picture is low.
Disclosure of Invention
The application provides a picture rendering method, a device, equipment and a medium, which can improve the efficiency of rendering a target picture.
In one aspect, the present application provides a method for rendering a picture, including:
acquiring regularity parameters of a target picture to be rendered; the regularity parameter is used for representing the regularity of the target picture;
acquiring a first local texture of a target picture; the global texture of the target picture is segmented into a plurality of local textures according to the regularity of the target picture, and the first local texture is any local texture in the plurality of local textures;
mapping the local grids of the first local texture based on the regularity parameter to obtain local grids of all local textures except the first local texture in the plurality of local textures;
rendering the target picture based on the first local texture and the local mesh of each of the plurality of local textures.
In one aspect, the present application provides a picture rendering apparatus, including:
the first acquisition module is used for acquiring regularity parameters of the target picture to be rendered; the regularity parameter is used for representing the regularity of the target picture;
the second acquisition module is used for acquiring a first local texture of the target picture; the global texture of the target picture is segmented into a plurality of local textures according to the regularity of the target picture, and the first local texture is any local texture in the plurality of local textures;
The mapping module is used for carrying out mapping processing on the local grids of the first local texture based on the regularity parameters to obtain local grids of all local textures except the first local texture in the plurality of local textures;
and the rendering module is used for rendering the target picture based on the first local texture and the local grids of the local textures in the plurality of local textures.
Optionally, the regularity of the target picture includes symmetry, and the regularity parameter includes a symmetry mode of the target picture and symmetry parameters in the symmetry mode;
the symmetric mode includes any of the following: a bipartite symmetrical mode, a quarternary symmetrical mode;
if the symmetric mode is a bipartite symmetric mode, the symmetric parameters include any of the following: a bisector axisymmetric parameter, a bisector center symmetric parameter;
if the symmetry mode is a quarter symmetry mode, the symmetry parameters include any of the following: a quarter axisymmetric parameter, a quarter centrosymmetric parameter.
Optionally, the device is further configured to:
generating a local mesh of the first local texture based on the first local texture;
the local grid of the first local texture comprises grid vertexes of the first local texture, and the grid vertexes of the first local texture have corresponding texture coordinates on the first local texture.
Optionally, the symmetrical mode is a bipartite symmetrical mode, and the global texture of the target picture is segmented into a first local texture and a second local texture, and the first local texture and the second local texture are symmetrical to each other;
the mapping module performs mapping processing on the local grids of the first local texture based on the regularity parameter to obtain a local grid mode of each local texture except the first local texture in the local textures, and the mapping module comprises the following steps:
and performing binary symmetrical mapping processing on the local grids of the first local texture based on the regularity parameters to generate local grids of the second local texture.
Optionally, the symmetry parameter includes a bisector symmetry parameter, and the bisector symmetry parameter includes a target symmetry axis of the target picture; the mapping module performs binary symmetrical mapping processing on the local grid of the first local texture based on the regularity parameter, and generates a local grid of the second local texture, which comprises the following steps:
performing axisymmetric processing on grid vertexes of the first local texture based on the target symmetry axis to generate first grid vertexes of the second local texture;
setting the texture coordinates of the grid vertexes of the first local texture as the texture coordinates of the first grid vertexes symmetrical to the grid vertexes of the first local texture based on the symmetry axis of the target;
A local mesh of the second local texture is generated based on the first mesh vertices of the second local texture having texture coordinates.
Optionally, the symmetry parameter includes a bisection center symmetry parameter, and the bisection center symmetry parameter is used to indicate the associated first center symmetry angle; the mapping module performs binary symmetrical mapping processing on the local grid of the first local texture based on the regularity parameter, and generates a local grid of the second local texture, which comprises the following steps:
rotating the grid vertexes of the first local texture by a first central symmetry angle to generate second grid vertexes of the second local texture; one grid vertex of the first local texture rotates and then is used for generating a corresponding second grid vertex;
setting the texture coordinates of each grid vertex of the first local texture as the texture coordinates of a second grid vertex corresponding to each grid vertex of the first local texture;
a local mesh of the second local texture is generated based on the second mesh vertices of the second local texture having texture coordinates.
Optionally, the symmetrical mode is a quarter symmetrical mode, the global texture of the target picture is segmented into a first local texture, a third local texture, a fourth local texture and a fifth local texture, and the local textures are symmetrical to each other;
The mapping module performs mapping processing on the local grids of the first local texture based on the regularity parameter to obtain a local grid mode of each local texture except the first local texture in the local textures, and the mapping module comprises the following steps:
and performing quarter symmetrical mapping processing on the local grids of the first local texture based on the regularity parameters to generate a local grid of the third local texture, a local grid of the fourth local texture and a local grid of the fifth local texture.
Optionally, the symmetry parameter includes a quarter axis symmetry parameter, and the quarter axis symmetry parameter includes a first symmetry axis and a second symmetry axis of the target picture; the mapping module performs quarter symmetrical mapping processing on the local grids of the first local texture based on the regularity parameter, and generates a local grid of the third local texture, a local grid of the fourth local texture and a local grid of the fifth local texture, which comprises the following steps:
symmetrically processing grid vertexes of the first local texture based on the first symmetry axis to generate third grid vertexes of the third local texture, and setting texture coordinates of the grid vertexes of the first local texture to be texture coordinates of the third grid vertexes symmetrical to the grid vertexes of the first local texture based on the first symmetry axis;
Symmetrically processing grid vertexes of the first local texture based on the second symmetry axis to generate fourth grid vertexes of the fourth local texture, and setting texture coordinates of the grid vertexes of the first local texture to be texture coordinates of the fourth grid vertexes symmetrical to the grid vertexes of the first local texture based on the second symmetry axis;
symmetrically processing grid vertexes of the first local texture based on the first symmetry axis and the second symmetry axis to generate fifth grid vertexes of the fifth local texture, and setting texture coordinates of the grid vertexes of the first local texture to be texture coordinates of the fifth grid vertexes symmetrical to the grid vertexes of the first local texture based on the first symmetry axis and the second symmetry axis;
generating a local mesh of the third local texture based on the third mesh vertex of the third local texture having texture coordinates, generating a local mesh of the fourth local texture based on the fourth mesh vertex of the fourth local texture having texture coordinates, and generating a local mesh of the fifth local texture based on the fifth mesh vertex of the fifth local texture having texture coordinates.
Optionally, the symmetry parameters include a quarter-centered symmetry parameter, and the quarter-centered symmetry parameter is used to indicate the associated second, third, and fourth central symmetry angles; the mapping module performs quarter symmetrical mapping processing on the local grids of the first local texture based on the regularity parameter, and generates a local grid of the third local texture, a local grid of the fourth local texture and a local grid of the fifth local texture, which comprises the following steps:
Rotating the grid vertexes of the first local texture by a second central symmetry angle to generate sixth grid vertexes of the third local texture, and setting the texture coordinates of the grid vertexes of the first local texture to be the texture coordinates of the sixth grid vertexes generated after the grid vertexes of the first local texture are rotated;
rotating the grid vertexes of the first local texture by a third central symmetry angle to generate seventh grid vertexes of the fourth local texture, and setting the texture coordinates of the grid vertexes of the first local texture to be the texture coordinates of the seventh grid vertexes generated after the grid vertexes of the first local texture are rotated;
rotating the grid vertexes of the first local texture by a fourth central symmetry angle to generate eighth grid vertexes of the fifth local texture, and setting texture coordinates of the grid vertexes of the first local texture to be texture coordinates of the eighth grid vertexes generated after the grid vertexes of the first local texture are rotated;
generating a local mesh of the third local texture based on the sixth mesh vertex of the third local texture having texture coordinates, generating a local mesh of the fourth local texture based on the seventh mesh vertex of the fourth local texture having texture coordinates, and generating a local mesh of the fifth local texture based on the eighth mesh vertex of the fifth local texture having texture coordinates.
Optionally, the local mesh of each local texture in the plurality of local textures includes mesh vertices of the local texture; texture coordinates of grid vertices of each local texture in the plurality of local textures all belong to texture coordinates on the first local texture;
the rendering module renders the target picture based on the first local texture and the local grid of each local texture in the plurality of local textures, and the method comprises the following steps:
based on the texture coordinates of the grid vertexes of each local texture in the plurality of local textures, respectively performing color sampling in the first local texture to obtain the pixel colors of the grid vertexes of each local texture in the plurality of local textures;
coloring the grid vertexes of each local texture in the plurality of local textures based on the pixel colors of the grid vertexes of each local texture in the plurality of local textures so as to render and display local pictures of each local texture in the plurality of local textures;
the target picture is composed of local pictures to which each local texture belongs in a plurality of local textures.
Optionally, the global texture is a texture of a contracted picture, and the contracted picture is obtained by performing contraction treatment on a picture at the center position of the target picture;
The rendering module renders the target picture based on the first local texture and the local grid of each local texture in the plurality of local textures, and the method comprises the following steps:
rendering a contracted picture based on the first local texture and a local mesh of each local texture of the plurality of local textures;
and stretching the picture at the central position of the rendered contracted picture to obtain a target picture.
Optionally, the device is further configured to:
acquiring a target game control which is required to be rendered and displayed by referring to global textures in a game;
updating the rendering display mode of the target game control to a rendering display mode based on the first local texture and the local grids of each local texture in the plurality of local textures;
the target picture is a picture of a target game control to be rendered and displayed.
In one aspect the application provides a computer device comprising a memory and a processor, the memory storing a computer program which, when executed by the processor, causes the processor to perform the method of one aspect of the application.
An aspect of the application provides a computer readable storage medium storing a computer program which, when executed by a processor, causes the processor to perform the method of the above aspect.
According to one aspect of the present application, there is provided a computer program product comprising a computer program stored in a computer readable storage medium. The processor of the computer device reads the computer program from the computer-readable storage medium, and the processor executes the computer program to cause the computer device to execute the method provided in various optional manners of the above aspect and the like.
The method and the device can acquire the regularity parameters of the target picture to be rendered; the regularity parameter is used for representing the regularity of the target picture; acquiring a first local texture of a target picture; the global texture of the target picture is segmented into a plurality of local textures according to the regularity of the target picture, and the first local texture is any local texture in the plurality of local textures; mapping the local grids of the first local texture based on the regularity parameter to obtain local grids of all local textures except the first local texture in the plurality of local textures; rendering the target picture based on the first local texture and the local mesh of each of the plurality of local textures. Based on the method, the global texture of the target picture can be segmented into a plurality of local textures based on the regularity of the target picture, and the local grids of other local textures can be obtained through mapping of the local grids of the first local texture, and further, the rendering of the target picture can be realized through the first local texture and the local grids of the local textures, therefore, when the target picture is rendered, only one local texture (such as the first local texture) of the target picture is required to be loaded, and the global texture of the target picture is not required to be loaded, so that the texture loading speed when the target picture is rendered is improved, and the rendering efficiency of the target picture is also improved; in addition, as only the local texture of the picture is needed to be stored, the memory storage pressure aiming at the texture of the picture can be reduced, and the effective optimization of the memory is realized.
Drawings
In order to more clearly illustrate the application or the technical solutions of the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it being obvious that the drawings in the description below are only some embodiments of the application, and that other drawings can be obtained from them without inventive effort for a person skilled in the art.
FIG. 1 is a schematic view of a scene rendered by a picture provided by the present application;
fig. 2 is a schematic flow chart of a picture rendering method provided by the application;
FIG. 3 is a schematic view of another picture rendering scene provided by the present application;
FIG. 4 is a schematic view of a scene of an image cropping provided by the present application;
FIG. 5 is a flow chart of a binary symmetrical mapping method for a grid according to the present application;
FIGS. 6 a-6 c are schematic diagrams of a scenario for generating a local mesh provided by the present application;
FIG. 7 is a flow chart of another method of bipartite symmetric mapping for grids according to the present application;
FIG. 8 is a schematic view of another image cropping scenario provided by the present application;
FIG. 9 is a flow chart of a method of quarter-symmetric mapping for a grid provided by the present application;
FIGS. 10 a-10 b are schematic views of another scenario for generating a local mesh provided by the present application;
FIG. 11 is a flow chart of another method of four-way symmetric mapping for a grid provided by the present application;
FIGS. 12 a-12 b are schematic views of yet another scenario for generating a local mesh provided by the present application;
FIG. 13 is a schematic diagram of an interface for a texture set provided by the present application;
fig. 14 is a schematic structural diagram of a picture rendering device according to the present application;
fig. 15 is a schematic structural diagram of a computer device according to the present application.
Detailed Description
The following description of the embodiments of the present application will be made more apparent and fully hereinafter with reference to the accompanying drawings, in which some, but not all embodiments of the application are shown. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
Firstly, it should be noted that all data collected by the present application (such as the target picture, the texture of the target picture, and the related data of the grid of the texture) may be collected under the condition that the object (such as the user, the organization, or the enterprise) to which the data belongs agrees and authorizes, and the collection, the use, and the processing of the related data need to comply with the related laws and regulations and standards of the related country and region.
Here, the related concept to which the present application relates will be described:
unity3D: a game engine.
UI: the User Interface (User Interface) refers to the overall design of man-machine interaction, operation logic and attractive Interface for software.
UGUI: the UGUI is a set of UI components of the Unity3D, and can conveniently realize the visual operation of a UI interface.
Vertex (Vertex): may also be referred to as a mesh vertex, and represents a vertex in a mesh, and includes attributes such as vertex Position (Position), vertex Color (Color), normal (Normal), and texture coordinates (UV coordinates).
Triangle (Triangle): one face in the mesh is composed of three vertices. In computer graphics, triangles are the most basic graphical elements.
UV: texture coordinates of the vertex, representing the position of the vertex on the texture.
Referring to fig. 1, fig. 1 is a schematic view of a scene rendered by a picture according to the present application. As shown in fig. 1, a game application may be installed in the terminal device 100, and an installation package of the game application may include all textures (may also be referred to as texture maps) of pictures to be rendered and displayed in the game application (such as pictures of any interface elements to be rendered and displayed in a game interface), so that after the game application is installed, the terminal device 100 may take all textures required for interface rendering in the game application.
Alternatively, the terminal device 100 may be: smart terminals such as smart phones, tablet computers, notebook computers, desktop computers, smart televisions, portable devices, vehicle terminals and the like.
The target picture may refer to a picture of any object that needs to be rendered and displayed in the game interface in the game application, for example, the target picture may be a picture of the whole game interface, the target picture may also be a picture of a certain UI element (such as a game control) in the game interface, and so on, which may be specifically determined according to an actual application scenario.
The target picture has regularity (such as symmetry), and the target picture also has a global texture, which is an entire texture for rendering and displaying the target picture, and the global texture of the target picture may be divided into a plurality of parts according to the regularity of the target picture, and the plurality of parts may be referred to as a plurality of local textures of the target picture. The first local texture as shown in fig. 1 may be any one of the plurality of local textures.
Therefore, in the terminal device of the present application, only the first local texture of the global texture of the target picture may be stored, when the target picture needs to be rendered and displayed, only the first local texture of the target picture may be loaded (e.g. loaded from a disk of the terminal device), and based on the first local texture and the regularity of the target picture, the rendering effect of the first local texture on the target picture and the rendering effect of the local textures other than the first local texture on the target picture in the above local textures may be achieved, so that the rendering of the entire target picture is achieved through the first local texture, and the specific process may also refer to the related description in the corresponding embodiment of fig. 2 below.
By adopting the method provided by the application, for the global texture of the picture with regularity (such as the target picture), only a part of the global texture (such as the first local texture) can be stored in the terminal equipment, so that the occupation of the texture to the memory in the terminal equipment is reduced; and because the terminal equipment only needs to load part of the texture of the picture when rendering the regular picture, the whole rendering of the picture is realized, and because the loading speed of the texture is improved compared with the loading of the global texture by loading part of the texture, the rendering efficiency of the picture based on the loaded texture is also improved.
Referring to fig. 2, fig. 2 is a flow chart of a picture rendering method provided by the present application. The execution body in the embodiment of the present application may be a picture rendering device (may be simply referred to as a rendering device), where the rendering device may be a computer device or a cluster of computer devices formed by a plurality of computer devices, and the computer device may be a server, a terminal device, or other devices, which is not limited to this. As shown in fig. 2, the method may include:
step S101, obtaining regularity parameters of a target picture to be rendered; the regularity parameter is used to characterize the regularity of the target picture.
Specifically, the target picture may be a picture of any interface element (such as a UI element) to be rendered and displayed in the application interface. If the application interface can be a game interface, the interface element to be rendered and displayed can be a game control to be rendered and displayed in the game, and therefore, the target picture can be a picture of the game control to be rendered and displayed by the game interface. The target picture is specifically a picture of which interface element can be determined according to the actual application scene, and the method is not limited.
The target picture may be an image with regularity, and the regularity parameter of the target picture may be a parameter set for the target picture based on the regularity of the target picture, so the regularity parameter may also be used to characterize (or reflect) the regularity of the target picture. In other words, the regularity parameter of the target picture may be used to describe which specific distribution rule the image content contained in the target picture has.
Alternatively, the regularity parameter of the target picture may be preset, for example, the target picture is a picture of an object to be rendered and displayed in the game interface, the game may be included in the rendering device (e.g., the game is installed), and then the regularity parameter of the target picture may be set in a game engine of the rendering device. Thus, the rendering device may obtain (e.g., from the game engine) the regularity parameters of the target image.
Alternatively, the regularity of the target picture may include symmetry, i.e. the target picture may be one symmetrical picture, and thus the regularity parameter of the target picture may include a symmetry mode of the target picture and symmetry parameters of the target picture in the symmetry mode. The symmetry pattern of the target picture may be used to indicate the symmetry manner of the image content in the target picture.
Alternatively, the symmetrical pattern of the target picture may include any one of the following: a bipartite symmetrical mode and a quarternary symmetrical mode. If the symmetrical mode of the target picture is a bipartite symmetrical mode, the target picture can comprise pictures of two parts which are symmetrical to each other, namely the target picture can be formed based on the pictures of the two parts which are symmetrical to each other; if the symmetrical mode of the target picture is a quarter symmetrical mode, it indicates that the target picture may include four symmetrical pictures, i.e., the target picture may be formed based on the four symmetrical pictures.
Further, if the symmetry mode of the target picture is the binary symmetry mode, the symmetry parameters in the symmetry mode may include any of the following: a bisector axis symmetry parameter, a bisector center symmetry parameter.
If the symmetry parameter is a bipartite axisymmetric parameter, it indicates that the target picture may be symmetric based on one coordinate axis of a plane coordinate system (may also be referred to as a reference coordinate system), that is, the bipartite axisymmetric parameter is used to indicate that the target picture is a bipartite picture based on axisymmetry, and the bipartite axisymmetric parameter may include symmetry axes (number is 1) of the target picture, and may refer to the symmetry axes of the target picture as target symmetry axes.
The reference coordinate system may include two coordinate axes, which may include an x-axis (horizontal axis) and a y-axis (vertical axis), and may be referred to as a first coordinate axis and may be referred to as a second coordinate axis.
If the target picture is in a bipartite symmetry mode and the symmetry parameter is a bipartite axis symmetry parameter, the target symmetry axis of the target picture may be a first coordinate axis or a second coordinate axis, that is, the target picture may be symmetrical based on the x-axis (for example, the target picture may be symmetrical based on a direction of the x-axis, or may be understood that the target picture is symmetrical based on its own center line parallel to the x-axis, and the direction of the center line of the target picture coincides with the direction of the x-axis and is transverse), or may be symmetrical based on the y-axis (for example, the target picture may be symmetrical based on the direction of the y-axis, or may be understood that the target picture is symmetrical based on its own center line parallel to the y-axis, and the direction of the center line of the target picture coincides with the direction of the y-axis and is longitudinal).
In addition, if the symmetry parameter is the bisection center symmetry parameter, it indicates that the half picture of the target picture may be obtained by rotating the other half picture of the target picture by 180 degrees (may be rotated based on the center point of the target picture), that is, the bisection center symmetry parameter is used to indicate that the target picture is a bisection picture with center symmetry, and in the bisection center symmetry mode, the rotated 180 degrees may be referred to as a center symmetry angle (may be referred to as a first center symmetry angle) associated with the bisection center symmetry parameter, that is, the bisection center symmetry parameter may be used to indicate the associated first center symmetry angle.
Further, if the symmetry mode of the target picture is the quarter symmetry mode, the symmetry parameters in the symmetry mode may include any of the following: a quarter axisymmetric parameter, a quarter centrosymmetric parameter.
If the symmetry parameter is the quarter axis symmetry parameter, it indicates that the target picture may be symmetric based on both coordinate axes of the reference coordinate system, that is, the quarter axis symmetry parameter is used to indicate that the target picture is a quarter picture based on axial symmetry, and the quarter axis symmetry parameter may include two symmetry axes of the target picture, where the two symmetry axes may include the first coordinate axis and the second coordinate axis, and the first coordinate axis may be referred to as a first symmetry axis of the target picture, and the second coordinate axis may be referred to as a second symmetry axis of the target picture. That is, if the symmetry parameter is the above-mentioned quarter axis symmetry parameter, it is understood that the target picture is symmetrical based on both the x-axis direction and the y-axis direction.
In addition, if the symmetry parameter is the quarter-center symmetry parameter, it indicates that 1/4 of the pictures of the target picture can be rotated by different angles respectively, so as to obtain another 3 1/4 pictures, i.e. the quarter-center symmetry parameter is used for indicating that the target picture is a quarter-picture with center symmetry. For example, 1/4 of the target picture may be rotated 90 degrees to obtain another 1/4 of the picture, may be rotated 180 degrees to obtain yet another 1/4 of the picture, and may be rotated 270 degrees to obtain yet another 1/4 of the picture.
Wherein the 90 degrees, 180 degrees and 270 degrees of rotation may all be referred to as central symmetry angles associated with quarter central symmetry parameters, the 90 degrees of rotation associated with quarter central symmetry parameters may be referred to as second central symmetry angles, the 180 degrees of rotation associated with quarter central symmetry parameters may be referred to as third central symmetry angles, the 270 degrees of rotation associated with quarter central symmetry parameters may be referred to as fourth central symmetry angles, i.e. quarter central symmetry parameters may be used to indicate the second central symmetry angles, third central symmetry angles and fourth central symmetry angles associated.
It should be noted that, the symmetrical mode of the target picture in the present application may be mainly the two-way symmetrical mode and the four-way symmetrical mode. In some extended scenarios, the target picture may also be in an octal symmetric mode (i.e., the target picture may be split into 8 parts, the other 7 1/8 pictures of the target picture may be converted by 1/8 pictures of the target picture), a hexadecimal symmetric mode (i.e., the target picture may be split into 16 parts, the other 15 1/16 pictures of the target picture may be converted by 1/16 pictures of the target picture), or other more symmetric modes. It should be noted that in these symmetrical modes higher than four, the calculation amount of rendering the whole target picture through the local texture may be excessively large, and the saw-tooth shape may appear between the parts in the rendered target picture, because in the symmetrical modes higher than 2 and 4, oblique lines appear between the parts of the target picture, and the oblique line parts are formed by stacking a plurality of pixels, so if the oblique lines between the parts of the target picture are not overlapped, the saw-tooth shape effect may be displayed, and the rendering effect on the target picture is not ideal. Therefore, if a symmetrical mode higher than four is used, the rendering calculation amount and the rendering effect need to be comprehensively considered.
In the embodiment of the application, the rendering process of the target picture is described in detail by taking the symmetrical mode as a binary symmetrical mode or a quarter symmetrical mode as an example.
Step S102, a first local texture of a target picture is obtained; the global texture of the target picture is segmented into a plurality of local textures according to the regularity of the target picture, and the first local texture is any one of the plurality of local textures.
In particular, the target picture may have a global texture, which may be an entire complete texture for rendering the target picture. The global texture can be segmented into a plurality of local textures according to the regularity of the target picture, and the local textures can be mutually converted based on the regularity of the picture, namely any local texture in the local textures can be converted into other local textures. Illustratively, the sizes of the respective partial textures of the plurality of partial textures may be the same.
It is understood that the global texture of the target picture may also have regularity of the target picture. As described below.
If the symmetrical mode of the target picture is the binary symmetrical mode, the global texture may be segmented into two symmetrical local textures (one local texture is half of the global texture) according to the regularity of the target picture, and either one of the two local textures may obtain the other local texture in an axisymmetrical manner or in a center rotation manner. Here, one local texture may be the texture of a half of the target picture, where the half may refer to the smallest symmetric unit of the target picture in the bipartite symmetric mode, and the other half may be obtained by symmetric (axisymmetric or centrosymmetric) of the half.
If the symmetrical mode of the target picture is the quarter symmetrical mode, the global texture can be divided into four symmetrical local textures (one local texture is 1/4 of the global texture) according to the regularity of the target picture, and any one of the four local textures can also obtain other three local textures in an axisymmetrical mode or a center rotation mode. Here, one local texture may be the texture of 1/4 picture in the target picture, where the 1/4 picture may refer to the minimum symmetric unit of the target picture in the quarter symmetric mode, and the other 3 1/4 pictures may be obtained by performing symmetry (axisymmetry or centrosymmetric) on the 1/4 picture.
Alternatively, the minimum symmetry unit in the target picture described above may refer to a picture that can restore the minimum portion of the target picture.
Any local texture in the plurality of local textures may be used as a first local texture (which local texture is specifically selected as the first local texture and may be determined according to an actual application scene), where the first local texture may be a local texture used by a subsequent rendering device to render the target picture, and the rendering device may store only the first local texture, may not store a global texture of the target picture, or may not store other local textures except the first local texture in the global texture; in this way, the storage pressure of the rendering device for the texture of the target picture to be rendered can be reduced.
Alternatively, the multiple local textures of the global texture of the target picture may be pre-segmented, and the process may be performed by the rendering device, or may also be performed by another device (such as a background device of a game), which is described below by taking the rendering device as an example, where the process may include:
the rendering device may obtain an atlas, which may contain a plurality of pictures to be rendered, such as a picture that may contain all UI elements in the game that may be rendered. The rendering device may detect and screen (may be understood as locating) the pictures with regularity (such as symmetry) from the atlas, and as the above target pictures, the number of the screened target pictures may be determined according to the actual application scenario.
In one embodiment, the present application may use a cosine similarity algorithm to detect and screen out a picture with regularity from an atlas, and the process of detecting a picture in the atlas is described as an example, and the process may include: the rendering device may split the picture into 4 parts, e.g. the picture may be split evenly into 4 parts by a horizontal center line and a vertical center line of the picture (may include 1/4 picture of the upper left part, 1/4 picture of the lower left part, 1/4 picture of the upper right part and 1/4 picture of the lower right part). For example, in fig. 10a, the target picture c1 may include 1/4 picture of the upper left part, 1/4 picture of the lower left part, 1/4 picture of the upper right part, and 1/4 picture of the lower right part, and the local picture c3 obtained by cropping the target picture c1 may be 1/4 picture of the upper left part in the target picture.
Furthermore, the rendering device may calculate cosine similarity between the 4 partial pictures, where any two partial pictures may have a cosine similarity, and if the cosine similarity between the 4 partial pictures is relatively high (e.g., higher than a similarity threshold), the pictures may be considered to have symmetry (i.e., regularity), and the symmetrical mode of the symmetry may be a quarter-symmetrical mode; if the cosine similarity between the top left picture and the top right picture in the 4 parts of pictures is higher than the similarity threshold, the cosine similarity between the bottom left picture and the bottom right picture is higher than the similarity threshold, the cosine similarity between the top left picture and the bottom left picture is lower than the similarity threshold, and the cosine similarity between the top right picture and the bottom right picture is lower than the similarity threshold, the pictures can be considered to be symmetrical, and the symmetrical mode of the symmetry can be a bipartite symmetrical mode.
More, if the cosine similarity between the picture at the upper left part and the picture at the lower left part of the 4 parts is higher than the similarity threshold, the cosine similarity between the picture at the upper right part and the picture at the lower right part is higher than the similarity threshold, the cosine similarity between the picture at the upper left part and the picture at the upper right part is lower than the similarity threshold, and the cosine similarity between the picture at the lower left part and the picture at the lower right part is lower than the similarity threshold, the pictures can be considered to be symmetrical, and the symmetrical mode of the symmetry can be a bisectional symmetrical mode.
Step S103, mapping the local grids of the first local texture based on the regularity parameter to obtain local grids of all local textures except the first local texture in the plurality of local textures.
Alternatively, the rendering device may directly generate a Mesh (Mesh) of the first local texture through the first local texture, and may call the Mesh of the local texture as a local Mesh, i.e., may call the Mesh of the first local texture as a local Mesh of the first local texture. Alternatively, the local mesh of the first local texture may be a mesh generated in advance for the first local texture. Alternatively, a corresponding mesh generation method may be invoked to quickly generate the local mesh of the first local texture. The local grid of any local texture can be used for rendering a local picture which needs to be rendered and displayed based on the local texture in the target picture.
Wherein the local mesh of the first local texture may comprise mesh triangles (which may be abbreviated as triangles) and mesh vertices of the first local texture (i.e. vertices of triangles), i.e. the local mesh of the first local texture may be formed of several triangles and vertices of triangles, the mesh vertices in the local mesh of the first local texture may be referred to as mesh vertices of the first local texture, which mesh vertices have corresponding texture coordinates (UV coordinates) on the first local texture, i.e. it may be understood that the texture coordinates of one mesh vertex of the first local texture may be used to indicate a corresponding one texture position on the first local texture, which texture position may also be referred to as pixel position, i.e. the texture position may be one pixel position on the first local texture.
Wherein the local mesh of the local texture may be rectangular, and the local mesh of the first local texture may be composed of at least two triangles, since at least two triangles are required to form one rectangle.
If the symmetrical mode of the target picture may be the binary symmetrical mode, the plurality of local textures of the global texture of the target picture may refer to two local textures that are symmetrical (may be axisymmetric or centrosymmetric), and another local texture other than the first local texture of the two local textures may be referred to as a second local texture; that is, the global texture of the target picture may be segmented into a first local texture and a second local texture, which are symmetrical to each other.
Therefore, the rendering device can perform binary symmetrical mapping processing on the local grid of the first local texture, and can generate the local grid of the second local texture. The bipartite symmetrical mapping process may refer to a process of performing a corresponding symmetrical (e.g., axisymmetrical or centrosymmetric) symmetry on the local mesh of the first local texture (since one of the bipartite may be used to represent the first local texture itself) and a process of mapping may generate a local mesh of the other half of the local texture (i.e., the second local texture). This particular process may also be seen in the following detailed description of the corresponding embodiment of fig. 5 and 7.
More, if the symmetrical mode of the target picture is the quarter symmetrical mode, the plurality of local textures of the global texture of the target picture may refer to four local textures symmetrical to each other, and three local textures other than the first local texture in the four local textures may be referred to as a third local texture, a fourth local texture and a fifth local texture, respectively; that is, the global texture of the target picture may be segmented into a first local texture, a third local texture, a fourth local texture, and a fifth local texture, which are symmetrical to each other.
Therefore, the rendering device may perform a quarter symmetric mapping process on the local mesh of the first local texture, that is, may generate the local mesh of the third local texture, the local mesh of the fourth local texture, and the local mesh of the fifth local texture. The quarter-symmetric mapping process may refer to performing the symmetry and mapping process on the local mesh of the first local texture three times in a corresponding symmetry manner (such as axial symmetry or center symmetry) (because one of the quarters may be used to represent the first local texture itself), and may generate the local meshes of the other three local textures, respectively. This particular process may also be seen in the following detailed description of the corresponding embodiment of fig. 9 and 11.
Step S103, rendering the target picture based on the first local texture and the local mesh of each of the plurality of local textures.
Grid vertices in a local grid of any local texture may be referred to as grid vertices of the any local texture. Wherein, since the grid vertices of each local texture except the first local texture in the plurality of local textures are symmetrically mapped based on the grid vertices of the first local texture, and the texture coordinates of the grid vertices of the first local texture are texture coordinates on the first local texture, the texture coordinates of the grid vertices of each local texture except the first local texture in the plurality of local textures are also texture coordinates on the first local texture.
Therefore, the application can realize the rendering of the local grids of the local textures through the first local texture so as to realize the rendering of the whole target picture, and does not need to use other local textures except the first local texture to participate in the rendering of the target picture, as described below.
The rendering device may sample the color in the first local texture by using the texture coordinates of the grid vertices of each local texture in the plurality of local textures, so as to obtain the pixel color (the pixel color is the color obtained by sampling) of the grid vertices of each local texture in the plurality of local textures. The pixel color at the pixel position indicated by the texture coordinates of the mesh vertex of any one of the first local textures may be taken as the pixel color sampled by the mesh vertex of the any one local texture, and one mesh vertex may have one texture coordinate, so that one mesh vertex may correspond to one pixel color sampled by the sampling.
Furthermore, the rendering device may perform coloring processing on the grid vertices of each local texture in the plurality of local textures by using the pixel colors of the grid vertices of each local texture in the plurality of local textures, so as to render and display local pictures to which each local texture belongs, wherein the local pictures to which each local texture belongs all belong to local pictures in the target picture, and the local pictures to which each local texture to be rendered and displayed belong form the whole target picture to be rendered and displayed.
Referring to fig. 3, fig. 3 is a schematic view of another picture rendering scene provided in the present application. As shown in fig. 3, the rendering device may perform rendering processing on the local mesh of the first local texture through the first local texture to implement rendering display of the local picture to which the first local texture belongs, and may perform rendering processing on the local mesh of each local texture other than the first local texture through the first local texture to implement rendering display of the local picture to which each local texture belongs, where the local picture to which the first local texture to be rendered and displayed and the local picture to which each local texture to be rendered and displayed belong together form a target picture to be rendered and displayed.
In more detail, in a possible implementation manner, the global texture may also be a texture of a contracted picture, where the contracted picture may be obtained by performing contraction processing on a picture at a center position of the target picture, and the contracted picture may also have a symmetrical mode and symmetrical parameters of the target picture.
The scale (i.e., amplitude) of stretching the picture at the center of the contracted picture (i.e., the picture after contracting the picture at the center of the target picture) is the same as the scale (i.e., amplitude) of contracting the picture at the center of the target picture, so that the target picture of the original size can be accurately restored by the contracted picture rendered.
Optionally, when the above situation is applicable to a case where the target picture is a picture of the target frame, the rendering scene of the target picture may be the same everywhere (e.g., the picture at the center of the target frame is formed by two upper and lower straight frames), so that the picture at the center may be shrunk (to further reduce the amount of data of the stored texture), and the 4 corners of the target frame do not belong to the picture at the center, so that the 4 corners are not stretched, and the shape of the 4 corners of the target frame is not deformed.
Referring to fig. 4, fig. 4 is a schematic view of a scene of image cropping according to the present application. As shown in fig. 4, the symmetrical mode of the target picture may be a quarter symmetrical mode, and the symmetrical parameter is a quarter axisymmetrical parameter, in the present application, the picture at the center position of the target picture may be shrunk to obtain a shrunk picture of the target picture, and the shrunk picture is also a quarter symmetrical mode, so that the shrunk picture may be subjected to quarter axisymmetrical clipping, and a local picture of the shrunk picture may be obtained, where the local picture may be a picture 1/4 of the top left corner of the shrunk picture, and the first local texture may be a texture of the local picture of the shrunk picture.
In other specific scenarios, if the top left corner and the top right corner of the target frame in the target picture are the same, the bottom left corner and the bottom right corner of the target frame are the same, the top left corner and the bottom left corner are different, and the top right corner and the bottom right corner are different, which indicates that the target picture and the shrink map of the target picture may be in a bipartite symmetrical mode, and the symmetry parameter is a bipartite axisymmetric parameter (based on y-axis symmetry), the shrink map may be subjected to a bipartite symmetrical clipping process based on the x-axis direction, where the first local texture may be the texture of the 1/2 picture in the top half of the shrink picture or the texture of the 1/2 picture in the bottom half of the shrink picture.
More, the rendering device may further obtain a game control to be rendered and displayed by referencing (i.e. calling) a global texture of a target picture in the game, and may refer to the game control to be rendered and displayed by referencing the global texture as a target game control, and update the original rendering and displaying mode of the target game control (i.e. the original mode of implementing the rendering and displaying of the target game control by using the global texture and a grid of the global texture) to the game control, where the rendering and displaying mode may be performed based on the first local texture and the local grids of each local texture, and the picture of the target game control on the game interface that is rendered and displayed may be the target picture.
In the application, the target picture can be rendered and displayed through the local texture (such as the first local texture) of the target picture, so that the rendering device can only store the local texture of the target picture before the target picture is required to be rendered and displayed, and the global texture of the whole target picture is not required to be stored, thereby reducing the occupation of the texture map to the memory of the rendering device. In addition, when the rendering device needs to render and display the target picture, only local textures (such as a first local texture) of the target picture are needed to be loaded, so that the quick loading of the textures can be realized, and further, the target picture can be rendered and displayed through the loaded local textures, so that the efficiency of rendering and displaying the target picture is improved, and the time consumption of rendering and displaying the target picture is reduced.
The method and the device can acquire the regularity parameters of the target picture to be rendered; the regularity parameter is used for representing the regularity of the target picture; acquiring a first local texture of a target picture; the global texture of the target picture is segmented into a plurality of local textures according to the regularity of the target picture, and the first local texture is any local texture in the plurality of local textures; mapping the local grids of the first local texture based on the regularity parameter to obtain local grids of all local textures except the first local texture in the plurality of local textures; rendering the target picture based on the first local texture and the local mesh of each of the plurality of local textures. Based on the method, the global texture of the target picture can be segmented into a plurality of local textures based on the regularity of the target picture, and the local grids of other local textures can be obtained through mapping of the local grids of the first local texture, and further, the rendering of the target picture can be realized through the first local texture and the local grids of the local textures, therefore, when the target picture is rendered, only one local texture (such as the first local texture) of the target picture is required to be loaded, and the global texture of the target picture is not required to be loaded, so that the texture loading speed when the target picture is rendered is improved, and the rendering efficiency of the target picture is also improved; in addition, as only the local texture of the picture is needed to be stored, the memory storage pressure aiming at the texture of the picture can be reduced, and the effective optimization of the memory is realized.
Referring to fig. 5, fig. 5 is a flow chart of a binary symmetrical mapping method for a grid according to the present application. In the embodiment of the application, when the symmetrical mode of the target picture is a bipartite symmetrical mode, the symmetrical parameter is a bipartite axis symmetrical parameter, and the bipartite axis parameter includes the above-mentioned target symmetrical axis of the target picture, a process of generating a local grid of a second local texture through a local grid of a first local texture is described. As shown in fig. 5, the method may include:
step S201, based on the symmetry axis of the target, performing symmetry processing on the grid vertices of the first local texture to generate first grid vertices of the second local texture.
Alternatively, the rendering device may perform axisymmetric processing on the mesh vertex of the first local texture through the target symmetry axis, and may generate a mesh vertex of the second local texture, which may be referred to as a first mesh vertex, where the first mesh vertex and the mesh vertex of the first local texture are symmetrical based on the direction of the target symmetry axis.
Wherein, a grid vertex of the first local texture may be used to generate a first grid vertex, where the first grid vertex and the grid vertex of the first local texture are symmetrical based on a direction of a symmetry axis of the object, and the first grid vertex may be referred to as a first grid vertex corresponding to the grid vertex of the first local texture.
In step S202, the texture coordinates of the mesh vertices of the first local texture are set to the texture coordinates of the first mesh vertices symmetrical to the mesh vertices of the first local texture based on the target symmetry axis.
Alternatively, the rendering device may set the texture coordinates of each mesh vertex of the first local texture as the texture coordinates of the first mesh vertex corresponding to each mesh vertex of the first local texture, respectively.
In other words, the texture coordinates of any one of the mesh vertices of the first local texture may be set as the texture coordinates of the first mesh vertex corresponding to the mesh vertex.
Step S203, generating a local mesh of the second local texture based on the first mesh vertex of the second local texture having the texture coordinates.
Alternatively, the rendering device may generate the local mesh of the second local texture by the first mesh vertices of the second local texture having texture coordinates. Wherein, each first grid vertex of the second local texture is determined, and a corresponding grid triangle (may be referred to as triangle) may be generated based on each first grid vertex, where the vertex of the grid triangle is each first grid vertex of the second local texture, and the local texture of the second local texture may include each first grid vertex of the second local texture and a triangle whose vertex is each first grid vertex.
Referring to fig. 6 a-6 c, fig. 6 a-6 c are schematic views of a scene for generating a local mesh according to the present application. As shown in fig. 6a, the target picture may be a picture a1 herein, and the frame a2 includes related symmetry attribute parameters (such as the symmetry parameters described above) configured for the target picture, where the symmetry mode of the target picture may be a bipartite symmetry mode, the symmetry parameter may be a bipartite axis symmetry parameter, and the target symmetry axis in the bipartite axis symmetry parameter may be a y axis.
Therefore, the target picture may be symmetrically cropped in the y-axis direction to obtain a local picture a3 of the target picture, where the local picture a3 may be a picture of a left half of the target picture, the first local texture may be a texture of the local picture a3, and only the texture of the local picture a3 may be stored in the rendering device.
As further shown in fig. 6b, the mesh vertices in the local mesh of the first local texture (such as the texture of the local picture a3 herein) may include mesh vertex b1, mesh vertex b2, mesh vertex b3, and mesh vertex b4 herein. Wherein, the texture coordinate of the mesh vertex b1 is (0, 1), the texture coordinate of the mesh vertex b2 is (0, 0), the texture coordinate of the mesh vertex b3 is (1, 0), and the texture coordinate of the mesh vertex b4 is (1, 1).
Since the first partial texture and the second partial texture are symmetrical based on the direction of the y-axis, the rendering apparatus may generate the mesh vertex b5 symmetrical to the mesh vertex b1 based on the direction of the y-axis, generate the mesh vertex b6 symmetrical to the mesh vertex b2 based on the direction of the y-axis, generate the mesh vertex symmetrical to the mesh vertex b4 based on the direction of the y-axis or the mesh vertex b4 itself, and generate the mesh vertex symmetrical to the mesh vertex b3 based on the direction of the y-axis or the mesh vertex b3 itself. That is, the plurality of first mesh vertices of the second local texture may include mesh vertex b3, mesh vertex b4, mesh vertex b5, and mesh vertex b6.
Accordingly, the texture device may set the texture coordinates (0, 1) of the mesh vertex b1 to be the texture coordinates of the mesh vertex b5, and may set the texture coordinates (0, 0) of the mesh vertex b2 to be the texture coordinates of the mesh vertex b6, that is, the first mesh vertex symmetrical to the mesh vertex of the first partial texture based on the y-axis direction may have the same texture coordinates. The local mesh of the second local texture includes the mesh vertex b3, the mesh vertex b4, the mesh vertex b5, the mesh vertex b6, and two triangles whose vertices are the mesh vertex b3, the mesh vertex b4, the mesh vertex b5, and the mesh vertex b6.
Through the above procedure, that is, the local mesh of the second local texture is generated by the local mesh of the first local texture, and the two mesh vertices of the mesh vertex b4 and the mesh vertex b3 are shared between the local mesh of the first local texture and the local mesh of the second local texture.
It will be appreciated that the first local texture and the second local texture may actually have a very large number of grid vertices, which may be exemplified by grid vertices b 1-b 6, but that the texture coordinates of each grid vertex of the second local texture and each grid vertex of the second local texture may be generated based on the texture coordinates of each grid vertex of the first local texture according to the above-described principle, regardless of how many grid vertices the first local texture and the second local texture have.
In another possible embodiment, as further shown in fig. 6c, fig. 6c shows the principle of how the local mesh of the second local texture is generated by the local mesh of the first local texture in case the target picture is symmetrical based on the direction of the x-axis (i.e. in case the target symmetry axis is the x-axis).
Here, the mesh vertex of the first local texture may include a mesh vertex b7, a mesh vertex b8, a mesh vertex b9, and a mesh vertex b10, the texture coordinate of the mesh vertex b7 may be (0, 1), the texture coordinate of the mesh vertex b8 may be (1, 1), the texture coordinate of the mesh vertex b9 may be (1, 0), and the texture coordinate of the mesh vertex b10 may be (0, 0).
Accordingly, the rendering apparatus may generate the mesh vertex b11 symmetrical to the mesh vertex b7 based on the direction of the x-axis, generate the mesh vertex 12 symmetrical to the mesh vertex b8 based on the direction of the x-axis, generate the mesh vertex symmetrical to the mesh vertex b10 based on the direction of the x-axis or the mesh vertex b10 itself, and generate the mesh vertex symmetrical to the mesh vertex b9 based on the direction of the x-axis or the mesh vertex b9 itself, similarly. That is, the plurality of first mesh vertices of the second local texture may include mesh vertex b9, mesh vertex b10, mesh vertex b11, and mesh vertex b12.
Accordingly, the texture device may set the texture coordinates (0, 1) of the mesh vertex b7 to be the texture coordinates of the mesh vertex b11, and may set the texture coordinates (1, 1) of the mesh vertex b8 to be the texture coordinates of the mesh vertex b12, that is, the first mesh vertex symmetrical to the mesh vertex of the first partial texture based on the x-axis direction may have the same texture coordinates. The local mesh of the second local texture includes the mesh vertex b9, the mesh vertex b10, the mesh vertex b11, and the mesh vertex b12, and two triangles whose vertices are the mesh vertex b9, the mesh vertex b10, the mesh vertex b11, and the mesh vertex b12.
Through the above procedure, that is, the local mesh of the second local texture is generated by the local mesh of the first local texture, and the two mesh vertices of the mesh vertex b9 and the mesh vertex b10 are shared between the local mesh of the first local texture and the local mesh of the second local texture.
Referring to fig. 7, fig. 7 is a flow chart of another binary-symmetric mapping method for a grid according to the present application. In the embodiment of the application, when the symmetrical mode of the target picture is a bipartite symmetrical mode and the symmetrical parameter is a bipartite center symmetrical parameter, the process of generating the local grid of the second local texture through the local grid of the first local texture is described. As shown in fig. 7, the method may include:
step S301, rotating grid vertexes of a first local texture by a first central symmetry angle to generate second grid vertexes of a second local texture; one grid vertex of the first local texture is rotated and then used for generating a corresponding second grid vertex.
Alternatively, when the target picture is a bi-centered picture, the bi-centered symmetry parameter may be used to indicate the first central symmetry angle, which may be 180 degrees. At this time, the first local texture may be a texture of half of the target picture.
The rendering device may rotate the grid vertex of the first local texture by the first central symmetry angle, so as to generate the grid vertex of the second local texture at this time, and may refer to the grid vertex as the second grid vertex, where one grid vertex of the first local texture rotates and is then used to generate a corresponding second grid vertex.
In step S302, the texture coordinates of each grid vertex of the first local texture are set as the texture coordinates of the second grid vertex corresponding to each grid vertex of the first local texture.
Alternatively, the rendering device may set the texture coordinates of each mesh vertex of the first local texture to be the texture coordinates of the second mesh vertex corresponding to each mesh vertex of the first local texture, respectively.
In other words, the texture coordinates of any one of the mesh vertices of the first local texture may be set to be the texture coordinates of the second mesh vertex generated by rotating the any one of the mesh vertices by 180 degrees.
Step S303, generating a local mesh of the second local texture based on the second mesh vertices of the second local texture having the texture coordinates.
Alternatively, the rendering device may generate the local mesh of the second local texture by a second mesh vertex of the second local texture having texture coordinates. Wherein, each second mesh vertex of the second local texture is determined, and a corresponding mesh triangle (may be abbreviated as triangle) may be generated based on each second mesh vertex, where the vertex of the mesh triangle is each second mesh vertex of the second local texture, and the local texture of the second local texture may include each second mesh vertex of the second local texture and the triangle whose vertex is each second mesh vertex.
Referring to fig. 8, fig. 8 is a schematic view of another image cropping scene provided by the present application. As shown in fig. 8, the symmetry mode of the target picture g1 may be a bipartite symmetry mode, and the symmetry parameter may be a bipartite center symmetry parameter, that is, the target picture g1 is a bipartite center symmetric picture.
The image of the upper half part of the target image g1 can be cut to obtain a local image g2 of the target image g1, the local image g2 can be an image of the upper half part of the target image g1, and the other half image of the target image g2 (namely, the image of the lower half part of the target image g 2) can be obtained by rotating the local image g2 by 180 degrees. The first local texture may be the texture of the local picture g2, so it is understood that, by rotating the mesh vertices in the local texture of the local picture g2 by 180 degrees, the mesh vertices (e.g., the second mesh vertices) of the texture of the other half of the target picture g2 (i.e., the second local texture) may be generated.
Through the above process, that is, the local mesh of the second local texture is generated through the local mesh of the first local texture, the mesh vertices in the local mesh of the second local texture may be obtained by rotating the mesh vertices in the local mesh of the first local texture by the first central symmetry angle.
Referring to fig. 9, fig. 9 is a flow chart of a method for mapping a grid with quarter symmetry according to the present application. In the embodiment of the application, when the symmetrical mode of the target picture is a quarter symmetrical mode and the symmetrical parameter is a quarter axis symmetrical parameter, the process of generating the local grid of the second local texture through the local grid of the first local texture is described. As shown in fig. 9, the method may include:
step S401, performing symmetry processing on the grid vertices of the first local texture based on the first symmetry axis, generating third grid vertices of the third local texture, and setting the texture coordinates of the grid vertices of the first local texture to the texture coordinates of the third grid vertices symmetric to the grid vertices of the first local texture based on the first symmetry axis.
Specifically, in this scenario, the target picture may have two symmetry axes, i.e., a first symmetry axis and a second symmetry axis, where the first symmetry axis may be the first coordinate axis (i.e., the x-axis), and the second symmetry axis may be the second coordinate axis (i.e., the y-axis). The target picture may be equally divided into 4 parts based on the direction of the first axis of symmetry and the direction of the second axis of symmetry.
Optionally, the quarter axisymmetric parameter may further include distribution information of the first local texture in the global texture, and since the target picture may be quarter-divided into four local textures, the four local textures may be understood to be distributed in 4 quadrants (including the first quadrant, the second quadrant, the third quadrant, and the fourth quadrant), the texture of the 1/4 local picture in the upper right portion of the target picture may be understood to be distributed in the first quadrant, the texture of the 1/4 local picture in the upper left portion of the target picture may be understood to be distributed in the second quadrant, the texture of the 1/4 local picture in the lower left portion of the target picture may be understood to be distributed in the third quadrant, and the texture of the 1/4 local picture in the lower right portion of the target picture may be understood to be distributed in the fourth quadrant. Thus, the distribution information of the first local texture may be used to indicate which quadrant in the global texture the first local texture is distributed in.
The rendering device may perform symmetric processing on the mesh vertex of the first local texture through the first symmetry axis to generate a corresponding mesh vertex, may use the mesh vertex generated by performing symmetric processing on the mesh vertex of the first local texture through the first symmetry axis as the mesh vertex of the third local texture, and may refer to the mesh vertex as a third mesh vertex.
Wherein one mesh vertex of the first local texture is used for performing symmetry processing based on the first symmetry axis to generate a third mesh vertex, and the third mesh vertex is symmetrical with the mesh vertex of the first local texture based on the direction of the first symmetry axis.
The rendering device may further set the texture coordinates of the mesh vertex of the first local texture to be texture coordinates of a third mesh vertex symmetrical to the mesh vertex of the first local texture based on the first symmetry axis. In other words, the rendering apparatus may set the texture coordinates of any one of the mesh vertices of the first local texture to be the texture coordinates of the third mesh vertex obtained by symmetrically processing the any one of the mesh vertices based on the first symmetry axis.
If the first local texture is distributed in the second quadrant, the rendering device may perform downward symmetric processing on the mesh vertex of the first local texture based on the first symmetry axis (e.g., the x-axis) to generate a third mesh vertex of the third local texture, where the third local texture may be distributed in the third quadrant (i.e., the local picture to which the third local texture belongs is a picture of the lower left 1/4 portion of the target picture).
Step S402, performing symmetry processing on the grid vertex of the first local texture based on the second symmetry axis, generating a fourth grid vertex of the fourth local texture, and setting the texture coordinates of the grid vertex of the first local texture to the texture coordinates of the fourth grid vertex symmetrical to the grid vertex of the first local texture based on the second symmetry axis.
Similarly, the rendering device may perform symmetric processing on the mesh vertex of the first local texture through the second symmetry axis to generate a corresponding mesh vertex, and may use the mesh vertex generated by performing symmetric processing on the mesh vertex of the first local texture through the second symmetry axis as the mesh vertex of the fourth local texture, and may refer to the mesh vertex as the fourth mesh vertex.
Wherein one mesh vertex of the first local texture is used for symmetrical processing based on the second symmetry axis to generate a fourth mesh vertex, and the fourth mesh vertex is symmetrical with the mesh vertex of the first local texture based on the direction of the second symmetry axis.
The rendering device may further set the texture coordinates of the mesh vertex of the first partial texture to be texture coordinates of a fourth mesh vertex symmetrical to the mesh vertex of the first partial texture based on the second symmetry axis. In other words, the rendering apparatus may set the texture coordinates of any one of the mesh vertices of the first local texture to be the texture coordinates of a fourth mesh vertex obtained by symmetrically processing the any one of the mesh vertices based on the second symmetry axis.
If the first local texture is distributed in the second quadrant, the rendering device may perform rightward symmetric processing on the mesh vertex of the first local texture based on the second symmetry axis (e.g., the y-axis) to generate a fourth mesh vertex of the fourth local texture, where the fourth local texture may be distributed in the first quadrant (i.e., the local picture to which the fourth local texture belongs is a picture of the top right 1/4 portion of the target picture).
Step S403, performing symmetry processing on the grid vertices of the first local texture based on the first symmetry axis and the second symmetry axis, generating a fifth grid vertex of the fifth local texture, and setting the texture coordinates of the grid vertices of the first local texture to the texture coordinates of the fifth grid vertex symmetrical to the grid vertices of the first local texture based on the first symmetry axis and the second symmetry axis.
Further, the rendering device may perform symmetric processing on the mesh vertex of the first local texture through the first symmetry axis and the second symmetry axis to generate a corresponding mesh vertex, and may use the mesh vertex generated by performing symmetric processing on the mesh vertex of the first local texture through the first symmetry axis and the second symmetry axis as the mesh vertex of the fifth local texture, and may refer to the mesh vertex as a fifth mesh vertex.
The first local texture may be a first local texture, a second local texture, and a third local texture, in which the first local texture is a first local texture, and the second local texture is a second local texture, and the third local texture is a third local texture, and the fourth local texture is a third local texture.
The rendering device may further set texture coordinates of the mesh vertices of the first local texture to texture coordinates of a fifth mesh vertex generated after symmetric processing of the mesh vertices of the first local texture based on the first symmetry axis and the second symmetry axis. In other words, the rendering apparatus may set the texture coordinates of any one of the mesh vertices of the first local texture to be the texture coordinates of the fifth mesh vertex obtained by symmetrically processing the any one of the mesh vertices based on the first symmetry axis and the second symmetry axis.
If the first local texture is distributed in the second quadrant, the rendering device may perform downward symmetry processing on the mesh vertex of the first local texture based on the first symmetry axis and then perform rightward symmetry processing based on the second symmetry axis to generate a fifth mesh vertex of the fifth local texture, or the rendering device may perform rightward symmetry processing on the mesh vertex of the first local texture based on the second symmetry axis and then perform downward symmetry processing based on the first symmetry axis and then generate the fifth mesh vertex of the fifth local texture. In this case, the fifth local texture may be distributed in the fourth quadrant (i.e., the local picture to which the fifth local texture belongs is the picture of the lower right 1/4 portion of the target picture).
Step S404, generating a local mesh of the third local texture based on the third mesh vertex of the third local texture having texture coordinates, generating a local mesh of the fourth local texture based on the fourth mesh vertex of the fourth local texture having texture coordinates, and generating a local mesh of the fifth local texture based on the fifth mesh vertex of the fifth local texture having texture coordinates.
Alternatively, the rendering device may generate a local mesh of the third local texture by using third mesh vertices of the third local texture having texture coordinates, and the local mesh of the third local texture may include each third mesh vertex having texture coordinates and a triangle having each third mesh vertex as a vertex.
The rendering device may further generate a local mesh of the fourth local texture by fourth mesh vertices of the fourth local texture having texture coordinates, and the local mesh of the fourth local texture may include respective fourth mesh vertices having texture coordinates and triangles having the respective fourth mesh vertices as vertices.
Similarly, the rendering device may further generate a local mesh of the fifth local texture by using the fifth mesh vertices of the fifth local texture having texture coordinates, where the local mesh of the fifth local texture may include each fifth mesh vertex having texture coordinates and a triangle having each fifth mesh vertex as a vertex.
Referring to fig. 10 a-10 b, fig. 10 a-10 b are schematic views of another scenario for generating a local mesh according to the present application. As shown in fig. 10a, here, a relevant type parameter (such as a parameter related to a symmetry type) of the target picture is configured in the box c2, where the parameter may be used to indicate that the symmetry mode of the target picture c1 may be a quarter symmetry mode, and the symmetry parameter may be a quarter symmetry parameter, that is, the target picture c1 may be a quarter axisymmetric picture, and the target picture c1 may be cropped to obtain a local picture c3 of the target picture c1, where the local picture c3 may be a local picture of an upper left 1/4 portion of the target picture, and a texture of the local picture c3 may be regarded as the first local texture, that is, the first local texture may be distributed in the second quadrant.
As further shown in fig. 10b, the local mesh of the first local texture may include mesh vertex d1, mesh vertex d2, mesh vertex d3, and mesh vertex d4. Wherein, the texture coordinate of the mesh vertex d1 may be (1, 1), the texture coordinate of the mesh vertex d2 may be (1, 0), the texture coordinate of the mesh vertex d3 may be (0, 0), and the texture coordinate of the mesh vertex d4 may be (0, 1).
Wherein, based on the first symmetry axis, the grid vertex d1 is symmetrically processed, and the generated third grid vertex may be the grid vertex d5, so the texture coordinate of the grid vertex d5 may be the same as the texture coordinate of the grid vertex d 1; symmetrically processing the grid vertex d4 based on the first symmetry axis, wherein the generated third grid vertex can be the grid vertex d6, so that the texture coordinate of the grid vertex d6 can be the same as the texture coordinate of the grid vertex d 4; the mesh vertex d2 is symmetrically processed based on the first symmetry axis, the generated third mesh vertex is also the mesh vertex d2 itself, the mesh vertex d3 is symmetrically processed based on the first symmetry axis, and the generated third mesh vertex is also the mesh vertex d3 itself.
Through the above-described procedure, each third mesh vertex (including mesh vertex d2, mesh vertex d3, mesh vertex d5, mesh vertex d 6) of the third local texture is generated, and the third local texture shares two mesh vertices of mesh vertex d2 and mesh vertex d3 with the first local texture.
Similarly, the grid vertex d1 is symmetrically processed based on the second symmetry axis, and the generated fourth grid vertex can be the grid vertex d8, so that the texture coordinate of the grid vertex d8 can be the same as the texture coordinate of the grid vertex d 1; symmetrically processing the grid vertex d2 based on the second symmetry axis, wherein the generated fourth grid vertex can be the grid vertex d9, so that the texture coordinate of the grid vertex d9 can be the same as the texture coordinate of the grid vertex d 2; the mesh vertex d3 is symmetrically processed based on the second symmetry axis, the generated fourth mesh vertex is also the mesh vertex d3 itself, the mesh vertex d4 is symmetrically processed based on the second symmetry axis, and the generated fourth mesh vertex is also the mesh vertex d4 itself.
Through the above procedure, each fourth mesh vertex (including mesh vertex d3, mesh vertex d4, mesh vertex d8, mesh vertex d 9) of the fourth local texture is generated, and the fourth local texture shares two mesh vertices of mesh vertex d3 and mesh vertex d4 with the first local texture.
Further, the rendering device performs symmetric processing on the grid vertex d1 based on the first symmetry axis and the second symmetry axis, and the generated fifth grid vertex may be the grid vertex d7, so the texture coordinates of the grid vertex d7 may be the same as the texture coordinates of the grid vertex d 1; the rendering device performs symmetry processing on the grid vertex d2 based on the first symmetry axis and the second symmetry axis, and the generated fifth grid vertex may be the grid vertex d9 (because the position after the symmetry processing through the first symmetry axis is the position of the grid vertex d2, and the position of the grid vertex d9 may be obtained by performing symmetry processing on the position of the grid vertex d2 based on the second symmetry axis), so the texture coordinate of the grid vertex d9 may be the same as the texture coordinate of the grid vertex d 2.
Similarly, the rendering device performs symmetric processing on the grid vertex d4 based on the first symmetry axis and the second symmetry axis, and the generated fifth grid vertex may be the grid vertex d6, so the texture coordinate of the grid vertex d6 may be the same as the texture coordinate of the grid vertex d 4; the rendering device performs symmetry processing on the mesh vertex d3 based on the first symmetry axis and the second symmetry axis, and the generated fifth mesh vertex may be the mesh vertex d3 itself.
Through the above procedure, each fifth mesh vertex (including mesh vertex d3, mesh vertex d6, mesh vertex d7, mesh vertex d 9) of the fifth local texture is generated, and the fifth local texture and the first local texture share the mesh vertex d 3.
The local grids of the other local textures are accurately and quickly generated through the process, namely, the local grid of the first local texture.
Referring to fig. 11, fig. 11 is a flow chart of another method for four-way symmetric mapping for a grid according to the present application. In the embodiment of the application, when the symmetrical mode of the target picture is a quarter symmetrical mode and the symmetrical parameter is a quarter central symmetrical parameter, the process of generating the local grid of the second local texture through the local grid of the first local texture is described. As shown in fig. 11, the method may include:
step S501, rotating the grid vertex of the first local texture by a second central symmetry angle to generate a sixth grid vertex of the third local texture, and setting the texture coordinate of the grid vertex of the first local texture to be the texture coordinate of the sixth grid vertex generated after rotating the grid vertex of the first local texture.
In particular, the second angle of central symmetry may be 90 degrees. The rendering device may rotate the mesh vertex of the first local texture by the second central symmetry angle to generate a corresponding mesh vertex, where the mesh vertex generated by rotating the mesh vertex of the first local texture by the second central symmetry angle may be used as the mesh vertex of the third local texture, and the mesh vertex may be referred to as a sixth mesh vertex.
And after any grid vertex of the first local texture is used for rotating the second central symmetry angle, a corresponding sixth grid vertex is generated.
The rendering apparatus may use texture coordinates of each mesh vertex of the first local texture as texture coordinates of a sixth mesh vertex corresponding to each mesh vertex of the first local texture, respectively. In other words, the rendering apparatus may use the texture coordinates of any one of the mesh vertices of the first local texture as the texture coordinates of the sixth mesh vertex generated by rotating the any one of the mesh vertices by the second centrosymmetric angle.
Alternatively, the quarter-centered symmetrical parameter may also include information about the distribution of the first local texture in the global texture, and since the target picture may be quarter-divided into four local textures, it may be understood that the four local textures are distributed in 4 quadrants (including the first quadrant, the second quadrant, the third quadrant, and the fourth quadrant), the texture of the 1/4 local picture in the upper right portion of the target picture may be understood as being distributed in the first quadrant, the texture of the 1/4 local picture in the upper left portion of the target picture may be understood as being distributed in the second quadrant, the texture of the 1/4 local picture in the lower left portion of the target picture may be understood as being distributed in the third quadrant, and the texture of the 1/4 local picture in the lower right portion of the target picture may be understood as being distributed in the fourth quadrant. Thus, the distribution information of the first local texture may be used to indicate which quadrant in the global texture the first local texture is distributed in.
If the first local texture is distributed in the second quadrant, the grid vertices of the first local texture may be rotated 90 degrees to the right (as may be understood, the position of the lower right corner of the first local texture may be the center position of the global texture) to generate the sixth grid vertices of the third local texture, which may be distributed in the first quadrant at this time.
Step S502, rotating the grid vertex of the first local texture by a third central symmetry angle to generate a seventh grid vertex of the fourth local texture, and setting the texture coordinate of the grid vertex of the first local texture to be the texture coordinate of the seventh grid vertex generated after rotating the grid vertex of the first local texture.
Alternatively, the second angle of central symmetry may be 180 degrees. The rendering device may rotate the mesh vertex of the first local texture by the third central symmetry angle to generate a corresponding mesh vertex, where the mesh vertex generated by rotating the mesh vertex of the first local texture by the third central symmetry angle may be regarded as a mesh vertex of the fourth local texture, and the mesh vertex may be referred to as a seventh mesh vertex.
And after any grid vertex of the first local texture is used for rotating the third central symmetry angle, generating a corresponding seventh grid vertex.
The rendering apparatus may use texture coordinates of each mesh vertex of the first local texture as texture coordinates of a seventh mesh vertex corresponding to each mesh vertex of the first local texture, respectively. In other words, the rendering apparatus may use the texture coordinates of any one of the mesh vertices of the first local texture as the texture coordinates of the seventh mesh vertex generated by rotating the any one of the mesh vertices by the third centrosymmetric angle.
If the first local texture is distributed in the second quadrant, the grid vertex of the first local texture may be rotated 180 degrees to the right (for example, the grid vertex may be rotated by taking the position point of the lower right corner of the first local texture as the rotation point) to generate a seventh grid vertex of the fourth local texture, and the fourth local texture may be distributed in the fourth quadrant at this time.
Step S503, rotating the grid vertex of the first local texture by a fourth central symmetry angle to generate an eighth grid vertex of the fifth local texture, and setting the texture coordinates of the grid vertex of the first local texture to the texture coordinates of the eighth grid vertex generated after rotating the grid vertex of the first local texture.
Alternatively, the fourth angle of central symmetry may be 270 degrees. The rendering device may rotate the mesh vertex of the first local texture by the fourth central symmetry angle to generate a corresponding mesh vertex, where the mesh vertex generated by rotating the mesh vertex of the first local texture by the fourth central symmetry angle may be regarded as a mesh vertex of the fifth local texture, and the mesh vertex may be referred to as an eighth mesh vertex.
And after any grid vertex of the first local texture is used for rotating the fourth central symmetry angle, generating a corresponding eighth grid vertex.
The rendering apparatus may use texture coordinates of each mesh vertex of the first local texture as texture coordinates of an eighth mesh vertex corresponding to each mesh vertex of the first local texture, respectively. In other words, the rendering apparatus may use the texture coordinates of any one of the mesh vertices of the first local texture as the texture coordinates of the eighth mesh vertex generated by rotating the any one of the mesh vertices by the fourth central symmetry angle.
If the first local texture is distributed in the second quadrant, the mesh vertex of the first local texture may be rotated to the right by 270 degrees (for example, the mesh vertex may be rotated by taking the position point of the lower right corner of the first local texture as the rotation point), so as to generate the eighth mesh vertex of the fifth local texture, and the fifth local texture may be distributed in the third quadrant.
Step S504, generating a local mesh of the third local texture based on the sixth mesh vertex of the third local texture having texture coordinates, generating a local mesh of the fourth local texture based on the seventh mesh vertex of the fourth local texture having texture coordinates, and generating a local mesh of the fifth local texture based on the eighth mesh vertex of the fifth local texture having texture coordinates.
Alternatively, the rendering apparatus may generate the local mesh of the third local texture by using sixth mesh vertices of the third local texture having texture coordinates, and the local mesh of the third local texture may include each of the sixth mesh vertices having texture coordinates and a triangle having each of the sixth mesh vertices as a vertex.
The rendering device may further generate a local mesh of the fourth local texture by a seventh mesh vertex of the fourth local texture having texture coordinates, and the local mesh of the fourth local texture may include each seventh mesh vertex having texture coordinates and a triangle having each seventh mesh vertex as a vertex.
Similarly, the rendering device may further generate a local mesh of the fifth local texture by using the eighth mesh vertices of the fifth local texture having texture coordinates, where the local mesh of the fifth local texture may include each eighth mesh vertex having texture coordinates and a triangle having each eighth mesh vertex as a vertex.
Referring to fig. 12 a-12 b, fig. 12 a-12 b are schematic views of still another scenario for generating a local mesh according to the present application. As shown in fig. 12a, here, a relevant type parameter (such as a parameter related to a symmetry type) of the target picture is configured in the box e2, where the parameter may be used to indicate that the symmetry mode of the target picture e1 may be a quarter symmetry mode, and the symmetry parameter may be a quarter center symmetry parameter, that is, the target picture e1 may be a quarter center symmetrical picture, and the target picture e1 may be clipped to obtain a local picture e3 of the target picture e1, where the local picture e3 may be a local picture of a lower left 1/4 portion of the target picture, and a texture of the local picture e3 may be regarded as the first local texture, that is, the first local texture may be distributed in a Third quadrant (Third).
As further shown in fig. 12b, the local mesh of the first local texture may include mesh vertex f1, mesh vertex f2, mesh vertex f3, and mesh vertex f4. Wherein, the texture coordinate of the mesh vertex f1 may be (1, 0), the texture coordinate of the mesh vertex f2 may be (0, 0), the texture coordinate of the mesh vertex f3 may be (0, 1), and the texture coordinate of the mesh vertex f4 may be (1, 1).
After the grid vertex f1 is rotated rightward by the second central symmetry angle, the generated sixth grid vertex may be the grid vertex f6, so the texture coordinate of the grid vertex f6 may be the same as the texture coordinate of the grid vertex f 1; after the grid vertex f2 is rotated rightward by a second central symmetry angle, the generated sixth grid vertex is the grid vertex f 2; after the grid vertex f3 is rotated rightward by the second central symmetry angle, the generated sixth grid vertex may be the grid vertex f7, and thus, the texture coordinates of the grid vertex f7 may be the same as those of the grid vertex f 3; after the grid vertex f4 is rotated rightward by the second central symmetry angle, the generated sixth grid vertex may be the grid vertex f5, and thus, the texture coordinates of the grid vertex f5 may be the same as those of the grid vertex f 4.
Through the above procedure, each sixth mesh vertex of the third local texture is generated, and the third local texture may share the mesh vertex f2 as one mesh vertex with the first local texture.
Similarly, after the grid vertex f1 is rotated rightward by the third central symmetry angle, the generated seventh grid vertex may be the grid vertex f10, so the texture coordinate of the grid vertex f10 may be the same as the texture coordinate of the grid vertex f 1; after the grid vertex f2 is rotated rightward by a third central symmetry angle, the generated seventh grid vertex is the grid vertex f 2; after the grid vertex f3 is rotated rightward by the third central symmetry angle, the generated seventh grid vertex may be the grid vertex f8, and thus, the texture coordinates of the grid vertex f8 may be the same as those of the grid vertex f 3; after the grid vertex f4 is rotated rightward by the third central symmetry angle, the generated seventh grid vertex may be the grid vertex f9, and thus, the texture coordinates of the grid vertex f9 may be the same as those of the grid vertex f 4.
Through the above procedure, each seventh mesh vertex of the fourth local texture is generated, and the fourth local texture may share the mesh vertex f2 with the first local texture.
More, after the grid vertex f1 is rotated rightward by the fourth central symmetry angle, the generated eighth grid vertex may be the grid vertex f13, so the texture coordinates of the grid vertex f13 may be the same as the texture coordinates of the grid vertex f 1; after the grid vertex f2 is rotated rightward by a fourth central symmetry angle, the generated eighth grid vertex is the grid vertex f 2; after the grid vertex f3 is rotated rightward by the fourth central symmetry angle, the generated eighth grid vertex may be the grid vertex f11, and thus, the texture coordinates of the grid vertex f11 may be the same as those of the grid vertex f 3; after the fourth central symmetry angle is rotated rightward for the mesh vertex f4, the eighth mesh vertex generated may be the mesh vertex f12, and thus, the texture coordinates of the mesh vertex f12 may be the same as those of the mesh vertex f 4.
Through the above procedure, each eighth mesh vertex of the fifth local texture is generated, and the fifth local texture may share the mesh vertex f2 as one mesh vertex with the first local texture.
The local grids of the local textures generated based on the above process can be combined and placed, and after the combined and placed, the same position can have one or two texture coordinates (respectively belonging to different local textures).
The local grids of the other local textures are accurately and quickly generated through the process, namely, the local grid of the first local texture.
Referring to fig. 13, fig. 13 is a schematic diagram of an interface of texture sets according to the present application. As shown in fig. 13, for regular textures (i.e., global textures of pictures belonging to the regularity), there may be partial textures (e.g., 1/2 or 1/4) of the texture in the texture set, which greatly reduces the data volume of the texture set.
The method provided by the application can realize the process of rapidly rendering and displaying the target picture through the local texture of the target picture. According to the application, the texture size is reduced by times by adopting a method of symmetrical texture clipping, so that the size of a drawing set (such as the texture set where the texture is located) is reduced, the loading speed of the UI in the game is improved on the premise of not reducing the quality of the drawing (namely the texture drawing) (so that the target picture displayed in rendering cannot be distorted), the occupation of a memory (shared video memory) when the UI is opened is reduced, and the game inclusion operated at the front end is correspondingly reduced, so that the performance of the game operation at the front end can be improved. In addition, the method provided by the application has good usability, and can be started through simple settings (such as setting some symmetrical modes and symmetrical parameters of the target picture and generating relevant logic configuration of local grids of other local textures through local grids of the local textures of the target picture) without writing complicated codes or using additional rendering plug-ins.
More, the method for rendering the map provided in the present application may be implemented by the above Unity3D, and the UGUI of the Unity3D may have an Image component, where the Image component may be a basic UI element for displaying a picture, and the Image component may be inherited from a MaskableGraphic class (an abstract class of Image processing) for processing the display, rendering, and generation and rendering of a Mesh (Mesh).
The application can additionally copy (i.e. newly add) two picture types to be rendered in the Unity3D, wherein one picture type is in a bipartite symmetrical mode (comprising bipartite axisymmetry and bipartite center symmetry), and the other picture type is in a quarter symmetrical mode (comprising quarter axisymmetry and quarter center symmetry), so that different symmetrical parameters can be set for different types of pictures, and the corresponding rendering of the various types of pictures can be realized through different symmetrical parameters of the various types of pictures.
In the application, firstly, a Mesh (such as a local Mesh for generating each local texture of a target picture) can be generated by an OnPopulateMesh method (a hook function) in Unity3D, the generated Mesh can be called in a Rebuild method (a Rebuild method) of Graphic (pattern), and vertex positions, vertex colors, UV coordinates and triangle information (such as a Mesh triangle) of vertices (including Mesh vertices of each local texture) are provided for the Mesh of a canvas renderer. In the OnPopulateMesh method, vertex color, UV and triangle information corresponding to other local textures can be generated based on the loaded local texture (such as the first local texture) according to different types (such as different types of symmetrical modes) set for the picture. Finally, the canvas renderer can be responsible for rendering the Image (such as the target picture) through the generated Mesh so as to render and display the complete target picture.
In the application, the above-mentioned OnPopulateMesh method in the Unity3D can be rewritten, and vertices, triangles and the like (such as a method of generating other local textures of grid vertices and triangles by self-defining the grid vertices and triangles of the first local texture) can be added in a self-defining manner, so that the display mode of the custom UI elements of Symmetry (bisectional Symmetry mode) and Sector (quarter Symmetry mode) can be realized.
In addition, the present application may modify the native code of the Unity UGUI, so that the relevant configuration (such as the configuration of the symmetry parameters) of the picture may be visually displayed based on the corresponding configuration interface, through which the selection items for configuring the relevant parameters of the picture (such as the selection items of the symmetry mode, the selection items of the symmetry method (such as axisymmetry or centrosymmetry), the selection items of the specific symmetry parameters under the symmetry method (such as symmetry axis), etc.) and the preview of the effect of the cropped picture (such as the preview of the local picture to which the first local texture belongs) may be provided, so that the configuration interface may display the content shown in fig. 8, 10a and 12a, and the relevant parameter configuration process of the picture by the relevant technician may be more convenient.
Fig. 14 is a schematic structural diagram of a picture rendering device according to the present application. As shown in fig. 14, the picture rendering apparatus 140 may include: a first acquisition module 1401, a second acquisition module 1402, a mapping module 1403, and a rendering module 1404.
A first obtaining module 1401, configured to obtain a regularity parameter of a target picture to be rendered; the regularity parameter is used for representing the regularity of the target picture;
a second obtaining module 1402, configured to obtain a first local texture of a target picture; the global texture of the target picture is segmented into a plurality of local textures according to the regularity of the target picture, and the first local texture is any local texture in the plurality of local textures;
a mapping module 1403, configured to perform mapping processing on the local grids of the first local texture based on the regularity parameter, so as to obtain local grids of each local texture except the first local texture in the plurality of local textures;
a rendering module 1404 is configured to render the target picture based on the first local texture and the local mesh of each of the plurality of local textures.
Optionally, the regularity of the target picture includes symmetry, and the regularity parameter includes a symmetry mode of the target picture and symmetry parameters in the symmetry mode;
The symmetric mode includes any of the following: a bipartite symmetrical mode, a quarternary symmetrical mode;
if the symmetric mode is a bipartite symmetric mode, the symmetric parameters include any of the following: a bisector axisymmetric parameter, a bisector center symmetric parameter;
if the symmetry mode is a quarter symmetry mode, the symmetry parameters include any of the following: a quarter axisymmetric parameter, a quarter centrosymmetric parameter.
Optionally, the above device 140 is further configured to:
generating a local mesh of the first local texture based on the first local texture;
the local grid of the first local texture comprises grid vertexes of the first local texture, and the grid vertexes of the first local texture have corresponding texture coordinates on the first local texture.
Optionally, the symmetrical mode is a bipartite symmetrical mode, and the global texture of the target picture is segmented into a first local texture and a second local texture, and the first local texture and the second local texture are symmetrical to each other;
the mapping module 1403 performs mapping processing on the local meshes of the first local texture based on the regularity parameter to obtain a local mesh of each local texture except the first local texture in the plurality of local textures, including:
and performing binary symmetrical mapping processing on the local grids of the first local texture based on the regularity parameters to generate local grids of the second local texture.
Optionally, the symmetry parameter includes a bisector symmetry parameter, and the bisector symmetry parameter includes a target symmetry axis of the target picture; the mapping module 1403 performs a binary symmetrical mapping process on the local mesh of the first local texture based on the regularity parameter, and generates a local mesh of the second local texture, including:
performing axisymmetric processing on grid vertexes of the first local texture based on the target symmetry axis to generate first grid vertexes of the second local texture;
setting the texture coordinates of the grid vertexes of the first local texture as the texture coordinates of the first grid vertexes symmetrical to the grid vertexes of the first local texture based on the symmetry axis of the target;
a local mesh of the second local texture is generated based on the first mesh vertices of the second local texture having texture coordinates.
Optionally, the symmetry parameter includes a bisection center symmetry parameter, and the bisection center symmetry parameter is used to indicate the associated first center symmetry angle; the mapping module 1403 performs a binary symmetrical mapping process on the local mesh of the first local texture based on the regularity parameter, and generates a local mesh of the second local texture, including:
rotating the grid vertexes of the first local texture by a first central symmetry angle to generate second grid vertexes of the second local texture; one grid vertex of the first local texture rotates and then is used for generating a corresponding second grid vertex;
Setting the texture coordinates of each grid vertex of the first local texture as the texture coordinates of a second grid vertex corresponding to each grid vertex of the first local texture;
a local mesh of the second local texture is generated based on the second mesh vertices of the second local texture having texture coordinates.
Optionally, the symmetrical mode is a quarter symmetrical mode, the global texture of the target picture is segmented into a first local texture, a third local texture, a fourth local texture and a fifth local texture, and the local textures are symmetrical to each other;
the mapping module 1403 performs mapping processing on the local meshes of the first local texture based on the regularity parameter to obtain a local mesh of each local texture except the first local texture in the plurality of local textures, including:
and performing quarter symmetrical mapping processing on the local grids of the first local texture based on the regularity parameters to generate a local grid of the third local texture, a local grid of the fourth local texture and a local grid of the fifth local texture.
Optionally, the symmetry parameter includes a quarter axis symmetry parameter, and the quarter axis symmetry parameter includes a first symmetry axis and a second symmetry axis of the target picture; the mapping module 1403 performs a quarter-symmetric mapping process on the local mesh of the first local texture based on the regularity parameter, and generates a local mesh of the third local texture, a local mesh of the fourth local texture, and a local mesh of the fifth local texture, including:
Symmetrically processing grid vertexes of the first local texture based on the first symmetry axis to generate third grid vertexes of the third local texture, and setting texture coordinates of the grid vertexes of the first local texture to be texture coordinates of the third grid vertexes symmetrical to the grid vertexes of the first local texture based on the first symmetry axis;
symmetrically processing grid vertexes of the first local texture based on the second symmetry axis to generate fourth grid vertexes of the fourth local texture, and setting texture coordinates of the grid vertexes of the first local texture to be texture coordinates of the fourth grid vertexes symmetrical to the grid vertexes of the first local texture based on the second symmetry axis;
symmetrically processing grid vertexes of the first local texture based on the first symmetry axis and the second symmetry axis to generate fifth grid vertexes of the fifth local texture, and setting texture coordinates of the grid vertexes of the first local texture to be texture coordinates of the fifth grid vertexes symmetrical to the grid vertexes of the first local texture based on the first symmetry axis and the second symmetry axis;
generating a local mesh of the third local texture based on the third mesh vertex of the third local texture having texture coordinates, generating a local mesh of the fourth local texture based on the fourth mesh vertex of the fourth local texture having texture coordinates, and generating a local mesh of the fifth local texture based on the fifth mesh vertex of the fifth local texture having texture coordinates.
Optionally, the symmetry parameters include a quarter-centered symmetry parameter, and the quarter-centered symmetry parameter is used to indicate the associated second, third, and fourth central symmetry angles; the mapping module 1403 performs a quarter-symmetric mapping process on the local mesh of the first local texture based on the regularity parameter, and generates a local mesh of the third local texture, a local mesh of the fourth local texture, and a local mesh of the fifth local texture, including:
rotating the grid vertexes of the first local texture by a second central symmetry angle to generate sixth grid vertexes of the third local texture, and setting the texture coordinates of the grid vertexes of the first local texture to be the texture coordinates of the sixth grid vertexes generated after the grid vertexes of the first local texture are rotated;
rotating the grid vertexes of the first local texture by a third central symmetry angle to generate seventh grid vertexes of the fourth local texture, and setting the texture coordinates of the grid vertexes of the first local texture to be the texture coordinates of the seventh grid vertexes generated after the grid vertexes of the first local texture are rotated;
rotating the grid vertexes of the first local texture by a fourth central symmetry angle to generate eighth grid vertexes of the fifth local texture, and setting texture coordinates of the grid vertexes of the first local texture to be texture coordinates of the eighth grid vertexes generated after the grid vertexes of the first local texture are rotated;
Generating a local mesh of the third local texture based on the sixth mesh vertex of the third local texture having texture coordinates, generating a local mesh of the fourth local texture based on the seventh mesh vertex of the fourth local texture having texture coordinates, and generating a local mesh of the fifth local texture based on the eighth mesh vertex of the fifth local texture having texture coordinates.
Optionally, the local mesh of each local texture in the plurality of local textures includes mesh vertices of the local texture; texture coordinates of grid vertices of each local texture in the plurality of local textures all belong to texture coordinates on the first local texture;
the rendering module 1404 renders the target picture based on the first local texture and the local mesh of each of the plurality of local textures, including:
based on the texture coordinates of the grid vertexes of each local texture in the plurality of local textures, respectively performing color sampling in the first local texture to obtain the pixel colors of the grid vertexes of each local texture in the plurality of local textures;
coloring the grid vertexes of each local texture in the plurality of local textures based on the pixel colors of the grid vertexes of each local texture in the plurality of local textures so as to render and display local pictures of each local texture in the plurality of local textures;
The target picture is composed of local pictures to which each local texture belongs in a plurality of local textures.
Optionally, the global texture is a texture of a contracted picture, and the contracted picture is obtained by performing contraction treatment on a picture at the center position of the target picture;
the rendering module 1404 renders the target picture based on the first local texture and the local mesh of each of the plurality of local textures, including:
rendering a contracted picture based on the first local texture and a local mesh of each local texture of the plurality of local textures;
and stretching the picture at the central position of the rendered contracted picture to obtain a target picture.
Optionally, the above device 140 is further configured to:
acquiring a target game control which is required to be rendered and displayed by referring to global textures in a game;
updating the rendering display mode of the target game control to a rendering display mode based on the first local texture and the local grids of each local texture in the plurality of local textures;
the target picture is a picture of a target game control to be rendered and displayed.
According to one embodiment of the present application, the steps involved in the picture rendering method shown in fig. 2 may be performed by respective modules in the picture rendering apparatus 140 shown in fig. 14. For example, step S101 shown in fig. 2 may be performed by the first acquisition module 1401 in fig. 14, and step S102 shown in fig. 2 may be performed by the second acquisition module 1402 in fig. 14; step S103 shown in fig. 2 may be performed by the mapping module 1403 in fig. 14; step S104 shown in fig. 2 may be performed by the mapping module 1404 in fig. 14.
The method and the device can acquire the regularity parameters of the target picture to be rendered; the regularity parameter is used for representing the regularity of the target picture; acquiring a first local texture of a target picture; the global texture of the target picture is segmented into a plurality of local textures according to the regularity of the target picture, and the first local texture is any local texture in the plurality of local textures; mapping the local grids of the first local texture based on the regularity parameter to obtain local grids of all local textures except the first local texture in the plurality of local textures; rendering the target picture based on the first local texture and the local mesh of each of the plurality of local textures. Based on the above, the device provided by the application can divide the global texture of the target picture into a plurality of local textures based on the regularity of the target picture, and can map the local grids of other local textures through the local grids of the first local texture, and further, the rendering of the target picture can be realized through the first local texture and the local grids of the local textures, therefore, when the target picture is rendered, only one local texture (such as the first local texture) of the target picture is required to be loaded, and the global texture of the target picture is not required to be loaded, so that the texture loading speed when the target picture is rendered is improved, and the rendering efficiency of the target picture is also improved; in addition, as only the local texture of the picture is needed to be stored, the memory storage pressure aiming at the texture of the picture can be reduced, and the effective optimization of the memory is realized.
According to an embodiment of the present application, each module in the picture rendering device 140 shown in fig. 14 may be separately or completely combined into one or several units to form a structure, or some (some) of the units may be further split into a plurality of sub-units with smaller functions, so that the same operation may be implemented without affecting the implementation of the technical effects of the embodiment of the present application. The above modules are divided based on logic functions, and in practical applications, the functions of one module may be implemented by a plurality of units, or the functions of a plurality of modules may be implemented by one unit. In other embodiments of the present application, the picture rendering device 140 may also include other units, and in practical applications, these functions may also be implemented with assistance from other units, and may be implemented by cooperation of multiple units.
According to one embodiment of the present application, a computer program capable of executing the steps involved in the respective methods shown in the embodiments of the present application may be run on a general-purpose computer device, which may contain a Central Processing Unit (CPU), a random access storage medium (RAM), a read only storage medium (ROM), or the like, and a storage element, to construct a picture rendering apparatus 140 as shown in fig. 14. The above-described computer program may be recorded on, for example, a computer-readable recording medium, and may be loaded into and executed in the above-described computer apparatus through the computer-readable recording medium.
Referring to fig. 15, fig. 15 is a schematic structural diagram of a computer device according to the present application. As shown in fig. 15, the computer device 1000 may include: processor 1001, network interface 1004, and memory 1005, and, in some embodiments, computer device 1000 may further comprise: a user interface 1003, and at least one communication bus 1002. Wherein the communication bus 1002 is used to enable connected communication between these components. The user interface 1003 may include a Display (Display), a Keyboard (Keyboard), and the optional user interface 1003 may further include a standard wired interface, a wireless interface, among others. The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface). The memory 1005 may be a high-speed RAM memory or a non-volatile memory (non-volatile memory), such as at least one disk memory. The memory 1005 may also optionally be at least one storage device located remotely from the processor 1001. As shown in fig. 15, an operating system, a network communication module, a user interface module, and a device control application program may be included in the memory 1005, which is one type of computer storage medium.
In the computer device 1000 shown in FIG. 15, the network interface 1004 may provide network communication functions; while user interface 1003 is primarily used as an interface for providing input to a user; and the processor 1001 may be used to invoke a device control application stored in the memory 1005 to implement:
acquiring regularity parameters of a target picture to be rendered; the regularity parameter is used for representing the regularity of the target picture;
acquiring a first local texture of a target picture; the global texture of the target picture is segmented into a plurality of local textures according to the regularity of the target picture, and the first local texture is any local texture in the plurality of local textures;
mapping the local grids of the first local texture based on the regularity parameter to obtain local grids of all local textures except the first local texture in the plurality of local textures;
rendering the target picture based on the first local texture and the local mesh of each of the plurality of local textures.
It should be understood that the computer device 1000 described in the embodiments of the present application may perform the description of the above-mentioned image rendering method in the embodiments of the present application, and may also perform the description of the above-mentioned image rendering apparatus 140 in the embodiment corresponding to fig. 14, which is not repeated herein. In addition, the description of the beneficial effects of the same method is omitted.
Furthermore, it should be noted here that: the present application also provides a computer readable storage medium, and a computer program is stored in the computer readable storage medium, and when the processor executes the computer program, the description of the picture rendering method in each embodiment of the present application can be executed, so that a detailed description will not be given here. In addition, the description of the beneficial effects of the same method is omitted. For technical details not disclosed in the embodiments of the computer storage medium according to the present application, please refer to the description of the method embodiments of the present application.
As an example, the above-described computer program may be deployed to be executed on one computer device or on a plurality of computer devices that are located at one site, or alternatively, may be executed on a plurality of computer devices that are distributed across a plurality of sites and interconnected by a communication network, and the plurality of computer devices that are distributed across the plurality of sites and interconnected by the communication network may constitute a blockchain network.
The computer readable storage medium may be an internal storage unit of the computer device, such as a hard disk or a memory of the computer device. The computer readable storage medium may also be an external storage device of the computer device, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) card, a flash card (flash card) or the like, which are provided on the computer device. Further, the computer-readable storage medium may also include both internal storage units and external storage devices of the computer device. The computer-readable storage medium is used to store the computer program and other programs and data required by the computer device. The computer-readable storage medium may also be used to temporarily store data that has been output or is to be output.
The present application provides a computer program product comprising a computer program stored in a computer readable storage medium. The processor of the computer device reads the computer program from the computer-readable storage medium, and the processor executes the computer program, so that the computer device performs the description of the above-mentioned picture rendering method in the embodiments of the present application, and thus, a detailed description thereof will not be provided herein. In addition, the description of the beneficial effects of the same method is omitted. For technical details not disclosed in the embodiments of the computer-readable storage medium according to the present application, please refer to the description of the method embodiments of the present application.
The terms first, second and the like in the description and in the claims and drawings of embodiments of the application are used for distinguishing between different objects and not for describing a particular sequential order. Furthermore, the term "include" and any variations thereof is intended to cover a non-exclusive inclusion. For example, a process, method, apparatus, article, or device that comprises a list of steps or elements is not limited to the list of steps or modules but may, in the alternative, include other steps or modules not listed or inherent to such process, method, apparatus, article, or device.
Those of ordinary skill in the art will appreciate that the elements and algorithm steps described in connection with the embodiments disclosed herein may be embodied in electronic hardware, in computer software, or in a combination of the two, and that the elements and steps of the examples have been generally described in terms of function in the foregoing description to clearly illustrate the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The foregoing disclosure is illustrative of the present application and is not to be construed as limiting the scope of the application, which is defined by the appended claims.

Claims (15)

1. A picture rendering method, the method comprising:
acquiring regularity parameters of a target picture to be rendered; the regularity parameter is used for representing the regularity of the target picture;
acquiring a first local texture of the target picture; the global texture of the target picture is segmented into a plurality of local textures according to the regularity of the target picture, and the first local texture is any local texture in the plurality of local textures;
Mapping the local grids of the first local texture based on the regularity parameters to obtain local grids of all local textures except the first local texture in the plurality of local textures;
rendering the target picture based on the first local texture and a local mesh of each local texture of the plurality of local textures.
2. The method of claim 1, wherein the regularity of the target picture comprises symmetry, the regularity parameter comprising a symmetry pattern of the target picture and symmetry parameters in the symmetry pattern;
the symmetric mode includes any one of the following: a bipartite symmetrical mode, a quarternary symmetrical mode;
if the symmetric mode is the bisection symmetric mode, the symmetric parameter includes any one of the following: a bisector axisymmetric parameter, a bisector center symmetric parameter;
if the symmetric mode is the quarter symmetric mode, the symmetric parameter includes any one of the following: a quarter axisymmetric parameter, a quarter centrosymmetric parameter.
3. The method of claim 2, wherein the method further comprises:
generating a local mesh of the first local texture based on the first local texture;
The local grid of the first local texture comprises grid vertexes of the first local texture, and the grid vertexes of the first local texture have texture coordinates corresponding to the first local texture.
4. A method as claimed in claim 3, wherein the symmetric mode is the bipartite symmetric mode, the global texture of the target picture being segmented into the first local texture and a second local texture, the first local texture and the second local texture being mutually symmetric;
the mapping processing is performed on the local grids of the first local texture based on the regularity parameter to obtain local grids of all local textures except the first local texture in the plurality of local textures, including:
and carrying out binary symmetrical mapping processing on the local grids of the first local texture based on the regularity parameters to generate local grids of the second local texture.
5. The method of claim 4, wherein the symmetry parameters comprise the bipartite axis symmetry parameters, and the bipartite axis symmetry parameters comprise a target symmetry axis of the target picture; the performing binary symmetry mapping processing on the local grid of the first local texture based on the regularity parameter to generate the local grid of the second local texture, including:
Performing axisymmetric processing on grid vertexes of the first local texture based on the target symmetry axis to generate first grid vertexes of the second local texture;
setting the texture coordinates of the grid vertexes of the first local texture as the texture coordinates of the first grid vertexes symmetrical to the grid vertexes of the first local texture based on the symmetry axis of the target;
a local mesh of the second local texture is generated based on the first mesh vertices of the second local texture having texture coordinates.
6. The method of claim 4, wherein the symmetry parameters include the bisection center symmetry parameter, and the bisection center symmetry parameter is used to indicate an associated first center symmetry angle; the performing binary symmetry mapping processing on the local grid of the first local texture based on the regularity parameter to generate the local grid of the second local texture, including:
rotating the grid vertexes of the first local texture by the first central symmetry angle to generate second grid vertexes of the second local texture; one grid vertex of the first local texture rotates and then is used for generating a corresponding second grid vertex;
Setting the texture coordinates of each grid vertex of the first local texture as the texture coordinates of a second grid vertex corresponding to each grid vertex of the first local texture;
a local mesh of the second local texture is generated based on the second mesh vertices of the second local texture having texture coordinates.
7. A method as claimed in claim 3, wherein the symmetric mode is the quarter symmetric mode, the global texture of the target picture is segmented into the first, third, fourth and fifth local textures, and the respective local textures are symmetric to each other;
the mapping processing is performed on the local grids of the first local texture based on the regularity parameter to obtain local grids of all local textures except the first local texture in the plurality of local textures, including:
and carrying out quarter symmetrical mapping processing on the local grids of the first local texture based on the regularity parameters, and generating the local grids of the third local texture, the local grids of the fourth local texture and the local grids of the fifth local texture.
8. The method of claim 7, wherein the symmetry parameters comprise the quarter axis symmetry parameters, and the quarter axis symmetry parameters comprise a first symmetry axis and a second symmetry axis of the target picture; the performing quarter-symmetric mapping processing on the local mesh of the first local texture based on the regularity parameter, and generating the local mesh of the third local texture, the local mesh of the fourth local texture, and the local mesh of the fifth local texture, includes:
Symmetrically processing grid vertexes of the first local texture based on the first symmetry axis to generate third grid vertexes of the third local texture, and setting texture coordinates of the grid vertexes of the first local texture to be texture coordinates of the third grid vertexes symmetrical to the grid vertexes of the first local texture based on the first symmetry axis;
symmetrically processing grid vertexes of the first local texture based on the second symmetry axis to generate fourth grid vertexes of a fourth local texture, and setting texture coordinates of the grid vertexes of the first local texture to be texture coordinates of the fourth grid vertexes symmetrical to the grid vertexes of the first local texture based on the second symmetry axis;
symmetrically processing grid vertexes of the first local texture based on the first symmetry axis and the second symmetry axis to generate fifth grid vertexes of the fifth local texture, and setting texture coordinates of the grid vertexes of the first local texture to be texture coordinates of fifth grid vertexes symmetrical to the grid vertexes of the first local texture based on the first symmetry axis and the second symmetry axis;
Generating a local grid of the third local texture based on the third grid vertex of the third local texture with texture coordinates, generating a local grid of the fourth local texture based on the fourth grid vertex of the fourth local texture with texture coordinates, and generating a local grid of the fifth local texture based on the fifth grid vertex of the fifth local texture with texture coordinates.
9. The method of claim 7, wherein the symmetry parameters include the quarter-center symmetry parameter, and the quarter-center symmetry parameter is used to indicate associated second, third, and fourth center symmetry angles; the performing quarter-symmetric mapping processing on the local mesh of the first local texture based on the regularity parameter, and generating the local mesh of the third local texture, the local mesh of the fourth local texture, and the local mesh of the fifth local texture, includes:
rotating the grid vertexes of the first local texture by the second central symmetry angle to generate sixth grid vertexes of the third local texture, and setting the texture coordinates of the grid vertexes of the first local texture to be the texture coordinates of the sixth grid vertexes generated after the grid vertexes of the first local texture are rotated;
Rotating the grid vertexes of the first local texture by the third central symmetry angle to generate seventh grid vertexes of the fourth local texture, and setting the texture coordinates of the grid vertexes of the first local texture to be the texture coordinates of the seventh grid vertexes generated after the grid vertexes of the first local texture are rotated;
rotating the grid vertexes of the first local texture by the fourth central symmetry angle to generate eighth grid vertexes of the fifth local texture, and setting texture coordinates of the grid vertexes of the first local texture to be texture coordinates of the eighth grid vertexes generated after the grid vertexes of the first local texture are rotated;
generating a local mesh of the third local texture based on a sixth mesh vertex of the third local texture having texture coordinates, generating a local mesh of the fourth local texture based on a seventh mesh vertex of the fourth local texture having texture coordinates, and generating a local mesh of the fifth local texture based on an eighth mesh vertex of the fifth local texture having texture coordinates.
10. The method of any of claims 1-9, wherein each local texture of the plurality of local textures comprises a mesh vertex of the respective local texture in a local mesh of the respective local texture; texture coordinates of grid vertices of each local texture in the plurality of local textures all belong to texture coordinates on the first local texture;
The rendering the target picture based on the first local texture and a local mesh of each local texture of the plurality of local textures, comprising:
based on the texture coordinates of the grid vertexes of each local texture in the plurality of local textures, respectively performing color sampling in the first local texture to obtain the pixel colors of the grid vertexes of each local texture in the plurality of local textures;
coloring the grid vertexes of each local texture in the plurality of local textures based on the pixel colors of the grid vertexes of each local texture in the plurality of local textures so as to render and display local pictures of each local texture in the plurality of local textures;
the target picture is composed of local pictures to which each local texture in the plurality of local textures belongs.
11. The method of claim 1, wherein the global texture is a texture of a contracted picture obtained by contracting a picture at a center position of the target picture;
the rendering the target picture based on the first local texture and a local mesh of each local texture of the plurality of local textures, comprising:
Rendering the contracted picture based on the first local texture and a local mesh of each local texture of the plurality of local textures;
and stretching the rendered picture at the central position of the contracted picture to obtain the target picture.
12. The method of claim 1, wherein the method further comprises:
acquiring a target game control which is required to be rendered and displayed by referring to the global texture in the game;
updating the rendering display mode of the target game control to a rendering display mode based on the first local texture and local grids of all local textures in the plurality of local textures;
the target picture is a picture of the target game control to be rendered and displayed.
13. A picture rendering apparatus, the apparatus comprising:
the first acquisition module is used for acquiring regularity parameters of the target picture to be rendered; the regularity parameter is used for representing the regularity of the target picture;
the second acquisition module is used for acquiring a first local texture of the target picture; the global texture of the target picture is segmented into a plurality of local textures according to the regularity of the target picture, and the first local texture is any local texture in the plurality of local textures;
The mapping module is used for carrying out mapping processing on the local grids of the first local texture based on the regularity parameter to obtain local grids of all local textures except the first local texture in the plurality of local textures;
and the rendering module is used for rendering the target picture based on the first local texture and the local grids of the local textures in the plurality of local textures.
14. A computer device comprising a memory and a processor, the memory storing a computer program that, when executed by the processor, causes the processor to perform the steps of the method of any of claims 1-12.
15. A computer readable storage medium, characterized in that the computer readable storage medium stores a computer program adapted to be loaded by a processor and to perform the method of any of claims 1-12.
CN202310885913.5A 2023-07-19 2023-07-19 Picture rendering method, device, equipment and medium Active CN116597063B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310885913.5A CN116597063B (en) 2023-07-19 2023-07-19 Picture rendering method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310885913.5A CN116597063B (en) 2023-07-19 2023-07-19 Picture rendering method, device, equipment and medium

Publications (2)

Publication Number Publication Date
CN116597063A true CN116597063A (en) 2023-08-15
CN116597063B CN116597063B (en) 2023-12-05

Family

ID=87606709

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310885913.5A Active CN116597063B (en) 2023-07-19 2023-07-19 Picture rendering method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN116597063B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117557710A (en) * 2024-01-12 2024-02-13 深圳市其域创新科技有限公司 Texture rendering method and device, terminal equipment and storage medium

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20050059253A (en) * 2005-04-12 2005-06-17 노키아 코포레이션 Three dimensional image processing
US20060214931A1 (en) * 2005-03-22 2006-09-28 Microsoft Corporation Local, deformable precomputed radiance transfer
CN108765539A (en) * 2018-05-24 2018-11-06 武汉斗鱼网络科技有限公司 Image rendering method, device, equipment and storage medium based on OpenG L ES
CN110097525A (en) * 2019-04-23 2019-08-06 厦门美图之家科技有限公司 A kind of image rendering method, device and calculate equipment
CN112206519A (en) * 2020-10-28 2021-01-12 网易(杭州)网络有限公司 Method, device, storage medium and computer equipment for realizing game scene environment change
CN112700525A (en) * 2019-10-21 2021-04-23 华为技术有限公司 Image processing method and electronic equipment
US20210312167A1 (en) * 2018-12-18 2021-10-07 Gree, Inc. Server device, terminal device, and display method for controlling facial expressions of a virtual character
CN114677469A (en) * 2022-03-25 2022-06-28 北京字跳网络技术有限公司 Method and device for rendering target image, electronic equipment and storage medium
CN115187711A (en) * 2022-07-18 2022-10-14 亿咖通(湖北)技术有限公司 Rendering method and device of road model, processing equipment and storage medium
CN115908685A (en) * 2022-11-16 2023-04-04 北京字跳网络技术有限公司 Scene rendering method, device, equipment and storage medium
CN115888085A (en) * 2022-11-29 2023-04-04 网易(杭州)网络有限公司 Game information processing method, device and storage medium
CN115937461A (en) * 2022-11-16 2023-04-07 泰瑞数创科技(北京)股份有限公司 Multi-source fusion model construction and texture generation method, device, medium and equipment
CN116188720A (en) * 2022-11-28 2023-05-30 厦门黑镜科技有限公司 Digital person generation method, device, electronic equipment and storage medium

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060214931A1 (en) * 2005-03-22 2006-09-28 Microsoft Corporation Local, deformable precomputed radiance transfer
KR20050059253A (en) * 2005-04-12 2005-06-17 노키아 코포레이션 Three dimensional image processing
CN108765539A (en) * 2018-05-24 2018-11-06 武汉斗鱼网络科技有限公司 Image rendering method, device, equipment and storage medium based on OpenG L ES
US20210312167A1 (en) * 2018-12-18 2021-10-07 Gree, Inc. Server device, terminal device, and display method for controlling facial expressions of a virtual character
CN110097525A (en) * 2019-04-23 2019-08-06 厦门美图之家科技有限公司 A kind of image rendering method, device and calculate equipment
CN112700525A (en) * 2019-10-21 2021-04-23 华为技术有限公司 Image processing method and electronic equipment
CN112206519A (en) * 2020-10-28 2021-01-12 网易(杭州)网络有限公司 Method, device, storage medium and computer equipment for realizing game scene environment change
CN114677469A (en) * 2022-03-25 2022-06-28 北京字跳网络技术有限公司 Method and device for rendering target image, electronic equipment and storage medium
CN115187711A (en) * 2022-07-18 2022-10-14 亿咖通(湖北)技术有限公司 Rendering method and device of road model, processing equipment and storage medium
CN115908685A (en) * 2022-11-16 2023-04-04 北京字跳网络技术有限公司 Scene rendering method, device, equipment and storage medium
CN115937461A (en) * 2022-11-16 2023-04-07 泰瑞数创科技(北京)股份有限公司 Multi-source fusion model construction and texture generation method, device, medium and equipment
CN116188720A (en) * 2022-11-28 2023-05-30 厦门黑镜科技有限公司 Digital person generation method, device, electronic equipment and storage medium
CN115888085A (en) * 2022-11-29 2023-04-04 网易(杭州)网络有限公司 Game information processing method, device and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117557710A (en) * 2024-01-12 2024-02-13 深圳市其域创新科技有限公司 Texture rendering method and device, terminal equipment and storage medium
CN117557710B (en) * 2024-01-12 2024-05-03 深圳市其域创新科技有限公司 Texture rendering method and device, terminal equipment and storage medium

Also Published As

Publication number Publication date
CN116597063B (en) 2023-12-05

Similar Documents

Publication Publication Date Title
CN107680042B (en) Rendering method, device, engine and storage medium combining texture and convolution network
CN107358649B (en) Processing method and device of terrain file
CN116597063B (en) Picture rendering method, device, equipment and medium
CN110990516A (en) Map data processing method and device and server
KR101975049B1 (en) Method and apparatus for setting background of ui control, and terminal
CN113628331B (en) Data organization and scheduling method for photogrammetry model in illusion engine
US11488347B2 (en) Method for instant rendering of voxels
CN109448123B (en) Model control method and device, storage medium and electronic equipment
CN115222806B (en) Polygon processing method, device, equipment and computer readable storage medium
CN112132941B (en) Text rendering method, device, equipment and storage medium
US10733782B2 (en) Graphics processing systems
WO2023197762A1 (en) Image rendering method and apparatus, electronic device, computer-readable storage medium, and computer program product
US20230053462A1 (en) Image rendering method and apparatus, device, medium, and computer program product
CN113256764A (en) Rasterization device and method and computer storage medium
CN108986034B (en) Raster data coordinate conversion method, system, terminal equipment and storage medium
CN114429513A (en) Method and device for determining visible element, storage medium and electronic equipment
CN112188087B (en) Panoramic video screenshot method and device, storage medium and computer equipment
CN117611703A (en) Barrage character rendering method, barrage character rendering device, barrage character rendering equipment, storage medium and program product
CN112149745B (en) Method, device, equipment and storage medium for determining difficult example sample
CN110038302B (en) Unity 3D-based grid generation method and device
CN109816761B (en) Graph conversion method, graph conversion device, storage medium and electronic equipment
CN112465692A (en) Image processing method, device, equipment and storage medium
US11869123B2 (en) Anti-aliasing two-dimensional vector graphics using a compressed vertex buffer
CN115222867A (en) Overlap detection method, overlap detection device, electronic equipment and storage medium
CN111681317B (en) Data processing method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40091131

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant