CN112419137A - Method and device for displaying mask picture and method and device for displaying mask picture - Google Patents

Method and device for displaying mask picture and method and device for displaying mask picture Download PDF

Info

Publication number
CN112419137A
CN112419137A CN202010749875.7A CN202010749875A CN112419137A CN 112419137 A CN112419137 A CN 112419137A CN 202010749875 A CN202010749875 A CN 202010749875A CN 112419137 A CN112419137 A CN 112419137A
Authority
CN
China
Prior art keywords
mask
mesh
picture
vertex
coordinates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010749875.7A
Other languages
Chinese (zh)
Inventor
储勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Hode Information Technology Co Ltd
Original Assignee
Shanghai Hode Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Hode Information Technology Co Ltd filed Critical Shanghai Hode Information Technology Co Ltd
Priority to CN202010749875.7A priority Critical patent/CN112419137A/en
Publication of CN112419137A publication Critical patent/CN112419137A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/20Processor architectures; Processor configuration, e.g. pipelining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Geometry (AREA)
  • Image Generation (AREA)

Abstract

The application discloses a method and a device for displaying a mask picture, and belongs to the technical field of picture processing. The method comprises the following steps: receiving a mask instruction triggered by a user based on an original picture, wherein the mask instruction carries mask parameter information; acquiring first attribute information of a first grid corresponding to the original picture according to the mask instruction; generating second attribute information of a second grid corresponding to the mask picture according to the first attribute information and the mask parameter information; and acquiring texture corresponding to the original picture, and rendering the second grid according to the texture and the second attribute information so as to display the mask picture. The method and the device can reduce the operation amount of the GPU in the process of rendering the picture.

Description

Method and device for displaying mask picture and method and device for displaying mask picture
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method and an apparatus for displaying a mask image, and a method and an apparatus for displaying a mask image.
Background
The current Unity (game engine) built-in GUI system provides the recmask 2d component to support rectangular masks when implementing masks for original pictures. However, when Unity uses the recmask 2d component to implement the rectangular mask, the recmask 2d component needs to establish a parent-child relationship structure (GameObject) and take the masked person (original picture) as a child level, and then add the rectangular mask to the original picture by adjusting the sizes and positions of the parent level and the child level. Because the size and position of the parent level and the child level of the rectangular mask need to be adjusted simultaneously in the existing mode of realizing the rectangular mask through the relationship structure of the parent level and the child level, a Graphics Processing Unit (GPU) needs to calculate original pictures and mask pictures in the parent level and the child level, the calculation is complex, and the calculation amount is large.
Disclosure of Invention
In view of the above, a method, an apparatus, a computer device and a computer readable storage medium for displaying a mask image are provided to solve the problems of the conventional mask image display method that the GPU is complex to calculate and the computation workload is large.
The application provides a mask picture display method, which comprises the following steps:
receiving a mask instruction triggered by a user based on an original picture, wherein the mask instruction carries mask parameter information;
acquiring first attribute information of a first grid corresponding to the original picture according to the mask instruction;
generating second attribute information of a second grid corresponding to the mask picture according to the first attribute information and the mask parameter information;
and acquiring texture corresponding to the original picture, and rendering the second grid according to the texture and the second attribute information so as to display the mask picture.
Optionally, the mask picture display further includes:
and displaying a mask parameter setting interface for a user to set the mask parameter information based on the parameter setting interface.
Optionally, the first attribute information includes vertex coordinates of a plurality of vertices of the first mesh and texture coordinates corresponding to the plurality of vertices of the first mesh, the second attribute information includes vertex coordinates of a plurality of vertices of the second mesh and texture coordinates corresponding to a plurality of vertices of the second mesh, and the generating second attribute information of the second mesh corresponding to the mask picture according to the first attribute information and the mask parameter information includes:
generating vertex coordinates of a plurality of vertexes of a second grid corresponding to the mask picture according to the vertex coordinates of the plurality of vertexes of the first grid and the mask parameter information;
and generating texture coordinates corresponding to a plurality of vertexes of a second grid corresponding to the mask picture according to the texture coordinates corresponding to the plurality of vertexes of the first grid and the mask parameter information.
Optionally, the mask parameter information includes a ratio of a top mask of the original picture, a ratio of a bottom mask of the original picture, a ratio of a left mask of the original picture, and a ratio of a right mask of the original picture, and the generating vertex coordinates of a plurality of vertices of a second mesh corresponding to the mask picture according to the vertex coordinates of a plurality of vertices of the first mesh and the mask parameter information includes:
determining vertex coordinates of a plurality of vertices of the second mesh according to the vertex coordinates of the plurality of vertices of the first mesh, the proportion of the upper edge mask, the proportion of the lower edge mask, the proportion of the left edge mask, and the proportion of the right edge mask;
generating texture coordinates corresponding to a plurality of vertexes of a second mesh corresponding to a mask picture according to the texture coordinates corresponding to the plurality of vertexes of the first mesh and the mask parameter information includes:
and determining texture coordinates corresponding to a plurality of vertexes of the second grid according to the texture coordinates corresponding to the plurality of vertexes of the first grid, the proportion of the upper side mask, the proportion of the lower side mask, the proportion of the left side mask and the proportion of the right side mask.
Optionally, the mask parameter information includes a ratio of a top mask of the original picture, a ratio of a bottom mask of the original picture, a ratio of a left mask of the original picture, a ratio of a right mask of the original picture, a tilt value of a horizontal direction of the mask, and a tilt value of a vertical direction of the mask, and generating vertex coordinates of a plurality of vertices of a second mesh corresponding to the mask picture according to the vertex coordinates of the plurality of vertices of the first mesh and the mask parameter information includes:
determining vertex coordinates of a plurality of vertexes of the second mesh according to the vertex coordinates of the plurality of vertexes of the first mesh, the proportion of the upper edge mask, the proportion of the lower edge mask, the proportion of the left edge mask, the proportion of the right edge mask, the inclination value of the mask in the horizontal direction and the inclination value of the mask in the vertical direction;
generating texture coordinates corresponding to a plurality of vertexes of a second mesh corresponding to a mask picture according to the texture coordinates corresponding to the plurality of vertexes of the first mesh and the mask parameter information includes:
and determining texture coordinates corresponding to a plurality of vertexes of the second grid according to texture coordinates corresponding to the plurality of vertexes of the first grid, the proportion of the upper side mask, the proportion of the lower side mask, the proportion of the left side mask, the proportion of the right side mask, the inclination value of the mask in the horizontal direction and the inclination value of the mask in the vertical direction.
Optionally, the mask parameter information includes a radius of a rounded rectangle and a smoothness of the rounded rectangle, and the generating, according to the vertex coordinates of the multiple vertices of the first mesh and the mask parameter information, the vertex coordinates of the multiple vertices of the second mesh corresponding to the mask picture includes:
determining vertex coordinates of a plurality of vertices of the second mesh according to the vertex coordinates of the plurality of vertices of the first mesh, the radius of the rounded rectangle, and the smoothness of the rounded rectangle;
generating texture coordinates corresponding to a plurality of vertexes of a second mesh corresponding to a mask picture according to the texture coordinates corresponding to the plurality of vertexes of the first mesh and the mask parameter information includes:
and determining texture coordinates corresponding to a plurality of vertexes of the second mesh according to the texture coordinates corresponding to the plurality of vertexes of the first mesh, the radius of the rounded rectangle and the smoothness of the rounded rectangle.
Optionally, before the step of determining vertex coordinates of a plurality of vertices of the second mesh according to the vertex coordinates of the plurality of vertices of the first mesh, the radius of the rounded rectangle, and the smoothness of the rounded rectangle, the method further includes:
and determining the number of the vertexes of the second mesh according to the smoothness of the rounded rectangles and a preset formula.
The application also provides a shade picture display device, includes:
the receiving module is used for receiving a mask instruction triggered by a user based on an original picture, wherein the mask instruction carries mask parameter information;
the obtaining module is used for obtaining first attribute information of a first grid corresponding to the original picture according to the mask instruction;
the generating module is used for generating second attribute information of a second grid corresponding to the mask picture according to the first attribute information and the mask parameter information;
and the rendering module is used for acquiring the texture corresponding to the original picture and rendering the second grid according to the texture and the second attribute information so as to display the mask picture.
The present application further provides a computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the above method when executing the computer program.
The beneficial effects of the above technical scheme are that:
the method comprises the steps that a mask instruction triggered by a user based on an original picture is received, wherein the mask instruction carries mask parameter information; acquiring first attribute information of a first grid corresponding to the original picture according to the mask instruction; generating second attribute information of a second grid corresponding to the mask picture according to the first attribute information and the mask parameter information; and acquiring texture corresponding to the original picture, and rendering the second grid according to the texture and the second attribute information so as to display the mask picture. According to the embodiment of the invention, the second attribute information of the second grid corresponding to the mask picture obtained after the mask processing is determined according to the mask parameter information set by the user and the first attribute information of the first grid corresponding to the original picture, and then the mask picture is generated according to the second attribute information of the second grid and the texture corresponding to the original picture, so that the calculation processing on all texture data of the original picture and the mask picture is not needed, the calculation amount can be saved, and the display speed can be improved.
Drawings
Fig. 1 is a frame diagram of an embodiment of a system frame diagram of a mask image display method according to the present application;
fig. 2 is a flowchart illustrating an embodiment of a method for displaying a mask image according to the present application;
FIG. 3 is a diagram illustrating a mask parameter setting interface according to an embodiment of the present disclosure;
fig. 4 is a flowchart illustrating a detailed process of generating second attribute information of a second mesh corresponding to a mask picture according to the first attribute information and the mask parameter information in an embodiment of the present application;
FIG. 5 is a schematic diagram of an original picture in an embodiment of the present application;
fig. 6 is a schematic diagram of a mask picture obtained by performing rectangular mask processing on an original picture according to an embodiment of the present application;
fig. 7 is a schematic diagram of a mask picture obtained by performing parallelogram mask processing on an original picture according to an embodiment of the present application;
fig. 8 is a schematic diagram of a mask picture obtained by performing parallelogram mask processing on an original picture according to an embodiment of the present application;
fig. 9 is a block diagram of an embodiment of a masking image display device according to the present application;
fig. 10 is a schematic hardware structure diagram of a computer device executing a method for displaying a mask picture or a method for displaying a mask picture according to an embodiment of the present disclosure.
Detailed Description
The advantages of the present application are further illustrated below with reference to the accompanying drawings and specific embodiments.
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
The terminology used in the present disclosure is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used in this disclosure and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present disclosure. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
In the description of the present application, it should be understood that the numerical references before the steps do not identify the order of performing the steps, but merely serve to facilitate the description of the present application and to distinguish each step, and therefore should not be construed as limiting the present application.
Fig. 1 schematically illustrates an application environment of a mask picture display method according to an embodiment of the present application. In an exemplary embodiment, the system of the application environment may include a user terminal 10, a background server 20. The user terminal 10 and the background server 20 form a wireless or wired connection, and the user terminal 10 has a corresponding application client or a corresponding web page client. The user terminal 10 may be a PC, a mobile phone, an iPAD, a tablet computer, a notebook computer, a personal digital assistant, or the like. The background server 20 may be a rack server, a blade server, a tower server, or a rack server (including an independent server or a server cluster composed of a plurality of servers).
Fig. 2 is a schematic flow chart illustrating a method for displaying a mask image according to an embodiment of the present application. It is to be understood that the flow charts in the embodiments of the present method are not intended to limit the order in which the steps are performed. As can be seen from the figure, the method for displaying a mask picture provided in the embodiment includes:
and step S20, receiving a mask instruction triggered by a user based on the original picture, wherein the mask instruction carries mask parameter information.
In the prior art, Unity is realized in a Mesh (Mesh) manner when displaying an original picture, and specifically, Unity creates a GameObject and adds a Mesh filter component and a Mesh render component when displaying an original picture, then obtains a Mesh (Mesh) corresponding to the original picture through the Mesh filter component, and sends the obtained Mesh to the Mesh render component, so as to render the original picture through the Mesh render component.
Where Unity is the client engine that creates the 3d game. The Mesh is a Mesh corresponding to the original picture, and the Mesh contains vertex coordinates, normal lines, texture coordinates, triangle drawing sequences and other useful attributes and functional information. The GameObject is a basic structure built in Unity. The MeshRender component is a component for obtaining a grid corresponding to the original picture. The Mesh Render component is a component for rendering the Mesh, and the Mesh filter component is used for giving the Mesh to the Mesh Render so as to draw and display the original picture.
In this embodiment, after Unity displays the original picture through the mesh, the user may trigger a mask instruction based on the displayed original picture, where the mask instruction is an instruction for cropping the original picture to keep part of the picture content visible.
In this embodiment, Unity performs cropping on the original picture according to the mask parameter information. Wherein, referring to fig. 3, the mask parameter information may include the following parameter information: the ratio of masks above an original picture (Mask Top), the ratio of masks below the original picture (Mask Bottom), the ratio of masks on the Left side of the original picture (Mask Left), the ratio of masks on the Right side of the original picture (Mask Right), the tilt value of the masks in the Horizontal direction (bean Horizontal), the tilt value of the masks in the Vertical direction (bean Vertical), the Radius of a rounded rectangle (Radius), the Smoothness of the rounded rectangle (smoothening), the offset position of the original picture in the masks (Scroll X, Scroll Y), and the like.
It should be noted that, the ratio of the edge mask on the original picture refers to the ratio of clipping the top of the original picture; the proportion of the lower edge mask of the original picture refers to the proportion of cutting the bottom of the original picture; the proportion of the left mask of the original picture refers to the proportion of clipping the left side of the original picture; the proportion of the mask on the right side of the original picture refers to the proportion of clipping the right side of the original picture; the inclination value of the horizontal direction of the mask refers to an inclination value for performing left/right inclination processing on a mask picture; the inclination value of the vertical direction of the mask refers to an inclination value for performing up/down inclination processing on a mask picture; the radius of the rounded rectangle refers to the radius of a rounded corner when the original picture is subjected to rounded rectangle masking processing; the smoothness of the rounded rectangle is a parameter for determining how many triangles the rounded rectangle region is divided into when performing rounded rectangle masking on the original picture, for example, if the smoothness of the rounded rectangle is n, the rounded rectangle region needs to be divided into n triangles.
In an embodiment, when the user triggers the mask instruction, the user may trigger the mask instruction by clicking a preset control. It can be understood that the user needs to set the mask parameter information before triggering the mask instruction.
In an exemplary embodiment, in order to facilitate the setting of the mask parameter information, the mask picture display further includes:
and displaying a mask parameter setting interface for a user to set the mask parameter information based on the parameter setting interface.
Specifically, when the user masks the original picture, a visual mask parameter setting interface may be displayed on the screen, and the user may set various mask parameter information for performing mask processing on the original picture through the mask parameter setting interface, in an exemplary embodiment, the mask parameter setting interface is as shown in fig. 3.
According to the embodiment, the mask parameter information can be conveniently set by a user through the display of the mask parameter setting interface.
Step S21, obtaining first attribute information of a first grid corresponding to the original picture according to the mask instruction.
Specifically, the first attribute information includes vertex coordinates (vertex), normal (normal), texture coordinates (uv), triangle sequence (triangle), and the like.
The space coordinates of each vertex of the first mesh can be stored through the vertex coordinate array, and the vertex size is 4 because the first mesh has 4 vertices. And storing the normal of each vertex of the first mesh through a normal array, wherein the size of the normal corresponds to the vertex coordinate, namely the normal of the vertex [ i ] corresponding to the normal [ i ]. Information defining the position of each point on the picture, which is interrelated with the 3D model corresponding to the original picture, is defined by texture coordinates (UV), which is the exact correspondence of each point on the original picture to the surface of the model object, UV i corresponds to vertex i. The triangle sequence (triangle) is an int array that contains a list of triangles indexed by vertex arrays. In this embodiment, the first mesh is composed of 2 triangles, and three points of the triangles are points in the vertex coordinates. For example: the first mesh has four vertices 0, 1, 2, 3 with coordinates V0(1,1,0), V1(-1,1,0), V2(1, -1,0), V3(-1, -1,0), so that these four vertices may form a triangle sequence tri [0] ═ ver [0], ver [3], ver [1], tri [1] ═ ver [0], ver [2], ver [3] including two triangles.
Step S22, generating second attribute information of a second mesh corresponding to the mask picture according to the first attribute information and the mask parameter information.
Specifically, the mask picture is a picture obtained by performing mask processing on an original picture. In this embodiment, when the mask picture is displayed, the mask picture is also displayed in a mesh manner, and therefore, in order to display the mask picture, the second attribute information of the second mesh corresponding to the mask picture is determined according to the first attribute information of the mesh corresponding to the original picture and the mask parameter information.
In this embodiment, the data type included in the second attribute information of the second mesh is completely identical to the data type included in the first attribute information of the first mesh. That is, the second attribute information also includes vertex coordinates (vertex), normal (normal), texture coordinates (uv), triangle sequence (triangle), and the like.
In an exemplary embodiment, referring to fig. 4, the first attribute information may include vertex coordinates of a plurality of vertices of the first mesh and texture coordinates corresponding to the plurality of vertices of the first mesh, the second attribute information includes vertex coordinates of a plurality of vertices of the second mesh and texture coordinates corresponding to a plurality of vertices of the second mesh, and the generating the second attribute information of the second mesh corresponding to the mask picture according to the first attribute information and the mask parameter information includes:
step S40, generating vertex coordinates of multiple vertices of the second mesh corresponding to the mask picture according to the vertex coordinates of the multiple vertices of the first mesh and the mask parameter information.
Specifically, after vertex coordinates of each vertex forming the first mesh are obtained, the vertex coordinates of the multiple vertices of the second mesh corresponding to the mask picture may be recalculated by using a mathematical rule according to mask parameter information set by the user and the vertex coordinates of the first mesh.
Step S41, generating texture coordinates corresponding to the plurality of vertices of the second mesh corresponding to the mask picture according to the texture coordinates corresponding to the plurality of vertices of the first mesh and the mask parameter information.
Specifically, after texture coordinates of each vertex forming the first mesh are obtained, the texture coordinates of the multiple vertices of the second mesh corresponding to the mask picture may be recalculated by using a mathematical rule according to mask parameter information set by the user and the texture coordinates of each vertex of the first mesh.
It can be understood that, different masking processes are performed on the original picture, and the masking parameter information that the user needs to set is different, in an exemplary embodiment, when rectangular masking is required to be performed on an original picture, a user may set a ratio of a top mask of the original picture, a ratio of a bottom mask of the original picture, a ratio of a left mask of the original picture and a ratio of a right mask of the original picture in a mask parameter setting interface, i.e. the mask parameter information comprises the proportion of the top mask of the original picture, the proportion of the bottom mask of the original picture, the proportion of the left mask of the original picture and the proportion of the right mask of the original picture, generating vertex coordinates of a plurality of vertices of a second mesh corresponding to a mask picture according to the vertex coordinates of the plurality of vertices of the first mesh and the mask parameter information comprises:
determining vertex coordinates of a plurality of vertices of the second mesh according to the vertex coordinates of the plurality of vertices of the first mesh, the proportion of the upper edge mask, the proportion of the lower edge mask, the proportion of the left edge mask, and the proportion of the right edge mask.
Generating texture coordinates corresponding to a plurality of vertexes of a second mesh corresponding to a mask picture according to the texture coordinates corresponding to the plurality of vertexes of the first mesh and the mask parameter information includes:
and determining texture coordinates corresponding to a plurality of vertexes of the second grid according to the texture coordinates corresponding to the plurality of vertexes of the first grid, the proportion of the upper side mask, the proportion of the lower side mask, the proportion of the left side mask and the proportion of the right side mask.
Specifically, referring to fig. 5, assuming that the width and height of the original picture are both 1, the ratio of the upper side mask is T, the ratio of the lower side mask is B, the ratio of the left side mask is L, and the ratio of the right side mask is R.
Because the original picture is displayed in a mesh manner, when the original picture is masked, the coordinates of four vertexes included in the first mesh corresponding to the original picture are obtained as follows: the vertex coordinates of the upper left corner are (1,0), the vertex coordinates of the upper right corner are (1,1), the vertex coordinates of the lower left corner are (0,0), the vertex coordinates of the lower right corner are (1,0), and the texture coordinates of the four vertices included in the first mesh corresponding to the original picture are (1,0), the texture coordinates of the vertex of the upper right corner are (1,1), the texture coordinates of the vertex of the lower left corner are (0,0), and the texture coordinates of the vertex of the lower right corner are (1, 0).
After vertex coordinates of a plurality of vertexes of the first grid, a proportion T of the upper side mask, a proportion B of the lower side mask, a proportion L of the left side mask and a proportion R of the right side mask are obtained, according to a mathematical law, coordinates of four vertexes contained in a second grid corresponding to a mask picture obtained after rectangular mask processing is performed on the original picture are respectively: the vertex coordinates of the upper left corner are (L,1-T), the vertex coordinates of the upper right corner are (1-R,1-T), the vertex coordinates of the lower left corner are (L, B), and the vertex coordinates of the lower right corner are (1-R, B).
Similarly, after obtaining texture coordinates of multiple vertexes of the first mesh, the proportion T of the upper mask, the proportion B of the lower mask, the proportion L of the left mask, and the proportion R of the right mask, it can be known according to a mathematical law that texture coordinates of four vertexes included in a second mesh corresponding to a mask picture obtained by performing rectangular mask processing on the original picture are respectively: the texture coordinate of the vertex at the upper left corner is (L,1-T), the texture coordinate of the vertex at the upper right corner is (1-R,1-T), the texture coordinate of the vertex at the lower left corner is (L, B), and the texture coordinate of the vertex at the lower right corner is (1-R, B). For example, a mask picture obtained by performing rectangular mask processing on an original picture is shown in fig. 6.
In another exemplary embodiment, when the original picture needs to be parallelogram-masked, the user can set the proportion of the upper mask of the original picture, the proportion of the lower mask of the original picture, the proportion of the left mask of the original picture, the proportion of the right mask of the original picture, the inclination value of the horizontal direction of the mask and the inclination value of the vertical direction of the mask in the mask parameter setting interface, namely the mask parameter information comprises the proportion of the upper mask of the original picture, the proportion of the lower mask of the original picture, the proportion of the left mask of the original picture, the proportion of the right mask of the original picture, the inclination value of the horizontal direction of the mask and the inclination value of the vertical direction of the mask, generating vertex coordinates of a plurality of vertices of a second mesh corresponding to the mask picture according to the vertex coordinates of the plurality of vertices of the first mesh and the mask parameter information comprises:
determining vertex coordinates of a plurality of vertexes of the second mesh according to the vertex coordinates of the plurality of vertexes of the first mesh, the proportion of the upper edge mask, the proportion of the lower edge mask, the proportion of the left edge mask, the proportion of the right edge mask, the inclination value of the mask in the horizontal direction and the inclination value of the mask in the vertical direction;
generating texture coordinates corresponding to a plurality of vertexes of a second mesh corresponding to a mask picture according to the texture coordinates corresponding to the plurality of vertexes of the first mesh and the mask parameter information includes:
and determining texture coordinates corresponding to a plurality of vertexes of the second grid according to texture coordinates corresponding to the plurality of vertexes of the first grid, the proportion of the upper side mask, the proportion of the lower side mask, the proportion of the left side mask, the proportion of the right side mask, the inclination value of the mask in the horizontal direction and the inclination value of the mask in the vertical direction.
Specifically, referring to fig. 5, assuming that the width and height of the original picture are both 1, the ratio of the upper side mask is T, the ratio of the lower side mask is B, the ratio of the left side mask is L, and the ratio of the right side mask is R, the inclination value of the mask in the horizontal direction is H, and the inclination value of the mask in the vertical direction is V.
The parallelogram masking processing of the original picture can be regarded as that the picture obtained after the rectangle masking processing of the original picture is subjected to the inclination processing of the horizontal direction of the mask and the inclination processing of the vertical direction of the mask.
After vertex coordinates of a plurality of vertexes of the first grid, a proportion T of the upper side mask, a proportion B of the lower side mask, a proportion L of the left side mask, a proportion R of the right side mask and an inclination value of the horizontal direction of the mask are obtained and H, the inclination value of the vertical direction of the mask is V, the coordinates of four vertexes contained in a second grid corresponding to a mask picture obtained after parallelogram mask processing is carried out on the original picture are known according to a mathematical law and are respectively: the vertex coordinates of the upper left corner are (L + H,1-T-V), the vertex coordinates of the upper right corner are (1-R + H,1-T + V), the vertex coordinates of the lower left corner are (L-H, B-V), and the vertex coordinates of the lower right corner are (1-R-H, B + V).
Similarly, after texture coordinates of a plurality of vertexes of the first grid, the proportion T of the upper mask, the proportion B of the lower mask, the proportion L of the left mask, the proportion R of the right mask and the inclination value of the horizontal direction of the mask are obtained as H, and the inclination value of the vertical direction of the mask is V, it can be known according to the mathematical law that texture coordinates of four vertexes contained in a second grid corresponding to a mask picture obtained after parallelogram mask processing is performed on the original picture are respectively: the texture coordinates of the vertex at the upper left corner are (L + H,1-T-V), the texture coordinates of the vertex at the upper right corner are (1-R + H,1-T + V), the texture coordinates of the vertex at the lower left corner are (L-H, B-V), and the texture coordinates of the vertex at the lower right corner are (1-R-H, B + V). For example, a mask picture obtained by performing parallelogram mask processing on an original picture is shown in fig. 7.
In another exemplary embodiment, when the rounded corner mask processing is required to be performed on the original picture, a user may set a radius of a rounded rectangle and a smoothness of the rounded corner rectangle in a mask parameter setting interface, that is, the mask parameter information includes the radius of the rounded corner rectangle and the smoothness of the rounded corner rectangle, and then the generating the vertex coordinates of the multiple vertices of the second mesh corresponding to the mask picture according to the vertex coordinates of the multiple vertices of the first mesh and the mask parameter information includes:
determining vertex coordinates of a plurality of vertices of the second mesh according to the vertex coordinates of the plurality of vertices of the first mesh, the radius of the rounded rectangle, and the smoothness of the rounded rectangle;
generating texture coordinates corresponding to a plurality of vertexes of a second mesh corresponding to a mask picture according to the texture coordinates corresponding to the plurality of vertexes of the first mesh and the mask parameter information includes:
and determining texture coordinates corresponding to a plurality of vertexes of the second mesh according to the texture coordinates corresponding to the plurality of vertexes of the first mesh, the radius of the rounded rectangle and the smoothness of the rounded rectangle.
Specifically, assuming that the width and height of the original picture are both 1, the radius of the rounded rectangle is 0.2, and the smoothness of the rounded rectangle is 4. When the rounded rectangle masking processing is performed on the original picture, the original picture may be regarded as being divided into three groups of rectangles in the middle and rounded rectangle regions in 4 corners as shown in fig. 8, and in the prior art, Unity draws the rounded rectangles by dividing the rounded rectangles into a plurality of triangles with the same size, so in this embodiment, when the rounded rectangle masking processing is performed on the original picture, the original picture may be decomposed into three groups of rectangles in the middle and rounded rectangle regions composed of a plurality of triangles in 4 groups.
In addition, in this embodiment, since the number of triangles that can be divided in each rounded rectangle region is determined by the smoothness of the rounded rectangle, after the smoothness of the rounded rectangle is obtained, the number of vertices of the second mesh needs to be determined according to the smoothness of the rounded rectangle and a preset formula, where the preset formula is: the vertex number of the second mesh obtained after the rounded rectangle mask processing is performed on the original picture is 4 (smoothness +2 of the rounded rectangle), that is, if the number of the triangles divided in the rounded rectangle area is 4, the vertex number of the second mesh obtained after the rounded rectangle mask processing is performed on the original picture is: and 24 (4+2) × 4, and the included angle α of the triangle is 90 °/4 is 22.5 °.
For example, after obtaining texture coordinates of a plurality of vertices of the first mesh, a radius of a rounded rectangle is 0.2, and a smoothness of the rounded rectangle is 4, according to a mathematical rule, texture coordinates of 24 vertices included in a second mesh corresponding to a mask picture obtained by performing rounded rectangle processing on the original picture are respectively:
the coordinates of the four vertexes of the middle rectangle are respectively: the vertex coordinates of the upper left corner are (0.2,1), the vertex coordinates of the upper right corner are (0.8,1), the vertex coordinates of the lower left corner are (0.2,0), and the vertex coordinates of the lower right corner are (0.8, 0).
The coordinates of the four vertices of the left rectangle are: the vertex coordinates of the upper left corner are (0,0.8), the vertex coordinates of the upper right corner are (0.2,0.8), the vertex coordinates of the lower left corner are (0,0.2), and the vertex coordinates of the lower right corner are (0.2 ).
The coordinates of the four vertices of the right rectangle are: the vertex coordinates of the upper left corner are (0.8 ), the vertex coordinates of the upper right corner are (1,0.8), the vertex coordinates of the lower left corner are (0.8,0.2), and the vertex coordinates of the lower right corner are (1, 0.2).
The coordinates of three vertices (vertex 1, vertex 2, and vertex 3) included in the upper left corner rounded corner region and not having the vertex coordinates calculated are (0.2 to 0.2 × cos α,0.8+0.2 × sin α), (0.2 to 0.2 × cos2 α,0.8+0.2 × sin2 α), (0.2 to 0.2 × cos3 α, and 0.8+0.2 × sin3 α), in this order.
The coordinates of three vertices (vertex 4, vertex 5, and vertex 6) included in the upper right corner rounded area and not having the vertex coordinates calculated are (0.8+0.2 × cos α,0.8+0.2 × sin α), (0.8+0.2 × cos2 α,0.8+0.2 × sin2 α), (0.8+0.2 × cos3 α, and 0.8+0.2 × sin3 α), in this order.
The coordinates of three vertices (vertex 7, vertex 8, and vertex 9) included in the lower left corner rounded corner region and not having the vertex coordinates calculated are (0.2 to 0.2 × cos α,0.2 to 0.2 × sin α), (0.2 to 0.2 × cos2 α,0.2 to 0.2 × sin2 α), (0.2 to 0.2 × cos3 α, and 0.2 to 0.2 × sin3 α), in this order.
The coordinates of three vertices (vertex 10, vertex 11, and vertex 12) included in the lower right corner rounded corner region and not having the vertex coordinates calculated are (0.8+0.2 × cos α,0.2 to 0.2 × sin α), (0.8+0.2 × cos2 α,0.2 to 0.2 × sin2 α), (0.8+0.2 × cos3 α, and 0.2 to 0.2 × sin3 α), in this order.
Similarly, after texture coordinates of a plurality of vertexes of the first mesh and the radius of the rounded rectangle are obtained to be 0.2 and the smoothness of the rounded rectangle is 4, it can be known from a mathematical law that texture coordinates of 24 vertexes included in a second mesh corresponding to a mask picture obtained by performing rounded mask processing on the original picture are respectively:
the texture coordinates of the four vertices of the middle rectangle are: the texture coordinates of the top left corner vertex are (0.2,1), the texture coordinates of the top right corner vertex are (0.8,1), the texture coordinates of the bottom left corner vertex are (0.2,0), and the texture coordinates of the bottom right corner vertex are (0.8, 0).
The texture coordinates of the four vertices of the left rectangle are: the texture coordinates of the top left vertex are (0,0.8), the texture coordinates of the top right vertex are (0.2,0.8), the texture coordinates of the bottom left vertex are (0,0.2), and the texture coordinates of the bottom right vertex are (0.2 ).
The texture coordinates of the four vertices of the right rectangle are: the texture coordinates of the top left vertex are (0.8 ), the texture coordinates of the top right vertex are (1,0.8), the texture coordinates of the bottom left vertex are (0.8,0.2), and the texture coordinates of the bottom right vertex are (1, 0.2).
The texture coordinates of three vertices (vertex 1, vertex 2, and vertex 3) included in the upper left corner rounded region and not having the vertex coordinates calculated are (0.2 to 0.2 × cos α,0.8+0.2 × sin α), (0.2 to 0.2 × cos2 α,0.8+0.2 × sin2 α), (0.2 to 0.2 × cos3 α, and 0.8+0.2 × sin3 α), in this order.
The texture coordinates of three vertices (vertex 4, vertex 5, and vertex 6) included in the upper right corner rounded region and for which the vertex coordinates have not been calculated are (0.8+0.2 × cos α,0.8+0.2 × sin α), (0.8+0.2 × cos2 α,0.8+0.2 × sin2 α), (0.8+0.2 × cos3 α, and 0.8+0.2 × sin3 α), in this order.
The texture coordinates of three vertices (vertex 7, vertex 8, and vertex 9) included in the lower left corner rounded corner region and not having the vertex coordinates calculated are (0.2 to 0.2 × cos α,0.2 to 0.2 × sin α), (0.2 to 0.2 × cos2 α,0.2 to 0.2 × sin2 α), (0.2 to 0.2 × cos3 α, and 0.2 to 0.2 × sin3 α), in this order.
The texture coordinates of three vertices (vertex 10, vertex 11, and vertex 12) included in the lower right corner rounded corner region and not having the vertex coordinates calculated are (0.8+0.2 × cos α,0.2 to 0.2 × sin α), (0.8+0.2 × cos2 α,0.2 to 0.2 × sin2 α), (0.8+0.2 × cos3 α, and 0.2 to 0.2 × sin3 α), in this order.
In this embodiment, when rectangular masking is performed on an original picture, vertex coordinates and texture coordinates of a second mesh corresponding to the masking picture may be calculated according to masking parameter information set by a user and vertex coordinates and texture coordinates of each vertex of the first mesh, so that the second mesh may be subsequently rendered according to the vertex coordinates and the texture coordinates of the second mesh to obtain the masking picture obtained after rectangular masking is performed on the original picture, and calculation processing does not need to be performed on all texture data of the original picture, thereby saving calculation amount and improving display speed.
Step S23, obtaining a texture corresponding to the original picture, and rendering the second mesh according to the texture and the second attribute information to display the mask picture.
Specifically, the texture is a format converted from a picture, and is a format that a GPU (Graphics Processing Unit) can process.
The second attribute information includes attribute information such as vertex coordinates and texture coordinates of each vertex constituting the second mesh, and a triangle sequence (triangle) composed of each vertex.
Therefore, after obtaining the attribute information, a second mesh may be created according to the attribute information, and then a texture region corresponding to the original picture is sampled according to texture coordinates in the second mesh, and the second mesh is rendered according to the sampled texture, so as to display the mask picture.
According to the embodiment of the invention, the second attribute information of the second grid corresponding to the mask picture obtained after the mask processing is determined according to the mask parameter information set by the user and the first attribute information of the first grid corresponding to the original picture, and then the mask picture is generated according to the second attribute information of the second grid and the texture corresponding to the original picture, so that the calculation processing on all texture data of the original picture and the mask picture is not needed, the calculation amount can be saved, and the display speed can be improved.
Fig. 9 is a block diagram of a mask image display apparatus 900 according to an embodiment of the present invention.
In this embodiment, the masking image display apparatus 900 includes a series of computer program instructions stored in a memory, and when the computer program instructions are executed by a processor, the masking image display function of the embodiments of the present application can be realized. In some embodiments, the mask picture display device 900 may be divided into one or more modules based on the particular operations implemented by the portions of the computer program instructions. For example, in fig. 9, the mask picture display device 900 may be divided into a receiving module 901, an obtaining module 902, a generating module 903, and a rendering module 904. Wherein:
the receiving module 901 is configured to receive a mask instruction triggered by a user based on an original picture, where the mask instruction carries mask parameter information.
In the prior art, Unity is realized in a Mesh (Mesh) manner when displaying an original picture, and specifically, Unity creates a GameObject and adds a Mesh filter component and a Mesh render component when displaying an original picture, then obtains a Mesh (Mesh) corresponding to the original picture through the Mesh filter component, and sends the obtained Mesh to the Mesh render component, so as to render the original picture through the Mesh render component.
Where Unity is the client engine that creates the 3d game. The Mesh is a Mesh corresponding to the original picture, and the Mesh contains vertex coordinates, normal lines, texture coordinates, triangle drawing sequences and other useful attributes and functional information. The GameObject is a basic structure built in Unity. The MeshRender component is a component for obtaining a grid corresponding to the original picture. The Mesh Render component is a component for rendering the Mesh, and the Mesh filter component is used for giving the Mesh to the Mesh Render so as to draw and display the original picture.
In this embodiment, after Unity displays the original picture through the mesh, the user may trigger a mask instruction based on the displayed original picture, where the mask instruction is an instruction for cropping the original picture to keep part of the picture content visible.
In this embodiment, Unity performs cropping on the original picture according to the mask parameter information. Wherein the mask parameter information may include the following parameter information: the method comprises the following steps of obtaining the ratio of a mask on the upper side of an original picture, the ratio of a mask on the lower side of the original picture, the ratio of a mask on the left side of the original picture, the ratio of a mask on the right side of the original picture, a tilt value of the horizontal direction of the mask, a tilt value of the vertical direction of the mask, the radius of a rounded rectangle, the smoothness of the rounded rectangle, the offset position of the original picture in the mask and the like.
It should be noted that, the ratio of the edge mask on the original picture refers to the ratio of clipping the top of the original picture; the proportion of the lower edge mask of the original picture refers to the proportion of cutting the bottom of the original picture; the proportion of the left mask of the original picture refers to the proportion of clipping the left side of the original picture; the proportion of the mask on the right side of the original picture refers to the proportion of clipping the right side of the original picture; the inclination value of the horizontal direction of the mask refers to an inclination value for performing left/right inclination processing on a mask picture; the inclination value of the vertical direction of the mask refers to an inclination value for performing up/down inclination processing on a mask picture; the radius of the rounded rectangle refers to the radius of a rounded corner when the original picture is subjected to rounded rectangle masking processing; the smoothness of the rounded rectangle is a parameter for determining how many triangles the rounded rectangle region is divided into when performing rounded rectangle masking on the original picture, for example, if the smoothness of the rounded rectangle is n, the rounded rectangle region needs to be divided into n triangles.
In an embodiment, when the user triggers the mask instruction, the user may trigger the mask instruction by clicking a preset control. It can be understood that the user needs to set the mask parameter information before triggering the mask instruction.
In an exemplary embodiment, in order to facilitate the setting of the mask parameter information, the mask picture display device 900 further includes: and setting a module.
The setting module is used for displaying a mask parameter setting interface so that a user can set the mask parameter information based on the parameter setting interface.
Specifically, when the user masks the original picture, a visual mask parameter setting interface may be displayed on the screen, and the user may set various mask parameter information for performing mask processing on the original picture through the mask parameter setting interface, in an exemplary embodiment, the mask parameter setting interface is as shown in fig. 3.
According to the embodiment, the mask parameter information can be conveniently set by a user through the display of the mask parameter setting interface.
An obtaining module 902, configured to obtain, according to the mask instruction, first attribute information of a first grid corresponding to the original picture.
Specifically, the first attribute information includes vertex coordinates (vertex), normal (normal), texture coordinates (uv), triangle sequence (triangle), and the like.
The space coordinates of each vertex of the first mesh can be stored through the vertex coordinate array, and the vertex size is 4 because the first mesh has 4 vertices. And storing the normal of each vertex of the first mesh through a normal array, wherein the size of the normal corresponds to the vertex coordinate, namely the normal of the vertex [ i ] corresponding to the normal [ i ]. Information defining the position of each point on the picture, which is interrelated with the 3D model corresponding to the original picture, is defined by texture coordinates (UV), which is the exact correspondence of each point on the original picture to the surface of the model object, UV i corresponds to vertex i. The triangle sequence (triangle) is an int array that contains a list of triangles indexed by vertex arrays. In this embodiment, the first mesh is composed of 2 triangles, and three points of the triangles are points in the vertex coordinates. For example: the first mesh has four vertices 0, 1, 2, 3 with coordinates V0(1,1,0), V1(-1,1,0), V2(1, -1,0), V3(-1, -1,0), so that these four vertices may form a triangle sequence tri [0] ═ ver [0], ver [3], ver [1], tri [1] ═ ver [0], ver [2], ver [3] including two triangles.
A generating module 903, configured to generate second attribute information of a second grid corresponding to the mask picture according to the first attribute information and the mask parameter information.
Specifically, the mask picture is a picture obtained by performing mask processing on an original picture. In this embodiment, when the mask picture is displayed, the mask picture is also displayed in a mesh manner, and therefore, in order to display the mask picture, the second attribute information of the second mesh corresponding to the mask picture is determined according to the first attribute information of the mesh corresponding to the original picture and the mask parameter information.
In this embodiment, the data type included in the second attribute information of the second mesh is completely identical to the data type included in the first attribute information of the first mesh. That is, the second attribute information also includes vertex coordinates (vertex), normal (normal), texture coordinates (uv), triangle sequence (triangle), and the like.
In an exemplary embodiment, referring to fig. 4, the first attribute information may include vertex coordinates of a plurality of vertices of the first mesh and texture coordinates corresponding to the plurality of vertices of the first mesh, the second attribute information includes vertex coordinates of a plurality of vertices of the second mesh and texture coordinates corresponding to the plurality of vertices of the second mesh, and the setting module 903 is further configured to generate vertex coordinates of a plurality of vertices of the second mesh corresponding to the mask picture according to the vertex coordinates of the plurality of vertices of the first mesh and the mask parameter information.
Specifically, after vertex coordinates of each vertex forming the first mesh are obtained, the vertex coordinates of the multiple vertices of the second mesh corresponding to the mask picture may be recalculated by using a mathematical rule according to mask parameter information set by the user and the vertex coordinates of the first mesh.
The setting module 903 is further configured to generate texture coordinates corresponding to multiple vertices of a second mesh corresponding to a mask picture according to the texture coordinates corresponding to the multiple vertices of the first mesh and the mask parameter information.
Specifically, after texture coordinates of each vertex forming the first mesh are obtained, the texture coordinates of the multiple vertices of the second mesh corresponding to the mask picture may be recalculated by using a mathematical rule according to mask parameter information set by the user and the texture coordinates of each vertex of the first mesh.
It can be understood that, different masking processes are performed on the original picture, and the masking parameter information that the user needs to set is different, in an exemplary embodiment, when rectangular masking is required to be performed on an original picture, a user may set a ratio of a top mask of the original picture, a ratio of a bottom mask of the original picture, a ratio of a left mask of the original picture and a ratio of a right mask of the original picture in a mask parameter setting interface, i.e. the mask parameter information comprises the proportion of the top mask of the original picture, the proportion of the bottom mask of the original picture, the proportion of the left mask of the original picture and the proportion of the right mask of the original picture, generating vertex coordinates of a plurality of vertices of a second mesh corresponding to a mask picture according to the vertex coordinates of the plurality of vertices of the first mesh and the mask parameter information comprises:
determining vertex coordinates of a plurality of vertices of the second mesh according to the vertex coordinates of the plurality of vertices of the first mesh, the proportion of the upper edge mask, the proportion of the lower edge mask, the proportion of the left edge mask, and the proportion of the right edge mask.
Generating texture coordinates corresponding to a plurality of vertexes of a second mesh corresponding to a mask picture according to the texture coordinates corresponding to the plurality of vertexes of the first mesh and the mask parameter information includes:
and determining texture coordinates corresponding to a plurality of vertexes of the second grid according to the texture coordinates corresponding to the plurality of vertexes of the first grid, the proportion of the upper side mask, the proportion of the lower side mask, the proportion of the left side mask and the proportion of the right side mask.
Specifically, referring to fig. 5, assuming that the width and height of the original picture are both 1, the ratio of the upper side mask is T, the ratio of the lower side mask is B, the ratio of the left side mask is L, and the ratio of the right side mask is R.
Because the original picture is displayed in a mesh manner, when the original picture is masked, the coordinates of four vertexes included in the first mesh corresponding to the original picture are obtained as follows: the vertex coordinates of the upper left corner are (1,0), the vertex coordinates of the upper right corner are (1,1), the vertex coordinates of the lower left corner are (0,0), the vertex coordinates of the lower right corner are (1,0), and the texture coordinates of the four vertices included in the first mesh corresponding to the original picture are (1,0), the texture coordinates of the vertex of the upper right corner are (1,1), the texture coordinates of the vertex of the lower left corner are (0,0), and the texture coordinates of the vertex of the lower right corner are (1, 0).
After vertex coordinates of a plurality of vertexes of the first grid, a proportion T of the upper side mask, a proportion B of the lower side mask, a proportion L of the left side mask and a proportion R of the right side mask are obtained, according to a mathematical law, coordinates of four vertexes contained in a second grid corresponding to a mask picture obtained after rectangular mask processing is performed on the original picture are respectively: the vertex coordinates of the upper left corner are (L,1-T), the vertex coordinates of the upper right corner are (1-R,1-T), the vertex coordinates of the lower left corner are (L, B), and the vertex coordinates of the lower right corner are (1-R, B).
Similarly, after obtaining texture coordinates of multiple vertexes of the first mesh, the proportion T of the upper mask, the proportion B of the lower mask, the proportion L of the left mask, and the proportion R of the right mask, it can be known according to a mathematical law that texture coordinates of four vertexes included in a second mesh corresponding to a mask picture obtained by performing rectangular mask processing on the original picture are respectively: the texture coordinate of the vertex at the upper left corner is (L,1-T), the texture coordinate of the vertex at the upper right corner is (1-R,1-T), the texture coordinate of the vertex at the lower left corner is (L, B), and the texture coordinate of the vertex at the lower right corner is (1-R, B). For example, a mask picture obtained by performing rectangular mask processing on an original picture is shown in fig. 6.
In another exemplary embodiment, when the original picture needs to be parallelogram-masked, the user can set the proportion of the upper mask of the original picture, the proportion of the lower mask of the original picture, the proportion of the left mask of the original picture, the proportion of the right mask of the original picture, the inclination value of the horizontal direction of the mask and the inclination value of the vertical direction of the mask in the mask parameter setting interface, namely the mask parameter information comprises the proportion of the upper mask of the original picture, the proportion of the lower mask of the original picture, the proportion of the left mask of the original picture, the proportion of the right mask of the original picture, the inclination value of the horizontal direction of the mask and the inclination value of the vertical direction of the mask, generating vertex coordinates of a plurality of vertices of a second mesh corresponding to the mask picture according to the vertex coordinates of the plurality of vertices of the first mesh and the mask parameter information comprises:
determining vertex coordinates of a plurality of vertexes of the second mesh according to the vertex coordinates of the plurality of vertexes of the first mesh, the proportion of the upper edge mask, the proportion of the lower edge mask, the proportion of the left edge mask, the proportion of the right edge mask, the inclination value of the mask in the horizontal direction and the inclination value of the mask in the vertical direction;
generating texture coordinates corresponding to a plurality of vertexes of a second mesh corresponding to a mask picture according to the texture coordinates corresponding to the plurality of vertexes of the first mesh and the mask parameter information includes:
and determining texture coordinates corresponding to a plurality of vertexes of the second grid according to texture coordinates corresponding to the plurality of vertexes of the first grid, the proportion of the upper side mask, the proportion of the lower side mask, the proportion of the left side mask, the proportion of the right side mask, the inclination value of the mask in the horizontal direction and the inclination value of the mask in the vertical direction.
Specifically, referring to fig. 5, assuming that the width and height of the original picture are both 1, the ratio of the upper side mask is T, the ratio of the lower side mask is B, the ratio of the left side mask is L, and the ratio of the right side mask is R, the inclination value of the mask in the horizontal direction is H, and the inclination value of the mask in the vertical direction is V.
The parallelogram masking processing of the original picture can be regarded as that the picture obtained after the rectangle masking processing of the original picture is subjected to the inclination processing of the horizontal direction of the mask and the inclination processing of the vertical direction of the mask.
After vertex coordinates of a plurality of vertexes of the first grid, a proportion T of the upper side mask, a proportion B of the lower side mask, a proportion L of the left side mask, a proportion R of the right side mask and an inclination value of the horizontal direction of the mask are obtained and H, the inclination value of the vertical direction of the mask is V, the coordinates of four vertexes contained in a second grid corresponding to a mask picture obtained after parallelogram mask processing is carried out on the original picture are known according to a mathematical law and are respectively: the vertex coordinates of the upper left corner are (L + H,1-T-V), the vertex coordinates of the upper right corner are (1-R + H,1-T + V), the vertex coordinates of the lower left corner are (L-H, B-V), and the vertex coordinates of the lower right corner are (1-R-H, B + V).
Similarly, after texture coordinates of a plurality of vertexes of the first grid, the proportion T of the upper mask, the proportion B of the lower mask, the proportion L of the left mask, the proportion R of the right mask and the inclination value of the horizontal direction of the mask are obtained as H, and the inclination value of the vertical direction of the mask is V, it can be known according to the mathematical law that texture coordinates of four vertexes contained in a second grid corresponding to a mask picture obtained after parallelogram mask processing is performed on the original picture are respectively: the texture coordinates of the vertex at the upper left corner are (L + H,1-T-V), the texture coordinates of the vertex at the upper right corner are (1-R + H,1-T + V), the texture coordinates of the vertex at the lower left corner are (L-H, B-V), and the texture coordinates of the vertex at the lower right corner are (1-R-H, B + V). For example, a mask picture obtained by performing parallelogram mask processing on an original picture is shown in fig. 7.
In another exemplary embodiment, when the rounded corner mask processing is required to be performed on the original picture, a user may set a radius of a rounded rectangle and a smoothness of the rounded corner rectangle in a mask parameter setting interface, that is, the mask parameter information includes the radius of the rounded corner rectangle and the smoothness of the rounded corner rectangle, and then the generating the vertex coordinates of the multiple vertices of the second mesh corresponding to the mask picture according to the vertex coordinates of the multiple vertices of the first mesh and the mask parameter information includes:
determining vertex coordinates of a plurality of vertices of the second mesh according to the vertex coordinates of the plurality of vertices of the first mesh, the radius of the rounded rectangle, and the smoothness of the rounded rectangle;
generating texture coordinates corresponding to a plurality of vertexes of a second mesh corresponding to a mask picture according to the texture coordinates corresponding to the plurality of vertexes of the first mesh and the mask parameter information includes:
and determining texture coordinates corresponding to a plurality of vertexes of the second mesh according to the texture coordinates corresponding to the plurality of vertexes of the first mesh, the radius of the rounded rectangle and the smoothness of the rounded rectangle.
Specifically, assuming that the width and height of the original picture are both 1, the radius of the rounded rectangle is 0.2, and the smoothness of the rounded rectangle is 4. When the rounded rectangle masking processing is performed on the original picture, the mask picture obtained after the rounded rectangle processing is performed on the original picture may be divided into three groups of rectangles in the middle and rounded rectangle regions in 4 corners as shown in fig. 8, and in the prior art, Unity draws the rounded rectangle by dividing the rounded rectangle into a plurality of triangles with the same size, so in this embodiment, when the rounded rectangle masking processing is performed on the original picture, the original picture may be decomposed into three groups of rectangles in the middle and rounded rectangle regions composed of a plurality of triangles in 4 groups.
In addition, in this embodiment, since the number of triangles that can be divided in each rounded rectangle region is determined by the smoothness of the rounded rectangle, after the smoothness of the rounded rectangle is obtained, the number of vertices of the second mesh needs to be determined according to the smoothness of the rounded rectangle and a preset formula, where the preset formula is: the vertex number of the second mesh obtained after the rounded rectangle mask processing is performed on the original picture is 4 (smoothness +2 of the rounded rectangle), that is, if the number of the triangles divided in the rounded rectangle area is 4, the vertex number of the second mesh obtained after the rounded rectangle mask processing is performed on the original picture is: and 24 (4+2) × 4, and the included angle α of the triangle is 90 °/4 is 22.5 °.
For example, after obtaining texture coordinates of a plurality of vertices of the first mesh, a radius of a rounded rectangle is 0.2, and a smoothness of the rounded rectangle is 4, according to a mathematical rule, texture coordinates of 24 vertices included in a second mesh corresponding to a mask picture obtained by performing rounded rectangle processing on the original picture are respectively:
the coordinates of the four vertexes of the middle rectangle are respectively: the vertex coordinates of the upper left corner are (0.2,1), the vertex coordinates of the upper right corner are (0.8,1), the vertex coordinates of the lower left corner are (0.2,0), and the vertex coordinates of the lower right corner are (0.8, 0).
The coordinates of the four vertices of the left rectangle are: the vertex coordinates of the upper left corner are (0,0.8), the vertex coordinates of the upper right corner are (0.2,0.8), the vertex coordinates of the lower left corner are (0,0.2), and the vertex coordinates of the lower right corner are (0.2 ).
The coordinates of the four vertices of the right rectangle are: the vertex coordinates of the upper left corner are (0.8 ), the vertex coordinates of the upper right corner are (1,0.8), the vertex coordinates of the lower left corner are (0.8,0.2), and the vertex coordinates of the lower right corner are (1, 0.2).
The coordinates of three vertices (vertex 1, vertex 2, and vertex 3) included in the upper left corner rounded corner region and not having the vertex coordinates calculated are (0.2 to 0.2 × cos α,0.8+0.2 × sin α), (0.2 to 0.2 × cos2 α,0.8+0.2 × sin2 α), (0.2 to 0.2 × cos3 α, and 0.8+0.2 × sin3 α), in this order.
The coordinates of three vertices (vertex 4, vertex 5, and vertex 6) included in the upper right corner rounded area and not having the vertex coordinates calculated are (0.8+0.2 × cos α,0.8+0.2 × sin α), (0.8+0.2 × cos2 α,0.8+0.2 × sin2 α), (0.8+0.2 × cos3 α, and 0.8+0.2 × sin3 α), in this order.
The coordinates of three vertices (vertex 7, vertex 8, and vertex 9) included in the lower left corner rounded corner region and not having the vertex coordinates calculated are (0.2 to 0.2 × cos α,0.2 to 0.2 × sin α), (0.2 to 0.2 × cos2 α,0.2 to 0.2 × sin2 α), (0.2 to 0.2 × cos3 α, and 0.2 to 0.2 × sin3 α), in this order.
The coordinates of three vertices (vertex 10, vertex 11, and vertex 12) included in the lower right corner rounded corner region and not having the vertex coordinates calculated are (0.8+0.2 × cos α,0.2 to 0.2 × sin α), (0.8+0.2 × cos2 α,0.2 to 0.2 × sin2 α), (0.8+0.2 × cos3 α, and 0.2 to 0.2 × sin3 α), in this order.
Similarly, after texture coordinates of a plurality of vertexes of the first mesh and the radius of the rounded rectangle are obtained to be 0.2 and the smoothness of the rounded rectangle is 4, it can be known from a mathematical law that texture coordinates of 24 vertexes included in a second mesh corresponding to a mask picture obtained by performing rounded mask processing on the original picture are respectively:
the texture coordinates of the four vertices of the middle rectangle are: the texture coordinates of the top left corner vertex are (0.2,1), the texture coordinates of the top right corner vertex are (0.8,1), the texture coordinates of the bottom left corner vertex are (0.2,0), and the texture coordinates of the bottom right corner vertex are (0.8, 0).
The texture coordinates of the four vertices of the left rectangle are: the texture coordinates of the top left vertex are (0,0.8), the texture coordinates of the top right vertex are (0.2,0.8), the texture coordinates of the bottom left vertex are (0,0.2), and the texture coordinates of the bottom right vertex are (0.2 ).
The texture coordinates of the four vertices of the right rectangle are: the texture coordinates of the top left vertex are (0.8 ), the texture coordinates of the top right vertex are (1,0.8), the texture coordinates of the bottom left vertex are (0.8,0.2), and the texture coordinates of the bottom right vertex are (1, 0.2).
The texture coordinates of three vertices (vertex 1, vertex 2, and vertex 3) included in the upper left corner rounded region and not having the vertex coordinates calculated are (0.2 to 0.2 × cos α,0.8+0.2 × sin α), (0.2 to 0.2 × cos2 α,0.8+0.2 × sin2 α), (0.2 to 0.2 × cos3 α, and 0.8+0.2 × sin3 α), in this order.
The texture coordinates of three vertices (vertex 4, vertex 5, and vertex 6) included in the upper right corner rounded region and for which the vertex coordinates have not been calculated are (0.8+0.2 × cos α,0.8+0.2 × sin α), (0.8+0.2 × cos2 α,0.8+0.2 × sin2 α), (0.8+0.2 × cos3 α, and 0.8+0.2 × sin3 α), in this order.
The texture coordinates of three vertices (vertex 7, vertex 8, and vertex 9) included in the lower left corner rounded corner region and not having the vertex coordinates calculated are (0.2 to 0.2 × cos α,0.2 to 0.2 × sin α), (0.2 to 0.2 × cos2 α,0.2 to 0.2 × sin2 α), (0.2 to 0.2 × cos3 α, and 0.2 to 0.2 × sin3 α), in this order.
The texture coordinates of three vertices (vertex 10, vertex 11, and vertex 12) included in the lower right corner rounded corner region and not having the vertex coordinates calculated are (0.8+0.2 × cos α,0.2 to 0.2 × sin α), (0.8+0.2 × cos2 α,0.2 to 0.2 × sin2 α), (0.8+0.2 × cos3 α, and 0.2 to 0.2 × sin3 α), in this order.
In this embodiment, when rectangular masking is performed on an original picture, vertex coordinates and texture coordinates of a second mesh corresponding to the masking picture may be calculated according to masking parameter information set by a user and vertex coordinates and texture coordinates of each vertex of the first mesh, so that the second mesh may be subsequently rendered according to the vertex coordinates and the texture coordinates of the second mesh to obtain the masking picture obtained after rectangular masking is performed on the original picture, and calculation processing does not need to be performed on all texture data of the original picture, thereby saving calculation amount and improving display speed.
And a rendering module 904, configured to obtain a texture corresponding to the original picture, and render the second mesh according to the texture and the second attribute information, so as to display the mask picture.
Specifically, the second attribute information includes attribute information such as vertex coordinates and texture coordinates of each vertex constituting the second mesh, and a triangle sequence (triangle) composed of each vertex.
Therefore, after obtaining the attribute information, a second mesh may be created according to the attribute information, and then a texture region corresponding to the original picture is sampled according to texture coordinates in the second mesh, and the second mesh is rendered according to the sampled texture, so as to display the mask picture.
According to the embodiment of the invention, the second attribute information of the second grid corresponding to the mask picture obtained after the mask processing is determined according to the mask parameter information set by the user and the first attribute information of the first grid corresponding to the original picture, and then the mask picture is generated according to the second attribute information of the second grid and the texture corresponding to the original picture, so that the calculation processing on all texture data of the original picture and the mask picture is not needed, the calculation amount can be saved, and the display speed can be improved.
Fig. 10 schematically shows a hardware architecture diagram of a computer device 10 suitable for implementing the masking picture displaying method according to an embodiment of the present application. In the present embodiment, the computer device 10 is a device capable of automatically performing numerical calculation and/or information processing in accordance with a command set or stored in advance. For example, the server may be a tablet computer, a notebook computer, a desktop computer, a rack server, a blade server, a tower server, or a rack server (including an independent server or a server cluster composed of a plurality of servers). As shown in fig. 10, computer device 10 includes at least, but is not limited to: the memory 901, the processor 902, and the network interface 903 may be communicatively linked to each other by a system bus. Wherein:
the memory 901 includes at least one type of computer-readable storage medium including a flash memory, a hard disk, a multimedia card, a card type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a Read Only Memory (ROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a Programmable Read Only Memory (PROM), a magnetic memory, a magnetic disk, an optical disk, etc. In some embodiments, the storage 901 may be an internal storage module of the computer device 10, such as a hard disk or a memory of the computer device 10. In other embodiments, the memory 901 may also be an external storage device of the computer device 10, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), or the like, provided on the computer device 10. Of course, memory 901 may also include both internal and external memory modules of computer device 10. In this embodiment, the memory 901 is generally used for storing an operating system installed in the computer device 10 and various application software, such as program codes of the mask picture display method. Further, the memory 901 may also be used to temporarily store various types of data that have been output or are to be output.
The processor 902 may be a Central Processing Unit (CPU), controller, microcontroller, microprocessor, or other data Processing chip in some embodiments. The processor 902 is generally configured to control the overall operation of the computer device 10, such as performing control and processing related to data interaction or communication with the computer device 10. In this embodiment, the processor 902 is configured to execute program codes stored in the memory 901 or process data.
Network interface 903 may comprise a wireless network interface or a wired network interface, with network interface 903 typically being used to establish communication links between computer device 10 and other computer devices. For example, the network interface 903 is used to connect the computer device 10 to an external terminal via a network, establish a data transmission channel and a communication link between the computer device 10 and the external terminal, and the like. The network may be a wireless or wired network such as an Intranet (Intranet), the Internet (Internet), a Global System of Mobile communication (GSM), Wideband Code Division Multiple Access (WCDMA), a 4G network, a 5G network, Bluetooth (Bluetooth), or Wi-Fi.
It is noted that FIG. 10 only shows a computer device having components 901-903, but it is understood that not all of the shown components are required to be implemented, and that more or fewer components may be implemented instead.
In this embodiment, the method for displaying a mask image stored in the memory 901 may be further divided into one or more program modules, and executed by one or more processors (in this embodiment, the processor 902) to complete the present application.
The embodiments of the present application provide a non-volatile computer-readable storage medium, on which a computer program is stored, and when the computer program is executed by a processor, the steps of the method for displaying a mask picture in the embodiments are implemented.
In this embodiment, the computer-readable storage medium includes a flash memory, a hard disk, a multimedia card, a card type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a Read Only Memory (ROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a Programmable Read Only Memory (PROM), a magnetic memory, a magnetic disk, an optical disk, and the like. In some embodiments, the computer readable storage medium may be an internal storage unit of the computer device, such as a hard disk or a memory of the computer device. In other embodiments, the computer readable storage medium may be an external storage device of the computer device, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like provided on the computer device. Of course, the computer-readable storage medium may also include both internal and external storage devices of the computer device. In this embodiment, the computer-readable storage medium is generally used for storing an operating system and various types of application software installed in the computer device, for example, the program code of the mask image display method in the embodiment, and the like. Further, the computer-readable storage medium may also be used to temporarily store various types of data that have been output or are to be output.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and the parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on at least two network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the solution of the embodiments of the present application. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a general hardware platform, and certainly can also be implemented by hardware. It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware related to instructions of a computer program, which can be stored in a computer readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-only memory (ROM), a Random Access Memory (RAM), or the like.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (10)

1. A method for displaying a mask picture, comprising:
receiving a mask instruction triggered by a user based on an original picture, wherein the mask instruction carries mask parameter information;
acquiring first attribute information of a first grid corresponding to the original picture according to the mask instruction;
generating second attribute information of a second grid corresponding to the mask picture according to the first attribute information and the mask parameter information;
and acquiring texture corresponding to the original picture, and rendering the second grid according to the texture and the second attribute information so as to display the mask picture.
2. The method of displaying a mask picture according to claim 1, wherein the displaying of the mask picture further comprises:
and displaying a mask parameter setting interface for a user to set the mask parameter information based on the parameter setting interface.
3. The mask picture display method according to claim 1, wherein the first attribute information includes vertex coordinates of a plurality of vertices of the first mesh and texture coordinates corresponding to the plurality of vertices of the first mesh, the second attribute information includes vertex coordinates of a plurality of vertices of the second mesh and texture coordinates corresponding to a plurality of vertices of the second mesh, and the generating second attribute information of the second mesh corresponding to the mask picture according to the first attribute information and the mask parameter information includes:
generating vertex coordinates of a plurality of vertexes of a second grid corresponding to the mask picture according to the vertex coordinates of the plurality of vertexes of the first grid and the mask parameter information;
and generating texture coordinates corresponding to a plurality of vertexes of a second grid corresponding to the mask picture according to the texture coordinates corresponding to the plurality of vertexes of the first grid and the mask parameter information.
4. The method according to claim 3, wherein the mask parameter information includes a ratio of an upper mask of the original picture, a ratio of a lower mask of the original picture, a ratio of a left mask of the original picture, and a ratio of a right mask of the original picture, and the generating the vertex coordinates of the vertices of the second mesh corresponding to the mask picture according to the vertex coordinates of the vertices of the first mesh and the mask parameter information includes:
determining vertex coordinates of a plurality of vertices of the second mesh according to the vertex coordinates of the plurality of vertices of the first mesh, the proportion of the upper edge mask, the proportion of the lower edge mask, the proportion of the left edge mask, and the proportion of the right edge mask;
generating texture coordinates corresponding to a plurality of vertexes of a second mesh corresponding to a mask picture according to the texture coordinates corresponding to the plurality of vertexes of the first mesh and the mask parameter information includes:
and determining texture coordinates corresponding to a plurality of vertexes of the second grid according to the texture coordinates corresponding to the plurality of vertexes of the first grid, the proportion of the upper side mask, the proportion of the lower side mask, the proportion of the left side mask and the proportion of the right side mask.
5. The method according to claim 3, wherein the mask parameter information includes a ratio of an upper mask of the original picture, a ratio of a lower mask of the original picture, a ratio of a left mask of the original picture, a ratio of a right mask of the original picture, a tilt value of a horizontal mask and a tilt value of a vertical mask, and the generating the vertex coordinates of the vertices of the second mesh corresponding to the mask picture according to the vertex coordinates of the vertices of the first mesh and the mask parameter information includes:
determining vertex coordinates of a plurality of vertexes of the second mesh according to the vertex coordinates of the plurality of vertexes of the first mesh, the proportion of the upper edge mask, the proportion of the lower edge mask, the proportion of the left edge mask, the proportion of the right edge mask, the inclination value of the mask in the horizontal direction and the inclination value of the mask in the vertical direction;
generating texture coordinates corresponding to a plurality of vertexes of a second mesh corresponding to a mask picture according to the texture coordinates corresponding to the plurality of vertexes of the first mesh and the mask parameter information includes:
and determining texture coordinates corresponding to a plurality of vertexes of the second grid according to texture coordinates corresponding to the plurality of vertexes of the first grid, the proportion of the upper side mask, the proportion of the lower side mask, the proportion of the left side mask, the proportion of the right side mask, the inclination value of the mask in the horizontal direction and the inclination value of the mask in the vertical direction.
6. The mask picture display method according to claim 3, wherein the mask parameter information includes a radius of a rounded rectangle and a smoothness of the rounded rectangle, and the generating of the vertex coordinates of the plurality of vertices of the second mesh corresponding to the mask picture from the vertex coordinates of the plurality of vertices of the first mesh and the mask parameter information includes:
determining vertex coordinates of a plurality of vertices of the second mesh according to the vertex coordinates of the plurality of vertices of the first mesh, the radius of the rounded rectangle, and the smoothness of the rounded rectangle;
generating texture coordinates corresponding to a plurality of vertexes of a second mesh corresponding to a mask picture according to the texture coordinates corresponding to the plurality of vertexes of the first mesh and the mask parameter information includes:
and determining texture coordinates corresponding to a plurality of vertexes of the second mesh according to the texture coordinates corresponding to the plurality of vertexes of the first mesh, the radius of the rounded rectangle and the smoothness of the rounded rectangle.
7. The mask picture display method according to claim 6, wherein the step of determining the vertex coordinates of the vertices of the second mesh according to the vertex coordinates of the vertices of the first mesh, the radius of the rounded rectangle, and the smoothness of the rounded rectangle further comprises:
and determining the number of the vertexes of the second mesh according to the smoothness of the rounded rectangles and a preset formula.
8. A mask picture display device, comprising:
the receiving module is used for receiving a mask instruction triggered by a user based on an original picture, wherein the mask instruction carries mask parameter information;
the obtaining module is used for obtaining first attribute information of a first grid corresponding to the original picture according to the mask instruction;
the generating module is used for generating second attribute information of a second grid corresponding to the mask picture according to the first attribute information and the mask parameter information;
and the rendering module is used for acquiring the texture corresponding to the original picture and rendering the second grid according to the texture and the second attribute information so as to display the mask picture.
9. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the method of displaying a matte picture according to any one of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium having stored thereon a computer program, characterized in that: the computer program, when being executed by a processor, realizes the steps of the method for displaying a mask picture according to any one of claims 1 to 7.
CN202010749875.7A 2020-07-30 2020-07-30 Method and device for displaying mask picture and method and device for displaying mask picture Pending CN112419137A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010749875.7A CN112419137A (en) 2020-07-30 2020-07-30 Method and device for displaying mask picture and method and device for displaying mask picture

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010749875.7A CN112419137A (en) 2020-07-30 2020-07-30 Method and device for displaying mask picture and method and device for displaying mask picture

Publications (1)

Publication Number Publication Date
CN112419137A true CN112419137A (en) 2021-02-26

Family

ID=74844045

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010749875.7A Pending CN112419137A (en) 2020-07-30 2020-07-30 Method and device for displaying mask picture and method and device for displaying mask picture

Country Status (1)

Country Link
CN (1) CN112419137A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113536173A (en) * 2021-07-14 2021-10-22 北京字节跳动网络技术有限公司 Page processing method and device, electronic equipment and readable storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113536173A (en) * 2021-07-14 2021-10-22 北京字节跳动网络技术有限公司 Page processing method and device, electronic equipment and readable storage medium
CN113536173B (en) * 2021-07-14 2024-01-16 抖音视界有限公司 Page processing method and device, electronic equipment and readable storage medium

Similar Documents

Publication Publication Date Title
CN110990516B (en) Map data processing method, device and server
CN111127576B (en) Game picture rendering method and device and electronic equipment
US20140225894A1 (en) 3d-rendering method and device for logical window
CN110033507B (en) Method, device and equipment for drawing internal trace of model map and readable storage medium
CN112991558B (en) Map editing method and map editor
CN109697748B (en) Model compression processing method, model mapping processing method, model compression processing device, and storage medium
US20230033319A1 (en) Method, apparatus and device for processing shadow texture, computer-readable storage medium, and program product
CN115237522A (en) Page self-adaptive display method and device
CN112381907A (en) Multimedia track drawing method and system
US8823728B2 (en) Dynamically generated images and effects
CN113538502A (en) Picture clipping method and device, electronic equipment and storage medium
CN113240783A (en) Stylized rendering method and device, readable storage medium and electronic equipment
CN110428504B (en) Text image synthesis method, apparatus, computer device and storage medium
CN112419137A (en) Method and device for displaying mask picture and method and device for displaying mask picture
CN111583398A (en) Image display method and device, electronic equipment and computer readable storage medium
CN112419460B (en) Method, apparatus, computer device and storage medium for baking model map
CN110838167B (en) Model rendering method, device and storage medium
CN106846498B (en) Laser point cloud rendering method and device
CN113486941B (en) Live image training sample generation method, model training method and electronic equipment
CN115228083A (en) Resource rendering method and device
CN114820853A (en) Vector graphics processing method and device, computer equipment and storage medium
CN111681317B (en) Data processing method and device, electronic equipment and storage medium
JP2003331313A (en) Image processing program
CN114238528A (en) Map loading method and device, electronic equipment and storage medium
CN114627225A (en) Method and device for rendering graphics and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination