CN114757861A - Texture image fusion method and device, computer equipment and readable medium - Google Patents

Texture image fusion method and device, computer equipment and readable medium Download PDF

Info

Publication number
CN114757861A
CN114757861A CN202210358299.2A CN202210358299A CN114757861A CN 114757861 A CN114757861 A CN 114757861A CN 202210358299 A CN202210358299 A CN 202210358299A CN 114757861 A CN114757861 A CN 114757861A
Authority
CN
China
Prior art keywords
texture
boundary
target
determining
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210358299.2A
Other languages
Chinese (zh)
Inventor
马光辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Foshan Huya Huxin Technology Co ltd
Original Assignee
Foshan Huya Huxin Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Foshan Huya Huxin Technology Co ltd filed Critical Foshan Huya Huxin Technology Co ltd
Priority to CN202210358299.2A priority Critical patent/CN114757861A/en
Publication of CN114757861A publication Critical patent/CN114757861A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Generation (AREA)

Abstract

The present disclosure provides a texture image fusion method, apparatus, computer device and readable medium, the method comprising: inputting a texture block, an appointed boundary region and the target color value determined in the target texture map into a Poisson equation to further generate a fused image, wherein the appointed boundary region is a region with a preset width around a boundary with a stitching relation, so that the appointed boundary region is an image extracted from the texture block, and compared with the prior art that two images are fused by inputting two images when the Poisson equation is used, the method has the advantages that the method is simple in structure, convenient to operate and easy to implement; according to the method, the texture block and the designated boundary region extracted from the texture block are used as the output image of the Poisson equation, and the designated boundary region is subjected to fusion processing by using the Poisson equation, so that the problem of fault or color difference does not occur when the designated boundary region is subjected to suture connection, and the effect of not manually adjusting the fusion boundary is achieved.

Description

Texture image fusion method and device, computer equipment and readable medium
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to a texture image fusion method, apparatus, computer device, and readable medium.
Background
Image fusion is applied to a wide variety of scenes, such as three-dimensional modeling, special effect stickers, live virtual portraits, and so on; in the above-mentioned scenes, a step of mapping a texture image to a designated area is involved, and the texture image is used for displaying the texture of the surface of an object or the rugged grove of the surface of the object, so that in scenes such as three-dimensional modeling, special effect stickers, virtual portrait during live broadcasting and the like, a viewer can obtain more detailed and real experience when watching the target subject by mapping the texture image onto the target subject.
The texture image mapping is a process of mapping a two-dimensional texture image onto other subjects, generally, the process needs to stitch the boundary of the texture image, the boundary of the texture image often has obvious faults or unnatural colors when being stitched and connected, the traditional means is to manually adjust the stitched boundary to reduce the situations of faults and unnatural colors, and the manual adjustment has the problems of large workload, low efficiency and the like.
Disclosure of Invention
To overcome the problems in the related art, the present disclosure provides a texture image fusion method, apparatus, computer device, and readable medium.
According to a first aspect of the embodiments of the present disclosure, there is provided a texture image fusion method, including:
determining each texture block containing a boundary in the target texture map;
determining a boundary having a stitching relationship;
taking the statistic of the color value of each pixel point on the boundary with the stitching relationship as a target color value;
inputting the texture block, a target color value and a designated boundary area into a Poisson equation, wherein the target color value corresponds to the boundary of the texture block, and the designated boundary area is an area with a preset width around the boundary and having a stitching relation;
and carrying out fusion processing on the pixel points of the specified boundary region with the stitching relation by using a Poisson equation to generate a fusion image.
According to a second aspect of the embodiments of the present disclosure, there is provided a texture image fusion apparatus, including:
the dividing module is used for determining each texture block containing a boundary in the target texture map;
the association module is used for determining a boundary with a stitching relation;
the statistical module is used for taking the statistics of the color values of all the pixel points on the boundary with the stitching relation as target color values;
the input module is used for inputting the texture block, the target color value and the designated boundary area into a Poisson equation, wherein the target color value corresponds to the boundary of the texture block, and the designated boundary area is an area with a preset width around the boundary with a stitching relation;
And the processing module is used for carrying out fusion processing on the pixel points of the specified boundary region with the stitching relationship by utilizing the Poisson equation to generate a fusion image.
According to a third aspect of embodiments of the present disclosure, there is provided a computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the method according to the first aspect when executing the program.
According to a fourth aspect of embodiments of the present disclosure, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the method of the first aspect.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
in the embodiment of the disclosure, a fused image is generated by inputting the texture block, the designated boundary region and the target color value determined in the target texture map into a poisson equation, and the designated boundary region is a region with a preset width around a boundary with a stitching relationship, so that the designated boundary region is an image extracted from the texture block, and compared with the prior art in which the poisson equation is used, two images are required to be input often and fused; according to the method, the texture block and the designated boundary region extracted from the texture block are used as output images of the Poisson equation, the designated boundary region is subjected to fusion processing by using the Poisson equation, on one hand, the effect of improving the processing efficiency of the Poisson equation is realized because the data input into the Poisson equation has pertinence in the process, on the other hand, the problem of fault or color difference does not occur in the designated boundary region when stitching connection is carried out due to the fact that the designated boundary region is fused through the Poisson equation, and therefore the effect that the fusion boundary does not need to be manually adjusted is achieved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a flowchart illustrating a texture image fusion method according to an exemplary embodiment of the present disclosure.
FIG. 2 is a schematic diagram of a target texture image shown in the present disclosure according to an exemplary embodiment.
FIG. 3 is a schematic diagram of a texture block in a target texture image shown by the present disclosure in accordance with an exemplary embodiment.
FIG. 4 is a schematic diagram illustrating texture coordinates of a target texture image according to an exemplary embodiment of the present disclosure.
FIG. 5 is a schematic diagram illustrating a designated boundary region in a target texture image according to an exemplary embodiment of the present disclosure.
FIG. 6 is a schematic diagram of a particular target texture image shown in accordance with an exemplary embodiment of the present disclosure.
FIG. 7 is a partial flow diagram of an example application in the usage scenario of FIG. 6.
FIG. 8 is a block diagram of texture for the example application in the usage scenario of FIG. 6.
FIG. 9 is a diagram illustrating the boundaries of an application example in the usage scenario of FIG. 6.
FIG. 10 is a diagram illustrating a designated bounding area of an application example in the usage scenario of FIG. 6.
FIG. 11 is a diagram illustrating mapping to a three-dimensional model of an application example in the usage scenario of FIG. 6.
Fig. 12 is a block diagram of a texture image fusion apparatus according to another exemplary embodiment shown in the present disclosure.
Fig. 13 is a block diagram of a texture image fusion apparatus according to another exemplary embodiment of the present disclosure.
Fig. 14 is a hardware configuration diagram of a computer device in which the texture image fusion apparatus according to the embodiment of the present disclosure is located.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
The terminology used in the present disclosure is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used in this disclosure and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present disclosure. The word "if," as used herein, may be interpreted as "at … …" or "when … …" or "in response to a determination," depending on the context.
At present, image processing is not limited to processing images independently, and application scenes of image processing are continuously extended, for example, in scenes established by three-dimensional models, textures need to be mapped onto the three-dimensional models, so that the three-dimensional models are more real and finer in appearance; for example, in a scene in which a real person is converted into a virtual person in a live broadcasting or video recording process, the outline of the real person needs to be obtained, and then the texture is mapped to the outline, so that the real person is shown in the image of the virtual person.
The scenes comprise that the existing texture images are fused on other main bodies, the fusion of the texture images is mainly completed through texture mapping, textures are uneven grooves, color patterns and the like on the surface of an object, texture blocks are a texture set used for mapping in an area, and images storing all the texture blocks used for mapping to a three-dimensional model or a virtual character are called texture images; that is, the texture image includes a plurality of texture blocks, each texture block is used for mapping to a three-dimensional model or different regions of the virtual character, because the three-dimensional model or the virtual character is a complete and connected object, it is necessary to fill each region of the three-dimensional model or the virtual character through mapping of the texture block so that the subject presents a complete surface, and it is also necessary that a plurality of texture blocks are mapped in cooperation, for example, eyes, nose and mouth belong to the regions to be mapped with different texture blocks, but the eyes, nose and mouth need to be connected, so that stitching connection with the boundaries of other texture blocks is inevitably generated in the process of texture mapping in each region of the eyes, nose and mouth, and it is difficult to ensure that the boundaries of the texture blocks are continuous because the texture image is a two-dimensional image, therefore, the boundaries of the texture blocks have no continuity, so that obvious faults or color differences usually occur when the boundaries of the texture blocks are connected by sewing;
In the conventional technology, the means for solving the obvious fault or color difference after texture image mapping is manual adjustment, namely, the manual adjustment is carried out at the position with the obvious fault or color difference, and the region is subjected to separate image processing, so that the problem of the fault or color difference of the region is solved.
According to a first aspect of an embodiment of the present disclosure, a texture image fusion method is provided.
Fig. 1 is a flowchart illustrating a texture image fusion method according to an exemplary embodiment of the present disclosure, such as the texture image fusion method illustrated in fig. 1, including:
step S101, determining each texture block containing a boundary in a target texture map;
step S102, determining a boundary with a stitching relationship;
step S103, taking the statistic of the color value of each pixel point on the boundary with the stitching relation as a target color value;
step S104, inputting the texture block, the target color value and the designated boundary area into a Poisson equation;
and S105, performing fusion processing on the pixel points of the specified boundary region with the stitching relation by using a Poisson equation to generate a fusion image.
The target texture image in this embodiment is image information for mapping onto a target subject, and since the target subject includes a plurality of regions, each region needs to be mapped with a different texture, the target texture image may include a plurality of texture blocks for mapping different regions, and one texture block is used for mapping onto one region on the target subject, where the target subject may be a three-dimensional model or an avatar, and as an example, the textures of each part of an eye, an ear, a mouth, and a nose in the avatar are different, so that it is necessary to generate respective texture blocks of different eyes, ears, mouths, and noses in advance, then perform corresponding mapping onto the corresponding region of the avatar, and store images of respective texture blocks of the eye, the ear, the mouth, and the nose to determine as the target texture image.
Therefore, the target texture image is a pre-finished image based on a target subject to be mapped, the target texture image comprises a plurality of texture blocks required by the subject to be mapped currently, different texture blocks need to be mapped to different areas of the target subject, for example, for texture mapping of the avatar, the texture image with eyes, ears, mouths and noses needs to be finished in advance, and after the avatar body is generated, the texture blocks corresponding to the eyes, ears, mouths and noses are mapped to the eye, ear, mouth and nose areas corresponding to the avatar body from the target texture image, so that the avatar of the avatar is finished.
As shown in fig. 2 and fig. 3, in step S101, since the texture block 110 and the target texture map are both images including a plurality of pixel points 112, but each texture block 110 is an independent and complete image, it is necessary to determine each texture block 110 from the target texture image 100, that is, it is equivalent to determine the four texture blocks a, b, c, and d shown in fig. 3 from fig. 2, where the determination of each texture block 110 from the target texture image 100 is determined based on the connectivity implementation between the pixel points 112; this process may be: the relationship between the pixels 112 includes a neighborhood and connectivity, where the neighborhood includes a 4 neighborhood, a D neighborhood, and an 8 neighborhood, the 4 neighborhood is a horizontally and vertically adjacent pixel of the target pixel p, for example, the p coordinate is (x, y), and then the pixel coordinates of the 4 neighborhood of p are (x +1, y), (x-1, y), (x, y +1), and (x, y-1), respectively; d is adjacent to p diagonal pixels 112 coordinates (x +1, y + 1); (x +1, y-1); (x-1, y + 1); (x-1, y-1); 8 neighborhood is the sum of pixel points 112 of 4 neighborhood and D neighborhood; the connectivity includes 4 connectivity, 8 connectivity, and m connectivity, where 4 connectivity is two pixel points p and q, and if p is in the 4 neighborhood of q, the two pixel points 112 are said to be 4 connectivity; 8 is connected into two pixel points p and q, if p is in 8 neighborhood of q, the two pixel points 112 are said to be 8 connected; m-connectivity includes two conditions: firstly, two pixel points p and q are arranged, wherein p is in the 4 neighborhood of q, or p is in the D neighborhood of q; at the second point, the intersection of the 4 neighborhoods of p and q is empty, namely m connectivity is mixed connectivity of 4 connectivity and D connectivity; an image region formed by a set of connected pixels 112 in the target texture image 100 is determined as a texture block 110, as shown in fig. 2 and fig. 3, the pixels 112 with connectivity between the connected pixels 112 are represented by connecting lines, that is, the pixels 112 with connectivity are the same texture block 110, and the pixels 112 between different texture blocks 110 do not have connectivity, that is, do not have connecting lines, so the pixels 112 in the texture block 110 all have connectivity, and the pixels 112 between different texture blocks 110 do not have connectivity.
As shown in fig. 4, since connectivity between the pixels 112 needs to be identified and determined by coordinates, coordinates of each pixel 112 in the target texture image 100 need to be determined before determining connectivity of the pixels 112, in the conventional approach, when an image is processed, the image is often based on pixel coordinates of the image, and the pixel coordinates of the image may change with the resolution of the image, that is, the pixel coordinates of the image include different conditions such as 400 × 400, 600 × 600, or 900 × 900, and therefore, the images with different pixel coordinates need to be adjusted adaptively in coordinates, so as to achieve an effect of processing images with different pixel coordinates, in the present disclosure, the step of determining the texture coordinates of each pixel 112 in the target texture image 100 includes: the pixel coordinates of each pixel point 112 in the target texture image 100 are normalized to generate texture coordinates, specifically, the process is equivalent to converting the pixel coordinates of the target texture image 100 into world coordinates, so as to achieve the effect of converting the pixel coordinates of images with different resolutions into the world coordinates of the same standard for processing, wherein a preset function can be, but is not limited to, a normalization function, and the pixel coordinates are converted into texture coordinates between 0 and 1 by a normalization function method, so that the method disclosed by the invention can adaptively process images with different resolutions; after the texture coordinates of the target texture image 100 are determined, each texture block 110 in the target texture image can be determined based on the texture coordinates in combination with the connectivity determination logic.
As shown in fig. 2-4, in step S102, before determining the boundary 111 with the stitching relationship, it is necessary to determine each boundary 111, determine coordinates of edge pixel points 112 of each texture block 110 according to texture coordinates, and determine each boundary 111 of the texture block 110 based on a set of the edge pixel points 112, so that one texture block 110 may include a plurality of boundaries 111; as an example, when the target texture map is used for mapping onto the target three-dimensional model, first three-dimensional coordinates of the target three-dimensional model need to be obtained, since the target three-dimensional model is composed of a plurality of points, and the points form a boundary line, a triangle, a rectangle, and other shapes to finally present a three-dimensional target three-dimensional model, each point corresponds to a unique three-dimensional coordinate, the three-dimensional coordinates can be created in the process of creating the target three-dimensional model, or created and generated according to the target three-dimensional model, so the three-dimensional coordinate obtaining process of the target three-dimensional model can obtain pre-created three-dimensional coordinates, or can subsequently create and generate corresponding three-dimensional coordinates based on the target three-dimensional model, and since the points are the basis of forming the boundary line, the triangle, the rectangle, and other shapes, the corresponding three-dimensional coordinates on the boundary line are also known, that is, each boundary line has a corresponding three-dimensional coordinate set, and the three-dimensional coordinates of the texture coordinates of each pixel point 112 on the boundary 111 on the target three-dimensional model are determined; determining a set of pixel points 112 with the same three-dimensional coordinate as a boundary with a stitching relationship; the process of determining the boundary 111 with stitching relationships is then: determining three-dimensional coordinates of a target three-dimensional model, mapping pixels of a target texture map onto the target three-dimensional model, wherein in the mapping process, pixel points 112 on one texture coordinate are mapped onto one three-dimensional coordinate, so that one texture coordinate corresponds to only one three-dimensional coordinate, and thus, the texture is accurately mapped onto the target three-dimensional model, because the continuity of the pixel points 112 needs to be ensured after each texture block 110 is mapped onto the target three-dimensional model, the boundaries 111 of the texture blocks 110 need to be stitched, wherein the stitching of the boundaries 111 means that at least two boundaries 111 with stitching relations are both mapped onto the same boundary line of the target three-dimensional model, so that for the three-dimensional coordinates on the boundary line of the target three-dimensional model, at least two texture coordinates are corresponding to, namely, a plurality of texture coordinate sets corresponding to the three-dimensional coordinate set of the boundary line on the target three-dimensional model, one texture coordinate set corresponds to one boundary 111, so that all boundaries corresponding to all texture coordinate sets corresponding to the three-dimensional coordinate set of the same boundary line have a stitching relationship, and therefore, the stitching of the boundaries 111 is realized by mapping a plurality of boundaries 111 to the boundary lines of the three-dimensional coordinate set with the corresponding relationship according to the pixel points 112 on the texture coordinate set.
Although the continuity of the pixel points 112 on the target three-dimensional model is ensured, due to the difference of patterns or color differences between the boundaries 111 of the texture block 110, the problem of faults or color differences is easy to occur after the boundaries 111 are stitched; in order to eliminate the problem of stitching the boundary 111, the conventional technique is to manually adjust the stitched boundary 111 to solve the problem of the fault or color difference, and the present disclosure uses the poisson equation to perform the fusion processing on each texture block 110, since the poisson equation is generally a fusion of two images, each texture block 110 in the present disclosure requires a separate fusion process, it is therefore necessary to determine another image other than the texture block 110 as an input image of the poisson equation, since the stitching boundary 111 is often problematic in the mapping process of the target three-dimensional model, therefore, the area with the preset width around the boundary 111 is determined to be the designated boundary area 113, and then the designated boundary area 113 and the texture block 110 are input into the Poisson equation together, i.e. the fusion processing of the texture block 110 and the boundary 111 thereof can be completed by utilizing the Poisson variance, so as to eliminate the difference of the boundary 111 of the texture block 110;
On the other hand, the poisson equation determines the boundary point in the process of fusing the two images, and the information of the boundary 111 can be determined in advance by inputting the texture block 110 and the designated boundary region 113 into the poisson method, which is equivalent to inputting targeted data, so that the efficiency of reducing the working capacity of the poisson equation is improved.
In step S103, a statistic of color values of each pixel 112 on the boundary 111 having the stitching relationship is taken as a target color value; the target color value serves as an adjustment parameter of the poisson equation, and plays a role of adjusting an output of the poisson equation, where the statistic may be, but is not limited to, an average value, a median value, etc., and as an example, when the statistic is an average amount corresponding to each pixel 112 on the boundary 111, the target color value is specifically a set of averages of color values of each pixel 112 on the boundary 111 having a stitching relationship, for example, color values of pixels 112 on two boundaries 111 having a stitching relationship are respectively boundary 111(1, 2, 3, 4) and (3, 4, 5, 6) then the target color value is (2, 3, 4, 5), and the target color value is a target color value of two boundaries 111 having a stitching relationship, that is, a target color value of the boundary 111 having a stitching relationship is consistent, and the target is an adjustment parameter input as a poisson variance, therefore, the boundary 111 with the stitching relationship is fused into the matched color value through the Poisson equation so as to reduce the problem of faults or color differences of the stitching boundary 111.
In step S104, inputting the texture block 110, the target color value, and a designated boundary area 113 into a poisson equation, where the designated boundary area 113 is an area with a preset width around the boundary 111 with a stitching relationship;
in this embodiment, because the texture blocks 110 are stitched at multiple locations as required, that is, one texture block 110 may include multiple boundaries 111, and the stitched boundary 111 corresponding to each boundary 111 may be in the texture block 110, or may be in other texture blocks 110, when performing fusion processing on each texture block 110 and its boundary 111, input poisson equation fusion is performed on target color values corresponding to each texture block 110 and its boundary 111 and the specified boundary area 113 in sequence, the input specified boundary area 113 is the specified boundary area 113 of all boundaries 111 in the texture block 110, and the input target color values are target color values corresponding to all boundaries 111 in the texture block 110; the target color values are used as adjustment parameters of the poisson equation, the poisson equation is adjusted to process the boundary 111 of the texture block 110, the target color values are average color values of the boundary 111 with the stitching relationship, therefore, the fused texture information of the boundary 111 with the stitching relationship is considered, the color values of the boundary 111 with the stitching relationship are consistent and continuous after passing through the poisson equation, namely when the boundary 111 with the stitching relationship is processed, the used adjustment parameters are all consistent target color values, the boundary 111 with the stitching relationship is processed through the poisson equation, the boundary 111 with the stitching relationship forms closer textures and color values, and the problem of faults or color difference cannot occur during stitching.
FIG. 5 is a schematic diagram of a specified boundary region in a target texture image shown in the present disclosure in accordance with an exemplary embodiment;
as shown in fig. 5, further, the designated boundary regions 113 are boundaries 111 having a stitching relationship, which extend inward by 20 pixels 112, and since one texture block 110 may include a plurality of boundaries 111, one texture block 110 may include a plurality of designated boundary regions 113, each designated boundary region 113 is generated based on the boundaries 111 having the stitching relationship, and therefore, the designated boundary regions 113 also have the stitching relationship therebetween, as shown in fig. 5, the designated boundary region 113 at the bottom of the texture block c and the designated boundary region 113 at the top of the texture block d are the designated boundary regions 113 having the stitching relationship, and the designated boundary regions 113 at the left and right sides of the texture block d are the designated boundary regions 113 having the stitching relationship, so that the designated boundary regions 113 having the stitching relationship may be in the same texture block 110 or in different texture blocks 110; the preset width of the designated boundary region 113 easily affects the effect of the final fused image, and on one hand, the preset width affects the speed of the poisson equation fusion, wherein the larger the preset width is, the larger the designated boundary region 113 of the poisson equation fusion is, and the longer the processing time of the poisson equation is; conversely, the smaller the designated boundary area 113 is, the shorter the processing time of the poisson equation is, and on the other hand, even if the preset width is too small, the problem that the boundary 111 with the stitching relationship still has the possible fault and color difference is still present, while the calculated amount of the poisson equation is large when the preset width is too large and the more remarkable effect is difficult to occur on the fusion effect of the boundary 111 with the stitching relationship, so that the problem of low efficiency is present; experiments have shown that a preset width of between 10 and 30, preferably 20, solves the problems of faults and color differences of the boundary 111 and ensures the efficiency of the poisson equation.
Step S105, carrying out fusion processing on the pixel points 112 of the designated boundary area 113 with the stitching relation by using a Poisson equation to generate a fusion image.
In this embodiment, the poisson equation is to perform fusion processing on the texture block 110 and the designated boundary region 113, fuse the texture block 110 and the designated boundary region 113, adjust the designated boundary region 113 based on the target color value, because the target color values of the boundary 111 with the stitching relationship are consistent, the color difference between the designated boundary regions 113 corresponding to the boundary 111 with the stitching relationship can be reduced, thereby reducing the problems of faults and color difference after the boundary 111 is stitched, and the color value of each pixel point 112 on the texture block 110 is closer to the designated boundary region 113 through the fusion of the whole texture block 110 and the designated boundary region 113, therefore, the uniformity of the overall color value of the texture block 110 is ensured, and the problem of incongruity caused by the large color difference between the designated boundary area 113 of the texture block 110 and other areas is prevented.
FIG. 6 is a schematic diagram of a particular target texture image shown in accordance with an exemplary embodiment of the present disclosure.
FIG. 7 is a partial flow diagram of an example application in the usage scenario of FIG. 6.
As an example, when the method of the present disclosure is implemented with fig. 6 as the target texture image:
FIG. 8 is a block diagram of texture for the example application in the usage scenario of FIG. 6.
S201, establishing texture coordinates based on the target texture map of FIG. 6, determining connectivity among pixel points according to the texture coordinates, and determining a texture block a, a texture block b, a texture block c, a texture block d and four texture blocks through the connectivity.
FIG. 8 is a block diagram of texture for the example application in the usage scenario of FIG. 6.
After the texture coordinates are established, as shown in fig. 8, the process of establishing the texture coordinates is to combine the pixel coordinates of the target texture map in fig. 6 with the normalization function to generate corresponding texture coordinates, so as to ensure that the target texture images with different pixel coordinates can be processed as the texture coordinates in a unified manner.
S202, determining boundaries with stitching relations in the texture blocks based on the three-dimensional coordinates of the three-dimensional model and the texture coordinates of the target texture block.
FIG. 8 is a diagram illustrating the boundaries of an application example in the usage scenario of FIG. 6.
Taking the texture block d as an example, as shown in fig. 9, it is determined that the coordinate sets corresponding to the boundaries 111a and 111b on the left and right sides of the texture block d are all corresponding to the same three-dimensional coordinate set by the texture coordinates, and therefore it is determined that the boundaries on the left and right sides of the texture block d are provided with stitched boundaries; and the coordinate sets corresponding to the boundary 111d at the bottom of the texture block c and the boundary 111c at the top of the texture block d are both corresponding to the same three-dimensional coordinate set, so that the boundary 111d at the bottom of the texture block c and the boundary 111c at the top of the texture block d are determined to be boundaries with stitching; other boundary determination processes with stitching relationships are the same and will not be described here.
And S203, taking the average value of the color values of all the pixel points on the boundary with the stitching relation as a target color value.
Taking the texture block d as an example, color values of pixels corresponding to each of the boundary 111a and the boundary 111b are obtained, and the color values of the corresponding pixels are averaged to obtain target pixels of the boundary 111a and the boundary 111 b.
S204, determining the designated boundary area.
FIG. 10 is a diagram illustrating a designated bounding region for an application instance in the usage scenario of FIG. 6.
Taking the texture block d as an example, as shown in fig. 10, 20 pixels are extended to the central area of the texture block d along the boundary 111a, the boundary 111b, and the boundary 111c, and a designated boundary area 113a, a designated boundary area 113b, and a designated boundary area 113c corresponding to the boundary 111a, the boundary 111b, and the boundary 111c are determined;
and S205, generating a fusion image by using the texture block, the target color value and the input value Poisson equation of the specified boundary area.
Taking the texture block d as an example, the target color values corresponding to the texture block d, the boundary defining region 113a, the designated boundary region 113b, the designated boundary region 113c, the boundary 111a, the boundary 111c, and the boundary 111d need to be input into a poisson equation; and inputting the value Poisson equation by other texture blocks in the same form, and finally generating a fused image.
And S206, mapping the fused image to the three-dimensional model.
As shown in fig. 11, 11A in fig. 11 is a case where the target texture image is directly mapped and fused to the three-dimensional model, and at this time, the problem of manually adjusting the fault and the color difference of the stitched boundary also needs to be performed on the three-dimensional model, and after the fused image generated by the method of the present disclosure is mapped to the three-dimensional model, as shown in 11B in fig. 11, the obvious fault and color difference problem does not occur at the stitched boundary, and the modification by the manual adjustment is not needed, so that the efficiency of mapping and fusing the texture image is improved.
FIG. 12 is a block diagram illustrating a texture image fusion apparatus according to an exemplary embodiment of the present disclosure.
According to a second aspect of the embodiments of the present disclosure, as shown in fig. 12, there is provided a texture image fusion apparatus including:
a partitioning module 1210, configured to determine texture blocks including boundaries in a target texture map;
an association module 1220 for determining a boundary having a stitching relationship;
the statistical module 1230 is configured to use a statistical amount of color values of each pixel point on the boundary with the stitching relationship as a target color value;
the input module 1240 is configured to input the texture block, the target color value, and the designated boundary area into a poisson equation, where the target color value corresponds to the boundary of the texture block, and the designated boundary area is an area with a preset width around the boundary having a stitching relationship;
The processing module 1250 is configured to perform fusion processing on the pixels in the designated boundary area having the stitching relationship by using a poisson equation, so as to generate a fused image.
In another embodiment, the partitioning module 1210 comprises:
the coordinate unit is used for determining the texture coordinate of each pixel point in the target texture map;
the connectivity unit is used for determining the connectivity among all pixel points in the texture image based on the texture coordinates;
and the collection unit is used for determining the pixel point collection with connectivity as a texture block.
Fig. 13 is a block diagram of a texture image fusion apparatus according to another exemplary embodiment of the present disclosure.
As shown in fig. 13, the apparatus further includes: and the mapping module 1260 is used for mapping the fused image to the target three-dimensional model.
Fig. 14 is a hardware configuration diagram of a computer device according to an embodiment of the present disclosure.
According to a third aspect of embodiments herein, as in fig. 14, there is provided a computer device comprising a memory 1402, a processor 1401, and a computer program stored on the memory 1402 and executable on the processor 1401, wherein the processor 1401 when executing the program implements a method as in the first aspect.
The apparatus may include: a processor 1401, a memory 1402, an input/output interface 1403, a communication interface 1404, and a bus 1405. Wherein the processor 1401, the memory 1402, the input/output interface 1403 and the communication interface 1404 are communicatively connected to each other within the device via a bus 1405.
The processor 1401 may be implemented by a general-purpose CPU (Central Processing Unit), a microprocessor, an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits, and is configured to execute related programs to implement the technical solutions provided in the embodiments of the present specification.
The Memory 1402 may be implemented in the form of a ROM (Read Only Memory), a RAM (Random Access Memory), a static Memory device, a dynamic Memory device, or the like. The memory 1402 may store an operating system and other application programs, and when the technical solutions provided by the embodiments of the present specification are implemented by software or firmware, the relevant program codes are stored in the memory 1402 and called to be executed by the processor 1401.
The input/output interface 1403 is used for connecting an input/output module to input and output information. The i/o module may be configured as a component within the device (not shown) or may be external to the device to provide corresponding functionality. Wherein the input devices may include a keyboard, mouse, touch screen, microphone, various sensors, etc., and the output devices may include a display, speaker, vibrator, indicator light, etc.
The communication interface 1404 is used for connecting a communication module (not shown in the figure) to implement communication interaction between the present device and other devices. The communication module can realize communication in a wired mode (for example, USB, network cable, etc.), and can also realize communication in a wireless mode (for example, mobile network, WIFI, bluetooth, etc.).
Bus 1405 includes a path to transfer information between various components of the device, such as processor 1401, memory 1402, input/output interface 1403, and communication interface 1404.
It should be noted that although the above-described device only shows the processor 1401, the memory 1402, the input/output interface 1403, the communication interface 1404 and the bus 1405, in a specific implementation, the device may also comprise other components necessary for normal operation. In addition, those skilled in the art will appreciate that the above-described apparatus may also include only the components necessary to implement the embodiments of the present disclosure, and need not include all of the components shown in the figures.
According to a fourth aspect of embodiments herein, there is provided a computer readable storage medium having stored thereon computer instructions which, when executed by a processor, implement a method as in the first aspect.
From the above description of the embodiments, it is clear to those skilled in the art that the embodiments of the present disclosure can be implemented by software plus necessary general hardware platform. Based on such understanding, the technical solutions of the embodiments of the present specification may be essentially or partially implemented in the form of a software product, which may be stored in a storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, etc., and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods of the embodiments or some parts of the embodiments of the present specification.
The foregoing description of specific embodiments of the present disclosure has been presented. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
Other embodiments of the present disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.
The above description is only exemplary of the present disclosure and should not be taken as limiting the disclosure, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present disclosure should be included in the scope of the present disclosure.

Claims (10)

1. A texture image fusion method, the method comprising:
determining each texture block containing a boundary in the target texture map;
Determining the boundary having a stitching relationship;
taking the statistic of the color value of each pixel point on the boundary with the stitching relation as a target color value;
inputting the texture block, the target color value and a specified boundary region into a Poisson equation, wherein the target color value corresponds to the boundary of the texture block, and the specified boundary region is a region with a preset width around the boundary and has a stitching relation;
and performing fusion processing on the pixel points of the specified boundary region with the stitching relation by using the Poisson equation to generate a fusion image.
2. The method of claim 1, wherein determining each texture block of the target texture map that includes a boundary comprises:
determining texture coordinates of each pixel point in the target texture map;
determining connectivity among pixel points in the texture image based on the texture coordinates;
and determining a pixel point set with connectivity as a texture block.
3. The method of claim 2, further comprising: and mapping the fused image to a target three-dimensional model.
4. The method of claim 3, wherein determining a boundary having a stitching relationship comprises:
Determining three-dimensional coordinates of texture coordinates of all pixel points on the boundary, which correspond to the target three-dimensional model; and determining the pixel point set with the same three-dimensional coordinate as a boundary with a stitching relationship.
5. The method of claim 2, wherein the step of determining texture coordinates for each pixel point in the target texture map comprises: and carrying out normalization processing on the pixel coordinates of each pixel point in the target texture map to generate the texture coordinates.
6. A texture image fusion apparatus, characterized in that the apparatus comprises:
the dividing module is used for determining each texture block containing a boundary in the target texture map;
an association module to determine the boundary having a stitching relationship;
the statistical module is used for taking the statistics of the color values of all the pixel points on the boundary with the stitching relation as target color values;
an input module, configured to input the texture block, the target color value, and a specified boundary region into a poisson equation, where the target color value corresponds to the boundary of the texture block, and the specified boundary region is a region with a preset width around the boundary and having a stitching relationship;
And the processing module is used for carrying out fusion processing on the pixel points of the specified boundary region with the stitching relationship by utilizing a Poisson equation to generate a fusion image.
7. The apparatus of claim 6, wherein the partitioning module comprises:
the coordinate unit is used for determining the texture coordinate of each pixel point in the target texture map;
the connectivity unit is used for determining the connectivity among the pixel points in the texture image based on the texture coordinates;
and the collection unit is used for determining the pixel point collection with connectivity as the texture block.
8. The apparatus of claim 7, further comprising: and the mapping module is used for mapping the fused image to a target three-dimensional model.
9. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the method of any one of claims 1 to 5 when executing the program.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method of any one of claims 1 to 5.
CN202210358299.2A 2022-04-06 2022-04-06 Texture image fusion method and device, computer equipment and readable medium Pending CN114757861A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210358299.2A CN114757861A (en) 2022-04-06 2022-04-06 Texture image fusion method and device, computer equipment and readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210358299.2A CN114757861A (en) 2022-04-06 2022-04-06 Texture image fusion method and device, computer equipment and readable medium

Publications (1)

Publication Number Publication Date
CN114757861A true CN114757861A (en) 2022-07-15

Family

ID=82328970

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210358299.2A Pending CN114757861A (en) 2022-04-06 2022-04-06 Texture image fusion method and device, computer equipment and readable medium

Country Status (1)

Country Link
CN (1) CN114757861A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115690246A (en) * 2022-10-20 2023-02-03 北京国电通网络技术有限公司 Image texture information generation method, apparatus, device, medium, and program product

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104574501A (en) * 2014-12-19 2015-04-29 浙江大学 High-quality texture mapping method aiming at complicated three-dimensional scene
US20160217552A1 (en) * 2015-01-22 2016-07-28 Samsung Electronics Co., Ltd. Video super-resolution by fast video segmentation for boundary accuracy control
CN106600544A (en) * 2016-11-10 2017-04-26 北京暴风魔镜科技有限公司 Anti-aliasing method and anti-aliasing system based on texture mapping
CN107154016A (en) * 2016-03-01 2017-09-12 腾讯科技(深圳)有限公司 The joining method and device of destination object in stereo-picture
CN111738914A (en) * 2020-07-29 2020-10-02 腾讯科技(深圳)有限公司 Image processing method, image processing device, computer equipment and storage medium
CN113221619A (en) * 2021-01-28 2021-08-06 深圳市雄帝科技股份有限公司 Face image highlight removing method and system based on Poisson reconstruction and storage medium thereof

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104574501A (en) * 2014-12-19 2015-04-29 浙江大学 High-quality texture mapping method aiming at complicated three-dimensional scene
US20160217552A1 (en) * 2015-01-22 2016-07-28 Samsung Electronics Co., Ltd. Video super-resolution by fast video segmentation for boundary accuracy control
CN107154016A (en) * 2016-03-01 2017-09-12 腾讯科技(深圳)有限公司 The joining method and device of destination object in stereo-picture
CN106600544A (en) * 2016-11-10 2017-04-26 北京暴风魔镜科技有限公司 Anti-aliasing method and anti-aliasing system based on texture mapping
CN111738914A (en) * 2020-07-29 2020-10-02 腾讯科技(深圳)有限公司 Image processing method, image processing device, computer equipment and storage medium
CN113221619A (en) * 2021-01-28 2021-08-06 深圳市雄帝科技股份有限公司 Face image highlight removing method and system based on Poisson reconstruction and storage medium thereof

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
姜翰青等: "面向复杂三维场景的高质量纹理映射", 计算机学报, 15 December 2015 (2015-12-15) *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115690246A (en) * 2022-10-20 2023-02-03 北京国电通网络技术有限公司 Image texture information generation method, apparatus, device, medium, and program product

Similar Documents

Publication Publication Date Title
WO2020038407A1 (en) Image rendering method and apparatus, image processing device, and storage medium
JP6970283B2 (en) Image stitching method and device, storage medium
JP4071422B2 (en) Motion blur image drawing method and drawing apparatus
TWI636423B (en) Method for efficient construction of high resolution display buffers
US11839820B2 (en) Method and apparatus for generating game character model, processor, and terminal
US20220284679A1 (en) Method and apparatus for constructing three-dimensional face mesh, device, and storage medium
US10217259B2 (en) Method of and apparatus for graphics processing
US8698830B2 (en) Image processing apparatus and method for texture-mapping an image onto a computer graphics image
CN112288665A (en) Image fusion method and device, storage medium and electronic equipment
JPWO2014033886A1 (en) Image processing apparatus, image processing method, and program
CN101477701A (en) Built-in real tri-dimension rendering process oriented to AutoCAD and 3DS MAX
CN101477702B (en) Built-in real tri-dimension driving method for computer display card
CN114757861A (en) Texture image fusion method and device, computer equipment and readable medium
CN101511034A (en) Truly three-dimensional stereo display method facing Skyline
CN110038302B (en) Unity 3D-based grid generation method and device
CN116977539A (en) Image processing method, apparatus, computer device, storage medium, and program product
JP5857606B2 (en) Depth production support apparatus, depth production support method, and program
CN114627225A (en) Method and device for rendering graphics and storage medium
JP4214528B2 (en) Pseudo stereoscopic image generation apparatus, pseudo stereoscopic image generation program, and pseudo stereoscopic image display system
JP2010238110A (en) Image processor, image processing method and image processing program
CN115311395A (en) Three-dimensional scene rendering method, device and equipment
CN101488230B (en) VirtualEarth oriented ture three-dimensional stereo display method
JP2022029239A (en) Image processing apparatus, image processing method, and program
GB2573593A (en) Augmented reality rendering method and apparatus
US20230410406A1 (en) Computer-readable non-transitory storage medium having image processing program stored therein, image processing apparatus, image processing system, and image processing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination