CN113538549B - Method and system for retaining texture of image texture during image processing - Google Patents
Method and system for retaining texture of image texture during image processing Download PDFInfo
- Publication number
- CN113538549B CN113538549B CN202111013113.1A CN202111013113A CN113538549B CN 113538549 B CN113538549 B CN 113538549B CN 202111013113 A CN202111013113 A CN 202111013113A CN 113538549 B CN113538549 B CN 113538549B
- Authority
- CN
- China
- Prior art keywords
- normal
- image
- processed
- map
- vector
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000012545 processing Methods 0.000 title claims abstract description 61
- 238000000034 method Methods 0.000 title claims abstract description 39
- 238000013507 mapping Methods 0.000 claims abstract description 62
- 238000011084 recovery Methods 0.000 claims abstract description 13
- 230000008859 change Effects 0.000 claims description 4
- 238000009499 grossing Methods 0.000 claims description 4
- 230000002401 inhibitory effect Effects 0.000 claims description 3
- 230000008569 process Effects 0.000 description 8
- 230000000694 effects Effects 0.000 description 3
- 230000008030 elimination Effects 0.000 description 3
- 238000003379 elimination reaction Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 238000003491 array Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 230000037303 wrinkles Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000005764 inhibitory process Effects 0.000 description 1
- 230000000670 limiting effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/529—Depth or shape recovery from texture
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/77—Retouching; Inpainting; Scratch removal
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Processing (AREA)
- Image Generation (AREA)
Abstract
The application relates to a method and a system for retaining texture of image texture when processing an image, wherein the method comprises the following steps: confirming a region to be eliminated in the image to be processed, smearing the region to be eliminated to obtain a smeared mask image, filling the smeared region to be eliminated, calculating to obtain a first normal map of the image to be processed, downsampling the image to be processed based on a preset scaling factor, and calculating to obtain a second normal map of the downsampled image to be processed. And processing the first normal line mapping and the second normal line mapping, merging the processed first normal line mapping and second normal line mapping into a third normal line mapping, carrying out texture recovery on the mask map based on the third normal line mapping, and replacing the mask map after texture recovery to the region to be eliminated. The pattern of the area to be eliminated in the image to be processed is eliminated, and the texture of the image of the area to be eliminated is reserved.
Description
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to a method and system for preserving texture of an image during image processing.
Background
In some design areas, a user may need to smear out patterns on some objects in an image and replace the patterns with user-defined patterns, such as replacing LOGO on clothing in the image. In the prior art, when such image processing is performed, the pattern on the surface of the object is generally directly smeared, and then the smeared image part is subjected to restoration processing, but the restored image often loses texture information of the restored part, such as fold information of clothes after the pattern on the clothes is removed.
Disclosure of Invention
In order to overcome the problem that the repairing treatment is carried out on the smeared image part when the image processing is carried out in the related technology, and the texture information of the repaired part can be lost, the application provides a method and a system for retaining the texture of the image texture when the image is processed.
The scheme of the application is as follows:
according to a first aspect of embodiments of the present application, there is provided a method for preserving texture of an image texture when processing an image, including:
confirming a region to be eliminated in an image to be processed, and smearing the region to be eliminated to obtain a smeared mask image;
filling the area to be eliminated after the smearing;
calculating to obtain a first normal map of the image to be processed;
downsampling the image to be processed based on a preset scaling factor, and calculating to obtain a second normal map of the downsampled image to be processed;
processing the first normal map and the second normal map;
merging the processed first normal line mapping and the second normal line mapping into a third normal line mapping;
texture restoration is performed on the mask map based on the third normal map;
and replacing the mask image after texture recovery to the region to be eliminated.
Preferably, in one implementation manner of the present application, the filling the area to be eliminated after the applying includes:
and selecting the adjacent pixel areas of the area to be eliminated to carry out pixel filling on the area to be eliminated.
Preferably, in an implementation manner of the present application, the calculating to obtain the first normal map of the image to be processed includes:
establishing a first coordinate system, and constructing a first normal vector in the first coordinate system;
taking the gradient of the image to be processed in the horizontal direction as the vector quantity of the first normal vector on the x axis of the first coordinate system;
taking the gradient of the image to be processed in the vertical direction as the vector quantity of the first normal vector on the y axis of the first coordinate system;
taking a first preset constant as a component of the first normal vector on the z axis of the first coordinate system;
normalizing the first normal vector;
processing the normalized first normal vector;
and generating the first normal map according to the first normal vector obtained through processing.
Preferably, in one implementation manner of the present application, the calculating to obtain the second normal map of the downsampled image to be processed includes:
establishing a second coordinate system, and constructing a second normal vector in the second coordinate system;
taking the gradient of the downsampled image to be processed in the horizontal direction as the component of the second normal vector on the x axis of the second coordinate system;
taking the gradient of the downsampled image to be processed in the vertical direction as the component of the second normal vector on the y axis of the second coordinate system;
taking a second preset constant as a component of the first normal vector on the z axis of the first coordinate system; wherein the second preset constant is greater than the first preset constant;
normalizing the second normal vector;
processing the normalized second normal vector;
and generating the second normal map according to the processed second normal vector.
Preferably, in one implementation manner of the present application, the processing the normalized first normal vector includes: inhibiting larger values at two ends of the normalized first normal vector result;
the processing of the normalized second normal vector includes: and suppressing larger values at two ends of the normalized second normal vector result.
Preferably, in one implementable manner of the present application, the method further includes:
acquiring pixel values of each point of the image to be processed;
and calculating the gradient of each point of the image to be processed based on a gradient operator according to the pixel value of each point of the image to be processed.
Preferably, in an implementation manner of the present application, the processing the first normal map and the second normal map includes:
and carrying out Gaussian blur on the first normal line mapping and the second normal line mapping, and smoothing normal line changes of the first normal line mapping and the second normal line mapping.
Preferably, in one implementable manner of the present application, the merging the processed first normal map and the second normal map into a third normal map includes:
and carrying out soft light mixing on the processed first normal line mapping and the processed second normal line mapping to obtain the third normal line mapping.
Preferably, in an implementation manner of the present application, the performing texture recovery on the mask map based on the third normal map includes:
and calculating the world light irradiation result of the third normal map, and recovering the shadow and gradient change of the object surface fold in the mask map according to the world light irradiation result.
According to a second aspect of embodiments of the present application, there is provided a system for preserving texture of an image texture when processing an image, comprising:
the eliminating module is used for confirming an area to be eliminated in the image to be processed, and smearing the area to be eliminated to obtain a smeared mask image;
the filling module is used for filling the area to be eliminated after being smeared;
the first normal map generation module is used for calculating and obtaining a first normal map of the image to be processed;
the second normal map generation module is used for downsampling the image to be processed based on a preset scaling coefficient, and calculating to obtain a second normal map of the downsampled image to be processed;
the normal line mapping processing module is used for processing the first normal line mapping and the second normal line mapping;
the normal line mapping merging module is used for merging the processed first normal line mapping and the processed second normal line mapping into a third normal line mapping;
a texture restoration module, configured to perform texture restoration on the mask map based on the third normal map;
and the replacing module is used for replacing the mask image after texture recovery to the area to be eliminated.
The technical scheme that this application provided can include following beneficial effect: the method for retaining the texture of the image when processing the image comprises the following steps: confirming a region to be eliminated in the image to be processed, smearing the region to be eliminated to obtain a smeared mask image, filling the smeared region to be eliminated, calculating to obtain a first normal map of the image to be processed, downsampling the image to be processed based on a preset scaling factor, and calculating to obtain a second normal map of the downsampled image to be processed. The first normal map is calculated by adopting the full resolution of the image to be processed, so that more details of the image to be processed are obtained, and the second normal map is calculated by adopting the downsampled image to be processed, so that more structural information of the image to be processed is obtained. And processing the first normal line mapping and the second normal line mapping, merging the processed first normal line mapping and second normal line mapping into a third normal line mapping, carrying out texture recovery on the mask map based on the third normal line mapping, and replacing the mask map after texture recovery to the region to be eliminated. The pattern of the area to be eliminated in the image to be processed is eliminated, and the texture of the image of the area to be eliminated is reserved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
FIG. 1 is a flow chart of a method for preserving texture of an image while processing the image according to one embodiment of the present application;
FIG. 2 is a flowchart illustrating a method for preserving texture of an image during image processing according to another embodiment of the present application to calculate a first normal map of an image to be processed;
FIG. 3 is a schematic view of an image to be processed provided in one embodiment of the present application;
FIG. 4 is a schematic illustration of a coated mask provided in one embodiment of the present application;
FIG. 5 is a schematic diagram of a region to be eliminated after smearing an image to be processed according to one embodiment of the present application;
FIG. 6 is a third normal map schematic provided by one embodiment of the present application;
FIG. 7 is a schematic representation of the resulting processed image provided by one embodiment of the present application;
FIG. 8 is a schematic diagram of a system for preserving texture of an image while processing the image according to one embodiment of the present application.
Reference numerals: an elimination module-21; filling the module-22; a first normal map generation module-23; a second normal map generation module-24; a normal map processing module-25; normal map merging module-26; a texture restoration module-27; replacement module-28.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present application as detailed in the accompanying claims.
A method of preserving image texture in processing an image, referring to fig. 1, comprising:
s11: confirming a region to be eliminated in the image to be processed, and smearing the region to be eliminated to obtain a smeared mask image;
fig. 3 is illustrated as an image to be processed in this and subsequent embodiments. The mask pattern after application is shown in fig. 4.
S12: filling the area to be eliminated after the smearing;
specific: and selecting adjacent pixel areas of the area to be eliminated to fill the pixels of the area to be eliminated, so that the filled pixels have continuity. The processed image has more reality and better texture effect. The effect after filling is shown in fig. 5.
S13: calculating to obtain a first normal map of the image to be processed;
referring to fig. 2, in particular:
s131: establishing a first coordinate system, and constructing a first normal vector in the first coordinate system;
s132: taking the gradient of the image to be processed in the horizontal direction as the vector quantity of the first normal vector on the x axis of the first coordinate system;
s133: taking the gradient of the image to be processed in the vertical direction as the component of the first normal vector on the y axis of the first coordinate system;
s134: taking a first preset constant as a component of a first normal vector on a z-axis of a first coordinate system;
s135: normalizing the first normal vector;
s136: processing the normalized first normal vector;
s137: and generating a first normal map according to the first normal vector obtained by processing.
The magnitude of the first predetermined constant determines the degree of concavity and convexity of the first normal map. When the first preset constant is small, the component of the first normal vector on the z axis of the first coordinate system is small, so that the component of the normalized first normal vector in the x and y directions is large, further, the concave-convex fluctuation of the first normal map is large, and the same is done.
The formula for normalizing the first normal vector is as follows:
n=(x,y,z)/length((x,y,z))
wherein n is a first normal vector, x, y and z are the component vector of the first normal vector on the x axis, the component vector of the first normal vector on the y axis and a first preset constant respectively.
S14: downsampling the image to be processed based on a preset scaling factor, and calculating to obtain a second normal map of the downsampled image to be processed;
specific:
establishing a second coordinate system, and constructing a second normal vector in the second coordinate system;
taking the gradient of the sampled image to be processed in the horizontal direction as the vector quantity of the second normal vector on the x axis of the second coordinate system;
taking the gradient of the sampled image to be processed in the vertical direction as the component of a second normal vector on the y axis of a second coordinate system;
taking a second preset constant as a component of the first normal vector on the z axis of the first coordinate system; wherein the second preset constant is greater than the first preset constant;
normalizing the second normal vector;
processing the normalized second normal vector;
and generating a second normal map according to the second normal vector obtained by processing.
The process of generating the second normal map is substantially identical to the process of generating the first normal map.
The difference is that downsampling of the image to be processed based on a preset scaling factor is required before the second normal map is generated.
Preferably, the preset scaling factor is 4.
Downsampling is the process of reducing an image, and downsampling an image to be processed is to reduce the image to be processed.
In this embodiment, the second preset constant is set to be larger than the first preset constant, so that the structural information of the image is more conveniently obtained.
When the first normal map is calculated, more details of the image to be processed are obtained as the full resolution of the image to be processed is adopted for calculation. More structural information of the image to be processed is obtained when the second normal map is calculated. The main purpose of computing the second normal map is to recover a larger range of gradient transformation cases of the image to be processed.
S15: processing the first normal map and the second normal map;
s16: merging the processed first normal line mapping and the second normal line mapping into a third normal line mapping;
the third normal map is shown in fig. 6.
S17: performing texture recovery on the mask map based on the third normal map;
s18: and replacing the mask image after texture recovery to the area to be eliminated.
The resulting processed image is shown in fig. 7.
The method for retaining texture of image texture when processing an image in the embodiment comprises the following steps: confirming a region to be eliminated in the image to be processed, smearing the region to be eliminated to obtain a smeared mask image, filling the smeared region to be eliminated, calculating to obtain a first normal map of the image to be processed, downsampling the image to be processed based on a preset scaling factor, and calculating to obtain a second normal map of the downsampled image to be processed. The first normal map is calculated by adopting the full resolution of the image to be processed, so that more details of the image to be processed are obtained, and the second normal map is calculated by adopting the downsampled image to be processed, so that more structural information of the image to be processed is obtained. And processing the first normal line mapping and the second normal line mapping, merging the processed first normal line mapping and second normal line mapping into a third normal line mapping, carrying out texture recovery on the mask map based on the third normal line mapping, and replacing the mask map after texture recovery to the region to be eliminated. The pattern of the area to be eliminated in the image to be processed is eliminated, and the texture of the image of the area to be eliminated is reserved.
In some embodiments, a method for preserving texture of an image texture when processing an image, processes a normalized first normal vector, including: inhibiting larger values at two ends of the normalized first normal vector result;
processing the normalized second normal vector, including: and suppressing larger values at two ends of the normalized second normal vector result.
Since more gradient information of gradual changes rather than edge gradient information of abrupt changes needs to be retained when image restoration is performed, in this embodiment, the larger values at both ends of the result of the normal vector calculated in the above embodiment are also suppressed once, so as to be used for suppressing the edge information of abrupt changes, thereby obtaining the suppressed normal vector.
Specifically, the inhibition is performed with reference to the following formula:
n’=max((-smoothstep(0.0,0.35,abs(n))+1.0,0.0)*n。
where n is the normalized first or second normal vector and n' is the suppressed first or second normal vector.
The method for retaining texture of image texture when processing an image in some embodiments further comprises:
acquiring pixel values of each point of an image to be processed;
and calculating the gradient of each point of the image to be processed based on the gradient operator according to the pixel values of each point of the image to be processed.
In this embodiment, gradients of points of the image to be processed may be calculated based on various gradient operators. The gradient operator may be, but is not limited to, a Sobel operator or a Prewitt operator.
In this embodiment, a Sobel operator is taken as an example to describe the calculation gradient:
the gradient at coordinates (u, v) is:
dx=2*P(u–1,v)+P(u-1,v+1)+P(u–1,v–1)–2*P(u+1,v)–P(u+1,v+1)–P(u+1,v–1),dy=P(u-1,v–1)+2*P(u,v-1)+P(u+1,v-1)–P(u-1,v+1)–2*P(u,v+1)–P(u+1,v+1)
where dx is the gradient at point (u, v) and P (u, v) is the pixel value at point (u, v).
In some embodiments, a method for preserving texture of an image texture in processing an image, processes a first normal map and a second normal map, includes:
and carrying out Gaussian blur on the first normal line mapping and the second normal line mapping, and smoothing the normal line changes of the first normal line mapping and the second normal line mapping.
Gaussian blur, also known as gaussian smoothing, is a widely used processing effect in existing image processing software, which is typically used to reduce image noise and to reduce the level of detail. In this embodiment, the first normal map and the second normal map are gaussian blurred to smooth the normal variation of the first normal map and the second normal map.
In some embodiments, a method for preserving texture of an image texture when processing an image, combining a processed first normal map and a processed second normal map into a third normal map, includes:
and carrying out soft light mixing on the processed first normal line mapping and the processed second normal line mapping to obtain a third normal line mapping.
In this embodiment, the smoothed first normal map and the smoothed second normal map are combined to obtain the final normal map. Specifically, the first normal map and the second normal map are merged based on a soft blend.
In some embodiments, a method for preserving texture of an image texture when processing an image, performing texture recovery on a mask map based on a third normal map, includes:
and calculating the world light irradiation result of the third normal map, and recovering the shadow and gradient change of the object surface folds in the mask map according to the world light irradiation result.
In this embodiment, the result of the world light irradiation of the third normal map is calculated to recover the shadow and gradient change of the object surface wrinkles in the mask map.
Specifically, the calculation is performed with the world light as a reference of white light.
A system for preserving image texture in processing an image, referring to fig. 8, comprising:
the elimination module 21 is configured to confirm an area to be eliminated in the image to be processed, and smear the area to be eliminated to obtain a smeared mask image;
a filling module 22, configured to fill the area to be eliminated after being smeared;
a first normal map generating module 23, configured to calculate a first normal map of the image to be processed;
the second normal map generating module 24 is configured to downsample the image to be processed based on a preset scaling factor, and calculate a second normal map of the downsampled image to be processed;
a normal map processing module 25, configured to process the first normal map and the second normal map;
a normal map merging module 26, configured to merge the processed first normal map and the processed second normal map into a third normal map;
a texture restoration module 27 for performing texture restoration on the mask map based on the third normal map;
and a replacing module 28, configured to replace the texture restored mask map with the region to be eliminated.
Specifically, the normal map processing module 25 is configured to perform gaussian blur on the first normal map and the second normal map, and smooth normal variation of the first normal map and the second normal map.
Specifically, the normal map merging module 26 is configured to soft mix the processed first normal map and the processed second normal map to obtain a third normal map.
Specifically, the texture restoring module 27 is configured to calculate a world light irradiation result of the third normal map, and restore shadows and gradient changes of the object surface wrinkles in the mask map according to the world light irradiation result.
In the system for reserving texture of image texture when processing an image in the embodiment, the elimination module confirms the area to be eliminated in the image to be processed, and the area to be eliminated is smeared to obtain a smeared mask image. And filling the area to be eliminated after the smearing through a filling module. And calculating to obtain a first normal map of the image to be processed through the first normal map generating module, downsampling the image to be processed through the second normal map generating module based on a preset scaling coefficient, and calculating to obtain a second normal map of the downsampled image to be processed. The first normal map is calculated by adopting the full resolution of the image to be processed, so that more details of the image to be processed are obtained, and the second normal map is calculated by adopting the downsampled image to be processed, so that more structural information of the image to be processed is obtained. And processing the first normal line map and the second normal line map through a normal line map processing module, merging the processed first normal line map and second normal line map into a third normal line map through a normal line map merging module, carrying out texture restoration on the mask map based on the third normal line map through a texture restoration module, and finally replacing the mask map after the texture restoration to a region to be eliminated through a replacement module. In this embodiment, not only the pattern of the region to be eliminated in the image to be processed is eliminated, but also the texture of the image texture of the region to be eliminated is maintained.
The system for preserving texture of an image texture when processing an image in some embodiments further comprises:
the gradient calculation module is used for obtaining pixel values of each point of the image to be processed; and calculating the gradient of each point of the image to be processed based on the gradient operator according to the pixel values of each point of the image to be processed.
It is to be understood that the same or similar parts in the above embodiments may be referred to each other, and that in some embodiments, the same or similar parts in other embodiments may be referred to.
It should be noted that in the description of the present application, the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. Furthermore, in the description of the present application, unless otherwise indicated, the meaning of "plurality" means at least two.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and further implementations are included within the scope of the preferred embodiment of the present application in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the embodiments of the present application.
It is to be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above-described embodiments, the various steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, may be implemented using any one or combination of the following techniques, as is well known in the art: discrete logic circuits having logic gates for implementing logic functions on data signals, application specific integrated circuits having suitable combinational logic gates, programmable Gate Arrays (PGAs), field Programmable Gate Arrays (FPGAs), and the like.
Those of ordinary skill in the art will appreciate that all or a portion of the steps carried out in the method of the above-described embodiments may be implemented by a program to instruct related hardware, where the program may be stored in a computer readable storage medium, and where the program, when executed, includes one or a combination of the steps of the method embodiments.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing module, or each unit may exist alone physically, or two or more units may be integrated in one module. The integrated modules may be implemented in hardware or in software functional modules. The integrated modules may also be stored in a computer readable storage medium if implemented in the form of software functional modules and sold or used as a stand-alone product.
The above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, or the like.
In the description of the present specification, a description referring to terms "one embodiment," "some embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiments or examples. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Although embodiments of the present application have been shown and described above, it will be understood that the above embodiments are illustrative and not to be construed as limiting the application, and that variations, modifications, alternatives, and variations may be made to the above embodiments by one of ordinary skill in the art within the scope of the application.
Claims (8)
1. A method for preserving texture of an image texture while processing the image, comprising:
confirming a region to be eliminated in an image to be processed, and smearing the region to be eliminated to obtain a smeared mask image;
filling the area to be eliminated after the smearing;
calculating to obtain a first normal map of the image to be processed;
downsampling the image to be processed based on a preset scaling factor, and calculating to obtain a second normal map of the downsampled image to be processed;
processing the first normal map and the second normal map;
merging the processed first normal line mapping and the second normal line mapping into a third normal line mapping;
texture restoration is performed on the mask map based on the third normal map;
replacing the mask map after texture recovery to the region to be eliminated;
the calculating to obtain the first normal map of the image to be processed includes:
establishing a first coordinate system, and constructing a first normal vector in the first coordinate system;
taking the gradient of the image to be processed in the horizontal direction as the vector quantity of the first normal vector on the x axis of the first coordinate system;
taking the gradient of the image to be processed in the vertical direction as the vector quantity of the first normal vector on the y axis of the first coordinate system;
taking a first preset constant as a component of the first normal vector on the z axis of the first coordinate system;
normalizing the first normal vector;
processing the normalized first normal vector;
generating the first normal map according to the processed first normal vector;
the calculating to obtain a second normal map of the downsampled image to be processed includes:
establishing a second coordinate system, and constructing a second normal vector in the second coordinate system;
taking the gradient of the downsampled image to be processed in the horizontal direction as the component of the second normal vector on the x axis of the second coordinate system;
taking the gradient of the downsampled image to be processed in the vertical direction as the component of the second normal vector on the y axis of the second coordinate system;
taking a second preset constant as a component of the first normal vector on the z axis of the first coordinate system; wherein the second preset constant is greater than the first preset constant;
normalizing the second normal vector;
processing the normalized second normal vector;
and generating the second normal map according to the processed second normal vector.
2. The method of claim 1, wherein the filling the area to be eliminated after the applying comprises:
and selecting the adjacent pixel areas of the area to be eliminated to carry out pixel filling on the area to be eliminated.
3. The method of claim 1, wherein processing the normalized first normal vector comprises: inhibiting larger values at two ends of the normalized first normal vector result;
processing the normalized second normal vector, including: and suppressing larger values at two ends of the normalized second normal vector result.
4. The method as recited in claim 1, further comprising:
acquiring pixel values of each point of the image to be processed;
and calculating the gradient of each point of the image to be processed based on a gradient operator according to the pixel value of each point of the image to be processed.
5. The method of claim 1, wherein the processing the first normal map and the second normal map comprises:
and carrying out Gaussian blur on the first normal line mapping and the second normal line mapping, and smoothing normal line changes of the first normal line mapping and the second normal line mapping.
6. The method of claim 1, wherein merging the processed first and second normal maps into a third normal map comprises:
and carrying out soft light mixing on the processed first normal line mapping and the processed second normal line mapping to obtain the third normal line mapping.
7. The method of claim 1, wherein the texture restoration of the mask map based on the third normal map comprises:
and calculating the world light irradiation result of the third normal map, and recovering the shadow and gradient change of the object surface fold in the mask map according to the world light irradiation result.
8. A system for preserving texture of an image texture while processing the image, comprising:
the eliminating module is used for confirming an area to be eliminated in the image to be processed, and smearing the area to be eliminated to obtain a smeared mask image;
the filling module is used for filling the area to be eliminated after being smeared;
the first normal map generation module is used for calculating and obtaining a first normal map of the image to be processed;
the second normal map generation module is used for downsampling the image to be processed based on a preset scaling coefficient, and calculating to obtain a second normal map of the downsampled image to be processed;
the normal line mapping processing module is used for processing the first normal line mapping and the second normal line mapping;
the normal line mapping merging module is used for merging the processed first normal line mapping and the processed second normal line mapping into a third normal line mapping;
a texture restoration module, configured to perform texture restoration on the mask map based on the third normal map;
a replacing module, configured to replace the texture restored mask map to the region to be eliminated;
the calculating to obtain the first normal map of the image to be processed includes:
establishing a first coordinate system, and constructing a first normal vector in the first coordinate system;
taking the gradient of the image to be processed in the horizontal direction as the vector quantity of the first normal vector on the x axis of the first coordinate system;
taking the gradient of the image to be processed in the vertical direction as the vector quantity of the first normal vector on the y axis of the first coordinate system;
taking a first preset constant as a component of the first normal vector on the z axis of the first coordinate system;
normalizing the first normal vector;
processing the normalized first normal vector;
generating the first normal map according to the processed first normal vector;
the calculating to obtain a second normal map of the downsampled image to be processed includes:
establishing a second coordinate system, and constructing a second normal vector in the second coordinate system;
taking the gradient of the downsampled image to be processed in the horizontal direction as the component of the second normal vector on the x axis of the second coordinate system;
taking the gradient of the downsampled image to be processed in the vertical direction as the component of the second normal vector on the y axis of the second coordinate system;
taking a second preset constant as a component of the first normal vector on the z axis of the first coordinate system; wherein the second preset constant is greater than the first preset constant;
normalizing the second normal vector;
processing the normalized second normal vector;
and generating the second normal map according to the processed second normal vector.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111013113.1A CN113538549B (en) | 2021-08-31 | 2021-08-31 | Method and system for retaining texture of image texture during image processing |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111013113.1A CN113538549B (en) | 2021-08-31 | 2021-08-31 | Method and system for retaining texture of image texture during image processing |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113538549A CN113538549A (en) | 2021-10-22 |
CN113538549B true CN113538549B (en) | 2023-12-22 |
Family
ID=78122946
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111013113.1A Active CN113538549B (en) | 2021-08-31 | 2021-08-31 | Method and system for retaining texture of image texture during image processing |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113538549B (en) |
Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2659458A1 (en) * | 2010-12-30 | 2013-11-06 | TomTom Polska SP. Z O.O. | System and method for generating textured map object images |
KR20140019199A (en) * | 2012-08-06 | 2014-02-14 | 동명대학교산학협력단 | Method of producing 3d earth globes based on natural user interface using motion-recognition infrared camera |
CN104392481A (en) * | 2014-11-25 | 2015-03-04 | 无锡梵天信息技术股份有限公司 | Method and device for controlling specular reflection definition by mapping |
CN105574918A (en) * | 2015-12-24 | 2016-05-11 | 网易(杭州)网络有限公司 | Material adding method and apparatus of 3D model, and terminal |
CN107204033A (en) * | 2016-03-16 | 2017-09-26 | 腾讯科技(深圳)有限公司 | The generation method and device of picture |
CN107358643A (en) * | 2017-07-04 | 2017-11-17 | 网易(杭州)网络有限公司 | Image processing method, device, electronic equipment and storage medium |
CN107808372A (en) * | 2017-11-02 | 2018-03-16 | 北京奇虎科技有限公司 | Image penetration management method, apparatus, computing device and computer-readable storage medium |
CN109559319A (en) * | 2018-10-31 | 2019-04-02 | 深圳市创梦天地科技有限公司 | A kind of processing method and terminal of normal map |
CN111243099A (en) * | 2018-11-12 | 2020-06-05 | 联想新视界(天津)科技有限公司 | Method and device for processing image and method and device for displaying image in AR (augmented reality) device |
CN111583398A (en) * | 2020-05-15 | 2020-08-25 | 网易(杭州)网络有限公司 | Image display method and device, electronic equipment and computer readable storage medium |
CN111612882A (en) * | 2020-06-10 | 2020-09-01 | 腾讯科技(深圳)有限公司 | Image processing method, image processing device, computer storage medium and electronic equipment |
CN111773710A (en) * | 2020-08-20 | 2020-10-16 | 网易(杭州)网络有限公司 | Texture image processing method and device, electronic equipment and storage medium |
CN111899325A (en) * | 2020-08-13 | 2020-11-06 | 网易(杭州)网络有限公司 | Rendering method and device of crystal stone model, electronic equipment and storage medium |
JP2020197774A (en) * | 2019-05-31 | 2020-12-10 | キヤノン株式会社 | Image processing method, image processing device, image-capturing device, image processing program, and memory medium |
CN112241933A (en) * | 2020-07-15 | 2021-01-19 | 北京沃东天骏信息技术有限公司 | Face image processing method and device, storage medium and electronic equipment |
CN112870707A (en) * | 2021-03-19 | 2021-06-01 | 腾讯科技(深圳)有限公司 | Virtual object display method in virtual scene, computer device and storage medium |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9892542B2 (en) * | 2015-11-19 | 2018-02-13 | Adobe Systems Incorporated | Creating bump and normal maps from images with multi-scale control |
-
2021
- 2021-08-31 CN CN202111013113.1A patent/CN113538549B/en active Active
Patent Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2659458A1 (en) * | 2010-12-30 | 2013-11-06 | TomTom Polska SP. Z O.O. | System and method for generating textured map object images |
KR20140019199A (en) * | 2012-08-06 | 2014-02-14 | 동명대학교산학협력단 | Method of producing 3d earth globes based on natural user interface using motion-recognition infrared camera |
CN104392481A (en) * | 2014-11-25 | 2015-03-04 | 无锡梵天信息技术股份有限公司 | Method and device for controlling specular reflection definition by mapping |
CN105574918A (en) * | 2015-12-24 | 2016-05-11 | 网易(杭州)网络有限公司 | Material adding method and apparatus of 3D model, and terminal |
CN107204033A (en) * | 2016-03-16 | 2017-09-26 | 腾讯科技(深圳)有限公司 | The generation method and device of picture |
CN107358643A (en) * | 2017-07-04 | 2017-11-17 | 网易(杭州)网络有限公司 | Image processing method, device, electronic equipment and storage medium |
CN107808372A (en) * | 2017-11-02 | 2018-03-16 | 北京奇虎科技有限公司 | Image penetration management method, apparatus, computing device and computer-readable storage medium |
CN109559319A (en) * | 2018-10-31 | 2019-04-02 | 深圳市创梦天地科技有限公司 | A kind of processing method and terminal of normal map |
CN111243099A (en) * | 2018-11-12 | 2020-06-05 | 联想新视界(天津)科技有限公司 | Method and device for processing image and method and device for displaying image in AR (augmented reality) device |
JP2020197774A (en) * | 2019-05-31 | 2020-12-10 | キヤノン株式会社 | Image processing method, image processing device, image-capturing device, image processing program, and memory medium |
CN111583398A (en) * | 2020-05-15 | 2020-08-25 | 网易(杭州)网络有限公司 | Image display method and device, electronic equipment and computer readable storage medium |
CN111612882A (en) * | 2020-06-10 | 2020-09-01 | 腾讯科技(深圳)有限公司 | Image processing method, image processing device, computer storage medium and electronic equipment |
CN112241933A (en) * | 2020-07-15 | 2021-01-19 | 北京沃东天骏信息技术有限公司 | Face image processing method and device, storage medium and electronic equipment |
CN111899325A (en) * | 2020-08-13 | 2020-11-06 | 网易(杭州)网络有限公司 | Rendering method and device of crystal stone model, electronic equipment and storage medium |
CN111773710A (en) * | 2020-08-20 | 2020-10-16 | 网易(杭州)网络有限公司 | Texture image processing method and device, electronic equipment and storage medium |
CN112870707A (en) * | 2021-03-19 | 2021-06-01 | 腾讯科技(深圳)有限公司 | Virtual object display method in virtual scene, computer device and storage medium |
Non-Patent Citations (2)
Title |
---|
"A general texture mapping framework for image-based 3D modeling";Lin Xu et al;《2010 IEEE International Conference on Image Processing》;全文 * |
浅析水利三维设计中的法线贴图工作原理与绘制方法;柏文;花金祥;张茜茜;;科技视界(第18期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN113538549A (en) | 2021-10-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109712067B (en) | Virtual viewpoint drawing method based on depth image | |
Ham et al. | Robust image filtering using joint static and dynamic guidance | |
EP2380132B1 (en) | Denoising medical images | |
US8406548B2 (en) | Method and apparatus for performing a blur rendering process on an image | |
Su et al. | Edge-preserving texture suppression filter based on joint filtering schemes | |
US20130202177A1 (en) | Non-linear resolution reduction for medical imagery | |
JP2004038984A (en) | Interpolated image filtering method and apparatus | |
CN113744142B (en) | Image restoration method, electronic device and storage medium | |
EP2884745A1 (en) | Virtual view generating method and apparatus | |
CN113129207B (en) | Picture background blurring method and device, computer equipment and storage medium | |
CN108537868A (en) | Information processing equipment and information processing method | |
CN113538549B (en) | Method and system for retaining texture of image texture during image processing | |
Steidl et al. | Anisotropic smoothing using double orientations | |
JP5617426B2 (en) | Jaggy mitigation processing apparatus and jaggy mitigation processing method | |
Kang et al. | Inpainting from multiple views | |
Grossauer | Inpainting of movies using optical flow | |
Pavan Kumar et al. | A refined structure preserving image abstraction framework as a pre-processing technique for desire focusing on prominent structure and artistic stylization | |
JP2009512040A (en) | How to segment an image | |
Lederman | Two dimensional image denoising using partial differential equations with a constraint on high dimensional domains | |
CN118314044B (en) | Method and system for enhancing etching line morphology image of multi-layer gluing mask plate | |
Rana et al. | An Enhanced Model for Inpainting on Digital Images Using Dynamic Masking. | |
Cai et al. | Boundary-preserving depth upsampling without texture copying artifacts and holes | |
Krishnamurthy et al. | Image-guided depth map upsampling using normalized cuts-based segmentation and smoothness priors | |
CN112581411B (en) | Image defogging method and terminal | |
Šroubek et al. | Image fusion based on level set segmentation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |