CN113822815A - Method and device for eliminating high-performance picture sundries by using GPU rendering - Google Patents

Method and device for eliminating high-performance picture sundries by using GPU rendering Download PDF

Info

Publication number
CN113822815A
CN113822815A CN202111121389.1A CN202111121389A CN113822815A CN 113822815 A CN113822815 A CN 113822815A CN 202111121389 A CN202111121389 A CN 202111121389A CN 113822815 A CN113822815 A CN 113822815A
Authority
CN
China
Prior art keywords
texture
target
size
image
block
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111121389.1A
Other languages
Chinese (zh)
Other versions
CN113822815B (en
Inventor
林青山
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Guangzhuiyuan Information Technology Co ltd
Original Assignee
Guangzhou Guangzhuiyuan Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Guangzhuiyuan Information Technology Co ltd filed Critical Guangzhou Guangzhuiyuan Information Technology Co ltd
Priority to CN202111121389.1A priority Critical patent/CN113822815B/en
Publication of CN113822815A publication Critical patent/CN113822815A/en
Application granted granted Critical
Publication of CN113822815B publication Critical patent/CN113822815B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/77Retouching; Inpainting; Scratch removal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • G06T15/205Image-based rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/80Shading
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • G06T7/49Analysis of texture based on structural texture description, e.g. using primitives or placement rules
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20016Hierarchical, coarse-to-fine, multiscale or multiresolution image processing; Pyramid transform
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Graphics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computing Systems (AREA)
  • Geometry (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Generation (AREA)

Abstract

The method comprises the steps of obtaining an original image and a binary image marked by sundries, repairing sundry areas in the image by combining an image repairing algorithm based on GPU rendering, achieving the effect of eliminating the sundries in the image, and solving the problem that in the prior art, in the process of eliminating the sundries in the image, large blocks of abrupt color blocks can appear, or other pixel points in the image are transplanted in error, so that the eliminating effect is poor.

Description

Method and device for eliminating high-performance picture sundries by using GPU rendering
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a method and an apparatus for performing high-performance image sundry removal by using GPU rendering.
Background
Due to the increasingly wide application of mobile terminals, users are used to shoot at any time and any place for memorial, but when shooting, the users often encounter the situation that other passers-by or other sundries are shot carelessly in the photos, and the users hope to remove the unexpected things in the images. In a traditional implementation mode, a user manually draws an area, and the Opencv is used for searching pixel points near a picture to fill the area so as to achieve the purpose of eliminating objects in the area. Some algorithms are assisted by functions such as blurring and the like, so that the repaired area is more attached to the original image and is not too abrupt. However, these methods still cannot avoid the situation that a large block of abrupt color block appears in the image or other pixel points in the image are transplanted in error. On the other hand, the Opencv calculation amount is huge, the requirement on the mobile device is too high, higher memory operation conditions are required, and many mobile terminals with low-performance or medium-performance configuration cannot operate the algorithms, so that the algorithms cannot be used by users.
Disclosure of Invention
The application provides a method and a device for eliminating high-performance image sundries by using GPU rendering, which aim to solve the problem that in the prior art, the elimination effect is poor in the elimination of the image sundries.
The above object of the present application is achieved by the following technical solutions:
the embodiment of the application provides a method for eliminating impurities in a high-performance picture by using GPU rendering, which comprises the following steps:
the method comprises the steps of obtaining and decoding an original image and a sundry mark binary image input by a user, and loading to obtain an original image texture and a sundry mark binary image texture suitable for OpenGL rendering;
down-sampling the sundry marked binary image texture until no sundries exist in the sampled image texture data to obtain a plurality of image textures respectively corresponding to each sampling, and taking the image texture obtained by each sampling as a layer of data to form an image texture pyramid; the image texture of the lowest layer of the image texture pyramid corresponds to the size of the original image;
the original image texture is down-sampled to obtain a target texture with the size same as the size of the topmost layer of the image texture pyramid, and position textures with the size storing position information of similar matching blocks in the original image texture corresponding to the block where each pixel in the target texture is located and distance textures with the size storing the distance information of the similar matching blocks in the block where each pixel in the target texture is located and the corresponding original image texture are established;
and from the topmost layer to the bottommost layer of the image texture pyramid, based on the size of each layer, executing the following loop to obtain a final target texture with the size same as that of the original image texture and updated matching block information based on the minimum distance similarity of the block where each pixel is located:
determining a similar matching block with the minimum distance corresponding to the block of each pixel in the target texture in the original image texture;
updating the position texture and the distance texture based on the similar matching block information with the minimum distance;
updating the target texture based on the updated location texture and the distance texture;
changing the pyramid layer number of the image texture into the next layer of the current layer number to obtain a new size, performing up-sampling on the target texture again in the original image texture to obtain a target texture with a new size, and performing up-sampling on both the position texture and the distance texture to obtain a position texture with a new size and a distance texture with a new size;
and coding to generate a target image file according to the final target texture obtained by the circulation.
Further, the down-sampling the binary image texture of the sundry mark until no sundries exist in the texture data of the sampled image comprises:
sampling the sundry marked binary image texture based on the original image texture size to obtain an image texture;
judging whether the texture of the image contains sundries or not;
if so, continuing downsampling after halving the sampling size;
if not, stopping sampling.
Further, the block where the pixel is located is:
and the radius of the area block is a preset numerical value by taking the pixel as a central point.
Further, the establishing of the position texture of the size storing the position information of the similar matching block in the original image texture corresponding to the block where each pixel in the target texture is located and the distance texture of the size storing the distance information of the similar matching block in the block where each pixel in the target texture is located and the corresponding original image texture includes:
establishing position texture and distance texture of the size of the top image texture of the image texture pyramid;
randomly generating a similar matching block for each block where each pixel in the target texture is located;
storing the position information of the similar matching block corresponding to the block of each pixel in the target texture in the original image texture into the position texture;
and storing the distance information between the block of each pixel in the target texture and the similar matching block corresponding to the block of the pixel in the original image texture in the distance texture.
Further, the determining a similar matching block with a minimum corresponding distance in the original image texture of a block in which each pixel in the target texture is located includes:
defining a window size to be one-half of the target texture size;
taking each pixel point in the target texture as a central point, and taking the window size as a radius to obtain nine window position points including the position points of the pixel points;
sampling nine window position points in the position texture to obtain nine similar matching block position points corresponding to the nine window position points respectively;
respectively calculating the distances between the current block of the pixel in the target texture and the nine window position points and nine similar matching block position points in the original image texture, and determining the position point with the minimum distance;
and determining a block corresponding to the position point with the minimum distance in the original image texture, and determining a similar matching block with the minimum distance in the original image texture corresponding to the block of the pixel.
Further, the determining a similar matching block with a minimum corresponding distance in the original image texture of a block in which each pixel in the target texture is located further includes:
halving the window size to obtain a new window size;
based on the new window size, the smallest similar matching block is re-determined.
Further, the updating the target texture based on the updated position texture and the distance texture includes:
establishing a calculation area by taking each target pixel in the target texture as a central point and a preset size as a radius;
based on the position texture and the distance texture, acquiring position information and distance information of a similar matching block of each pixel point in the calculation region;
and carrying out weighted summation on the position color values of the similar matching blocks corresponding to each pixel point in the calculation area based on the image texture data, the position information and the distance information to obtain the color value of the target pixel position, and updating the color value into the target texture.
Further, the changing the pyramid layer number of the image texture into the next layer of the current layer number to obtain a new size, re-upsampling the target texture in the original image texture to obtain a target texture of a new size, and upsampling both the position texture and the distance texture to obtain a position texture of a new size and a distance texture of a new size includes:
changing the image texture pyramid layer number into the next layer of the current layer number to obtain a new size;
based on the new size, performing up-sampling in a linear filtering mode to obtain a target texture of the new size;
based on the new size, performing up-sampling in a linear filtering mode to obtain position texture of the new size;
and obtaining a distance texture of the new size based on the new size and the position texture of the new size.
In a second aspect, an embodiment of the present application further provides an apparatus for performing high-performance image sundry removal by using GPU rendering, including:
the decoding and loading module is used for acquiring and decoding an original image and a sundry mark binary image input by a user, and loading to obtain an original image texture and a sundry mark binary image texture suitable for OpenGL rendering;
the first processing module is used for carrying out down-sampling on the sundry label binary image texture to obtain an image texture array, and sampling the original image texture based on the image texture array to obtain a target texture;
the second processing module is used for determining a similar matching block of a block where each pixel in the target texture is located in the original image texture, and updating the target texture based on the similar matching block to obtain a final target texture;
a third processing module, configured to generate a target image file by encoding based on the final target texture;
and the output module is used for outputting the target image file.
The technical scheme provided by the embodiment of the application can have the following beneficial effects:
according to the technical scheme provided by the embodiment of the application, the sundry area in the image is repaired by inputting the original image and the sundry mark binary image and combining the image repairing algorithm based on GPU rendering, so that the effect of eliminating image sundries is achieved, and the problem that in the prior art, in the process of eliminating the image sundries, large blocks of sharp color blocks occur or other pixel points in the image are transplanted mistakenly, and the eliminating effect is poor is solved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application.
Fig. 1 is a schematic flowchart of a method for performing high-performance image debris elimination by using GPU rendering according to an embodiment of the present disclosure;
fig. 2 is a schematic structural diagram of a device for performing high-performance image sundry removal by using GPU rendering according to an embodiment of the present disclosure.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
In order to solve the above problems, the present application provides a method and an apparatus for performing high-performance image clutter removal using GPU rendering, so as to perform high-performance image clutter removal on an image.
Examples
Referring to fig. 1, fig. 1 is a schematic flowchart of a method for performing high-performance image clutter removal by using GPU rendering according to an embodiment of the present disclosure, as shown in fig. 1, the method at least includes the following steps:
s101, an original image and an sundry mark binary image input by a user are obtained and decoded, and original image textures and sundry mark binary image textures suitable for OpenGL rendering are obtained through loading.
Specifically, an original image and a sundry mark binary image input by a user are decoded to obtain image data, and the image data are loaded into an original image texture P and a sundry mark binary image texture M suitable for OpenGL rendering respectively. In the binary impurity labeling diagram, 1 represents a impurity region, and 0 represents a non-impurity region.
S102, down-sampling the sundry marked binary image textures until no sundries exist in the sampled image texture data to obtain a plurality of image textures corresponding to each sampling respectively, and taking the image textures obtained by each sampling as a layer of data to form an image texture pyramid; and the image texture of the lowest layer of the image texture pyramid corresponds to the size of the original image.
Specifically, continuously sampling downwards the sundry marked binary image texture M obtained in the process, wherein in the sampling process, the size of each layer of image texture is reduced by half until the image texture data of the current layer are all 0, the process comprises judging the result obtained by sampling after each sampling, and when the sampling result also comprises sundries, continuing to sample; and when no sundries exist in the sampling result, namely the image texture data is all 0, stopping continuously sampling.
Based on the sampling result, an image texture pyramid is obtained, wherein the number of down-sampling times is recorded, namely the pyramid layer number is Level, the image texture array of all the layers is recorded as Ms [ Level ], wherein Ms [0] is M, namely the bottom layer of the image texture pyramid is Ms [0], the highest layer is a sundry marked binary texture M, namely Ms [0], and the highest layer is the last sampling result containing sundries of the data.
It should be noted that the image texture array Ms represents the aforementioned image texture pyramid, and is an array including Level textures. Wherein, one layer of each pyramid has only one texture, and the texture array represents the whole pyramid, including all layers. S103, the original image texture is down-sampled to obtain a target texture with the same size as the topmost pyramid size of the image texture, and a position texture with the size storing position information of similar matching blocks in the original image texture corresponding to the block where each pixel in the target texture is located and a distance texture with the size storing distance information of the similar matching blocks in the block where each pixel in the target texture is located and the corresponding original image texture are established.
Specifically, S is defined as the size of Ms [ L ] starting from the topmost layer of the image texture pyramid, i.e., from L ═ Level-1. And (4) performing down-sampling on the original image texture P, and designating the size as S to obtain a target texture T. It should be noted that, for each pixel point p0 in the texture, a block area with a block radius PR of 2 is defined by taking p0 as a center point, and is 5 × 5, and is defined as a block where a p0 point is located, and all the following pixels are located.
After the target texture T is obtained, a position texture O with the same size S is also generated for recording the position of a similar matching block in the texture P corresponding to the block where each pixel point of the texture T is located, and a distance texture D with the same size S is generated for recording the distance between the block where each pixel point of the texture T is located and the similar matching block in the corresponding texture P, the range is 0-65535, and the smaller the value is, the more similar the value is.
It should be noted that the calculation of the distance between the blocks can be obtained by counting the variance of the color values of all the pixels in the block, wherein if the pixel is in the sundry marking region Ms [ L ], the distance is considered to be longer, and the specific calculation method is not repeated.
In addition, after the position texture and the distance texture are established, the position texture and the distance texture also need to be initialized, and the specific process is as follows: in the OpenGL shader program, all pixels with viewport size S are traversed, and the following calculation is performed for each pixel point p 11: and randomly generating a position p12 as a similar matching block of the pixel point, and updating the texture O value to be p 12. The distance D between the block of P11 in texture T and the block of P12 in texture P is calculated, and the value of texture D is updated to D.
And S104, executing circulation from the topmost layer to the bottommost layer of the image texture pyramid based on the size of each layer to obtain a final target texture with the size same as that of the original image texture and updated matching block information based on the minimum distance similarity of the block where each pixel is located.
Specifically, S104 includes S1041: and determining the similar matching block with the minimum corresponding distance in the original image texture of the block where each pixel in the target texture is located.
Specifically, S10411 defines the window size as one-half of the target texture size. The window radius WR is defined as half of the larger value of the size S, i.e., WR ═ max (s.width, s.height)/2.
S10412, taking each pixel point in the target texture as a central point, and taking the window size as a radius, obtaining nine window position points including the position points of the pixel points.
Specifically, in the OpenGL shader program, all pixels with viewport size S are traversed, and the following calculation is performed on each pixel point p 21:
with p21 as the window center and WR as the window radius, the center (p21.x, p21.y), the upper left (p21.x-WR, p21.y-WR), the upper right (p21.x + WR, p21.y-WR), the right (p21.x + WR, p21.y), the lower right (p21.x + WR, p21.y + WR), the lower left (p21.x-WR, p21.y + WR), the left (p21.x-WR, p21.y), and the total of nine position points p2s were calculated.
S10413, sampling the nine window position points in the position texture to obtain nine similar matching block position points corresponding to the nine window position points respectively. Specifically, the nine position points p2s in the position texture O are sampled and the difference between the p2s position point and p21 is subtracted, so that nine position points corresponding to the similar matching block are op2 s.
S10414, respectively calculating the distance between the pixel in the target texture and the nine window position points and the nine similar matching block position points in the block and the original texture, and determining the position point with the minimum distance. Specifically, for example, the distances between a block where P21 is located in the texture T and a block where eighteen points of P2s and op2s in the texture P are located are calculated respectively, and the minimum distance mind and the position minp in the texture P corresponding to the minimum distance are obtained by comparison, the block corresponding to the position point with the minimum distance in the original image texture is determined, and the similar matching block with the minimum distance is determined for the similar matching block with the minimum distance in the original image texture corresponding to the block where the pixel is located.
Further, in order to mention the accuracy of the obtained similar matching block with the minimum distance, after the completion of the above-mentioned overarching, the window radius WR may be further halved, that is, WR is equal to WR/2. If WR is greater than 0, the process of S10411-S10415 is continued based on the new window radius, and the similar matching block with the minimum distance is continued to be calculated so as to obtain more accurate data.
S1042, based on the similar matching block information with the minimum distance, updating the position texture and the distance texture.
Specifically, through the above process, the similar matching block with the minimum distance is obtained, and based on the information of the similar matching block with the minimum distance, if the position of the block is minp and the distance is mind, the value of the updated position texture O is minp, and the value of the updated distance texture D is mind.
S1043, updating the target texture based on the updated position texture and the updated distance texture.
The method specifically comprises the following steps: establishing a calculation area by taking each target pixel in the target texture as a central point and a preset size as a radius; based on the position texture and the distance texture, acquiring position information and distance information of each pixel point similar matching block in the calculation region; and carrying out weighted summation on the position color values of the similar matching blocks corresponding to each pixel point in the calculation area based on the image texture data, the position information and the distance information to obtain the color value of the target pixel position, and updating the color value into the target texture.
For example, in the OpenGL shader program, all pixels with viewport size S are traversed, and the following calculation is performed for each pixel point p 31: taking p31 as a central point, taking the block radius PR as a window radius, traversing all pixel points in the window, carrying out weighted summation on the position color value of the similar matching block corresponding to each pixel point p32, wherein the initialized total color value c3 is 0, and the total weight value w3 is 0, and the calculation is as follows:
firstly, sampling is carried out on the position p32 in the texture O and the texture D respectively to obtain the position p33 and the distance D31 of a similar block; then, sampling is carried out at the position p33 in the texture Ms [ L ] to obtain a value m3, if m3 is 1, the texture Ms [ L ] represents that the texture Ms [ L ] is in a foreign object area, and the weight value of skipping the current pixel point is 0; and sampling the position P33 in the texture P to obtain a color value c31, wherein the weight value w31 is obtained by linear transformation of the distance d31, and the smaller the distance is, the larger the weight is. Update c3 ═ c3+ c31 ═ w31, update w3 ═ w3+ w 31. The color value c32 at the p31 position was calculated as c3/w 3. Finally, the texture T value is updated to c 32.
Changing the pyramid layer number of the image texture into the next layer of the current layer number to obtain a new size, performing up-sampling on the target texture based on the original image texture to obtain a target texture of the new size, and performing up-sampling on both the position texture and the distance texture to obtain a position texture of the new size and a distance texture of the new size;
s1044, changing the pyramid layer number of the image texture into the next layer of the current layer number to obtain a new size, re-up-sampling the target texture in the original image texture to obtain a target texture with a new size, and up-sampling both the position texture and the distance texture to obtain a position texture with a new size and a distance texture with a new size.
After the pyramid top layer is completed based on the image texture, the pyramid layer number is changed, specifically, the current pyramid layer L is reduced by 1, that is, L is equal to L-1, that is, the next layer of the current pyramid layer is taken to obtain a new size, the target texture T is up-sampled, and the target texture T with the size, the position texture O and the distance texture D with the same size are obtained after the pyramid layer number is changed by re-sampling.
Specifically, the upsampling of the target texture T includes: and updating S to be the size of Ms [ L ], and performing up-sampling on the target texture T in a linear filtering mode, wherein the size is S.
For the position texture O, upsampling is carried out in a linear filtering mode, the size is S, and the specific process comprises the following steps: in the OpenGL shader program, all pixels with viewport sizes of the new size S are traversed, and the following calculation is performed for each pixel point p 61: first, rounding down by dividing p61 by 2 to get the actual pixel position p62 of texture O; then sampling is carried out at the position p62 of the texture O to obtain p 63; finally, the value of texture O after up-sampling is calculated to be p64 ═ p63 × 2- (p61-p 62).
For the distance texture D, after obtaining a new position texture in the above process, the distance D6 between the block where P61 is located in the target texture T and the block where P64 is located in the original image texture P is calculated, and the value of the position texture D is updated to D6.
That is, after the target texture T is updated based on the image texture pyramid top layer, the pyramid layer is changed based on the target texture T, that is, the second-level data of the pyramid is taken, the processes of S1041-S1044 are performed again, and a loop is performed to obtain the target texture T updated based on the second-level data.
It can be understood that after the target texture T is obtained based on sampling, calculating and updating of each layer of pyramid data, that is, before the pyramid layer is replaced, a loop judgment process is performed. And if the current pyramid level is 0, namely the bottom layer, namely the target texture T obtained in the process is the final target texture with the same size as the original image texture P, and the updated final target texture is matched based on the minimum distance similarity matching block information of the block where each pixel is located, the circulation is stopped, and the final texture T is input.
In addition, in order to improve the accuracy of the final target texture, after each layer finishes more lines of target textures, the process of S1041-S1043 is repeated for L × 2 times according to the current pyramid layer number L, so as to ensure the accuracy of data.
And S105, according to the final target texture obtained by the circulation, encoding to generate a target image file.
Specifically, after the final target texture is obtained, the image data of the final target texture is read, and a JPG or PNG image file is generated by encoding.
According to the method for eliminating the sundries in the high-performance picture by using the GPU rendering, the sundry area in the picture is repaired by inputting the original picture and the two-value picture marked by the sundries and combining the picture repairing algorithm based on the GPU rendering, so that the effect of eliminating the sundries in the picture is achieved, and the problem of poor eliminating effect in the process of eliminating the sundries in the picture in the prior art is solved.
Based on the same inventive concept, an embodiment of the present application further provides a device for performing high-performance image sundry removal by using GPU rendering, including: a decode load module 201, a first processing module 202, a second processing module 203, a third processing module 204 and an output module 205.
The decoding and loading module 201 is configured to acquire and decode an original image and a sundry mark binary image input by a user, and load to acquire an original image texture and a sundry mark binary image texture suitable for OpenGL rendering;
the first processing module 202 is configured to perform down-sampling on the sundry label binary image texture to obtain an image texture array, and sample an original image texture based on the image texture array to obtain a target texture;
the second processing module 203 is configured to determine a similar matching block of a block where each pixel in the target texture is located in the original image texture, and update the target texture based on the similar matching block to obtain a final target texture;
and a third processing module 204, configured to encode and generate a target image file based on the final target texture.
An output module 205, configured to output the target image file.
It is understood that the same or similar parts in the above embodiments may be mutually referred to, and the same or similar parts in other embodiments may be referred to for the content which is not described in detail in some embodiments.
It should be noted that, in the description of the present application, the terms "first", "second", etc. are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. Further, in the description of the present application, the meaning of "a plurality" means at least two unless otherwise specified.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and the scope of the preferred embodiments of the present application includes other implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
It should be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present application may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc.
In the description herein, reference to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Although embodiments of the present application have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present application, and that variations, modifications, substitutions and alterations may be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (9)

1. A method for high-performance image sundry elimination by GPU rendering is characterized by comprising the following steps:
the method comprises the steps of obtaining and decoding an original image and a sundry mark binary image input by a user, and loading to obtain an original image texture and a sundry mark binary image texture suitable for OpenGL rendering;
down-sampling the sundry marked binary image texture until no sundries exist in the sampled image texture data to obtain a plurality of image textures respectively corresponding to each sampling, and taking the image texture obtained by each sampling as a layer of data to form an image texture pyramid; the image texture of the lowest layer of the image texture pyramid corresponds to the size of the original image;
the original image texture is down-sampled to obtain a target texture with the size same as the size of the topmost layer of the image texture pyramid, and position textures with the size storing position information of similar matching blocks in the original image texture corresponding to the block where each pixel in the target texture is located and distance textures with the size storing the distance information of the similar matching blocks in the block where each pixel in the target texture is located and the corresponding original image texture are established;
and from the topmost layer to the bottommost layer of the image texture pyramid, based on the size of each layer, executing the following loop to obtain a final target texture with the size same as that of the original image texture and updated matching block information based on the minimum distance similarity of the block where each pixel is located:
determining a similar matching block with the minimum distance corresponding to the block of each pixel in the target texture in the original image texture;
updating the position texture and the distance texture based on the similar matching block information with the minimum distance;
updating the target texture based on the updated location texture and the distance texture;
changing the pyramid layer number of the image texture into the next layer of the current layer number to obtain a new size, performing up-sampling on the target texture again in the original image texture to obtain a target texture with a new size, and performing up-sampling on both the position texture and the distance texture to obtain a position texture with a new size and a distance texture with a new size;
and coding to generate a target image file according to the final target texture obtained by the circulation.
2. The method according to claim 1, wherein the downsampling the clutter marked binary image texture until no clutter exists in the sampled image texture data comprises:
sampling the sundry marked binary image texture based on the original image texture size to obtain an image texture;
judging whether the texture of the image contains sundries or not;
if so, continuing downsampling after halving the sampling size;
if not, stopping sampling.
3. The method according to claim 1, wherein the block where the pixel is located is:
and the radius of the area block is a preset numerical value by taking the pixel as a central point.
4. The method according to claim 1, wherein the establishing of the position texture of the size storing position information of similar matching blocks in the original image texture corresponding to the block of each pixel in the target texture and the distance texture of the size storing the distance information of the similar matching blocks in the target texture and the corresponding original image texture comprises:
establishing position texture and distance texture of the size of the top image texture of the image texture pyramid;
randomly generating a similar matching block for each block where each pixel in the target texture is located;
storing the position information of the similar matching block corresponding to the block of each pixel in the target texture in the original image texture into the position texture;
and storing the distance information between the block of each pixel in the target texture and the similar matching block corresponding to the block of the pixel in the original image texture in the distance texture.
5. The method according to claim 1, wherein the determining a similar matching block with a minimum corresponding distance in the original image texture for the block of each pixel in the target texture comprises:
defining a window size to be one-half of the target texture size;
taking each pixel point in the target texture as a central point, and taking the window size as a radius to obtain nine window position points including the position points of the pixel points;
sampling nine window position points in the position texture to obtain nine similar matching block position points corresponding to the nine window position points respectively;
respectively calculating the distances between the current block of the pixel in the target texture and the nine window position points and nine similar matching block position points in the original image texture, and determining the position point with the minimum distance;
and determining a block corresponding to the position point with the minimum distance in the original image texture, and determining a similar matching block with the minimum distance in the original image texture corresponding to the block of the pixel.
6. The method according to claim 5, wherein the determining a similar matching block with a minimum corresponding distance in the original image texture for the block of each pixel in the target texture further comprises:
halving the window size to obtain a new window size;
based on the new window size, the smallest similar matching block is re-determined.
7. The method according to claim 1, wherein the updating the target texture based on the updated position texture and distance texture comprises:
establishing a calculation area by taking each target pixel in the target texture as a central point and a preset size as a radius;
based on the position texture and the distance texture, acquiring position information and distance information of a similar matching block of each pixel point in the calculation region;
and carrying out weighted summation on the position color values of the similar matching blocks corresponding to each pixel point in the calculation area based on the image texture data, the position information and the distance information to obtain the color value of the target pixel position, and updating the color value into the target texture.
8. The method according to claim 1, wherein the changing the pyramid layer number of the image texture to a layer next to a current layer number to obtain a new size, re-upsampling the target texture in the original image texture to obtain a target texture of a new size, and upsampling both the position texture and the distance texture to obtain a position texture of the new size and a distance texture of the new size comprises:
changing the image texture pyramid layer number into the next layer of the current layer number to obtain a new size;
based on the new size, performing up-sampling in a linear filtering mode to obtain a target texture of the new size;
based on the new size, performing up-sampling in a linear filtering mode to obtain position texture of the new size;
and obtaining a distance texture of the new size based on the new size and the position texture of the new size.
9. An apparatus for high performance image clutter removal using GPU rendering, comprising:
the decoding and loading module is used for acquiring and decoding an original image and a sundry mark binary image input by a user, and loading to obtain an original image texture and a sundry mark binary image texture suitable for OpenGL rendering;
the first processing module is used for carrying out down-sampling on the sundry label binary image texture to obtain an image texture array, and sampling the original image texture based on the image texture array to obtain a target texture;
the second processing module is used for determining a similar matching block of a block where each pixel in the target texture is located in the original image texture, and updating the target texture based on the similar matching block to obtain a final target texture;
a third processing module, configured to generate a target image file by encoding based on the final target texture;
and the output module is used for outputting the target image file.
CN202111121389.1A 2021-09-24 2021-09-24 Method and apparatus for high performance picture clutter removal using GPU rendering Active CN113822815B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111121389.1A CN113822815B (en) 2021-09-24 2021-09-24 Method and apparatus for high performance picture clutter removal using GPU rendering

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111121389.1A CN113822815B (en) 2021-09-24 2021-09-24 Method and apparatus for high performance picture clutter removal using GPU rendering

Publications (2)

Publication Number Publication Date
CN113822815A true CN113822815A (en) 2021-12-21
CN113822815B CN113822815B (en) 2024-02-06

Family

ID=78915346

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111121389.1A Active CN113822815B (en) 2021-09-24 2021-09-24 Method and apparatus for high performance picture clutter removal using GPU rendering

Country Status (1)

Country Link
CN (1) CN113822815B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105447489A (en) * 2015-11-13 2016-03-30 浙江传媒学院 Character and background adhesion noise elimination method for image OCR system
WO2019011046A1 (en) * 2017-07-13 2019-01-17 华为技术有限公司 Image processing method, device and system
CN111710018A (en) * 2020-06-29 2020-09-25 广东小天才科技有限公司 Method and device for manually smearing sundries, electronic equipment and storage medium
CN111861869A (en) * 2020-07-15 2020-10-30 广州光锥元信息科技有限公司 Image processing method and device for beautifying portrait and preventing background distortion
CN113409411A (en) * 2021-05-26 2021-09-17 腾讯科技(深圳)有限公司 Rendering method and device of graphical interface, electronic equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105447489A (en) * 2015-11-13 2016-03-30 浙江传媒学院 Character and background adhesion noise elimination method for image OCR system
WO2019011046A1 (en) * 2017-07-13 2019-01-17 华为技术有限公司 Image processing method, device and system
CN111710018A (en) * 2020-06-29 2020-09-25 广东小天才科技有限公司 Method and device for manually smearing sundries, electronic equipment and storage medium
CN111861869A (en) * 2020-07-15 2020-10-30 广州光锥元信息科技有限公司 Image processing method and device for beautifying portrait and preventing background distortion
CN113409411A (en) * 2021-05-26 2021-09-17 腾讯科技(深圳)有限公司 Rendering method and device of graphical interface, electronic equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
付美艳;崔守良;李作志;: "草地视景图像质量优化仿真方法研究", 计算机仿真, no. 04, pages 251 - 255 *

Also Published As

Publication number Publication date
CN113822815B (en) 2024-02-06

Similar Documents

Publication Publication Date Title
CN108664981B (en) Salient image extraction method and device
CN112330574B (en) Portrait restoration method and device, electronic equipment and computer storage medium
EP1131789B1 (en) Method for building a three-dimensional scene model by analysing a sequence of images
KR102481882B1 (en) Method and apparaturs for processing image
CN108765343A (en) Method, apparatus, terminal and the computer readable storage medium of image procossing
KR101760323B1 (en) Method and system for rendering three dimensional views of a scene
CN103931181B (en) Adaptive upsampling for spatial scalability Video coding
CN116664450A (en) Diffusion model-based image enhancement method, device, equipment and storage medium
US20220084278A1 (en) Method and device for rendering point cloud-based data
EP3343446A1 (en) Method and apparatus for encoding and decoding lists of pixels
CN111429468B (en) Cell nucleus segmentation method, device, equipment and storage medium
CN111429459A (en) End-to-end joint classification medical image segmentation method and system
CN111353965B (en) Image restoration method, device, terminal and storage medium
CN114972001A (en) Image sequence rendering method and device, computer readable medium and electronic equipment
CN113744142B (en) Image restoration method, electronic device and storage medium
CN113822815B (en) Method and apparatus for high performance picture clutter removal using GPU rendering
WO2023206844A1 (en) Product image reconstruction method and apparatus, and product reconstruction model training method and apparatus
CN115423697A (en) Image restoration method, terminal and computer storage medium
CN110766117A (en) Two-dimensional code generation method and system
CN115205112A (en) Model training method and device for super-resolution of real complex scene image
CN114219738A (en) Single-image multi-scale super-resolution reconstruction network structure and method
WO2020178591A1 (en) Enhancement of three-dimensional facial scans
Abbas et al. GA based rational cubic B-spline representation for still image interpolation
CN112672052A (en) Image data enhancement method and system, electronic equipment and storage medium
CN113284074B (en) Method and device for removing target object of panoramic image, server and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant