CN103903241A - Image super-resolution method based on sample with adjacent sides - Google Patents

Image super-resolution method based on sample with adjacent sides Download PDF

Info

Publication number
CN103903241A
CN103903241A CN201410141448.5A CN201410141448A CN103903241A CN 103903241 A CN103903241 A CN 103903241A CN 201410141448 A CN201410141448 A CN 201410141448A CN 103903241 A CN103903241 A CN 103903241A
Authority
CN
China
Prior art keywords
sample
image
resolution
piece
low
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410141448.5A
Other languages
Chinese (zh)
Other versions
CN103903241B (en
Inventor
端木春江
王泽思
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Sen Bao Textile Technology Co ltd
Original Assignee
Zhejiang Normal University CJNU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Normal University CJNU filed Critical Zhejiang Normal University CJNU
Priority to CN201410141448.5A priority Critical patent/CN103903241B/en
Priority claimed from CN201410141448.5A external-priority patent/CN103903241B/en
Publication of CN103903241A publication Critical patent/CN103903241A/en
Application granted granted Critical
Publication of CN103903241B publication Critical patent/CN103903241B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Processing (AREA)

Abstract

The invention discloses an image super-resolution method based on a sample. The sample in the method is different from that of a traditional method, and sides are added on the periphery of a traditional sample with the low resolution. By means of the image super-resolution method, in the on-line processing process, the traditional method that image blocks are respectively amplified is not selected, and the amplifying method that an overlapping region exists in the amplifying process is adopted. In this way, a matching criterion is newly designed by the image super-resolution method for selecting high-resolution blocks corresponding to a region to be amplified in an optimized mode, and the high-resolution blocks are stored in a database. Finally, weighted averaging is carried out on high-resolution pixel values in the overlapping region to obtain high-resolution images corresponding to the final and given low-resolution images.

Description

Based on the image super-resolution method with adjacent side sample
Technical field
The present invention relates to the super resolution technology in image processing, for a given image, wish that obtaining one carries out enlarged image to this image, the enlarged image obtaining is more clear better.This technology can be applied to fields such as amplifying the little image stored due to limit bandwidth on internet, has wide application.
Background technology
There are now in the world two class super-resolution methods.One class methods are the methods based on interpolation, and another kind of is method based on sample (example).Wherein, although the super-resolution method based on difference is fairly simple, complexity is lower, and its enlarged image obtaining is generally fuzzyyer, and image border part is unintelligible.In order to overcome this shortcoming, people have proposed the super-resolution method based on sample.In these class methods, be divided into two steps.First step is to set up image sample data storehouse, and second step is the realization of the image super-resolution based on sample.In a first step, first carry out the training process of off-line.In this process, first manage to obtain a collection of high-resolution picture rich in detail, then each high-resolution image is obtained the image of corresponding low resolution by the method for down-sampling or filtering.Then low resolution and high-resolution image are carried out to piecemeal processing, the size of the piece that the image of low resolution is divided can be 4 × 4 or 5 × 5 etc., piece to each in low-resolution image can find the piece corresponding with it in high-resolution image block.For example: as the enlargement factor requiring is 2 o'clock, the piece of 4 × 4 sizes to each in low resolution figure has piece and its corresponding or coupling of 8 × 8 sizes in high-resolution figure.Like this, in the time that the size of the image block of low resolution is N × N, the set of pixel value composition in the image block of low resolution:
q(x,y,k)={f L(x,y,k,i,j)|x≤i≤x+N-1,y≤j≤y+N-1}
Here k represents to train k image in storehouse, the position of the upper left corner that (x, y) is image block in the image of low resolution, f l(x, y, k, i, j) is the value of the pixel located at (i, j) of the image of k low resolution.The image block of represented low resolution is corresponding with gathering q (x, y, k), the set being made up of the pixel value of a high-resolution image block:
Q(x,y,k)={f H(x,y,k,i,j)|2x≤i≤2x+2N-1,2y≤j≤2y+2N-1}
Here f h(x, y, k, i, j) is the value of the pixel located at (i, j) of k high-resolution image.Then, the piece represented high-resolution set Q (x, y, k) of piece represented the set q (x, y, k) of each low resolution, its coupling and this corresponding relation all can be stored in sample data storehouse.The piece of the low resolution of each storage can be described as a sample.To training storehouse in each high-resolution image through this process after, can obtain a sample data storehouse, this database description large quantities of low resolution pieces (sample) and its corresponding high resolving power piece.
After tranining database is built well, just can carry out online SUPERRESOLUTION PROCESSING FOR ACOUSTIC to the given not image of the low resolution in training storehouse.Manage herein in step, first the image of the low resolution to be amplified not having in training storehouse is carried out to piecemeal, the size of piece is the same with the size of the piece of the low resolution in sample data storehouse.,, if the sample in sample data storehouse is all 4 × 4 sizes, so, the image of low resolution to be amplified is divided into the piece with 4 × 4 sizes.Then,, to each piece to be amplified, in sample data storehouse, find the sample that approaches this piece to be amplified most.The set forming for the pixel in piece to be amplified
s(x 0,y 0)={g L(i,j)|x 0≤i≤x 0+N-1,y 0≤j≤y 0+N-1}
First calculate the absolute difference sum between piece q (x, y, k) in piece to be amplified and training storehouse:
SAD ( x , y , k ) = Σ i = 1 N - 1 Σ j = 1 N - 1 | g L ( x 0 + i , y 0 + j ) - f L ( x , y , k , i , j ) |
Here (x, 0, y 0) be the position in the upper left corner of piece to be amplified, g lthe pixel value of locating at (i, j) in (i, j) image for low resolution to be amplified.Next, the piece that searching and piece to be amplified mate most in training storehouse, calculates
( x o , y o , k o ) = arg { min x , y , k ( SAD ( x , y , k ) ) }
Like this, in training storehouse, gather q (x o, y o, k o) represented piece and piece to be amplified mate most, i.e. between the two nearest.Then, just can utilize the set q (x storing in tranining database o, y o, k o) represented piece and set Q (x o, y o, k o) corresponding relation between represented piece carries out super-resolution amplification., for set that in the piece of high-definition picture, pixel forms
S(x 0,y 0)={g H(i,j)|2x 0≤i≤2x 0+2N-1,2y 0≤j≤2y 0+2N-1}
In set Q (x for pixel value o, y o, k o) in pixel value substitute.Therefore, have
g H(2x 0+i,2y 0+j)=f H(x o,y o,k o,i,j),
Wherein, 0≤i≤2N-1,0≤j≤2N-1,2x o≤ i '≤2x o+ 2N-1,2y o≤ j '≤2y o+ 2N-1.
For example, for the piece of waiting to amplify 4 × 4 sizes, first find and be stored in tranining database and neutralize high-resolution 8 × 8 that this sample mates most, then utilize this match block to substitute 8 × 8 piece of this piece to be amplified institute corresponding position in high-resolution image.
In existing document, also someone proposes following super-resolution method.In these methods, to each piece to be amplified, in sample data storehouse, find k and its immediate sample, k the high resolving power piece that then finds this k sample to mate, then utilization is weighted average method to this k high-resolution and obtains needed high resolving power piece.
At the piece of the each low resolution in low-resolution image, after all having obtained substituting high-resolution of its piece, just these high resolving power pieces can be stitched together, obtain a high-resolution image, complete the processing procedure of image super-resolution.
In above processing procedure, be to utilize respectively tranining database to obtain high-resolution to the piece of each low resolution.Like this, in general the high-definition picture obtaining has stronger blocking effect.That is, on the separatrix of an image block and another image block, or around marginal, there is pseudo-edge, or the situation of pseudo-saltus step.This blocking effect can seriously reduce the visual effect of obtained high-definition picture.
In addition, in this method, there is many-to-one situation, originally differ less low resolution piece, corresponding the high resolving power piece differing greatly in tranining database., corresponding low point of ratio piece of multiple very different high resolving power pieces.Like this, if low resolution piece has small disturbance, extremely different high-definition pictures will be brought.This phenomenon cannot overcome in these class methods.
Like this, need to propose or find the method that can effectively overcome the image super-resolution of above shortcoming.In the present invention, the pattern of setting up a kind of novel sample is overcome to above shortcoming, obtain the performance of better super-resolution.
Summary of the invention
(1) foundation with adjacent side sample
In the present invention, the extraction of sample is different with traditional method.In classic method, after low-resolution image is carried out to piecemeal, the size of piecemeal is the size of sample, thereby has ignored the information of this piece surrounding pixel.This seminar is thought, if can make good use of the information of ignoring in this classic method in Image Super Resolution Processing, can improve widely the performance of SUPERRESOLUTION PROCESSING FOR ACOUSTIC.
For this reason, the present invention proposes a kind of image super-resolution method based on adjacent side sample.In the method, for the extraction of sample, except comprising the image information of original piecemeal, be also included in as much as possible low-resolution image and neutralize the information of the surrounding pixel that this piece surrounding is adjacent.As: in the time that the size of the piece of low resolution is N × N, the size of the sample extracting is no longer 4 × 4, but (N+m) × (N+m) size as shown in Figure of description 1, the number of pixels in the horizontal direction or in vertical direction that wherein m expands outside the border of piece for extracted sample.These pixel values can obtain in the image of low resolution.
The size of the high resolving power piece corresponding with this sample is also the same with former method, remains unchanged.As: the size at the piece of low resolution is 4 × 4 o'clock, and the size of high-resolution is still 8 × 8.
Like this, in the database of traditional training, store: the sample of multiple N × N sizes, and (2N) corresponding with each sample × (2N) high resolving power match block of size, N × N is the size of the piece of low resolution here.And store in the training storehouse of proposed method: multiple (N+m) × (N+m) sample of size, and and (2N) × (2N) high resolving power match block of size corresponding to each sample., a new set of the composition of the pixel value in the sample of low resolution:
q′(x,y,k)={f L(x,y,k,i,j)|x-m≤i≤x+m+N-1,y-m≤j≤y+m+N-1}
(2) determining and weights in matching process of edge pixel point
Due in image, human eye is the most responsive to its marginal information.Therefore, this seminar thinks that in the piece coupling optimizing of carrying out in super-resolution process, reply edge pixel value is given the weights larger than the value on general pixel, to obtain the edge image more clearly of amplification.
In image is processed, edge proposes there are a lot of methods, but First-order Gradient method is more responsive to noise ratio.Therefore, the present invention adopts the Laplace operator edge of second order to extract.The signal of one dimension is being carried out after Laplce's processing, occurring respectively positive negative two signal values that absolute value is larger at the right and left of marginal point, marginal point place is zero passage point value.Therefore, first the pixel in image is adopted as lower bolster
0 1 0 1 - 4 1 0 1 0
Carry out convolution algorithm, then finding absolute value is less than | δ | point, δ is a little value, then in the direction of 0 °, 45 °, 90 °, 135 °, 180 ° in the region of Q × Q (Q=5 in the present invention) around the point finding, search out respectively one just or one negative value, and its absolute value is greater than T (T=200 in the present invention), search out in the other direction at it and the value of this value contrary sign, its absolute value is also greater than T again.As find, the point finding is the zero crossing after two-dimentional Laplace's operation, i.e. marginal point in image; Other point is non-marginal point.For the marginal point in image and non-marginal point, its weights are respectively A and B,
Figure BSA0000102842800000042
Wherein (i, j) is the position of the pixel in image.(A=1, B=1.5 in the present invention)
(3) generation of overlapping region and utilization
In traditional super-resolution algorithms, between piece to be amplified and the piece that amplified, there is no overlapping region, this has caused the generation of blocking effect in certain degree.How in the process of super-resolution, producing this overlapping region, and make good use of these overlapping regions and promote the performance of SUPERRESOLUTION PROCESSING FOR ACOUSTIC, is an innovative point of the present invention.Because the sample of setting up in the present invention is with adjacent side Pixel Information, this can cause sample and between magnification region, overlap region on the image of low resolution.Simultaneously, in the present invention, carry out after piece amplification at every turn, the distance that move in the upper left corner of next one piece to be amplified will be less than the length of a piece, so not only can on the image of low resolution, overlap region, on high-resolution image, also can produce and treat magnification region and the overlapping region between magnification region.For example: if first size of amplifying piece is 4 × 4, its upper left corner coordinate is (0,0), and the size of next one piece to be amplified is also 4 × 4, and the coordinate in its upper left corner of moving is (2,0).At this moment, will on the image of low resolution and high-resolution image, produce and treat magnification region and the overlapping region between magnification region.Its overlapping region on the image of low resolution is set OL 1=(i, j) | and 2≤i≤3,0≤j≤3}, the set of the overlapping region on high-resolution image is OH 1=(i, j) | 4≤i≤7,0≤j≤7}.Along with the carrying out of SUPERRESOLUTION PROCESSING FOR ACOUSTIC process, likely piece to be amplified and several has amplified piece and has had overlapping region in the present invention.
(4) treat magnification region and training storehouse in piece coupling difference determine
Due in the present invention, sample is with adjacent side pixel, can make good use of this heuristic information when optimum sample finding in training storehouse, avoids blocking effect and many-to-one problem in traditional algorithm simultaneously, makes the picture quality after amplifying better.
Simultaneously different with traditional method, the method proposing is not only considered low resolution piece and the current coupling difference for the treatment of between magnification region in training storehouse, also in amplification process, also consider that training storehouse neutralizes corresponding high-resolution of this low resolution piece and neutralizes the coupling difference between magnification region at enlarged image, to obtain better effect.
After sample and tranining database having been carried out as above change, the definition for the treatment of the coupling difference of piece in magnification region and training storehouse in the process that super-resolution image amplifies also changes some, the advantage of the sample pattern being proposed to bring into play in SUPERRESOLUTION PROCESSING FOR ACOUSTIC.
For this reason, in the time selecting each sample until the corresponding coupling of magnification region institute, not only utilize the value of pixel in region, also utilize on low-resolution image the value with this region neighbor.Meanwhile, also need to consider to treat magnification region and the overlapping part difference on low-resolution image and high-definition picture between magnification region.
Therefore, first calculate
SADEO ( x , y , k , L ol ) = Σ ( i , j ) ∈ L ol | f L ( x , y , k , i , j ) - g L ( i , j ) | · p L ( i , j )
Wherein, L olcurrent magnification region and the common factor of ol between magnification region wanted.F l(x, y, k, i, j) is the pixel point value that in training storehouse, (x, y, k) individual sample is located at (i, j), g l(x ol+ i, y ol+ j) be (x on low-resolution image ol+ i, y ol+ pixel the point value j) located, p l(i, j) represents in low resolution figure whether (i, j) locates is the weights of edge pixel, and it asks method as previously mentioned.Next, calculate this candidate's sample and all on low-resolution image the coupling difference of overlapping region,
SADLO ( x , y , k ) = Σ L oi SADEO ( x , y , k , L ol )
SADLO (x, y, k) represents the absolute difference sum of (x, y, k) individual sample and each overlapping region like this.
Then calculate at the training storehouse corresponding high resolving power piece of neutralization (x, y, k) individual sample and each absolute difference of magnification region overlaid part, calculate
SADEH ( x , y , k , H oh ) = Σ ( i , j ) ∈ H oh | f H ( x , y , k , i , j ) - g H ( i , j ) | · p H ( i , j )
Here H, ohcurrent magnification region and the common factor of oh between magnification region wanted.F h(x, y, k, i, j) is the value of the pixel located at (i, j) of the corresponding piece of training storehouse neutralization (x, y, k) individual sample, g h(i, j) is oh value of having amplified the pixel of the lap between piece and piece to be amplified.P h(i, j) represents in high resolution graphics whether (i, j) locates is the weights of edge pixel, and it asks method as previously mentioned.Next, calculate with high resolving power piece corresponding to this candidate's sample and neutralize the absolute difference sum of magnification region lap at high-definition picture,
SADHO ( x , y , k ) = Σ H oh SADEO ( x , y , k , H oh )
The absolute difference sum of the high resolving power piece that SADHO (x, y, k) represents and (x, y, k) individual sample is corresponding like this and each overlapping region of having amplified.
Finally, determine in low-resolution image, sample and piece to be amplified are at the absolute difference in zero lap region., calculate
SADNO ( x , y , k ) = Σ ( i , j ) ∈ L no | f L ( x , y , k , i , j ) - g L ( i , j ) | · p L ( i , j )
Here L, norepresent in low-resolution image and the non-overlapping part of sample (x, y, k).P l(i, j) represents in low resolution figure whether (i, j) locates is the weights of edge pixel, and it asks method as previously mentioned.
Coupling difference that obtain like this, and that sample (x, y, k) is corresponding is
SADE(x,y,k)=αSADLO(x,y,k)+βSADHO(x,y,k)+γSADNO(x,y,k)
Here, α, beta, gamma is three balance factors, the impact for the various differences of balance on total difference.(in the present invention, α=β=γ=1 for simplicity)
(5) determining of the pixel value in high-resolution image
In high-resolution figure, as previously mentioned, some pixels are in the overlapping region of several amplification pieces.Therefore, need to, according to each amplification piece in this locational value, finally determine the pixel value in high-definition picture.Because the pixel point value in the little piece of mean absolute difference is more reliable, to the little piece of mean absolute difference, with large weights, the large piece of mean absolute difference is with little weights.For this reason, first calculate
sum ( x , y ) = Σ i = 1 L o S ( i ) SADE ( x o ( i ) , y o ( i ) , k o ( i ) )
Here, (x, y) is the position of pixel in high-definition picture, L obe the sum of overlapping region on this pixel, i represents i overlapping region on this pixel, SADE (x o(i), y o(i), k o(i)) be the minimum SADE value on i overlapping region, wherein a upper joint is shown in the definition of SADE value.S (i) is for calculating SADE (x o(i), y o(i), k o(i) sum of pixel used) time., the pixel value that (x, the y) on high-resolution image locates is:
g H * ( x , y ) = sum ( x , y ) { Σ i = 1 L o [ S ( i ) SADE ( x o ( i ) , y o ( i ) , k o ( i ) ) ] · [ F H ( x o ( i ) , y o ( i ) , k o ( i ) , m , n ) ] }
Wherein f h(x o(i), y o(i), k o(i), m, n) be the value on this pixel in i match block.
(6) processing of the initial edged to low-resolution image and high-definition picture
Because the sample in the present invention is the sample with adjacent side, for simplicity, the present invention first surrounding of the image to low resolution adds a limit, and the width on limit is identical with the width of adjacent side in sample.Pixel value on the pixel of surrounding is determined by the pixel value in its most contiguous image.
(7) off-line process of the present invention
The present invention is divided into the process of establishing of the tranining database of off-line, and online image super-resolution amplification process.The process of establishing of the tranining database of its off-line is as described below., the each image in high-resolution image data base is handled as follows:
Step 1) according to high-resolution figure, carry out down-sampling, obtain the figure of low resolution.k=1
Step 2) image of low resolution is carried out to above-described initial edged processing, it is added to a limit.
Step 3) make x l=0, y l=0
Step 4) extract in low-resolution image from coordinate (x l-m, y l-m) start to (x l+ N+m-1, y l+ N+m-1) between image block, store in database.N × N is the size of piece, and m is the width of the adjacent side with adjacent side sample.Choosing of N and m value, need to take in from the performance of computation complexity and SUPERRESOLUTION PROCESSING FOR ACOUSTIC.
Step 5) make x h=2x l, y h=2y l
Step 6) extract in low-resolution image from coordinate (x h, y h) start to (x h+ 2N-1, y h+ 2N-1) between image block, store in database.Like this, just store a pair of low resolution piece (with the sample of adjacent side) f l(x l, y l, k) with high resolving power piece f h(x h, y h, k).
Step 7) make x l=x l+ 1
Step 8) if x l≤ W-N, jumps to step 4) the lower a pair of low resolution piece of extraction and high resolving power piece, the width that wherein W is image.
Step 9) make x l=0, y l=y l+ 1
Step 10) if y l≤ H-N, jumps to step 4) the lower a pair of low resolution piece of extraction and high resolving power piece, the height that wherein H is image.
Step 11) k=k+1, jump to step 1), next panel height image in different resolution is processed, until all images in training storehouse are all disposed.
Like this, through the disposal route of this off-line, just obtained can be used for the database that a lot of samples of SUPERRESOLUTION PROCESSING FOR ACOUSTIC and sample mates therewith high resolving power piece form.
(8) online processing procedure of the present invention
For the low-resolution image that will amplify not having in a given database, the present invention adopts following processing procedure.
Step 1) processing to initial low resolution figure, add a limit to the image of low resolution.
Step 2) make x l=0, y l=0
Step 3) on this position, calculate on the high-resolution figure on low resolution figure, having amplified, and (x in training storehouse, y, k) sample of individual storage and the high resolving power piece that sample is corresponding therewith, the coupling difference SADE (x proposing, y, k).
Step 4) find in database the sample of coupling and the high resolving power piece that sample is corresponding therewith., calculate that this is locational
( x o , y o , k o ) = arg [ min ( x , y , k ) ∈ SX ( SADE ( x , y , k ) ]
Wherein, the set that SX forms for the index of all samples in tranining database.
Step 5) make x l=x l+ step_x, step_x is the distance that on low resolution figure, on x direction of principal axis, next piece is beated.Due to step_x<N, so just form horizontal overlapping region.
Step 6) jump to step 4), next overlapping piece is found in database to the image block pair of coupling, i.e. sample and the high resolving power piece corresponding to this sample of coupling, until x l>=W-N.The width that wherein W is image, N is the width of a low resolution piece.
Step 7) y l=y l+ step_y, step_y is the distance that on low resolution figure, on y direction of principal axis, next piece is beated.Due to step_y<N, so just form longitudinal overlapping region.
Step 8) jump to step 4), next overlapping piece is found in database to the image block pair of coupling, i.e. sample and the high resolving power piece corresponding to this sample of coupling, until y l>=H-N.The height that wherein H is image, N is the height of a low resolution piece.
Step 9) utilize the each high resolving power match block finding, utilize as above proposed method to determine the pixel value in high-resolution image.
In sum, innovative point of the present invention is: 1), in the amplification process of the training of off-line and online super-resolution, utilize the sample with adjacent side, to overcome blocking effect and the one-to-many problem in classic method.2) in online amplification process, utilize overlapping region, and the matching criterior proposing, continue to overcome blocking effect in classic method and the problem of one-to-many.3) in the time of the pixel point value of determining in overlapping region, final numerical value and the absolute difference of overlapping region are inversely proportional to, and the high resolving power piece of the Optimum Matching of overlapping region is directly proportional at the pixel value of this point.
In utilization process of the present invention, comprise online processing procedure and off-line process.Wherein, off-line process is can single treatment complete, then obtains a large amount of the low resolution piece with adjacent side sample proposing and corresponding high resolving power piece, so that these information are deposited in to tranining database.Online processing procedure is treated enlarged image for each, to utilize the database training and the thought that uses overlapping region, obtains high-resolution image.
Below with reference to accompanying drawing, the technique effect of design of the present invention, concrete structure and generation is described further, to understand fully object of the present invention, feature and effect.
Accompanying drawing explanation
Fig. 1 is the schematic diagram with adjacent side sample proposed by the invention;
Fig. 2 is the process flow diagram of setting up off-line the method with adjacent side sample and corresponding database proposed by the invention;
Fig. 3 is the proposed by the invention process flow diagram that carries out online image super-resolution amplification;
Fig. 4 is that the present invention utilizes the facial image figure of a part of image for training in FERET database;
Fig. 5 is the present invention and existing method to the experimental result picture that in FERET database, the facial image in training image storehouse is not processed.Each row image is from left to right followed successively by: the actual high-definition picture of experimental result (d) that the super-resolution of the method based on adjacent side sample that the experimental result (c) that (a) super-resolution of the method based on sample of the existing optimum of low-resolution image (b) of input is amplified proposes is amplified.
Fig. 6 is the present invention and existing method to the experimental result picture of the face image processing in training shape library not in I.D. database.Each row image is from left to right followed successively by: the actual high-definition picture of experimental result (d) that the super-resolution of the method based on adjacent side sample that the experimental result (c) that (a) super-resolution of the method based on sample of the existing optimum of low-resolution image (b) of input is amplified proposes is amplified.
Embodiment
Below in conjunction with accompanying drawing, embodiments of the invention are elaborated: the present embodiment is implemented under with technical solution of the present invention prerequisite, provided detailed embodiment and concrete operating process, but protection scope of the present invention is not limited to following embodiment.
The present invention carries out before Image Super Resolution Processing certain piece image carrying out, and need to first set up a tranining database.What in storehouse, deposit is the sample of proposed belt edge and the high resolving power piece mating with sample.The present invention utilizes following steps to obtain this database.First, first collect good a large amount of high-resolution image.This can be by obtaining in some online free image data bases, as FERET database, the face database of Yale university etc. of the present invention's utilization.Then every that collects high-resolution image is carried out to the following processing as shown in Figure of description 2:
Step 1) according to high-resolution figure, it is carried out to down-sampling, obtain the figure of low resolution.Make k=1 (k=1 represents it is the piece image in database, and k is the index of which image).
Step 2) image of low resolution is carried out to initial edged processing, it is added to a limit.Because proposed sample is with adjacent side, therefore convenient for processing, add a limit need to view picture low-resolution image, can extract the sample with adjacent side with the image block in surrounding in the same old way.The width on added limit is 2 in the present invention.
Step 3) make x l=0, y l=0, here, (x l, y l) be except the coordinate of the later upper left corner, limit in low-resolution image on low resolution sample.
Step 4) extract in low-resolution image from coordinate (x l-2, y l-2) start to (x l+ 4+2-1, y l+ 4+2-1) between image block, store in database.4 × 4 is the size of piece, 2 width for the adjacent side with adjacent side sample.The size of the piece of selecting in the present invention is 4 × 4, and the width of the adjacent side with adjacent side sample is 2.
Step 5) make x h=2x l, y h=2y l, (x h, y h) be in high-definition picture and the coordinate of the upper left corner of the piece of sample coupling in high-definition picture.
Step 6) extract in low-resolution image from coordinate (x h, y h) start to (x h+ 2*4-1, y h+ 2*4-1) between image block, store in database.Like this, just store a pair of low resolution piece (with the sample of adjacent side) f l(x l, y l, k) with high resolving power piece f h(x h, y h, k).Wherein the size with adjacent side sample is 6 × 6, and the size of corresponding high-resolution match block is 8 × 8.
Step 7) make x l=x l+ 1, sample is at the horizontal ordinate in the upper left corner of the low-resolution image pixel unit that moves to left, to extract next sample.
Step 8) if x l≤ W-4, jumps to step 4) the lower a pair of low resolution piece of extraction and high resolving power piece, the width that wherein W is image.Otherwise the sample of this line has extracted complete, the extraction of next line sample will be carried out.
Step 9) make x l=0, y l=y l+ 1, the coordinate of the sample upper left corner on the figure of low resolution moves down into the beginning of next line.
Step 10) if y l≤ H-N, jumps to step 4) the lower a pair of low resolution piece of extraction and high resolving power piece, the height that wherein H is image.Otherwise; Carry out complete to present image sample extract process.
Step 11) k=k+1, jump to step 1), next panel height image in different resolution is processed, until all images in training storehouse are all disposed.
Like this, through the disposal route of this off-line, just obtained can be used for the database that a lot of samples of SUPERRESOLUTION PROCESSING FOR ACOUSTIC and sample mates therewith high resolving power piece form.
After Database has been got well, the present invention just can carry out SUPERRESOLUTION PROCESSING FOR ACOUSTIC to the image of arbitrary low resolution, and it has been amplified.The process that piece image is amplified, as shown in Figure of description 3, is handled as follows:
Step 1) initial low resolution figure is processed, add the limit that a width is 2 to the image of low resolution.
Step 2) make x l=0, y l=0, (x l, y l) be the upper left corner of piece the to be amplified coordinate in low-resolution image.
Step 3) on this position, calculate on the high-resolution figure on low resolution figure, having amplified, and (x in training storehouse, y, k) sample of individual storage and the high resolving power piece that sample is corresponding therewith, corresponding coupling difference SADE (x, y, k).Calculating of this coupling difference comprised sample and the difference of current low-resolution image piece and the difference of overlapping region on low-resolution image, also comprised and on full resolution pricture, amplified the difference in overlapping region between piece and high resolving power piece corresponding to sample.
Step 4) find in database the sample of coupling and the high resolving power piece that sample is corresponding therewith., calculate that this is locational
( x o , y o , k o ) = arg [ min ( x , y , k ) &Element; SX ( SADE ( x , y , k ) ]
Wherein, the set that SX forms for the index of all samples in tranining database.
Step 5) make x l=x l+ step_x, step_x is the distance that on low resolution figure, on x direction of principal axis, next piece is beated, in the present invention step_x=2.Due to step_x<N, so just form horizontal overlapping region.It is considered herein that and only made full use of this magnification region and the former information of magnification region lap on low resolution and high-definition picture, could reduce further blocking effect in SUPERRESOLUTION PROCESSING FOR ACOUSTIC and the problem of one-to-many.
Step 6) jump to step 4), next overlapping piece is found in database to the image block pair of coupling, i.e. sample and the high resolving power piece corresponding to this sample of coupling, until x l>=W-N.The width that wherein W is image, N=4 is the width of a low resolution piece.
Step 7) y l=y l+ step_y, step_y=2 is the distance that on low resolution figure, on y direction of principal axis, next piece is beated in the present invention.Due to step_y<N, so just form longitudinal overlapping region.
Step 8) jump to step 4), next overlapping piece is found in database to the image block pair of coupling, i.e. sample and the high resolving power piece corresponding to this sample of coupling, until y l>=H-N, N=4 is the height of a low resolution piece.The height that wherein H is image.
Step 9) utilize the each high resolving power match block finding, utilize the method proposing in above the present invention to determine the pixel value in high-resolution image.Overlapping region on high-definition picture, mainly utilizes weighted average method to obtain final high-resolution pixel value.
The present invention tests as experimental subjects using face database, and by subjective assessment and two aspects of objective evaluation, experimental result is evaluated, wherein subjective assessment refers to human eye and rebuilds image for the wherein impression of details by observation, and objective evaluation mainly comprises MSE, PSNR and MSSIM (mean value of SSIM) interpretational criteria.Wherein MSE is root-mean-square error, is defined as:
MSE = &Sigma; y = 1 2 H &Sigma; x = 1 2 W ( g H * ( x , y ) - g H ( x , y ) 2 W &times; 2 H
Here, 2W × 2H is the size of high-definition picture,
Figure BSA0000102842800000114
for the image pixel value that the high-definition picture amplifying from low-resolution image is located at (x, y), g h(x, y) is the pixel value that original high-definition picture is located at (x, y).
PSNR is defined as:
PSNR = 10 &times; log 10 ( max 2 MSE )
Here the maximal value of the possible pixel value in max presentation video.SSIM is defined as:
SSIM = ( 2 &mu; x &mu; y + C 1 ) ( 2 &sigma; xy + C 2 ) ( &mu; x 2 + &mu; y 2 + C 1 ) ( &sigma; x 2 + &sigma; y 2 + C 2 )
Wherein, μ x, μ yrepresent respectively original high resolution image and the average of rebuilding rear image, C 1, C 2represent two constants, σ x, σ yrepresent original high resolution image and the variance of rebuilding rear image, σ xyrepresent the associating variance of this two width image.The mean value of the SSIM value of all in MSSIM presentation video.
Experiment one utilizes 240 width faces in FERET database to train, and comprising 40 different people, everyone has the not picture of ipsilateral of 6 width, and a part of face image in training set is as shown in Figure of description 4.The size of original full resolution pricture is 120*120, and after the sampling that degrades, the size of low-resolution image is 60*60, and enlargement factor is made as 2.
In experiment, the facial image not having in the tranining database of four width off-lines being used as to input picture tests.Meanwhile, in this four width image and database, the relevance of existing image is not strong.Experimental result is as shown in Figure of description 5.
From accompanying drawing 5, can find out, the super-resolution image amplification method reconstruct that adopts the sample with adjacent side information proposed by the invention effect is out better than the existing best super resolution ratio reconstruction method based on sample.Enlarged image of the present invention edge is more outstanding, and details is more clear, more close to actual high-definition picture.In accompanying drawing 5, first woman's overall sense organ is more clear, cheek and the eyes on second woman the right are more clear, eye contour edge is also improved, the sharpness of the 3rd man's left eye and face's part has obtained obvious improvement, sharpness on the whole has also obtained certain improvement, and the 4th man's lip part has also had improvement to a certain degree than existing method.Generally speaking, by the processing of method of the present invention, entire image is all better than existing the best way based on sample from the sharp keen degree of overall sharpness or image border.The super-resolution image reconstructing for Fig. 5, MSE and MSSIM between the image that calculating original high resolution image and the existing the best way based on sample obtain, and original full resolution pricture and adopt MSE and the MSSIM between the image that obtains of method of the present invention, thereby obtain table 1 and table 2.The method that proposed is as can be seen from the table closer to original high resolution image, and its MSE is less, and MSSIM is larger, rebuilds effect and is better than the method for reconstructing based on sample.In addition the test result of 100 width test pictures is carried out also finding after statistical average, the performance in MSE and the MSSIM index of institute's put forward the methods is better than existing the best way based on sample.
This of the MSE index of table 1 existing the best way based on sample and method proposed by the invention in actual FERET face database image
Figure BSA0000102842800000121
This of the MSSIM index of table 2 existing the best way based on sample and method proposed by the invention in actual FERET face database image
Figure BSA0000102842800000131
Experiment two human face photos that gathered on 120 width I.D.s go to train as picture database, and the size of original high resolution image is 120*120, and the low-resolution image size after down-sampling is 60*60.Get with database in image do not have the facial image of relevance to test, test result is as shown in Figure of description 6.To the result of Fig. 6 calculate respectively image that original image and existing the best way based on sample obtain and and PSNR value and the SSIM value of the image of the reconstruction that obtains of method proposed by the invention, result is as shown in Table 3 and Table 4.The PSNR of the image that method proposed by the invention is rebuild is as can be seen from the table larger, and MSSIM is higher, and objective indicator illustrates that performance of the present invention is better.Meanwhile, from accompanying drawing 6, can find out that the subjective effect of proposed method is better, closer to actual high-definition picture.The high-definition picture that method of the present invention is rebuild is better than current the best way based on sample at detail section as face, eyes, nose etc.Method of the present invention is better than current the best way based on sample on the whole, has adopted the method proposing can more clearly carry out the amplification of image.
Table 3 is the comparison of the PSNR index of the best way based on sample and method proposed by the invention at present in actual identity card facial image database
Figure BSA0000102842800000132
Table 4 is the comparison of the MSSIM index of the best way based on sample and method proposed by the invention at present in actual identity card facial image database
Figure BSA0000102842800000141
More than describe preferred embodiment of the present invention in detail.The ordinary skill that should be appreciated that this area just can design according to the present invention be made many modifications and variations without creative work.Therefore, all technician in the art, all should be in by the determined protection domain of claims under this invention's idea on the basis of existing technology by the available technical scheme of logical analysis, reasoning, or a limited experiment.

Claims (7)

1. a method of amplifying based on the super-resolution image with adjacent side.The present invention includes the processing procedure of off-line and online processing procedure.In the processing procedure of off-line, set up for super-resolution image and amplify required tranining database.In online processing procedure, any given piece image is amplified.Coordinate at the image block of low resolution is (x l, y l) locate, the sample with adjacent side proposing is from (x l-m, y l-m) to (x l+ N-m-1, y l+ N-m-1) the foursquare image block in region, its size is (N+2m) × (N+2m).Its corresponding high-definition picture piece is from (2x in high-definition picture l, 2y l) to (2x l+ 2N-1,2y l+ 2N-1) the foursquare image block in region, its size is 2N × 2N.For the process of the building database of off-line, be first ready to the high-definition picture of a large amount of Gong training, described method comprises:
Step 1) according to high-resolution figure, carry out down-sampling, obtain the figure of low resolution, for the high resolving power piece that extracts sample and its coupling is prepared.Make k=1 (k=1 represents it is the piece image in database, and k is the index of which image).
Step 2) image of low resolution is carried out to initial edged processing.Because proposed sample is with adjacent side, therefore convenient for processing, add a limit to view picture low-resolution image, the width on limit is identical with the width of adjacent side in sample.
Step 3) make x l=0, y l=0, here, (x l, y l) be except the coordinate of the later upper left corner, limit in low-resolution image on low resolution sample.
Step 4) extract in low-resolution image from coordinate (x l-m, y l-n) start to (x l+ N+m-1, y l+ N+m-1) between image block, store in database.N × N is the size of piece, and m is the width of the adjacent side with adjacent side sample.The size of the piece of selecting in the present invention is 4 × 4, and the width of the adjacent side with adjacent side sample is 2.
Step 5) make x h=2x l, y h=2y l, (x h, y h) be in high-definition picture and the coordinate of the upper left corner of the piece of sample coupling in high-definition picture.
Step 6) extract in low-resolution image from coordinate (x h, y h) start to (x h+ 2*N-1, y h+ 2*N-1) between image block, store in database.Like this, just store a pair of low resolution piece (with the sample of adjacent side) f l(x l, y l, k) with high resolving power piece f h(x h, y h, k).Wherein the size with adjacent side sample is that (N+m) × (N+m), the size of corresponding high-resolution match block is 2N × 2N.
Step 7) make x l=x l+ 1, sample is at the horizontal ordinate in the upper left corner of the low-resolution image pixel unit that moves to left, to extract next sample.
Step 8) if x l≤ W-N, jumps to step 4) the lower a pair of low resolution piece of extraction and high resolving power piece, the width that wherein W is image.Otherwise the sample of this line has extracted complete, the extraction of next line sample will be carried out.
Step 9) make x l=0, y l=y l+ 1, the coordinate of the sample upper left corner on the figure of low resolution moves down into the beginning of next line.
Step 10) if y l≤ H-N, jumps to step 4) the lower a pair of low resolution piece of extraction and high resolving power piece, the height that wherein H is image.Otherwise; Carry out complete to present image sample extract process.
Step 11) k=k+1, jump to step 1), next panel height image in different resolution is processed, until all images in training storehouse are all disposed.
Like this, through the disposal route of this off-line, just obtained can be used for the database that a lot of samples of SUPERRESOLUTION PROCESSING FOR ACOUSTIC and sample mates therewith high resolving power piece form.
For online processing procedure, described method comprises:
Step 1) initial low resolution figure is processed, add a limit to the image of low resolution.On the image of edged, the pixel value on the pixel of the surrounding that should fill is determined by the pixel value in its most contiguous image.
Step 2) make x l=0, y l=0, (x l, y l) be the upper left corner of piece the to be amplified coordinate in low-resolution image.
Step 3) on this position, calculate on the high-resolution figure on low resolution figure, having amplified, and (x in training storehouse, y, k) sample of individual storage and the high resolving power piece that sample is corresponding therewith, corresponding coupling difference SADE (x, y, k).Calculating of this coupling difference has comprised sample and the difference of current low-resolution image piece and the difference of overlapping region on low-resolution image, also comprises and on full resolution pricture, has amplified the difference in overlapping region between piece and high resolving power piece corresponding to sample.Meanwhile, put different weights to edge pixel point and non-edge pixel.
While determining edge pixel point, utilize two-dimentional second order Laplace operator.
Step 4) find in database the sample of coupling and the high resolving power piece that sample is corresponding therewith., calculate that this is locational
Figure FSA0000102842790000021
Wherein, the set that SX forms for the index of all samples in tranining database.
Step 5) make x l=x l+ step_x, step_x is the distance that on low resolution figure, on x direction of principal axis, next piece is beated, in the present invention step_x=2.Due to step_x<N, so just form horizontal overlapping region.
Step 6) jump to step 4), next overlapping piece is found in database to the image block pair of coupling, i.e. sample and the high resolving power piece corresponding to this sample of coupling, until x l>=W-N.The width that wherein W is image, N=4 is the width of a low resolution piece.
Step 7) y l=y l+ step_y, step_y=2 is the distance that on low resolution figure, on y direction of principal axis, next piece is beated in the present invention.Due to step_y<N, form so longitudinal overlapping region.
Step 8) jump to step 4), next overlapping piece is found in database to the image block pair of coupling, i.e. sample and the high resolving power piece corresponding to this sample of coupling, until y l>=H-N, N=4 is the height of a low resolution piece.The height that wherein H is image.
Step 9) utilize the each high resolving power match block finding, determine the pixel value in high-resolution image.Overlapping region on high-definition picture, mainly utilizes average weighted method to obtain final high-resolution pixel value.
2. as claimed in claim 1ly a kind of set up band adjacent side sample, and utilize these methods with adjacent side sample in the super-resolution of image is amplified, to utilize these samples to overcome blocking effect in classic method and the problem of one-to-many.Coordinate at the image block of low resolution is (x l, y l) locate, the sample with adjacent side proposing is from (x l-m, y l-m) to (x l+ N-m-1, y l+ N-m-1) the foursquare image block in region, its size is (N+2m) × (N+2m).Its corresponding high-definition picture piece is from (2x in high-definition picture l, 2y l) to (2x l+ 2N-1,2y l+ 2N-1) the foursquare image block in region, its size is 2N × 2N.Because sample has adjacent side, in the time mating with the image block of low resolution, matching area is larger, can find better coupling sample, and and the high-definition picture piece of this sample coupling.
3. as claimed in claim 1,, in online super-resolution image amplification process, employing will be amplified piece and amplified the amplification method that has overlapping region between piece, to overcome further the problem of blocking effect and one-to-many.Here,, amplify an image block on x axle after, mobile the closing on horizontal ordinate is x l=x l+ step_x, step_x is the distance that on low resolution figure, on x direction of principal axis, next piece is beated, in the present invention step_x=2.Due to step_x<N (N=4 in the present invention), so just form horizontal overlapping region.After the image block of a line all having been carried out to amplification, on y axle, mobile the closing of ordinate is y l=y l+ step_y, step_y=2 is the distance that on low resolution figure, on y direction of principal axis, next piece is beated in the present invention.Due to step_y<N=4, form so longitudinal overlapping region.
4. as claimed in claim 1, when the present invention calculates coupling difference, comprise the absolute difference sum of overlapping region on low-resolution image, the absolute difference sum in underlapped region on low-resolution image, and absolute difference sum on the high-definition picture amplifying, then select optimum sample and sample mates therewith high resolving power piece.The computation process of SADE (x, y, k) is as follows:
First calculate
Figure FSA0000102842790000031
Wherein current magnification region and the common factor of ol between magnification region wanted.F l(x, y, k, i, j) is the pixel point value that in training storehouse, (x, y, k) individual sample is located at (i, j), g l(x ol+ i, y ol+ j) be (x on low-resolution image ol+ i, y ol+ pixel the point value j) located, p l(i, j) represents in low resolution figure whether (i, j) locates is the weights of edge pixel.Next, calculate this candidate's sample and all on low-resolution image the coupling difference of overlapping region,
Figure FSA0000102842790000041
Here SADLO (x, y, k) represents the absolute difference sum of (x, y, k) individual sample and each overlapping region.
Then calculate at the training storehouse corresponding high resolving power piece of neutralization (x, y, k) individual sample and each absolute difference of magnification region overlaid part, calculate
Figure FSA0000102842790000042
Here H, ohcurrent magnification region and the common factor of oh between magnification region wanted.F h(x, y, k, i, j) is the value of the pixel located at (i, j) of the corresponding piece of training storehouse neutralization (x, y, k) individual sample, g h(i, j) is oh value of having amplified the pixel of the lap between piece and piece to be amplified.P h(i, j) represents in high resolution graphics whether (i, j) locates is the weights of edge pixel.Next, calculate with high resolving power piece corresponding to this candidate's sample and neutralize the absolute difference sum of magnification region lap at high-definition picture,
Figure FSA0000102842790000043
The absolute difference sum of the high resolving power piece that SADHO (x, y, k) represents and (x, y, k) individual sample is corresponding like this and each overlapping region of having amplified.
Finally, determine in low-resolution image, sample and piece to be amplified are at the absolute difference in zero lap region., calculate
Figure FSA0000102842790000044
Here L, norepresent in low-resolution image and the non-overlapping part of sample (x, y, k).P l(i, j) represents in low resolution figure whether (i, j) locates is the weights of edge pixel.
Coupling difference that obtain like this, and that sample (x, y, k) is corresponding is
SADE(x,y,k)=αSADLO(x,y,k)+βSADHO(x,y,k)+γSADNO(x,y,k)
Here, α, beta, gamma is three balance factors, the impact for the various differences of balance on total difference.
(in the present invention, α=β=γ=1).
5. a kind of detected edge points as claimed in claim 1, and in the super-resolution of image is amplified, utilize the method for these edge pixel points, can obtain so sharp-edged high-definition picture.Here put different weights to edge pixel point and non-edge pixel.The process of determining edge pixel point is as follows:
First the pixel in image is adopted as lower bolster
Figure FSA0000102842790000051
Carry out convolution algorithm, then finding absolute value is less than | δ | point, δ is a little value, then in the direction of 0 °, 45 °, 90 °, 135 °, 180 ° in the region of Q × Q (Q=5 in the present invention) around the point finding, search out respectively one just or one negative value, and its absolute value is greater than T (T=200 in the present invention), search out in the other direction at it and the value of this value contrary sign, its absolute value is also greater than T again.As find, the point finding is the zero crossing after two-dimentional Laplace's operation, i.e. marginal point in image; Other point is non-marginal point.For the marginal point in image and non-marginal point, its weights are respectively A and B,
Figure FSA0000102842790000052
Wherein (i, j) is the position of the pixel in image.(A=1, B=1.5 in the present invention)
At the absolute difference that is calculated as follows Non-overlapping Domain on absolute difference on low resolution overlapping block, low resolution piece, and use respectively these weights when absolute difference on high-resolution overlay region:
Figure FSA0000102842790000053
Figure FSA0000102842790000054
Figure FSA0000102842790000055
Here p l(i, j), p h(i, j) is respectively the weights p (i, j) on low-resolution image and high-definition picture.
6. one as claimed in claim 1 is determined on overlapping region, final definite method of pixel value in high-definition picture.Overlapping region on high-definition picture, mainly utilizes weighted average method to obtain final high-resolution pixel value.Its method is as follows:
First calculate
Figure FSA0000102842790000061
Here, (x, y) is the position of pixel in high-definition picture, L obe the sum of overlapping region on this pixel, i represents i overlapping region on this pixel, SADE (x o(i), y o(i), k o(i)) be the minimum SADE value on i overlapping region.S (i) is for calculating SADE (x o(i), y o(i), k o(i) sum of pixel used) time., the pixel value that (x, the y) on high-resolution image locates is
Figure FSA0000102842790000062
Wherein f h(x o(i), y o(i), k o(i), m, n) be the value on this pixel in i match block.
7. the method for a kind of and edged with the corresponding low-resolution image of adjacent side sample as claimed in claim 1.The image of low resolution is carried out to initial edged processing.Because proposed sample is with adjacent side, therefore convenient for processing, add a limit to view picture low-resolution image, the width on limit is identical with the width of adjacent side in sample.On the image of edged, the pixel value on the pixel of the surrounding that should fill is determined by the pixel value in its most contiguous image.Like this, can extract the sample with adjacent side at the image block of surrounding in the same old way.The width on added limit is 2 in the present invention.
CN201410141448.5A 2014-04-03 Image super-resolution method based on band adjacent side sample Active CN103903241B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410141448.5A CN103903241B (en) 2014-04-03 Image super-resolution method based on band adjacent side sample

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410141448.5A CN103903241B (en) 2014-04-03 Image super-resolution method based on band adjacent side sample

Publications (2)

Publication Number Publication Date
CN103903241A true CN103903241A (en) 2014-07-02
CN103903241B CN103903241B (en) 2016-11-30

Family

ID=

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104881842A (en) * 2015-05-18 2015-09-02 浙江师范大学 Image super resolution method based on image decomposition
CN106709873A (en) * 2016-11-11 2017-05-24 浙江师范大学 Super-resolution method based on cubic spline interpolation and iterative updating
CN106780331A (en) * 2016-11-11 2017-05-31 浙江师范大学 A kind of new super-resolution method based on neighborhood insertion
CN108271061A (en) * 2016-12-30 2018-07-10 央视国际网络无锡有限公司 A kind of method for being inserted into high contrast frame subtitle in video
CN112037135A (en) * 2020-09-11 2020-12-04 上海瞳观智能科技有限公司 Method for selecting image key main body to be amplified and displayed

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
WILLIAM T.FREEMAN ET AL.: "Example-Based Super Resolution", 《IEEE COMPUTER GRAPHICS & APPLICATIONS》 *
万雪芬 等: "图像超分辨率重建处理算法研究", 《激光与红外》 *
王春霞 等: "图像超分辨率重建技术综述", 《计算机技术与发展》 *
范新胜: "基于例子的图像超分辨率重建技术研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104881842A (en) * 2015-05-18 2015-09-02 浙江师范大学 Image super resolution method based on image decomposition
CN104881842B (en) * 2015-05-18 2019-03-01 浙江师范大学 A kind of image super-resolution method based on picture breakdown
CN106709873A (en) * 2016-11-11 2017-05-24 浙江师范大学 Super-resolution method based on cubic spline interpolation and iterative updating
CN106780331A (en) * 2016-11-11 2017-05-31 浙江师范大学 A kind of new super-resolution method based on neighborhood insertion
CN106780331B (en) * 2016-11-11 2020-04-17 浙江师范大学 Novel super-resolution method based on neighborhood embedding
CN106709873B (en) * 2016-11-11 2020-12-18 浙江师范大学 Super-resolution method based on cubic spline interpolation and iterative updating
CN108271061A (en) * 2016-12-30 2018-07-10 央视国际网络无锡有限公司 A kind of method for being inserted into high contrast frame subtitle in video
CN112037135A (en) * 2020-09-11 2020-12-04 上海瞳观智能科技有限公司 Method for selecting image key main body to be amplified and displayed
CN112037135B (en) * 2020-09-11 2023-06-09 上海瞳观智能科技有限公司 Method for magnifying and displaying selected image key main body

Similar Documents

Publication Publication Date Title
CN111062872B (en) Image super-resolution reconstruction method and system based on edge detection
Cheng et al. Inpainting for remotely sensed images with a multichannel nonlocal total variation model
CN101520894B (en) Method for extracting significant object based on region significance
CN101872472B (en) Method for super-resolution reconstruction of facial image on basis of sample learning
CN101877143B (en) Three-dimensional scene reconstruction method of two-dimensional image group
CN106228528B (en) A kind of multi-focus image fusing method based on decision diagram and rarefaction representation
CN101630405B (en) Multi-focusing image fusion method utilizing core Fisher classification and redundant wavelet transformation
CN106339998A (en) Multi-focus image fusion method based on contrast pyramid transformation
CN105354558B (en) Humanface image matching method
CN102354397A (en) Method for reconstructing human facial image super-resolution based on similarity of facial characteristic organs
CN101551852B (en) Training system, training method and detection method
CN107220957B (en) It is a kind of to utilize the remote sensing image fusion method for rolling Steerable filter
CN103903236A (en) Method and device for reconstructing super-resolution facial image
CN106447654B (en) Quality evaluating method is redirected based on statistics similarity and the image of two-way conspicuousness fidelity
CN110532914A (en) Building analyte detection method based on fine-feature study
CN104021523A (en) Novel method for image super-resolution amplification based on edge classification
CN107944437A (en) A kind of Face detection method based on neutral net and integral image
Luo et al. Bi-GANs-ST for perceptual image super-resolution
Das et al. Extracting building footprints from high-resolution aerial imagery using refined cross AttentionNet
CN106022310A (en) HTG-HOG (histograms of temporal gradient and histograms of oriented gradient) and STG (scale of temporal gradient) feature-based human body behavior recognition method
CN103903241A (en) Image super-resolution method based on sample with adjacent sides
CN108986027A (en) Depth image super-resolution reconstruction method based on improved joint trilateral filter
CN104123707A (en) Local rank priori based single-image super-resolution reconstruction method
Zhang et al. A generative adversarial network approach for removing motion blur in the automatic detection of pavement cracks
CN103458154B (en) A kind of super-resolution method of video and device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20200212

Address after: 310052 floor 2, No. 1174, Binhe Road, Changhe street, Binjiang District, Hangzhou City, Zhejiang Province

Patentee after: Hangzhou fog Technology Co., Ltd.

Address before: 321004 Zhejiang province Jinhua City Yingbin Road No. 688, Zhejiang Normal University

Patentee before: Zhejiang Normal University

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20201202

Address after: 314400 Chunlan West Road, Haining Nongfa District, Haining City, Jiaxing City, Zhejiang Province

Patentee after: Zhejiang Sen Bao Textile Technology Co.,Ltd.

Address before: 310052 floor 2, No. 1174, Binhe Road, Changhe street, Binjiang District, Hangzhou City, Zhejiang Province

Patentee before: Hangzhou fog Technology Co.,Ltd.