CN102938141A - Image processing method and device and computer system - Google Patents

Image processing method and device and computer system Download PDF

Info

Publication number
CN102938141A
CN102938141A CN2012103264553A CN201210326455A CN102938141A CN 102938141 A CN102938141 A CN 102938141A CN 2012103264553 A CN2012103264553 A CN 2012103264553A CN 201210326455 A CN201210326455 A CN 201210326455A CN 102938141 A CN102938141 A CN 102938141A
Authority
CN
China
Prior art keywords
piece
block
image
current
dimensional array
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2012103264553A
Other languages
Chinese (zh)
Other versions
CN102938141B (en
Inventor
金良海
宋恩民
韩明臣
李水平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN201210326455.3A priority Critical patent/CN102938141B/en
Publication of CN102938141A publication Critical patent/CN102938141A/en
Application granted granted Critical
Publication of CN102938141B publication Critical patent/CN102938141B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Processing (AREA)

Abstract

The invention relates to the communication field and discloses an image processing method and device and a computer system. The method comprises the steps of decomposing a noise image into image blocks, and determining the image blocks to be reference blocks; determining original candidate blocks, respectively performing geometric transformation on the original candidate blocks to obtain transformed candidate blocks, and determining the image blocks to be one of matching blocks corresponding to the reference blocks; and according to three-dimensional array estimation formed by the matching blocks, specific to the matching blocks in the three-dimensional array estimation, if the current matching blocks in the current three-dimensional array estimation are the transformed candidate blocks, performing secondary geometric transformation on array elements corresponding to the current transformed candidate blocks, updating the three-dimensional array estimation, and determining grey values of pixel points in the noise image in an estimating mode according to the three-dimensional array estimation to obtain a primarily processed image. By means of the technical scheme, processing effect and enhancement effect of image processing can be improved.

Description

Method and image processing equipment and computer system that image is processed
Technical field
The present invention relates to the communications field, method and image processing equipment and computer system that particularly a kind of image is processed.
Background technology
Image is processed (image processing) and is adopted computing machine to be analyzed image, to reach the technology of results needed, claims again image processing.
Current image is processed and is referred generally to digital image processing.Digital picture refers to the large two-dimensional array obtained through over-sampling and digitizing with equipment such as digital camera, scanners, and the element of this two-dimensional array is called pixel, and gray-scale value is an integer, is called again gray-scale value.
Utilize computing machine to be processed digital picture and claim Digital Image Processing, its developing history is not long.Digital image processing techniques come from the twenties in 20th century, have transmitted at that time a photos from the London to the USA New York by subsea cable, have adopted digital compression technology.
At first, digital image processing techniques can help people more objective, be familiar with the world exactly, people's vision system can help the mankind to obtain the information more than 3/4 from the external world, and image, figure are the carriers of all visual informations, although the resolving ability of human eye is very high, can identify thousands of kinds of colors, but in a lot of situations, image is fuzzy or even sightless for human eye, by image enhancement technology, can make fuzzy even not visible image become clear bright.
On the other hand, by the mode identification technology in Digital Image Processing, the processing of the image of human eye None-identified can being classified.Can retrieve fast and accurately, mate and identify various things by computer pattern recognition.
It is one of current Image Denoising Technology preferably that (Image Denoising With Block-Matching And 3DFiltering is called for short BM3D) technology is processed in the three-dimensional bits matched filtering.The BM3D technology mainly finds in the spatial domain in conjunction with noise image and according to certain matching criterior, in noise image, finds another the most similar original picture block as the match block corresponding with current given reference block in the current given a certain given hunting zone of reference block, then utilizes this match block to carry out the image after piece coupling one-level transform domain threshold filter carries out being processed of denoising.
In carrying out research process of the present invention, inventor's discovery, at least there is following defect in prior art:
First: because the filtering performance of BM3D depends primarily on the match block of each reference block, and traditional BM3D technology only limits to the original image in original noise image as match block, for complicated image or be out of shape serious image, adopt traditional BM3D to be difficult to each reference block and find desirable match block;
Second: especially, for being subject in situation that noise is more serious (sigma > 40), the situation that the displacement of digital picture and deformation comparison are serious, it is not remarkable that the denoising effect of the BM3D technology of prior art strengthens effect.
Summary of the invention
A kind of method that provides image to process is provided the embodiment of the present invention the first purpose, applies this technical scheme and is conducive to improve the treatment effect that image is processed, and improves and strengthens effect.
The embodiment of the present invention the second purpose is to provide a kind of image processing equipment, applies this technical scheme and is conducive to improve the treatment effect that image is processed, and improves and strengthens effect.
The embodiment of the present invention the 3rd purpose is to provide a kind of computer system, applies this technical scheme and is conducive to improve the treatment effect that image is processed, and improves and strengthens effect.
The method that a kind of image that the embodiment of the present invention provides is processed comprises:
Noise image is decomposed into at least two image blocks, each described image block is defined as respectively to each reference block;
Determine respectively each original candidates piece corresponding to each described reference block,
To each described original candidates piece, carry out respectively at least two kinds of first geometric transformations, respectively the image block after each described first geometric transformation is defined as respectively: each described original candidates piece is the corresponding candidate blocks that respectively converts respectively,
To each described original candidates piece, respectively in current described original candidates piece and all described conversion candidate blocks corresponding to current described original candidates piece, the similarity of wherein corresponding with current described original candidates piece described reference block is reached to the image block of predetermined extent, be defined as one of match block that described reference block is corresponding;
Determine respectively the three-dimensional array of each described match block formation that each described reference block is corresponding, obtain each described reference block corresponding each described three-dimensional array respectively, the array element of each described three-dimensional array is respectively: the gray-scale value of each pixel that forms each described match block of current described three-dimensional array;
Respectively each described three-dimensional array is carried out to predetermined three-dimensional filtering, each three-dimensional array that obtains respectively each described three-dimensional array estimates, the array element that each described three-dimensional array is estimated is respectively: form the estimated value of gray-scale value of each pixel that current described three-dimensional array is estimated each described match block of corresponding described three-dimensional array;
Each described match block during each described three-dimensional array is estimated, if in current described three-dimensional array estimation, current described match block is described conversion candidate blocks, further array element corresponding to current described conversion candidate blocks carried out to secondary geometric transformation, array element corresponding to current described conversion candidate blocks during current described three-dimensional array is estimated is updated to: the array element after each described secondary geometric transformation; Otherwise do not upgrade described three-dimensional array, do not estimate,
Wherein each described secondary geometric transformation is respectively: with the geometric transformation of each described first geometric transformation contrary;
The estimated value of the gray-scale value of each described pixel in estimating according to all described three-dimensional array corresponding to all described reference blocks difference, according to predetermined evaluation method, the gray-scale value of each pixel in described noise image is determined in estimation, obtains the single treatment image.
Alternatively, in step: after obtaining the single treatment image, also comprise:
According to described single treatment image and described noise image, obtain the secondary treating image.
Alternatively, according to described single treatment image and described noise image, obtain the secondary treating image, specifically comprise:
Determine the coordinate position of each the described reference block in described noise image;
Respectively in described single treatment image, determine each second reference block, the coordinate position of each described the second reference block respectively with described noise image in the coordinate position of each described reference block corresponding;
In described single treatment image, determine respectively each second original candidates piece that each described second reference block is corresponding,
To each described the second original candidates piece, carry out respectively at least two kinds of first geometric transformations, the image block after each first geometric transformation is defined as respectively: each described second original candidates piece is each the second conversion candidate blocks of correspondence respectively,
To each described the second original candidates piece, respectively current described the second original candidates piece and current described the second original candidates piece corresponding all described second the conversion candidate blocks in, the similarity of wherein corresponding with current described the second original candidates piece described the second reference block is reached to the image block of predetermined extent, be defined as one of second match block that described the second reference block is corresponding;
In described single treatment image, determine each described second reference block coordinate position of each corresponding described the second match block respectively,
The coordinate position of each corresponding described the second match block according to each described second reference block difference in described single treatment image, in described noise image, determine respectively each described Secondary Match piece corresponding to each described reference block, wherein, to each described reference block, the coordinate position of the coordinate position of each described Secondary Match piece that current described reference block is corresponding each described the second match block that described second reference block corresponding with current described reference block is corresponding respectively is corresponding respectively;
To each described Secondary Match piece, judge respectively whether current described Secondary Match piece corresponding described second match block in described single treatment image is described the second conversion candidate blocks, if, current described Secondary Match piece is carried out to the first geometric transformation identical with the described first geometric transformation that obtains current described the second conversion candidate blocks, current described Secondary Match piece is updated to: the image block after current described first geometric transformation, otherwise do not upgrade current described Secondary Match piece;
Determine respectively each described reference block the second three-dimensional array of each corresponding described Secondary Match piece formation respectively, obtain each described reference block corresponding each described second three-dimensional array respectively, the array element of each described the second three-dimensional array is respectively: the gray-scale value of each pixel that forms each described Secondary Match piece of current described the second three-dimensional array;
Respectively each described second three-dimensional array is carried out to predetermined three-dimensional filtering, each second three-dimensional array that obtains respectively each described the second three-dimensional array estimates, the array element that each described second three-dimensional array is estimated is respectively: the estimated value of gray-scale value that forms each pixel of each described the second match block that current described the second three-dimensional array estimates;
Each described Secondary Match piece during each described second three-dimensional array is estimated, if the image block of current described Secondary Match piece for obtaining through described first geometric transformation in current described the second three-dimensional array estimation, described array element corresponding to current described Secondary Match piece carried out to secondary geometric transformation, described array element corresponding to current described Secondary Match piece during current described the second three-dimensional array is estimated is updated to: the array element after described secondary geometric transformation, otherwise do not upgrade the described array element in described three-dimensional array estimation
Wherein said secondary geometric transformation is respectively: with the geometric transformation of current described first geometric transformation contrary;
The estimated value of the gray-scale value of each described pixel in estimating according to all described the second three-dimensional array corresponding to all described reference blocks difference, according to predetermined evaluation method, the gray-scale value of each pixel in described noise image is determined in estimation, obtains the secondary treating image.
Alternatively, noise image is decomposed into at least two reference blocks, specifically comprises:
With the size of the N*N that is scheduled to, with the step-length of being scheduled to, in described noise image, by each image block taken out successively, using each described image block respectively as each described reference block,
The size of each described reference block is N*N, the number that described N is pixel.
Alternatively, determine respectively each original candidates piece corresponding to each described reference block, specifically,
To each described reference block, carry out respectively:
Determine the first neighborhood of a pixel in precalculated position in current described reference block,
Each second neighborhood by each pixel in described the first neighborhood is defined as respectively: each original candidates piece that current described reference block is corresponding, and wherein the image size of each described the second neighborhood is respectively described N*N;
And/or, in described single treatment image, determine respectively each second original candidates piece that each described second reference block is corresponding, specifically,
To each described the second reference block, carry out respectively:
Determine the 3rd neighborhood of a pixel in precalculated position in current described the second reference block,
By each neighbours territory of each pixel in described the 3rd neighborhood, be defined as respectively: each original candidates piece that current described the second reference block is corresponding, wherein the image size in each described neighbours territory is respectively described N*N.
Alternatively, the image of described the first neighborhood and/or the 3rd neighborhood size is Ns*Ns;
The number that described Ns is pixel, described Ns is greater than described N.
Alternatively, determine the first neighborhood of a pixel in precalculated position in current described reference block, specifically:
The central point that the described pixel of take is described the first neighborhood, determine described the first neighborhood;
And/or,
Determine the 3rd neighborhood of a pixel in precalculated position in current described the second reference block, specifically:
The central point that the described pixel of take is described the 3rd neighborhood, determine described the 3rd neighborhood.
Alternatively, determine the first neighborhood of a pixel in precalculated position in current described reference block, specifically,
Top left corner apex pixel neighborhood of a point by described reference block, be defined as described the first neighborhood;
And/or,
Determine the 3rd neighborhood of a pixel in precalculated position in current described the second reference block, specifically,
By the top left corner apex pixel neighborhood of a point of described the second reference block, be defined as described the 3rd neighborhood.
Alternatively, each second neighborhood by each pixel in described the first neighborhood is defined as respectively: each original candidates piece that current described reference block is corresponding, specifically,
Each described second neighborhood that to put centered by each pixel in described the first neighborhood is defined as respectively: each described original candidates piece that current described reference block is corresponding;
And/or,
Each neighbours territory by each pixel in described the 3rd neighborhood is defined as respectively: each described second original candidates piece that current described the second reference block is corresponding, specifically,
Each the described neighbours territory that to put centered by each pixel in described the 3rd neighborhood is defined as respectively: each described second original candidates piece that current described the second reference block is corresponding.
Alternatively, each second neighborhood by each pixel in described the first neighborhood is defined as respectively: each original candidates piece that current described reference block is corresponding, specifically,
To take described the second neighborhood that each pixel is top left corner apex or summit, the lower left corner or summit, the upper right corner or summit, the lower right corner in described the first neighborhood, be defined as respectively: each described original candidates piece that current described reference block is corresponding;
And/or,
Each neighbours territory by each pixel in described the 3rd neighborhood is defined as respectively: each described second original candidates piece that current described the second reference block is corresponding, specifically,
To take the described neighbours territory that each pixel is top left corner apex or summit, the lower left corner or summit, the upper right corner or summit, the lower right corner in described the 3rd neighborhood, be defined as respectively: each described second original candidates piece that current described the second reference block is corresponding.
Alternatively, each described first geometric transformation comprises following arbitrary or wherein any two kinds or two or more combination arbitrarily:
Translation, rotation, mirror image conversion, transposition, scaling.
Alternatively, each described first geometric transformation specifically is respectively: rotation;
To each described original candidates piece, carry out respectively at least two kinds of first geometric transformations, respectively the image block after each described first geometric transformation is defined as respectively: each described original candidates piece is the corresponding candidate blocks that respectively converts respectively, be specially,
To each described original candidates piece, carry out respectively: current described original candidates piece is rotated respectively to predetermined different angle, each image block obtained after each rotation is defined as respectively to each described conversion candidate blocks;
And/or,
To each described the second original candidates piece, carry out respectively at least two kinds of first geometric transformations, the image block after each first geometric transformation is defined as respectively: each described second original candidates piece is each the second conversion candidate blocks of correspondence respectively, is specially:
To each described the second original candidates piece, carry out respectively: current described the second original candidates piece is rotated respectively to predetermined different angle, each image block obtained after each rotation is defined as respectively to each described the second conversion candidate blocks.
Alternatively, each described first geometric transformation specifically comprises: rotation, vertical transposition, horizontal transposition, mirror image;
To each described original candidates piece, carry out respectively at least two kinds of first geometric transformations, the image block after each first geometric transformation is defined as respectively: the conversion candidate blocks that described original candidates piece is corresponding, be specially,
To each described original candidates piece, below carrying out respectively:
Current described original candidates piece is rotated to predetermined angle,
Respectively left, the current described original candidates piece of horizontal transposition to the right,
Make progress respectively, the current described original candidates piece of transposition downward vertically,
Two diagonal line of described original candidates piece of take respectively are the transposition axis, the current described original candidates piece of transposition respectively,
The axis of being scheduled to of take is image line, the current described original candidates piece of mirror image,
Each image block obtained after the image block that described rotation is obtained afterwards and each described horizontal transposition, and each image block obtained after each described vertical transposition, and each image block after described mirror image, be defined as respectively each described conversion candidate blocks;
And/or,
To each described the second original candidates piece, carry out respectively at least two kinds of first geometric transformations, the image block after each first geometric transformation is defined as respectively: the second conversion candidate blocks that described the second original candidates piece is corresponding is specially:
To each described the second original candidates piece, below carrying out respectively:
Current described the second original candidates piece is rotated respectively to predetermined angle,
Respectively left, current described the second original candidates piece of horizontal transposition to the right,
Make progress respectively, current described the second original candidates piece of transposition downward vertically,
Two diagonal line of described original candidates piece of take respectively are the transposition axis, current described the second original candidates piece of transposition respectively,
The axis of being scheduled to of take is image line, current described the second original candidates piece of mirror image,
Each image block obtained after the image block that described rotation is obtained afterwards and each described horizontal transposition, and each image block obtained after each described vertical transposition, and each image block after described mirror image, be defined as respectively each described the second conversion candidate blocks.
Alternatively, to each described original candidates piece, respectively in current described original candidates piece and all described conversion candidate blocks corresponding to current described original candidates piece, the similarity of wherein corresponding with current described original candidates piece described reference block is reached to the image block of predetermined extent, be defined as one of match block that described reference block is corresponding, specifically
To each described original candidates piece, below carrying out respectively:
Calculate respectively the Euclidean distance of the described reference block that current described original candidates piece is corresponding with current described original candidates piece, be designated as d 0,
Calculate respectively: the Euclidean distance that respectively converts the candidate blocks described reference block corresponding with current described original candidates piece that current described original candidates piece is corresponding, remember that the Euclidean distance of i described conversion candidate blocks and described reference block is d i, described i is the arbitrary natural number that is equal to or less than n, described n is: the number of each described conversion candidate blocks that current described original candidates piece is corresponding;
According to each described d iAnd described d 0, determine the similarity of the described reference block that described conversion candidate blocks that current described original candidates piece and current described original candidates piece are corresponding is corresponding with current described original candidates piece,
The described image block that wherein said similarity is reached to predetermined extent is defined as: one of match block that described reference block is corresponding;
And/or,
To each described the second original candidates piece, respectively current described the second original candidates piece and current described the second original candidates piece corresponding all described second the conversion candidate blocks in, the similarity of wherein corresponding with current described the second original candidates piece described the second reference block is reached to the image block of predetermined extent, be defined as one of second match block that described the second reference block is corresponding, specifically:
To each described the second original candidates piece, below carrying out respectively:
Calculate respectively the Euclidean distance of described the second reference block that current described the second original candidates piece is corresponding with current described original candidates piece, be designated as d 0',
Calculate respectively: the Euclidean distance of described the second reference block that each the second conversion candidate blocks that current described the second original candidates piece is corresponding is corresponding with current described the second original candidates piece, remember that the Euclidean distance of i described the second conversion candidate blocks and described the second reference block is d i', described i is the arbitrary natural number that is equal to or less than n, described n is: the number of each described the second conversion candidate blocks that current described the second original candidates piece is corresponding;
According to each described d i' and described d 0', determine the similarity of described the second conversion candidate blocks described second reference block corresponding with current described the second original candidates piece that current described the second original candidates piece and current described the second original candidates piece are corresponding,
The described image block that wherein said similarity is reached to predetermined extent is defined as: one of second match block of described the second reference block.
Alternatively, the similarity of wherein corresponding with current described original candidates piece described reference block is reached to the image block of predetermined extent, be defined as one of match block that described reference block is corresponding, specifically:
Make d '=min{d 1, d 2, d iD n,
If described d '-d 0<=T O1, by described d ', corresponding described conversion candidate blocks is defined as: one of match block that described reference block is corresponding,
Otherwise described original candidates piece is defined as: one of match block that described reference block is corresponding;
And/or,
The similarity of wherein corresponding with current described the second original candidates piece described the second reference block is reached to the image block of predetermined extent, be defined as one of second match block that described the second reference block is corresponding, specifically:
Make d2 "=min{d2 1', d2 2', d2 i' ... d2 n',
If described d2 "-d2 0'<=T O1, by described d2 ', corresponding described the second conversion candidate blocks is defined as: one of second match block of described the second reference block,
Otherwise described the second original candidates piece is defined as: one of match block that described the second reference block is corresponding;
Wherein, described T O12For predetermined positive number.
Alternatively, determine respectively the three-dimensional array of each described match block formation that each described reference block is corresponding, obtain each described reference block each described three-dimensional array of correspondence respectively, the array element of each described three-dimensional array is respectively: the gray-scale value of each pixel that forms each described match block of current described three-dimensional array, specifically
To each described reference block, carry out respectively:
According to current described reference block, the ascending order of the Euclidean distance of corresponding each described match block and current described reference block, determine each described match block putting in order in described three-dimensional array,
According to each described match block putting in order in described three-dimensional array, each gray-scale value of each pixel of each described match block is defined as: the array element of described three-dimensional array;
And/or,
Determine respectively each described reference block the second three-dimensional array of each corresponding described Secondary Match piece formation respectively, obtain each described reference block each described second three-dimensional array of correspondence respectively, the array element of each described the second three-dimensional array is respectively: the gray-scale value of each pixel that forms each described Secondary Match piece of current described the second three-dimensional array, specifically
To each described the second reference block, carry out respectively:
According to current described the second reference block, the ascending order of the Euclidean distance of corresponding each described the second match block and current described the second reference block, determine each described the second match block putting in order in described the second three-dimensional array,
According to each described the second match block putting in order in described the second three-dimensional array, each gray-scale value of each pixel of each described the second match block is defined as: the array element of described the second three-dimensional array.
Alternatively, in described step: the ascending order of the Euclidean distance of corresponding each described match block and current described reference block according to current described reference block, determine each described match block putting in order in described three-dimensional array, also comprise:
In corresponding all described match block, choose the described match block of predetermined number as the described match block that forms described three-dimensional array at current described reference block.
And/or,
In described step: the ascending order of the Euclidean distance of corresponding each described the second match block and current described the second reference block according to current described the second reference block, determine each described the second match block putting in order in described the second three-dimensional array, also comprise:
In corresponding all described the second match block, choose described second match block of predetermined number as described the second match block that forms described three-dimensional array at current described the second reference block.
Alternatively, respectively each described three-dimensional array is carried out to predetermined three-dimensional filtering, specifically:
Respectively each described three-dimensional array is carried out to three-dimensional hard-threshold filtering;
And/or,
Respectively each described second three-dimensional array is carried out to predetermined three-dimensional filtering, specifically:
Respectively each described second three-dimensional array is carried out to three-dimensional hard-threshold filtering.
Alternatively, respectively each described three-dimensional array is carried out to three-dimensional hard-threshold filtering, comprising:
Each described three-dimensional array is carried out respectively to three-dimensional Fourier transform, obtain respectively each three-dimensional plural array;
Three-dimensional Fourier transform coefficient in the plural array of each described three-dimensional is carried out to the hard-threshold computing: the described coefficient that mould is less than to predetermined threshold is set to 0, the described coefficient that mould is more than or equal to predetermined threshold is constant, obtains the plural array of each described three-dimensional after described hard-threshold computing;
To the plural array of each described three-dimensional after described hard-threshold computing, carry out respectively three-dimensional inverse Fourier transform, obtain each described three-dimensional array and estimate;
And/or,
Respectively each described second three-dimensional array being carried out to three-dimensional hard-threshold filtering comprises:
Each described second three-dimensional array is carried out respectively to three-dimensional Fourier transform, obtain respectively each the second three-dimensional plural array;
Three-dimensional Fourier transform coefficient in each described second three-dimensional plural array is carried out to the hard-threshold computing: the described coefficient that mould is less than to predetermined threshold is set to 0, the described coefficient that mould is more than or equal to predetermined threshold is constant, obtains each the described second three-dimensional plural array after described hard-threshold computing;
To each the described second three-dimensional plural array after described hard-threshold computing, carry out respectively three-dimensional inverse Fourier transform, obtain each described second three-dimensional array and estimate.
Alternatively, the estimated value of each described gray-scale value in estimating according to all described three-dimensional array corresponding to all reference blocks difference, according to predetermined evaluation method, the gray-scale value of each pixel in described noise image is determined in estimation; Specifically,
To each described pixel in described noise image, below carrying out respectively:
Determine each described reference block each the described three-dimensional array estimation of correspondence respectively that comprises current described pixel,
Determine respectively current described pixel each described gray-scale value estimated value of correspondence in each described three-dimensional array respectively, each described gray-scale value estimated value is carried out to the gray-scale value of the value of predetermined weighted mean computing acquisition as current described pixel in described noise image.
And/or,
According in described noise image, the estimated value of each described gray-scale value in all described second three-dimensional array estimation of all described reference blocks difference correspondences, according to predetermined evaluation method, the gray-scale value of each pixel in described noise image is determined in estimation, specifically:
To each described pixel in described noise image, below carrying out respectively:
Determine each described reference block each the described second three-dimensional array estimation of correspondence respectively that comprises current described pixel,
Determine respectively current described pixel each described gray-scale value estimated value of correspondence in each described second three-dimensional array respectively, each described gray-scale value estimated value is carried out to the gray-scale value of the value of predetermined weighted mean computing acquisition as current described pixel in described noise image.
A kind of image processing equipment that the embodiment of the present invention provides comprises:
The picture breakdown unit, for noise image being decomposed into at least two image blocks, be defined as respectively each reference block by each described image block;
Original candidates piece determining unit, be connected with described picture breakdown unit, for determining respectively each original candidates piece corresponding to each described reference block,
Conversion candidate blocks determining unit, with original candidates piece determining unit, be connected, be used for each described original candidates piece, carry out respectively at least two kinds of first geometric transformations, respectively the image block after each described first geometric transformation is defined as respectively: each described original candidates piece is the corresponding candidate blocks that respectively converts respectively
The match block determining unit, with original candidates piece determining unit and conversion candidate blocks determining unit, be connected respectively, be used for each described original candidates piece, respectively in current described original candidates piece and all described conversion candidate blocks corresponding to current described original candidates piece, the similarity of wherein corresponding with current described original candidates piece described reference block is reached to the image block of predetermined extent, be defined as one of match block that described reference block is corresponding;
The three-dimensional array determining unit, with described match block determining unit, be connected, determine respectively the three-dimensional array of each described match block formation that each described reference block is corresponding, obtain each described reference block corresponding each described three-dimensional array respectively, the array element of each described three-dimensional array is respectively: the gray-scale value of each pixel that forms each described match block of current described three-dimensional array;
The three-dimensional filtering unit, with described three-dimensional array determining unit, be connected, for respectively each described three-dimensional array being carried out to predetermined three-dimensional filtering, each three-dimensional array that obtains respectively each described three-dimensional array estimates, the array element that each described three-dimensional array is estimated is respectively: form the estimated value of gray-scale value of each pixel that current described three-dimensional array is estimated each described match block of corresponding described three-dimensional array;
How much inverse transformation block, with described three-dimensional filtering unit and match block determining unit, be connected respectively, for each described match block that each described three-dimensional array is estimated, if in current described three-dimensional array estimation, current described match block is described conversion candidate blocks, further array element corresponding to current described conversion candidate blocks carried out to secondary geometric transformation, array element corresponding to current described conversion candidate blocks during current described three-dimensional array is estimated is updated to: the array element after each described secondary geometric transformation; Otherwise do not upgrade described three-dimensional array, do not estimate, wherein each described secondary geometric transformation is respectively: with the geometric transformation of each described first geometric transformation contrary;
The image evaluation unit, with described how much inverse transformation block and described three-dimensional filtering unit, be connected respectively, for estimate the estimated value of the gray-scale value of each described pixel according to all described three-dimensional array corresponding to all described reference blocks difference, according to predetermined evaluation method, the gray-scale value of each pixel in described noise image is determined in estimation, obtains the single treatment image.
A kind of computer system that the embodiment of the present invention provides comprises: processor, storer and communication line, and wherein said communication line is connected between described processor and described storer;
Wherein, described processor by described communication line, calls the code of storing in described storer, with for:
Noise image is decomposed into at least two image blocks, each described image block is defined as respectively to each reference block;
Determine respectively each original candidates piece corresponding to each described reference block,
To each described original candidates piece, carry out respectively at least two kinds of first geometric transformations, respectively the image block after each described first geometric transformation is defined as respectively: each described original candidates piece is the corresponding candidate blocks that respectively converts respectively,
To each described original candidates piece, respectively in current described original candidates piece and all described conversion candidate blocks corresponding to current described original candidates piece, the similarity of wherein corresponding with current described original candidates piece described reference block is reached to the image block of predetermined extent, be defined as one of match block that described reference block is corresponding;
Determine respectively the three-dimensional array of each described match block formation that each described reference block is corresponding, obtain each described reference block corresponding each described three-dimensional array respectively, the array element of each described three-dimensional array is respectively: the gray-scale value of each pixel that forms each described match block of current described three-dimensional array;
Respectively each described three-dimensional array is carried out to predetermined three-dimensional filtering, each three-dimensional array that obtains respectively each described three-dimensional array estimates, the array element that each described three-dimensional array is estimated is respectively: form the estimated value of gray-scale value of each pixel that current described three-dimensional array is estimated each described match block of corresponding described three-dimensional array;
Each described match block during each described three-dimensional array is estimated, if in current described three-dimensional array estimation, current described match block is described conversion candidate blocks, further array element corresponding to current described conversion candidate blocks carried out to secondary geometric transformation, array element corresponding to current described conversion candidate blocks during current described three-dimensional array is estimated is updated to: the array element after each described secondary geometric transformation; Otherwise do not upgrade described three-dimensional array, do not estimate,
Wherein each described secondary geometric transformation is respectively: with the geometric transformation of each described first geometric transformation contrary;
The estimated value of the gray-scale value of each described pixel in estimating according to all described three-dimensional array corresponding to all described reference blocks difference, according to predetermined evaluation method, the gray-scale value of each pixel in described noise image is determined in estimation, obtains the single treatment image.
Therefore, application the present embodiment technical scheme, owing to finding match block corresponding to a certain reference block in the present embodiment technical scheme, on the basis of the original candidates piece also directly obtained at noise image, the original candidates piece is carried out to geometric transformation, using the conversion candidate blocks and original candidates piece one of candidate image piece of corresponding match block as this reference block together obtained after geometric transformation, but not only using the original candidates piece as match block, adopt the present embodiment technical scheme can improve the quality (with the similarity of reference block) of the match block that each reference block is corresponding, thereby guarantee that the image when the pixel value of estimating each pixel of calculating noise image according to match block recovers noise image recovers the Optimality of strengthening the property.
In addition, if image in transmitting procedure or in the process obtained of image sampling in the situation that be subject to noise more serious (sigma > 40) displacement of digital picture and deformation comparison to occur when serious, adopt the present embodiment technical scheme can be effectively in degree of distortion, in higher noise image, to recruit preferably match block, improve denoising effect and strengthen effect.
The accompanying drawing explanation
In order to be illustrated more clearly in the embodiment of the present invention or technical scheme of the prior art, below will the accompanying drawing of required use in embodiment or description of the Prior Art be briefly described, apparently, accompanying drawing in the following describes is only some embodiments of the present invention, for those of ordinary skills, under the prerequisite of not paying creative work, can also obtain according to these accompanying drawings other accompanying drawing.
The method flow schematic diagram that Fig. 1 processes for the image that the embodiment of the present invention 1 provides;
The method flow schematic diagram that Fig. 2 processes for the image that the embodiment of the present invention 2 provides;
The method flow schematic diagram that Fig. 3 processes for the further image that the embodiment of the present invention 2 provides;
The image processing equipment structural representation that Fig. 4 provides for the embodiment of the present invention 3;
The image processing system schematic diagram that Fig. 5 provides for the embodiment of the present invention 4;
The Computer Systems Organization schematic diagram that Fig. 6 provides for the embodiment of the present invention 5;
Fig. 7 is laboratory the first original image;
Fig. 8 for having added the first noise image obtained after the Gaussian noise of sigma=20 in the first original image shown in Fig. 7;
Fig. 9 is the processing image that utilizes embodiment 1 technical scheme to obtain after being processed the first noise image shown in Fig. 8;
Figure 10 is the single treatment image that utilizes embodiment 1 technical scheme to obtain after being processed the first noise image shown in Fig. 8;
Figure 11 is laboratory the second original image;
Figure 12 for having added the second noise image obtained after the Gaussian noise of sigma=20 in the second original image shown in Figure 11;
Figure 13 is the processing image that utilizes BM3D technical scheme traditional in prior art to obtain after being processed the second noise image shown in Figure 12;
Figure 14 is the single treatment image that utilizes embodiment 1 technical scheme to obtain after being processed the second noise image shown in Figure 12;
Figure 15 is laboratory the 3rd original image;
Figure 16 in the 3rd original image shown in Figure 15, added after the Gaussian noise of sigma=20 obtain the 3rd noise image;
Figure 17 utilizes BM3D technical scheme traditional in prior art to processing image that shown in Figure 16, the 3rd noise image obtains after being processed;
Figure 18 utilizes embodiment 1 technical scheme to single treatment image that shown in Figure 16, the 3rd noise image obtains after being processed;
Figure 19 is laboratory the 4th original image;
Figure 20 for having added the 4th noise image obtained after the Gaussian noise of sigma=20 in the 4th original image shown in Figure 15;
Figure 21 utilizes BM3D technical scheme traditional in prior art to processing image that shown in Figure 20, the 4th noise image obtains after being processed;
Figure 22 utilizes embodiment 1 technical scheme to single treatment image that shown in Figure 20, the 4th noise image obtains after being processed.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is clearly and completely described, obviously, described embodiment is only the present invention's part embodiment, rather than whole embodiment.Embodiment based in the present invention, those of ordinary skills, not making under the creative work prerequisite the every other embodiment obtained, belong to the scope of protection of the invention.
Embodiment 1:
A kind of method that the present embodiment provides image to process, shown in Figure 1, the method mainly comprises following flow process.
Step 101: noise image is decomposed into at least two image blocks, using each image block respectively as each reference block.
Original noise image is designated as to noise image Z, noise image Z is resolved into to a plurality of image blocks, decompose each image block obtained and be designated as reference block.Each reference block can partly overlap on this noise image Z in the present embodiment, also can be not overlapped.
In the present embodiment, can but do not limit the image block that noise image Z is resolved into to the N*N size, according to predetermined step-length, establishing step-length is Nstep, takes out successively each image block as the reference piece in noise image.Wherein step-length Nstep can be, but not limited to as being equal to or less than the natural number of N.
Step 102: determine respectively each original candidates piece corresponding to each reference block.
In this step, can be, but not limited to determine according to existing traditional BM3D technology the match block that each reference block is corresponding, by this, traditional match block is designated as the original candidates piece.
In the present embodiment according to the prior art match block determine each image block that method finds as just in the present embodiment as each reference block one of candidate target of corresponding final match block, but not, directly as match block, specifically see below in detail continuous step analysis.
The method of specifically determining in the present embodiment each original candidates piece that each reference block is corresponding can be, but not limited to determine according to following:
To each reference block, according to following flow process, determine each original candidates piece that each reference block is corresponding:
At first determine region of search corresponding to this reference block: the neighborhood of the Ns*Ns centered by the coordinate of the upper left corner of this reference block (being designated as the first neighborhood), the size that wherein Ns is this region of search, Ns often chooses and is greater than the reference block size, when for example the reference block size is 5*5, the region of search size of choosing is 28*28; Then the second neighborhood that each the interior point of this region of search (i.e. the first neighborhood) of take is the upper left corner, the image block of formation and reference block formed objects is as the original candidates piece.
It should be noted that, the present embodiment be take the pixel that the reference image vegetarian refreshments of this neighborhood is the upper left corner when determining the first neighborhood, the second neighborhood, but can also be not limited to the pixel of centered by this pixel pixel or the upper right corner or the lower right corner or the lower left corner or other positions, as long as make this pixel in this neighborhood.
In addition, the image size of each reference point, the first neighborhood, the second neighborhood can be, but not limited to as length and the identical square matrices of width, but is not limited to this.
Particularly: establishing the region of search size that current reference block size is 5*5 is 28*28, chooses respectively 28*28=784 pixel of this region of search, by each pixel, is the upper left corner, obtains the original candidates piece that 784 sizes are 5*5.
If current noise image is divided into 20 above-mentioned reference blocks, according to the method, corresponding 784 the original candidates pieces of each reference block.
Step 103: to each original candidates piece, carry out respectively at least two kinds of first geometric transformations, the image block after each first geometric transformation is defined as respectively: the conversion candidate blocks that current original candidates piece is corresponding.
Corresponding each original candidates piece for each reference block of noise image, carry out this first geometric transformation according to predetermined various geometric transformation computings respectively, just this original candidates piece carries out respectively the image block that various geometric transformation obtains and is defined as the image block that this original candidates piece is corresponding.
As the other embodiment of the present embodiment, this first geometric transformation can be, but not limited to into following geometric transformation one of, for example: the translation of image; The mirror transformation of image; The transposition of image; The rotation of image, scaling of image etc.
Can be also above-mentioned any two combination: for example first geometric transformation be specially following two kinds: first rotate 45 degree and carry out horizontal mirror transformation again, and first rotate 45 degree and carry out the vertical mirror conversion again;
Can also be above-mentioned three combination, for example first geometric transformation be specially following two kinds: first rotate 45 degree and carry out horizontal mirror transformation again and carry out the transposition of image again, and first rotate 45 degree and carry out the vertical mirror conversion again and carry out the transposition of image again.
Concrete, image translation (Translation) be by points all in image all according to the translational movement of appointment, carry out level, vertical mobile.
The mirror transformation of image, the mirror transformation of image is divided into two kinds: a kind of is the horizon glass picture, and another kind is vertical mirror; The horizontal mirror image operation of image is centered by the vertical centering control axis of original image, image is divided into to left and right two parts and carries out symmetry transformation; The vertical mirror operation of image is centered by the horizontal median axis of original image, image is divided into to upper and lower two parts and carries out symmetry transformation.After mirror transformation, the height and width of figure are all constant.
The transposition of image (Transpose) operation is by the x coordinate of image pixel and y coordinate interchange.This operation will change height and the width of image, and after transposition, the height of image and width will exchange.If be (x0, y0) by the original coordinates of the pixel of transposition, the coordinate (x1, y1) after its transposition meets the following conditions: x1=y0; Y1=x0.
The rotation of image, refer to that it is the point of rotation that image be take certain point of being scheduled to, rotation, and generally take the center of image is initial point, rotates a certain angle.
Before being located at rotation: the coordinate of certain pixel (x0, y0) is: x0=γ cos (β); After y0=γ sin (β) rotation:
x?1=γcos(β-α)=γcos(β)cos(α)+γsin(β)sin(α)=x0cos(α)+y0sin(α);
y1=γsin(β-α)=γsin(β)cos(α)-γcos(β)sin(α)=-x0sin(α)+y0cos(α)。
The geometric transformation computing method of concrete various images can be, but not limited to carry out according to prior art.
If current noise image is divided into 20 above-mentioned reference blocks, apply the present embodiment technical scheme, corresponding 784 the original candidates pieces of every 1 reference block;
If predetermined first geometric transformation is respectively following seven kinds: rotate 45 °, half-twist, 5 ° of ROT13s, Rotate 180 °, rotate 225 °, rotates 270 °, rotates 315 °.:
Arbitrary 1 original candidates piece for every 1 reference block of noise image in corresponding 784 original candidates pieces, this original candidates piece is rotated respectively 45 °, half-twist, 5 ° of ROT13s, Rotate 180 °, rotate 225 °, rotate 270 °, rotate 315 °, obtain respectively postrotational 7 image blocks, corresponding 7 the conversion candidate blocks of wantonly 1 original candidates piece that this 1 reference block is corresponding.
For 20 each original candidates pieces that reference block is corresponding, carry out above-mentioned geometric transformation, 20 reference blocks, respectively corresponding 784 the original candidates pieces of every 1 reference block, every 1 original candidates piece is corresponding 7 conversion candidate blocks respectively.
Step 104: to each original candidates piece, in current original candidates piece and all conversion candidate blocks corresponding to current original candidates piece, the image block that the similarity of wherein corresponding with current original candidates piece reference block is reached to predetermined extent is defined as: one of match block that current reference block is corresponding respectively.
Each original candidates piece for each reference block, select one at this original candidates piece in all conversion candidate blocks corresponding with this original candidate blocks and reach the image block of predetermined extent with reference block similarity corresponding to this original candidates piece, one of match block corresponding as this reference block using this image block.
If noise image has 20 reference blocks, arbitrary 1 original candidates piece in 784 original candidates pieces corresponding to every 1 reference block, corresponding 7 the conversion candidate blocks of wantonly 1 original candidates piece, at every 1 original candidates piece, in total 8 each image blocks of 7 conversion candidate blocks formation corresponding with it, choose wherein with corresponding reference block similarity reach predetermined extent wherein 1 as match block.Apply this step, respectively for these 20 reference blocks, corresponding 784 match block of every 1 reference block.
Wherein when determining the similarity of each image block and reference block, can be, but not limited to, by the Euclidean distance between each image block relatively, choose Euclidean distance and reach the image block of pre-provisioning request as match block.
In the present embodiment when choosing match block corresponding to one of them reference block corresponding as original candidate blocks all conversion candidate mouths corresponding with this original candidate blocks from arbitrary original candidates piece, can be, but not limited to choose wherein the image block the highest with this reference block similarity as match block, but be not limited in this.The present embodiment can also but be not limited to, in similarity, reach on the basis of predetermined extent, preferentially choose the original candidates piece as match block.For example:
If noise image has 20 reference blocks, arbitrary 1 the original candidates piece in 784 original candidates pieces corresponding to every 1 reference block, corresponding 7 the conversion candidate blocks of wantonly 1 original candidates piece.: establish arbitrary reference block Img i, arbitrary reference block Img iCorresponding arbitrary original candidates piece Img ' k, arbitrary original candidates piece Img ' kCorresponding conversion candidate blocks is Img " j.The sequence number that wherein i is reference block, the span of i is the arbitrary natural number between 1 to 20; The sequence number that k is the original candidates piece, the span of k is the arbitrary natural number between 1 to 784; J is the sequence number of conversion candidate blocks, and the span of j is the arbitrary natural number between 1 to 7.
To arbitrary reference block Img iArbitrary original candidates piece Img ' k, at this original candidates piece Img ' kAnd 7 conversion candidate blocks Img " 1, Img " 2, Img " 3, Img " 4, Img " 5, Img " 6, Img " 7Middle calculating respectively:
Calculate Img ' kWith Img iEuclidean distance, be designated as d0,
Calculate Img " 1With Img iEuclidean distance, be designated as d1,
Calculate Img " 2With Img iEuclidean distance, be designated as d2,
Calculate Img " 3With Img iEuclidean distance, be designated as d3,
Calculate Img " 4With Img iEuclidean distance, be designated as d4,
Calculate Img " 5With Img iEuclidean distance, be designated as d5,
Calculate Img " 6With Img iEuclidean distance, be designated as d6,
Calculate Img " 7With Img iEuclidean distance, be designated as d7.
Make d k=min{d1, d2, d3, d4, d5, d6, d7}, relatively d kThe size of-d0, if (d k-d0)<=Tol, wherein Tol is predetermined Euclidean distance threshold value, chooses d jCorresponding conversion candidate blocks Img " jFor reference block Img iOne of match block; Otherwise choose the original candidates piece Img ' that d0 is corresponding kFor reference block Img iOne of match block.Adopt said method, for wantonly 1 reference block finds 784 match block, be respectively 20 reference blocks and find respectively each corresponding match block.Adopt this technical scheme, the minimum value min{d1 that only has the distance of the conversion candidate blocks after than geometric transformation apart from d0 of original candidates piece and reference block and reference block, d2, d3, d4, d5, d6, d7} large when many, just use the piece after geometric transformation, preferentially select the original candidates piece as match block meeting under certain condition, reduce the follow-up operand secondary geometric transformation of first geometric transformation contrary (for example with); If failing to meet this condition choosing the conversion candidate blocks is match block.Be conducive to like this, on the basis of the similarity of the match block of guaranteeing to find and reference block, reduce operand.
When calculating the Euclidean distance of any two width images, suppose that two image blocks are respectively imgA and imgB, in two image blocks, the difference diff of the gray-scale value between any two corresponding pixel points is diff=imgA-imgB; The Euclidean distance EulerDistance between this two image block is: EulerDistance=sqrt (sum (diff (:) .*diff (:))).
Step 105: determine respectively the three-dimensional array of each match block formation that each reference block is corresponding, obtain each three-dimensional array that each reference block is corresponding.
Be located in step 104, there are 20 reference blocks, be respectively wantonly 1 reference block Img iFind 784 match block: each reference block Img iA corresponding three-dimensional array, the element of this three-dimensional array is this reference block Img iEach gray-scale value of each pixel in corresponding all or part of match block.20 reference blocks are distinguished corresponding 20 three-dimensional array in this step.
It should be noted that, choose in this step whole match block that each reference block is corresponding element as three-dimensional array, also can choose the element of the match block of the predetermined portions that this reference block is corresponding as three-dimensional array, for example first 1 reference block Img in office iFind in 784 match block and choose wherein and this reference block Img iNx match block of Euclidean distance minimum as the element that forms this three-dimensional array.
In addition, the arrangement of the array element at each reference block in corresponding three-dimensional array, can be, but not limited to according to the ascending arrangement of Euclidean distance between each match block and current reference block.
Step 106: respectively each three-dimensional array is carried out to predetermined three-dimensional filtering, each three-dimensional array that obtains respectively each three-dimensional array is estimated.
To each the reference block Img obtained in step 105 iThe three-dimensional array F(i that corresponding match block forms) carry out predetermined three-dimensional filtering computing, wherein this three-dimensional filtering can be arbitrary for the various filtering operation algorithms of replying digital image information in prior art.
Provide in the present embodiment a kind of technical scheme of three-dimensional hard-threshold filtering, in this technical scheme, to arbitrary reference block Img iThe three-dimensional array F(i that corresponding match block forms) carry out respectively following processing, establish 20 reference blocks, respectively 20 three-dimensional array corresponding to reference block are carried out respectively to following processing:
Step1: by current reference block Img iThe three-dimensional array F(i that corresponding match block forms) carry out three-dimensional Fourier transform (3-D Discrete Fourier Transform, be called for short 3D-DFT), obtain this three-dimensional array F(i) three-dimensional plural number array F ' (i), obtain having realized this three-dimensional array F(i) in each match block carry out two-dimensional Fourier transform.
Step2: the three-dimensional Fourier transform coefficient to the plural array F ' of three-dimensional in (i) carries out the hard-threshold computing, be about to the coefficient that mould is less than predetermined threshold and be set to 0, the coefficient that makes mould be more than or equal to predetermined threshold remains unchanged, by the three-dimensional plural array F ' of the coefficient update after the hard-threshold computing (i).
Step3: to the plural number of the three-dimensional obtained after hard-threshold computing array F ' (i), carry out three-dimensional inverse Fourier transform, obtain three-dimensional array and estimate F " (i).
Now, in this three-dimensional array is estimated, its element is respectively: form the estimated value of each gray-scale value of each pixel that current three-dimensional array is estimated each match block of corresponding three-dimensional array.
Step 107: each Match of elemental composition piece during each three-dimensional array is estimated, carry out: if in the estimation of current three-dimensional array, the current matching piece is the conversion candidate blocks, further to the Current Transform candidate blocks, corresponding array element carries out secondary geometric transformation, and array element corresponding to Current Transform candidate blocks during current three-dimensional array is estimated is updated to respectively: the array element after each secondary geometric transformation; Otherwise do not upgrade the array element in this three-dimensional array estimation.
In the present embodiment, match block due to each three-dimensional array corresponding to each reference block in estimating can be the original candidates piece of its original orientation of directly taking from noise image, also can carry out for the image block of directly taking from noise image the conversion candidate blocks after geometric transformation, therefore execution step 108 is estimated the gray-scale value of each pixel of corresponding reference block according to each match block before, also carry out in the present embodiment, the processing of this step:
Each array element match block during each three-dimensional array is estimated, if this Match of elemental composition piece is by the original candidates piece being carried out to the conversion candidate blocks of first geometric transformation acquisition, this match block is carried out the secondary geometric transformation of the first geometric transformation contrary corresponding with it, by this match block back-transformed.For example: three bit array estimate in arbitrary Match of elemental composition piece carry out by the original candidates piece conversion candidate blocks that dextrorotation obtains after turning 90 degrees, this element in three-dimensional array being estimated carries out secondary geometric transformation: be rotated counterclockwise 90 degree, this match block is returned to original direction by geometric transformation, and array element corresponding to Current Transform candidate blocks in then current three-dimensional array being estimated is updated to the array after secondary geometric transformation.
If each array element match block in each three-dimensional array estimation is the original candidates piece, without upgrading array element corresponding to this original candidates piece.
In the three-dimensional array corresponding to each reference block, the Match of elemental composition piece is the Transformation Matching piece, carries out respectively above-mentioned secondary geometric transformation, and, after the array element renewal, the three-dimensional array after being upgraded is estimated.
Obtain respectively in this step each reference block in all reference blocks corresponding the renewal (if the Match of elemental composition piece wherein comprised is the conversion candidate blocks) estimated of three-dimensional array.
Step 108: the estimated value of each gray-scale value in estimating according to all three-dimensional array corresponding to all reference blocks difference, according to predetermined evaluation method, the gray-scale value of each pixel in noise image is determined in estimation, obtains the single treatment image.
In this step according to the three-dimensional array estimation of each corresponding match block composition respectively of each reference block, the gray-scale value of each pixel in the calculating noise image.Specifically, determine each reference block at the current pixel place that will calculate, calculate the gray-scale value of the current pixel that will calculate according to the estimated value of the gray-scale value of corresponding pixel points in each match block in each reference block at its place.If current noise image has n*n pixel, for each pixel, all adopt technique scheme to obtain the gray-scale value that every pixel is corresponding, form new image, pass through the present embodiment single treatment image.
When if a certain pixel is assigned in a plurality of reference blocks,, according to the gray-scale value of pixel corresponding in the three-dimensional array estimation that all match block of correspondence form respectively of each reference block, can be, but not limited to determine by weighted average calculation the gray-scale value of this pixel.
If a certain pixel A(x, y) be in two reference blocks, each reference block correspondence is by 8 three-dimensional array that match block is corresponding, according to this pixel A(x, y) coordinate determine that the gray-scale value of respective point in each match block that this pixel is corresponding estimates.
For example, pixel A(x, y) at the first reference block, the pixel value of eight corresponding match block respective point is estimated to be respectively: A1, A2, A3, A4, A5, A6, A7, A8, its corresponding weights are predefined for 0.6, and the predetermined of these weights can also can be pre-defined by the user based on experience value by computer system is pre-defined.
Pixel A(x, y) at the second reference block, the pixel value of eight corresponding match block respective point is estimated to be respectively: B1, B 2, B 3, B 4, B 5, B 6, B 7, B 8, its corresponding weights are predefined for 0.4.
Pixel A(x, y) gray-scale value x can be according to following functional expression calculative determination:
x=sum(A1+A2+A3+A4+A5+A6+A7+A8)/8*0.6+sum(B1+A2+B3+B4+B5+B6+B7+B8)/8*0.4。
All pixels are carried out respectively remembering with upper processing in like manner to the gray-scale value of each pixel, obtain the single treatment image.
It should be noted that, in the present embodiment, utilize each match block that each reference block is corresponding each pixel gray-scale value estimated value and in reference block the gray-scale value of each pixel estimate to calculate the above-mentioned Weighted Average Algorithm provided can be provided, but also can adopt other algorithms of the prior art, at this, not do and repeat.
Therefore, application the present embodiment technical scheme, owing to finding match block corresponding to a certain reference block in the present embodiment technical scheme, on the basis of the original candidates piece also directly obtained at noise image, the original candidates piece is carried out to geometric transformation, using the conversion candidate blocks and original candidates piece one of candidate image piece of corresponding match block as this reference block together obtained after geometric transformation, but not only using the original candidates piece as match block, adopt the present embodiment technical scheme can improve the quality (with the similarity of reference block) of the match block that each reference block is corresponding, thereby guarantee that the image when the pixel value of estimating each pixel of calculating noise image according to match block recovers noise image recovers the Optimality of strengthening the property.
In addition, if image in transmitting procedure or in the process obtained of image sampling in the situation that be subject to noise more serious (sigma > 40) displacement of digital picture and deformation comparison to occur when serious, adopt the present embodiment technical scheme can be effectively in degree of distortion, in higher noise image, to recruit preferably match block, improve denoising effect and strengthen effect.
Embodiment 2:
The method that the present embodiment provides another image to process,, shown in Figure 2, technical scheme institute difference that the method and embodiment provide mainly is, the present embodiment method mainly after the step 108 in embodiment 1, also further comprises:
Step 201: according to single treatment image and noise image, further obtain the secondary treating image.
In this step, further utilize the single treatment image further to strengthen the enhancing to noise image, improve the enhancing reduction effect to noise image.
Adopt and should further process, be conducive to further improve figure image intensifying reduction effect.
The treatment scheme of concrete steps 201 is decomposed can be with further reference to Fig. 3, shown in Figure 3, after step 108, further comprises following flow process:
Step 301: each coordinate position of determining each reference block in noise image.
Wherein each reference block is respectively each reference block that step 101 decomposition obtains in embodiment 1.
Step 302: according to the coordinate of each reference block in noise image, determine each second reference block in the single treatment image.
The coordinate position of each second reference block of determining in the single treatment image in this step is consistent with each coordinate position of each reference block in noise image respectively.
Step 303: in the single treatment image, determine respectively each second original candidates piece that each second reference block is corresponding.
The concrete treatment scheme of this step can be, but not limited to embodiment 1 in step 102 in like manner.
Such as but be not limited to each the second reference block, determine according to following flow process each second original candidates piece that each second reference block is corresponding:
At first determine region of search corresponding to this second reference block: the neighborhood of the Ns*Ns centered by the coordinate of the upper left corner of this second reference block (being designated as the 3rd neighborhood), the size that wherein Ns is this region of search, Ns often chooses and is greater than the reference block size, when for example the reference block size is 5*5, the region of search size of choosing is 28*28; Then the neighborhood (being designated as the neighbours territory) that each the interior point of this region of search (i.e. the 3rd neighborhood) of take is the upper left corner, the image block (i.e. neighbours territory) of formation and the second reference block formed objects is as the second original candidates piece.
It should be noted that, the present embodiment be take the pixel that the reference image vegetarian refreshments of this neighborhood is the upper left corner when determining the 3rd neighborhood, neighbours territory, but can also be not limited to the pixel of centered by this pixel pixel or the upper right corner or the lower right corner or the lower left corner or other positions, as long as make this pixel in this neighborhood.
Particularly: establishing the region of search size that current the second reference block size is 5*5 is 28*28, chooses respectively 28*28=784 pixel of this region of search, by each pixel, is the upper left corner, the second original candidates piece that to obtain 784 sizes be 5*5.
If current single treatment image is divided into 20 above-mentioned reference blocks, according to the method, corresponding 784 the second original candidates pieces of each second reference block.
Step 304: to each the second original candidates piece, carry out respectively at least two kinds of first geometric transformations, the image block after each first geometric transformation is defined as respectively: the second conversion candidate blocks that current the second original candidates piece is corresponding.
The concrete treatment scheme of this step can be, but not limited to embodiment 1 in step 103 in like manner.
Step 305: to each the second original candidates piece, carry out respectively at least two kinds of first geometric transformations, the image block after each first geometric transformation is defined as respectively: one of second conversion candidate blocks that the second original candidates piece is corresponding.
The concrete treatment scheme of this step can be, but not limited to embodiment 1 in step 104 in like manner.
For example but be not limited to by with the second reference block between Euclidean distance determine each second match block that each second reference block is corresponding, specifically, in the single treatment image, to each the second original candidates piece, respectively below the execution:
Calculate respectively the Euclidean distance of the second reference block that current the second original candidates piece is corresponding with current the second original candidates piece, be designated as d2 0',
Calculate respectively the Euclidean distance of corresponding each of current the second original candidates piece the second conversion candidate blocks second reference block corresponding with current the second original candidates piece, establishing i the described second Euclidean distance that converts candidate blocks and described the second reference block is d2 i', i is the arbitrary natural number that is equal to or less than n, n is: the number of each the second conversion candidate blocks that current the second original candidates piece is corresponding;
According to each d2 i' and d2 0', determine the similarity of the second conversion candidate blocks second reference block corresponding with current the second original candidates piece that current the second original candidates piece and current the second original candidates piece are corresponding, the image block that similarity is wherein reached to predetermined extent is defined as: one of second match block of the second reference block.
In the present embodiment when choosing match block corresponding to one of them reference block corresponding as original candidate blocks all conversion candidate mouths corresponding with this original candidate blocks from arbitrary original candidates piece, can be, but not limited to choose wherein the image block the highest with this reference block similarity as match block, but be not limited in this.The present embodiment can also but be not limited to, in similarity, reach on the basis of predetermined extent, preferentially choose the original candidates piece as match block.For example:
If the single treatment image has 20 the second reference blocks, arbitrary 1 the second original candidates piece in 784 the second original candidates pieces corresponding to every 1 the second reference block, corresponding 7 the second conversion candidate blocks of wantonly 1 the second original candidates piece.: establish arbitrary the second reference block Img2 i, arbitrary reference block Img2 iCorresponding arbitrary original candidates piece Img2 ' k, arbitrary original candidates piece Img2 ' kCorresponding conversion candidate blocks is Img2 " j.The sequence number that wherein i is the second reference block, the span of i is the arbitrary natural number between 1 to 20; The sequence number that k is the second original candidates piece, the span of k is the arbitrary natural number between 1 to 784; The sequence number that j is the second conversion candidate blocks, the span of j is the arbitrary natural number between 1 to 7.
To arbitrary reference block Img2 iArbitrary original candidates piece Img2 ' k, at this original candidates piece Img2 ' kAnd 7 conversion candidate blocks Img2 " 1, Img2 " 2, Img2 " 3, Img2 ' 4, Img2 " 5, Img2 " 6, Img2 " 7Middle calculating respectively:
Calculate Img2 ' kWith Img2 iEuclidean distance, be designated as d0 2,
Calculate Img2 " 1With Img2 iEuclidean distance, be designated as d1 2,
Calculate Img2 " 2With Img2 iEuclidean distance, be designated as d2 2,
Calculate Img2 " 3With Img2 iEuclidean distance, be designated as d3 2,
Calculate Img2 " 4With Img2 iEuclidean distance, be designated as d4 2,
Calculate Img2 " 5With Img2 iEuclidean distance, be designated as d5 2,
Calculate Img2 " 6With Img2 iEuclidean distance, be designated as d6 2,
Calculate Img2 " 7With Img2 iEuclidean distance, be designated as d7 2.
Make d K2=min{d1 2, d2 2, d3 2, d4 2, d5 2, d6 2, d7 2, compare d K2-d0 2Size, if (d K2-d 02)<=Tol 2, Tol wherein 2For predetermined Euclidean distance threshold value, it can be, but not limited to equate with Tol, chooses d J2Corresponding the second conversion candidate blocks Img2 " jBe the second reference block Img2 iOne of the second match block; Otherwise choose d0 2The second corresponding original candidates piece Img2 ' kBe the second reference block Img2 iOne of the second match block.Adopt said method, for wantonly 1 the second reference block finds 784 the second match block, be respectively 20 the second reference blocks and find respectively each corresponding second match block.Adopt this technical scheme, the minimum value min{d1 that only has the distance of the second conversion candidate blocks after than geometric transformation apart from d0 of the second original candidates piece and the second reference block and the second reference block, d2, d3, d4, d5, d6, d7} large when many, just use the piece after geometric transformation, preferentially select the second original candidates piece as the second match block meeting under certain condition, reduce the follow-up operand secondary geometric transformation of first geometric transformation contrary (for example with); If failing to meet this condition choosing the second conversion candidate blocks is the second match block.Be conducive to like this, on the basis of the similarity of the second match block of guaranteeing to find and the second reference block, reduce operand.
Step 306: to each the second original candidates piece, respectively current the second original candidates piece and current the second original candidates piece corresponding all second the conversion candidate blocks in, the image block that the similarity of the second wherein corresponding with current the second original candidates piece reference block is reached to predetermined extent is defined as: one of second match block of the second reference block.
The concrete treatment scheme of this step can be, but not limited to embodiment 1 in step 105 in like manner.
Step 307: determine the respectively coordinate position of second match block corresponding to each second reference block difference in the single treatment image.
If obtain altogether 20 reference blocks in step 306, determine respectively the coordinate position of each reference block in the single treatment image.
Step 308: in noise image, each Secondary Match piece corresponding to each reference block in difference determining step 301.
Specifically in step 307, in the single treatment image, determine all second match block of each second reference block corresponding with the reference block of noise image in this step of the present embodiment after, utilize all second match block of each second reference block in image, determine respectively Secondary Match piece corresponding to each reference block in noise image, concrete definite scheme is as follows: to each reference block, the coordinate that makes each Secondary Match piece that this reference block the is corresponding coordinate position of each second match block corresponding with current reference block corresponding second reference block in the secondary treating image respectively is consistent respectively.
If in noise image, reference block is 20, in the single treatment image, the second reference block is 20, the match block that every 1 second reference block is corresponding is distinguished 178, in the present embodiment respectively in noise image, determine respectively the Secondary Match piece for these 20 reference blocks, the number of the Secondary Match piece that every reference block is corresponding is 178.
So far, determined match block corresponding to each reference block difference in noise image.
Step 309: to each Secondary Match piece, judge respectively whether current secondary match block second corresponding match block in the single treatment image is the second conversion candidate blocks, if, the current secondary match block is carried out to the first geometric transformation identical with the first geometric transformation that obtains current the second conversion candidate blocks, current Secondary Match piece is updated to: the image block after current first geometric transformation; Otherwise do not upgrade the current secondary match block.
After step 308, further to each Secondary Match piece, judge whether this Secondary Match piece second corresponding match block in the single treatment image obtains image block as having been undertaken by the first original candidates piece after the described first geometric transformation of step 305, if, this Secondary Match piece is carried out: the geometric transformation that the first geometric transformation of carrying out in step 305 in the single treatment image with this Secondary Match piece is identical, still be designated as first geometric transformation, the image block formed after elementary geometric transformation is replaced to former Secondary Match piece becomes one of final Secondary Match piece; If not, without this Secondary Match being carried out soon to above-mentioned geometric transformation, process.
Step 310: determine respectively the second three-dimensional array of each match block formation that in noise image, each reference block is corresponding, obtain each second three-dimensional array that each reference block is corresponding.
The array element of each second three-dimensional array is respectively: each gray-scale value that forms each pixel in each Secondary Match piece of current the second three-dimensional array.
Be located in step 201, there are 20 reference blocks, be respectively wantonly 1 reference block Img iFind 784 Secondary Match pieces: each reference block Img iCorresponding second three-dimensional array, the element of this second three-dimensional array is this reference block Img iEach gray-scale value of each pixel in corresponding all or part of Secondary Match piece.20 reference blocks are distinguished corresponding 20 the second three-dimensional array in this step.
It should be noted that, choose in this step whole Secondary Match pieces that each reference block is corresponding element as the second three-dimensional array, also can choose the element of the Secondary Match piece of the predetermined portions that this reference block is corresponding as three-dimensional array, for example first 1 reference block Img in office iFind in 784 Secondary Match pieces and choose wherein and this reference block Img iNx Secondary Match piece of Euclidean distance minimum as the element that forms this three-dimensional array.
In addition, the arrangement of the array element at each reference block in corresponding the second three-dimensional array, can be, but not limited to according to the ascending arrangement of Euclidean distance between each Secondary Match piece and current reference block.
The concrete treatment scheme of this step can be, but not limited to embodiment 1 in step 105 in like manner.
Step 311: respectively each three-dimensional array is carried out to predetermined three-dimensional filtering, each three-dimensional array that obtains respectively each three-dimensional array is estimated.
Now, in the two or three bit array is estimated, the element of the two or three bit array is respectively: form the estimated value of each gray-scale value of each pixel that current the second three-dimensional array is estimated each Secondary Match piece of the second corresponding three-dimensional array.
Wherein this three-dimensional filtering can be, but not limited to, into three-dimensional hard-threshold filtering, can be, but not limited to as follows:
In this technical scheme, to arbitrary reference block Img iThe three-dimensional array F(i that corresponding match block forms) carry out respectively following processing, establish 20 reference blocks, respectively 20 three-dimensional array corresponding to reference block are carried out respectively to following processing:
Step1: by current reference block Img iThe three-dimensional array F2(i that corresponding a plurality of Secondary Match pieces form) carry out three-dimensional Fourier transform (3-D Discrete Fourier Transform, be called for short 3D-DFT), obtain this second three-dimensional array F(i) the second three-dimensional plural array F2 ' (i), obtain having realized this three-dimensional array F2(i) in each Secondary Match piece carry out two-dimensional Fourier transform.
Step2: the three-dimensional Fourier transform coefficient to the second three-dimensional plural array F2 ' in (i) carries out the hard-threshold computing, be about to the coefficient that mould is less than predetermined threshold and be set to 0, the coefficient that makes mould be more than or equal to predetermined threshold remains unchanged, by the three-dimensional plural array F2 ' of the coefficient update second after the hard-threshold computing (i).
Step3: to the second three-dimensional plural array F2 ' obtained after the hard-threshold computing (i), carry out three-dimensional inverse Fourier transform, obtain three-dimensional array and estimate F2 " (i).
The concrete treatment scheme of this step can be, but not limited to embodiment 1 in step 106 in like manner.
Step 312: each element Secondary Match piece during each second three-dimensional array is estimated, carry out respectively: if in current the second three-dimensional array estimation, the current secondary match block is: through the image block that in step 309, first geometric transformation obtains, further to the current secondary match block corresponding array element carry out with step 309 in the secondary geometric transformation of first geometric transformation contrary, array element corresponding to current secondary match block during current the second three-dimensional array is estimated is updated to: the array element after current secondary geometric transformation; Otherwise do not upgrade the array element in this second three-dimensional array.
The concrete treatment scheme of this step can be, but not limited to embodiment 1 in step 107 substantially in like manner.
Obtain respectively in this step in noise image in all reference blocks, each reference block corresponding the renewal (if the image block of the element Secondary Match piece wherein comprised for obtaining through first geometric transformation exists and upgrade) estimated of the second three-dimensional array.
Step 313: the estimated value of each gray-scale value in estimating according to all second three-dimensional array of all reference blocks difference correspondences, according to predetermined evaluation method, the gray-scale value of each pixel in noise image is determined in estimation, obtains the secondary treating image.
The concrete treatment scheme of this step can be, but not limited to embodiment 1 in step 107 in like manner.
According in described noise image, the estimated value of each described gray-scale value in all described second three-dimensional array estimation of all described reference blocks difference correspondences, according to predetermined evaluation method, the gray-scale value of each pixel in described noise image is determined in estimation, obtains the secondary treating image.
Therefore, further utilize in the present embodiment the enhancing reductibility the obtained single treatment image more excellent than noise image in embodiment 1, by find each second match block relative with corresponding each second reference block of each reference block in noise image in the single treatment image, wherein in the single treatment image, look for and still searched for the image block after geometric transformation in each second match block process and guarantee to find the higher similarity of quality match block preferably, and then utilize the correspondence relation of single treatment image and noise image, directly in noise image, find each Secondary Match piece, finally according to the estimation of the pixel value of each pixel of each Secondary Match piece, the gray-scale value of each pixel in noise image is estimated again, noise image is carried out to secondary and strengthen reduction, adopt this technical scheme can further improve the reduction effect to noise image.
Embodiment 3:
Shown in Figure 4, the equipment that the present embodiment provides a kind of image to process, it mainly comprises: picture breakdown unit 401, original candidates piece determining unit 402, conversion candidate blocks determining unit 403, match block determining unit 404, three-dimensional array determining unit 405, three-dimensional filtering unit 406, geometry inverse transformation block 407, image evaluation unit 408.
Wherein the annexation of each several part and principle of work are as follows.
Picture breakdown unit 401, for noise image being decomposed into at least two image blocks, be defined as respectively each reference block by each image block;
Original candidates piece determining unit 402, be connected with picture breakdown unit 401, for determining respectively each original candidates piece corresponding to each reference block;
Conversion candidate blocks determining unit 403, with original candidates piece determining unit 402, be connected, for to each original candidates piece, carry out respectively at least two kinds of first geometric transformations, respectively the image block after each first geometric transformation is defined as respectively: each original candidates piece is the corresponding candidate blocks that respectively converts respectively;
Match block determining unit 404, with original candidates piece determining unit 402 and conversion candidate blocks determining unit 403, be connected respectively, be used for each original candidates piece, respectively in current original candidates piece and all conversion candidate blocks corresponding to current original candidates piece, the similarity of wherein corresponding with current original candidates piece reference block is reached to the image block of predetermined extent, be defined as one of match block that reference block is corresponding;
Three-dimensional array determining unit 405, with match block determining unit 404, be connected, determine respectively the three-dimensional array of each match block formation that each reference block is corresponding, obtain each reference block corresponding each three-dimensional array respectively, the array element of each three-dimensional array is respectively: the gray-scale value of each pixel that forms each match block of current three-dimensional array;
Three-dimensional filtering unit 406, with three-dimensional array determining unit 405, be connected, for respectively each three-dimensional array being carried out to predetermined three-dimensional filtering, each three-dimensional array that obtains respectively each three-dimensional array estimates, the array element that each three-dimensional array is estimated is respectively: form the estimated value of gray-scale value of each pixel that current three-dimensional array is estimated each match block of corresponding three-dimensional array;
How much inverse transformation block 407, with three-dimensional filtering unit 406 and match block determining unit 404, be connected respectively, for each match block that each three-dimensional array is estimated, if in current three-dimensional array estimation, the current matching piece is the conversion candidate blocks, further to the Current Transform candidate blocks, corresponding array element carries out secondary geometric transformation, and array element corresponding to Current Transform candidate blocks during current three-dimensional array is estimated is updated to: the array element after each secondary geometric transformation; Otherwise do not upgrade three-dimensional array, do not estimate, wherein each secondary geometric transformation is respectively: with the geometric transformation of each first geometric transformation contrary;
Image evaluation unit 408, with geometry inverse transformation block 407 and three-dimensional filtering unit 406, be connected respectively, for estimate the estimated value of the gray-scale value of each pixels according to all three-dimensional array corresponding to all reference blocks difference, according to predetermined evaluation method, the gray-scale value of each pixel in noise image is determined in estimation, obtains the single treatment image.
Wherein further principle of work and image treatment scheme can be, but not limited to referring to the detailed description in embodiment 1,2.
Embodiment 4:
Shown in Figure 5, the present embodiment provides a kind of image processing system, and it mainly comprises: image dissector 501, original candidates piece determiner 502, conversion candidate blocks determiner 503, match block determiner 504, three-dimensional array determiner 505, three-dimensional filter 506, how much inverse converters 507, images are estimated device 508.
Wherein the annexation of each several part and principle of work are as follows.
Image dissector 501, for noise image being decomposed into at least two image blocks, be defined as respectively each reference block by each image block;
Original candidates piece determiner 502, be connected with image dissector 501, for determining respectively each original candidates piece corresponding to each reference block;
Conversion candidate blocks determiner 503, with original candidates piece determiner 502, be connected, for to each original candidates piece, carry out respectively at least two kinds of first geometric transformations, respectively the image block after each first geometric transformation is defined as respectively: each original candidates piece is the corresponding candidate blocks that respectively converts respectively;
Match block determiner 504, with original candidates piece determiner 502 and conversion candidate blocks determiner 503, be connected respectively, be used for each original candidates piece, respectively in current original candidates piece and all conversion candidate blocks corresponding to current original candidates piece, the similarity of wherein corresponding with current original candidates piece reference block is reached to the image block of predetermined extent, be defined as one of match block that reference block is corresponding;
Three-dimensional array determiner 505, with match block determiner 504, be connected, determine respectively the three-dimensional array of each match block formation that each reference block is corresponding, obtain each reference block corresponding each three-dimensional array respectively, the array element of each three-dimensional array is respectively: the gray-scale value of each pixel that forms each match block of current three-dimensional array;
Three-dimensional filter 506, with three-dimensional array determiner 505, be connected, for respectively each three-dimensional array being carried out to predetermined three-dimensional filtering, each three-dimensional array that obtains respectively each three-dimensional array estimates, the array element that each three-dimensional array is estimated is respectively: form the estimated value of gray-scale value of each pixel that current three-dimensional array is estimated each match block of corresponding three-dimensional array;
How much inverse converters 507, with three-dimensional filter 506 and match block determiner 504, be connected respectively, for each match block that each three-dimensional array is estimated, if in current three-dimensional array estimation, the current matching piece is the conversion candidate blocks, further to the Current Transform candidate blocks, corresponding array element carries out secondary geometric transformation, and array element corresponding to Current Transform candidate blocks during current three-dimensional array is estimated is updated to: the array element after each secondary geometric transformation; Otherwise do not upgrade three-dimensional array, do not estimate, wherein each secondary geometric transformation is respectively: with the geometric transformation of each first geometric transformation contrary;
Image estimation device 508, with geometry inverse converter 507 and three-dimensional filter 506, be connected respectively, for estimate the estimated value of the gray-scale value of each pixels according to all three-dimensional array corresponding to all reference blocks difference, according to predetermined evaluation method, the gray-scale value of each pixel in noise image is determined in estimation, obtains the single treatment image.
Wherein further principle of work and image treatment scheme can be, but not limited to referring to the detailed description in embodiment 1,2.
Embodiment 5:
A kind of computer system shown in Figure 6, that the present embodiment provides, it mainly comprises: processor 601, storer 602 and communication line 603.
Wherein communication line 603 is connected between processor 601 and storer 602; Wherein processor 601, by communication line 603, calls the code of storage in storer 602, with the image treatment scheme for following:
Noise image is decomposed into at least two image blocks, each image block is defined as respectively to each reference block;
Determine respectively each original candidates piece corresponding to each reference block,
To each original candidates piece, carry out respectively at least two kinds of first geometric transformations, respectively the image block after each first geometric transformation is defined as respectively: each original candidates piece is the corresponding candidate blocks that respectively converts respectively,
To each original candidates piece, respectively in current original candidates piece and all conversion candidate blocks corresponding to current original candidates piece, the similarity of wherein corresponding with current original candidates piece reference block is reached to the image block of predetermined extent, be defined as one of match block that reference block is corresponding;
Determine respectively the three-dimensional array that each match block that each reference block is corresponding forms, obtain each reference block corresponding each three-dimensional array respectively, the array element of each three-dimensional array is respectively: the gray-scale value of each pixel that forms each match block of current three-dimensional array;
Respectively each three-dimensional array is carried out to predetermined three-dimensional filtering, each three-dimensional array that obtains respectively each three-dimensional array estimates, the array element that each three-dimensional array is estimated is respectively: form the estimated value of gray-scale value of each pixel that current three-dimensional array is estimated each match block of corresponding three-dimensional array;
Each match block during each three-dimensional array is estimated, if in current three-dimensional array estimation, the current matching piece is the conversion candidate blocks, further to the Current Transform candidate blocks, corresponding array element carries out secondary geometric transformation, and array element corresponding to Current Transform candidate blocks during current three-dimensional array is estimated is updated to: the array element after each secondary geometric transformation; Otherwise do not upgrade three-dimensional array, do not estimate,
Wherein each secondary geometric transformation is respectively: with the geometric transformation of each first geometric transformation contrary;
The estimated value of the gray-scale value of each pixel in estimating according to all three-dimensional array corresponding to all reference blocks difference, according to predetermined evaluation method, the gray-scale value of each pixel in noise image is determined in estimation, obtains the single treatment image, and noise image strengthens the image after reducing.
The stored Procedure that processor 601 calls storer 602 is further processed the corresponding description can be, but not limited to further reference in embodiment 1,2 to noise image.
Test result analysis:
In order to further illustrate the beneficial effect of the present embodiment technical scheme, utilize respectively the treatment scheme of embodiment 1,2 respectively various noise images to be processed, concrete shown below referring in Figure of description, wherein,
The first effect comparison reference, be shown in Fig. 7-10 in detail.
Fig. 7 is laboratory the first original image;
Fig. 8 for having added the first noise image obtained after the Gaussian noise of sigma=20 in the first original image shown in Fig. 7;
Fig. 9 is the processing image that utilizes embodiment 1 technical scheme to obtain after being processed the first noise image shown in Fig. 8;
Figure 10 is the single treatment image that utilizes embodiment 1 technical scheme to obtain after being processed the first noise image shown in Fig. 8;
The second effect comparison reference, be shown in Figure 11-14 in detail.
Figure 11 is laboratory the second original image;
Figure 12 for having added the second noise image obtained after the Gaussian noise of sigma=20 in the second original image shown in Figure 11;
Figure 13 is the processing image that utilizes BM3D technical scheme traditional in prior art to obtain after being processed the second noise image shown in Figure 12;
Figure 14 is the single treatment image that utilizes embodiment 1 technical scheme to obtain after being processed the second noise image shown in Figure 12;
The 3rd effect comparison reference, be shown in Figure 15-18 in detail.
Figure 15 is laboratory the 3rd original image;
Figure 16 in the 3rd original image shown in Figure 15, added after the Gaussian noise of sigma=20 obtain the 3rd noise image;
Figure 17 utilizes BM3D technical scheme traditional in prior art to processing image that shown in Figure 16, the 3rd noise image obtains after being processed;
Figure 18 utilizes embodiment 1 technical scheme to single treatment image that shown in Figure 16, the 3rd noise image obtains after being processed;
Quadruple effect is really compared reference, sees in detail Figure 19-22.
Figure 19 is laboratory the 4th original image;
Figure 20 for having added the 4th noise image obtained after the Gaussian noise of sigma=20 in the 4th original image shown in Figure 15;
Figure 21 utilizes BM3D technical scheme traditional in prior art to processing image that shown in Figure 20, the 4th noise image obtains after being processed;
Figure 22 utilizes embodiment 1 technical scheme to single treatment image that shown in Figure 20, the 4th noise image obtains after being processed.
Respectively to Fig. 9,13,17,21 and the Y-PSNR (Peak Signal to Noise Ratio is called for short PSNR) of Figure 10,14,18,22 image calculated, obtain the data shown in table one:
Table one: the PSNR contrast form of the image after each processing
Therefore, adopt the present embodiment technical scheme, with respect to the effective reducing noise of prior art, the enhancing reduction effect of raising image, and can restore more textural characteristics of image.
Device embodiment described above is only schematic, the wherein said unit as the separating component explanation can or can not be also physically to separate, the parts that show as unit can be or can not be also physical locations, can be positioned at a place, or also can be distributed on a plurality of network element.Can select according to the actual needs some or all of unit wherein to realize the purpose of the present embodiment scheme.Those of ordinary skills in the situation that do not pay performing creative labour, can understand and implement.
Through the above description of the embodiments, those skilled in the art can be well understood to the mode that each embodiment can add essential general hardware platform by software and realize, can certainly pass through hardware.Understanding based on such, the part that technique scheme contributes to prior art in essence in other words can embody with the form of software product, this computer software product can be stored in computer-readable recording medium, as ROM/RAM, magnetic disc, CD etc., comprise that some instructions are with so that a computer equipment (can be personal computer, server, or the network equipment etc.) carry out the described method of some part of each embodiment or embodiment.
Above-described embodiment, do not form the restriction to this technical scheme protection domain.The modification of doing within any spirit at above-mentioned embodiment and principle, be equal to and replace and improvement etc., within all should being included in the protection domain of this technical scheme.

Claims (22)

1. the method that image is processed, is characterized in that, comprising:
Noise image is decomposed into at least two image blocks, each described image block is defined as respectively to each reference block;
Determine respectively each original candidates piece corresponding to each described reference block,
To each described original candidates piece, carry out respectively at least two kinds of first geometric transformations, respectively the image block after each described first geometric transformation is defined as respectively: each described original candidates piece is the corresponding candidate blocks that respectively converts respectively,
To each described original candidates piece, respectively in current described original candidates piece and all described conversion candidate blocks corresponding to current described original candidates piece, the similarity of wherein corresponding with current described original candidates piece described reference block is reached to the image block of predetermined extent, be defined as one of match block that described reference block is corresponding;
Determine respectively the three-dimensional array of each described match block formation that each described reference block is corresponding, obtain each described reference block corresponding each described three-dimensional array respectively, the array element of each described three-dimensional array is respectively: the gray-scale value of each pixel that forms each described match block of current described three-dimensional array;
Respectively each described three-dimensional array is carried out to predetermined three-dimensional filtering, each three-dimensional array that obtains respectively each described three-dimensional array estimates, the array element that each described three-dimensional array is estimated is respectively: form the estimated value of gray-scale value of each pixel that current described three-dimensional array is estimated each described match block of corresponding described three-dimensional array;
Each described match block during each described three-dimensional array is estimated, if in current described three-dimensional array estimation, current described match block is described conversion candidate blocks, further array element corresponding to current described conversion candidate blocks carried out to secondary geometric transformation, array element corresponding to current described conversion candidate blocks during current described three-dimensional array is estimated is updated to: the array element after each described secondary geometric transformation; Otherwise do not upgrade described three-dimensional array, do not estimate,
Wherein each described secondary geometric transformation is respectively: with the geometric transformation of each described first geometric transformation contrary;
The estimated value of the gray-scale value of each described pixel in estimating according to all described three-dimensional array corresponding to all described reference blocks difference, according to predetermined evaluation method, the gray-scale value of each pixel in described noise image is determined in estimation, obtains the single treatment image.
2. the method that image according to claim 1 is processed, is characterized in that,
In step: after obtaining the single treatment image, also comprise:
According to described single treatment image and described noise image, obtain the secondary treating image.
3. the method that image according to claim 2 is processed, is characterized in that,
According to described single treatment image and described noise image, obtain the secondary treating image, specifically comprise:
Determine the coordinate position of each the described reference block in described noise image;
Respectively in described single treatment image, determine each second reference block, the coordinate position of each described the second reference block respectively with described noise image in the coordinate position of each described reference block corresponding;
In described single treatment image, determine respectively each second original candidates piece that each described second reference block is corresponding,
To each described the second original candidates piece, carry out respectively at least two kinds of first geometric transformations, the image block after each first geometric transformation is defined as respectively: each described second original candidates piece is each the second conversion candidate blocks of correspondence respectively,
To each described the second original candidates piece, respectively current described the second original candidates piece and current described the second original candidates piece corresponding all described second the conversion candidate blocks in, the similarity of wherein corresponding with current described the second original candidates piece described the second reference block is reached to the image block of predetermined extent, be defined as one of second match block that described the second reference block is corresponding;
In described single treatment image, determine each described second reference block coordinate position of each corresponding described the second match block respectively,
The coordinate position of each corresponding described the second match block according to each described second reference block difference in described single treatment image, in described noise image, determine respectively each described Secondary Match piece corresponding to each described reference block, wherein, to each described reference block, the coordinate position of the coordinate position of each described Secondary Match piece that current described reference block is corresponding each described the second match block that described second reference block corresponding with current described reference block is corresponding respectively is corresponding respectively;
To each described Secondary Match piece, judge respectively whether current described Secondary Match piece corresponding described second match block in described single treatment image is described the second conversion candidate blocks, if, current described Secondary Match piece is carried out to the first geometric transformation identical with the described first geometric transformation that obtains current described the second conversion candidate blocks, current described Secondary Match piece is updated to: the image block after current described first geometric transformation, otherwise do not upgrade current described Secondary Match piece;
Determine respectively each described reference block the second three-dimensional array of each corresponding described Secondary Match piece formation respectively, obtain each described reference block corresponding each described second three-dimensional array respectively, the array element of each described the second three-dimensional array is respectively: the gray-scale value of each pixel that forms each described Secondary Match piece of current described the second three-dimensional array;
Respectively each described second three-dimensional array is carried out to predetermined three-dimensional filtering, each second three-dimensional array that obtains respectively each described the second three-dimensional array estimates, the array element that each described second three-dimensional array is estimated is respectively: the estimated value of gray-scale value that forms each pixel of each described the second match block that current described the second three-dimensional array estimates;
Each described Secondary Match piece during each described second three-dimensional array is estimated, if the image block of current described Secondary Match piece for obtaining through described first geometric transformation in current described the second three-dimensional array estimation, described array element corresponding to current described Secondary Match piece carried out to secondary geometric transformation, described array element corresponding to current described Secondary Match piece during current described the second three-dimensional array is estimated is updated to: the array element after described secondary geometric transformation, otherwise do not upgrade the described array element in described three-dimensional array estimation
Wherein said secondary geometric transformation is respectively: with the geometric transformation of current described first geometric transformation contrary;
The estimated value of the gray-scale value of each described pixel in estimating according to all described the second three-dimensional array corresponding to all described reference blocks difference, according to predetermined evaluation method, the gray-scale value of each pixel in described noise image is determined in estimation, obtains the secondary treating image.
4. according to the method for claim 1 or 2 or 3 described images processing, it is characterized in that,
Noise image is decomposed into at least two reference blocks, specifically comprises:
With the size of the N*N that is scheduled to, with the step-length of being scheduled to, in described noise image, by each image block taken out successively, using each described image block respectively as each described reference block,
The size of each described reference block is N*N, the number that described N is pixel.
5. according to the method for claim 1 or 2 or 3 described images processing, it is characterized in that,
Determine respectively each original candidates piece corresponding to each described reference block, specifically,
To each described reference block, carry out respectively:
Determine the first neighborhood of a pixel in precalculated position in current described reference block,
Each second neighborhood by each pixel in described the first neighborhood is defined as respectively: each original candidates piece that current described reference block is corresponding, and wherein the image size of each described the second neighborhood is respectively described N*N;
And/or, in described single treatment image, determine respectively each second original candidates piece that each described second reference block is corresponding, specifically,
To each described the second reference block, carry out respectively:
Determine the 3rd neighborhood of a pixel in precalculated position in current described the second reference block,
By each neighbours territory of each pixel in described the 3rd neighborhood, be defined as respectively: each original candidates piece that current described the second reference block is corresponding, wherein the image size in each described neighbours territory is respectively described N*N.
6. the method that image according to claim 5 is processed, is characterized in that,
The image size of described the first neighborhood and/or the 3rd neighborhood is Ns*Ns;
The number that described Ns is pixel, described Ns is greater than described N.
7. the method that image according to claim 5 is processed, is characterized in that,
Determine the first neighborhood of a pixel in precalculated position in current described reference block, specifically:
The central point that the described pixel of take is described the first neighborhood, determine described the first neighborhood;
And/or,
Determine the 3rd neighborhood of a pixel in precalculated position in current described the second reference block, specifically:
The central point that the described pixel of take is described the 3rd neighborhood, determine described the 3rd neighborhood.
8. the method that image according to claim 5 is processed, is characterized in that,
Determine the first neighborhood of a pixel in precalculated position in current described reference block, specifically,
Top left corner apex pixel neighborhood of a point by described reference block, be defined as described the first neighborhood;
And/or,
Determine the 3rd neighborhood of a pixel in precalculated position in current described the second reference block, specifically,
By the top left corner apex pixel neighborhood of a point of described the second reference block, be defined as described the 3rd neighborhood.
9. the method that image according to claim 5 is processed, is characterized in that,
Each second neighborhood by each pixel in described the first neighborhood is defined as respectively: each original candidates piece that current described reference block is corresponding, specifically,
Each described second neighborhood that to put centered by each pixel in described the first neighborhood is defined as respectively: each described original candidates piece that current described reference block is corresponding;
And/or,
Each neighbours territory by each pixel in described the 3rd neighborhood is defined as respectively: each described second original candidates piece that current described the second reference block is corresponding, specifically,
Each the described neighbours territory that to put centered by each pixel in described the 3rd neighborhood is defined as respectively: each described second original candidates piece that current described the second reference block is corresponding.
10. the method that image according to claim 5 is processed, is characterized in that,
Each second neighborhood by each pixel in described the first neighborhood is defined as respectively: each original candidates piece that current described reference block is corresponding, specifically,
To take described the second neighborhood that each pixel is top left corner apex or summit, the lower left corner or summit, the upper right corner or summit, the lower right corner in described the first neighborhood, be defined as respectively: each described original candidates piece that current described reference block is corresponding;
And/or,
Each neighbours territory by each pixel in described the 3rd neighborhood is defined as respectively: each described second original candidates piece that current described the second reference block is corresponding, specifically,
To take the described neighbours territory that each pixel is top left corner apex or summit, the lower left corner or summit, the upper right corner or summit, the lower right corner in described the 3rd neighborhood, be defined as respectively: each described second original candidates piece that current described the second reference block is corresponding.
11. the method that image according to claim 1 is processed, is characterized in that,
Each described first geometric transformation comprises following arbitrary or wherein any two kinds or two or more combination arbitrarily:
Translation, rotation, mirror image conversion, transposition, scaling.
12. the method that image according to claim 11 is processed, is characterized in that,
Each described first geometric transformation specifically is respectively: rotation;
To each described original candidates piece, carry out respectively at least two kinds of first geometric transformations, respectively the image block after each described first geometric transformation is defined as respectively: each described original candidates piece is the corresponding candidate blocks that respectively converts respectively, be specially,
To each described original candidates piece, carry out respectively: current described original candidates piece is rotated respectively to predetermined different angle, each image block obtained after each rotation is defined as respectively to each described conversion candidate blocks;
And/or,
To each described the second original candidates piece, carry out respectively at least two kinds of first geometric transformations, the image block after each first geometric transformation is defined as respectively: each described second original candidates piece is each the second conversion candidate blocks of correspondence respectively, is specially:
To each described the second original candidates piece, carry out respectively: current described the second original candidates piece is rotated respectively to predetermined different angle, each image block obtained after each rotation is defined as respectively to each described the second conversion candidate blocks.
13. the method that image according to claim 11 is processed, is characterized in that,
Each described first geometric transformation specifically comprises: rotation, vertical transposition, horizontal transposition, mirror image;
To each described original candidates piece, carry out respectively at least two kinds of first geometric transformations, the image block after each first geometric transformation is defined as respectively: the conversion candidate blocks that described original candidates piece is corresponding, be specially,
To each described original candidates piece, below carrying out respectively:
Current described original candidates piece is rotated to predetermined angle,
Respectively left, the current described original candidates piece of horizontal transposition to the right,
Make progress respectively, the current described original candidates piece of transposition downward vertically,
Two diagonal line of described original candidates piece of take respectively are the transposition axis, the current described original candidates piece of transposition respectively,
The axis of being scheduled to of take is image line, the current described original candidates piece of mirror image,
Each image block obtained after the image block that described rotation is obtained afterwards and each described horizontal transposition, and each image block obtained after each described vertical transposition, and each image block after described mirror image, be defined as respectively each described conversion candidate blocks;
And/or,
To each described the second original candidates piece, carry out respectively at least two kinds of first geometric transformations, the image block after each first geometric transformation is defined as respectively: the second conversion candidate blocks that described the second original candidates piece is corresponding is specially:
To each described the second original candidates piece, below carrying out respectively:
Current described the second original candidates piece is rotated respectively to predetermined angle,
Respectively left, current described the second original candidates piece of horizontal transposition to the right,
Make progress respectively, current described the second original candidates piece of transposition downward vertically,
Two diagonal line of described original candidates piece of take respectively are the transposition axis, current described the second original candidates piece of transposition respectively,
The axis of being scheduled to of take is image line, current described the second original candidates piece of mirror image,
Each image block obtained after the image block that described rotation is obtained afterwards and each described horizontal transposition, and each image block obtained after each described vertical transposition, and each image block after described mirror image, be defined as respectively each described the second conversion candidate blocks.
14. the method according to claim 1 or 2 or 3 described images processing, is characterized in that,
To each described original candidates piece, respectively in current described original candidates piece and all described conversion candidate blocks corresponding to current described original candidates piece, the similarity of wherein corresponding with current described original candidates piece described reference block is reached to the image block of predetermined extent, be defined as one of match block that described reference block is corresponding, specifically
To each described original candidates piece, below carrying out respectively:
Calculate respectively the Euclidean distance of the described reference block that current described original candidates piece is corresponding with current described original candidates piece, be designated as d 0,
Calculate respectively: the Euclidean distance that respectively converts the candidate blocks described reference block corresponding with current described original candidates piece that current described original candidates piece is corresponding, remember that the Euclidean distance of i described conversion candidate blocks and described reference block is d i, described i is the arbitrary natural number that is equal to or less than n, described n is: the number of each described conversion candidate blocks that current described original candidates piece is corresponding;
According to each described d iAnd described d 0, determine the similarity of the described reference block that described conversion candidate blocks that current described original candidates piece and current described original candidates piece are corresponding is corresponding with current described original candidates piece,
The described image block that wherein said similarity is reached to predetermined extent is defined as: one of match block that described reference block is corresponding;
And/or,
To each described the second original candidates piece, respectively current described the second original candidates piece and current described the second original candidates piece corresponding all described second the conversion candidate blocks in, the similarity of wherein corresponding with current described the second original candidates piece described the second reference block is reached to the image block of predetermined extent, be defined as one of second match block that described the second reference block is corresponding, specifically:
To each described the second original candidates piece, below carrying out respectively:
Calculate respectively the Euclidean distance of described the second reference block that current described the second original candidates piece is corresponding with current described original candidates piece, be designated as d 0',
Calculate respectively: the Euclidean distance of described the second reference block that each the second conversion candidate blocks that current described the second original candidates piece is corresponding is corresponding with current described the second original candidates piece, remember that the Euclidean distance of i described the second conversion candidate blocks and described the second reference block is d i', described i is the arbitrary natural number that is equal to or less than n, described n is: the number of each described the second conversion candidate blocks that current described the second original candidates piece is corresponding;
According to each described d i' and described d 0', determine the similarity of described the second conversion candidate blocks described second reference block corresponding with current described the second original candidates piece that current described the second original candidates piece and current described the second original candidates piece are corresponding,
The described image block that wherein said similarity is reached to predetermined extent is defined as: one of second match block of described the second reference block.
15. the method that image according to claim 14 is processed, is characterized in that,
The similarity of wherein corresponding with current described original candidates piece described reference block is reached to the image block of predetermined extent, be defined as one of match block that described reference block is corresponding, specifically:
Make d '=min{d 1, d 2, d iD n,
If described d '-d 0<=T O1, by described d ', corresponding described conversion candidate blocks is defined as: one of match block that described reference block is corresponding,
Otherwise described original candidates piece is defined as: one of match block that described reference block is corresponding;
And/or,
The similarity of wherein corresponding with current described the second original candidates piece described the second reference block is reached to the image block of predetermined extent, be defined as one of second match block that described the second reference block is corresponding, specifically:
Make d2 "=min{d2 1', d2 2', d2 i' ... d2 n',
If described d2 "-d2 0'<=T O1, by described d2 ', corresponding described the second conversion candidate blocks is defined as: one of second match block of described the second reference block,
Otherwise described the second original candidates piece is defined as: one of match block that described the second reference block is corresponding;
Wherein, described T O12For predetermined positive number.
16. the method according to claim 1 or 2 or 3 described images processing, is characterized in that,
Determine respectively the three-dimensional array of each described match block formation that each described reference block is corresponding, obtain each described reference block each described three-dimensional array of correspondence respectively, the array element of each described three-dimensional array is respectively: the gray-scale value of each pixel that forms each described match block of current described three-dimensional array, specifically
To each described reference block, carry out respectively:
According to current described reference block, the ascending order of the Euclidean distance of corresponding each described match block and current described reference block, determine each described match block putting in order in described three-dimensional array,
According to each described match block putting in order in described three-dimensional array, each gray-scale value of each pixel of each described match block is defined as: the array element of described three-dimensional array;
And/or,
Determine respectively each described reference block the second three-dimensional array of each corresponding described Secondary Match piece formation respectively, obtain each described reference block each described second three-dimensional array of correspondence respectively, the array element of each described the second three-dimensional array is respectively: the gray-scale value of each pixel that forms each described Secondary Match piece of current described the second three-dimensional array, specifically
To each described the second reference block, carry out respectively:
According to current described the second reference block, the ascending order of the Euclidean distance of corresponding each described the second match block and current described the second reference block, determine each described the second match block putting in order in described the second three-dimensional array,
According to each described the second match block putting in order in described the second three-dimensional array, each gray-scale value of each pixel of each described the second match block is defined as: the array element of described the second three-dimensional array.
17. the method that image according to claim 16 is processed, is characterized in that,
In described step: the ascending order of the Euclidean distance of corresponding each described match block and current described reference block according to current described reference block, determine each described match block putting in order in described three-dimensional array, also comprise:
In corresponding all described match block, choose the described match block of predetermined number as the described match block that forms described three-dimensional array at current described reference block.
And/or,
In described step: the ascending order of the Euclidean distance of corresponding each described the second match block and current described the second reference block according to current described the second reference block, determine each described the second match block putting in order in described the second three-dimensional array, also comprise:
In corresponding all described the second match block, choose described second match block of predetermined number as described the second match block that forms described three-dimensional array at current described the second reference block.
18. the method that image according to claim 1 is processed, is characterized in that,
Respectively each described three-dimensional array is carried out to predetermined three-dimensional filtering, specifically:
Respectively each described three-dimensional array is carried out to three-dimensional hard-threshold filtering;
And/or,
Respectively each described second three-dimensional array is carried out to predetermined three-dimensional filtering, specifically:
Respectively each described second three-dimensional array is carried out to three-dimensional hard-threshold filtering.
19. the method that image according to claim 18 is processed, is characterized in that,
Respectively each described three-dimensional array is carried out to three-dimensional hard-threshold filtering, comprising:
Each described three-dimensional array is carried out respectively to three-dimensional Fourier transform, obtain respectively each three-dimensional plural array;
Three-dimensional Fourier transform coefficient in the plural array of each described three-dimensional is carried out to the hard-threshold computing: the described coefficient that mould is less than to predetermined threshold is set to 0, the described coefficient that mould is more than or equal to predetermined threshold is constant, obtains the plural array of each described three-dimensional after described hard-threshold computing;
To the plural array of each described three-dimensional after described hard-threshold computing, carry out respectively three-dimensional inverse Fourier transform, obtain each described three-dimensional array and estimate;
And/or,
Respectively each described second three-dimensional array being carried out to three-dimensional hard-threshold filtering comprises:
Each described second three-dimensional array is carried out respectively to three-dimensional Fourier transform, obtain respectively each the second three-dimensional plural array;
Three-dimensional Fourier transform coefficient in each described second three-dimensional plural array is carried out to the hard-threshold computing: the described coefficient that mould is less than to predetermined threshold is set to 0, the described coefficient that mould is more than or equal to predetermined threshold is constant, obtains each the described second three-dimensional plural array after described hard-threshold computing;
To each the described second three-dimensional plural array after described hard-threshold computing, carry out respectively three-dimensional inverse Fourier transform, obtain each described second three-dimensional array and estimate.
20. the method according to claim 1 or 2 or 3 described images processing, is characterized in that,
The estimated value of each described gray-scale value in estimating according to all described three-dimensional array corresponding to all reference blocks difference, according to predetermined evaluation method, the gray-scale value of each pixel in described noise image is determined in estimation; Specifically,
To each described pixel in described noise image, below carrying out respectively:
Determine each described reference block each the described three-dimensional array estimation of correspondence respectively that comprises current described pixel,
Determine respectively current described pixel each described gray-scale value estimated value of correspondence in each described three-dimensional array respectively, each described gray-scale value estimated value is carried out to the gray-scale value of the value of predetermined weighted mean computing acquisition as current described pixel in described noise image.
And/or,
According in described noise image, the estimated value of each described gray-scale value in all described second three-dimensional array estimation of all described reference blocks difference correspondences, according to predetermined evaluation method, the gray-scale value of each pixel in described noise image is determined in estimation, specifically:
To each described pixel in described noise image, below carrying out respectively:
Determine each described reference block each the described second three-dimensional array estimation of correspondence respectively that comprises current described pixel,
Determine respectively current described pixel each described gray-scale value estimated value of correspondence in each described second three-dimensional array respectively, each described gray-scale value estimated value is carried out to the gray-scale value of the value of predetermined weighted mean computing acquisition as current described pixel in described noise image.
21. an image processing equipment, is characterized in that, comprising:
The picture breakdown unit, for noise image being decomposed into at least two image blocks, be defined as respectively each reference block by each described image block;
Original candidates piece determining unit, be connected with described picture breakdown unit, for determining respectively each original candidates piece corresponding to each described reference block,
Conversion candidate blocks determining unit, with original candidates piece determining unit, be connected, be used for each described original candidates piece, carry out respectively at least two kinds of first geometric transformations, respectively the image block after each described first geometric transformation is defined as respectively: each described original candidates piece is the corresponding candidate blocks that respectively converts respectively
The match block determining unit, with original candidates piece determining unit and conversion candidate blocks determining unit, be connected respectively, be used for each described original candidates piece, respectively in current described original candidates piece and all described conversion candidate blocks corresponding to current described original candidates piece, the similarity of wherein corresponding with current described original candidates piece described reference block is reached to the image block of predetermined extent, be defined as one of match block that described reference block is corresponding;
The three-dimensional array determining unit, with described match block determining unit, be connected, determine respectively the three-dimensional array of each described match block formation that each described reference block is corresponding, obtain each described reference block corresponding each described three-dimensional array respectively, the array element of each described three-dimensional array is respectively: the gray-scale value of each pixel that forms each described match block of current described three-dimensional array;
The three-dimensional filtering unit, with described three-dimensional array determining unit, be connected, for respectively each described three-dimensional array being carried out to predetermined three-dimensional filtering, each three-dimensional array that obtains respectively each described three-dimensional array estimates, the array element that each described three-dimensional array is estimated is respectively: form the estimated value of gray-scale value of each pixel that current described three-dimensional array is estimated each described match block of corresponding described three-dimensional array;
How much inverse transformation block, with described three-dimensional filtering unit and match block determining unit, be connected respectively, for each described match block that each described three-dimensional array is estimated, if in current described three-dimensional array estimation, current described match block is described conversion candidate blocks, further array element corresponding to current described conversion candidate blocks carried out to secondary geometric transformation, array element corresponding to current described conversion candidate blocks during current described three-dimensional array is estimated is updated to: the array element after each described secondary geometric transformation; Otherwise do not upgrade described three-dimensional array, do not estimate, wherein each described secondary geometric transformation is respectively: with the geometric transformation of each described first geometric transformation contrary;
The image evaluation unit, with described how much inverse transformation block and described three-dimensional filtering unit, be connected respectively, for estimate the estimated value of the gray-scale value of each described pixel according to all described three-dimensional array corresponding to all described reference blocks difference, according to predetermined evaluation method, the gray-scale value of each pixel in described noise image is determined in estimation, obtains the single treatment image.
22. a computer system, is characterized in that, comprising: processor, storer and communication line, wherein said communication line is connected between described processor and described storer;
Wherein, described processor by described communication line, calls the code of storing in described storer, with for:
Noise image is decomposed into at least two image blocks, each described image block is defined as respectively to each reference block;
Determine respectively each original candidates piece corresponding to each described reference block,
To each described original candidates piece, carry out respectively at least two kinds of first geometric transformations, respectively the image block after each described first geometric transformation is defined as respectively: each described original candidates piece is the corresponding candidate blocks that respectively converts respectively,
To each described original candidates piece, respectively in current described original candidates piece and all described conversion candidate blocks corresponding to current described original candidates piece, the similarity of wherein corresponding with current described original candidates piece described reference block is reached to the image block of predetermined extent, be defined as one of match block that described reference block is corresponding;
Determine respectively the three-dimensional array of each described match block formation that each described reference block is corresponding, obtain each described reference block corresponding each described three-dimensional array respectively, the array element of each described three-dimensional array is respectively: the gray-scale value of each pixel that forms each described match block of current described three-dimensional array;
Respectively each described three-dimensional array is carried out to predetermined three-dimensional filtering, each three-dimensional array that obtains respectively each described three-dimensional array estimates, the array element that each described three-dimensional array is estimated is respectively: form the estimated value of gray-scale value of each pixel that current described three-dimensional array is estimated each described match block of corresponding described three-dimensional array;
Each described match block during each described three-dimensional array is estimated, if in current described three-dimensional array estimation, current described match block is described conversion candidate blocks, further array element corresponding to current described conversion candidate blocks carried out to secondary geometric transformation, array element corresponding to current described conversion candidate blocks during current described three-dimensional array is estimated is updated to: the array element after each described secondary geometric transformation; Otherwise do not upgrade described three-dimensional array, do not estimate,
Wherein each described secondary geometric transformation is respectively: with the geometric transformation of each described first geometric transformation contrary;
The estimated value of the gray-scale value of each described pixel in estimating according to all described three-dimensional array corresponding to all described reference blocks difference, according to predetermined evaluation method, the gray-scale value of each pixel in described noise image is determined in estimation, obtains the single treatment image.
CN201210326455.3A 2012-09-06 2012-09-06 The method of image procossing and image processing equipment and computer system Expired - Fee Related CN102938141B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210326455.3A CN102938141B (en) 2012-09-06 2012-09-06 The method of image procossing and image processing equipment and computer system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210326455.3A CN102938141B (en) 2012-09-06 2012-09-06 The method of image procossing and image processing equipment and computer system

Publications (2)

Publication Number Publication Date
CN102938141A true CN102938141A (en) 2013-02-20
CN102938141B CN102938141B (en) 2015-09-09

Family

ID=47697034

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210326455.3A Expired - Fee Related CN102938141B (en) 2012-09-06 2012-09-06 The method of image procossing and image processing equipment and computer system

Country Status (1)

Country Link
CN (1) CN102938141B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111031313A (en) * 2017-03-02 2020-04-17 北方工业大学 Filtering method and device based on secondary reference block and secondary estimation group
CN118071661A (en) * 2023-12-12 2024-05-24 东莞力音电子有限公司 Horn coil microscope image enhancement method based on image analysis

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100020208A1 (en) * 2008-07-24 2010-01-28 Florida State University Research Foundation Systems and methods for training an active random field for real-time image denoising

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100020208A1 (en) * 2008-07-24 2010-01-28 Florida State University Research Foundation Systems and methods for training an active random field for real-time image denoising

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
WENJIANG LIU ;YUE ZHU ; TAO LIU ; MENGTIAN RONG: "Analysis and architecture design of aggregation in BM3D", 《ASIC (ASICON), 2011 IEEE 9TH INTERNATIONAL CONFERENCE ON 》, 28 October 2011 (2011-10-28) *
李政; 刘文江; 戎蒙恬; 刘太智;: "BM3D视频去噪算法实现与评估", 《信息技术》, 25 April 2012 (2012-04-25) *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111031313A (en) * 2017-03-02 2020-04-17 北方工业大学 Filtering method and device based on secondary reference block and secondary estimation group
CN111031313B (en) * 2017-03-02 2021-09-24 北方工业大学 Filtering method and device based on secondary reference block and secondary estimation group
CN118071661A (en) * 2023-12-12 2024-05-24 东莞力音电子有限公司 Horn coil microscope image enhancement method based on image analysis

Also Published As

Publication number Publication date
CN102938141B (en) 2015-09-09

Similar Documents

Publication Publication Date Title
CN110532897B (en) Method and device for recognizing image of part
KR101479387B1 (en) Methods and apparatuses for face detection
US11475593B2 (en) Methods and apparatus for processing image data for machine vision
US10878299B2 (en) Methods and apparatus for testing multiple fields for machine vision
CN108322724B (en) Image solid matching method and binocular vision equipment
US10846563B2 (en) Methods and apparatus for generating a dense field of three dimensional data for machine vision
US9934553B2 (en) Method for upscaling an image and apparatus for upscaling an image
WO2015089960A1 (en) Method and device for estimating orientation field of fingerprint
CN110969089B (en) Lightweight face recognition system and recognition method in noise environment
Liang et al. An algorithm for concrete crack extraction and identification based on machine vision
CN110751680A (en) Image processing method with fast alignment algorithm
CN108830283B (en) Image feature point matching method
CN115761734A (en) Object pose estimation method based on template matching and probability distribution
CN110321452B (en) Image retrieval method based on direction selection mechanism
CN112257721A (en) Image target region matching method based on Fast ICP
Prabhu et al. Understanding adversarial robustness through loss landscape geometries
CN110288026A (en) A kind of image partition method and device practised based on metric relation graphics
CN113538704A (en) Method and equipment for drawing virtual object shadow based on light source position
CN102938141B (en) The method of image procossing and image processing equipment and computer system
Dong-liang et al. Degraded image enhancement with applications in robot vision
CN114332447B (en) License plate correction method, license plate correction device and computer readable storage medium
CN106887071B (en) A kind of intelligent access control system based on two dimensional code identification
CN109815975A (en) A kind of objective classification method and relevant apparatus based on robot
CN111815658B (en) Image recognition method and device
EP3624062A1 (en) Methods and apparatus for processing image data for machine vision

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20150909

Termination date: 20180906

CF01 Termination of patent right due to non-payment of annual fee