CN104484872A - Interference image edge extending method based on directions - Google Patents
Interference image edge extending method based on directions Download PDFInfo
- Publication number
- CN104484872A CN104484872A CN201410704980.3A CN201410704980A CN104484872A CN 104484872 A CN104484872 A CN 104484872A CN 201410704980 A CN201410704980 A CN 201410704980A CN 104484872 A CN104484872 A CN 104484872A
- Authority
- CN
- China
- Prior art keywords
- pixels
- sign
- image
- epsiv
- sigma
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
Abstract
The invention relates to an edge extending method based on interference image fringe directions. The edge extending method comprises the following steps of 1, determining the direction of a boundary point; 2, according to the edge contour principle, defining internal pixels and external pixels, and determining the positions; 3, utilizing bilinear interpolation to determine an equation of known pixels and to-be-filled pixels; 4, according to the step 3, respectively establishing the equation of all to-be-filled pixels along the four boundary directions, expressing the equation as a matrix type, and solving the to-be-filled edge pixel values through a matrix solution method; 5, filling the obtained edge pixel values to an image, so as to perform the image edge extension.
Description
Technical field
The invention belongs to technical field of image processing, be specifically related to a kind of interference image edge extending method based on image direction newly.
Background technology
Optical interference techniques provides contactless, high-acruracy survey, and it is used in various investigation and application field.Interference image is the record result of this technology, and in order to strengthen degree of accuracy and robustness, interference image needs to carry out denoising at pretreatment stage.In multiple interference image noise-removed technology, very efficient and be widely used based on the iteration of partial differential equation and directed noise-removed technology in spatial domain.Such as, but if use traditional image border to fill, zero padding or mirror image are filled, and this noise-removed technology can introduce error in image boundary.Owing to needing to use a large amount of iterationses, error will extend to image inside from image boundary.For process interference image edge, a kind of method based on iterative Fourier transform is suggested, and it fills image boundary by outside derivation interference fringe.But it requires that the band of interference image is narrow, this is all under any circumstance not set up.Due to the streaming structure that interference image has interference fringe such, can contribute to along filling border, interference fringe direction reducing error.Therefore, propose a kind of interference image edge based on edge direction to expand.
Summary of the invention
The present invention will overcome the shortcoming and defect that prior art exists, and proposes a kind of image border fill method based on interference image stripe direction, comprises following basic step:
The first step: by gradient direction method determination edge direction θ (x, y), as follows;
Wherein f
σbe through Gauss except the bar graph after making an uproar, f
x σand f
y σf respectively
σfirst-order partial derivative on x direction and y direction.Direction θ (x, y) of point (x, y) is its direction average around in region [x-ε: x+ ε, y-ε: y+ ε ,], and ε represents the range size of peripheral region.
Second step: according to edge level line principle definition interior pixels and external pixels determine its position.For boundary pixel f (x, y), with its interior pixels f (x being in image inside on same level line
i, y
i) be in external pixels f (x outside image boundary
o, y
o) can be expressed as
f(x
i,y
i)=f(x
o,y
o)=f(x,y) (2)
x
i=x-γ,y
i=y-η (3)
x
o=x+γ,y
o=y+η (4)
In formula, γ and η represents the horizontal and vertical range deviation of interior pixels and external pixels and boundary pixel respectively.For external pixels, external pixels has following character:
ω
x(x,y)γ+ω
y(x,y)η=0 (6)
In formula, a (x, y) represents image background intensity, and b (x, y) represents amplitude,
represent phase place, ω
x(x, y) and ω
y(x, y) represents the first-order partial derivative of phase place on x direction and y direction respectively.Because direction θ (x, y) also can be defined as:
θ(x,y)=atan[ω
x(x,y),-ω
y(x,y)] (7)
γ and η can be expressed as with direction:
γ=±αcosθ(x,y),η=±αsinθ(x,y) (8)
α represents the distance of interior pixels and external pixels and boundary pixel.Because ω
x(x, y) and ω
y(x, y) is generally unknown, and direction θ (x, y) is tried to achieve by method in step one.γ and η positive and negative ± from the direction of self, there is different definition according to four borders, can be determined by following table:
3rd step: utilize bilinear interpolation to determine pixel equation to be filled.The bilinear interpolation of external pixels can be expressed as
f(x+γ,y+η)≈(1-|γ|)(1-|η|)f(x,y)+|γ|(1-|η|)f[x+sign(γ),y]
+(1-|γ|)|η|f[x,y+sign(η)]+|γ||η|f[x+sign(γ),y+sign(η)] (9)
Wherein, when parameter be positive number or negative time sign get 1 or-1; According to different boundary directions, f [x+sign (γ), y] and f [x+sign (γ), y+sign (η)] or f [x, y+sign (η)] with f [x+sign (γ), y+sign (η)] be pixel to be filled.F (x+ γ, y+ η) can calculate according to formula (2).
4th step: the system of equations setting up four all edge pixels to be filled of boundary direction according to the 3rd step respectively, is expressed as matrix form by system of equations, and solves edge pixel values to be filled by the method for solution matrix;
5th step: be filled on image by the edge pixel values of acquisition, carries out image border expansion.
Advantage of the present invention is: utilize the existing information of image to carry out the prediction of image change, realizes more accurate edge filling.The existing information of image is with ingenious the showing succinctly of the form in direction.
Accompanying drawing explanation
Fig. 1 is the interior pixels and external pixels that define in the present invention.
Fig. 2 is the level line and image direction that relate in the present invention.
Fig. 3 is the bilinear interpolation schematic diagram used in the present invention.
Fig. 4 be the inventive method result schematic diagram and with additive method result compare schematic diagram.
Embodiment
In the following description, with specific implementation method, the present invention is further explained in detail by reference to the accompanying drawings.But, it should be appreciated that and the invention is not restricted to this application, but can be applicable in the image procossing of many other types and other purposes.
Edge extending method based on interference image stripe direction of the present invention, comprises following basic step:
Step one: according to gradient direction method determination edge direction.Interference image can be expressed as:
In formula, f (x, y) represents pixel (x, y) pixel value, and a (x, y) represents image background intensity, and b (x, y) represents amplitude,
represent phase place, n (x, y) represents additive noise.Uniquely known only have f (x, y).Gradient method can estimate direction exactly from the bar graph of band noise, as follows;
Wherein f
σbe through Gauss except the bar graph after making an uproar, f
x σand f
y σf respectively
σfirst-order partial derivative on x direction and y direction.Direction θ (x, y) of point (x, y) is its direction average around in region [x-ε: x+ ε, y-ε: y+ ε ,], and ε represents the range size of peripheral region.
Step 2: definition interior pixels and external pixels calculate position.As shown in Figure 1, the straight line in figure is the level line by frontier point.The square representative on the left side is in the interior pixels of image inside, and middle square represents boundary pixel, and the square on the right represents the external pixels be in outside image boundary.Interior pixels and external pixels are all sub-pixels.Because these three pixels are positioned on same level line, they have identical pixel value.For boundary pixel f (x, y), with its interior pixels f (x on same level line
i, y
i) and external pixels f (x
o, y
o) can be expressed as
f(x
i,y
i)=f(x
o,y
o)=f(x,y) (12)
xi=x-γ,y
i=y-η (13)
x
o=x+γ,y
o=y+η (14)
In formula, γ and η represents the horizontal and vertical range deviation of interior pixels and external pixels and boundary pixel respectively.
As shown in Figure 2, be the level line being positioned at border, in frontier point near zone, can suppose that a (x, y) and b (x, y) is constant, and
linear.Because the pixel value on same level line is equal, for external pixels, external pixels has following character:
ω
x(x,y)γ+ω
y(x,y)η=0 (
16)
ω in formula
x(x, y) and ω
y(x, y) represents the first-order partial derivative of phase place on x direction and y direction respectively.Because direction θ (x, y) also can be defined as:
θ(x,y)=atan[ω
x(x,y),-ω
y(x,y)] (
17)
γ and η can be expressed as with direction:
γ=±αcosθ(x,y),η=±αsinθ(x,y) (18)
α represents the distance of interior pixels and external pixels and boundary pixel.Experimentally data show, when α gets 1, the filling result of acquisition is better, and therefore in this embodiment, α value gets 1.Because ω
x(x, y) and ω
y(x, y) is generally unknown, and direction θ (x, y) is tried to achieve by method in step one and formula (11).γ and η positive and negative ± from the direction of self, there is different definition according to four borders, can be determined by following table:
Step 3: utilize bilinear interpolation to determine pixel value to be filled.As shown in Figure 3, the square row representative image boundary pixel in left side, middle square row represent external pixels, and the square row on right side represent pixel to be filled.According to bilinear interpolation, external pixels can be expressed as
f(x+γ,y+η)≈(1-|γ|)(1-|η|)f(x,y)+|γ|(1-|η|)f[x+sign(γ),y]
+(1-|γ|)|η|f[x,y+sign(η)]+|γ||η|f[x+sign(γ),y+sign(η)] (19)
Wherein, when parameter be positive number or negative time sign get 1 or-1.In formula (19), the equation left side is external pixels, and it equals boundary pixel on same level line and interior pixels, is known.Two known boundaries pixels and two unknown pixels to be filled are comprised on the right of equation.According to different boundary directions, f [x+sign (γ), y] and f [x+sign (γ), y+sign (η)] or f [x, y+sign (η)] with f [x+sign (γ), y+sign (η)] be pixel to be filled.As shown in Figure 3, in the equation of adjacent external pixels, unknown pixel repeats.List equation to each external pixels, can obtain n equation, comprise n+2 unknown number, n is the length or wide of image.Summit pixel to be filled can be made to equal the summit pixel of former figure, thus make equation equal with unknown number, can solving equation group.
Step 4: set up the system of equations on four direction according to the 3rd step respectively, system of equations is expressed as matrix form, and solve edge pixel values to be filled by the method for solution matrix.
Step 5: be filled on image by the edge pixel values of acquisition, carries out image border expansion.
As shown in Figure 4, the first pictures be with the inventive method process after the design sketch of picture after iteration noise reduction, used zero padding and mirror image fill method respectively for latter two.Obviously can observe, with the image after the inventive method process after noise reduction, there is better effect.
The invention provides the thinking of a kind of image border extending method, the method and access of this technical scheme of specific implementation is a lot, and the above is only the preferred embodiment of the present invention.It should be pointed out that for those skilled in the art, under the premise without departing from the principles of the invention, can also make some improvements and modifications, these improvements and modifications also should be considered as protection scope of the present invention.
Claims (1)
1., based on the edge extending method of interference image stripe direction, based on the expansion border, direction of image boundary, comprise the steps:
The first step: by gradient direction method determination edge direction θ (x, y), as follows;
Wherein f
σbe through Gauss except the bar graph after making an uproar, f
x σand f
y σf respectively
σfirst-order partial derivative on x direction and y direction.Direction θ (x, y) of point (x, y) is its direction average around in region [x-ε: x+ ε, y-ε: y+ ε ,], and ε represents the range size of peripheral region.
Second step: according to edge level line principle definition interior pixels and external pixels determine its position.For boundary pixel f (x, y), with its interior pixels f (x being in image inside on same level line
i, y
i) be in external pixels f (x outside image boundary
o, y
o) can be expressed as
f(x
i,y
i)=f(x
o,y
o)=f(x,y) (2)
x
i=x-γ,y
i=y-η (3)
x
o=x+γ,y
o=y+η (4)
In formula, γ and η represents the horizontal and vertical range deviation of interior pixels and external pixels and boundary pixel respectively.For external pixels, external pixels has following character:
ω
x(x,y)γ+ω
y(x,y)η=0 (6)
In formula, a (x, y) represents image background intensity, and b (x, y) represents amplitude,
represent phase place, ω
x(x, y) and ω
y(x, y) represents the first-order partial derivative of phase place on x direction and y direction respectively.
Because direction θ (x, y) also can be defined as:
θ(x,y)=atan[ω
x(x,y),-ω
y(x,y)] (7)
γ and η can be expressed as with direction
γ=±αcosθ(x,y),η=±αsinθ(x,y) (8)
α represents the distance of interior pixels and external pixels and boundary pixel.Because ω
x(x, y) and ω
y(x, y) is generally unknown, and direction θ (x, y) is tried to achieve by method in step one.γ and η positive and negative ± from the direction of self, there is different definition according to four borders, can be determined by following table:
3rd step: utilize bilinear interpolation to determine pixel equation to be filled.The bilinear interpolation of external pixels can be expressed as
f(x+γ,y+η)≈(1-|γ|)(1-|η|)f(x,y)+|γ|(1-|η|)f[x+sign(γ),y]
(9)
+(1-|γ|)|η|f[x,y+sign(η)]+|γ||η|f[x+sign(γ),y+sign(η)]
Wherein, when parameter be positive number or negative time sign get 1 or-1; According to different boundary directions, f [x+sign (γ), y] and f [x+sign (γ), y+sign (η)] or f [x, y+sign (η)] with f [x+sign (γ), y+sign (η)] be pixel to be filled.F (x+ γ, y+ η) can calculate according to formula (2).
4th step: the system of equations setting up four all edge pixels to be filled of boundary direction according to the 3rd step respectively, is expressed as matrix form by system of equations, and solves edge pixel values to be filled by the method for solution matrix;
5th step: be filled on image by the edge pixel values of acquisition, carries out image border expansion.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410704980.3A CN104484872A (en) | 2014-11-27 | 2014-11-27 | Interference image edge extending method based on directions |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410704980.3A CN104484872A (en) | 2014-11-27 | 2014-11-27 | Interference image edge extending method based on directions |
Publications (1)
Publication Number | Publication Date |
---|---|
CN104484872A true CN104484872A (en) | 2015-04-01 |
Family
ID=52759412
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410704980.3A Pending CN104484872A (en) | 2014-11-27 | 2014-11-27 | Interference image edge extending method based on directions |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN104484872A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105513073A (en) * | 2015-12-09 | 2016-04-20 | 浙江工业大学 | Direction-consistency-based discontinuous fringe image segmentation and boundary filling method |
CN111370002A (en) * | 2020-02-14 | 2020-07-03 | 平安科技(深圳)有限公司 | Method and device for acquiring voice training sample, computer equipment and storage medium |
CN112529013A (en) * | 2020-12-14 | 2021-03-19 | 北京集创北方科技股份有限公司 | Image recognition method, device, equipment and computer readable medium |
WO2024193305A1 (en) * | 2023-03-17 | 2024-09-26 | 山东云海国创云计算装备产业创新中心有限公司 | Image processing method, system, device, and storage medium |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101706961A (en) * | 2009-11-10 | 2010-05-12 | 北京航空航天大学 | Image registration method and image registration device |
CN103208101A (en) * | 2013-03-28 | 2013-07-17 | 中国科学院对地观测与数字地球科学中心 | Local signal to noise ratio-based interferogram filtering method |
-
2014
- 2014-11-27 CN CN201410704980.3A patent/CN104484872A/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101706961A (en) * | 2009-11-10 | 2010-05-12 | 北京航空航天大学 | Image registration method and image registration device |
CN103208101A (en) * | 2013-03-28 | 2013-07-17 | 中国科学院对地观测与数字地球科学中心 | Local signal to noise ratio-based interferogram filtering method |
Non-Patent Citations (2)
Title |
---|
HAIXIA WANG 等: ""Oriented boundary padding for iterative and oriented fringe pattern denoising techniques"", 《SIGNAL PROCESSING》 * |
HAIXIA WANG 等: ""Quality-guided orientation unwrapping for fringe direction estimation"", 《APPLIED OPTICS》 * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105513073A (en) * | 2015-12-09 | 2016-04-20 | 浙江工业大学 | Direction-consistency-based discontinuous fringe image segmentation and boundary filling method |
CN111370002A (en) * | 2020-02-14 | 2020-07-03 | 平安科技(深圳)有限公司 | Method and device for acquiring voice training sample, computer equipment and storage medium |
CN112529013A (en) * | 2020-12-14 | 2021-03-19 | 北京集创北方科技股份有限公司 | Image recognition method, device, equipment and computer readable medium |
WO2024193305A1 (en) * | 2023-03-17 | 2024-09-26 | 山东云海国创云计算装备产业创新中心有限公司 | Image processing method, system, device, and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103400366B (en) | Based on the dynamic scene depth acquisition methods of fringe structure light | |
CN102608584B (en) | Time sequence InSAR (Interferometric Synthetic Aperture Radar) deformation monitoring method and device based on polynomial inversion model | |
CN103258328B (en) | A kind of center of distortion localization method of wide visual field camera lens | |
CN104732531B (en) | A kind of high-resolution remote sensing image signal to noise ratio curve self-adapting acquisition methods | |
CN104484872A (en) | Interference image edge extending method based on directions | |
CN105066906A (en) | Fast high dynamic range three-dimensional measurement method | |
CN103593826A (en) | Image ring artifact correcting method | |
CN101383044B (en) | Spatial diffusion in images | |
CN102968788A (en) | Wave band registering method based on regular grid surface element | |
CN103868471A (en) | Three-dimensional shape measuring apparatus and control method thereof | |
CN110909301A (en) | Interpolation method constructed based on gradient direction | |
CN102903078B (en) | A kind of motion blur image method for parameter estimation based on multiresolution Fourier analysis theory | |
BR112015000879B1 (en) | System and method for modeling migration speed | |
CN102519395A (en) | Color response calibration method in colored structure light three-dimensional measurement | |
CN104392447B (en) | A kind of image matching method based on shade of gray | |
CN105093280B (en) | Surface-level model is to the low frequency of earthquake data influence and the decomposition method of radio-frequency component | |
CN104331857A (en) | Phase position difference iteration compensation method in light intensity transmission equation phase retrieval | |
CN106054252B (en) | A kind of method and device of pre-stack time migration | |
CN104239419A (en) | Discontinuous subnet connecting method and device oriented to time sequence InSAR | |
CN102798354A (en) | Binary stripe stack based sinusoidal grating generation method | |
CN104614083B (en) | A kind of method of recovering phase shifting interference PHASE DISTRIBUTION and obtaining phase-shift phase between two width figure | |
CN102073997B (en) | Small-square document image perspective-recovery method | |
CN106595879A (en) | Wavefront reconstruction method for compensating frequency response defect | |
DE102013209295B4 (en) | Correction of MR image datasets using a similarity of temporally successive datasets | |
CN104297794A (en) | Seismic data mapping method and system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20150401 |
|
WD01 | Invention patent application deemed withdrawn after publication |