CN102542543A - Block similarity-based interactive image segmenting method - Google Patents

Block similarity-based interactive image segmenting method Download PDF

Info

Publication number
CN102542543A
CN102542543A CN2012100043120A CN201210004312A CN102542543A CN 102542543 A CN102542543 A CN 102542543A CN 2012100043120 A CN2012100043120 A CN 2012100043120A CN 201210004312 A CN201210004312 A CN 201210004312A CN 102542543 A CN102542543 A CN 102542543A
Authority
CN
China
Prior art keywords
msub
mrow
pixel point
image
foreground
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2012100043120A
Other languages
Chinese (zh)
Inventor
钟桦
焦李成
王旖蒙
王桂婷
缑水平
王爽
田小林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xidian University
Original Assignee
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xidian University filed Critical Xidian University
Priority to CN2012100043120A priority Critical patent/CN102542543A/en
Publication of CN102542543A publication Critical patent/CN102542543A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a block similarity-based interactive image segmenting method, which mainly solves the problems that the characteristic similarity cannot be analyzed correctly and performance popularization is restricted by the conventional method. The method comprises the following steps of: (1) extracting brightness information of a background marking block and a foreground marking block of an image, and performing similarity analysis on the image by using the block similarity of the image; (2) calculating a geodesic distance from each pixel point in the image to a foreground marking point and a background marking point respectively according to a weight which is acquired by the similarity analysis; (3) acquiring the probability that the pixel points belong to a foreground or a background according to the result of the geodesic distance; and (4) dividing the image into the foreground and the background according to the probability. By the block similarity-based interactive image segmenting method, characteristic information of the marking blocks of the image is used effectively, and the similarity analysis of the image is implemented by using block information; and compared with the conventional method, the method is more accurate in characteristic similarity analysis of the image, high in consistency of segmented results and low in background interference, and can be used for segmenting natural images.

Description

Interactive image segmentation method based on block similarity
Technical Field
The invention belongs to the technical field of image processing, and particularly relates to an image segmentation method which can be used for natural image segmentation in the fields of national defense and military situation monitoring, environment change evaluation, astronomical images, medical images and the like.
Background
Image segmentation is an image processing technology that divides an image into several meaningful regions or parts as needed, and is a hot issue in image processing research. An image often contains different parts, such as objects, environments, backgrounds and the like, which have areas of interest for people to compare, so that the image is divided into a plurality of parts, and the characteristics of the sub-images and the relationship among the sub-images are used for describing the image, which is very important for further analysis of the image.
Image segmentation can be divided into two types, interactive image segmentation and non-interactive image segmentation, according to whether a user participates in the segmentation process or not. The interactive image segmentation generally aims at the segmentation of image foreground and background, the interactive operation brings image prior information required in the segmentation of the image foreground and the background, the interactive image segmentation does not involve the user, the image is given as an image, the algorithm automatically completes the whole segmentation, and the segmentation effect is poor. Alexis Protiere and Guillermo Sapiro in 2007 propose Gabor feature-based interactive image segmentation, Gabor filtering is carried out on an image, window energy features of sub-bands after filtering are extracted, Gaussian modeling is carried out on the sub-band energy features to obtain a weight probability matrix of each pixel point belonging to a foreground and a background, finally, the weight probability matrix is regarded as a directed weighted graph, image tabulation coordinate values are used as vertex values, corresponding weights in the weight probability matrix are used as weights of edges, an image segmentation problem is converted into a shortest path problem from all pixel points to marking pixel points of the foreground and the background, and the image is divided into two types according to the obtained shortest paths.
The interactive image segmentation based on Gabor features considers the similarity of Gabor feature information of each pixel point of an image, the segmentation of a synthesized texture image is accurate, although a natural image has a certain effect, for the natural image with not strong texture information, the Gabor features cannot well reflect the similar information between the pixel points of the image, so that the segmentation result is not accurate, and the interference of a background is large.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provides an interactive image segmentation method based on block similarity so as to accurately reflect the similar information among image pixel points, improve the consistency of segmentation results and reduce background interference.
In order to achieve the above object, the present invention comprises the steps of:
(1) assuming that an input image to be segmented obeys Markov distribution, a weight probability formula p (x) of pixel points, a background and a foreground in the image to be segmented is constructedi|xj):
<math> <mrow> <mi>p</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>|</mo> <msub> <mi>x</mi> <mi>j</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <msub> <mi>Z</mi> <mrow> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>-</mo> <msub> <mi>x</mi> <mi>j</mi> </msub> </mrow> </msub> </mfrac> <mi>exp</mi> <mrow> <mo>(</mo> <mo>-</mo> <mfrac> <msubsup> <mrow> <mo>|</mo> <mo>|</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>-</mo> <msub> <mi>x</mi> <mi>j</mi> </msub> <mo>|</mo> <mo>|</mo> </mrow> <mn>2</mn> <mn>2</mn> </msubsup> <msup> <msub> <mrow> <mn>4</mn> <mi>&sigma;</mi> </mrow> <msub> <mi>x</mi> <mi>i</mi> </msub> </msub> <mn>2</mn> </msup> </mfrac> <mo>)</mo> </mrow> <mo>,</mo> </mrow> </math>
Wherein x isiFor the pixel point to be processed, xjIs given by xiThe remaining pixel points in the similar window as the center,
Figure BDA0000129287130000022
in order to normalize the factors, the method comprises the steps of,
Figure BDA0000129287130000023
is a pixel point xiAnd pixel point xjThe euclidean distance between them,
Figure BDA0000129287130000024
by a pixel point xiA block variance of the center;
(2) inputting foreground and background mark images of an image to be segmented, introducing the size of an image block into an obtained weight probability formula, and carrying out block information similarity analysis on brightness information of the image to be segmented to obtain a pixel point x to be processediBelong toTo-be-processed pixel point xiWeight probability pf (x) of foreground marking pointi) And the weight probability pb (x) of the background mark pointi):
<math> <mrow> <mi>pf</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <msub> <mi>Z</mi> <mrow> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>-</mo> <msub> <mi>x</mi> <mi>j</mi> </msub> </mrow> </msub> </mfrac> <mi>exp</mi> <mrow> <mo>(</mo> <mo>-</mo> <mfrac> <mrow> <mi>df</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> </mrow> <mrow> <msup> <mrow> <mn>4</mn> <mi>&sigma;</mi> </mrow> <mn>2</mn> </msup> <mi>N</mi> </mrow> </mfrac> <mo>)</mo> </mrow> <mo>,</mo> </mrow> </math> <math> <mrow> <mi>pb</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <msub> <mi>Z</mi> <mrow> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>-</mo> <msub> <mi>x</mi> <mi>j</mi> </msub> </mrow> </msub> </mfrac> <mi>exp</mi> <mrow> <mo>(</mo> <mo>-</mo> <mfrac> <mrow> <mi>db</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> </mrow> <mrow> <msup> <mrow> <mn>4</mn> <mi>&sigma;</mi> </mrow> <mn>2</mn> </msup> <mi>N</mi> </mrow> </mfrac> <mo>)</mo> </mrow> <mo>,</mo> </mrow> </math>
Wherein xiFor the pixel point to be processed, i belongs to 1, 2, 3 … M, M is the size of the image to be segmented, xjMarking pixel points for the foreground or the background, j belongs to 1, 2, 3 … C, C is the number of marked foreground or background pixel points, and sigma is xiVariance of the centered image block, block size 7 × 7, df (x)i) Representing a pixel point xiShortest Euclidean distance from the foreground marker block, db (x)i) Representing a pixel point xiThe shortest Euclidean distance from the background mark block, wherein N is the size of the block;
(3) by the formula
Figure BDA0000129287130000027
Obtaining the pixel point x to be processediProbability of similarity PF (x) of belonging to the foregroundi) From the formula
Figure BDA0000129287130000028
Obtaining the pixel point x to be processediProbability of similarity PB (x) belonging to the backgroundi);
(4) Mixing 1-PF (x)i) As geodesic distance d to the foregroundf(xi) The weight value of (a) is obtained by an algorithm of solving the shortest path by Dijkstra to obtain a pixel point x to be processediGeodesic distance d to the foregroundf(xi) 1-PB (x)i) As a geodesic distance d to the backgroundb(xi) The weight value of (a) is obtained by an algorithm of solving the shortest path by Dijkstra to obtain a pixel point x to be processediGeodesic distance d to the backgroundb(xi);
(5) According to the geodesic distance df(xi),db(xi) Obtaining the foreground of the pixel point to be processed
Figure BDA0000129287130000031
And probability of belonging to the background
Figure BDA0000129287130000032
PB x i = d b ( x i ) d b ( x i ) + d f ( x i ) ,
PF x i = d f ( x i ) d b ( x i ) + d f ( x i ) .
(6) According to the probability that the pixel point to be processed belongs to the foreground
Figure BDA0000129287130000035
Probability to background
Figure BDA0000129287130000036
The size of the pixel point is obtained as follows: if it isProcessing the pixel point xiIf it is determined as a background pixel point, if it is determined as a background pixel point
Figure BDA0000129287130000038
Then the pixel point x to be processed is processediAnd judging as background pixel points.
(7) And (5) repeating the steps (2) to (6) until all the pixel points in the input image to be segmented are processed completely, and obtaining the final segmentation result of the image to be segmented.
Compared with the prior art, the method has the following advantages:
according to the invention, the Euclidean distance between the blocks of the image pixels is used for reflecting the characteristic information between the image pixels, so that a more accurate similarity probability matrix can be obtained, the similar information between the images can be better reflected, the overall information of the images can be comprehensively grasped, the background interference of the segmentation result is smaller, and the wrong segmentation is not easy to cause.
Drawings
FIG. 1 is a flow chart of the present invention;
FIG. 2 is a natural image used in the testing of the present invention;
FIG. 3 is a diagram of foreground and background signatures of a natural image used in the test of the present invention;
FIG. 4 is a graph comparing the segmentation results of FIG. 2 using the present invention and a prior art method;
FIG. 5 is a graph of the difference between the segmentation result of FIG. 2 and an ideal template using the present invention and a prior art method.
Detailed Description
The following examples illustrate the invention in detail: the present embodiment is implemented on the premise of the technical solution of the present invention, and a detailed implementation manner and a process are given, but the scope of the present invention is not limited to the following embodiments.
Referring to fig. 1, the invention comprises the following steps:
step 1: under the condition of assuming that the input image to be segmented is subject to Markov distribution, constructing a weight probability formula p (x)i|xj)。
(1a) In image denoising, the estimated value of a pixel point can be known according to a Bayesian estimation theory framework
Figure BDA0000129287130000041
Comprises the following steps:
<math> <mrow> <mover> <mi>x</mi> <mo>^</mo> </mover> <mo>=</mo> <mfrac> <mrow> <munderover> <mi>&Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <msub> <mi>N</mi> <mi>i</mi> </msub> </munderover> <mi>p</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>|</mo> <msub> <mi>x</mi> <mi>j</mi> </msub> <mo>)</mo> </mrow> <msub> <mi>x</mi> <mi>j</mi> </msub> </mrow> <mrow> <munderover> <mi>&Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <msub> <mi>N</mi> <mi>i</mi> </msub> </munderover> <mi>p</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>|</mo> <msub> <mi>x</mi> <mi>j</mi> </msub> <mo>)</mo> </mrow> </mrow> </mfrac> <mo>,</mo> </mrow> </math>
wherein x isiFor the pixel point to be estimated, xjIs given by xiThe remaining pixels in the similarity window, p (x), as the centeri|xj) Is a pixel point xiAnd pixel point xjWeight probability information of, NiBy a pixel point xiThe size of the search window at the center;
(1b) let p (x) be independent of each other and obey Gaussian Markov distributioni|xj)=p(xi-xj) Then, according to the markov distribution model, there are:
<math> <mrow> <mi>p</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>|</mo> <msub> <mi>x</mi> <mi>j</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <msub> <mi>Z</mi> <mrow> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>-</mo> <msub> <mi>x</mi> <mi>j</mi> </msub> </mrow> </msub> </mfrac> <mi>exp</mi> <mrow> <mo>(</mo> <mo>-</mo> <mfrac> <msubsup> <mrow> <mo>|</mo> <mo>|</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>-</mo> <msub> <mi>x</mi> <mi>j</mi> </msub> <mo>+</mo> <msub> <mi>&mu;</mi> <msub> <mi>x</mi> <mi>j</mi> </msub> </msub> <mo>-</mo> <msub> <mi>&mu;</mi> <msub> <mi>x</mi> <mi>i</mi> </msub> </msub> <mo>|</mo> <mo>|</mo> </mrow> <mn>2</mn> <mn>2</mn> </msubsup> <mrow> <mn>2</mn> <mrow> <mo>(</mo> <msubsup> <mi>&sigma;</mi> <msub> <mi>x</mi> <mi>i</mi> </msub> <mn>2</mn> </msubsup> <mo>+</mo> <msubsup> <mi>&sigma;</mi> <msub> <mi>x</mi> <mi>j</mi> </msub> <mn>2</mn> </msubsup> <mo>)</mo> </mrow> </mrow> </mfrac> <mo>)</mo> </mrow> <mo>,</mo> </mrow> </math>
wherein,
Figure BDA0000129287130000044
is given by xiThe mean value of the blocks that are centered,
Figure BDA0000129287130000045
is given by xiThe variance of the block at the center is,
Figure BDA0000129287130000046
is given by xjThe mean value of the blocks that are centered,
Figure BDA0000129287130000047
with xjThe variance of the block at the center is,
Figure BDA0000129287130000048
is a normalization factor;
(1c) all pixel points obey the Gaussian Markov distribution, so
Figure BDA0000129287130000049
Figure BDA00001292871300000410
Then p (x) in (2b)i|xj) The following steps are changed:
<math> <mrow> <mi>p</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>|</mo> <msub> <mi>x</mi> <mi>j</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <msub> <mi>Z</mi> <mrow> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>-</mo> <msub> <mi>x</mi> <mi>j</mi> </msub> </mrow> </msub> </mfrac> <mi>exp</mi> <mrow> <mo>(</mo> <mo>-</mo> <mfrac> <msubsup> <mrow> <mo>|</mo> <mo>|</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>-</mo> <msub> <mi>x</mi> <mi>j</mi> </msub> <mo>|</mo> <mo>|</mo> </mrow> <mn>2</mn> <mn>2</mn> </msubsup> <msup> <msub> <mrow> <mn>4</mn> <mi>&sigma;</mi> </mrow> <msub> <mi>x</mi> <mi>i</mi> </msub> </msub> <mn>2</mn> </msup> </mfrac> <mo>)</mo> </mrow> <mo>,</mo> </mrow> </math>
wherein,
Figure BDA00001292871300000412
is a pixel point xiAnd pixel point xjThe euclidean distance between them.
Step 2: according to the weight probability p (x)i|xj) And carrying out block information similarity analysis on the input marked image to obtain the similarity probability between the pixel point to be processed and the foreground and the background respectively.
(2a) Inputting a marked image, wherein the marked points are divided into a foreground type and a background type;
(2b) introducing the size of an image block into a weight probability formula, and performing block information similarity analysis on brightness information of an image to be segmented to obtain a pixel point x to be processediSimilarity weight pf (x) with foreground markeri) And probability of similarity pb (x) to the background marker pointi):
<math> <mrow> <mi>pf</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <msub> <mi>Z</mi> <mrow> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>-</mo> <msub> <mi>x</mi> <mi>j</mi> </msub> </mrow> </msub> </mfrac> <mi>exp</mi> <mrow> <mo>(</mo> <mo>-</mo> <mfrac> <mrow> <mi>df</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> </mrow> <mrow> <msup> <mrow> <mn>4</mn> <mi>&sigma;</mi> </mrow> <mn>2</mn> </msup> <mi>N</mi> </mrow> </mfrac> <mo>)</mo> </mrow> </mrow> </math> <math> <mrow> <mi>pb</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <msub> <mi>Z</mi> <mrow> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>-</mo> <msub> <mi>x</mi> <mi>j</mi> </msub> </mrow> </msub> </mfrac> <mi>exp</mi> <mrow> <mo>(</mo> <mo>-</mo> <mfrac> <mrow> <mi>db</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> </mrow> <mrow> <msup> <mrow> <mn>4</mn> <mi>&sigma;</mi> </mrow> <mn>2</mn> </msup> <mi>N</mi> </mrow> </mfrac> <mo>)</mo> </mrow> <mo>,</mo> </mrow> </math>
Wherein xiFor the pixel point to be processed, i belongs to 1, 2, 3 … M, M is the size of the image to be segmented, xjMarking pixel points for the foreground or the background, j belongs to 1, 2, 3 … C, C is the number of marked foreground or background pixel points, and sigma is xiThe variance of the image block, centered, block size 7 x 7,to normalize the factor, df (x)i) Representing a pixel point xiShortest Euclidean distance from the foreground marker block, db (x)i) Representing a pixel point xiThe shortest Euclidean distance from the background mark block, wherein N is the size of the block;
(2d) by the formula PF ( x i ) = pf ( x i ) pf ( x i ) + pf ( x i ) , PB ( x i ) = pb ( x i ) pf ( x i ) + pb ( x i ) Obtaining the pixel point x to be processediProbability of similarity PF (x) belonging to the foreground respectivelyi) Probability of similarity to background PB (x)i);
And step 3: calculating a pixel point x to be processediGeodesic distance d to the foregroundf(xi). Mixing 1-PF (x)i) As a weight required for calculating the geodesic distance, the geodesic distance is calculated:
(3a) initializing the geodesic distance from the foreground marking point to the foreground to be 0, initializing the geodesic distance of the background to be infinite, and taking the foreground marking point as a sample point;
(3b) searching 8 connected neighborhood pixels of the current sample according to the 8 neighborhood matrix of the pixels, and finding out the pixel with the minimum weight value in the pixels;
(3c) adding the pixel point with the minimum weight to a path of the geodesic distance, and sequencing according to the size;
(3d) for other pixel points on the path, weight correction is carried out according to the weight of the newly added pixel point, and the smaller of the original weight and the corrected weight of the pixel point is taken as the updated weight probability of other pixel points on the path;
(3e) taking the newly added pixel points as new samples, and repeating the steps (3b) - (3d) until all the pixel points in the image are completely searched;
(3f) to-be-processed pixel point xiThe updated weight probability of (a) is taken as the pixel point xiGeodesic distance d to the foregroundf(xi);
And 4, step 4: calculating a pixel point x to be processediGeodesic distance d to the backgroundb(xi):
1-PB (x)i) As a weight required for calculating the geodesic distance, the geodesic distance is calculated:
(4a) initializing the geodesic distance from the background mark point to the background to 0, initializing the geodesic distance from the foreground to infinity, and taking the background mark point as a sample point;
(4b) searching 8 connected neighborhood pixels of the current sample according to the 8 neighborhood matrix of the pixels, and finding out the pixel with the minimum weight value in the pixels;
(4c) adding the pixel point with the minimum weight to a path of the geodesic distance, and sequencing according to the size; (4d) for other pixel points on the path, weight correction is carried out according to the weight of the newly added pixel point, and the smaller of the original weight and the corrected weight of the pixel point is taken as the updated weight probability of other pixel points on the path;
(4e) taking the newly added pixel points as new samples, and repeating the steps (4b) - (4d) until all the pixel points in the image are completely searched;
(4f) to-be-processed pixel point xiThe updated weight probability of (a) is taken as the pixel point xiGeodesic distance d to the backgroundb(xi);
And 5: according to the geodesic distance df(xi),db(xi) Obtaining the foreground of the pixel point to be processed
Figure BDA0000129287130000061
And probability of belonging to the background
Figure BDA0000129287130000062
PB x i = d b ( x i ) d b ( x i ) + d f ( x i ) ,
PF x i = d f ( x i ) d b ( x i ) + d f ( x i ) .
Step 6: according to the image to be processedProbability of a prime point belonging to the foregroundProbability to background
Figure BDA0000129287130000066
The size of the pixel point is obtained as follows: if it is
Figure BDA0000129287130000067
Processing the pixel point xiIf it is determined as a background pixel point, if it is determined as a background pixel point
Figure BDA0000129287130000068
Then the pixel point x to be processed is processediAnd judging as background pixel points.
And 7: and (5) repeating the steps 2-6 until all pixel points in the input image to be segmented are completely processed, and obtaining the final segmentation result of the image to be segmented.
The effects of the present invention are further illustrated by the following simulations:
1. simulation conditions are as follows:
the invention adopts the image shown in fig. 2 as the test image, wherein fig. 2(a) is a test image bird to be segmented, fig. 2(b) is a test image cat to be segmented, fig. 2(c) is a test image vase to be segmented, and the software platform is MATLAB 7.0.
The foreground and background pixels are respectively marked on the test chart to be segmented shown in fig. 2(a), fig. 2(b) and fig. 2(c), fig. 3(a) is the marked chart of the test chart shown in fig. 2(a), fig. 3(b) is the marked chart of the test chart shown in fig. 2(b), and fig. 3(c) is the marked chart of the test chart shown in fig. 2(c), wherein the dotted line represents the foreground pixel, and the solid line represents the background pixel.
2. Emulated content
Simulation 1, a simulation experiment is performed on the test chart 2(b) by using the existing Gabor-based features and the method of the present invention, and the experimental result is shown in FIG. 4, wherein FIG. 4(a) is a graph of the segmentation result of the existing Gabor-based features, FIG. 4(b) is a graph of the segmentation result of the method of the present invention,
and 2, performing a simulation experiment on the segmentation result of the test chart 2(b) and the ideal template chart by using the existing Gabor-based feature and the method of the invention, wherein the experiment result is shown in FIG. 5, wherein FIG. 5(a) is a difference chart between the existing segmentation result based on the Gabor feature and the ideal template, a white area represents a wrongly-divided pixel point, FIG. 5(b) is a difference chart between the segmentation result of the method of the invention and the ideal template, and the white area represents a wrongly-divided pixel point.
2. And (3) simulation results:
as can be seen from fig. 4(a) and fig. 5(a), the method based on Gabor features has weak ability to obtain the image foreground object, is not accurate enough to grasp the whole image, has large interference on image background information, and may cause the object to be wrongly classified.
As can be seen from FIG. 4(b) and FIG. 5(b), the method of the present invention has the advantages of accurate grasp of the overall structure of the image, accurate analysis of the feature similarity, good consistency of the segmentation result and low background interference.
The evaluation indexes are shown in table 1 by comparing the methods shown in fig. 4(a) and 4(b) with the error rate as the evaluation criterion of the division result.
Table 1: error rate (%) comparison of two segmentation methods
bird cat vase
Gabor characteristics 5.1749 13.3302 21.15
The method of the invention 3.3386 7.4799 9.8532
As can be seen from Table 1, the similarity analysis of the traditional Gabor characteristics cannot well utilize the similarity information between image pixels, and the probability of geodesic distance weight is not accurate enough, so that the segmentation result is not ideal. The method can effectively utilize the block information of the image to carry out the segmentation experiment on different images. In experiments, the experimental effect of the method is better than that of similarity analysis based on the traditional Gabor characteristic method in the aspects of overall information grasping of images and background interference reduction.

Claims (4)

1. An interactive image segmentation method based on block similarity comprises the following steps:
(1) assuming that an input image to be segmented obeys Markov distribution, a weight probability formula p (x) of pixel points, a background and a foreground in the image to be segmented is constructedi|xj):
<math> <mrow> <mi>p</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>|</mo> <msub> <mi>x</mi> <mi>j</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <msub> <mi>Z</mi> <mrow> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>-</mo> <msub> <mi>x</mi> <mi>j</mi> </msub> </mrow> </msub> </mfrac> <mi>exp</mi> <mrow> <mo>(</mo> <mo>-</mo> <mfrac> <msubsup> <mrow> <mo>|</mo> <mo>|</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>-</mo> <msub> <mi>x</mi> <mi>j</mi> </msub> <mo>|</mo> <mo>|</mo> </mrow> <mn>2</mn> <mn>2</mn> </msubsup> <msup> <msub> <mrow> <mn>4</mn> <mi>&sigma;</mi> </mrow> <msub> <mi>x</mi> <mi>i</mi> </msub> </msub> <mn>2</mn> </msup> </mfrac> <mo>)</mo> </mrow> <mo>,</mo> </mrow> </math>
Wherein x isiFor the pixel point to be processed, xjIs given by xiThe remaining pixel points in the similar window as the center,in order to normalize the factors, the method comprises the steps of,
Figure FDA0000129287120000013
is a pixel point xiAnd pixel point xjThe euclidean distance between them,
Figure FDA0000129287120000014
by a pixel point xiA block variance of the center;
(2) inputting images to be segmentedThe foreground and background marked images are obtained, the size of an image block is introduced into an obtained weight probability formula, the brightness information of the image to be segmented is subjected to block information similarity analysis, and a pixel point x to be processed is obtainediBelonging to a pixel point x to be processediWeight probability pf (x) of foreground marking pointi) And the weight probability pb (x) of the background mark pointi):
<math> <mrow> <mi>pf</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <msub> <mi>Z</mi> <mrow> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>-</mo> <msub> <mi>x</mi> <mi>j</mi> </msub> </mrow> </msub> </mfrac> <mi>exp</mi> <mrow> <mo>(</mo> <mo>-</mo> <mfrac> <mrow> <mi>df</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> </mrow> <mrow> <msup> <mrow> <mn>4</mn> <mi>&sigma;</mi> </mrow> <mn>2</mn> </msup> <mi>N</mi> </mrow> </mfrac> <mo>)</mo> </mrow> <mo>,</mo> </mrow> </math> <math> <mrow> <mi>pb</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <msub> <mi>Z</mi> <mrow> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>-</mo> <msub> <mi>x</mi> <mi>j</mi> </msub> </mrow> </msub> </mfrac> <mi>exp</mi> <mrow> <mo>(</mo> <mo>-</mo> <mfrac> <mrow> <mi>db</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> </mrow> <mrow> <msup> <mrow> <mn>4</mn> <mi>&sigma;</mi> </mrow> <mn>2</mn> </msup> <mi>N</mi> </mrow> </mfrac> <mo>)</mo> </mrow> <mo>,</mo> </mrow> </math>
Wherein xiFor the pixel point to be processed, i belongs to 1, 2, 3 … M, M is the size of the image to be segmented, xjMarking pixel points for the foreground or the background, j belongs to 1, 2, 3 … C, C is the number of marked foreground or background pixel points, and sigma is xiVariance of the centered image block, block size 7 × 7, df (x)i) Representing a pixel point xiShortest Euclidean distance from the foreground marker block, db (x)i) Representing a pixel point xiThe shortest Euclidean distance from the background mark block, wherein N is the size of the block;
(3) by the formula
Figure FDA0000129287120000017
Obtaining the pixel point x to be processediProbability of similarity PF (x) of belonging to the foregroundi) From the formulaObtaining the pixel point x to be processediProbability of similarity PB (x) belonging to the backgroundi);
(4) Mixing 1-PF (x)i) As geodesic distance d to the foregroundf(xi) The weight value of (a) is obtained by an algorithm of solving the shortest path by Dijkstra to obtain a pixel point x to be processediGeodesic distance d to the foregroundf(xi) 1-PB (x)i) As a geodesic distance d to the backgroundb(xi) The weight value of (2) is obtained by utilizing an algorithm of solving the shortest path by DijkstraReason pixel point xiGeodesic distance d to the backgroundb(xi);
(5) According to the geodesic distance df(xi),db(xi) Obtaining the foreground of the pixel point to be processed
Figure FDA0000129287120000021
And probability of belonging to the background
Figure FDA0000129287120000022
PB x i = d b ( x i ) d b ( x i ) + d f ( x i ) ,
PF x i = d f ( x i ) d b ( x i ) + d f ( x i ) ;
(6) According to the probability that the pixel point to be processed belongs to the foreground
Figure FDA0000129287120000025
Probability to background
Figure FDA0000129287120000026
The size of the pixel point is obtained as follows: if it is
Figure FDA0000129287120000027
Processing the pixel point xiIf it is determined as a background pixel point, if it is determined as a background pixel point
Figure FDA0000129287120000028
Then the pixel point x to be processed is processediJudging as a background pixel point;
(7) and (5) repeating the steps (2) to (6) until all the pixel points in the input image to be segmented are processed completely, and obtaining the final segmentation result of the image to be segmented.
2. The method for geodetic distance image segmentation based on block similarity as claimed in claim 1, wherein the step (1) of constructing a weight probability formula of pixel points in the image to be segmented and the background and foreground is performed according to the following steps:
(2a) in image denoising, a theoretical frame is estimated according to BayesEstimation of pixel points
Figure FDA0000129287120000029
Comprises the following steps:
<math> <mrow> <mover> <mi>x</mi> <mo>^</mo> </mover> <mo>=</mo> <mfrac> <mrow> <munderover> <mi>&Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <msub> <mi>N</mi> <mi>i</mi> </msub> </munderover> <mi>p</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>|</mo> <msub> <mi>x</mi> <mi>j</mi> </msub> <mo>)</mo> </mrow> <msub> <mi>x</mi> <mi>j</mi> </msub> </mrow> <mrow> <munderover> <mi>&Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <msub> <mi>N</mi> <mi>i</mi> </msub> </munderover> <mi>p</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>|</mo> <msub> <mi>x</mi> <mi>j</mi> </msub> <mo>)</mo> </mrow> </mrow> </mfrac> <mo>,</mo> </mrow> </math>
wherein x isiFor the pixel point to be estimated, xjIs given by xiThe remaining pixels in the similarity window, p (x), as the centeri|xj) Is a pixel point xiAnd pixel point xjWeight probability information of, NiBy a pixel point xiThe size of the search window at the center;
(2b) suppose each in the image to be segmentedThe pixel points are independent and obey the Gaussian Markov distribution, let p (x)i|xj)=p(xi-xj) Then, according to the markov distribution model, there are:
<math> <mrow> <mi>p</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>|</mo> <msub> <mi>x</mi> <mi>j</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <msub> <mi>Z</mi> <mrow> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>-</mo> <msub> <mi>x</mi> <mi>j</mi> </msub> </mrow> </msub> </mfrac> <mi>exp</mi> <mrow> <mo>(</mo> <mo>-</mo> <mfrac> <msubsup> <mrow> <mo>|</mo> <mo>|</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>-</mo> <msub> <mi>x</mi> <mi>j</mi> </msub> <mo>+</mo> <msub> <mi>&mu;</mi> <msub> <mi>x</mi> <mi>j</mi> </msub> </msub> <mo>-</mo> <msub> <mi>&mu;</mi> <msub> <mi>x</mi> <mi>i</mi> </msub> </msub> <mo>|</mo> <mo>|</mo> </mrow> <mn>2</mn> <mn>2</mn> </msubsup> <mrow> <mn>2</mn> <mrow> <mo>(</mo> <msubsup> <mi>&sigma;</mi> <msub> <mi>x</mi> <mi>i</mi> </msub> <mn>2</mn> </msubsup> <mo>+</mo> <msubsup> <mi>&sigma;</mi> <msub> <mi>x</mi> <mi>j</mi> </msub> <mn>2</mn> </msubsup> <mo>)</mo> </mrow> </mrow> </mfrac> <mo>)</mo> </mrow> <mo>,</mo> </mrow> </math>
wherein,
Figure FDA0000129287120000032
is given by xiThe mean value of the blocks that are centered,
Figure FDA0000129287120000033
is given by xiThe variance of the block at the center is,
Figure FDA0000129287120000034
is given by xjThe mean value of the blocks that are centered,
Figure FDA0000129287120000035
with xjThe variance of the block at the center is,
Figure FDA0000129287120000036
is a normalization factor;
(2c) all pixel points obey the Gaussian Markov distribution, so
Figure FDA0000129287120000038
Then p (x) in (2b)i|xj) The following steps are changed:
<math> <mrow> <mi>p</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>|</mo> <msub> <mi>x</mi> <mi>j</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <msub> <mi>Z</mi> <mrow> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>-</mo> <msub> <mi>x</mi> <mi>j</mi> </msub> </mrow> </msub> </mfrac> <mi>exp</mi> <mrow> <mo>(</mo> <mo>-</mo> <mfrac> <msubsup> <mrow> <mo>|</mo> <mo>|</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>-</mo> <msub> <mi>x</mi> <mi>j</mi> </msub> <mo>|</mo> <mo>|</mo> </mrow> <mn>2</mn> <mn>2</mn> </msubsup> <msup> <msub> <mrow> <mn>4</mn> <mi>&sigma;</mi> </mrow> <msub> <mi>x</mi> <mi>i</mi> </msub> </msub> <mn>2</mn> </msup> </mfrac> <mo>)</mo> </mrow> <mo>,</mo> </mrow> </math>
wherein,
Figure FDA00001292871200000310
is a pixel point xiAnd pixel point xjThe euclidean distance between them.
3. The method for geodesic distance image segmentation based on block similarity as claimed in claim 1, wherein said Dijkstra shortest path algorithm in step (4) is used to obtain pixel point x to be processediGeodesic distance d to the foregroundf(xi) The method comprises the following steps:
(3a) initializing the geodesic distance from the foreground marking point to the foreground to be 0, initializing the geodesic distance of the background to be infinite, and taking the foreground marking point as a sample point;
(3b) searching 8 connected neighborhood pixels of the current sample according to the 8 neighborhood matrix of the pixels, and finding out the pixel with the minimum weight value in the pixels;
(3c) adding the pixel point with the minimum weight to a path of the geodesic distance, and sequencing according to the size;
(3d) for other pixel points on the path, weight correction is carried out according to the weight of the newly added pixel point, and the smaller of the original weight and the corrected weight of the pixel point is taken as the updated weight probability of other pixel points on the path;
(3e) taking the newly added pixel points as new samples, and repeating the steps (3b) - (3d) until all the pixel points in the image are completely searched;
(3f) to-be-processed pixel point xiThe updated weight probability of (a) is taken as the pixel point xiGeodesic distance d to the foregroundf(xi)。
4. The method for geodesic distance image segmentation based on block similarity as claimed in claim 1, wherein said Dijkstra shortest path algorithm in step (4) is used to obtain pixel point x to be processediGeodesic distance d to the backgroundb(xi) The method comprises the following steps:
(4a) initializing the geodesic distance from the background mark point to the background to 0, initializing the geodesic distance from the foreground to infinity, and taking the background mark point as a sample point;
(4b) searching 8 connected neighborhood pixels of the current sample according to the 8 neighborhood matrix of the pixels, and finding out the pixel with the minimum weight value in the pixels;
(4c) adding the pixel point with the minimum weight to a path of the geodesic distance, and sequencing according to the size;
(4d) for other pixel points on the path, weight correction is carried out according to the weight of the newly added pixel point, and the smaller of the original weight and the corrected weight of the pixel point is taken as the updated weight probability of other pixel points on the path;
(4e) taking the newly added pixel points as new samples, and repeating the steps (4b) - (4d) until all the pixel points in the image are completely searched;
(4f) to-be-processed pixel point xiThe updated weight probability of (a) is taken as the pixel point xiGeodesic distance d to the foregroundb(xi)。
CN2012100043120A 2012-01-06 2012-01-06 Block similarity-based interactive image segmenting method Pending CN102542543A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2012100043120A CN102542543A (en) 2012-01-06 2012-01-06 Block similarity-based interactive image segmenting method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2012100043120A CN102542543A (en) 2012-01-06 2012-01-06 Block similarity-based interactive image segmenting method

Publications (1)

Publication Number Publication Date
CN102542543A true CN102542543A (en) 2012-07-04

Family

ID=46349368

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2012100043120A Pending CN102542543A (en) 2012-01-06 2012-01-06 Block similarity-based interactive image segmenting method

Country Status (1)

Country Link
CN (1) CN102542543A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103077499A (en) * 2013-01-09 2013-05-01 西安电子科技大学 SAR (Synthetic Aperture Radar) image pre-processing method based on similar block
CN104217423A (en) * 2013-06-03 2014-12-17 西门子公司 Automatic generation of selected image data set
CN104766065A (en) * 2015-04-14 2015-07-08 中国科学院自动化研究所 Robustness prospect detection method based on multi-view learning
CN106780506A (en) * 2016-11-21 2017-05-31 北京交通大学 A kind of interactive image segmentation method based on multi-source shortest path distance
CN107452003A (en) * 2017-06-30 2017-12-08 大圣科技股份有限公司 A kind of method and device of the image segmentation containing depth information
CN107527349A (en) * 2017-06-27 2017-12-29 广州城建职业学院 Image partition method based on network dynamics evolutionary strategy
CN109255321A (en) * 2018-09-03 2019-01-22 电子科技大学 A kind of visual pursuit classifier construction method of combination history and instant messages
CN109493363A (en) * 2018-09-11 2019-03-19 北京达佳互联信息技术有限公司 A kind of FIG pull handle method, apparatus and image processing equipment based on geodesic distance

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101299243A (en) * 2008-06-27 2008-11-05 西安电子科技大学 Method of image segmentation based on immune spectrum clustering
CN101727662A (en) * 2009-11-27 2010-06-09 西安电子科技大学 SAR image nonlocal mean value speckle filtering method
CN102298774A (en) * 2011-09-21 2011-12-28 西安电子科技大学 Non-local mean denoising method based on joint similarity

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101299243A (en) * 2008-06-27 2008-11-05 西安电子科技大学 Method of image segmentation based on immune spectrum clustering
CN101727662A (en) * 2009-11-27 2010-06-09 西安电子科技大学 SAR image nonlocal mean value speckle filtering method
CN102298774A (en) * 2011-09-21 2011-12-28 西安电子科技大学 Non-local mean denoising method based on joint similarity

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
翟书娟: "交互式图像分割模型与算法研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103077499B (en) * 2013-01-09 2015-04-08 西安电子科技大学 SAR (Synthetic Aperture Radar) image pre-processing method based on similar block
CN103077499A (en) * 2013-01-09 2013-05-01 西安电子科技大学 SAR (Synthetic Aperture Radar) image pre-processing method based on similar block
US9672625B2 (en) 2013-06-03 2017-06-06 Siemens Aktiengesellschaft Method and apparatus to automatically implement a selection procedure on image data to generate a selected image data set
CN104217423A (en) * 2013-06-03 2014-12-17 西门子公司 Automatic generation of selected image data set
CN104217423B (en) * 2013-06-03 2017-06-16 西门子公司 Select automatically generating for image data set
CN104766065A (en) * 2015-04-14 2015-07-08 中国科学院自动化研究所 Robustness prospect detection method based on multi-view learning
CN104766065B (en) * 2015-04-14 2018-04-27 中国科学院自动化研究所 Robustness foreground detection method based on various visual angles study
CN106780506A (en) * 2016-11-21 2017-05-31 北京交通大学 A kind of interactive image segmentation method based on multi-source shortest path distance
CN106780506B (en) * 2016-11-21 2020-05-08 北京交通大学 Interactive image segmentation method based on multisource path shortest distance
CN107527349A (en) * 2017-06-27 2017-12-29 广州城建职业学院 Image partition method based on network dynamics evolutionary strategy
CN107452003A (en) * 2017-06-30 2017-12-08 大圣科技股份有限公司 A kind of method and device of the image segmentation containing depth information
CN109255321A (en) * 2018-09-03 2019-01-22 电子科技大学 A kind of visual pursuit classifier construction method of combination history and instant messages
CN109255321B (en) * 2018-09-03 2021-12-10 电子科技大学 Visual tracking classifier construction method combining history and instant information
CN109493363A (en) * 2018-09-11 2019-03-19 北京达佳互联信息技术有限公司 A kind of FIG pull handle method, apparatus and image processing equipment based on geodesic distance

Similar Documents

Publication Publication Date Title
CN102542543A (en) Block similarity-based interactive image segmenting method
CN111640125B (en) Aerial photography graph building detection and segmentation method and device based on Mask R-CNN
WO2019174376A1 (en) Lung texture recognition method for extracting appearance and geometrical feature based on deep neural network
CN111753828B (en) Natural scene horizontal character detection method based on deep convolutional neural network
CN105931253B (en) A kind of image partition method being combined based on semi-supervised learning
CN109615008B (en) Hyperspectral image classification method and system based on stack width learning
CN109934826A (en) A kind of characteristics of image dividing method based on figure convolutional network
CN104933709A (en) Automatic random-walk CT lung parenchyma image segmentation method based on prior information
CN103077555B (en) The automatic marking method that a kind of three-dimensional model is formed
CN104732546B (en) The non-rigid SAR image registration method of region similitude and local space constraint
CN104732545A (en) Texture image segmentation method combined with sparse neighbor propagation and rapid spectral clustering
CN110349170B (en) Full-connection CRF cascade FCN and K mean brain tumor segmentation algorithm
CN113192076B (en) MRI brain tumor image segmentation method combining classification prediction and multi-scale feature extraction
CN102074013B (en) Wavelet multi-scale Markov network model-based image segmentation method
CN106780450A (en) A kind of image significance detection method based on low-rank Multiscale Fusion
CN103226825B (en) Based on the method for detecting change of remote sensing image of low-rank sparse model
CN111339924A (en) Polarized SAR image classification method based on superpixel and full convolution network
CN107301643A (en) Well-marked target detection method based on robust rarefaction representation Yu Laplce&#39;s regular terms
CN117152746B (en) Method for acquiring cervical cell classification parameters based on YOLOV5 network
CN106650820A (en) Matching recognition method of handwritten electrical component symbols and standard electrical component symbols
CN116703812A (en) Deep learning-based photovoltaic module crack detection method and system
CN112435264A (en) 42CrMo single-phase metallographic structure segmentation method and system based on deep learning
CN106447662A (en) Combined distance based FCM image segmentation algorithm
CN108090913A (en) A kind of image, semantic dividing method based on object level Gauss-Markov random fields
CN103413332B (en) Based on the image partition method of two passage Texture Segmentation active contour models

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20120704