CN104751434A - Method and apparatus for dividing object from image - Google Patents

Method and apparatus for dividing object from image Download PDF

Info

Publication number
CN104751434A
CN104751434A CN201310728768.6A CN201310728768A CN104751434A CN 104751434 A CN104751434 A CN 104751434A CN 201310728768 A CN201310728768 A CN 201310728768A CN 104751434 A CN104751434 A CN 104751434A
Authority
CN
China
Prior art keywords
node
chained list
traversal
border
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201310728768.6A
Other languages
Chinese (zh)
Inventor
王晓涛
王强
许宽宏
郝志会
郭萍
张祐荣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Samsung Telecom R&D Center
Beijing Samsung Telecommunications Technology Research Co Ltd
Samsung Electronics Co Ltd
Original Assignee
Beijing Samsung Telecommunications Technology Research Co Ltd
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Samsung Telecommunications Technology Research Co Ltd, Samsung Electronics Co Ltd filed Critical Beijing Samsung Telecommunications Technology Research Co Ltd
Priority to CN201310728768.6A priority Critical patent/CN104751434A/en
Publication of CN104751434A publication Critical patent/CN104751434A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Analysis (AREA)

Abstract

A method and apparatus for dividing an object from an image is provided. The method includes: A) detecting an approximate position of an object from an image, to obtain an initial boundary containing the object; B) converting the initial boundary into a form of a chain table; C) performing traversal on the current chain table; D) determining whether the boundary indicated by the chain table after the traversal is converged; and E) when the boundary indicated by the chain table after the traversal is converged, using the converged boundary as the boundary of the object.

Description

The method and apparatus of cutting object from image
Technical field
The present invention relates to computer vision field.More particularly, a kind of method and and equipment of cutting object from image is related to.
Background technology
Iamge Segmentation is an important technology in vision technique, and it has very important application in intelligent video surveillance, the retrieval of content-based image/video, image/video annotation, auxiliary man-machine interaction.Recently, image Segmentation Technology is applied to medical imaging field gradually, for being partitioned into various anatomical structure from image, such as, and barrier film, epidermis, regional wall, organ, damage, tumour etc.
But existing image Segmentation Technology needs very high calculated amount, cannot realize Iamge Segmentation real-time fast.In addition, existing Iamge Segmentation needs to utilize a large amount of training samples to carry out training classifier usually, and the sorter of training only can for the segmentation of special object, and applicability is lower.
Summary of the invention
The object of the present invention is to provide a kind of method and and equipment of cutting object from image.
An aspect of of the present present invention provides a kind of method of cutting object from image, comprising: A) rough position of detected object from image, to obtain the initial boundary comprising object; B) described initial boundary is converted to the form of chained list; C) current chained list is traveled through; D) determined whether the border represented by the chained list after traversal restrains; E) when completing the convergence of the border represented by the chained list after traversal, using the border of restraining as the border of object.
Alternatively, in ergodic process, chained list is upgraded.
Alternatively, when completing the border represented by the chained list after traversal and not restraining, return and perform step C) to travel through current chained list.
Alternatively, step C) comprising: the node of current chained list is traveled through; Determine whether the node traversed is frontier point; Being confirmed as from chain list deletion is not the node of frontier point, and adds new node at the end of chained list.
Alternatively, in step C) in, when the quantity of the new node added reaches predetermined value, complete traversal; When the quantity of the new node added does not reach predetermined value, and the last node added is traversed rear node determining that this finally adds when being frontier point, completes traversal.
Alternatively, the step of adding new node at the end of chained list comprises: using apart from the described end not being the pixel of the predetermined step-length of the node of frontier point and adding chained list as new node to.
Alternatively, described traversal comprises the traversal outwards developed and the traversal inwardly developed, wherein, in the ergodic process inwardly developed, the step of adding new node at the end of chained list comprises: using along the direction leaving object apart from the described end not being the pixel of the predetermined step-length of the node of frontier point and adding chained list as new node to; Wherein, in the ergodic process outwards developed, the step of adding new node at the end of chained list comprises: using along the direction entering object apart from the described end not being the pixel of the predetermined step-length of the node of frontier point and adding chained list as new node to.
Alternatively, step C) comprising: first perform the traversal outwards developed and perform the traversal inwardly developed based on the chained list after the traversal outwards developed; Or first perform the traversal inwardly developed and perform the traversal outwards developed based on the chained list after the traversal inwardly developed.
Alternatively, step D) comprising: determined region indicated by the border represented by the chained list after traversal and last complete traversal after the region indicated by the border represented by chained list between difference whether be less than predetermined threshold, when described difference is less than predetermined threshold, determine the border convergence represented by the chained list after traversal.
Alternatively, the distance between the center in the region indicated by the border represented by chained list after described difference has referred to traversal and the last center completing the region indicated by the border represented by chained list after traversal.
Alternatively, described method also comprises: remove redundant node from completing the chained list after traversal, the border represented by chained list is made to be individual layer border, wherein, determined that the step whether border represented by the chained list after traversal restrains comprises: whether the border represented by chained list determining to eliminate redundant node restrains.
Alternatively, determine that whether the node of the chained list traversed is that the step of frontier point comprises: chained list regional characteristic value A1) determining current chained list; A2) category attribute value of the node of the chained list traversed is calculated according to chained list regional characteristic value; A3) smooth property value and the gradient characteristics value of the node of the chained list traversed is calculated; A4) determine whether node is frontier point based on the category attribute value of node, smooth property value and gradient characteristics value.
Alternatively, chained list regional characteristic value can by Parametric Representation below: c 1, i ∈ [1, N],
Wherein, c 1the average of the eigenwert of the pixel of the inside on the current border represented by chained list, be the average of the eigenwert of the pixel of surrounding in the i-th flabellum region in a region on described border, N represents the quantity in the flabellum region that described region is divided.
Alternatively, by formulae discovery chained list regional characteristic value below
φ t G = ( I ( x ) - c 1 ) 2 - Σ i = 1 N χ i ( x ) ( I ( x ) - c 2 i ) 2
Wherein, I (x) represents the eigenwert of the pixel x represented by the node of chained list, x ix () represents indicator function, when pixel x is positioned at during region, x i(x)=1; When pixel x is not positioned at during region, x i(x)=0.
Alternatively, by formulae discovery smooth property value R (x) below:
R ( x ) Σ y ∈ N n * ( x ) φ ( y ) | | N n * ( x ) | |
Wherein, N n *x () is the deleted neighbourhood of the pixel x represented by node, φ (y) is the level set function being input as y, the codomain of this level set function be-1,0,1}, represent the pixel quantity in the deleted neighbourhood of the pixel x represented by node.
Alternatively, by formulae discovery gradient characteristics value g (x) below:
g ( x ) = 1 - 1 1 + | ▿ ( G σ * I ) | 2
Wherein G σbe take σ as the gaussian kernel function of variance, I represents the sets of pixel values of the pixel of image, and symbol * represents and asks convolution, symbol represent and ask gradient.
Alternatively, determine whether node is frontier point by formula below:
f 1 ( x ) = φ t G ( x ) + λ 1 R ( x ) + λ 2 g ( x )
f 2 ( x ) = φ t G ( x ) + λ 1 R ( x ) - λ 2 g ( x )
Wherein, represent chained list regional characteristic value, R (x) represents smooth property value, and g (x) represents gradient characteristics value, λ 1and λ 2the constant of nonnegative value,
Wherein, in the ergodic process outwards developed, as f1 (x) >0, determine that node is not frontier point; When f1 (x)≤0, determine that node is frontier point;
In the ergodic process inwardly developed, as f2 (x) <0, determine that node is not frontier point; When f2 (x) >=0, determine that node is frontier point.
Alternatively, in step B) and C) between perform steps A 1).
Alternatively, steps A 3) also comprise: the shape facility value calculating the node of the chained list traversed.
Alternatively, wherein by formulae discovery shape facility value c (x, z) below:
c ( x , z ) = 1 - exp ( - ( d ( x , z ) - r 0 ( z ) ) 2 &epsiv; 2 ) ,
Wherein, z represents the coordinate of an axis of the pixel x of image represented by node during 3-D view, r 0z () is the average radius on the border on z-plane, d (x, z) is the distance of pixel x to the center on the border on z-plane, and ε is constant.
Alternatively, determine whether node is frontier point by formula below:
f 1 ( x ) = &phi; t G ( x ) + &lambda; 1 R ( x ) + &lambda; 2 g ( x ) + &lambda; 3 c ( x , z ) )
f 2 ( x ) = &phi; t G ( x ) + &lambda; 1 R ( x ) - &lambda; 2 g ( x ) + &lambda; 3 c ( x , z ) )
Wherein, represent chained list regional characteristic value, R (x) represents smooth property value, and g (x) represents gradient characteristics value, and c (x, z) represents shape facility value, λ 1, λ 2, λ 3the constant of nonnegative value,
Wherein, in the ergodic process outwards developed, as f1 (x) >0, determine that node is not frontier point; When f1 (x)≤0, determine that node is frontier point.
Alternatively, described image is two dimensional image or 3-D view.
Another aspect of the present invention provides a kind of equipment of cutting object from image, comprising: initial detecting unit, the rough position of detected object from image, to obtain the initial boundary comprising object; Chained list converting unit, is converted to the form of chained list by described initial boundary; Border evolution unit, travels through current chained list; Convergence detection unit, has determined whether the border represented by the chained list after traversal restrains; Border determining unit, when completing the convergence of the border represented by the chained list after traversal, using the border of restraining as the border of object.
Alternatively, evolution unit in border upgrades chained list in ergodic process.
Alternatively, when completing the border represented by the chained list after traversal and not restraining, evolution unit in border travels through current chained list.
Alternatively, border evolution unit comprises: Traversal Unit, travels through the node of current chained list; Boundary points detection unit, determines whether the node traversed is frontier point; Node updates unit, being confirmed as from chain list deletion is not the node of frontier point, and adds new node at the end of chained list.
Alternatively, when the quantity of the new node added reaches predetermined value, border evolution unit completes traversal; When the quantity of the new node added does not reach predetermined value, and the last node added is traversed rear node determining that this finally adds when being frontier point, and border evolution unit completes traversal.
Alternatively, node updates unit is using apart from the described end not being the pixel of the predetermined step-length of the node of frontier point and adding chained list as new node to.
Alternatively, described traversal comprises the traversal outwards developed and the traversal inwardly developed, wherein, in the ergodic process inwardly developed, using along the direction leaving object apart from the described end not being the pixel of the predetermined step-length of the node of frontier point and adding chained list as new node to; Wherein, in the ergodic process outwards developed, using along the direction entering object apart from the described end not being the pixel of the predetermined step-length of the node of frontier point and adding chained list as new node to.
Alternatively, Traversal Unit first performs the traversal outwards developed and the traversal inwardly developed based on the chained list execution after the traversal outwards developed; Or Traversal Unit first performs the traversal inwardly developed and the traversal outwards developed based on the chained list execution after the traversal inwardly developed.
Alternatively, the region indicated by the border represented by chained list after convergence detection unit has determined traversal and last complete traversal after the region indicated by the border represented by chained list between difference whether be less than predetermined threshold, when described difference is less than predetermined threshold, convergence detection unit has determined the border convergence represented by the chained list after traversal.
Alternatively, the distance between the center in the region indicated by the border represented by chained list after described difference has referred to traversal and the last center completing the region indicated by the border represented by chained list after traversal.
Alternatively, described equipment also comprises: redundant node eliminates unit, removing redundant node, making the border represented by chained list be individual layer border from completing the chained list after traversal, wherein, convergence detection unit determines whether the border eliminated represented by the chained list of redundant node restrains.
Alternatively, boundary points detection unit comprises: chained list regional characteristic value computing unit, determines the chained list regional characteristic value of current chained list; First node eigenwert computing unit, calculates the category attribute value of the node of the chained list traversed according to chained list regional characteristic value; Section Point eigenwert computing unit, calculates smooth property value and the gradient characteristics value of the node of the chained list traversed; Based on the category attribute value of node, smooth property value and gradient characteristics value, frontier point determining unit, determines whether node is frontier point.
Alternatively, chained list regional characteristic value can by Parametric Representation below: c 1, i ∈ [1, N], wherein, c 1the average of the eigenwert of the pixel of the inside on the current border represented by chained list, be the average of the eigenwert of the pixel of surrounding in the i-th flabellum region in a region on described border, N represents the quantity in the flabellum region that described region is divided.
Alternatively, by formulae discovery chained list regional characteristic value below
&phi; t G = ( I ( x ) - c 1 ) 2 - &Sigma; i = 1 N &chi; i ( x ) ( I ( x ) - c 2 i ) 2
Wherein, I (x) represents the eigenwert of the pixel x represented by the node of chained list, x ix () represents indicator function, when pixel x is positioned at during region, x i(x)=1; When pixel x is not positioned at during region, x i(x)=0.
Alternatively, by formulae discovery smooth property value R (x) below:
R ( x ) &Sigma; y &Element; N n * ( x ) &phi; ( y ) | | N n * ( x ) | | ,
Wherein, N n *x () is the deleted neighbourhood of the pixel x represented by node, φ (y) is the level set function being input as y, the codomain of this level set function be-1,0,1}, represent the pixel quantity in the deleted neighbourhood of the pixel x represented by node.
Alternatively, by formulae discovery gradient characteristics value g (x) below:
g ( x ) = 1 - 1 1 + | &dtri; ( G &sigma; * I ) | 2
Wherein G σbe take σ as the gaussian kernel function of variance, I represents the sets of pixel values of the pixel of image, and symbol * represents and asks convolution, symbol represent and ask gradient.
Alternatively, determine whether node is frontier point by formula below:
f 1 ( x ) = &phi; t G ( x ) + &lambda; 1 R ( x ) + &lambda; 2 g ( x )
f 2 ( x ) = &phi; t G ( x ) + &lambda; 1 R ( x ) - &lambda; 2 g ( x )
Wherein, represent chained list regional characteristic value, R (x) represents smooth property value, and g (x) represents gradient characteristics value, λ 1and λ 2the constant of nonnegative value,
Wherein, in the ergodic process outwards developed, as f1 (x) >0, determine that node is not frontier point; When f1 (x)≤0, determine that node is frontier point;
In the ergodic process inwardly developed, as f2 (x) <0, determine that node is not frontier point; When f2 (x) >=0, determine that node is frontier point.
Alternatively, chained list regional characteristic value computing unit determined the chained list regional characteristic value of current chained list after described initial boundary is converted to the form of chained list by chained list converting unit before evolution unit in border travels through current chained list.
Alternatively, Section Point eigenwert computing unit also calculates the shape facility value of the node of the chained list traversed.
Alternatively, by formulae discovery shape facility value c (x, z) below:
c ( x , z ) = 1 - exp ( - ( d ( x , z ) - r 0 ( z ) ) 2 &epsiv; 2 ) ,
Wherein, z represents the coordinate of an axis of the pixel x of image represented by node during 3-D view, r 0z () is the average radius on the border on z-plane, d (x, z) is the distance of pixel x to the center on the border on z-plane, and ε is constant.
Alternatively, determine whether node is frontier point by formula below:
f 1 ( x ) = &phi; t G ( x ) + &lambda; 1 R ( x ) + &lambda; 2 g ( x ) + &lambda; 3 c ( x , z ) )
f 2 ( x ) = &phi; t G ( x ) + &lambda; 1 R ( x ) - &lambda; 2 g ( x ) + &lambda; 3 c ( x , z ) )
Wherein, represent chained list regional characteristic value, R (x) represents smooth property value, and g (x) represents gradient characteristics value, and c (x, z) represents shape facility value, λ 1, λ 2, λ 3the constant of nonnegative value,
Wherein, in the ergodic process outwards developed, as f1 (x) >0, determine that node is not frontier point; When f1 (x)≤0, determine that node is frontier point.
Alternatively, described image is two dimensional image or 3-D view.
According to the method and apparatus of cutting object from image of the present invention, in the evolutionary process on border, do not need the space arrangement order keeping node, but new node is added at the end of chained list after deletion of node, thus save the operation of sequential search, and the new node added can traverse in the current ergodic process of chained list, but not travel through in next ergodic process, thus reduce calculated amount, improve evolution efficiency and speed.In addition, according to the method and apparatus of cutting object from image of the present invention, lower calculated amount can be utilized to realize the segmentation of object, and splitting speed increase substantially.In addition, according to the method for cutting object from image of the present invention and and the mode of equipment use data-driven carry out Iamge Segmentation, do not need training sample, there is the scope of application widely.
Part in ensuing description is set forth the present invention other in and/or advantage, some will be clearly by describing, or can learn through enforcement of the present invention.
Accompanying drawing explanation
By the detailed description of carrying out below in conjunction with accompanying drawing, above and other objects of the present invention, feature and advantage will become apparent, wherein:
Fig. 1 illustrates the process flow diagram of the method for cutting object from image according to an embodiment of the invention;
Fig. 2 illustrates and determines that whether node is the process flow diagram of the method for frontier point according to of the present invention;
Fig. 3 illustrates the schematic diagram in flabellum region according to an embodiment of the invention;
Fig. 4 illustrates the block diagram of the equipment of cutting object from image according to an embodiment of the invention;
Fig. 5 illustrates the block diagram of border evolution unit according to an embodiment of the invention;
Fig. 6 illustrates the block diagram of boundary points detection unit according to an embodiment of the invention.
Embodiment
Now, describe different example embodiment more fully with reference to the accompanying drawings, wherein, some exemplary embodiments are shown in the drawings.
Fig. 1 illustrates the process flow diagram of the method for cutting object from image according to an embodiment of the invention.
In step 101, the rough position of detected object from image, to obtain the initial boundary comprising object.
Existing various method for checking object (such as, sliding window method, maximum stable extremal region detection, rim detection, Harris-affine detection etc.) can be utilized to realize step 101.In addition, also step 101 is realized by the method for checking object (such as, the position of artificial input Seed Points appointed object) of manual intervention.Be two dimensional image or 3-D view according to image, initial boundary can be two-dimentional border (the two-dimentional border that such as, square frame, circle or other various two dimensions are close-shaped etc.) or three-dimensional boundaries (such as, the surface of cube, ball or other various 3D shapes).
In step 102, the initial boundary obtained is converted to the form of chained list in step 101.In other words, the pixel on initial boundary is represented with the form of chained list.This chained list is the form of unordered chained list, like this, in this chained list, does not need to sort according to node position in the picture as ordered list.Each node of chained list represents a pixel.Each node comprises the coordinate information of this pixel.In addition, each node also can have the index of oneself adjacent front and back node.
In step 103, current chained list is traveled through.
In ergodic process, when traversing a node, determine whether this node is frontier point.When this node is not frontier point, by this node from chain list deletion, and add new node at the end of chained list.The pixel of the predetermined step-length apart from this node can be added to the end of chained list as new node.When determining that this node is frontier point, this node is retained in chained list.
Like this, when new node is added to chained list, chained list is updated, the quantity also corresponding increase of node to be traveled through.
When determining whether node is frontier point, can determine whether node is frontier point according to existing various frontier point determination technology.Such as, exist in prior art and can determine whether this pixel is frontier point according to certain pixel feature in the picture (such as, texture, gray scale, gradient).In addition, will be described later the scheme whether determination node that the present invention proposes is frontier point.
After all nodes (comprising the new node added) of chained list are traversed, and when not needing to add new node, traversal terminates.In other words, after the last node added is traversed, and when not needing to add new node again, traversal (that is, the node be finally traversed is confirmed as being frontier point) is completed.
In addition, could realize after the new node more than above-mentioned traversal termination condition may need to add very, be unfavorable for the increase of cutting speed.Determining to terminate in another embodiment of traversal, the quantity of the new node added being limited, when the new number of nodes added reaches predetermined value, completing traversal; When the new number of nodes added does not reach predetermined value, the node finally added is traversed rear when determining not add new node, completes traversal.
In another embodiment traveled through current chained list, described traversal comprises the traversal outwards developed and the traversal inwardly developed.First can perform the traversal outwards developed, then perform the traversal inwardly developed based on the chained list after the traversal outwards developed; Or first perform the traversal inwardly developed, then perform the traversal outwards developed based on the chained list after the traversal inwardly developed.
In the ergodic process outwards developed, whether the node determining the chained list traversed is frontier point, when this node is not frontier point, by this node from chain list deletion, and not will the pixel of the predetermined step-length of the node of frontier point to add chained list to end as new node apart from this along the direction leaving object.When this node is frontier point, this node is retained in chained list.
In the ergodic process inwardly developed, whether the node determining the chained list traversed is frontier point, when this node is not frontier point, by this node from chain list deletion, and not will the pixel of the predetermined step-length of the node of frontier point to add chained list to end as new node apart from this along the direction entering object.When this node is frontier point, this node is retained in chained list.
In other words, in the ergodic process outwards developed, only using the pixel outside the border of the current object determined as new node.In the ergodic process inwardly developed, only using the pixel within the current object bounds determined as new node.
In step 104, determine whether the border represented by the chained list after traversal restrains.In other words, determine whether the result of this time traversal restrains, and reaches desired value.
Whether the difference by having determined between the region indicated by the border represented by the chained list after traversal and the region indicated by the last border completed represented by the chained list after traversal is less than predetermined threshold to determine whether is restrained.When described difference is less than predetermined threshold, determine convergence.
In one embodiment, the distance between the center in the center in the region indicated by the border represented by the chained list after traveling through by one-tenth and the region indicated by the last border completed represented by the chained list after traversal is used as described difference.In addition, the parameter of the difference between other presentation is also feasible.Such as, overlapping margins rate, border similarity etc.
When completing the border represented by the chained list after traversal and not restraining, return and perform step 103.In step 103, again perform traversal with chained list now.
When completing the convergence of the border represented by the chained list after traversal, in step 105, using the border of restraining as the border of object.Now, because iteration result is restrained, there is no need to return step 103 again and perform iteration.
In another embodiment, from image, the method for cutting object also comprises: removing redundant node from completing the chained list after traversal, making the border represented by chained list be individual layer border.After the operation through a series of interpolation new node, the border of formation may be thicker, and such as, for two dimensional image, border is thicker; For three bit images, there is multilayer on border.Border now can not give expression to the border of object exactly, needs to remove redundant node.Existing various redundancy minimizing technology can be utilized, such as, remove redundant node by various filtering method.Subsequently, convergence can be determined whether for the border represented by the chained list eliminating redundant node in step 104.
Introduce referring to Fig. 2 a kind of that the present invention proposes and determine that whether node is the method for frontier point.
Fig. 2 illustrates and determines that whether node is the process flow diagram of the method for frontier point according to of the present invention.
In step 201, determine the chained list regional characteristic value of current chained list.
Chained list regional characteristic value represents the characteristics of image in a neighborhood (such as, arrowband neighborhood) on the border represented by chained list.
Chained list regional characteristic value can by Parametric Representation below: c 1, (i ∈ [1, N], N be greater than 1 natural number).Here, c 1the average of the eigenwert of the pixel of border inner, be the average of the eigenwert of the pixel of surrounding in the i-th flabellum region in a region (that is, the neighborhood on border) on border, this region is divided into N number of flabellum region.Should be appreciated that, i is here natural number.In addition, here, the eigenwert of pixel can be the value of the characteristics of image of various sign pixel, such as, and gray-scale value (pixel value), texture value etc.
Fig. 3 illustrates the schematic diagram in flabellum region according to an embodiment of the invention.
As shown in Figure 3, the closed curve of solid line is border, and the formation region, region between border and the closed curve (being illustrated by the broken lines) surrounding border, this region is divided into N number of region (in this example, N is 8).Preferably, by carrying out division to the quadrant of the coordinate system centered by the center on border, region is divided.
In step 202, calculate the category attribute value of the node of the chained list traversed according to chained list regional characteristic value.Category attribute value embodies node and belongs to the interior zone of cutting object or the probability of perimeter.
By the category attribute value of formula 1 computing node below.
&phi; t G = ( I ( x ) - c 1 ) 2 - &Sigma; i = 1 N &chi; i ( x ) ( I ( x ) - c 2 i ) 2 - - - ( 1 )
Wherein, I (x) represents the eigenwert of the pixel x represented by the node of chained list, x ix () represents indicator function, when pixel x is positioned at during region, x i(x)=1; When pixel x is not positioned at during region, x i(x)=0.
When time to represent that pixel x belongs to the probability of perimeter high, in other cases, the probability that pixel x should belong to interior zone is high.
In step 203, calculate smooth property value and the gradient characteristics value of the node of the chained list traversed.
The smooth characteristic value of node embodies the slickness at the pixel place of border represented by node represented by chained list.Gradient characteristics value embodies the gradient at the pixel place of image represented by node.Existing various slickness algorithm can be used, gradient algorithm comes computing node smooth property value, gradient characteristics value.For the ease of considering the various characteristics of pixel, need the smooth property value after normalization and gradient characteristics value, that is, the less expression of smooth property value is more smooth, and the larger gradient of Gradient Features value is larger.
In addition, mode provided by the invention also can be used to calculate smooth property value and gradient characteristics value.
Formula (2) below shows and calculates smooth property value R (x):
R ( x ) = &Sigma; y &Element; N n * ( x ) &phi; ( y ) | | N n * ( x ) | | - - - ( 2 )
Wherein, N n *x () is the deleted neighbourhood of the pixel x represented by node, φ (y) is the level set function being input as y, the codomain of this level set function be-1,0,1}, represent the pixel quantity in the deleted neighbourhood of the pixel x represented by node.
Formula (3) below shows compute gradient characteristic value g (x):
g ( x ) = 1 - 1 1 + | &dtri; ( G &sigma; * I ) | 2
Wherein G σbe take σ as the gaussian kernel function of variance, I represents the sets of pixel values of the pixel of image, and symbol * represents and asks convolution, symbol represent and ask gradient.
Value near g (x) border is in the picture comparatively large, less in the value of flat site, can stop in evolutionary process the situation occurring passing across the border like this.
In step 204, determine whether node is frontier point based on the category attribute value of node, smooth property value and gradient characteristics value.
Specifically, determine whether node is frontier point by formula (4) below:
f 1 ( x ) = &phi; t G ( x ) + &lambda; 1 R ( x ) + &lambda; 2 g ( x )
f 2 ( x ) = &phi; t G ( x ) + &lambda; 1 R ( x ) - &lambda; 2 g ( x ) - - - ( 4 )
Wherein, λ 1and λ 2it is the constant of nonnegative value.
In the ergodic process of above-described outside evolution, determine whether node is frontier point by the f1 (x) in formula (4).Specifically, as f1 (x) >0, determine that node is not frontier point; When f1 (x)≤0, determine that node is frontier point.
In the ergodic process of above-described inside evolution, determine whether node is frontier point by the formula f2 (x) in formula (4).Specifically, as f2 (x) <0, determine that node is not frontier point, when f2 (x) >=0, determine that node is frontier point.
In another embodiment, when when image is 3-D view and object to be split has cylindric, also can the shape facility value of computing node in step 203.
Calculate shape facility value c (x, z) by formula (5) below, c (x, z) larger expression circularity is higher.
c ( x , z ) = 1 - exp ( - ( d ( x , z ) - r 0 ( z ) ) 2 &epsiv; 2 ) ,
Wherein, z represents the coordinate of an axis of the pixel x of image represented by node during 3-D view, r 0z () is the average radius on the border on z-plane, d (x, z) is the distance of pixel x to the center on the border on z-plane, and ε is constant.
Now, in step 204, determine whether node is frontier point by formula (6) below:
f 1 ( x ) = &phi; t G ( x ) + &lambda; 1 R ( x ) + &lambda; 2 g ( x ) + &lambda; 3 c ( x , z ) )
f 2 ( x ) = &phi; t G ( x ) + &lambda; 1 R ( x ) - &lambda; 2 g ( x ) + &lambda; 3 c ( x , z ) ) - - - ( 6 )
Wherein, λ 1, λ 2, λ 3it is the constant of nonnegative value.
In the ergodic process of above-described outside evolution, determine whether node is frontier point by the f1 (x) in formula (6).Specifically, as f1 (x) >0, determine that node is not frontier point; When f1 (x)≤0, determine that node is frontier point.
In the ergodic process of above-described inside evolution, determine whether node is frontier point by the formula f2 (x) in formula (6).Specifically, as f2 (x) <0, determine that node is not frontier point, when f2 (x) >=0, determine that node is frontier point.
In a preferred embodiment, the step 201 in Fig. 2 can perform before the step 103 of Fig. 1, and other steps in Fig. 2 can perform in ergodic process.Its reason is that the chained list regional characteristic value determining current chained list needs larger calculated amount, if performed extreme influence splitting speed in ergodic process, if performed step 201 before step 103, can be little to Accuracy while greatly improving splitting speed.
Fig. 4 illustrates the block diagram of the equipment of cutting object from image according to an embodiment of the invention.
As shown in Figure 4, from image, the equipment 400 of cutting object comprises according to an embodiment of the invention: initial detecting unit 410, chained list converting unit 420, border evolution unit 430, convergence detection unit 440, border determining unit 450.
The rough position of initial detecting unit 410 detected object from image, to obtain the initial boundary comprising object.
Initial detecting unit 410 can utilize existing various object detection technique (such as, sliding window method, maximum stable extremal region detection, rim detection, Harris-affine detection etc.) to carry out the rough position of detected object from image.In addition, also by the rough position of method for checking object (such as, the position of artificial input Seed Points appointed object) the next detected object from image of manual intervention.Be two dimensional image or 3-D view according to image, initial boundary can be two-dimentional border (the two-dimentional border that such as, square frame, circle or other various two dimensions are close-shaped etc.) or three-dimensional boundaries (such as, the surface of cube, ball or other various 3D shapes).
Described initial boundary is converted to the form of chained list by chained list converting unit 420.In other words, the pixel on initial boundary represents with the form of chained list by chained list converting unit 420.This chained list is the form of unordered chained list, like this, in this chained list, does not need to sort according to node position in the picture as ordered list.Each node of chained list represents a pixel.Each node comprises the coordinate information of this pixel.In addition, each node also can have the index of oneself adjacent front and back node.
Border evolution unit 430 travels through current chained list.
In ergodic process, when traversing a node, border evolution unit 430 determines whether this node is frontier point.When this node is not frontier point, this node from chain list deletion, and is added new node at the end of chained list by border evolution unit 430.The pixel of the predetermined step-length apart from this node can be added to the end of chained list by border evolution unit 430 as new node.When determining that this node is frontier point, this node is retained in chained list by border evolution unit 430.
Like this, when new node is added to chained list, chained list is updated, the quantity also corresponding increase of node to be traveled through.
When determining whether node is frontier point, can determine whether node is frontier point according to existing various frontier point determination technology.Such as, exist in prior art and can determine whether this pixel is frontier point according to certain pixel feature in the picture (such as, texture, gray scale, gradient).In addition, will be described later the scheme whether determination node that the present invention proposes is frontier point.
After all nodes (comprising the new node added) of chained list are traversed, and when not needing to add new node, border evolution unit 430 traversal terminates.In other words, after the last node added is traversed, and when not needing to add new node again, border evolution unit 430 completes traversal (that is, the node be finally traversed is confirmed as being frontier point).
In addition, could realize after the new node more than above-mentioned traversal termination condition may need to add very, be unfavorable for the increase of cutting speed.Determining to terminate in another embodiment of traversal, limit the quantity of the new node added, when the new number of nodes added reaches predetermined value, border evolution unit 430 completes traversal; When the new number of nodes added does not reach predetermined value, the node finally added is traversed rear when determining not add new node, and border evolution unit 430 completes traversal.
In another embodiment traveled through current chained list, described traversal comprises the traversal outwards developed and the traversal inwardly developed.First border evolution unit 430 can perform the traversal outwards developed, and then performs the traversal inwardly developed based on the chained list after the traversal outwards developed; Or first border evolution unit 430 performs the traversal inwardly developed, then perform the traversal outwards developed based on the chained list after the traversal inwardly developed
In the ergodic process outwards developed, border evolution unit 430 determines whether the node of the chained list traversed is frontier point, when this node is not frontier point, by this node from chain list deletion, and not will the pixel of the predetermined step-length of the node of frontier point to add chained list to end as new node apart from this along the direction leaving object.When this node is frontier point, this node is retained in chained list.
In the ergodic process inwardly developed, border evolution unit 430 determines whether the node of the chained list traversed is frontier point, when this node is not frontier point, by this node from chain list deletion, and not will the pixel of the predetermined step-length of the node of frontier point to add chained list to end as new node apart from this along the direction entering object.When this node is frontier point, this node is retained in chained list.
In other words, in the ergodic process outwards developed, only using the pixel outside the border of the current object determined as new node.In the ergodic process inwardly developed, only using the pixel within the current object bounds determined as new node.
Whether the border represented by chained list after convergence detection unit 440 has determined traversal restrains.In other words, determine whether the result of this time traversal restrains, and reaches desired value.
Whether convergence detection unit 440 is less than predetermined threshold to determine whether by the difference having determined between the region indicated by the border represented by the chained list after traversal and the region indicated by the last border completed represented by the chained list after traversal is restrained.When described difference is less than predetermined threshold, determine convergence.
In one embodiment, the distance between the center in the center in the region indicated by the border represented by the chained list after traveling through by one-tenth and the region indicated by the last border completed represented by the chained list after traversal is used as described difference.In addition, the parameter of the difference between other presentation is also feasible.Such as, overlapping margins rate, border similarity etc.
When completing the border represented by the chained list after traversal and not restraining, border evolution unit 430 travels through current chained list again, and convergence detection unit 440 determines whether convergence again to the border completed represented by the chained list after traversal.Like this, border evolution unit 430 and convergence detection unit 440 repeat traversal and convergence determination operation, until border convergence.
The border when border completed represented by the chained list after traversal is restrained by border determining unit 450 is as the border of object.
In another embodiment, described equipment also comprises redundant node and eliminates unit, removing redundant node, making the border represented by chained list be individual layer border from completing the chained list after traversal.After the operation through a series of interpolation new node, the border of formation may be thicker, and such as, for two dimensional image, border is thicker; For three bit images, there is multilayer on border.Border now can not give expression to the border of object exactly, needs to remove redundant node.Existing various redundancy removal technology can be utilized, such as, remove redundant node by various filtering method.Subsequently, convergence detection unit 440 can determine whether convergence for the border represented by the chained list eliminating redundant node.
Fig. 5 illustrates the block diagram of evolution unit in border according to an embodiment of the invention.
Border evolution unit 430 comprises: Traversal Unit 510, boundary points detection unit 520, node updates unit 530
Traversal Unit 510 pairs of nodes travel through.
Boundary points detection unit 520 determines whether the node traversed is frontier point.According to existing various frontier point determination technology, boundary points detection unit 520 can determine whether node is frontier point.Such as, exist in prior art and can determine whether this pixel is frontier point according to certain pixel feature in the picture (such as, texture, gray scale, gradient).In addition, will be described later the boundary points detection unit 520 whether determination node that the present invention proposes is frontier point.
Node updates unit 530 is confirmed as the node of non-frontier point from chain list deletion, and adds the pixel of the predetermined step-length of the node apart from this deletion the end of chained list to as new node.
Fig. 6 illustrates the block diagram of boundary points detection unit according to an embodiment of the invention.
Boundary points detection unit 520 comprises: chained list regional characteristic value computing unit 521, first node eigenwert computing unit 522, Section Point eigenwert computing unit 523, frontier point determining unit 524.
Chained list regional characteristic value computing unit 521 determines the chained list regional characteristic value of current chained list.
Chained list regional characteristic value represents the characteristics of image in a neighborhood (such as, arrowband neighborhood) on the border represented by chained list.
Chained list regional characteristic value can by previously described Parametric Representation: c 1, (i ∈ [1, N], N be greater than 1 natural number).
First node eigenwert computing unit 522 calculates the category attribute value of the node of the chained list traversed according to chained list regional characteristic value.By the category attribute value of aforesaid formula 1 computing node.
Section Point eigenwert computing unit 523 calculates smooth property value and the gradient characteristics value of the node of the chained list traversed.
Existing various slickness algorithm can be used, gradient algorithm comes computing node smooth property value, gradient characteristics value.For the ease of considering the various characteristics of pixel, need the smooth property value after normalization and gradient characteristics value, that is, the less expression of smooth property value is more smooth, and the larger gradient of Gradient Features value is larger.
In addition, also smooth property value can be calculated according to aforesaid formula 2, according to aforesaid formula 3 compute gradient characteristic value.
Based on the category attribute value of node, smooth property value and gradient characteristics value, frontier point determining unit 524 determines whether node is frontier point.Can determine whether node is frontier point based on aforementioned formula (4).
In the ergodic process of above-described outside evolution, determine whether node is frontier point by the f1 (x) in formula (4).Specifically, as f1 (x) >0, determine that node is not frontier point; When f1 (x)≤0, determine that node is frontier point.
In the ergodic process of above-described inside evolution, determine whether node is frontier point by the formula f2 (x) in formula (4).Specifically, as f2 (x) <0, determine that node is not frontier point, when f2 (x) >=0, determine that node is frontier point.
In another embodiment, when when image is 3-D view and object to be split has cylindric, Section Point eigenwert computing unit 523 also can the shape facility value of computing node.Shape facility value is calculated by aforesaid formula (5).In the case, based on the category attribute value of node, smooth property value, gradient characteristics value and shape facility value, frontier point determining unit 524 determines whether node is frontier point.
Determine whether node is frontier point by above-mentioned formula (6).
In the ergodic process of above-described outside evolution, determine whether node is frontier point by the f1 (x) in formula (6).Specifically, as f1 (x) >0, determine that node is not frontier point; When f1 (x)≤0, determine that node is frontier point.
In the ergodic process of above-described inside evolution, determine whether node is frontier point by the formula f2 (x) in formula (6).Specifically, as f2 (x) <0, determine that node is not frontier point, when f2 (x) >=0, determine that node is frontier point.
In a preferred embodiment, the step 201 in Fig. 2 can perform before the step 103 of Fig. 1, and other steps in Fig. 2 can perform in ergodic process.Its reason is that the chained list regional characteristic value determining current chained list needs larger calculated amount, if performed extreme influence splitting speed in ergodic process, if performed step 201 before step 103, can be little to Accuracy while greatly improving splitting speed.
In a preferred embodiment, chained list regional characteristic value computing unit 521 determined the chained list regional characteristic value of current chained list after described initial boundary is converted to the form of chained list by chained list converting unit 420 before border evolution unit 430 travels through current chained list.In addition, chained list regional characteristic value computing unit 521 can develop not included in unit 430 not included in border, but as the element be included in side by side with border evolution unit 430 in equipment 400.
According to the method and apparatus of cutting object from image of the present invention, in the evolutionary process on border, do not need the space arrangement order keeping node, but new node is added at the end of chained list after deletion of node, thus save the operation of sequential search, and the new node added can traverse in the current ergodic process of chained list, but not travel through in next ergodic process, thus reduce calculated amount, improve evolution efficiency and speed.In addition, according to the method and apparatus of cutting object from image of the present invention, lower calculated amount can be utilized to realize the segmentation of object, and splitting speed increase substantially.In addition, according to the method for cutting object from image of the present invention and and the mode of equipment use data-driven carry out Iamge Segmentation, do not need training sample, there is the scope of application widely.
In addition, should be appreciated that, according to an embodiment of the invention from image cutting object equipment in unit can be implemented nextport hardware component NextPort.The process of those skilled in the art performed by the unit limited, can such as use field programmable gate array (FPGA) or special IC (ASIC) to realize unit.
In addition, from image, the method for cutting object may be implemented as the computer code in computer readable recording medium storing program for performing according to an embodiment of the invention.Those skilled in the art can realize described computer code according to the description of said method.Said method of the present invention is realized when described computer code is performed in a computer.
Although specifically show with reference to its exemplary embodiment and describe the present invention, but it should be appreciated by those skilled in the art, when not departing from the spirit and scope of the present invention that claim limits, the various changes in form and details can be carried out to it.

Claims (44)

1. the method for cutting object from image, comprising:
A) rough position of detected object from image, to obtain the initial boundary comprising object;
B) described initial boundary is converted to the form of chained list;
C) current chained list is traveled through;
D) determined whether the border represented by the chained list after traversal restrains;
E) when completing the convergence of the border represented by the chained list after traversal, using the border of restraining as the border of object.
2. method according to claim 1, wherein, in ergodic process, upgrades chained list.
3. method according to claim 1, wherein, when completing the border represented by the chained list after traversal and not restraining, returns and performs step C) to travel through current chained list.
4. method according to claim 1, wherein, step C) comprising:
The node of current chained list is traveled through;
Determine whether the node traversed is frontier point;
Being confirmed as from chain list deletion is not the node of frontier point, and adds new node at the end of chained list.
5. method according to claim 4, wherein, in step C) in, when the quantity of the new node added reaches predetermined value, complete traversal; When the quantity of the new node added does not reach predetermined value, and the last node added is traversed rear node determining that this finally adds when being frontier point, completes traversal.
6. method according to claim 4, wherein, the step of adding new node at the end of chained list comprises: using apart from the described end not being the pixel of the predetermined step-length of the node of frontier point and adding chained list as new node to.
7. the method according to claim 4 or 6, wherein, described traversal comprises the traversal outwards developed and the traversal inwardly developed,
Wherein, in the ergodic process inwardly developed, the step of adding new node at the end of chained list comprises: using along the direction leaving object apart from the described end not being the pixel of the predetermined step-length of the node of frontier point and adding chained list as new node to;
Wherein, in the ergodic process outwards developed, the step of adding new node at the end of chained list comprises: using along the direction entering object apart from the described end not being the pixel of the predetermined step-length of the node of frontier point and adding chained list as new node to.
8. method according to claim 7, wherein, step C) comprising: first perform the traversal outwards developed and perform the traversal inwardly developed based on the chained list after the traversal outwards developed; Or first perform the traversal inwardly developed and perform the traversal outwards developed based on the chained list after the traversal inwardly developed.
9. method according to claim 1, wherein, step D) comprising: determined region indicated by the border represented by the chained list after traversal and last complete traversal after the region indicated by the border represented by chained list between difference whether be less than predetermined threshold, when described difference is less than predetermined threshold, determine the border convergence represented by the chained list after traversal.
10. method according to claim 9, wherein, the distance between the center in the region indicated by the border represented by chained list after described difference has referred to traversal and the last center completing the region indicated by the border represented by chained list after traversal.
11. methods according to claim 1, also comprise: removing redundant node from completing the chained list after traversal, making the border represented by chained list be individual layer border,
Wherein, determined that the step whether border represented by the chained list after traversal restrains comprises: whether the border represented by chained list determining to eliminate redundant node restrains.
12. methods according to claim 7, wherein, determine that whether the node of the chained list traversed is that the step of frontier point comprises:
A1) the chained list regional characteristic value of current chained list is determined;
A2) category attribute value of the node of the chained list traversed is calculated according to chained list regional characteristic value;
A3) smooth property value and the gradient characteristics value of the node of the chained list traversed is calculated;
A4) determine whether node is frontier point based on the category attribute value of node, smooth property value and gradient characteristics value.
13. methods according to claim 12, wherein, chained list regional characteristic value can by Parametric Representation below: c 1, i ∈ [1, N],
Wherein, c 1the average of the eigenwert of the pixel of the inside on the current border represented by chained list, be the average of the eigenwert of the pixel of surrounding in the i-th flabellum region in a region on described border, N represents the quantity in the flabellum region that described region is divided.
14. methods according to claim 13, wherein, by formulae discovery chained list regional characteristic value below
&phi; t G = ( I ( x ) - c 1 ) 2 - &Sigma; i = 1 N &chi; i ( x ) ( I ( x ) - c 2 i ) 2
Wherein, I (x) represents the eigenwert of the pixel x represented by the node of chained list, x ix () represents indicator function, when pixel x is positioned at during region, x i(x)=1; When pixel x is not positioned at during region, x i(x)=0.
15. methods according to claim 13, wherein, formulae discovery smooth property value R (x) by below:
R ( x ) &Sigma; y &Element; N n * ( x ) &phi; ( y ) | | N n * ( x ) | |
Wherein, N n *x () is the deleted neighbourhood of the pixel x represented by node, φ (y) is the level set function being input as y, the codomain of this level set function be-1,0,1}, represent the pixel quantity in the deleted neighbourhood of the pixel x represented by node.
16. methods according to claim 13, wherein, formulae discovery gradient characteristics value g (x) by below:
g ( x ) = 1 - 1 1 + | &dtri; ( G &sigma; * I ) | 2
Wherein G σbe take σ as the gaussian kernel function of variance, I represents the sets of pixel values of the pixel of image, and symbol * represents and asks convolution, symbol represent and ask gradient.
17. methods according to claim 12, determine whether node is frontier point by formula below:
f 1 ( x ) = &phi; t G ( x ) + &lambda; 1 R ( x ) + &lambda; 2 g ( x )
f 2 ( x ) = &phi; t G ( x ) + &lambda; 1 R ( x ) - &lambda; 2 g ( x )
Wherein, represent chained list regional characteristic value, R (x) represents smooth property value, and g (x) represents gradient characteristics value, λ 1and λ 2the constant of nonnegative value,
Wherein, in the ergodic process outwards developed, as f1 (x) >0, determine that node is not frontier point; When f1 (x)≤0, determine that node is frontier point;
In the ergodic process inwardly developed, as f2 (x) <0, determine that node is not frontier point; When f2 (x) >=0, determine that node is frontier point.
18. methods according to claim 12, wherein, in step B) and C) between perform steps A 1).
19. methods according to claim 12, wherein, steps A 3) also comprise: the shape facility value calculating the node of the chained list traversed.
20. methods according to claim 19, wherein by formulae discovery shape facility value c (x, z) below:
c ( x , z ) = 1 - exp ( - ( d ( x , z ) - r 0 ( z ) ) 2 &epsiv; 2 ) ,
Wherein, z represents the coordinate of an axis of the pixel x of image represented by node during 3-D view, r 0z () is the average radius on the border on z-plane, d (x, z) is the distance of pixel x to the center on the border on z-plane, and ε is constant.
21. methods according to claim 19, determine whether node is frontier point by formula below:
f 1 ( x ) = &phi; t G ( x ) + &lambda; 1 R ( x ) + &lambda; 2 g ( x ) + &lambda; 3 c ( x , z ) )
f 2 ( x ) = &phi; t G ( x ) + &lambda; 1 R ( x ) - &lambda; 2 g ( x ) + &lambda; 3 c ( x , z ) )
Wherein, represent chained list regional characteristic value, R (x) represents smooth property value, and g (x) represents gradient characteristics value, and c (x, z) represents shape facility value, λ 1, λ 2, λ 3the constant of nonnegative value,
Wherein, in the ergodic process outwards developed, as f1 (x) >0, determine that node is not frontier point; When f1 (x)≤0, determine that node is frontier point.
22. methods according to claim 1, wherein, described image is two dimensional image or 3-D view.
The equipment of 23. 1 kinds of cutting objects from image, comprising:
Initial detecting unit, the rough position of detected object from image, to obtain the initial boundary comprising object;
Chained list converting unit, is converted to the form of chained list by described initial boundary;
Border evolution unit, travels through current chained list;
Convergence detection unit, has determined whether the border represented by the chained list after traversal restrains;
Border determining unit, when completing the convergence of the border represented by the chained list after traversal, using the border of restraining as the border of object.
24. equipment according to claim 23, wherein, border evolution unit upgrades chained list in ergodic process.
25. equipment according to claim 23, wherein, when completing the border represented by the chained list after traversal and not restraining, evolution unit in border travels through current chained list.
26. equipment according to claim 23, wherein, border evolution unit comprises:
Traversal Unit, travels through the node of current chained list;
Boundary points detection unit, determines whether the node traversed is frontier point;
Node updates unit, being confirmed as from chain list deletion is not the node of frontier point, and adds new node at the end of chained list.
27. equipment according to claim 26, wherein, when the quantity of the new node added reaches predetermined value, border evolution unit completes traversal; When the quantity of the new node added does not reach predetermined value, and the last node added is traversed rear node determining that this finally adds when being frontier point, and border evolution unit completes traversal.
28. equipment according to claim 26, wherein, node updates unit is using apart from the described end not being the pixel of the predetermined step-length of the node of frontier point and adding chained list as new node to.
29. equipment according to claim 26 or 28, wherein, described traversal comprises the traversal outwards developed and the traversal inwardly developed,
Wherein, in the ergodic process inwardly developed, using along the direction leaving object apart from the described end not being the pixel of the predetermined step-length of the node of frontier point and adding chained list as new node to;
Wherein, in the ergodic process outwards developed, using along the direction entering object apart from the described end not being the pixel of the predetermined step-length of the node of frontier point and adding chained list as new node to.
30. equipment according to claim 29, wherein, Traversal Unit first performs the traversal outwards developed and the traversal inwardly developed based on the chained list execution after the traversal outwards developed; Or Traversal Unit first performs the traversal inwardly developed and the traversal outwards developed based on the chained list execution after the traversal inwardly developed.
31. equipment according to claim 23, wherein, the region indicated by the border represented by chained list after convergence detection unit has determined traversal and last complete traversal after the region indicated by the border represented by chained list between difference whether be less than predetermined threshold, when described difference is less than predetermined threshold, convergence detection unit has determined the border convergence represented by the chained list after traversal.
32. equipment according to claim 31, wherein, the distance between the center in the region indicated by the border represented by chained list after described difference has referred to traversal and the last center completing the region indicated by the border represented by chained list after traversal.
33. equipment according to claim 23, also comprise: redundant node eliminates unit, removing redundant node, making the border represented by chained list be individual layer border from completing the chained list after traversal,
Wherein, convergence detection unit determines whether the border eliminated represented by the chained list of redundant node restrains.
34. equipment according to claim 29, wherein, boundary points detection unit comprises:
Chained list regional characteristic value computing unit, determines the chained list regional characteristic value of current chained list;
First node eigenwert computing unit, calculates the category attribute value of the node of the chained list traversed according to chained list regional characteristic value;
Section Point eigenwert computing unit, calculates smooth property value and the gradient characteristics value of the node of the chained list traversed;
Based on the category attribute value of node, smooth property value and gradient characteristics value, frontier point determining unit, determines whether node is frontier point.
35. equipment according to claim 34, wherein, chained list regional characteristic value can by Parametric Representation below: c 1, i ∈ [1, N],
Wherein, c 1the average of the eigenwert of the pixel of the inside on the current border represented by chained list, be the average of the eigenwert of the pixel of surrounding in the i-th flabellum region in a region on described border, N represents the quantity in the flabellum region that described region is divided.
36. equipment according to claim 35, wherein, by formulae discovery chained list regional characteristic value below
&phi; t G = ( I ( x ) - c 1 ) 2 - &Sigma; i = 1 N &chi; i ( x ) ( I ( x ) - c 2 i ) 2
Wherein, I (x) represents the eigenwert of the pixel x represented by the node of chained list, x ix () represents indicator function, when pixel x is positioned at during region, x i(x)=1; When pixel x is not positioned at during region, x i(x)=0.
37. equipment according to claim 35, wherein, formulae discovery smooth property value R (x) by below:
R ( x ) = &Sigma; y &Element; N n * ( x ) &phi; ( y ) | | N n * ( x ) | | ,
Wherein, N n *x () is the deleted neighbourhood of the pixel x represented by node, φ (y) is the level set function being input as y, the codomain of this level set function be-1,0,1}, represent the pixel quantity in the deleted neighbourhood of the pixel x represented by node.
38. equipment according to claim 35, wherein, formulae discovery gradient characteristics value g (x) by below:
g ( x ) = 1 - 1 1 + | &dtri; ( G &sigma; * I ) | 2
Wherein G σbe take σ as the gaussian kernel function of variance, I represents the sets of pixel values of the pixel of image, and symbol * represents and asks convolution, symbol represent and ask gradient.
39. equipment according to claim 34, determine whether node is frontier point by formula below:
f 1 ( x ) = &phi; t G ( x ) + &lambda; 1 R ( x ) + &lambda; 2 g ( x )
f 2 ( x ) = &phi; t G ( x ) + &lambda; 1 R ( x ) - &lambda; 2 g ( x )
Wherein, represent chained list regional characteristic value, R (x) represents smooth property value, and g (x) represents gradient characteristics value, λ 1and λ 2the constant of nonnegative value,
Wherein, in the ergodic process outwards developed, as f1 (x) >0, determine that node is not frontier point; When f1 (x)≤0, determine that node is frontier point;
In the ergodic process inwardly developed, as f2 (x) <0, determine that node is not frontier point; When f2 (x) >=0, determine that node is frontier point.
40. equipment according to claim 34, wherein, chained list regional characteristic value computing unit determined the chained list regional characteristic value of current chained list after described initial boundary is converted to the form of chained list by chained list converting unit before evolution unit in border travels through current chained list.
41. equipment according to claim 34, wherein, Section Point eigenwert computing unit also calculates the shape facility value of the node of the chained list traversed.
42. equipment according to claim 41, wherein, formulae discovery shape facility value c (x, z) by below:
c ( x , z ) = 1 - exp ( - ( d ( x , z ) - r 0 ( z ) ) 2 &epsiv; 2 ) ,
Wherein, z represents the coordinate of an axis of the pixel x of image represented by node during 3-D view, r 0z () is the average radius on the border on z-plane, d (x, z) is the distance of pixel x to the center on the border on z-plane, and ε is constant.
43. equipment according to claim 41, determine whether node is frontier point by formula below:
f 1 ( x ) = &phi; t G ( x ) + &lambda; 1 R ( x ) + &lambda; 2 g ( x ) + &lambda; 3 c ( x , z ) )
f 2 ( x ) = &phi; t G ( x ) + &lambda; 1 R ( x ) - &lambda; 2 g ( x ) + &lambda; 3 c ( x , z ) )
Wherein, represent chained list regional characteristic value, R (x) represents smooth property value, and g (x) represents gradient characteristics value, and c (x, z) represents shape facility value, λ 1, λ 2, λ 3the constant of nonnegative value,
Wherein, in the ergodic process outwards developed, as f1 (x) >0, determine that node is not frontier point; When f1 (x)≤0, determine that node is frontier point.
44. equipment according to claim 23, wherein, described image is two dimensional image or 3-D view.
CN201310728768.6A 2013-12-25 2013-12-25 Method and apparatus for dividing object from image Pending CN104751434A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310728768.6A CN104751434A (en) 2013-12-25 2013-12-25 Method and apparatus for dividing object from image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310728768.6A CN104751434A (en) 2013-12-25 2013-12-25 Method and apparatus for dividing object from image

Publications (1)

Publication Number Publication Date
CN104751434A true CN104751434A (en) 2015-07-01

Family

ID=53591055

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310728768.6A Pending CN104751434A (en) 2013-12-25 2013-12-25 Method and apparatus for dividing object from image

Country Status (1)

Country Link
CN (1) CN104751434A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105389813A (en) * 2015-10-30 2016-03-09 上海联影医疗科技有限公司 Medical image organ recognition method and segmentation method
CN108090483A (en) * 2016-11-21 2018-05-29 医渡云(北京)技术有限公司 The detection method and device of breast lump candidate regions
CN105389813B (en) * 2015-10-30 2018-08-31 上海联影医疗科技有限公司 The recognition methods of organ and dividing method in medical image

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105389813A (en) * 2015-10-30 2016-03-09 上海联影医疗科技有限公司 Medical image organ recognition method and segmentation method
CN105389813B (en) * 2015-10-30 2018-08-31 上海联影医疗科技有限公司 The recognition methods of organ and dividing method in medical image
CN108090483A (en) * 2016-11-21 2018-05-29 医渡云(北京)技术有限公司 The detection method and device of breast lump candidate regions
CN108090483B (en) * 2016-11-21 2021-04-27 医渡云(北京)技术有限公司 Method and device for detecting breast tumor candidate region

Similar Documents

Publication Publication Date Title
US10963632B2 (en) Method, apparatus, device for table extraction based on a richly formatted document and medium
US9619691B2 (en) Multi-view 3D object recognition from a point cloud and change detection
CN112184752A (en) Video target tracking method based on pyramid convolution
US20210183097A1 (en) Spare Part Identification Using a Locally Learned 3D Landmark Database
KR102219561B1 (en) Unsupervised stereo matching apparatus and method using confidential correspondence consistency
CN113129335B (en) Visual tracking algorithm and multi-template updating strategy based on twin network
CN111091101B (en) High-precision pedestrian detection method, system and device based on one-step method
Baheti et al. Federated Learning on Distributed Medical Records for Detection of Lung Nodules.
CN112241676A (en) Method for automatically identifying terrain sundries
CN106611030A (en) Object similarity comparison method and object search method based on video, and object similarity comparison system and object search system based on video
CN107993239A (en) A kind of method and apparatus for the depth order for calculating monocular image
CN115393601A (en) Three-dimensional target detection method based on point cloud data
Chen et al. Improved fast r-cnn with fusion of optical and 3d data for robust palm tree detection in high resolution uav images
CN111008630A (en) Target positioning method based on weak supervised learning
CN116523970B (en) Dynamic three-dimensional target tracking method and device based on secondary implicit matching
CN111008294B (en) Traffic image processing and image retrieval method and device
CN104751434A (en) Method and apparatus for dividing object from image
CN114972947B (en) Depth scene text detection method and device based on fuzzy semantic modeling
Zhu et al. (Retracted) Transfer learning-based YOLOv3 model for road dense object detection
CN114913519B (en) 3D target detection method and device, electronic equipment and storage medium
CN115527050A (en) Image feature matching method, computer device and readable storage medium
CN112133100B (en) Vehicle detection method based on R-CNN
CN114881850A (en) Point cloud super-resolution method and device, electronic equipment and storage medium
CN110610185B (en) Method, device and equipment for detecting salient object of image
CN112561956A (en) Video target tracking method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20150701

WD01 Invention patent application deemed withdrawn after publication