CN1595432A - Digital image dividing method based on cluster learning equipment integration - Google Patents

Digital image dividing method based on cluster learning equipment integration Download PDF

Info

Publication number
CN1595432A
CN1595432A CN200410041172.XA CN200410041172A CN1595432A CN 1595432 A CN1595432 A CN 1595432A CN 200410041172 A CN200410041172 A CN 200410041172A CN 1595432 A CN1595432 A CN 1595432A
Authority
CN
China
Prior art keywords
learning device
clustering learning
pixel
clustering
control variable
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN200410041172.XA
Other languages
Chinese (zh)
Other versions
CN1313964C (en
Inventor
姜�远
周志华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University
Original Assignee
Nanjing University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University filed Critical Nanjing University
Priority to CNB200410041172XA priority Critical patent/CN1313964C/en
Publication of CN1595432A publication Critical patent/CN1595432A/en
Application granted granted Critical
Publication of CN1313964C publication Critical patent/CN1313964C/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Landscapes

  • Image Analysis (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention relates to a digital image segmentation method based on the clustering learning machine integration, which is composed of the following steps. Transfer the image to the pixel vector set, which is used for training to get the clustering learning machines. Combine the clustering results of the clustering learning machine to get the coarse segment result. Remove the isolated points of the result and combine the small region with the least pixels into the largest neighboring region. Moreover, combine the region with the small RGB mean value into its nearest neighboring region and complete. The advantage of the invention lies in that the digital image segmentation accuracy is enhanced with the multiple clustering learning machines, and the performance of the digital image processing equipment is adjuvantly improved when segmenting.

Description

Based on the integrated digital picture dividing method of clustering learning device
One, technical field
The present invention relates to digital image processing apparatus, particularly a kind of digital picture dividing method based on clustering learning device integrated technology.
Two, background technology
Along with digital image device popularizing of digital camera etc. for example, digital picture has obtained widespread use in all trades and professions.In order effectively to extract and utilize the information that comprises in the digital picture, need cut apart image.Image segmentation is meant to be opened having different semantic Region Segmentation in the image, mutually disjoints in these zones, and consistance is all satisfied in each zone.Effectively the digital picture cutting techniques will be for further digital image search, identification etc. lay the foundation.The clustering learning device is classical machine learning techniques, and the digital picture cutting techniques based on the clustering learning device has all only utilized a clustering learning device at present, and segmentation effect remains further to be improved.
Three, summary of the invention
The purpose of this invention is to provide a kind of digital picture dividing method that utilizes a plurality of clustering learning devices, improves segmentation precision, carrying out the performance of digital picture when cutting apart with the auxiliary digital image processing apparatus that improves.
For realizing purpose of the present invention, of the present invention based on the integrated digital picture dividing method of clustering learning device, may further comprise the steps: 1, image is converted into the pixel vectors set; 2, utilize the pixel vectors set to train a plurality of clustering learning devices, comprise that the cluster mark that each clustering learning device is used carries out registration; 3, the cluster result with each clustering learning device carries out combination to produce the coarse segmentation result; 4, the isolated point among the removal coarse segmentation result; 5, pixel count is few zonule is incorporated its maximum neighborhood into; 6, its nearest-neighbor is incorporated in the RGB average is little zone into; 7, finish.
The present invention compared with prior art, its remarkable advantage is: utilize a plurality of clustering learning devices to improve the precision that digital picture is cut apart, and the auxiliary digital image processing apparatus that improved is carrying out the performance of digital picture when cutting apart.
Four, description of drawings
Fig. 1 is the digital image processing apparatus workflow diagram.
Fig. 2 is the process flow diagram of the inventive method.
Fig. 3 generates the integrated process flow diagram of clustering learning device.
Fig. 4 is the process flow diagram that produces the coarse segmentation result.
Five, embodiment
The present invention is described in detail below in conjunction with drawings and Examples.
As shown in Figure 1, digital image processing apparatus obtains digital picture by Digital Image Input Device, carries out pre-service such as level and smooth, denoising then, carries out image segmentation then, and the image after cutting apart can be proceeded processing such as retrieval, identification.
Method of the present invention as shown in Figure 2.Step 10 is initial actuatings.The number of regions N that step 11 obtains to desire to be partitioned into from the user (N be one greater than 1 integer).It is integrated that step 12 generates the clustering learning device, and this step will be specifically introduced in conjunction with Fig. 3 in the part of back.Step 14 is utilized the integrated generation coarse segmentation of clustering learning device result, and this step will be specifically introduced in conjunction with Fig. 4 in the part of back.Comprise a plurality of zones and several isolated points among the result of coarse segmentation, step 16 is removed these isolated points, is about to isolated point and incorporates in its area surrounded.Step 17 calculates the pixel count that each zone comprises, and its maximum neighborhood is incorporated in the zone that pixel count is minimum into, up to remaining 2N zone.Step 18 calculates each regional RGB average (being mean value red in the image, green, blue component), find out the zone of RGB average minimum, and this zone incorporated into and the immediate neighborhood of its RGB average, up to remaining N zone, just obtained segmentation result this moment.Step 19 is done states of Fig. 2.
Fig. 3 describes the step 12 among Fig. 2 in detail, and its effect is that generation clustering learning device is integrated.Step 120 is initial actuatings.The clustering learning device that step 121 obtains to use from the user is counted M (M be greater than 1 integer), here the clustering learning device that employed clustering learning device can be an any kind, as long as can carry out the cluster task, for example can use the SOM neural network of introducing in the machine learning textbook.Step 122 is expressed as one 5 dimension pixel vectors [R, G, B, x, y] with each pixel in the image, and R, G, B represent redness, green, the blue component value of this pixel respectively here, and x and y represent the horizontal ordinate and the ordinate of this pixel respectively.Step 123 is changed to 1 with controlled variable i.Step 124 judges whether i is not more than M, is execution in step 125 then, otherwise forwards step 128 to.Step 125 is provided with the parameter of clustering learning device at random, if what for example use is the SOM neural network, its learning rate and distance threshold is set at random then.Step 126 trains a clustering learning device, and it is k class that this clustering learning device gathers pixel vectors, and k can be any normal integer bigger than the 2N among Fig. 2 here, for sake of convenience, supposes that this k class is respectively C 1 (i), C 2 (i)..., C k (i), C here j (i)Represent that i clustering learning device gathers j the class that.Step 127 adds 1 with control variable i, forwards step 124 then to.
Because each clustering learning device is trained respectively, the cluster mark of its use may be different, for example i clustering learning device gathers l the class that may to gather l the class that with j clustering learning device is not same class, therefore need carry out registration to the cluster mark that each clustering learning device uses.The remainder of Fig. 3 promptly is used to finish this task.The step 128 of Fig. 3 is changed to 2 with control variable j.Step 129 judges whether j is not more than M, is execution in step 130 then, otherwise forwards step 138 to, be i.e. the done state of Fig. 3.Step 130 is changed to Ω the cluster mark C that is used by j clustering learning device 1 (j), C 2 (j)..., C k (j)The set of forming.Step 131 is changed to 1 with control variable l.Step 132 judges whether l is not more than k, is execution in step 133 then, otherwise execution in step 137, forwards step 129 to after control variable j is added 1.Step 133 is found out among the Ω with the 1st clustering learning device and is gathered l class, i.e. C l (1), the maximum class C of identical pixel vectors that is comprised u (j)Step 134 is with C u (j)Be changed to C l (1)Step 135 is removed C from Ω u (j)Step 136 forwards step 132 to after control variable l is added 1.Be actually the cluster mark registration that cluster mark that j clustering learning device used uses to the 1st clustering learning device from the operation of step 132 to 136.
Fig. 4 describes the step 14 of Fig. 2 in detail, and its effect is the integrated coarse segmentation result that produces of clustering learning device who utilizes Fig. 3 to generate.Step 140 is initial actuatings.Step 141 is changed to 1 with control variable i.Step 142 judges whether i is not more than the M among Fig. 3, i.e. the clustering learning device number of clustering learning device in integrated is execution in step 143 then, otherwise forwards step 146 to.Step 143 according to following formula calculate the clustering learning device integrated in the average mutual information β of i clustering learning device i, β iValue big more, mean i clustering learning device the quantity of information that do not comprised by other clustering learning devices just few more:
β i = 1 M - 1 Σ i = 1 , i ≠ j M Φ NM 1 ( C ( i ) , C ( j ) )
C wherein (i)And C (j)Be two clustering learning devices, its mutual information Φ NM1(C (i), C (j)) be defined as following formula:
Φ NM 1 ( C ( i ) , C ( j ) ) = 2 n Σ a = 1 k Σ b = 1 k n ab log k 2 ( n ab n n a n b )
Wherein k gathers the classification number that is for clustering learning device in the step 126 of Fig. 2 with the pixel vectors set, and n is the pixel vectors number, in these pixel vectors n is arranged aThe individual C that belongs to a (i), n is arranged bThe individual C that belongs to b (j), n is arranged AbThe individual C that both belonged to a (i)Belong to C again b (j)
The average mutual information that the step 144 of Fig. 4 utilizes step 143 to calculate, according to following formula calculate the clustering learning device integrated in the weight w of i clustering learning device i:
w i = 1 β i Z
Wherein Z is the standardization factor, makes:
w i > 0 ( i = 1,2 , . . . , M ) and Σ i = 1 M w i = 1
The step 145 of Fig. 4 adds 1 with control variable i, forwards step 142 to.Step 146 is taken out a untreated pixel vectors from the pixel vectors set.Step 147 is provided with a counter respectively for each classification, and what results that the clustering learning device provides these counters are used for recording is these classifications, and here of all categories respectively corresponding be the cluster result of the 1st clustering learning device, i.e. C among Fig. 3 1 (1), C 2 (1)..., C k (1)Step 148 is with all counter O resets.Step 149 is changed to 1 with control variable j.Step 150 obtain the clustering learning device integrated in j the result of clustering learning device on the current pixel vector, for sake of convenience, claim this result for C x (j)Step 151 is with C x (j)The counter of pairing classification adds w j, i.e. the weight of j clustering learning device calculating of step 144.Step 152 adds 1 with control variable j.Step 153 judges whether j is not more than M, and promptly the number of the integrated middle clustering learning device of clustering learning device is then to forward step 150 to, otherwise with regard to execution in step 154.Value in step 154 pair all counters compares, the counter that the value of finding out is maximum, and with the as a result classification of its corresponding class as the current pixel vector; If there is the value in a plurality of counters to be maximal value, then to occur the as a result classification of the classification of chance maximum in these counter corresponding class as the current pixel vector.Step 155 judges whether in addition without step 146 to have then to forward step 146 to, otherwise just to enter step 156 to 154 pixel vectors of handling.Pixel adjacent and that classification is identical as a result is included into the same area to step 156 with position in the image.Step 157 is the done state of Fig. 4.

Claims (4)

1, a kind of based on the integrated digital picture dividing method of clustering learning device, comprise that digital picture is input to digital image processing apparatus by Digital Image Input Device carries out pre-service, carry out image segmentation then and handle, it is characterized in that the method for described dividing processing may further comprise the steps:
(1) image is converted into the pixel vectors set;
(2) utilize the pixel vectors set to train a plurality of clustering learning devices;
(3) cluster result with each clustering learning device carries out combination to produce the coarse segmentation result;
(4) isolated point among the removal coarse segmentation result;
(5) pixel count is few zonule is incorporated its maximum neighborhood into;
(6) incorporate the little zone of mean value red in the image, green, blue component into its nearest-neighbor;
(7) finish.
2, according to claim 1 based on the integrated digital picture dividing method of clustering learning device, it is characterized in that the said step of utilizing the pixel vectors set to train a plurality of clustering learning devices is:
(1) step 121 is counted M from the clustering learning device that the user obtains to use, M be one greater than 1 integer;
(2) step 122 is expressed as one 5 dimension pixel vectors [R, G, B, x, y] with each pixel in the image, and wherein R, G, B represent redness, green, the blue component value of this pixel respectively, and x, y represent the horizontal ordinate and the ordinate of this pixel respectively;
(3) step 123 is changed to 1 with controlled variable i;
(4) step 124 judges whether i is not more than M, is execution in step (5) then, otherwise forwards step (8) to;
(5) step 125 is provided with the parameter of clustering learning device at random;
(6) step 126 trains a clustering learning device, and this clustering learning device is k class with pixel vectors is poly-, and k is the integer greater than 2N, N be one greater than 1 obtain the number of regions desiring to be partitioned into from the user;
(7) step 127 adds 1 with control variable i, forwards step (4) then to;
(8) step 128 is changed to 2 with control variable j;
(9) step 129 judges whether j is not more than M, is execution in step (10) then, otherwise finishes;
(10) step 130 is changed to Ω the set of being made up of the cluster mark of j clustering learning device use;
(11) step 131 is changed to 1 with control variable l;
(12) step 132 judges whether l is not more than k, is execution in step (13) then, otherwise forwards step (9) to after control variable j added 1;
(13) step 133 is found out among the Ω with the 1st clustering learning device and is gathered l class, i.e. C l (1), the maximum class C of identical pixel vectors that is comprised u (j)
(14) step 134 is with C u (j)Be changed to C l (1)
(15) step 135 is removed C from Ω u (j)
(16) step 136 forwards step (12) to after control variable l is added 1.
3, according to claim 1 based on the integrated digital picture dividing method of clustering learning device, it is characterized in that said generation coarse segmentation result's step is:
(1) control variable i is changed to 1;
(2) judge whether i is not more than the clustering learning device and counts M, be execution in step (3) then, otherwise forward step (6) to;
(3) according to following formula calculate the clustering learning device integrated in the average mutual information of i clustering learning device;
(4) the average mutual information that utilizes step (3) to obtain, the weight w of i clustering learning device during calculating clustering learning device is integrated i:
(5) control variable i is added 1, forward step (2) to;
(6) from the pixel vectors set, take out a untreated pixel vectors;
(7) for each classification a counter is set respectively;
(8) with all counter O resets;
(9) control variable j is changed to 1;
(10) obtain the clustering learning device integrated in j the result of clustering learning device on the current pixel vector;
(11) with the counter weighted w of the pairing classification of The above results j
(12) control variable j is added 1;
(13) judging whether j is not more than M, is then to forward step (10) to, otherwise with regard to execution in step (14);
(14) value in all counters is compared, the counter that the value of finding out is maximum, and with the as a result classification of its corresponding class as the current pixel vector;
(15) judge whether in addition the pixel vectors handled without step (6) to (14), have then to forward step (6) to, otherwise just enter step (16);
(16) pixel adjacent and that classification is identical as a result is included into the same area with position in the image;
(17) finish.
4, according to claim 3 based on the integrated digital picture dividing method of clustering learning device, it is characterized in that in step (14), if there is the value in a plurality of counters to be maximal value, then to occur the as a result classification of the classification of chance maximum in these counter corresponding class as the current pixel vector.
CNB200410041172XA 2004-07-05 2004-07-05 Digital image dividing method based on cluster learning equipment integration Expired - Fee Related CN1313964C (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CNB200410041172XA CN1313964C (en) 2004-07-05 2004-07-05 Digital image dividing method based on cluster learning equipment integration

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CNB200410041172XA CN1313964C (en) 2004-07-05 2004-07-05 Digital image dividing method based on cluster learning equipment integration

Publications (2)

Publication Number Publication Date
CN1595432A true CN1595432A (en) 2005-03-16
CN1313964C CN1313964C (en) 2007-05-02

Family

ID=34664945

Family Applications (1)

Application Number Title Priority Date Filing Date
CNB200410041172XA Expired - Fee Related CN1313964C (en) 2004-07-05 2004-07-05 Digital image dividing method based on cluster learning equipment integration

Country Status (1)

Country Link
CN (1) CN1313964C (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101188018B (en) * 2007-12-06 2010-08-25 北大方正集团有限公司 An automatic land return method and device in typeset
CN102737381A (en) * 2012-06-13 2012-10-17 西安电子科技大学 Image partitioning method based on mixed bipartite graph clustering integration
CN106296695A (en) * 2016-08-12 2017-01-04 西安理工大学 Adaptive threshold natural target image based on significance segmentation extraction algorithm

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030095707A1 (en) * 2001-11-19 2003-05-22 Koninklijke Philips Electronics N.V. Computer vision method and system for blob-based analysis using a probabilistic pramework
CN1139898C (en) * 2002-03-25 2004-02-25 北京工业大学 Cornea focus image cutting method based on k-mean cluster and information amalgamation

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101188018B (en) * 2007-12-06 2010-08-25 北大方正集团有限公司 An automatic land return method and device in typeset
CN102737381A (en) * 2012-06-13 2012-10-17 西安电子科技大学 Image partitioning method based on mixed bipartite graph clustering integration
CN102737381B (en) * 2012-06-13 2014-10-29 西安电子科技大学 Image partitioning method based on mixed bipartite graph clustering integration
CN106296695A (en) * 2016-08-12 2017-01-04 西安理工大学 Adaptive threshold natural target image based on significance segmentation extraction algorithm
CN106296695B (en) * 2016-08-12 2019-05-24 西安理工大学 Adaptive threshold natural target image segmentation extraction algorithm based on conspicuousness

Also Published As

Publication number Publication date
CN1313964C (en) 2007-05-02

Similar Documents

Publication Publication Date Title
CN108537192B (en) Remote sensing image earth surface coverage classification method based on full convolution network
CN103886308B (en) A kind of pedestrian detection method of use converging channels feature and soft cascade grader
CN108052946A (en) A kind of high pressure cabinet switch automatic identifying method based on convolutional neural networks
CN104809187B (en) A kind of indoor scene semanteme marking method based on RGB D data
CN103020122B (en) A kind of transfer learning method based on semi-supervised clustering
CN106650615B (en) A kind of image processing method and terminal
CN106897739A (en) A kind of grid equipment sorting technique based on convolutional neural networks
CN108537134A (en) A kind of video semanteme scene cut and mask method
CN106447020B (en) A kind of intelligence method for counting colonies
CN107391772A (en) A kind of file classification method based on naive Bayesian
CN107392139A (en) A kind of method for detecting lane lines and terminal device based on Hough transformation
CN107590491A (en) A kind of image processing method and device
CN108182447A (en) A kind of adaptive particle filter method for tracking target based on deep learning
Gali et al. Genetic algorithm for content based image retrieval
CN100409248C (en) Process and device for detecting faces in a colour image
Claypo et al. Opinion mining for thai restaurant reviews using K-Means clustering and MRF feature selection
CN108549901A (en) A kind of iteratively faster object detection method based on deep learning
CN110490262A (en) Image processing model generation method, image processing method, device and electronic equipment
CN1313964C (en) Digital image dividing method based on cluster learning equipment integration
CN106991676A (en) A kind of super-pixel fusion method of local correlation
CN109271997B (en) Image texture classification method based on skip subdivision local mode
CN108960186B (en) Advertising machine user identification method based on human face
CN107908630A (en) Material picture color classification retrieving method
CN102034117B (en) Image classification method and apparatus
CN106340024A (en) Image segmentation method and application and computing device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
C17 Cessation of patent right
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20070502

Termination date: 20100705