CN101344928A - Method and apparatus for confirming image area and classifying image - Google Patents

Method and apparatus for confirming image area and classifying image Download PDF

Info

Publication number
CN101344928A
CN101344928A CNA200710136257XA CN200710136257A CN101344928A CN 101344928 A CN101344928 A CN 101344928A CN A200710136257X A CNA200710136257X A CN A200710136257XA CN 200710136257 A CN200710136257 A CN 200710136257A CN 101344928 A CN101344928 A CN 101344928A
Authority
CN
China
Prior art keywords
image
pixel
value
confidence
color
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CNA200710136257XA
Other languages
Chinese (zh)
Other versions
CN101344928B (en
Inventor
王健民
陈新武
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Priority to CN 200710136257 priority Critical patent/CN101344928B/en
Publication of CN101344928A publication Critical patent/CN101344928A/en
Application granted granted Critical
Publication of CN101344928B publication Critical patent/CN101344928B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The disclosed invention relates to a method used for determining the image region of an image. The method includes the following steps: the confidence value of a pixel in the image is calculated, and the type of the pixel and the image region of the pixel of the same type are determined by the confidence value. The method can significantly improve the recall ratio of the image and obtain a classification result better than the classification result obtained by the existing method.

Description

Be used for determining image-region and the method and apparatus that image is classified
Technical field
Relate generally to of the present invention is used for determining image-region and image kind and image is divided into the method and apparatus of a plurality of kinds.
Background technology
Classify the image as kind and be based on of the tool challenge of the major issue of image retrieval of content and computer vision with semantic meaning.By the feature of image it being classified is that applicable and further processing will be depended on sorting result.The very approaching classification of image retrieval.In image indexing system, the feature of extracting from image will be stored in the database together with image, and the image that has similar characteristics after a while will extract according to input feature vector.In this, searching system can be used as the application of sorting technique.Current, make a lot of work in this field and in document in recent years, proposed various systems.Yet many features of being extracted from entire image will be directly used in image classification in current algorithm (for example color histogram method), but the defective of low recall ratio still exists.
Therefore, need a kind ofly can to classify and obtain method to image than the better classification results of existing method.
Summary of the invention
The purpose of this invention is to provide and a kind ofly be used for determining image-region and classify the image as a plurality of kinds, thereby obtain method and apparatus than the better classification results of prior art.
In order to obtain above-mentioned purpose, according to an aspect of the present invention, provide a kind of method that is used for the image-region of definite image, comprise step: the value of the confidence of pixel in the computed image, determine the kind of pixel by the value of the confidence, determine the image-region of the pixel of identical type.
According to a further aspect in the invention, a kind of method that is used to classify the image as a plurality of kinds is provided, comprise step: the value of the confidence of pixel in the computed image, determine to classify the image as a plurality of kinds by described characteristics of image by characteristics of image by the value of the confidence and the positional information of pixel.
Still according to an aspect of the present invention, provide a kind of method that is used for determining the kind of image, comprise step: the value of the confidence of pixel in the computed image, determine the kind of image based on the value of the confidence of pixel in the image.
Still according to an aspect of the present invention, a kind of equipment that is used for the image-region of definite image is provided, the computing module that comprises the value of the confidence that is used for the computed image pixel is used for determining by the value of the confidence the image-region cover half piece really of the pixel of the kind of pixel and identical type.
According to a further aspect in the invention, a kind of equipment that is used to classify the image as a plurality of kinds is provided, the computing module that comprises the value of the confidence that is used for the computed image pixel, be used for determining characteristics of image cover half piece really by the value of the confidence and the positional information of pixel, and the sort module that is used for classifying the image as by described characteristics of image a plurality of kinds.
Still according to an aspect of the present invention, provide a kind of equipment that is used for determining the kind of image, comprise the computing module of the value of the confidence that is used for the computed image pixel, the kind that is used for determining image based on the value of the confidence of image pixel is the cover half piece really.
Be compared to prior art, use of the present invention can improve the recall ratio of image significantly and obtain than the better classification results of existing method.
In addition, classified image and the zone thereof that obtains by the present invention also can be used in the image improvement processing to obtain better visual effect.
Description of drawings
According to the detailed description of the exemplary embodiment of reading below in conjunction with accompanying drawing, above-mentioned and other purposes, feature and advantage of the present invention will become obvious.
Fig. 1 is the process flow diagram of method that is used to classify the image as a plurality of kinds according to an embodiment of the invention;
Fig. 2 illustrates some pixels of how selecting according to an embodiment of the invention in image;
Fig. 3 illustrates the clip image piece according to each pixel of selecting of an embodiment of the invention in Fig. 2;
The schematically illustrated wavelet transformation according to an embodiment of the invention of Fig. 4 is to obtain the process of wavelet character;
Fig. 5 is the process flow diagram of employed in an embodiment of the invention Adaboost algorithm;
Fig. 6 is the process flow diagram of the Weaklearn (weak study) in the Adaboost algorithm of Fig. 5;
Fig. 7 is the schematic block diagram of equipment that is used to classify the image as a plurality of kinds according to an embodiment of the invention;
Fig. 8 is according to the schematic block diagram of the equipment of the image-region that is used for definite image of an embodiment of the invention;
Fig. 9 is the schematic block diagram according to the kind that is used for definite image of an embodiment of the invention; And,
Figure 10 represents the illustrative application according to an embodiment of the invention.
Embodiment
With reference now to accompanying drawing, the present invention is described in detail.
Fig. 1 shows the process flow diagram of method that is used to classify the image as a plurality of kinds according to an embodiment of the invention.In step 100, the image that input will be classified.In step 110, image is reduced to predetermined size, and for example 19200.If the zone of image then narrows down to 19200 with it greater than 19200.The possibility of result of convergent-divergent is inaccurate but this is not too important.Dwindling algorithm can be a kind of linear transformation.After reduction operation, the ratio of picture traverse and height is constant.If the area of this image is not more than 19200, then it will directly be used by processing subsequently.In step 120, the value of the confidence of calculating pixel (also can be calculated by the computing module among Fig. 7 and Fig. 8 400 and 500 respectively) will be described its details hereinafter.
In order to calculate the value of the confidence, at first select some pixels of the image dwindle.For the pixel of selecting, calculate color characteristic and wavelet character according to the adjacent block zone of pixel.This feature of two types will constitute a kind of feature, can obtain the value of the confidence of pixel by this feature.As shown in Figure 2, in 9 adjacent pixels, only select a pixel (by grey colour specification).Yet, should be noted that system of selection is not limited to method described here.As shown in Figure 3, at coming shearing width and be the image block of 8 pixels highly by one in the pixel of the selection of black indication.In addition, should be noted that cutting method is not limited to method described here.For not having the pixel of the selection in enough spaces near image boundary and for the cutout image, they will be left in the basket.Then carry out the calculating of color characteristic, this color characteristic comprises r, the g of execution block pixel and average, variance yields and the covariance value of b component.Can express by following equation:
1. the average of piece pixel:
f ( 1 ) = Σ i = 1 N r ( i ) N , f ( 2 ) = Σ i = 1 N g ( i ) N , ( 3 ) = Σ i = 1 N b ( i ) N
2. the variance yields of piece pixel:
f ( 4 ) = Σ i = 1 N ( r ( i ) - f ( 1 ) ) 2 N , f ( 5 ) = Σ i = 1 N ( g ( i ) - f ( 2 ) ) 2 N , f ( 6 ) = Σ i = 1 N ( b ( i ) - f ( 3 ) ) 2 N
3. the covariance value of piece pixel:
f ( 7 ) = Σ i = 1 N ( r ( i ) - f ( 1 ) ) × ( g ( i ) - f ( 2 ) ) N
f ( 8 ) = Σ i = 1 N ( r ( i ) - f ( 1 ) ) × ( b ( i ) - f ( 3 ) ) N
f ( 9 ) = Σ i = 1 N ( g ( i ) - f ( 2 ) ) × ( b ( i ) - f ( 3 ) ) N
F (1) wherein ..., f (9) representation feature, N represents the number of pixels of piece, its in the present invention preferably 64, the pixel of piece can be registered as pixel i and r (i), g (i) and b (i), remarked pixel r, g and b component respectively.
With reference now to Fig. 4,, it schematically shows wavelet transformation according to an embodiment of the invention to obtain the process of feature.This method uses 2 grades of small echos (the existing Haar of use small echo) conversion to obtain feature.Hereinafter will carry out brief description to the process of this conversion.At first, the piece color image is converted to grey piece image.Secondly, by being applied 1 grade of wavelet transformation, grey piece image obtains 1 grade of small echo result.Then 1 grade of wavelet transformation is applied to once more small echo result's upper area, makes up twice wavelet transformation result and will obtain by 6 the digital indicated zones among Fig. 4.After this, with consideration of regional 1,2 ... 6 small echo result calculates wavelet character and variance yields thereof.According to following equation from zone 1,2 ... 6 extract 12 dimension wavelet characters.
M ( i ) = Σ j = 1 N abs ( g ( j ) ) N , V ( i ) = Σ j = 1 N ( abs ( g ( j ) ) - M ( i ) ) 2 N
I=1 wherein ..., 6, g (j) be the pixel j among the regional i gray-scale value (j=1 ..., N), N is the number of pixels of regional i, then M (i) is the average gray value of regional i, V (i) is the variance gray-scale value of regional i.
The feature that color characteristic that is obtained by above-mentioned steps and wavelet character constitute pixel.Since color characteristic be 9 the dimension and wavelet character be 12 the dimension, so the feature of pixel be 21 the dimension and they can be expressed as f (1) ..., f (21).Be called as weighted value by the end value that applies discriminant H (x) (below will explain) resulting H (x).Then obtain the value of the confidence by the weighted value of scope [0,1] being carried out normalization according to following step:
1) makes weighted value=weighted value/10;
2) if weighted value<0 makes weighted value=0; If weighted value>1 makes weighted value=1;
3) weighted value is in scope [0,1] like this, and therefore the value of the confidence is a weighted value.
Obtain discriminant H (x) and parameter thereof by Adaboost.With reference now to Fig. 5,, the figure shows process flow diagram at the employed Adaboost algorithm of embodiments of the present invention.Although the Adaboost algorithm is known to those skilled in the art, it is described tout court will help better to understand this method.
In step 200, import m training sample, that is, and S={ (x 1, y 1), (x 2, y 2) ..., (x m, y m), wherein indicate y i∈ Y={0,1}, wherein x iIt is the example that extracts and represent in some way (being generally the d n dimensional vector n of property value) from certain space X.In training at the value of the confidence of pixel, x iBe the plain feature of value: f (1) ..., f (21) (introducing), y by calculating the value of the confidence iBe and x iRelated class mark and only can be representative negate (for example, non-blue sky pixel) sample 0 or representative certainly (for example, blue sky pixel) sample 1.Yet, should be noted that y iValue be not limited to value described here, those skilled in the art also can be provided with y according to actual needs iValue.In step 210, initializaing variable t is configured to equal 1, in step 220, carries out D according to following formula t(i) initialization:
D t(i)=1/m i=1,2,...,m,t=1,2,...,T (3)
D wherein 1Be the training sample S={ (x of the first round 1, y 1), (x 2, y 2) ..., (x m, y m) distribution and utilize and evenly to distribute initialization.In each iteration based on previous D T-1Calculate D subsequently tThen process flow diagram advances to iterative part, by being illustrated by the indicated following step of numeral.1) calls weak study (describing after a while), provide mistake indicia distribution D to it tAnd obtain supposing h t(X)->[1,1], h tThe real number of output from-1 to 1 is worth greatly more, and then sample may be sure sample more.On the other hand, be worth more for a short time, then sample may be the sample (step 230) negating more.2) calculate h tPuppet loss
ϵ t Σ i = 1 m 1 - ( y i * 2 - 1 ) * h t ( x i ) 2 D t ( i )
ε tBe to estimate hypothesis h tThe value of quality degree.If ε tBe zero, h then tDry straight.On the other hand, if ε tBig more, ε then tThe result more bad.Should be noted that this error is at distribution D t(i) measure.
3) β t is arranged to equal ε t/ (1-ε t), β wherein tOnly be temporary variable for simplicity and do not have substantial connotation.
4) upgrade distribution D t(i):
D t + 1 ( i ) = D t ( i ) Z t β t 1 + ( y i * 2 - 1 ) * h t ( x i ) 2
Z wherein tBe that the normalization constant (selects to make D like this T+1To be to distribute).The weight of example and certain number multiply each other, and making can be based on D tCalculate D T+1Then can be by coming normalized weight again divided by the normalization constant.Effectively, " easily " sample of being classified by many previous weak hypothesis will obtain lower weight, and often will be obtained higher weight by " difficulty " sample of misclassification.Therefore, this method will focus on to be the most difficult sample (step 240) to weak study to the weight of maximum.
This algorithm will continue the T wheel, and last algorithm is with weak hypothesis h 1, h 2..., h TMerge to single final hypothesis:
H ( x ) = Σ i ≈ 1 T log ( 1 β i ) g ( ( f ( k i ) × s i - r i ) / q i )
Wherein
Figure A20071013625700144
Fig. 6 is the process flow diagram of weak study of the Adaboost algorithm of Fig. 5, by being illustrated by the indicated following step of numeral.
1) input
The weak learning algorithm that uses receives m training set S={ (x 1, y 1), (x 2, y 2) ..., (x m, y m) with and at the distribution D={D (1) of the t of Adaboost algorithm wheel, D (2) ..., D (m) (step 300).
2) iteration
Carry out k=1,2,3 ... the iteration of d (d is the dimension of pattern space X)
1. obtain k pattern, P k={ p K, 1, p K, 2... p K, m}
p K, i=x I, kX wherein i=x I, 1, x I, 2..., x I, d(this means that xi is x I, 1, x I, 2..., x I, dMean value) (step 320).
2. pass through P kTraining sample is sorted, make for i<j arbitrarily p K, i<p K, j(step 330).
3. obtain symbol weight w k={ w K, 1, w K, 2... w K, m, w K, i=(y i* * D (i) 2-1), thereby for sure (negating) sample w K, iBe positive (negative).
4. obtain negative weight and S 0 k = &Sigma; w k , i < 0 - w k , i (positive number).
5. obtain symbol weight c k={ c K, 1, c K, 2... c K, mAccumulation and
c k , i = &Sigma; j = 1 i w k , j + S 0 k - 0.5
D6. find index i1 and i2, make ck, i1=max{c K, 1, c K, 2... c K, m-1And c K, i2=min{c K, 1, c K, 2..., c K, m-1}
7. if c K, i1>-c K, i2, then make S k=1, rk=p K, id1/ 2+p K, id1+1/ 2,
Otherwise make S k=-1, r k=-p K, id2/ 2-p K, id2+1/ 2,
8. obtain q k = 1 2 &Sigma; i = 1 m p k , i * p k , i * D ( i ) &Sigma; i = 1 m D ( i ) - ( &Sigma; i = 1 m p k , i * D ( i ) &Sigma; i = 1 m D ( i ) ) 2 + 0.00001
9. acquisition loss function
&epsiv; k = &Sigma; i = 1 m 1 - ( y i * 2 - 1 ) * g ( ( p k , i * s k - r k ) / q k ) 2 D t ( i ) , Wherein
3) after the iteration
After iteration, select best k, that is, k 0 = arg min ( &epsiv; k ) k = 1,2,3 , . . . , d (step 370).
4) output
The final output hypothesis of weak study is
H (x i)=g ((x I, k0* s K0-r K0)/q K0), wherein
Figure A20071013625700161
Suppose at the t wheel to obtain weak learning assumption, then h gets the h that uses in the iterative step t
Turn back to Fig. 1 now, in step 130, determine the kind of pixel by the value of the confidence of pixel.To determine that blue sky pixel and non-blue sky pixel are example, if the value of the confidence of pixel is greater than 0.00001 (here only being exemplary), then pixel is considered to the blue sky pixel, otherwise to be considered to be non-blue sky pixel.Therefore, determine the zone, blue sky in step 160 based on the previous blue sky pixel of determining.Should be noted that and use the present invention to determine that zone, blue sky and zone, non-blue sky only are exemplary and not restrictive, and it can be used to any regional kind definite as expectation.In step 140, come the feature of computed image by the value of the confidence of the pixel of selecting.In step 140, can come the feature of computed image by selected pixel the value of the confidence.Characteristics of image is the feature of 6 dimensions, and calculation process hereinafter is shown:
Order sumw = &Sigma; i = 1 N w ( i ) , Make eps=0.001
If (sumw<eps)
Make img (1)=-1, img (2)=-1, img (3)=-1, img (4)=-1, img (5)=-1, img (6)=-1.Under this condition ,-1 only is to be used for the mark of feature and it can be other value.In next step, this algorithm is with judge mark and do a little judgements.
Otherwise order:
img ( 1 ) = wumw N img ( 2 ) = &Sigma; i = 1 N w ( i ) * x ( i ) sumw img ( 3 ) = &Sigma; i = 1 N w ( i ) * y ( i ) sumw
img ( 4 ) = &Sigma; i = 1 N w ( i ) * x ( i ) * x ( i ) sumw - ( &Sigma; i = 1 N w ( i ) * x ( i ) ) * ( &Sigma; i = 1 N w ( i ) * x ( i ) ) sumw * sumw
img ( 5 ) = &Sigma; i = 1 N w ( i ) * y ( i ) * y ( i ) sumw - ( &Sigma; i = 1 N w ( i ) * y ( i ) ) * ( &Sigma; i = 1 N w ( i ) * y ( i ) ) sumw * sumw
img ( 6 ) = ( &Sigma; i = 1 N w ( i ) * x ( i ) * y ( i ) sumw - ( &Sigma; i = 1 N w ( i ) * x ( i ) ) * ( &Sigma; i = 1 N w ( i ) * y ( i ) ) sumw * sumw ) / ( ( img ( 4 ) + eps ) * ( img ( 5 ) + eps ) )
For the pixel i that selects, its value of the confidence is represented as w (i), and its coordinate is expressed as x (i) and y (i) respectively.N is the number of the pixel of selection.Above-mentioned img (1) ... img (6) is a characteristics of image.In step 150, utilize characteristics of image that image is classified, illustrate by following processes:
1) if (img (1)<0) or (img (1)=-1 and img (2)=-1 and img (3)=-1 and img (4)=-1 and img (5)=-1 and img (6)=-1) makes res=-10; Otherwise to feature img (1) ..., img (6) applies discriminant H (x) (below will explain), makes the end value of res=H (x).
2) if (res>0), then image is classified as blue sky, otherwise is classified as non-blue sky.Similarly, should be noted that use the present invention classifies the image as the blue sky image and non-blue sky image only is exemplary rather than restriction, and it can be used to classify the image as any kind of as expectation.
H ( x ) = &Sigma; i = 1 T log ( 1 &beta; i ) g ( ( img ( k i ) * s i - r i ) / q i )
Wherein
Obtain discriminant H (X) and its parameter by Adaboost.Training process is identical with the front, and some following differences are only arranged:
1. in training set, be image rather than pixel at this part training sample; The blue sky image is used as sure sample but not the blue sky image is used as the sample negating.
2. in training sample, for S set, xi presentation video feature: img (1) ..., img (6) (introducing) by above-mentioned part, for yi, 0 representative non-blue sky image pattern and 1 represent the blue sky image pattern.
Fig. 7, Fig. 8 and Fig. 9 show the schematic block diagram that image and respective regions thereof are classified of being used for according to an embodiment of the invention respectively.In Fig. 7, computing module 400, determination module 410 and sort module 420 can be finished the respective process in the step 120,140 and 150 of Fig. 1.In Fig. 8, computing module 500 and determination module 510 can be finished the step 120 of Fig. 1,130 and 160 respective process.In Fig. 9, the value of the confidence of each pixel in computing module 600 computed image and determination module 610 is determined the kind of image based on the value of the confidence of the pixel in the image.
Figure 10 illustrates can be according to the illustrative application of one embodiment of the present invention.At step 700 input picture.In step 710, by as classified in image or its zone and determine according to the equipment among the method among Fig. 1 of the present invention or Fig. 7,8 and 9.In step 720, image improvement is handled and is applied to image or its zone based on classification results.Obtain improved image at last.For example, the hypothetical target kind be the coloured image of blue sky and input with processed, at first, image is by method 110 convergent-divergents of Fig. 1.By the value of the confidence (this value for example is 0.1) of the pixel of method 120 computed image of Fig. 1, determine the kind of pixels by 130 of Fig. 1, then some pixels are classified as the blue sky pixel and other are classified as non-blue sky pixel.In the step 160 of Fig. 1, the zone that is made of the blue sky pixel is classified as the zone, blue sky, and is classified as zone, non-blue sky by the zone that zone, non-blue sky constitutes.In the step 140 of Fig. 1, obtain characteristics of image, suppose that eigenwert is v1...v6.In the step 150 of Fig. 1, determine the kind of image according to image feature value v1...v6.For example, the kind of image is confirmed as the blue sky image.Then in the step 720 of Figure 10, image improvement is handled and is applied in to improve the input picture (for example, improved method can make that the blue sky in the image is darker) as the blue sky image.
Although disclose specific implementations of the present invention, it will be appreciated by those skilled in the art that to make a change at specific embodiment and can not depart from the spirit and scope of the present invention.Therefore, scope of the present invention is not limited to specific embodiment, and intention is that appended claims contains any and all such application, modification and the embodiment in the scope of the invention.

Claims (64)

1. method that is used for determining the image-region of image comprises step:
Calculate the value of the confidence of pixel in the described image;
Determine the kind of described pixel by described the value of the confidence;
Determine the image-region of the pixel of identical type.
2. method according to claim 1, wherein in described calculation procedure, the pixel in the adjacent area of the described pixel in the use image is calculated the described the value of the confidence of pixel in the described image.
3. method according to claim 2, wherein the described pixel in the image is each pixel in the described image or the pixel selected from described image.
4. according to claim 2 or 3 described methods, the described step of wherein calculating the value of the confidence comprises following step in addition:
The feature of color of pixel described in the computed image;
Calculate the textural characteristics of described adjacent area; And
Calculate the described the value of the confidence of pixel described in the described image based on described color and textural characteristics.
5. method according to claim 4, the described step of wherein calculating color characteristic comprises the described color of pixel feature of calculating in the described adjacent area in addition.
6. method according to claim 4, wherein said textural characteristics is a wavelet character.
7. method according to claim 4, wherein said color characteristic comprise at least one in average, standard variance value and the covariance value of each color component.
8. method according to claim 6, wherein said wavelet character comprise by in a plurality of zones that wavelet transformation obtained of described image each the average of gray-scale value and at least one of standard variance value.
9. method according to claim 1 is wherein in described calculation procedure, by described color and textural characteristics are applied the described the value of the confidence that decision function calculates pixel described in the described image.
10. method according to claim 9, wherein said decision function is determined by the Adaboost algorithm.
11. a method that is used to classify the image as a plurality of kinds comprises step:
Calculate the value of the confidence of pixel in the described image;
The value of the confidence and positional information by pixel in the described image are determined characteristics of image;
Classify the image as a plurality of kinds by described characteristics of image.
12. method according to claim 11, wherein in described calculation procedure, the pixel described in the use image in the adjacent area of pixel is calculated the described the value of the confidence of pixel in the described image.
13. method according to claim 12, wherein the described pixel in the image is each pixel in the described image or the pixel selected from described image.
14. according to claim 12 or 13 described methods, the described step of wherein calculating the value of the confidence comprises following step in addition:
The feature of color of pixel described in the computed image;
Calculate the textural characteristics of described adjacent area; And
Calculate the described the value of the confidence of pixel described in the described image based on described color and textural characteristics.
15. method according to claim 14, the described step of wherein calculating color characteristic comprises the described color of pixel feature of calculating in the described adjacent area in addition.
16. method according to claim 14, wherein said textural characteristics is a wavelet character.
17. method according to claim 14, wherein said color characteristic comprise at least one in average, standard variance value and the covariance value of each color component.
18. method according to claim 16, wherein said wavelet character comprise by in a plurality of zones that wavelet transformation obtained of described image each the average of gray-scale value and at least one of standard variance value.
19. method according to claim 11 is wherein in described calculation procedure, by described color and textural characteristics are applied the value of the confidence that decision function calculates pixel described in the described image.
20. method according to claim 19, wherein said decision function is determined by the Adaboost algorithm.
21. method according to claim 11, wherein the described step of the kind of image being classified by described characteristics of image comprises in addition described characteristics of image is applied decision function.
22. method according to claim 21, wherein said decision function is determined by the Adaboost algorithm.
23. a method that is used for the kind of definite image comprises step:
Calculate the value of the confidence of pixel in the described image;
Determine the kind of described image based on the described the value of the confidence of pixel in the described image.
24. method according to claim 23, wherein in described calculation procedure, the pixel described in the use image in the adjacent area of pixel is calculated the value of the confidence of described image.
25. method according to claim 24, wherein pixel described in the image is each pixel in the described image or the pixel selected from described image.
26. according to claim 24 or 25 described methods, the described step of wherein calculating described the value of the confidence comprises following step in addition:
Calculate color of pixel feature described in the described image;
Calculate the textural characteristics of described adjacent area; And
Described the value of the confidence based on pixel described in described color and the textural characteristics computed image.
27. comprising in addition, method according to claim 26, the described step of wherein calculating color characteristic calculate described color of pixel feature in the described adjacent area.
28. method according to claim 26, wherein said textural characteristics is a wavelet character.
29. method according to claim 26, wherein said color characteristic comprise at least one in average, standard variance value and the covariance value of each color component.
30. method according to claim 28, wherein said wavelet character comprise by in a plurality of zones that wavelet transformation obtained of described image each the average of gray-scale value and at least one of standard variance value.
31. method according to claim 23 is wherein in described calculation procedure, by described color and textural characteristics are applied the value of the confidence that decision function calculates pixel described in the described image.
32. method according to claim 31, wherein said decision function is determined by the Adaboost algorithm.
33. an equipment that is used for the image-region of definite image comprises:
Computing module, it is used for calculating the value of the confidence of described image pixel;
Determination module, it is used for determining by described the value of the confidence the image-region of the pixel of the kind of pixel and identical type.
34. equipment according to claim 33, the pixel in the adjacent area of the described pixel in the wherein said computing module use image is calculated the described the value of the confidence of pixel described in the described image.
35. equipment according to claim 34, wherein the described pixel in the image is each pixel in the described image or the pixel selected from described image.
36., wherein calculate the value of the confidence and comprise following step in addition by described computing module according to claim 34 or 35 described equipment:
The feature of color of pixel described in the computed image;
Calculate the textural characteristics of described adjacent area; And
Calculate the described the value of the confidence of pixel described in the described image based on described color and textural characteristics.
37. equipment according to claim 36, the step of wherein calculating color characteristic comprises the described color of pixel feature of calculating in the described adjacent area in addition.
38. equipment according to claim 36, wherein said textural characteristics is a wavelet character.
39. equipment according to claim 36, wherein said color characteristic comprise at least one in average, standard variance value and the covariance value of each color component.
40. according to the described equipment of claim 38, wherein said wavelet character comprise by in a plurality of zones that wavelet transformation obtained of described image each the average of gray-scale value and at least one of standard variance value.
41. equipment according to claim 33, wherein said computing module is by applying the described the value of the confidence that decision function calculates pixel described in the described image to described color and textural characteristics.
42. according to the described equipment of claim 41, wherein said decision function is determined by the Adaboost algorithm.
43. an equipment that is used to classify the image as a plurality of kinds comprises:
Computing module, it is used for calculating the value of the confidence of described image pixel;
Determination module, it is used for determining characteristics of image by the value of the confidence of described image pixel and positional information;
Sort module, it is used for classifying the image as a plurality of kinds by described characteristics of image.
44. according to the described equipment of claim 43, wherein the pixel in the adjacent area of the described pixel in the computing module use image is calculated the value of the confidence of pixel in the described image.
45. according to the described equipment of claim 44, wherein the described pixel in the image is each pixel in the described image or the pixel selected from described image.
46., wherein calculate the value of the confidence and comprise following step in addition by described computing module according to claim 44 or 45 described equipment:
The feature of color of pixel described in the computed image;
Calculate the textural characteristics of described adjacent area; And
Calculate the described the value of the confidence of pixel described in the described image based on described color and textural characteristics.
47. according to the described equipment of claim 46, the step of wherein calculating color characteristic comprises the described color of pixel feature of calculating in the described adjacent area in addition.
48. according to the described equipment of claim 46, wherein said textural characteristics is a wavelet character.
49. according to the described equipment of claim 46, wherein said color characteristic comprises at least one in average, standard variance value and the covariance value of each color component.
50. according to the described equipment of claim 48, wherein said wavelet character comprise by in a plurality of zones that wavelet transformation obtained of described image each the average of gray-scale value and at least one of standard variance value.
51. according to the described equipment of claim 43, wherein said computing module is by applying the described the value of the confidence that decision function calculates pixel described in the described image to described color and textural characteristics.
52. according to the described equipment of claim 51, wherein said decision function is determined by the Adaboost algorithm.
53. according to the described equipment of claim 43, wherein said sort module applies decision function by described characteristics of image the kind of image is classified to described characteristics of image.
54. according to the described equipment of claim 53, wherein said decision function is determined by the Adaboost algorithm.
55. an equipment that is used for the kind of definite image comprises:
Computing module, it is used for calculating the value of the confidence of described image pixel;
Determination module, it is used for determining based on the described the value of the confidence of described image pixel the kind of described image.
56. according to the described equipment of claim 55, the pixel in the adjacent area of the described pixel in the wherein said computing module use image is calculated the described the value of the confidence of pixel in the described image.
57. according to the described equipment of claim 56, wherein the described pixel in the image is each pixel in the described image or the pixel selected from described image.
58., wherein calculate the value of the confidence and comprise following step in addition by described computing module according to claim 56 or 57 described equipment:
The feature of color of pixel described in the computed image;
Calculate the textural characteristics of described adjacent area; And
Calculate the described the value of the confidence of pixel described in the described image based on described color and textural characteristics.
59. according to the described equipment of claim 58, the described step of wherein calculating color characteristic comprises the described color of pixel feature of calculating in the described adjacent area in addition.
60. according to the described equipment of claim 58, wherein said textural characteristics is a wavelet character.
61. according to the described equipment of claim 58, wherein said color characteristic comprises at least one in average, standard variance value and the covariance value of each color component.
62. according to the described equipment of claim 60, wherein said wavelet character comprise by in a plurality of zones that wavelet transformation obtained of described image each the average of gray-scale value and at least one of standard variance value.
63. according to the described equipment of claim 55, wherein said computing module is by applying the described the value of the confidence that decision function calculates pixel described in the described image to described color and textural characteristics.
64. according to the described equipment of claim 63, wherein said decision function is determined by the Adaboost algorithm.
CN 200710136257 2007-07-12 2007-07-12 Method and apparatus for confirming image area and classifying image Expired - Fee Related CN101344928B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 200710136257 CN101344928B (en) 2007-07-12 2007-07-12 Method and apparatus for confirming image area and classifying image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 200710136257 CN101344928B (en) 2007-07-12 2007-07-12 Method and apparatus for confirming image area and classifying image

Publications (2)

Publication Number Publication Date
CN101344928A true CN101344928A (en) 2009-01-14
CN101344928B CN101344928B (en) 2013-04-17

Family

ID=40246929

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 200710136257 Expired - Fee Related CN101344928B (en) 2007-07-12 2007-07-12 Method and apparatus for confirming image area and classifying image

Country Status (1)

Country Link
CN (1) CN101344928B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101894297A (en) * 2009-05-22 2010-11-24 索尼公司 Discriminating device, method of discrimination and computer program
CN102781210A (en) * 2011-05-09 2012-11-14 富士机械制造株式会社 Method for generating reference mark model template
CN106056549A (en) * 2016-05-26 2016-10-26 广西师范大学 Hidden image restoration method based on pixel classification
CN106886783A (en) * 2017-01-20 2017-06-23 清华大学 A kind of image search method and system based on provincial characteristics
CN106951916A (en) * 2016-08-31 2017-07-14 惠州学院 A kind of Potato Quality stage division based on multiresolution algorithm and Adaboost algorithm

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1317673C (en) * 2004-03-18 2007-05-23 致伸科技股份有限公司 System and method for distinguishing words and graphics in an image using neural network

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101894297A (en) * 2009-05-22 2010-11-24 索尼公司 Discriminating device, method of discrimination and computer program
CN102781210A (en) * 2011-05-09 2012-11-14 富士机械制造株式会社 Method for generating reference mark model template
CN102781210B (en) * 2011-05-09 2016-10-05 富士机械制造株式会社 Reference mark model template generates method
CN106056549A (en) * 2016-05-26 2016-10-26 广西师范大学 Hidden image restoration method based on pixel classification
CN106056549B (en) * 2016-05-26 2018-12-21 广西师范大学 Hidden image restoration methods based on pixel classifications
CN106951916A (en) * 2016-08-31 2017-07-14 惠州学院 A kind of Potato Quality stage division based on multiresolution algorithm and Adaboost algorithm
CN106886783A (en) * 2017-01-20 2017-06-23 清华大学 A kind of image search method and system based on provincial characteristics

Also Published As

Publication number Publication date
CN101344928B (en) 2013-04-17

Similar Documents

Publication Publication Date Title
CN108830188B (en) Vehicle detection method based on deep learning
DE112016005059B4 (en) Subcategory-aware convolutional neural networks for object detection
CN105046196B (en) Front truck information of vehicles structuring output method based on concatenated convolutional neutral net
CN103049763B (en) Context-constraint-based target identification method
CN112016605B (en) Target detection method based on corner alignment and boundary matching of bounding box
CN104680127A (en) Gesture identification method and gesture identification system
CN107610114A (en) Optical satellite remote sensing image cloud snow mist detection method based on SVMs
CN111126224A (en) Vehicle detection method and classification recognition model training method
CN102968637A (en) Complicated background image and character division method
CN107392968B (en) The image significance detection method of Fusion of Color comparison diagram and Color-spatial distribution figure
CN103544484A (en) Traffic sign identification method and system based on SURF
CN112036231B (en) Vehicle-mounted video-based lane line and pavement indication mark detection and identification method
CN104573685A (en) Natural scene text detecting method based on extraction of linear structures
CN104657980A (en) Improved multi-channel image partitioning algorithm based on Meanshift
CN101344928B (en) Method and apparatus for confirming image area and classifying image
CN110929746A (en) Electronic file title positioning, extracting and classifying method based on deep neural network
CN104966047A (en) Method and device for identifying vehicle license
CN110689091A (en) Weak supervision fine-grained object classification method
DK2447884T3 (en) A method for the detection and recognition of an object in an image and an apparatus and a computer program therefor
CN112906550A (en) Static gesture recognition method based on watershed transformation
CN111126401A (en) License plate character recognition method based on context information
CN108664969A (en) Landmark identification method based on condition random field
CN110633635A (en) ROI-based traffic sign board real-time detection method and system
CN109213886B (en) Image retrieval method and system based on image segmentation and fuzzy pattern recognition
CN107368847B (en) Crop leaf disease identification method and system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20130417

Termination date: 20170712

CF01 Termination of patent right due to non-payment of annual fee