CN104392240A - Parasite egg identification method based on multi-feature fusion - Google Patents

Parasite egg identification method based on multi-feature fusion Download PDF

Info

Publication number
CN104392240A
CN104392240A CN201410587222.8A CN201410587222A CN104392240A CN 104392240 A CN104392240 A CN 104392240A CN 201410587222 A CN201410587222 A CN 201410587222A CN 104392240 A CN104392240 A CN 104392240A
Authority
CN
China
Prior art keywords
worm
pixel
ovum
image
refers
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201410587222.8A
Other languages
Chinese (zh)
Inventor
沈海默
陈韶红
陈家旭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
National Institute of Parasitic Diseases of Chinese Center for Disease Control and Prevention
Original Assignee
National Institute of Parasitic Diseases of Chinese Center for Disease Control and Prevention
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National Institute of Parasitic Diseases of Chinese Center for Disease Control and Prevention filed Critical National Institute of Parasitic Diseases of Chinese Center for Disease Control and Prevention
Priority to CN201410587222.8A priority Critical patent/CN104392240A/en
Publication of CN104392240A publication Critical patent/CN104392240A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/752Contour matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/757Matching configurations of points or features

Abstract

Provided is a parasite egg identification method based on multi-feature fusion. The method comprises the steps that one step of image preprocessing, in which brightness normalization processing and sharpening processing based on Gaussian filtering are performed on image information acquired by micro-photographic equipment, is performed so that an egg edge sharpening image is obtained; segmentation processing is performed on a target picture by using a mean shift algorithm so that areas judged to be eggs are acquired; binary processing is performed on each candidate edge area according to the established information of the parasite egg shape edge areas to be identified, and target acquisition is performed by adopting a boundary tracking algorithm according to the boundary of the egg areas so that the segmented egg images are obtained; the specified feature values of the egg images are intercepted to be stored in a preset feature database; and the acquired feature values are substituted into a general database by adopting a KNN (k=3) algorithm based on relative distance, and the category of the eggs is judged based on the KNN algorithm. Egg identification accuracy exceeds 90% so that a relatively ideal result is achieved.

Description

A kind of parasite egg recognition methods based on multi-feature fusion
Technical field:
The invention belongs to image identification technical field, particularly relate to a kind of worm's ovum recognition methods, specifically a kind of recognition methods of parasite egg based on multi-feature fusion.
Background technology:
Parasitic disease remains one of global public health problem, and worm's ovum microscopy is one of crucial Prevention Technique, is also a basic link of parasite morphologic signature analysis and subsequent biological research.The identification of parasite egg cannot be carried out with self-reacting device as blood cell analysis, can only rely on human eye for a long time and carry out observation resolution under the microscope.But in numerous parasite samples, different worm's ovums is differentiated it is a both loaded down with trivial details job, also need to carry out special training to technician simultaneously.After the artificial smear of current employing sample, naked eyes distinguish method for distinguishing under the microscope, not only complex operation, identification error are different with the experience of reviewer and state, and lack objectivity and accuracy, inspection specimen image, data and result are not easy to store, reappear and retrieval, can not adapt to the demand of modern medical service Informatization Development.Therefore, computer technology is necessary to assist the identification carrying out parasite egg.
Nineteen ninety-five, the Kong Xiangwei of Department of Electronics of Dalian University of Technology carried out worm egg microcomputer detection and indentification systematic research under microscope, and correct recognition rata is close to 92.3%.Within 1997, Zhao Ya pretty young woman has also carried out the automatic Study of recognition for 10 kinds of parasite egg pictures, and be extracted the girth in worm's ovum region, area, circularity and density four features and identify, recognition correct rate reaches 92%.The refined grade of Zhongshan University Fu Cheng is developed 7 kinds of fluke adult samples and is extracted corresponding 13 morphological features and carries out discriminator for 2002, and recognition accuracy reaches 89.04%.But image need with utilize Photoshop, AutoCAD, etc. carry out pre-service.The quick Wavelet transformation coefficient characteristics utilizing small echo classification to be extracted worm's ovum image of Guo Xiao in 2004, and selected probabilistic neural network to classify to worm's ovum.Li Jun peak in 2005 utilizes the tree-shaped principle of stratification to build sorter in conjunction with minimum distance classification principle, Bayes criterion and artificial neural network etc. and identifies, accuracy reaches 94.91%.The Peng Shexin of Hunan University in 2005 develops parasites identification system, and discrimination can reach 93.0%.The people such as Luo Ze act, Song Lihong in 2007 propose a kind of New Image feature extracting method and adopt SVM to realize automatically identifying and classification to nine kinds of parasite egg pictures such as blood flukes, and discrimination reaches 93.9%.
Abroad, within 1996, Copenhagen, Denmark veterinary laboratories Sommer C. utilizes the amplitude calculating its Fourier transform to classify, and accurate discrimination is 81.5%; Sommer C. was extracted the size of three kinds of ox nematode worm's ovum images, texture and shape facility for Classification and Identification in 1998, make average correct recognition rata reach 91.2%; The people such as South Korea Seoul national university Yang ySll31 in 1999 adopt 7 kinds of totally 52 human parasite egg pictures, and utilize artificial neural network recognition methods to carry out classification and Detection and identification to the 4 kinds of morphological features extracted, and recognition accuracy reaches 86 %.Yang equals kind and the picture number that calendar year 2001 adds worm's ovum, and the correct recognition rata obtained after utilizing said method to carry out classification and Detection and identification brings up to 90.3%.[141 people such as grade carry out digital image understanding process to the five kinds of nematode larval images parasitized in domestic animal to the G.Theodoropoulos of Athens, GRE national science and technology university in 2000, and 7 effective characteristic parameters of extraction are classified, and correct recognition rata is 91.9%.The people such as the Jane S.Fraga of St. Paul,Brazil university in 2007 utilize Bayes sorter to achieve to infect parasitic identification to poultry, and discrimination reaches 85.75%.The same year, the S.Raviraja of the Sudan used statistical method to classify the classification of blood picture of pathogen of being infected with malaria.
Although both at home and abroad about researchist utilizes computing machine to carry out the automatic Study of recognition of parasitic agent in trial, utilize computing machine to carry out identification automatically to parasitic ovum image and still have many difficulties, be mainly reflected in following several respects:
A) parasitic kind is many, makes to be difficult to when Image semantic classification find the method that can be applicable to all worm's ovums, and the form of parasitic ovum varies in color, and makes to choose the feature distinguishing various worm's ovum very difficult;
B) due to the difference of image capturing device, the difference of shooting environmental, even same worm's ovum, shooting image out also may there are differences in the color etc. of background and worm's ovum itself, and this also can affect recognition effect;
C) parasitic ovum originally also has different forms in different periods, and what have even differs greatly, if roundworm egg is in unfertilized de-protein film period and de-protein film just obvious difference in period of being fertilized.
From existing research data, carry out the automatic Study of recognition of parasite egg digital picture less than 15 years, reach far away the degree of clinical practice or self-reacting device identification outside China, with dense " pure research " color, problem is mainly manifested in following three aspects:
The kind that a) can identify is less, is often confined to that certain is several, the laboratory study of certain class worm's ovum or adult, and using value is not high clinically for the too narrow system of adaptive surface.
B) need the part of manual intervention many in recognition system processing procedure, if any recognition system need first to measure characteristic parameter with common software and call out again after input database and carry out Classification and Identification, and the recognition system of non-integral; Some systems need with mouse selected target or determine the starting point that frontier tracing is split, from " automatically " decipherment distance claimed very away from.
C) characteristics of image extracted can not accurately reflect feature of image, make the range of characteristic values overlap of various identification object more, have to adopt the very large sorting algorithm of complexity to improve discrimination, use with Hospitals at Present five classify cellanalyzer Effec-tive Function compared with, abroad the fastest identifying processing time is that the system of 15s still awaits improving.
So the common parasite of human worm's ovum discriminator of a kind of energy adaptation of development research, the integration of various identifying processing step and processing speed faster worm's ovum automatic image recognition system are necessary, can adapt to the clinical practice in following self-reacting device.
Summary of the invention:
The invention provides a kind of recognition methods of parasite egg based on multi-feature fusion, the discrimination that the recognition methods of described parasite egg this based on multi-feature fusion will solve parasite egg of the prior art is low, identification kind is few, differentiates the technical matters that the time is long.
The recognition methods of a kind of parasite egg based on multi-feature fusion of the present invention, comprises the steps:
A) step to Image semantic classification, described in the step of Image semantic classification, the image information that microphotographi apparatus obtains is carried out brightness normalized, gray processing process is carried out to normalized image, generate Normalized Grey Level image, and then the Edge contrast based on gaussian filtering is carried out to whole pictures, obtain the image of worm's ovum edge sharpening;
B) image to worm's ovum edge sharpening carries out the step that mean shift finds worm's ovum, carrying out mean shift at an image to worm's ovum edge sharpening finds in the step of worm's ovum, Mean-shift algorithm is used to carry out dividing processing to Target Photo, obtain the color feature vector of above-mentioned image, plan based on color feature vector and find optimum target region, obtaining the region being judged as worm's ovum;
C) one is carried out the step of Target Acquisition to worm's ovum image based on the above-mentioned region being determined as worm's ovum, in the step of described Target Acquisition, parasite egg shape edges area information to be identified according to what set up, binary conversion treatment is carried out to the fringe region of each candidate, adopt edge following algorithm to carry out Target Acquisition according to the border in worm's ovum region, obtain the worm's ovum image after splitting;
D) one to segmentation after worm's ovum image interception specific characteristic value, stored in the step of default property data base;
E) step based on the Classification and Identification of many algorithms, in a step based on the Classification and Identification of many algorithms, adopt KNN (k=3) algorithm based on relative distance, obtained eigenwert is substituted into total data storehouse, judges worm's ovum classification based on KNN algorithm.
Further, carrying out mean shift at an image to worm's ovum edge sharpening finds in the step of worm's ovum, Mean-shift algorithm is used to carry out dividing processing to target, carry out in the process of dividing processing at use Mean-shift algorithm to target, first original image is carried out to the division of X × Y, obtain X × Y intersection point, and merging treatment is carried out to these intersection points, Euclidean distance between the color value of i.e. certain two some correspondence is less than certain threshold value, described threshold value is the color average of minimum 5% pixel of 5% pixel that brightness of image is the highest and brightness, then they are combined into a point, obtain m point like this as initial point set, on m representative picture, X × Y is total to the set of n pixel, each pixel can be expressed as independent variable X i{ i=1 ... n}, the computing method of sample point mean shift M are:
M h , U ( x ) = h 2 d + 2 ▿ ‾ f E ( x ) f ‾ U ( x )
An initial point is selected, the window S centered by putting by this at center picture hcalculating mean value displacement M in (x) h, Ux (), if this value is not less than certain threshold value, just window S h(x) translation M h, Ux (), then repeats to calculate mean shift in new window, obtains new central value, until M h, Ux () is less than certain threshold value, stop translation, obtain a maximum local density position; Repeat above-mentioned steps, obtain the point of m corresponding the maximum local density position, and merging treatment is carried out to these points, obtain the central point of n cluster, i.e. the mass-tone of original image, for each pixel in original image, judge to be grouped in which cluster according to Euclidean distance, represent dominant color information with one dimensional histograms, horizontal ordinate represents each mass-tone, ordinate represents the ratio of the pixel count that each mass-tone comprises, and so just obtains the color feature vector of this image:
Q={ (P i, W i) i=1 ..., n}, wherein P i=(L * i, a * i, b * i), W i∈ (0,1], in above-mentioned formula, W is ratio, P ifor color value, different from traditional RGB representation in components method, color value LSH representation in components method represents herein, is designated as L respectively i, a i, b i.Use traditional E MD algorithm can plan optimum target region based on color feature vector Q, the formula general type of EMD function is
EMD ( P , Q ) = min Σ i = 1 m Σ j = 1 n d ( p i , q j ) f ij Σ i = 1 m Σ j = 1 n f ij
Wherein the highest with expection central point similarity EMD region is exactly target area;
Further, carry out in the step of Target Acquisition based on the above-mentioned region being determined as worm's ovum to worm's ovum image at one, based on the target area that above-mentioned EMD function calculates, namely parasite egg shape edges area information to be identified according to what set up, binary conversion treatment is carried out to the fringe region of each candidate, edge following algorithm is adopted to carry out Target Acquisition, each pixel is searched in accordance with the order from top to bottom when algorithm starts, if sequence array is K, first from upper left side, first aim pixel is searched for, be set to k0, then pixel k0 is the boundary pixel in the most upper left corner, this region, the starting point of namely searching for, setting search direction is by counterclockwise, eight neighborhood direction is searched for, k0 is set to tracking mark, and k0 is inserted as first element of sequence array, by counterclockwise searching for next object pixel, and be set to k, if can not find, then k is isolated pixel region, if k equals search beginning boundary pixel k0, then continue the boundary pixel judging whether other proximal direction does not trace in addition in order, if do not have, then get back to starting point, algorithm terminates, and the boundary pixel point in sequence K forms a closed region, in being enclosed in target area.
Further, at one to the worm's ovum image interception specific characteristic value after segmentation, in the step stored in default property data base, first obtain the eigenwert of worm's ovum image:
1) obtain the external smallest square region of fringe region, counting pixel count can obtain length (length), width (width), and length refers to the length of object boundary rectangle, and width refers to the length of object boundary rectangle;
2) count the pixel of target area, target neighboring area, can obtain area and perimeter, the ratio of its area and minimum external square is ovality (ovality), and ovality refers to the area ratio of object area and external ellipse;
3) area (area) refers to object area, and girth (perimeter) refers to object perimeter;
4) based target field color configuration information, obtains RGB component; Picture is converted into the statistics Nogata that gray scale can obtain gray-scale value, its average is gray-scale value; Be HSV space by targeted transformation, HSL component can be obtained; Average gray (grey) refers to the color average of the object after gray processing; Average red component refers to that the expression of computing machine to colour have employed the mode of RGB combination, and average red component (red) refers to the mean value of R-portion; Average green component (green) refers to that the expression of computing machine to colour have employed the mode of RGB combination, and average green component refers to the mean value of G part; Average blue component (blue) refers to that the expression of computing machine to colour have employed the mode of RGB combination, and average blue component refers to the mean value of part B; After average chrominance (color) refers to and converts RGB color model to HSL color model, the mean value of H part; After average staturation (saturation) refers to and converts RGB color model to HSL color model, the mean value of S part; The mean value of L part after mean flow rate (bright) refers to and converts RGB color model to HSL color model; The statistic histogram of gray-scale value refers to adds up to the distribution of gray-scale value 0 ~ 255 vector obtained stage by stage; Gray standard deviation (greyscale) refers to the difference of each local color of object; Color weight (weighted) refers to the ratio of the average chrominance that the expression of computing machine to colour generates automatically according to pixel position when have employed the mode of RGB combination and position coordinates, and this value, only for error correction, does not participate in computing;
5), after obtaining eigenwert, the Document type data storehouse that input is preset, loads follow-up classification and identification algorithm in table form.
Further, in a step based on the Classification and Identification of many algorithms, adopt the KNN algorithm based on relative distance, the step of described KNN algorithm is as follows:
First for avoiding the calculating affecting sample distance because attribute codomain is different, each sample in characteristic value data storehouse should be X [i] to the i-th dimension attribute value, calculate maximal value Max [i], minimum M in [i], recycling formula X [i]=(X [i]-Mini [i])/(Max [i] – Min [i]) be normalized operation, after each attribute normalization of sample, its codomain is [0,1], then data set D={X1 is built according to characteristic value data storehouse, XL}, wherein X i∈ R n, i=1 ... L; If sample has ClassNum class; If C irepresent the set of all samples in the i-th class, and C i∩ C j=Ф (i, j=1 ..., ClassNum), sample set also can be expressed as: D=C 1∪ C 2∪ ... ∪ C r;
If the distance between two worm's ovum samples is Dist, data set D has m attribute, and its data set is configured to R (A 1, A 2..., A m), X and Y is respectively two samples in data set D, then the distance metric formula of X and Y is:
Dist ( X , Y ) = Σ i = 1 m ( X . x i - Y . y i ) 2
In test sample book, the K-nearest neighbor distance average of the i-th class is:
Avgdis ( i ) = Σ j = 1 k i Dist ( X j , Y ) k i , X j ∈ C i , i = 1 , . . . , ClassNum
K ifor C iin number of samples, Y is the arest neighbors of Xj, and the relative distance between test sample book X and training sample Y is: D=Dist (X, Y)/Avgdis (i), Y ∈ C i;
When N=3, as long as calculate the distance of each sample of data set to measuring and calculating sample, compare 3 arest neighbors choosing test sample book, can differentiate its classification, classification results is embodied by score, if the feature of input picture is (f1, f2 ..., fn), in database the feature of certain sample be (x11, x12 ..., x1n), then score=s (f1, x11) * s (f2, x12) * ... ..*s (fn, x1n);
S construction of function is herein: make maxV=max (f1, x11); MinV=min (f1, x11), then diff=(maxV-minV)/maxV; S=X* (pow (e ,-diff)-1/e)+B, wherein e is natural number, X=(A-B) * e/ (e-1), and A can be made to get maximal value, if diff=0, then s is maximal value A; If diff=1, then s is minimum value B.
Further, before use Mean-shift algorithm carries out dividing processing to target, picture needs to precalculate color histogram.
Further, as caused image to occur breakpoint because of problems such as excessive normalization, cause losing frontier point and cannot build regional frame or area pixel set, then made up with algorithm of region growing, the concrete steps of described algorithm of region growing are: first look for a sub pixel as the starting point of growth to each region of segmentation that needs, then will there is the potting gum of same or similar character in this region in sub pixel surrounding neighbors with sub pixel, these new pixels are used as the process that new sub pixel proceeds above, until the pixel do not satisfied condition again can be included.
Further, described parasite is clonorchis sinensis, band tapeworm, whipworm, pinworm, roundworm, hookworm, fish tapeworm, Schistosoma japonicum, fasciolopsis buski, lung fluke or Spirometra mansoni.
Further, described figure to be identified has rotational invariance, to illumination indifference, to individual indifference.
Concrete, described property data base refers to the eigenwert of image worm's ovum part, as length (length), width (width), ovality (ovality), area (area), girth (perimeter), average gray (grey), average red component (red), average green component (green), average blue component (blue), average chrominance (color), average staturation (saturation), mean flow rate (bright), gray standard deviation (greyscale), color weight (weighted), database format is see subordinate list.
Concrete, described total data storehouse refers to the total storehouse be made up of the parasite property data base of known parameters.
Present invention also offers a kind of recognition methods of parasite egg based on multi-feature fusion, it is characterized in that comprising the steps:
A) step to Image semantic classification, at one in the step of Image semantic classification, the image information that microphotographi apparatus obtains is carried out brightness normalized, gray processing process is carried out to normalized image, generate Normalized Grey Level image, and then the Edge contrast based on gaussian filtering is carried out to whole pictures, obtain the image of worm's ovum edge sharpening;
B) step adopting human assistance identification to find worm's ovum, human assistance identification is adopted to find in the step of worm's ovum at one, adopt and strengthen Grab Cut method to the Image Segmentation Using of worm's ovum edge sharpening, user provides restriction square frame to carry out staff support, obtain the color feature vector of worm's ovum image, plan based on color feature vector and find optimum target region, obtaining the region being judged as worm's ovum;
C) one is carried out the step of Target Acquisition to worm's ovum image based on the above-mentioned region being determined as worm's ovum, in the step of carrying out Target Acquisition, parasite egg shape edges area information to be identified according to what set up, binary conversion treatment is carried out to the fringe region of each candidate, adopt edge following algorithm to carry out Target Acquisition according to the border in worm's ovum region, obtain the worm's ovum image after splitting;
D) one to the worm's ovum image interception specific characteristic value after segmentation and stored in the step of default property data base;
E) step based on the Classification and Identification of many algorithms, in a step based on the Classification and Identification of many algorithms, adopt KNN (k=3) algorithm based on relative distance, obtained eigenwert is substituted into the total data storehouse be made up of 11 kinds of parasite property data bases, judge worm's ovum classification based on KNN algorithm.
Further, human assistance identification is adopted to find in the step of worm's ovum at one, adopt and strengthen Grab Cut method to the Image Segmentation Using of worm's ovum edge sharpening, user provides restriction square frame to carry out staff support, and the part beyond square frame does not process, user carrys out initialization three component T by arranging background area TB, foreground area TF is set to sky, and zone of ignorance TU is set to the supplementary set of background area TB, for the pixel of all background areas, their Alpha value is set to 0, i.e. a=0; For the pixel of zone of ignorance, their Alpha value is set to 1, i.e. a=1, carrys out the gauss hybrid models that initialization creates prospect and background, for each pixel n in zone of ignorance arranges gauss hybrid models parameter with these two set of a=0 and a=1 respectively:
kn=arg min Dn(a n,k n,θ,Z n),
Gaussian mixture model parameter is tried to achieve by the data of each pixel in image
θ=arg min U(a,k n,θ,Z n),
Utilize minimization of energy formula to obtain initial segmentation:
min kE(a,k n,θ,Z n),
Repeat 3 times, carry out border optimization.
Further, carry out in the step of Target Acquisition based on the above-mentioned region being determined as worm's ovum to worm's ovum image, parasite egg shape edges area information to be identified according to what set up, binary conversion treatment is carried out to the fringe region of each candidate, edge following algorithm is adopted to carry out Target Acquisition, each pixel is searched in accordance with the order from top to bottom when algorithm starts, if sequence array is K, first from upper left side, first aim pixel is searched for, be set to k0, then pixel k0 is the boundary pixel in the most upper left corner, this region, the starting point of namely searching for, setting search direction is by counterclockwise, eight neighborhood direction is searched for, k0 is set to tracking mark, and k0 is inserted as first element of sequence array, by counterclockwise searching for next object pixel, and be set to k, if can not find, then k is isolated pixel region, if k equals search beginning boundary pixel k0, then continue the boundary pixel judging whether other proximal direction does not trace in addition in order, if do not have, then get back to starting point, algorithm terminates, and the boundary pixel point in sequence K forms a closed region, in being enclosed in target area.
Further, at one, specific characteristic value is intercepted stored in the step of property data base to acquisition target, first obtains the eigenwert of worm's ovum image:
1) obtain the external smallest square region of fringe region, counting pixel count can obtain length (length), width (width), and length refers to the length of object boundary rectangle, and width refers to the length of object boundary rectangle;
2) count the pixel of target area, target neighboring area, can obtain area and perimeter, the ratio of its area and minimum external square is ovality (ovality), and ovality refers to the area ratio of object area and external ellipse;
3) area (area) refers to object area, and girth (perimeter) refers to object perimeter;
4) based target field color configuration information, obtains RGB component; Picture is converted into the statistics Nogata that gray scale can obtain gray-scale value, its average is gray-scale value; Be HSV space by targeted transformation, HSL component can be obtained; Average gray (grey) refers to the color average of the object after gray processing; Average red component refers to that the expression of computing machine to colour have employed the mode of RGB combination, and average red component (red) refers to the mean value of R-portion; Average green component (green) refers to that the expression of computing machine to colour have employed the mode of RGB combination, and average green component refers to the mean value of G part; Average blue component (blue) refers to that the expression of computing machine to colour have employed the mode of RGB combination, and average blue component refers to the mean value of part B; After average chrominance (color) refers to and converts RGB color model to HSL color model, the mean value of H part; After average staturation (saturation) refers to and converts RGB color model to HSL color model, the mean value of S part; The mean value of L part after mean flow rate (bright) refers to and converts RGB color model to HSL color model; The statistic histogram of gray-scale value refers to adds up to the distribution of gray-scale value 0 ~ 255 vector obtained stage by stage; Gray standard deviation (greyscale) refers to the difference of each local color of object; Color weight (weighted) refers to the ratio of the average chrominance that the expression of computing machine to colour generates automatically according to pixel position when have employed the mode of RGB combination and position coordinates, and this value, only for error correction, does not participate in computing;
5), after obtaining eigenwert, the Document type data storehouse that input is preset, loads follow-up classification and identification algorithm in table form.
Further, in a step based on the Classification and Identification of many algorithms, adopt the KNN algorithm based on relative distance, the step of KNN algorithm is as follows:
First for avoiding the calculating affecting sample distance because attribute codomain is different, each sample in characteristic value data storehouse should be X [i] to the i-th dimension attribute value, calculate maximal value Max [i], minimum M in [i], recycling formula X [i]=(X [i]-Mini [i])/(Max [i] – Min [i]) be normalized operation, after each attribute normalization of sample, its codomain is [0,1], then data set D={X1 is built according to characteristic value data storehouse, XL}, wherein X i∈ R n, i=1 ... L; If sample has ClassNum class; If C irepresent the set of all samples in the i-th class, and C i∩ C j=Ф (i, j=1 ..., ClassNum), sample set also can be expressed as: D=C 1∪ C 2∪ ... ∪ C r;
If the distance between two worm's ovum samples is Dist, data set D has m attribute, and its data set is configured to R (A 1, A 2..., A m), X and Y is respectively two samples in data set D, then the distance metric formula of X and Y is:
Dist ( X , Y ) = Σ i = 1 m ( X . x i - Y . y i ) 2
In test sample book, the K-nearest neighbor distance average of the i-th class is:
Avgdis ( i ) = Σ j = 1 k i Dist ( X j , Y ) k i , X j ∈ C i , i = 1 , . . . , ClassNum
K ifor C iin number of samples, Y is the arest neighbors of Xj, and the relative distance between test sample book X and training sample Y is: D=Dist (X, Y)/Avgdis (i), Y ∈ C i;
When N=3, as long as calculate the distance of each sample of data set to measuring and calculating sample, compare 3 arest neighbors choosing test sample book, can differentiate its classification, classification results is embodied by score, if the feature of input picture is (f1, f2 ..., fn), in database the feature of certain sample be (x11, x12 ..., x1n), then score=s (f1, x11) * s (f2, x12) * ... ..*s (fn, x1n);
S construction of function is herein: make maxV=max (f1, x11); MinV=min (f1, x11), then diff=(maxV-minV)/maxV; S=X* (pow (e ,-diff)-1/e)+B, wherein e is natural number, X=(A-B) * e/ (e-1), and A can be made to get maximal value, if diff=0, then s is maximal value A; If diff=1, then s is minimum value B.
Further, as caused image to occur breakpoint because of problems such as excessive normalization, cause losing frontier point, then made up with algorithm of region growing, the concrete steps of described algorithm of region growing are: first look for a sub pixel as the starting point of growth to each region of segmentation that needs, then will there is the potting gum of same or similar character in this region in sub pixel surrounding neighbors with sub pixel, these new pixels are used as the process that new sub pixel proceeds above, until the pixel do not satisfied condition again can be included.
Further, described parasite is clonorchis sinensis, band tapeworm, whipworm, pinworm, roundworm, hookworm, fish tapeworm, Schistosoma japonicum, fasciolopsis buski, lung fluke or Spirometra mansoni.
Further, described figure to be identified has has rotational invariance, to illumination indifference, to individual indifference.
The invention provides a technology automatically split from image to be identified by worm's ovum to obtain eigenwert.Due to reasons such as the irregular morphosis of parasite egg, dimensional orientation and sample impurity are more, automatically identify that difficulty is large, invention also provides the human assistance means based on Grab Cut digital matting method.Because difficult point of the present invention is automatically to be partitioned into object from worm's ovum picture, so require that system has rotational invariance (namely insensitive to shooting angle) to input picture, to light differential, individual difference, there is stronger adaptability, to worm's ovum kind, there is extensibility.
Parasite of human can be divided into single celled protozoon (protozoon), cellulous worm (helminth) and arthropod (arthorpod) three major types, and mainly more than the ten kind of worm that China is common, be mainly divided into again nematode (nematode), I fluke (trematode) and tapeworm (eestode) and spiny-headed worm (aeanthocphala) four class.The present invention is predetermined identification object with 11 kinds of China's common parasite of human worm's ovums such as clonorchis sinensis, band tapeworm, whipworm, pinworm, roundworm, hookworm, fish tapeworm, Schistosoma japonicum, fasciolopsis buski, lung fluke, Spirometra mansonis.
The invention provides one based on the worm's ovum sorter of KNN algorithm, utilize eigenwert to classify.Along with the development of computer vision and various advanced medical imaging device, single characteristics of image is difficult to the content comprehensively, accurately expressing medical image, and multiple features fusion has become the inevitable approach extracting medical image validity feature.Retaining on the basis of raw information, overcome the original data volume feature of instability greatly, the fusion feature of extraction can effectively for image recognition as far as possible.Shape, size, inclusions, the feature such as color and special construction (as chorion thickness, specific shape) of worm's ovum are depended in the identification of parasite of human.Above-mentioned visual signature is converted into girth, area, circularity, color and texture 15 characteristic of divisions quantitatively extracted by related algorithm by the present invention.
The present invention compares with prior art, and its technique effect is obvious.(1) native system can insensitive to the rotation of picture (namely to shooting angle not only responsive), has certain tolerance to light differential, worm's ovum individual difference; (2) extract the stage at mathematical feature, native system has been combined closely the biological property of worm's ovum, has very strong specific aim, higher discrimination, thus identify more accurate; (3) present system provides full-automatic dividing and semi-automatic segmentation two kinds selection, wherein the worm's ovum image of semi-automatic segmentation algorithm to complex background has good adaptability; Full-automatic dividing has very high automaticity, and it is convenient to provide the follow-up work such as automatic test, statistics, analysis.(4) native system at present to the recognition accuracy of worm's ovum in 11 more than 90%, reach comparatively ideal result.
Accompanying drawing illustrates:
Fig. 1 is the worm's ovum image recognition flow process based on full-automatic dividing.
Fig. 2 is the picture after edge sharpening.
Fig. 3 finds target area and zoning frame based on mean shift algorithm.
Fig. 4 is that based target region and edge following algorithm extract target.
Fig. 5 is characteristic value data library structure and gathers use-case.
Fig. 6 is the worm's ovum image recognition flow process based on human assistance.
Fig. 7 is the picture after edge sharpening.
Fig. 8 is the artificial picture frame auxiliary partition based on Grab Cut method.
Fig. 9 extracts target based on human assistance and edge following algorithm.
Figure 10 is characteristic value data library structure and gathers use-case.
Embodiment:
The microphoto of what following embodiment adopted the is blood smear of a schistosome ovum.
Embodiment 1
A full-automatic recognition methods for parasite egg based on multi-feature fusion, comprises following flow process (process flow diagram is shown in Fig. 1):
A) step to Image semantic classification, the image information that microphotographi apparatus obtains is carried out brightness normalized, gray processing process is carried out to normalized image, generate Normalized Grey Level image, and then the Edge contrast based on gaussian filtering is carried out to whole pictures, obtain the image of worm's ovum edge sharpening;
B) image to worm's ovum edge sharpening carries out the step that mean shift finds worm's ovum, Mean-shift algorithm is used to carry out dividing processing to Target Photo, obtain the color feature vector of this image, plan based on color vector and find optimum target region, obtaining the region being judged as worm's ovum;
C) one is carried out the step of Target Acquisition to worm's ovum image based on the above-mentioned region being determined as worm's ovum.Based on above-mentioned shape segmentations information, namely parasite egg shape edges area information to be identified according to what set up, binary conversion treatment is carried out to the fringe region of each candidate, adopts edge following algorithm to carry out Target Acquisition according to the border in worm's ovum region, obtain the worm's ovum image after splitting;
D) one to segmentation after worm's ovum image interception specific characteristic value, stored in the step of default property data base;
E) step based on the Classification and Identification of many algorithms, in a step based on the Classification and Identification of many algorithms of the present invention, adopt KNN (k=3) algorithm based on relative distance, obtained eigenwert is substituted into the total data storehouse comprising 11 kinds of parasite property data bases, judge worm's ovum classification based on KNN algorithm;
Further, described in the step of Image semantic classification, the image information that microphotographi apparatus obtains is carried out brightness normalized, gray processing process is carried out to normalized image, generate Normalized Grey Level image, and then the Edge contrast based on gaussian filtering is carried out to whole pictures, obtain the image (Use Case Map is shown in Fig. 2) of worm's ovum edge sharpening;
Further, an image to worm's ovum edge sharpening carries out the step of mean shift, carry out in the step of mean shift at an image to worm's ovum edge sharpening, Mean-shift algorithm is used to carry out dividing processing to target, carry out in the process of dividing processing at use Mean-shift algorithm to target, first original image is carried out to the division of X × Y, obtain X × Y intersection point, and merging treatment is carried out to these intersection points, Euclidean distance between the color value of i.e. certain two some correspondence is less than certain threshold value (for ensureing running software speed, in the present invention, this threshold definitions is the color average of minimum 5% pixel of 5% pixel that brightness of image is the highest and brightness), then they are combined into a point, obtain m point like this as initial point set, on m representative picture, X × Y is total to the set of n pixel, each pixel can be expressed as independent variable X i{ i=1 ... n}, the computing method of sample point mean shift M are:
M h , U ( x ) = h 2 d + 2 ▿ ‾ f E ( x ) f ‾ U ( x )
An initial point is selected, the window S centered by putting by this at center picture hcalculating mean value displacement M in (x) h, Ux (), if this value is not less than certain threshold value, just window S h(x) translation M h, Ux (), then repeats to calculate mean shift in new window, obtains new central value, until M h, Ux () is less than certain threshold value, stop translation, obtain a maximum local density position; Repeat above-mentioned steps, obtain the point of m corresponding the maximum local density position, and merging treatment is carried out to these points, obtain the central point of n cluster, i.e. the mass-tone of original image, for each pixel in original image, judge to be grouped in which cluster according to Euclidean distance, represent dominant color information with one dimensional histograms, horizontal ordinate represents each mass-tone, ordinate represents the ratio of the pixel count that each mass-tone comprises, and so just obtains the color feature vector of this image:
P={ (P i, W i) i=1 ..., n}, wherein P i=(L * i, a * i, b * i), W i∈ (0,1].
Plan optimum target region based on color vector p and q, formula is
EMD ( P , Q ) = min Σ i = 1 m Σ j = 1 n d ( p i , q j ) f ij Σ i = 1 m Σ j = 1 n f ij
Wherein the highest with expection central point similarity EMD region is exactly target area (Use Case Map is shown in Fig. 3).
Iamge Segmentation mainly comprises parallel boundary segmentation, serial borderline segmentation, Using Parallel Regional Partition and serial region segmentation four class traditionally.Passing research points out that parallel boundary segmentation is inapplicable for the image of uniform gray level change, and serial Region Segmentation Algorithm (mainly containing region-growing method and correlation criteria) is although awfully hot door, but it is generally acknowledged that its calculated amount is comparatively large, do not meet qualifications of the present invention.Close value segmentation need a threshold value is first set, then the pixel in image is compared with thresholds, pixel is divided into target and background being split.But detect in photo at parasite egg, due to the pollution of impurity, the requirement of segmentation between object and background, can not be met by the gray scale difference of self-assembling formation.
And Mean-shift algorithm can, in this kind of complicated probability distribution of parasite egg picture, make each pixel find the Local modulus maxima of density function along shortest path.After tested, utilize Mean-shift algorithm statistics robustness and along the characteristic of density gradient direction Fast Convergent and color histogram nomography to the coupling of target shape, can solve non-rigid targets form changeable, follow the tracks of the high problem of complexity.
In the present invention, mean shift refers to: a kind of theoretical based on non-parametric Density Estimator, is the method for the extreme point utilizing gradient method iterative computation probability density function.This algorithm has the feature of printenv, Fast Pattern Matching, is a kind of efficient target tracking algorithm.
In the present invention, region growing (region growing) refers to: the set of pixels with similar quality is formed region altogether.Concrete steps are: first look for a sub pixel as the starting point of growth to each region of segmentation that needs, then will have the potting gum of same or similar character in sub pixel surrounding neighbors with sub pixel in this region.These new pixels are used as the process that new sub pixel proceeds above, until the pixel do not satisfied condition again can be included.
Further, based on the above-mentioned region being determined as worm's ovum, worm's ovum image is carried out to the step of Target Acquisition, based on above-mentioned shape segmentations information, namely parasite egg shape edges area information to be identified according to what set up, binary conversion treatment is carried out to the fringe region of each candidate, edge following algorithm is adopted to carry out Target Acquisition, each pixel is searched in accordance with the order from top to bottom when algorithm starts, if sequence array is K, first from upper left side, first aim pixel is searched for, be set to k0, then pixel k0 is the boundary pixel in the most upper left corner, this region, the starting point of namely searching for, setting search direction is by counterclockwise, eight neighborhood direction is searched for, k0 is set to tracking mark, and k0 is inserted as first element of sequence array, by counterclockwise searching for next object pixel, and be set to k, if can not find, then k is isolated pixel region, if k equals search beginning boundary pixel k0, then continue the boundary pixel judging whether other proximal direction does not trace in addition in order, if do not have, then get back to starting point, algorithm terminates, boundary pixel point in sequence K forms a closed region, in being enclosed in target area (Use Case Map is shown in Fig. 4),
Further, at one to acquisition target intercepting specific characteristic value and stored in the step of property data base, described, specific characteristic value is intercepted to acquisition target, stored in the step of property data base, first obtains the eigenwert (see subordinate list) of worm's ovum image:
1) obtain the external smallest square region of fringe region, counting pixel count can obtain length (length), width (width), and length refers to the length of object boundary rectangle, and width refers to the length of object boundary rectangle;
2) count the pixel of target area, target neighboring area, can obtain area and perimeter, the ratio of its area and minimum external square is ovality (ovality), and ovality refers to the area ratio of object area and external ellipse;
3) area (area) refers to object area, and girth (perimeter) refers to object perimeter;
4) based target field color configuration information, obtains RGB component; Picture is converted into the statistics Nogata that gray scale can obtain gray-scale value, its average is gray-scale value; Be HSV space by targeted transformation, HSL component can be obtained; Average gray (grey) refers to the color average of the object after gray processing; Average red component refers to that the expression of computing machine to colour have employed the mode of RGB combination, and average red component (red) refers to the mean value of R-portion; Average green component (green) refers to that the expression of computing machine to colour have employed the mode of RGB combination, and average green component refers to the mean value of G part; Average blue component (blue) refers to that the expression of computing machine to colour have employed the mode of RGB combination, and average blue component refers to the mean value of part B; After average chrominance (color) refers to and converts RGB color model to HSL color model, the mean value of H part; After average staturation (saturation) refers to and converts RGB color model to HSL color model, the mean value of S part; The mean value of L part after mean flow rate (bright) refers to and converts RGB color model to HSL color model; The statistic histogram of gray-scale value refers to adds up to the distribution of gray-scale value 0 ~ 255 vector obtained stage by stage; Gray standard deviation (greyscale) refers to the difference of each local color of object; Color weight (weighted) refers to the ratio of the average chrominance that the expression of computing machine to colour generates automatically according to pixel position when have employed the mode of RGB combination and position coordinates, and this value, only for error correction, does not participate in computing;
5), after obtaining eigenwert, the Document type data storehouse that input is preset, load follow-up classification and identification algorithm in table form, database structure is as shown in the table.
Further, in a step based on the Classification and Identification of many algorithms, adopt the KNN algorithm based on relative distance, the step of concrete KNN algorithm is as follows:
First for avoiding the calculating affecting sample distance because attribute codomain is different, each sample in characteristic value data storehouse should be X [i] to the i-th dimension attribute value, calculate maximal value Max [i], minimum M in [i], recycling formula X [i]=(X [i]-Mini [i])/(Max [i] – Min [i]) be normalized operation, after each attribute normalization of sample, its codomain is [0,1], then data set D={X1 is built according to characteristic value data storehouse, XL}, wherein X i∈ R n, i=1 ... L; If sample has ClassNum class; If C irepresent the set of all samples in the i-th class, and C i∩ C j=Ф (i, j=1 ..., ClassNum), sample set also can be expressed as: D=C 1∪ C 2∪ ... ∪ C r;
If the distance between two worm's ovum samples is Dist, data set D has m attribute, and its data set is configured to R (A 1, A 2..., A m), X and Y is respectively two samples in data set D, then the distance metric formula of X and Y is:
Dist ( X , Y ) = Σ i = 1 m ( X . x i - Y . y i ) 2
In test sample book, the K-nearest neighbor distance average of the i-th class is:
Avgdis ( i ) = Σ j = 1 k i Dist ( X j , Y ) k i , X j ∈ C i , i = 1 , . . . , ClassNum
K ifor C iin number of samples, Y is the arest neighbors of Xj, and the relative distance between test sample book X and training sample Y is: D=Dist (X, Y)/Avgdis (i), Y ∈ C i;
When N=3, as long as calculate the distance of each sample of data set to measuring and calculating sample, compare 3 arest neighbors choosing test sample book, can differentiate its classification, classification results is embodied by score, if the feature of input picture is (f1, f2 ..., fn), in database the feature of certain sample be (x11, x12 ..., x1n), then score=s (f1, x11) * s (f2, x12) * ... ..*s (fn, x1n);
S construction of function is herein: make maxV=max (f1, x11); MinV=min (f1, x11), then diff=(maxV-minV)/maxV; S=X* (pow (e ,-diff)-1/e)+B, wherein e is natural number, X=(A-B) * e/ (e-1), and A can be made to get maximal value, if diff=0, then s is maximal value A; If diff=1, then s is minimum value B.
Further, before use Mean-shift algorithm carries out dividing processing to target, picture needs to precalculate color histogram.The color histogram using VC++ to carry herein calculates canonical algorithm module, uses default parameters.
Further, as caused image to occur breakpoint because of problems such as excessive normalization, cause losing frontier point and cannot build regional frame or area pixel set, then made up with algorithm of region growing, the concrete steps of described algorithm of region growing are: first look for a sub pixel as the starting point of growth to each region of segmentation that needs, then will there is the potting gum of same or similar character in this region in sub pixel surrounding neighbors with sub pixel, these new pixels are used as the process that new sub pixel proceeds above, until the pixel do not satisfied condition again can be included.
Embodiment 2
A human assistance recognition methods for parasite egg based on multi-feature fusion, is characterized in that comprising following flow process (process flow diagram is shown in Fig. 6):
A) step to Image semantic classification.The image information that microphotographi apparatus obtains is carried out brightness normalized, gray processing process is carried out to normalized image, generate Normalized Grey Level image, and then the Edge contrast based on gaussian filtering is carried out to whole pictures, obtain the image of worm's ovum edge sharpening;
B) one adopts enhancing Grab Cut method to carry out to the image of worm's ovum edge sharpening the step that worm's ovum is found in human assistance identification.In the step of Target Segmentation, adopt and strengthen Grab Cut method, namely user provides restriction square frame to carry out staff support, divide prospect and background more accurately, obtain the color feature vector of this image, plan based on color vector and find optimum target region, obtain the region being judged as worm's ovum;
C) one is carried out the step of Target Acquisition to worm's ovum image based on the above-mentioned region being determined as worm's ovum.Based on above-mentioned shape segmentations information, namely parasite egg shape edges area information to be identified according to what set up, binary conversion treatment is carried out to the fringe region of each candidate, adopts edge following algorithm to carry out Target Acquisition according to the border in worm's ovum region, obtain the worm's ovum image after splitting;
D) a set of to the worm's ovum image interception specific characteristic value after segmentation and stored in the step of default property data base;
E) step based on the Classification and Identification of many algorithms, in a step based on the Classification and Identification of many algorithms of the present invention, adopt KNN (k=3) algorithm based on relative distance, obtained eigenwert is substituted into total data storehouse, judges worm's ovum classification based on KNN algorithm;
Further, described in the step of Image semantic classification, the image information that microphotographi apparatus obtains is carried out brightness normalized, gray processing process is carried out to normalized image, generate Normalized Grey Level image, and then the Edge contrast based on gaussian filtering is carried out to whole pictures, obtain the image (Use Case Map is shown in Fig. 7) of worm's ovum edge sharpening;
Further, one adopts enhancing Grab Cut method to carry out to the image of worm's ovum edge sharpening the step that worm's ovum is found in human assistance identification.User provides restriction square frame to carry out staff support, and the part beyond square frame does not process, so faster more accurate than full-automatic computation of mean values displacement algorithm.User needs to carry out initialization three component T by arranging background area TB, foreground area TF is set to sky, and zone of ignorance TU is set to the supplementary set of background area TB, for the pixel of all background areas, their Alpha (transparency) value is set to 0, i.e. a=0; For the pixel of zone of ignorance, their Alpha value is set to 1, i.e. a=1, carrys out the gauss hybrid models that initialization creates prospect and background, for each pixel n in zone of ignorance arranges gauss hybrid models parameter with these two set of a=0 and a=1 respectively:
kn=arg min Dn(a n,k n,θ,Z n),
Gaussian mixture model parameter is tried to achieve by the data of each pixel in image
θ=arg min U(a,k n,θ,Z n),
Utilize minimization of energy formula to obtain initial segmentation:
min kE(a,k n,θ,Z n),
Repeat 3 times, carry out border optimization (Use Case Map is shown in Fig. 8);
The more picture edge following algorithm of some impurity can not ensure to produce closed border, and algorithm also may be out of control and depart from image boundary, particularly the distant image of grey scale change is all the more so to thinner, multiple border, some border and near boundary contour.So the invention provides manual intervention as supplementary means.The set that prospect background extracts is initialized as the zone of ignorance part of three components and background area portion respectively.User interactions part in initialization will have influence on final segmentation result.According to given initial information, for initial prospect scratches K the assembly that graph region and background cutout region create gauss hybrid models respectively.
Strengthening Grab Cut method in the present invention refers to based on restriction parasite egg picture, enhancing below the basis of Graph Cuts method has been done algorithm adaptability in 3: first, utilize gauss hybrid models (Gaussian Mixture Model, GMM) distribution that histogram describes prospect and background pixel is replaced, by the process risen to the process of gray level image coloured image; The second, utilize the alternative manner parameters asked in gauss hybrid models instead of and once minimize the computation process estimating to have carried out energy minimization; 3rd, by non-fully labeling method, decrease the workload of user in reciprocal process, user only need utilize rectangle frame to mark background area.
Further, based on the above-mentioned region being determined as worm's ovum, worm's ovum image is carried out to the step of Target Acquisition.Based on above-mentioned shape segmentations information, namely parasite egg shape edges area information to be identified according to what set up, binary conversion treatment is carried out to the fringe region of each candidate, edge following algorithm is adopted to carry out Target Acquisition, each pixel is searched in accordance with the order from top to bottom when algorithm starts, if sequence array is K, first from upper left side, first aim pixel is searched for, be set to k0, then pixel k0 is the boundary pixel in the most upper left corner, this region, the starting point of namely searching for, setting search direction is by counterclockwise, eight neighborhood direction is searched for, k0 is set to tracking mark, and k0 is inserted as first element of sequence array, by counterclockwise searching for next object pixel, and be set to k, if can not find, then k is isolated pixel region, if k equals search beginning boundary pixel k0, then continue the boundary pixel judging whether other proximal direction does not trace in addition in order, if do not have, then get back to starting point, algorithm terminates, boundary pixel point in sequence K forms a closed region, in being enclosed in target area (Use Case Map is shown in Fig. 9),
Further, one intercepts specific characteristic value to acquisition target, stored in the step of property data base, intercepts specific characteristic value, stored in the step of property data base, first need the eigenwert title, the acquisition methods that obtain described to acquisition target:
1) obtain the external smallest square region of fringe region, counting pixel count can obtain length (length), width (width), and length refers to the length of object boundary rectangle, and width refers to the length of object boundary rectangle;
2) count the pixel of target area, target neighboring area, can obtain area and perimeter, the ratio of its area and minimum external square is ovality (ovality), and ovality refers to the area ratio of object area and external ellipse;
3) area (area) refers to object area, and girth (perimeter) refers to object perimeter;
4) based target field color configuration information, obtains RGB component; Picture is converted into the statistics Nogata that gray scale can obtain gray-scale value, its average is gray-scale value; Be HSV space by targeted transformation, HSL component can be obtained; Average gray (grey) refers to the color average of the object after gray processing; Average red component refers to that the expression of computing machine to colour have employed the mode of RGB combination, and average red component (red) refers to the mean value of R-portion; Average green component (green) refers to that the expression of computing machine to colour have employed the mode of RGB combination, and average green component refers to the mean value of G part; Average blue component (blue) refers to that the expression of computing machine to colour have employed the mode of RGB combination, and average blue component refers to the mean value of part B; After average chrominance (color) refers to and converts RGB color model to HSL color model, the mean value of H part; After average staturation (saturation) refers to and converts RGB color model to HSL color model, the mean value of S part; The mean value of L part after mean flow rate (bright) refers to and converts RGB color model to HSL color model; The statistic histogram of gray-scale value refers to adds up to the distribution of gray-scale value 0 ~ 255 vector obtained stage by stage; Gray standard deviation (greyscale) refers to the difference of each local color of object; Color weight (weighted) refers to the ratio of the average chrominance that the expression of computing machine to colour generates automatically according to pixel position when have employed the mode of RGB combination and position coordinates, and this value, only for error correction, does not participate in computing;
5), after obtaining eigenwert, the Document type data storehouse that input is preset, load follow-up classification and identification algorithm in table form, database example is as following table.
The operation use-case of eigenwert collection is shown in Figure 10;
Further, in a step based on the Classification and Identification of many algorithms, adopt the KNN algorithm based on relative distance, the step of KNN algorithm is as follows:
First for avoiding the calculating affecting sample distance because attribute codomain is different, each sample in characteristic value data storehouse should be X [i] to the i-th dimension attribute value, calculate maximal value Max [i], minimum M in [i], recycling formula X [i]=(X [i]-Mini [i])/(Max [i] – Min [i]) be normalized operation, after each attribute normalization of sample, its codomain is [0,1], then data set D={X1 is built according to characteristic value data storehouse, XL}, wherein X i∈ R n, i=1 ... L; If sample has ClassNum class; If C irepresent the set of all samples in the i-th class, and C i∩ C j=Ф (i, j=1 ..., ClassNum), sample set also can be expressed as: D=C 1∪ C 2∪ ... ∪ C r;
If the distance between two worm's ovum samples is Dist, data set D has m attribute, and its data set is configured to R (A 1, A 2..., A m), X and Y is respectively two samples in data set D, then the distance metric formula of X and Y is:
Dist ( X , Y ) = Σ i = 1 m ( X . x i - Y . y i ) 2
In test sample book, the K-nearest neighbor distance average of the i-th class is:
Avgdis ( i ) = Σ j = 1 k i Dist ( X j , Y ) k i , X j ∈ C i , i = 1 , . . . , ClassNum
K ifor C iin number of samples, Y is the arest neighbors of Xj, and the relative distance between test sample book X and training sample Y is: D=Dist (X, Y)/Avgdis (i), Y ∈ C i;
When N=3, as long as calculate the distance of each sample of data set to measuring and calculating sample, compare 3 arest neighbors choosing test sample book, can differentiate its classification, classification results is embodied by score, if the feature of input picture is (f1, f2 ..., fn), in database the feature of certain sample be (x11, x12 ..., x1n), then score=s (f1, x11) * s (f2, x12) * ... ..*s (fn, x1n);
S construction of function is herein: make maxV=max (f1, x11); MinV=min (f1, x11), then diff=(maxV-minV)/maxV; S=X* (pow (e ,-diff)-1/e)+B, wherein e is natural number, X=(A-B) * e/ (e-1), and A can be made to get maximal value, if diff=0, then s is maximal value A; If diff=1, then s is minimum value B.
Further, as caused image to occur breakpoint because of problems such as excessive normalization, cause losing frontier point, then made up with algorithm of region growing, the concrete steps of described algorithm of region growing are: first look for a sub pixel as the starting point of growth to each region of segmentation that needs, then will there is the potting gum of same or similar character in this region in sub pixel surrounding neighbors with sub pixel, these new pixels are used as the process that new sub pixel proceeds above, until the pixel do not satisfied condition again can be included.
In the present invention, multiple feature is combined by the mode be multiplied.Because two other aspects of worm's ovums possibility are similar, but at a certain characteristic aspect difference very large (in actual microscopy often just negated similar possibility according to a feature), be multiplied and this huge difference better can be showed, go the value affecting total score.

Claims (17)

1. a recognition methods for parasite egg based on multi-feature fusion, comprises a process utilizing microphotographi apparatus to obtain the image of parasite egg, it is characterized in that: described process also comprises the steps:
A) step to Image semantic classification, described in the step of Image semantic classification, the image information that microphotographi apparatus obtains is carried out brightness normalized, gray processing process is carried out to normalized image, generate Normalized Grey Level image, and then the Edge contrast based on gaussian filtering is carried out to whole pictures, obtain the image of worm's ovum edge sharpening;
B) image to worm's ovum edge sharpening carries out the step that mean shift finds worm's ovum, carrying out mean shift at the described image to worm's ovum edge sharpening finds in the step of worm's ovum, Mean-shift algorithm is used to carry out dividing processing to Target Photo, obtain the color feature vector of above-mentioned image, plan based on color feature vector and find optimum target region, obtaining the region being judged as worm's ovum;
C) one is carried out the step of Target Acquisition to worm's ovum image based on the above-mentioned region being determined as worm's ovum, in the step of described Target Acquisition, parasite egg shape edges area information to be identified according to what set up, binary conversion treatment is carried out to the fringe region of each candidate, adopt edge following algorithm to carry out Target Acquisition according to the border in worm's ovum region, obtain the worm's ovum image after splitting;
D) one to segmentation after worm's ovum image interception specific characteristic value, stored in the step in default characteristic value data storehouse;
E) step of a Classification and Identification, in the step of a Classification and Identification, adopts KNN (k=3) algorithm based on relative distance, obtained eigenwert is substituted into total data storehouse, judges worm's ovum classification based on KNN algorithm.
2. the recognition methods of parasite egg based on multi-feature fusion as claimed in claim 1, it is characterized in that: carry out mean shift at an image to worm's ovum edge sharpening and find in the step of worm's ovum, Mean-shift algorithm is used to carry out dividing processing to target, carry out in the process of dividing processing at use Mean-shift algorithm to target, first original image is carried out to the division of X × Y, obtain X × Y intersection point, and merging treatment is carried out to these intersection points, Euclidean distance between the color value of i.e. certain two some correspondence is less than certain threshold value, described threshold value is the color average of minimum 5% pixel of 5% pixel that brightness of image is the highest and brightness, then they are combined into a point, obtain m point like this as initial point set, on m representative picture, X × Y is total to the set of n pixel, each pixel can be expressed as independent variable X i{ i=1 ... n}, the computing method of sample point mean shift M are:
M h , U ( x ) = h 2 d + 2 ▿ ‾ f E ( x ) f ‾ U ( x )
An initial point is selected, the window S centered by putting by this at center picture hcalculating mean value displacement M in (x) h, Ux (), if this value is not less than certain threshold value, just window S h(x) translation M h,Ux (), then repeats to calculate mean shift in new window, obtains new central value, until M h, Ux () is less than certain threshold value, stop translation, obtain a maximum local density position; Repeat above-mentioned steps, obtain the point of m corresponding the maximum local density position, and merging treatment is carried out to these points, obtain the central point of n cluster, i.e. the mass-tone of original image, for each pixel in original image, judge to be grouped in which cluster according to Euclidean distance, represent dominant color information with one dimensional histograms, horizontal ordinate represents each mass-tone, ordinate represents the ratio of the pixel count that each mass-tone comprises, and so just obtains the color feature vector of this image:
Q={ (P i, W i) i=1 ..., n}, wherein P i=(L * i, a * i, b * i), W i∈ (0,1],
In above-mentioned formula, W is ratio, P ifor color value, color value LSH representation in components method represents, is designated as L respectively i, a i, b i, color feature vector Q uses EMD algorithmic rule optimum target region, and the formula general type of EMD function is
EMD ( P , Q ) = min Σ i = 1 m Σ j = 1 n d ( p i , q j ) f ij Σ i = 1 m Σ j = 1 n f ij
Wherein the highest with expection central point similarity EMD region is exactly target area.
3. the recognition methods of parasite egg based on multi-feature fusion as claimed in claim 1, it is characterized in that: carry out in the step of Target Acquisition based on the above-mentioned region being determined as worm's ovum to worm's ovum image at one, parasite egg shape edges area information to be identified according to what set up, binary conversion treatment is carried out to the fringe region of each candidate, edge following algorithm is adopted to carry out Target Acquisition, each pixel is searched in accordance with the order from top to bottom when algorithm starts, if sequence array is K, first from upper left side, first aim pixel is searched for, be set to k0, then pixel k0 is the boundary pixel in the most upper left corner, this region, the starting point of namely searching for, setting search direction is by counterclockwise, eight neighborhood direction is searched for, k0 is set to tracking mark, and k0 is inserted as first element of sequence array, by counterclockwise searching for next object pixel, and be set to k, if can not find, then k is isolated pixel region, if k equals search beginning boundary pixel k0, then continue the boundary pixel judging whether other proximal direction does not trace in addition in order, if do not have, then get back to starting point, algorithm terminates, and the boundary pixel point in sequence K forms a closed region, in being enclosed in target area.
4. the recognition methods of a kind of parasite egg based on multi-feature fusion as claimed in claim 1, is characterized in that: in the worm's ovum image interception specific characteristic value after segmentation, step stored in default property data base, first obtains the eigenwert of worm's ovum image:
1) obtain the external smallest square region of fringe region, counting pixel count can obtain length, width, and length refers to the length of object boundary rectangle, and width refers to the length of object boundary rectangle;
2) count the pixel of target area, target neighboring area, can obtain area and perimeter, the ratio of its area and minimum external square is ovality, and ovality refers to the area ratio of object area and external ellipse;
3) area refers to object area, and girth refers to object perimeter;
4) based target field color configuration information, obtains RGB component; Picture is converted into the statistics Nogata that gray scale can obtain gray-scale value, its average is gray-scale value; Be HSV space by targeted transformation, HSL component can be obtained; Average gray refers to the color average of the object after gray processing; Average red component refers to that the expression of computing machine to colour have employed the mode of RGB combination, and average red component refers to the mean value of R-portion; Average green component refers to that the expression of computing machine to colour have employed the mode of RGB combination, and average green component refers to the mean value of G part; Average blue component refers to that the expression of computing machine to colour have employed the mode of RGB combination, and average blue component refers to the mean value of part B; After average chrominance refers to and converts RGB color model to HSL color model, the mean value of H part; After average staturation refers to and converts RGB color model to HSL color model, the mean value of S part; The mean value of L part after mean flow rate refers to and converts RGB color model to HSL color model; The statistic histogram of gray-scale value refers to adds up to the distribution of gray-scale value 0 ~ 255 vector obtained stage by stage; Gray standard deviation refers to the difference of each local color of object; Color weight refers to the ratio of the average chrominance that the expression of computing machine to colour generates automatically according to pixel position when have employed the mode of RGB combination and position coordinates, and this value, only for error correction, does not participate in computing;
5), after obtaining eigenwert, the Document type data storehouse that input is preset, loads follow-up classification and identification algorithm in table form.
5. the recognition methods of parasite egg based on multi-feature fusion as claimed in claim 1, is characterized in that: in the step of Classification and Identification, adopts the KNN algorithm based on relative distance, and the step of described KNN algorithm is as follows:
Each sample in characteristic value data storehouse should be X [i] to the i-th dimension attribute value, calculate maximal value Max [i], minimum M in [i], recycling formula X [i]=(X [i]-Mini [i])/(Max [i] – Min [i]) be normalized operation, after each attribute normalization of sample, its codomain is [0,1], then data set D={X1 is built according to characteristic value data storehouse ... XL}, wherein X i∈ R n, i=1 ... L; If sample has ClassNum class; If C irepresent the set of all samples in the i-th class, and C i∩ C j=Ф (i, j=1 ..., ClassNum), sample set also can be expressed as: D=C 1∪ C 2∪ ... ∪ C r;
If the distance between two worm's ovum samples is Dist, data set D has m attribute, and its data set is configured to R (A 1, A 2..., A m), X and Y is respectively two samples in data set D, then the distance metric formula of X and Y is:
Dist ( X , Y ) = Σ i = 1 m ( X . x i - Y . y i ) 2
In test sample book, the K-nearest neighbor distance average of the i-th class is:
Avgdis ( i ) = Σ j = 1 k i Dist ( X j , Y ) k i , X j ∈ C i , i = 1 , . . . , ClassNum
K ifor C iin number of samples, Y is the arest neighbors of Xj, and the relative distance between test sample book X and training sample Y is: D=Dist (X, Y)/Avgdis (i), Y ∈ C i;
When N=3, as long as calculate the distance of each sample of data set to measuring and calculating sample, compare 3 arest neighbors choosing test sample book, can differentiate its classification, classification results is embodied by score, if the feature of input picture is (f1, f2 ..., fn), in database the feature of certain sample be (x11, x12 ..., x1n), then score=s (f1, x11) * s (f2, x12) * ... ..*s (fn, x1n);
S construction of function is herein: make maxV=max (f1, x11); MinV=min (f1, x11), then diff=(maxV-minV)/maxV; S=X* (pow (e ,-diff)-1/e)+B, wherein e is natural number, X=(A-B) * e/ (e-1), and A can be made to get maximal value, if diff=0, then s is maximal value A; If diff=1, then s is minimum value B.
6. the recognition methods of parasite egg based on multi-feature fusion as claimed in claim 1, is characterized in that: before use Mean-shift algorithm carries out dividing processing to target, picture needs to precalculate color histogram.
7. the recognition methods of parasite egg based on multi-feature fusion as claimed in claim 1, it is characterized in that: as caused image to occur breakpoint because of problems such as excessive normalization, cause losing frontier point and cannot build regional frame or area pixel set, then made up with algorithm of region growing, the concrete steps of described algorithm of region growing are: first look for a sub pixel as the starting point of growth to each region of segmentation that needs, then will there is the potting gum of same or similar character in this region in sub pixel surrounding neighbors with sub pixel, these new pixels are used as the process that new sub pixel proceeds above, until the pixel do not satisfied condition again can be included.
8. the recognition methods of parasite egg based on multi-feature fusion as claimed in claim 1, is characterized in that: described parasite is clonorchis sinensis, band tapeworm, whipworm, pinworm, roundworm, hookworm, fish tapeworm, Schistosoma japonicum, fasciolopsis buski, lung fluke or Spirometra mansoni.
9. the recognition methods of parasite egg based on multi-feature fusion as claimed in claim 1, is characterized in that: described figure to be identified has rotational invariance, to illumination indifference, to individual indifference.
10. a recognition methods for parasite egg based on multi-feature fusion, is characterized in that comprising the steps:
A) step to Image semantic classification, at one in the step of Image semantic classification, the image information that microphotographi apparatus obtains is carried out brightness normalized, gray processing process is carried out to normalized image, generate Normalized Grey Level image, and then the Edge contrast based on gaussian filtering is carried out to whole pictures, obtain the image of worm's ovum edge sharpening;
B) step adopting human assistance identification to find worm's ovum, human assistance identification is adopted to find in the step of worm's ovum at one, adopt and strengthen Grab Cut method to the Image Segmentation Using of worm's ovum edge sharpening, user provides restriction square frame to carry out staff support, obtain the color feature vector of worm's ovum image, plan based on color feature vector and find optimum target region, obtaining the region being judged as worm's ovum;
C) one is carried out the step of Target Acquisition to worm's ovum image based on the above-mentioned region being determined as worm's ovum, in the step of carrying out Target Acquisition, parasite egg shape edges area information to be identified according to what set up, binary conversion treatment is carried out to the fringe region of each candidate, adopt edge following algorithm to carry out Target Acquisition according to the border in worm's ovum region, obtain the worm's ovum image after splitting;
D) one to the worm's ovum image interception specific characteristic value after segmentation and stored in the step of default property data base;
E) step of a Classification and Identification, in the step of a Classification and Identification, adopts KNN (k=3) algorithm based on relative distance, obtained eigenwert is substituted into total data storehouse, judges worm's ovum classification based on KNN algorithm.
The recognition methods of 11. a kind of parasite eggs based on multi-feature fusion as claimed in claim 10, is characterized in that:
A) human assistance identification is adopted to find in the step of worm's ovum at one, adopt and strengthen Grab Cut method to the Image Segmentation Using of worm's ovum edge sharpening, user provides restriction square frame to carry out staff support, part beyond square frame does not process, and user carrys out initialization three component T by arranging background area TB, and foreground area TF is set to sky, zone of ignorance TU is set to the supplementary set of background area TB, for the pixel of all background areas, their Alpha value is set to 0, i.e. a=0; For the pixel of zone of ignorance, their Alpha value is set to 1, i.e. a=1, carrys out the gauss hybrid models that initialization creates prospect and background, for each pixel n in zone of ignorance arranges gauss hybrid models parameter with these two set of a=0 and a=1 respectively:
kn=arg min Dn(a n,k n,θ,Z n),
Gaussian mixture model parameter is tried to achieve by the data of each pixel in image
θ=arg min U(a,k n,θ,Z n),
Utilize minimization of energy formula to obtain initial segmentation:
min kE(a,k n,θ,Z n),
Repeat 3 times, carry out border optimization.
The recognition methods of 12. a kind of parasite eggs based on multi-feature fusion as claimed in claim 10, it is characterized in that: carry out in the step of Target Acquisition based on the above-mentioned region being determined as worm's ovum to worm's ovum image, parasite egg shape edges area information to be identified according to what set up, binary conversion treatment is carried out to the fringe region of each candidate, edge following algorithm is adopted to carry out Target Acquisition, each pixel is searched in accordance with the order from top to bottom when algorithm starts, if sequence array is K, first from upper left side, first aim pixel is searched for, be set to k0, then pixel k0 is the boundary pixel in the most upper left corner, this region, the starting point of namely searching for, setting search direction is by counterclockwise, eight neighborhood direction is searched for, k0 is set to tracking mark, and k0 is inserted as first element of sequence array, by counterclockwise searching for next object pixel, and be set to k, if can not find, then k is isolated pixel region, if k equals search beginning boundary pixel k0, then continue the boundary pixel judging whether other proximal direction does not trace in addition in order, if do not have, then get back to starting point, algorithm terminates, and the boundary pixel point in sequence K forms a closed region, in being enclosed in target area.
The recognition methods of 13. a kind of parasite eggs based on multi-feature fusion as claimed in claim 10, is characterized in that: intercept specific characteristic value stored in the step of property data base at one to acquisition target, first obtain the eigenwert of worm's ovum image:
1) obtain the external smallest square region of fringe region, counting pixel count can obtain length, width, and length refers to the length of object boundary rectangle, and width refers to the length of object boundary rectangle;
2) count the pixel of target area, target neighboring area, can obtain area and perimeter, the ratio of its area and minimum external square is ovality, and ovality refers to the area ratio of object area and external ellipse;
3) area refers to object area, and girth refers to object perimeter;
4) based target field color configuration information, obtains RGB component; Picture is converted into the statistics Nogata that gray scale can obtain gray-scale value, its average is gray-scale value; Be HSV space by targeted transformation, HSL component can be obtained; Average gray refers to the color average of the object after gray processing; Average red component refers to that the expression of computing machine to colour have employed the mode of RGB combination, and average red component refers to the mean value of R-portion; Average green component refers to that the expression of computing machine to colour have employed the mode of RGB combination, and average green component refers to the mean value of G part; Average blue component refers to that the expression of computing machine to colour have employed the mode of RGB combination, and average blue component refers to the mean value of part B; After average chrominance refers to and converts RGB color model to HSL color model, the mean value of H part; After average staturation refers to and converts RGB color model to HSL color model, the mean value of S part; The mean value of L part after mean flow rate refers to and converts RGB color model to HSL color model; The statistic histogram of gray-scale value refers to adds up to the distribution of gray-scale value 0 ~ 255 vector obtained stage by stage; Gray standard deviation refers to the difference of each local color of object; Color weight refers to the ratio of the average chrominance that the expression of computing machine to colour generates automatically according to pixel position when have employed the mode of RGB combination and position coordinates, and this value, only for error correction, does not participate in computing;
5), after obtaining eigenwert, the Document type data storehouse that input is preset, loads follow-up classification and identification algorithm in table form.
The recognition methods of 14. a kind of parasite eggs based on multi-feature fusion as claimed in claim 10, it is characterized in that: in the step of a Classification and Identification, adopt the KNN algorithm based on relative distance, the step of KNN algorithm is as follows:
First for avoiding the calculating affecting sample distance because attribute codomain is different, each sample in characteristic value data storehouse should be X [i] to the i-th dimension attribute value, calculate maximal value Max [i], minimum M in [i], recycling formula X [i]=(X [i]-Mini [i])/(Max [i] – Min [i]) be normalized operation, after each attribute normalization of sample, its codomain is [0,1], then data set D={X1 is built according to characteristic value data storehouse, XL}, wherein X i∈ R n, i=1 ... L; If sample has ClassNum class; If C irepresent the set of all samples in the i-th class, and Ci ∩ C j=Ф (i, j=1 ..., ClassNum), sample set also can be expressed as: D=C 1∪ C 2∪ ... ∪ C r;
If the distance between two worm's ovum samples is Dist, data set D has m attribute, and its data set is configured to R (A 1, A 2..., A m), X and Y is respectively two samples in data set D, then the distance metric formula of X and Y is:
Dist ( X , Y ) = Σ i = 1 m ( X . x i - Y . y i ) 2
In test sample book, the K-nearest neighbor distance average of the i-th class is:
Avgdis ( i ) = Σ j = 1 k i Dist ( X j , Y ) k i , X j ∈ C i , i = 1 , . . . , ClassNum
K ifor C iin number of samples, Y is the arest neighbors of Xj, and the relative distance between test sample book X and training sample Y is: D=Dist (X, Y)/Avgdis (i), Y ∈ C i;
When N=3, as long as calculate the distance of each sample of data set to measuring and calculating sample, compare 3 arest neighbors choosing test sample book, can differentiate its classification, classification results is embodied by score, if the feature of input picture is (f1, f2 ..., fn), in database the feature of certain sample be (x11, x12 ..., x1n), then score=s (f1, x11) * s (f2, x12) * ... ..*s (fn, x1n);
S construction of function is herein: make maxV=max (f1, x11); MinV=min (f1, x11), then diff=(maxV-minV)/maxV; S=X* (pow (e ,-diff)-1/e)+B, wherein e is natural number, X=(A-B) * e/ (e-1), and A can be made to get maximal value, if diff=0, then s is maximal value A; If diff=1, then s is minimum value B.
The recognition methods of 15. a kind of parasite eggs based on multi-feature fusion as claimed in claim 10, it is characterized in that: as caused image to occur breakpoint because of problems such as excessive normalization, cause losing frontier point, then made up with algorithm of region growing, the concrete steps of described algorithm of region growing are: first look for a sub pixel as the starting point of growth to each region of segmentation that needs, then will there is the potting gum of same or similar character in this region in sub pixel surrounding neighbors with sub pixel, these new pixels are used as the process that new sub pixel proceeds above, until the pixel do not satisfied condition again can be included.
The recognition methods of 16. a kind of parasite eggs based on multi-feature fusion as claimed in claim 10, is characterized in that: described parasite is clonorchis sinensis, band tapeworm, whipworm, pinworm, roundworm, hookworm, fish tapeworm, Schistosoma japonicum, fasciolopsis buski, lung fluke or Spirometra mansoni.
The recognition methods of 17. a kind of parasite eggs based on multi-feature fusion as claimed in claim 10, is characterized in that: described figure to be identified has has rotational invariance, to illumination indifference, to individual indifference.
CN201410587222.8A 2014-10-28 2014-10-28 Parasite egg identification method based on multi-feature fusion Pending CN104392240A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410587222.8A CN104392240A (en) 2014-10-28 2014-10-28 Parasite egg identification method based on multi-feature fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410587222.8A CN104392240A (en) 2014-10-28 2014-10-28 Parasite egg identification method based on multi-feature fusion

Publications (1)

Publication Number Publication Date
CN104392240A true CN104392240A (en) 2015-03-04

Family

ID=52610141

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410587222.8A Pending CN104392240A (en) 2014-10-28 2014-10-28 Parasite egg identification method based on multi-feature fusion

Country Status (1)

Country Link
CN (1) CN104392240A (en)

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105243390A (en) * 2015-09-25 2016-01-13 河南科技学院 Insect image detection method and insect classification method
CN105551027A (en) * 2015-12-08 2016-05-04 沈阳东软医疗系统有限公司 Boundary tracking method and device
CN106203528A (en) * 2016-07-19 2016-12-07 华侨大学 A kind of feature based merges and the 3D of KNN draws intelligent classification algorithm
CN107220673A (en) * 2017-06-06 2017-09-29 滁州市天达汽车部件有限公司 A kind of bamboo cane method for sorting colors based on KNN algorithms
CN108376402A (en) * 2018-04-27 2018-08-07 安徽农业大学 Trialeurodes vaporariorum community growth state analysis device and method under a kind of off-line state
CN108596891A (en) * 2018-04-23 2018-09-28 中国计量大学 A kind of method of counting towards multiple types mixing silk cocoon
CN108805101A (en) * 2018-06-28 2018-11-13 陈静飞 A kind of recognition methods of the parasite egg based on deep learning
CN109145848A (en) * 2018-08-30 2019-01-04 西京学院 A kind of wheat head method of counting
CN109325499A (en) * 2018-08-02 2019-02-12 浙江中农在线电子商务有限公司 Pest and disease damage recognition methods and device
CN109359576A (en) * 2018-10-08 2019-02-19 北京理工大学 A kind of size of animal estimation method based on image local feature identification
CN110020654A (en) * 2019-04-08 2019-07-16 中南大学 The recognition methods of foamed zones in expansion fire-proof layer of charcoal SEM image
CN110084821A (en) * 2019-04-17 2019-08-02 杭州晓图科技有限公司 A kind of more example interactive image segmentation methods
CN110136078A (en) * 2019-04-29 2019-08-16 天津大学 The semi-automatic reparation complementing method of single plant corn image leaf destruction
CN110263608A (en) * 2019-01-25 2019-09-20 天津职业技术师范大学(中国职业培训指导教师进修中心) Electronic component automatic identifying method based on image feature space variable threshold value metric
CN110321896A (en) * 2019-04-30 2019-10-11 深圳市四季宏胜科技有限公司 Blackhead recognition methods, device and computer readable storage medium
CN110807426A (en) * 2019-11-05 2020-02-18 北京罗玛壹科技有限公司 Parasite detection system and method based on deep learning
CN111507177A (en) * 2020-02-19 2020-08-07 广西云涌科技有限公司 Identification method and device for metering turnover cabinet
CN111582276A (en) * 2020-05-29 2020-08-25 北京语言大学 Parasite egg identification method and system based on multi-feature fusion
CN111612824A (en) * 2020-05-26 2020-09-01 天津市微卡科技有限公司 Consciousness tracking recognition algorithm for robot control
CN111753706A (en) * 2020-06-19 2020-10-09 西安工业大学 Complex table intersection point clustering extraction method based on image statistics
CN111797706A (en) * 2020-06-11 2020-10-20 昭苏县西域马业有限责任公司 Image-based parasite egg shape recognition system and method
CN111795986A (en) * 2020-05-20 2020-10-20 万秋花 Virus clinical examination detection platform applying shape search
CN111815614A (en) * 2020-07-17 2020-10-23 中国人民解放军军事科学院军事医学研究院 Parasite detection method and system based on artificial intelligence and terminal equipment
CN114299494A (en) * 2022-01-20 2022-04-08 广东省农业科学院动物科学研究所 Method and system for detecting worm oval characteristics of aquatic product image
CN116758024A (en) * 2023-06-13 2023-09-15 山东省农业科学院 Peanut seed direction identification method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102007038137A1 (en) * 2007-08-13 2009-02-19 Robert Bosch Gmbh Image processing method for driver assistance system for e.g. lane recognition, involves computing radial and/or tangential derivatives of brightness values and/or gray scale values of image for generating image information
CN102073872A (en) * 2011-01-20 2011-05-25 中国疾病预防控制中心寄生虫病预防控制所 Image-based method for identifying shape of parasite egg
CN104036523A (en) * 2014-06-18 2014-09-10 哈尔滨工程大学 Improved mean shift target tracking method based on surf features

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102007038137A1 (en) * 2007-08-13 2009-02-19 Robert Bosch Gmbh Image processing method for driver assistance system for e.g. lane recognition, involves computing radial and/or tangential derivatives of brightness values and/or gray scale values of image for generating image information
CN102073872A (en) * 2011-01-20 2011-05-25 中国疾病预防控制中心寄生虫病预防控制所 Image-based method for identifying shape of parasite egg
CN104036523A (en) * 2014-06-18 2014-09-10 哈尔滨工程大学 Improved mean shift target tracking method based on surf features

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
刘峨: "基于meanshift和EMD的图像检索系统研究", 《中国优秀硕士学位论文全文数据库信息科技辑》 *
崔鑫: "计算机辅助孤立肺结节检测与医学征象识别算法研究", 《中国优秀硕士学位论文全文数据库信息科技辑》 *

Cited By (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105243390A (en) * 2015-09-25 2016-01-13 河南科技学院 Insect image detection method and insect classification method
CN105243390B (en) * 2015-09-25 2018-09-25 河南科技学院 Insect image identification detection method and classification of insect method
CN105551027B (en) * 2015-12-08 2018-08-03 沈阳东软医疗系统有限公司 A kind of frontier tracing method and device
CN105551027A (en) * 2015-12-08 2016-05-04 沈阳东软医疗系统有限公司 Boundary tracking method and device
CN106203528B (en) * 2016-07-19 2019-07-09 华侨大学 It is a kind of that intelligent classification algorithm is drawn based on the 3D of Fusion Features and KNN
CN106203528A (en) * 2016-07-19 2016-12-07 华侨大学 A kind of feature based merges and the 3D of KNN draws intelligent classification algorithm
CN107220673A (en) * 2017-06-06 2017-09-29 滁州市天达汽车部件有限公司 A kind of bamboo cane method for sorting colors based on KNN algorithms
CN107220673B (en) * 2017-06-06 2020-05-01 安徽天达汽车制造有限公司 KNN algorithm-based bamboo strip color classification method
CN108596891A (en) * 2018-04-23 2018-09-28 中国计量大学 A kind of method of counting towards multiple types mixing silk cocoon
CN108376402A (en) * 2018-04-27 2018-08-07 安徽农业大学 Trialeurodes vaporariorum community growth state analysis device and method under a kind of off-line state
CN108805101A (en) * 2018-06-28 2018-11-13 陈静飞 A kind of recognition methods of the parasite egg based on deep learning
CN109325499A (en) * 2018-08-02 2019-02-12 浙江中农在线电子商务有限公司 Pest and disease damage recognition methods and device
CN109145848A (en) * 2018-08-30 2019-01-04 西京学院 A kind of wheat head method of counting
CN109359576A (en) * 2018-10-08 2019-02-19 北京理工大学 A kind of size of animal estimation method based on image local feature identification
CN109359576B (en) * 2018-10-08 2021-09-03 北京理工大学 Animal quantity estimation method based on image local feature recognition
CN110263608B (en) * 2019-01-25 2023-07-07 天津职业技术师范大学(中国职业培训指导教师进修中心) Automatic electronic component identification method based on image feature space variable threshold measurement
CN110263608A (en) * 2019-01-25 2019-09-20 天津职业技术师范大学(中国职业培训指导教师进修中心) Electronic component automatic identifying method based on image feature space variable threshold value metric
CN110020654A (en) * 2019-04-08 2019-07-16 中南大学 The recognition methods of foamed zones in expansion fire-proof layer of charcoal SEM image
CN110084821B (en) * 2019-04-17 2021-01-12 杭州晓图科技有限公司 Multi-instance interactive image segmentation method
CN110084821A (en) * 2019-04-17 2019-08-02 杭州晓图科技有限公司 A kind of more example interactive image segmentation methods
CN110136078A (en) * 2019-04-29 2019-08-16 天津大学 The semi-automatic reparation complementing method of single plant corn image leaf destruction
CN110321896A (en) * 2019-04-30 2019-10-11 深圳市四季宏胜科技有限公司 Blackhead recognition methods, device and computer readable storage medium
CN110807426A (en) * 2019-11-05 2020-02-18 北京罗玛壹科技有限公司 Parasite detection system and method based on deep learning
CN110807426B (en) * 2019-11-05 2023-11-21 苏州华文海智能科技有限公司 Deep learning-based parasite detection system and method
CN111507177A (en) * 2020-02-19 2020-08-07 广西云涌科技有限公司 Identification method and device for metering turnover cabinet
CN111795986A (en) * 2020-05-20 2020-10-20 万秋花 Virus clinical examination detection platform applying shape search
CN111612824A (en) * 2020-05-26 2020-09-01 天津市微卡科技有限公司 Consciousness tracking recognition algorithm for robot control
CN111582276A (en) * 2020-05-29 2020-08-25 北京语言大学 Parasite egg identification method and system based on multi-feature fusion
CN111582276B (en) * 2020-05-29 2023-09-29 北京语言大学 Recognition method and system for parasite eggs based on multi-feature fusion
CN111797706A (en) * 2020-06-11 2020-10-20 昭苏县西域马业有限责任公司 Image-based parasite egg shape recognition system and method
CN111753706A (en) * 2020-06-19 2020-10-09 西安工业大学 Complex table intersection point clustering extraction method based on image statistics
CN111753706B (en) * 2020-06-19 2024-02-02 西安工业大学 Complex table intersection point clustering extraction method based on image statistics
CN111815614A (en) * 2020-07-17 2020-10-23 中国人民解放军军事科学院军事医学研究院 Parasite detection method and system based on artificial intelligence and terminal equipment
CN114299494A (en) * 2022-01-20 2022-04-08 广东省农业科学院动物科学研究所 Method and system for detecting worm oval characteristics of aquatic product image
CN114299494B (en) * 2022-01-20 2022-07-22 广东省农业科学院动物科学研究所 Method and system for detecting worm-egg-shaped characteristics of aquatic product image
CN116758024A (en) * 2023-06-13 2023-09-15 山东省农业科学院 Peanut seed direction identification method
CN116758024B (en) * 2023-06-13 2024-02-23 山东省农业科学院 Peanut seed direction identification method

Similar Documents

Publication Publication Date Title
CN104392240A (en) Parasite egg identification method based on multi-feature fusion
CN109154978B (en) System and method for detecting plant diseases
CN107274386B (en) artificial intelligent auxiliary cervical cell fluid-based smear reading system
Li et al. Segmentation of white blood cell from acute lymphoblastic leukemia images using dual-threshold method
CN102426649B (en) Simple steel seal digital automatic identification method with high accuracy rate
Quelhas et al. Cell nuclei and cytoplasm joint segmentation using the sliding band filter
Song et al. A deep learning based framework for accurate segmentation of cervical cytoplasm and nuclei
Liao et al. Automatic segmentation for cell images based on bottleneck detection and ellipse fitting
CN102682305B (en) Automatic screening system and automatic screening method using thin-prep cytology test
CN108229362A (en) A kind of binocular recognition of face biopsy method based on access control system
CN107464249B (en) Sheep contactless body ruler measurement method
US20140270347A1 (en) Hierarchical image classification system
CN103971126A (en) Method and device for identifying traffic signs
CN105894490A (en) Fuzzy integration multiple classifier integration-based uterine neck cell image identification method and device
Sunardi et al. Identity analysis of egg based on digital and thermal imaging: Image processing and counting object concept
CN106529532A (en) License plate identification system based on integral feature channels and gray projection
Shahin et al. A novel white blood cells segmentation algorithm based on adaptive neutrosophic similarity score
CN109087330A (en) It is a kind of based on by slightly to the moving target detecting method of smart image segmentation
CN107103608A (en) A kind of conspicuousness detection method based on region candidate samples selection
CN102147867A (en) Method for identifying traditional Chinese painting images and calligraphy images based on subject
CN108629297A (en) A kind of remote sensing images cloud detection method of optic based on spatial domain natural scene statistics
CN108073940A (en) A kind of method of 3D object instance object detections in unstructured moving grids
Karaoglu et al. Con-text: text detection using background connectivity for fine-grained object classification
CN104866850B (en) A kind of optimization method of text image binaryzation
Mohammadpoor et al. An intelligent technique for grape fanleaf virus detection

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20150304

WD01 Invention patent application deemed withdrawn after publication