CN103530878A  Edge extraction method based on fusion strategy  Google Patents
Edge extraction method based on fusion strategy Download PDFInfo
 Publication number
 CN103530878A CN103530878A CN201310475874.8A CN201310475874A CN103530878A CN 103530878 A CN103530878 A CN 103530878A CN 201310475874 A CN201310475874 A CN 201310475874A CN 103530878 A CN103530878 A CN 103530878A
 Authority
 CN
 China
 Prior art keywords
 pixel
 edge
 weight
 matrix
 difference
 Prior art date
Links
 230000004927 fusion Effects 0.000 title claims abstract description 10
 238000000605 extraction Methods 0.000 title abstract description 12
 239000011159 matrix material Substances 0.000 claims description 95
 230000000875 corresponding Effects 0.000 claims description 12
 238000001514 detection method Methods 0.000 claims description 12
 238000010606 normalization Methods 0.000 claims description 5
 238000006467 substitution reaction Methods 0.000 claims description 4
 238000003909 pattern recognition Methods 0.000 abstract description 2
 238000004883 computer application Methods 0.000 abstract 1
 238000010191 image analysis Methods 0.000 abstract 1
 230000035772 mutation Effects 0.000 abstract 1
 238000002360 preparation method Methods 0.000 abstract 1
 238000000034 method Methods 0.000 description 10
 238000003708 edge detection Methods 0.000 description 8
 238000010586 diagram Methods 0.000 description 7
 230000000694 effects Effects 0.000 description 6
 238000004422 calculation algorithm Methods 0.000 description 5
 238000004458 analytical method Methods 0.000 description 2
 238000004364 calculation method Methods 0.000 description 2
 238000005516 engineering process Methods 0.000 description 2
 238000001914 filtration Methods 0.000 description 1
 238000007499 fusion processing Methods 0.000 description 1
 238000007477 logistic regression Methods 0.000 description 1
 238000010801 machine learning Methods 0.000 description 1
 230000004048 modification Effects 0.000 description 1
 238000006011 modification reaction Methods 0.000 description 1
 238000003672 processing method Methods 0.000 description 1
 230000000717 retained Effects 0.000 description 1
 238000004450 types of analysis Methods 0.000 description 1
Abstract
The invention discloses an edge extraction method based on a fusion strategy, and belongs to the field of computer application, such as image processing, pattern recognition and vision computation. The edge extraction method comprises the following steps of inputting a grayscale image; integrating extraction results of three typical edge extraction algorithms to obtain voting weight reflecting and belonging to edge possibility degree; analyzing a difference value of the maximum luminance difference and the minimum luminance difference of pixel points and a neighborhood to obtain difference weight describing luminance mutation degree; counting deleted neighborhood variance distribution, and according to the characteristic that an edge point at least has the relative larger luminance dispersity on the basis that the luminance dispersity of four neighborhoods of the whole image is average, obtaining edge distribution weight of all the pixel points; fusing three weight matrixes to carry out edge decisionmaking; outputting an edge image. The edge extraction method based on the fusion strategy, which is disclosed by the invention, has the advantages that the accuracy of edge extraction is improved, the influence on the noise is reasonably reduced, and the preparation information is provided for subsequent processing, such as further image analysis and feature point positioning.
Description
Technical field
The present invention relates to Digital Image Processing and technical field of computer vision, be specifically related to the edge extracting image processing method based on convergence strategy.
Background technology
Edge is that in image, the information such as gray level or structure exists the place of sudden change in various degree, is the end in a region, is also the beginning in another region.Edge is that image is cut apart, and the important foundation that the graphical analyses such as texture feature extraction and Shape Feature Extraction are understood, is also the important foundation of the research fields such as computer vision, patternrecognition.
In recent years, existing a lot of edge extracting methods are proposed by numerous researchers.Classical edge extracting method, utilizes the gradient information of image, the firstorder filtering of image, and secondorder filter or zero crossing detect to extract, such as Sobel operator, Prewitt operator, Roberts operator, LoG operator, the arithmetic operators such as Canny operator.These Operator Image Edge are occupied leading position in extracting always, but due to the otherness of image, different operators have some difference to the edge detection results of different images.Also have at present method, utilize knowledge of statistics and machine learning to carry out edge extracting, such as there being fuzzy edge to detect, logistic regression detects, the rim detection based on Markov random field, multiscale morphology.But still do not have a kind of edge detection results can well be applicable to different images.Therefore need to there be the edge extracting method more with robustness to be suggested.
Summary of the invention
The object of the present invention is to provide a kind of edge extracting method based on convergence strategy, the described edge extracting method based on convergence strategy will improve the inadequate robust of existing single edge extracting technology, the marginal information integrality of extracting is not enough, is subject to noise jamming impact, and accuracy is undesirable.
The present invention realizes by the following method, and the present invention is the data continuity criterion based on gray level image, and obtained gray level image is carried out to edge extracting.A kind of edge extracting method based on convergence strategy proposed by the invention, is characterized in that, the method comprises the following steps:
1. the edge extracting method based on convergence strategy, is characterized in that comprising the following steps:
Step 1, input gray level image
Step 2, in computed image, pixel is the ballot weight matrix of edge possibility:
Step 21, input gray level image;
Step 22, utilizes Sobel operator to carry out rim detection;
Step 23, utilizes Canny operator to carry out rim detection;
Step 24, utilizes LoG operator to carry out rim detection;
Step 25, three kinds of operator testing results are carried out Nearest Neighbor with Weighted Voting statistics on respective pixel point;
Here at respective pixel point, refer to the eight neighborhood scope statistics at detected pixel, three kinds of operator testing results are added up respectively separately; By detected pixel namely the central pixel point of eight neighborhoods to give be 1 weights, it is 0.5 weights that four, its neighbours territory pixel is given, to give be 0.25 weights to remaining four pixels of eight neighborhoods four angles; When eight Shang Wei edges, neighborhood relevant position assignment be 1, by weighting, judge; Resulting weight and be more than or equal to 2, think this pixel that this kind of operator detect namely the central pixel point of eight neighborhoods be marginal point, and be 1 by weight and assignment, if weight and be less than 2 is thought not to be marginal point and by weight and assignment 0; Add up successively each pixel weight and, can obtain codomain for the Nearest Neighbor with Weighted Voting of [0,3];
Step 3, in computed image, pixel is the difference weight matrix of edge possibility
Step 31, input gray level image;
Step 32, calculates four, the neighbours territory point of each pixel and luminance difference own;
Step 33, obtains respectively maximal value and the minimum value of each pixel four differences separately;
Step 34, is normalized corresponding difference maximal value and the minimum value substitution Logic Regression Models of each pixel respectively;
The Logic Regression Models is here:
Y(x)=1/(1+exp(a*x+b))
Wherein, x is maximum luminance difference or the minimum brightness difference that each pixel is corresponding, Y (x) is maximum luminance difference or the minimum brightness difference after the normalization that each pixel is corresponding, a and b are luminance difference parameter, and for each pixel maximum difference in luminance parameter, be: a equals0.1, and b equals 2; For the poor parameterdefinition of each pixel minimum brightness: a equals0.3, and b equals 5;
Step 4, in computed image, pixel is the marginal distribution weight matrix of edge possibility
Step 41, input gray level image;
Step 42, calculates the brightness variance of four pixels that go to heart neighbours territory of each pixel;
Step 43, the variance that all pixels are tried to achieve is averaging, and is multiplied by coefficient 0.8, can obtain a threshold value;
Step 44, building size be the matrix of picture size, and initialize is zero, constructs initial edge distribution of weights matrix;
Step 45, all pixels of traversing graph picture, if its deleted neighbourhood statistical variance is more than or equal to threshold value, are 1 by the numerical value assignment of the correspondence position of marginal distribution weight matrix; Otherwise, keep initial 0 value, set up the 01 weight matrix of describing marginal distribution;
Step 5, carries out fusion treatment to obtain three kinds of weight matrix
Step 51, reads ballot weight matrix data;
Step 52, reads difference weight matrix data;
Step 53, reads marginal distribution weight matrix data;
Step 54, utilizes Hadamard product to merge three kinds of weight matrix, obtains and merges weight matrix;
Step 55, calculates benchmark numerical value, i.e. threshold value;
Here by ask merge weight matrix all elements and and divided by the element of marginal distribution weight matrix and, then be multiplied by coefficient 0.7, obtain benchmark numerical value, as threshold value;
Step 56, binaryzation merges weight matrix, if merge the value of matrix, is greater than or equal to benchmark numerical value, by the assignment that merges matrix relevant position, is 1; If be less than benchmark numerical value, by the value assignment that merges matrix relevant position, be 0;
Step 57, matrix is merged in output;
Step 6, output edge image.
The invention has the beneficial effects as follows: edge extracting effect is superior, marginal information is lost few and degree of accuracy is high, has certain noise faulttolerance.The present invention utilizes the thought of convergence strategy, make full use of the directviewing description of the existing validity of classical Boundary extracting algorithm and the partial structurtes characteristic of image border, carry out rim detection, can at utmost adapt to different images local edge, approach the best extraction results of different edge algorithms, improve final edge extracting result, improve edge extracting accurate.
Accompanying drawing explanation
Fig. 1 is the present invention and three kinds of classical edge detection operator Sobel, Canny and the experimental result comparison diagram of LoG.
Fig. 2 is a kind of edge extracting method process flow diagram based on convergence strategy proposed by the invention.
Fig. 3 is that the present invention obtains ballot weight module process flow diagram.
Fig. 4 is that the present invention obtains difference weight module process flow diagram.
Fig. 5 is that the present invention obtains marginal distribution weight module process flow diagram.
Fig. 6 is that weight of the present invention merges decisionmaking module process flow diagram.
Embodiment
For making the object, technical solutions and advantages of the present invention clearer, below in conjunction with specific embodiment, and with reference to accompanying drawing, the present invention is described in more detail.As shown in Figure 1, Fig. 11 and 21 are two different test patterns; Fig. 12 and Fig. 22nd, the inventive method edge detection results figure; Fig. 13 and Fig. 23rd, Sobel operator edge detection result figure; Fig. 14 and Fig. 24th, Canny operator edge detection result figure; Fig. 15 and Fig. 25th, LoG operator edge detection result figure; By result figure contrast, can find out, the edge integrity that the inventive method is extracted is relatively good, and redundant information is few, has effectively given prominence to marginal information, and rim detection is effective.
Fig. 2 is a kind of edge extracting method process flow diagram based on convergence strategy proposed by the invention, and the described edge extracting method based on convergence strategy specifically comprises the following steps:
Step 1, input gray level image.
Described gray level image is single channel image, and the gray level image here can change from coloured image, also can directly choose arbitrary passage of coloured image, and wherein each pixel span is 0 to 255.
Step 2, in computed image, pixel is the ballot weight matrix of edge possibility.
Use is obtained 102 pairs of images that read of ballot weight module and is processed, and finally obtains ballot weight matrix.Obtaining the handled data source of ballot weight module 102 is the single channel view data that step 1 obtains, by three kinds of typical edge extracting methods, carry out respectively edge extracting, utilize Voting Algorithm, three edge image ballots that these three kinds of edge extracting methods are obtained, according to each pixel and eight neighborhood territory pixels point thereof, be weighted ballot statistics, can obtain needed ballot weight matrix.
Step 3, in computed image, pixel is the difference weight matrix of edge possibility.
Use is obtained 103 pairs of images that read of difference weight module and is processed, and obtains difference weight matrix.Obtaining the handled data source of difference weight module 103 is the single channel view data that step 1 obtains, obtain the luminance difference of each pixel and its neighbours territory pixel, and obtain respectively maximal value and the minimum value of four differences of each pixel, then each pixel maximum difference and two Logic Regression Models of minimal difference substitution are separately tried to achieve to corresponding normalized value, finally two normalized values of each pixel are subtracted each other, by each pixel normalization minimax neighborhood luminance difference, form matrix, be required difference weight matrix.
Step 4, in computed image, pixel is the marginal distribution weight matrix of edge possibility.
Use is obtained 104 pairs of images that read of marginal distribution weight module and is processed, and finally obtains marginal distribution weight matrix.Obtaining the handled data source of marginal distribution weight module 104 is the single channel view data that step 1 obtains, calculate the variance of removing four pixels in heart neighbours territory of each pixel, calculate again the average of all pixel variances, and be multiplied by coefficient 0.8, as threshold value, binaryzation is removed heart variance matrix, detailed process: each pixel is tried to achieve to deleted neighbourhood variance and compare with threshold value, if the variance of correspondence position is greater than or equal to benchmark numerical value separately, by the value assignment of this pixel, be 1; If the variance of correspondence position is less than benchmark numerical value separately, by the value assignment of this pixel, be 0, obtain one by 0 and 1 matrix forming, as twovalue marginal distribution weight matrix.
Step 5, carries out fusion treatment to obtain three kinds of weight matrix.
Three kinds of weight matrix that 105 pairs of decisionmaking modules of weight fusion obtain merge, and final acquisition merged matrix.Weight merges decisionmaking module 105 and adopts the Hadamard products weight matrix of voting, difference weight matrix and marginal distribution weight matrix carry out corresponding element product calculation, obtain and merge matrix, calculate to merge respectively matrix and marginal distribution weight matrix all elements and, by merge matrix element and divided by marginal distribution matrix element and, then be multiplied by coefficient 0.7 and obtain a benchmark numerical value, as the decisionmaking value of marginal point, pointbypoint comparison, to merging matrix binaryzation.
Step 6, output edge image.
Output edge image module 106 merges the required twovalue obtaining with matrix display device and is shown as bianry image, is final edge extracting result.
Below each main modular relating in abovementioned steps is described in detail.
1, obtain ballot weight module 102
For the edge that makes to extract has robustness, utilize the extraction result of existing typical edge extracting method, by ballot, add up, based on poll, calculate the weight that each pixel is marginal point, for guaranteeing that representative edge extraction algorithm is extracted to deviation certain error correcting capability, the present invention proposes weight calculation and not only adds up the edge judged result of each algorithm to this point, and the neighborhood point edge court verdict after stack weighting, according to giving different weights from the distance of pixel to be extracted, by this fusion mode, obtain integrated representative edge and extract the edge judgement ballot weight matrix of result, as followup basis of further merging judgement.As shown in Figure 3, obtain ballot weight matrix specific implementation process comprise the following steps:
Step 21, input gray level image.
Step 22, utilizes Sobel operator to carry out rim detection.
Step 23, utilizes Canny operator to carry out rim detection.
Step 24, utilizes LoG operator to carry out rim detection.
Step 25, three kinds of operator testing results are carried out Nearest Neighbor with Weighted Voting statistics on respective pixel point.
Here at respective pixel point, refer to the eight neighborhood scope statistics at detected pixel, three kinds of operator testing results are added up respectively separately.By detected pixel namely the central pixel point of eight neighborhoods to give be 1 weights, it is 0.5 weights that four, its neighbours territory pixel is given, to give be 0.25 weights to remaining four pixels of eight neighborhoods four angles.When eight Shang Wei edges, neighborhood relevant position assignment be 1, by weighting, judge.Resulting weight and be more than or equal to 2, think this pixel that this kind of operator detect namely the central pixel point of eight neighborhoods be marginal point, and be 1 by weight and assignment, if weight and be less than 2 is thought not to be marginal point and by weight and assignment 0.Add up successively each pixel weight and, can obtain codomain for the Nearest Neighbor with Weighted Voting of [0,3].Illustrate as follows: suppose original image part local message matrix A, utilize Sobel, Canny and LoG operator to detect, obtain respectively matrix B _ Sobel, B_Canny and B_LoG, specific as follows:
Enumerating now pixel A (3,3) judges.For Sobel testing result, its weight and=(0*0.25*4)+(0*0.5*2+1*0.5*2)+1*1=2, this weight and be more than or equal to 2, so assignment is 1, in known Sobel testing result, A (3,3) is marginal point.
Can try to achieve successively Canny testing result weight and=(0*0.25*3+1*0.25*1)+(0*0.5*2+1*0.5*2)+0*1=1.25, this weight and be less than 2, so assignment is 0, so A (3,3) is not marginal point in known Canny testing result.
In like manner can try to achieve in LoG testing result, weight and be 1.75 to be less than 2, assignment is 0, so A (3,3) is not marginal point in known LoG testing result.
Three's weight assignment is added and obtains 1+0+0=1 the most at last, is the size of final vote weight.
Step 26, the pixel that is null value by voting results is given 0.5 weights, obtains final weights scope and be 0.5 to 3 ballot weight matrix.
Here the value assignment 0.5 that is 0 by ballot weight matrix, is for fear of followup fusion process, and the edge judged result of shielding based on partial structurtes, affects multiangle edge determine effect, loses possible marginal point.
2, obtain difference weight module 103
According to the architectural feature of image border point and its neighborhood territory pixel point, the present invention proposes to utilize to have and treat mutually neighborhood point and the difference between this luminance difference that judging point has maximum luminance variation and minimum brightness variation and explain the weight that this point is edge.Consider that this expression method is subject to noise and infects, the present invention adopts normalized luminance difference calculated difference weight matrix, make its weighted value [0,1] interval, its effect in followup weight matrix merges of efficient balance, retained based on difference and described the validity that jump in brightness carries out edge judgement on the one hand, avoided on the other hand the excessive increasing that brings noise effect of weights.As shown in Figure 4, obtain the specific implementation process of difference weight matrix further comprising the steps:
Step 31, input gray level image.
Step 32, calculates four, the neighbours territory point of each pixel and luminance difference own.
Step 33, obtains respectively maximal value and the minimum value of each pixel four differences separately.
Step 34, is normalized corresponding difference maximal value and the minimum value substitution Logic Regression Models of each pixel respectively.
Here poor calculating is to adopt different parameters according to experiment to maximum difference in luminance with minimum brightness.Described in following regression model, obtain the minimum and maximum luminance difference after normalization that each pixel is corresponding.
The Logic Regression Models is here:
Y(x)=1/(1+exp(a*x+b))
Wherein, x is maximum luminance difference or the minimum brightness difference that each pixel is corresponding, Y (x) is maximum luminance difference or the minimum brightness difference after the normalization that each pixel is corresponding, a and b are luminance difference parameter, and for each pixel maximum difference in luminance parameter, be: a equals0.1, and b equals 2; For the poor parameterdefinition of each pixel minimum brightness: a equals0.3, and b equals 5.
Step 35, normalized maximum brightness value and minimum luminance value that each pixel is tried to achieve subtract each other, can obtain a difference, its codomain is [0,1], the difference that all pixels are obtained forms difference weight matrix, from treating the maximum luminance variation degree between decisionmaking marginal point and neighborhood, describes the weight that this pixel is marginal point.
3, obtain marginal distribution weight module 104
As shown in Figure 5, obtain the specific implementation step of marginal distribution weight matrix as follows:
Step 41, input gray level image.
Step 42, calculates the brightness variance of four pixels that go to heart neighbours territory of each pixel.
Step 43, the variance that all pixels are tried to achieve is averaging, and is multiplied by coefficient 0.8, can obtain a threshold value.
Step 44, building size be the matrix of picture size, and initialize is zero, constructs initial edge distribution of weights matrix.
Step 45, all pixels of traversing graph picture, if its deleted neighbourhood statistical variance is more than or equal to threshold value, are 1 by the numerical value assignment of the correspondence position of marginal distribution weight matrix; Otherwise, keep initial 0 value, set up the 01 weight matrix of describing marginal distribution, by calculating and add up the variance of deleted neighbourhood, whether the brightness distribution degree of analyzing the relatively whole figure of neighborhood of a point brightness degree of scatter to be detected is larger, in whole brightness, change under background, it is larger that neighborhood brightness departs from average, be that variance is higher than the point of weighted mean, jump in brightness degree is high, is that the possibility of marginal point is large, and passing threshold algorithm carries out marginal point decisionmaking here, the marginal point distribution matrix of foundation based on deleted neighbourhood variance analysis, as the basis of followup fusion decisionmaking.
4, weight merges decisionmaking module 105
By merging decisionmaking, can utilize to greatest extent the edgedescription information of different angles, obtain more complete edge detection results, improve the correctness of edge extracting.As shown in Figure 6, the specific implementation process of weight matrix fusion decisionmaking comprises the following steps:
Step 51, reads ballot weight matrix data.
Step 52, reads difference weight matrix data.
Step 53, reads marginal distribution weight matrix data.
Step 54, utilizes Hadamard product to merge three kinds of weight matrix, obtains and merges weight matrix.
Step 55, calculates benchmark numerical value, i.e. threshold value.
Here by ask merge weight matrix all elements and and divided by the element of marginal distribution weight matrix and, here in marginal distribution weight matrix each element with the marginal point sum representing based on neighborhood variance analysis, be multiplied by again coefficient 0.7, obtain reference value, as threshold value.
Step 56, binaryzation merges weight matrix, if merge the value of matrix, is greater than or equal to benchmark numerical value, by the assignment that merges matrix relevant position, is 1; If be less than, by the value assignment that merges matrix relevant position, be 0.
Step 57, matrix is merged in output, obtains the more edge extracting result merging based on multiple edge extracting scheme of robust.
Abovedescribed specific embodiment; object of the present invention, technical scheme and beneficial effect are further described; institute is understood that; the foregoing is only specific embodiments of the invention; be not limited to the present invention; within the spirit and principles in the present invention all, any modification of making, be equal to replacement, improvement etc., within all should being included in protection scope of the present invention.
Claims (1)
1. the edge extracting method based on convergence strategy, is characterized in that comprising the following steps:
Step 1, input gray level image
Step 2, in computed image, pixel is the ballot weight matrix of edge possibility:
Step 21, input gray level image;
Step 22, utilizes Sobel operator to carry out rim detection;
Step 23, utilizes Canny operator to carry out rim detection;
Step 24, utilizes LoG operator to carry out rim detection;
Step 25, three kinds of operator testing results are carried out Nearest Neighbor with Weighted Voting statistics on respective pixel point;
Here at respective pixel point, refer to the eight neighborhood scope statistics at detected pixel, three kinds of operator testing results are added up respectively separately; By detected pixel namely the central pixel point of eight neighborhoods to give be 1 weights, it is 0.5 weights that four, its neighbours territory pixel is given, to give be 0.25 weights to remaining four pixels of eight neighborhoods four angles; When eight Shang Wei edges, neighborhood relevant position assignment be 1, by weighting, judge; Resulting weight and be more than or equal to 2, think this pixel that this kind of operator detect namely the central pixel point of eight neighborhoods be marginal point, and be 1 by weight and assignment, if weight and be less than 2 is thought not to be marginal point and by weight and assignment 0; Add up successively each pixel weight and, can obtain codomain for the Nearest Neighbor with Weighted Voting of [0,3];
Step 3, in computed image, pixel is the difference weight matrix of edge possibility
Step 31, input gray level image;
Step 32, calculates four, the neighbours territory point of each pixel and luminance difference own;
Step 33, obtains respectively maximal value and the minimum value of each pixel four differences separately;
Step 34, is normalized corresponding difference maximal value and the minimum value substitution Logic Regression Models of each pixel respectively;
The Logic Regression Models is here:
Y(x)=1/(1+exp(a*x+b))
Wherein, x is maximum luminance difference or the minimum brightness difference that each pixel is corresponding, Y (x) is maximum luminance difference or the minimum brightness difference after the normalization that each pixel is corresponding, a and b are luminance difference parameter, and for each pixel maximum difference in luminance parameter, be: a equals0.1, and b equals 2; For the poor parameterdefinition of each pixel minimum brightness: a equals0.3, and b equals 5;
Step 4, in computed image, pixel is the marginal distribution weight matrix of edge possibility
Step 41, input gray level image;
Step 42, calculates the brightness variance of four pixels that go to heart neighbours territory of each pixel;
Step 43, the variance that all pixels are tried to achieve is averaging, and is multiplied by coefficient 0.8, can obtain a threshold value;
Step 44, building size be the matrix of picture size, and initialize is zero, constructs initial edge distribution of weights matrix;
Step 45, all pixels of traversing graph picture, if its deleted neighbourhood statistical variance is more than or equal to threshold value, are 1 by the numerical value assignment of the correspondence position of marginal distribution weight matrix; Otherwise, keep initial 0 value, set up the 01 weight matrix of describing marginal distribution;
Step 5, carries out fusion treatment to obtain three kinds of weight matrix
Step 51, reads ballot weight matrix data;
Step 52, reads difference weight matrix data;
Step 53, reads marginal distribution weight matrix data;
Step 54, utilizes Hadamard product to merge three kinds of weight matrix, obtains and merges weight matrix;
Step 55, calculates benchmark numerical value, i.e. threshold value;
Here by ask merge weight matrix all elements and and divided by the element of marginal distribution weight matrix and, then be multiplied by coefficient 0.7, obtain benchmark numerical value, as threshold value;
Step 56, binaryzation merges weight matrix, if merge the value of matrix, is greater than or equal to benchmark numerical value, by the assignment that merges matrix relevant position, is 1; If be less than benchmark numerical value, by the value assignment that merges matrix relevant position, be 0;
Step 57, matrix is merged in output;
Step 6, output edge image.
Priority Applications (1)
Application Number  Priority Date  Filing Date  Title 

CN201310475874.8A CN103530878B (en)  20131012  20131012  A kind of edge extracting method based on convergence strategy 
Applications Claiming Priority (1)
Application Number  Priority Date  Filing Date  Title 

CN201310475874.8A CN103530878B (en)  20131012  20131012  A kind of edge extracting method based on convergence strategy 
Publications (2)
Publication Number  Publication Date 

CN103530878A true CN103530878A (en)  20140122 
CN103530878B CN103530878B (en)  20160113 
Family
ID=49932857
Family Applications (1)
Application Number  Title  Priority Date  Filing Date 

CN201310475874.8A Expired  Fee Related CN103530878B (en)  20131012  20131012  A kind of edge extracting method based on convergence strategy 
Country Status (1)
Country  Link 

CN (1)  CN103530878B (en) 
Cited By (8)
Publication number  Priority date  Publication date  Assignee  Title 

CN105139384A (en) *  20150811  20151209  北京天诚盛业科技有限公司  Defective capsule detection method and apparatus 
CN106504263A (en) *  20161104  20170315  辽宁工程技术大学  A kind of quick continuous boundary extracting method of image 
CN107909555A (en) *  20171127  20180413  北京大恒图像视觉有限公司  A kind of gridding noise elimination method for keeping acutance 
CN108513044A (en) *  20180416  20180907  深圳市华星光电技术有限公司  Picture smooth treatment method, electronic device and computer readable storage medium 
CN108734158A (en) *  20170414  20181102  成都唐源电气股份有限公司  A kind of realtime train number identification method and device 
CN108830870A (en) *  20180521  20181116  千寻位置网络有限公司  Satellite image highprecision field boundary extracting method based on Multiscale model study 
CN110298858A (en) *  20190701  20191001  北京奇艺世纪科技有限公司  A kind of image cropping method and device 
CN111300987A (en) *  20200227  20200619  深圳怡化电脑股份有限公司  Ink jet interval time determining method, device, computer equipment and storage medium 
Citations (4)
Publication number  Priority date  Publication date  Assignee  Title 

CN101286233A (en) *  20080519  20081015  重庆邮电大学  Fuzzy edge detection method based on object cloud 
CN101968885A (en) *  20100925  20110209  西北工业大学  Method for detecting remote sensing image change based on edge and grayscale 
US20120308153A1 (en) *  20110603  20121206  Sunghyun Hwang  Device and method of removing noise in edge area 
JP2013165352A (en) *  20120209  20130822  Canon Inc  Imaging apparatus, control method of the same and program 

2013
 20131012 CN CN201310475874.8A patent/CN103530878B/en not_active Expired  Fee Related
Patent Citations (4)
Publication number  Priority date  Publication date  Assignee  Title 

CN101286233A (en) *  20080519  20081015  重庆邮电大学  Fuzzy edge detection method based on object cloud 
CN101968885A (en) *  20100925  20110209  西北工业大学  Method for detecting remote sensing image change based on edge and grayscale 
US20120308153A1 (en) *  20110603  20121206  Sunghyun Hwang  Device and method of removing noise in edge area 
JP2013165352A (en) *  20120209  20130822  Canon Inc  Imaging apparatus, control method of the same and program 
NonPatent Citations (2)
Title 

JIA XIBIN 等: "A novel edge detection in medical images by fusing of multimodel from different spatial structure clues", 《2ND INTERNATIONAL CONFERENCE ON BIOMEDICAL ENGINEERING AND BIOTECHNOLOGY(ICBEB)》 * 
张引 等: "复杂背景下文本提取的色彩边缘检测算子设计", 《软件学报》 * 
Cited By (13)
Publication number  Priority date  Publication date  Assignee  Title 

CN105139384A (en) *  20150811  20151209  北京天诚盛业科技有限公司  Defective capsule detection method and apparatus 
CN105139384B (en) *  20150811  20171226  北京天诚盛业科技有限公司  The method and apparatus of defect capsule detection 
CN106504263A (en) *  20161104  20170315  辽宁工程技术大学  A kind of quick continuous boundary extracting method of image 
CN106504263B (en) *  20161104  20190712  辽宁工程技术大学  A kind of quick continuous boundary extracting method of image 
CN108734158B (en) *  20170414  20200519  成都唐源电气股份有限公司  Realtime train number identification method and device 
CN108734158A (en) *  20170414  20181102  成都唐源电气股份有限公司  A kind of realtime train number identification method and device 
CN107909555A (en) *  20171127  20180413  北京大恒图像视觉有限公司  A kind of gridding noise elimination method for keeping acutance 
CN107909555B (en) *  20171127  20200602  北京大恒图像视觉有限公司  Sharpnesskeeping grid noise elimination method 
CN108513044B (en) *  20180416  20201113  深圳市华星光电技术有限公司  Image smoothing method, electronic device and computer readable storage medium 
CN108513044A (en) *  20180416  20180907  深圳市华星光电技术有限公司  Picture smooth treatment method, electronic device and computer readable storage medium 
CN108830870A (en) *  20180521  20181116  千寻位置网络有限公司  Satellite image highprecision field boundary extracting method based on Multiscale model study 
CN110298858A (en) *  20190701  20191001  北京奇艺世纪科技有限公司  A kind of image cropping method and device 
CN111300987A (en) *  20200227  20200619  深圳怡化电脑股份有限公司  Ink jet interval time determining method, device, computer equipment and storage medium 
Also Published As
Publication number  Publication date 

CN103530878B (en)  20160113 
Similar Documents
Publication  Publication Date  Title 

US20180068461A1 (en)  Posture estimating apparatus, posture estimating method and storing medium  
Rathod et al.  Image processing techniques for detection of leaf disease  
CN103942577B (en)  Based on the personal identification method for establishing sample database and composite character certainly in video monitoring  
DeMaeztu et al.  Linear stereo matching  
Lu et al.  Crossbased local multipoint filtering  
CN104408460B (en)  A kind of lane detection and tracking detection method  
CN103984961B (en)  A kind of image detecting method for being used to detect underbody foreign matter  
Miri et al.  Retinal image analysis using curvelet transform and multistructure elements morphology by reconstruction  
Khan et al.  An optimized method for segmentation and classification of apple diseases based on strong correlation and genetic algorithm based feature selection  
EP2888718B1 (en)  Methods and systems for automatic location of optic structures in an image of an eye, and for automatic retina cuptodisc ratio computation  
CN103164858B (en)  Adhesion crowd based on superpixel and graph model is split and tracking  
CN107133948B (en)  Image blurring and noise evaluation method based on multitask convolution neural network  
US7224831B2 (en)  Method, apparatus and program for detecting an object  
US8055018B2 (en)  Object image detection method  
Khan  Fingerprint image enhancement and minutiae extraction  
CN102903122B (en)  Video object tracking method based on feature optical flow and online ensemble learning  
CN105046245B (en)  Video human face method of determination and evaluation  
CN105917353A (en)  Feature extraction and matching and template update for biometric authentication  
CN101271514B (en)  Image detection method and device for fast object detection and objective output  
US20140037159A1 (en)  Apparatus and method for analyzing lesions in medical image  
CN101567044B (en)  Method for detecting quality of human face image  
US20020146178A1 (en)  System and method for fingerprint image enchancement using partitioned leastsquared filters  
CN103778636B (en)  A kind of feature construction method for nonreference picture quality appraisement  
Wang et al.  iVAT and aVAT: enhanced visual analysis for cluster tendency assessment  
US9286537B2 (en)  System and method for classifying a skin infection 
Legal Events
Date  Code  Title  Description 

PB01  Publication  
C06  Publication  
SE01  Entry into force of request for substantive examination  
C10  Entry into substantive examination  
GR01  Patent grant  
C14  Grant of patent or utility model  
CF01  Termination of patent right due to nonpayment of annual fee 
Granted publication date: 20160113 Termination date: 20191012 

CF01  Termination of patent right due to nonpayment of annual fee 