CN106815851B - A kind of grid circle oil level indicator automatic reading method of view-based access control model measurement - Google Patents
A kind of grid circle oil level indicator automatic reading method of view-based access control model measurement Download PDFInfo
- Publication number
- CN106815851B CN106815851B CN201710059293.4A CN201710059293A CN106815851B CN 106815851 B CN106815851 B CN 106815851B CN 201710059293 A CN201710059293 A CN 201710059293A CN 106815851 B CN106815851 B CN 106815851B
- Authority
- CN
- China
- Prior art keywords
- image
- pixel
- value
- grid
- indicate
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Landscapes
- Image Analysis (AREA)
Abstract
The invention discloses a kind of grid circle oil level indicator automatic reading method of view-based access control model measurement, belong to the intelligence instrument reading field in machine vision industrial application.Emphasis of the present invention uses maximum extreme value stability region (Maximally stable extremal regions, MSER) matching algorithm positions grid, and longest grid is determined using beam search strong edge method, and the percentage of Fuel Oil Remaining can be calculated by the length of longest grid.By showing multiple image test, this method has caught the main feature of such instrument compared with other intelligence instrument number reading methods, algorithm --- and grid, grid is when in face of the interference such as image rotation, fuzzy, still there is good conspicuousness, so method proposed by the present invention has preferable robustness.
Description
Technical field
A kind of grid circle oil level indicator automatic reading method of view-based access control model measurement, is used for pin-point reading grid circle oil level
Meter, belongs to field of machine vision.
Background technique
Industrial instrument automatic reading technology mainly extracts specific instrument, range and location of reading using pattern-recognition,
Achieve the purpose that reading.Compared to artificial reading, advantage has: 24 hours uninterrupted readings save manual labor, stability
Height, it is high-efficient.Therefore in industrial circle application, intelligence instrument reading has very high researching value.
Measure industrial instrument automatic reading algorithm performance be mainly reflected in instrument positioning, reading extract and image restoration,
In the ability of correction.Now currently, old-fashioned instrument is many kinds of, pointer-type common are, in addition there are with liquid levels height
As the instrument of reading mode, different instrument has different range ability and reading for instrument for reading mode and the height using buoy
Number mode.And be imaged during, rotation, illumination variation and block be the main interference of image and image restoration main difficulty
Point.
Currently, common instrument automatic reading algorithm has Hough transform, weighted mean method, mathematical morphology, edge inspection
The series of algorithms such as survey, Threshold segmentation, Active contour.Since in imaging process, too many background information enters image, adding
Upper uneven illumination, blocking etc. influences, and causes to meter reading, and the key messages such as pointer are difficult to fine to extract.These methods exist
Can complete the intelligent reading to instrument to a certain extent, but for complex background, the multifarious processing capacity of instrument compared with
Difference.
Summary of the invention
The present invention provides a kind of grid circle oil level indicator automatic reading of view-based access control model measurement in view of the above shortcomings
Method solves in the prior art due to imaging process, and complex background enters image, along with image is illuminated by the light unevenness, blocks
Deng influence, lead to the automatic counting of carry out instrument that cannot prepare.
To achieve the goals above, The technical solution adopted by the invention is as follows:
A kind of grid circle oil level indicator automatic reading method of view-based access control model measurement, which comprises the following steps:
Step 1: reading in initial pictures f (x, y), noise is removed to initial pictures f (x, y) and enhancing is handled, obtains being gone
Except noise and enhanced image fenhance(x,y);
Step 2: to being removed noise and enhanced image fenhance(x, y) carries out clarity evaluation, if clarity
Low image fenhance(x, y) return step 1 carries out enhancing processing, otherwise enables the image f that clarity is highenhance(x, y) is instrument
Image f2(x,y);
Step 3: by Instrument image f2(x, y) is negated, and obtains mirror image finvert(x, y), with maximum extreme value stability region
(Maximally Stable Extremal Regions, MSER) matching algorithm extracts Instrument image f2(x, y) and mirror image
finvertThe area-of-interest of (x, y) respectively obtains segmented image fMSER+(x, y) and segmented image fMSER-(x,y);
Step 4: in segmented image fMSER+(x, y) and segmented image fMSER-On the basis of (x, y), first with position and operation
By segmented image fMSER+(x, y) and segmented image fMSER-(x, y) is fused into a width result images fbitand(x, y), then by result
Image fbitand(x, y) carries out morphologic filtering for grating image f3(x, y) is extracted;
Step 5: with Optimal-threshold segmentation by grating image f3The grid of (x, y) is divided into prospect, obtains grid segmentation figure
As f4(x,y);
Step 6: by grid segmented image f4The longest grid L of longitudinal length in (x, y)y max(x, y) is extracted, longest
Grid Ly maxThe ordinate minimum value of (x, y) is the maximum value of range, and the maximum value of ordinate is the position of liquid level;
Step 7: in a longitudinal direction, from longest grid Ly max(x, y) ordinate maximum value is to grid segmented image f4
Strong edge is searched in the bottom of (x, y), and the strong edge searched is range initial position, and range size can be obtained, complete reading;
Further, specific step is as follows for the step 1:
Step 11: reading initial pictures f (x, y);
Step 12: initial pictures f (x, y) is converted into gray level image fgray(x, y), wherein gray level image fgray(x's, y)
The formula of the gray value of each pixel is as follows:
Wherein,Indicate gray level image fgrayThe gray value of (x, y) each pixel, Rf(x, y) indicates initial
The pixel value of each pixel of image f (x, y) red channel;Gf(x, y) indicates each picture of initial pictures f (x, y) green channel
The pixel value of vegetarian refreshments;Bf(x, y) indicates that the pixel value of each pixel of initial pictures f (x, y) blue channel, (x, y) indicate just
Each pixel of beginning image;
Step 13: to gray level image fgray(x, y) carries out bilateral filtering and obtains the image f after removal noisedenosie(x, y),
Bilateral filtering formula is as follows:
Wherein, fgray(k, l) is gray level image, i.e. gray level image fgray(x, y), fdenosie(x, y) is after removing noise
Image, (k, l) is the pixel coordinate of gray level image, and (x, y) is the pixel coordinate for removing the image after noise, ω (x, y, k,
L) be bilateral filtering weighting coefficient, and weighting coefficient ω (x, y, k, l) depend on domain core and codomain core product, formula
It is as follows:
Wherein,It is domain core,Indicate domain core
Variance,It is codomain core,Indicate the variance of codomain core;
Step 14: to the image f for having carried out bilateral filteringdenosie(x, y) carries out histogram specification, obtains enhanced
Image fenhance(x, y), specific step is as follows for histogram specification:
Step 141: the image f after calculating removal noisedenosieThe histogram p of (x, y)r(r), wherein r indicates gray value,
And use pr(r) the histogram equalization transformation of following formula is found:
Wherein, skIt is the gray value that the pixel value after equalizing is k, L is the number of grey levels in image, i.e., compares 8
Special image be 256, MN be equalization after image in pixel sum, M be equalization after the every a line of image pixel
Number, N are the number of pixels for indicating each column of image after equalization, njIt is the image f for having carried out bilateral filteringdenosie(x,
Y) gray scale is the number of pixels of j in, then, skThe integer being rounded up in range [0, L-1];
Step 142: according to the following formula to q=0,1,2 ..., L-1 calculates transforming function transformation function G (zq) all value:
Wherein, ziIndicate the gray value that the pixel value after being mapped is i, pz(zi) it is pixel value in defined histogram
For ziPixel account for as defined in histogram pixel percentage, q indicates from 0 to be added to zqNumber of pixels, transforming function transformation function G (zq)
The pixel value after defined histogram is equalized is indicated, G (zq) value be rounded to it is whole in range [0, L-1]
Number, by G (zq) value be stored in a table;
Step 143: to each value sk, the G (z that is stored using step 142q) value find response zqValue, so that G (zq)
Closest to sk, and store from skTo zqMapping, when meeting given skZqWhen being worth more than one, that is, when mapping not unique, selection
The smallest value;
Step 144: removing the image f after noise to input firstdenosie(x, y) is carried out and the equilibrium as step 141
Change processing, the mapping then found using step 143 is the image f after removal noisedenosieAfter each of (x, y) is balanced
Pixel value skCorresponding z in image after being mapped as histogram specificationqValue.
Further, specific step is as follows for the step 2:
Step 21: to being removed noise and enhanced image fenhance(x, y) carries out clarity evaluation, clarity evaluation
Mode uses the amplitude for calculating the gradient of image, and the amplitude calculation formula of gradient is as follows:
Wherein, gradfenhance(x, y) is indicated to removal noise and enhanced image fenhanceThe gradient vector of (x, y),
||·||22 norms for indicating vector, shown herein as gradient vector gradfenhanceThe amplitude of (x, y),Table
Show the partial derivative in the direction x,Indicate the partial derivative in the direction y;
Step 22: if removal noise and enhanced image fenhance(x, y) is judged as the high image of clarity, then
It is Instrument image f by the image tagged2(x, y), if it is determined that the image low for clarity, the then histogram changed in step 14 reflect
Function is penetrated, and will removal noise and enhanced image fenhance(x, y) return step 14 carries out histogram specification.
Further, the step 3 maximum extreme value stability region (MaximallyStableExtremalRegions,
MSER) specific step is as follows for matching algorithm:
Step 31: by clear high Instrument image f2(x, y) is negated, and obtains mirror image finvert(x, y), wherein mirror image
Image finvertThe formula of the pixel value of (x, y) each pixel is as follows:
Wherein,Indicate Instrument image f2The pixel value of (x, y) each pixel,Indicate mirror image
As finvertThe pixel value of (x, y) each pixel;
Step 32: respectively to Instrument image f2(x, y) and mirror image finvertThe all pixels point of (x, y) is according to pixel
Value size is ranked up;
Step 33: according to step 32, respectively from Instrument image f2(x, y) and mirror image finvertChoosing in (x, y) has
Minimum pixel value gminPixel as source point, having the merger of other source points in four neighborhood of source point is the same connected component, should
Connected component constitutes a node of tree data structure, and then the pixel value of Xiang Geng great starts iteration;
Step 34: considering Instrument image f respectively2(x, y) and mirror image finvertCurrent grayvalue is the picture of g in (x, y)
The pixel that current grayvalue is g is added the connected component near four neighborhoods, and updates the connected component in step 33 by vegetarian refreshments
The tree data structure of composition: when two or more connected components are merged into one, the connected component being merged is divided
With a new node, and it is made into the father node of ancestor node;
Step 35: being g by gray value if the pixel that a gray value is g belongs to two or more connected components simultaneously
Pixel two or more connected component region merging techniques at one, then determine whether pixel value at this time is maximum picture
Plain value gmax, if not max pixel value gmax, execute step 33;If it is max pixel value gmax, execute step 36;
Step 36: Instrument image f2(x, y) and mirror image finvertThe all pixels point of (x, y) is processed, and is obtained
Data structure --- the fork tree of one connected component area when updating g each time, will function of the fork tree as a threshold value, fork sets
Each node can be seen as the extremal region comprising extreme value;
Step 37: if Qi,…,QjIt is a mutually nested extremal region, wherein gmin≤i<j≤gmax, thereforeWherein gmin≤g<gmax, then maximum extremal region QgBe it is most stable of, q (g)=| Qg+△\Qg-△|/Qg?Place
With a local minimum, wherein | | indicating the gesture of cardinality set, it is the how many amount of metric set element,
△ is the parameter of model, finally, Instrument image f2(x, y) obtains the segmented image f comprising instrument grid connected regionMSER+(x,
y);And mirror image finvert(x, y) obtains the segmented image f comprising instrument grid connected regionMSER-(x,y)。
Further, specific step is as follows for the step 4:
Step 41: by segmented image fMSER+(x, y) and segmented image fMSER-(x, y) execution position and operation obtain in place with
Result images fbitandThe operation method of (x, y), position and operation is by segmented image fMSER+(x, y) and segmented image fMSER-(x,
Then the pixel value of identical point of coordinate of two images is executed AND operation by each pixel value binary representation y);
Step 42: align and result images fbitand(x, y) carries out closing operation of mathematical morphology, obtains closed operation image fclose
(x, y), i.e., by position and result images fbitandConnected component in (x, y) comprising instrument grid connects, and is convenient for step 43
Easily grid region can be extracted, the formula of closed operation is as follows:
Wherein, " " indicates closed operation,Indicate morphological dilations,Indicate morphological erosion, B indicates form
Learn structural elements, it is however generally that, structural elements B takes cross-shaped structure, and the formula of morphological dilations and morphological erosion is as follows:
Wherein,Indicate all elements that the set is mapped about the origin of structural elements B,
(B)z=w | and w=b+z, b ∈ B } it indicates the origin translation of B to point z;
Step 43: surrounding closed operation image f with a minimum adjacent rectanglecloseEach connected component in (x, y) utilizes
The area and length-width ratio of the adjacent rectangle of the minimum of grid region connected component can be by grating image f3(x, y) is extracted.
Further, the formula of the Optimal-threshold segmentation of the step 5 are as follows:
Wherein, f4(x, y) represents the grid segmented image after Optimal-threshold segmentation, f3(x, y) represents grating image, and k is certainly
The optimal threshold of adaptation, m (k) are the mean value for being added to gray level k, mGIt is that (image refers to grating image f to whole image3(x,y))
Average gray, i.e. global threshold, piIn image, (image refers to grating image f to the pixel that expression pixel value is i3(x, y)) in
Percentage,Assuming that threshold value T (k)=k, and input picture, (image refers to grating image f using it3(x, y)) threshold
Value processing is two class C1And C2, P1(k) indicate that pixel is assigned to class C1Probability, 1-P1(k) indicate that pixel is assigned to class C2's
Probability, σ2It is C1Class and C2Variance between class.
Further, specific step is as follows for the step 6:
Step 61: using minimum circumscribed rectangle Li(x, y) indicates the image f after Threshold segmentation4The company of (x, y) inner each grid
Reduction of fractions to a common denominator amount, i ∈ { 1,2 ..., N }, N indicate the grid segmented image f after Threshold segmentation4(x, y) connected component number;
Step 62: to all minimum circumscribed rectangle LiThe ordinate length of (x, y) is ranked up, and finds out longest external square
Shape Ly max(x, y), longest boundary rectangle Ly maxThe ordinate minimum value of (x, y) is the maximum value of range, the maximum of ordinate
Value is the position of liquid level.
Further, specific step is as follows for the step 7:
Step 71: by grid segmented image f4Longest grid boundary rectangle L in (x, y)y maxThe maximum of the ordinate of (x, y)
Value ygridmaxWith minimum value ygridminIt is converted into after removal noise and enhancing and the high Instrument image f of clarity2(x,y)
In longest grid boundary rectangle Ly maxThe maximum value of the ordinate of (x, y)And minimum value
Step 72: from Instrument image f2Longest grid boundary rectangle L in (x, y)y maxThe maximum value of the ordinate of (x, y)Start, take 20 × 20 window, to Canny edge detection is carried out in window, if not detecting strong edge, by window
Opening's edge longitudinal movement 20 location of pixels, repeat Canny edge detection, until detecting strong edge, which is instrument
Table range initial position, circulation stop, and specific step is as follows for Canny edge detection:
Step 721: with a smooth input picture of Gaussian filter, obtain smoothed out image:
fs(x, y)=G (x, y) * f20×20(x,y);
Wherein, fs(x, y) indicates smoothed out image, f20×20(x, y) indicates Instrument image f2With 20 × 20 windows in (x, y)
The subgraph that mouth is chosen, G (x, y) indicate that Gaussian function, (x, y) indicate subgraph f20×20Pixel coordinate value in (x, y), σ2Table
Show the variance of Gaussian function G (x, y), " * " indicates convolution;
Step 722: according to smoothed out image, extract gradient magnitude image and gradient angular image:
Wherein, M (x, y) indicates that gradient magnitude image, α (x, y) indicate gradient angular image,After indicating smooth
Image fs(x, y) the direction x partial derivative,Indicate smoothed out image fsThe partial derivative of (x, y) in the direction y;
Step 723: gradient magnitude image M (x, y) being inhibited using non-maximum: firstly, enabling d1, d2, d3And d4Point
It Biao Shi not four basic edge directions, i.e. horizontal direction (0 °), -45 °, vertical direction (90 °), 45 °;It then looks for closest to ladder
Spend the d of angular image α (x, y)k(k=1,2,3,4);Finally, if the value of gradient magnitude image M (x, y) is less than along dkDirection
One of two neighbours' values, then enable gN(x, y)=0 (inhibition) otherwise enables gN(x, y)=M (x, y), here gN(x, y) is non-very big
Image after value inhibition, N indicate non-maxima suppression;
Step 724: the image g after non-maxima suppression is detected with dual threshold processingNThe edge of (x, y), i.e., use simultaneously
Two different threshold values are to the image g after non-maxima suppressionN(x, y) carries out edge detection:
Wherein, THIndicate high threshold, TLIndicate Low threshold, gNH(x, y) indicates the image g after non-maxima suppressionN(x,y)
By high threshold THImage after segmentation, gNL(x, y) indicates the image g after non-maxima suppressionN(x, y) passes through Low threshold TLPoint
Image after cutting, after threshold process, gNHThe nonzero element ratio g of (x, y)NL(x, y) is few, gNHAll non-zero pixels are all wrapped in (x, y)
It is contained in gNLIn (x, y), then
g'NL(x, y)=gNL(x,y)-gNH(x,y);
In above-mentioned formula, from by Low threshold TLImage g after segmentationNLIt is deleted in (x, y) all from by high threshold
THImage g after segmentationNHThe nonzero element of (x, y), can be by gNH(x, y) and g'NLNon-zero pixels in (x, y) are regarded as respectively
It is " strong " and " weak " edge pixel;
Step 725: after threshold process, by high threshold THImage g after segmentationNHAll strong edge pixels in (x, y)
It is effective edge pixel, and is marked, due to passes through high threshold THImage g after segmentationNHEdge in (x, y) exists
Gap, needs to obtain longer edge and Canny edge detection forms final output image, the specific steps are as follows:
(a) passing through high threshold THImage g after segmentationNHIt is clockwise with 8 neighborhoods to the pixel when prelocalization in (x, y)
Position next not visited edge pixel p;
(b) in g'NLAll weak pixels are labeled as efficient frontier pixel in (x, y), are connected with the connection method of 8 connectivity
It is connected to p;
(c) connection for passing through 8 connectivity, if gNHAll non-zero pixels in (x, y) have been accessed, then jump to step
(d), otherwise return step (a);
(d) by g'NL(x, y) unmarked all pixels zero setting for efficient frontier pixel, at this point, by high threshold THSegmentation
Image g afterwardsNHEdge in (x, y) there are gap is filled, and obtains a longer edge;
(e) by g'NLAll non-zero pixels tax of (x, y) is added to gNH(x, y) obtains being formed finally with Canny edge detection
Output image g (x, y).
In conclusion by adopting the above-described technical solution, the beneficial effects of the present invention are:
One, MSER is utilized there are obvious and stable edge and gradient distribution for instrument in image and ambient background
Algorithm can steadily by image instrument and background distinguish, the present invention is compared with other intelligence instrument number reading methods, algorithm
Catch the main feature of such instrument --- grid, grid still have well when in face of the interference such as image rotation, fuzzy
Conspicuousness, so method proposed by the present invention has preferable robustness;
Two, good optimization has been obtained in MSER algorithm, realizes that the OpenCV program of MSER algorithm has operation very well
Efficiency and testing result, it is 0.488 second that the secondary size of MSER algorithm process one, which is 704 pixels × 576 pixels Instrument image time-consuming,.
Detailed description of the invention
Fig. 1 is flow diagram of the invention;
Fig. 2 is initial pictures of the invention;
Fig. 3 is that noise and enhanced image are removed in the present invention;
Fig. 4 is the gradient map that noise and enhanced picture are removed in the present invention;
Fig. 5 is segmented image f in the present inventionMSER+(x,y);
Fig. 6 is segmented image f in the present inventionMSER-(x,y);
Fig. 7 be the present invention in position and result images fPosition with(x,y);
Fig. 8 is the grid region image extracted in the present invention;
Fig. 9 is grid segmented image f in the present invention4(x, y), i.e., search longest grid, determine instrument full scale position with
The schematic diagram of liquid level position;
Figure 10 is the schematic diagram that the initial position of instrument is determined in the present invention;
Figure 11 is the schematic diagram of final reading result in the present invention.
Specific embodiment
In order to make the objectives, technical solutions, and advantages of the present invention clearer, with reference to the accompanying drawings and embodiments, right
The present invention is further elaborated.It should be appreciated that described herein, specific examples are only used to explain the present invention, not
For limiting the present invention.
A kind of grid circle oil level indicator automatic reading method of view-based access control model measurement, which comprises the following steps:
Step 11: reading initial pictures f (x, y);
Step 12: initial pictures f (x, y) is converted into gray level image fgray(x, y), wherein gray level image fgray(x's, y)
The formula of the gray value of each pixel is as follows:
Wherein,Indicate gray level image fgrayThe gray value of (x, y) each pixel, Rf(x, y) indicates initial
The pixel value of each pixel of image f (x, y) red channel;Gf(x, y) indicates each picture of initial pictures f (x, y) green channel
The pixel value of vegetarian refreshments;Bf(x, y) indicates that the pixel value of each pixel of initial pictures f (x, y) blue channel, (x, y) indicate just
Each pixel of beginning image;
Step 13: to gray level image fgray(x, y) carries out bilateral filtering and obtains the image f after removal noisedenosie(x, y),
Bilateral filtering formula is as follows:
Wherein, fgray(k, l) is gray level image, i.e. gray level image fgray(x, y), fdenosie(x, y) is after removing noise
Image, (k, l) is the pixel coordinate of gray level image, and (x, y) is the pixel coordinate for removing the image after noise, ω (x, y, k,
L) be bilateral filtering weighting coefficient, and weighting coefficient ω (x, y, k, l) depend on domain core and codomain core product, formula
It is as follows:
Wherein,It is domain core,Indicate domain core
Variance,It is codomain core,Indicate the variance of codomain core;
Step 14: to the image f for having carried out bilateral filteringdenosie(x, y) carries out histogram specification, obtains enhanced
Image fenhance(x, y), specific step is as follows for histogram specification:
Step 141: the image f after calculating removal noisedenosieThe histogram p of (x, y)r(r), wherein r indicates gray value,
And use pr(r) the histogram equalization transformation of following formula is found:
Wherein, skIt is the gray value that the pixel value after equalizing is k, L is the number of grey levels in image, i.e., compares 8
Special image be 256, MN be equalization after image in pixel sum, M be equalization after the every a line of image pixel
Number, N are the number of pixels for indicating each column of image after equalization, njIt is the image f for having carried out bilateral filteringdenosie(x,
Y) gray scale is the number of pixels of j in, then, skThe integer being rounded up in range [0, L-1];
Step 142: according to the following formula to q=0,1,2 ..., L-1 calculates transforming function transformation function G (zq) all value:
Wherein, ziIndicate the gray value that the pixel value after being mapped is i, pz(zi) it is pixel value in defined histogram
For ziPixel account for as defined in histogram pixel percentage, q indicates from 0 to be added to zqNumber of pixels, transforming function transformation function G (zq)
The pixel value after defined histogram is equalized is indicated, G (zq) value be rounded to it is whole in range [0, L-1]
Number, by G (zq) value be stored in a table;
Step 143: to each value sk, the G (z that is stored using step 142q) value find response zqValue, so that G (zq)
Closest to sk, and store from skTo zqMapping, when meeting given skZqWhen being worth more than one, that is, when mapping not unique, selection
The smallest value;
Step 144: removing the image f after noise to input firstdenosie(x, y) is carried out and the equilibrium as step 141
Change processing, the mapping then found using step 143 is the image f after removal noisedenosieAfter each of (x, y) is balanced
Pixel value skCorresponding z in image after being mapped as histogram specificationqValue.By the Histogram Mapping of input picture to known clear
The histogram of clear image gets on, and realizes the function of the information such as enhancing image border.
Step 2: to being removed noise and enhanced image fenhance(x, y) carries out clarity evaluation, if clarity
Low image fenhance(x, y) return step 1 carries out enhancing processing, otherwise enables the image f that clarity is highenhance(x, y) is instrument
Image f2(x,y);Specific step is as follows:
Step 21: to being removed noise and enhanced image fenhance(x, y) carries out clarity evaluation, clarity evaluation
Mode uses the amplitude for calculating the gradient of image, and the amplitude calculation formula of gradient is as follows:
Wherein, gradfenhance(x, y) is indicated to removal noise and enhanced image fenhanceThe gradient vector of (x, y),
||·||22 norms for indicating vector, shown herein as gradient vector gradfenhanceThe amplitude of (x, y),Table
Show the partial derivative in the direction x,Indicate the partial derivative in the direction y;
Step 22: if removal noise and enhanced image fenhance(x, y) is judged as the high image of clarity, then
It is Instrument image f by the image tagged2(x, y), if it is determined that the image low for clarity, the then histogram changed in step 14 reflect
Function is penetrated, and will removal noise and enhanced image fenhance(x, y) return step 14 carries out histogram specification.
Step 3: by Instrument image f2(x, y) is negated, and obtains mirror image finvert(x, y), with maximum extreme value stability region
(Maximally Stable Extremal Regions, MSER) matching algorithm extracts Instrument image f2(x, y) and mirror image
finvertThe area-of-interest of (x, y) respectively obtains segmented image fMSER+(x, y) and segmented image fMSER-(x,y);Maximum extreme value
Specific step is as follows for stability region (MaximallyStableExtremalRegions, MSER) matching algorithm:
Step 31: by clear high Instrument image f2(x, y) is negated, and obtains mirror image finvert(x, y), wherein mirror image
Image finvertThe formula of the pixel value of (x, y) each pixel is as follows:
Wherein,Indicate Instrument image f2The pixel value of (x, y) each pixel,Indicate mirror image
As finvertThe pixel value of (x, y) each pixel;
Step 32: respectively to Instrument image f2(x, y) and mirror image finvertThe all pixels point of (x, y) is according to pixel
Value size is ranked up;
Step 33: according to step 32, respectively from Instrument image f2(x, y) and mirror image finvertChoosing in (x, y) has
Minimum pixel value gminPixel as source point, having the merger of other source points in four neighborhood of source point is the same connected component, should
Connected component constitutes a node of tree data structure, and then the pixel value of Xiang Geng great starts iteration;
Step 34: considering Instrument image f respectively2(x, y) and mirror image finvertCurrent grayvalue is the picture of g in (x, y)
The pixel that current grayvalue is g is added the connected component near four neighborhoods, and updates the connected component in step 33 by vegetarian refreshments
The tree data structure of composition: when two or more connected components are merged into one, the connected component being merged is divided
With a new node, and it is made into the father node of ancestor node;
Step 35: being g by gray value if the pixel that a gray value is g belongs to two or more connected components simultaneously
Pixel two or more connected component region merging techniques at one, then determine whether pixel value at this time is maximum picture
Plain value gmax, if not max pixel value gmax, execute step 33;If it is max pixel value gmax, execute step 36;
Step 36: Instrument image f2(x, y) and mirror image finvertThe all pixels point of (x, y) is processed, and is obtained
Data structure --- the fork tree of one connected component area when updating g each time, will function of the fork tree as a threshold value, fork sets
Each node can be seen as the extremal region comprising extreme value;
Step 37: if Qi,…,QjIt is a mutually nested extremal region, wherein gmin≤i<j≤gmax, thereforeWherein gmin≤g<gmax, then maximum extremal region QgBe it is most stable of, q (g)=| Qg+△\Qg-△|/Qg?
Place has a local minimum, wherein | | indicate the gesture of cardinality set, it is that metric set element is how many
Amount, △ is the parameter of model, finally, Instrument image f2(x, y) obtains the segmented image f comprising instrument grid connected regionMSER+
(x,y);And mirror image finvert(x, y) obtains the segmented image f comprising instrument grid connected regionMSER-(x,y)。
Step 4: in segmented image fMSER+(x, y) and segmented image fMSER-On the basis of (x, y), first with position and operation
By segmented image fMSER+(x, y) and segmented image fMSER-(x, y) is fused into a width result images fPosition with(x, y), then by result figure
As fPosition with(x, y) carries out morphologic filtering for grating image f3(x, y) is extracted;Specific step is as follows:
Step 41: by segmented image fMSER+(x, y) and segmented image fMSER-(x, y) execution position and operation obtain in place with
Result images fbitandThe operation method of (x, y), position and operation is by segmented image fMSER+(x, y) and segmented image fMSER-(x,
Then the pixel value of identical point of coordinate of two images is executed AND operation by each pixel value binary representation y);
Step 42: align and result images fbitand(x, y) carries out closing operation of mathematical morphology, obtains closed operation image fclose
(x, y), i.e., by position and result images fbitandConnected component in (x, y) comprising instrument grid connects, and is convenient for step 43
Easily grid region can be extracted, the formula of closed operation is as follows:
Wherein, " " indicates closed operation,Indicate morphological dilations,Indicate morphological erosion, B indicates form
Learn structural elements, it is however generally that, structural elements B takes cross-shaped structure, and the formula of morphological dilations and morphological erosion is as follows:
Wherein,Indicate all elements that the set is mapped about the origin of structural elements B, (B)z
=w | and w=b+z, b ∈ B } it indicates the origin translation of B to point z;
Step 43: surrounding closed operation image f with a minimum adjacent rectanglecloseEach connected component in (x, y) utilizes
The area and length-width ratio of the adjacent rectangle of the minimum of grid region connected component can be by grating image f3(x, y) is extracted.
Step 5: with Optimal-threshold segmentation by grating image f3The grid of (x, y) is divided into prospect, obtains grid segmentation figure
As f4(x,y);The formula of Optimal-threshold segmentation are as follows:
Wherein, f4(x, y) represents the grid segmented image after Optimal-threshold segmentation, f3(x, y) represents grating image, and k is certainly
The optimal threshold of adaptation, m (k) are the mean value for being added to gray level k, mGIt is that (image refers to grating image f to whole image3(x,y))
Average gray, i.e. global threshold, piIn image, (image refers to grating image f to the pixel that expression pixel value is i3(x, y)) in
Percentage,Assuming that threshold value T (k)=k, and input picture, (image refers to grating image f using it3(x, y)) threshold
Value processing is two class C1And C2, P1(k) indicate that pixel is assigned to class C1Probability, 1-P1(k) indicate that pixel is assigned to class C2's
Probability, σ2It is C1Class and C2Variance between class.
Step 6: by grid segmented image f4The longest grid L of longitudinal length in (x, y)y max(x, y) is extracted, longest
Grid Ly maxThe ordinate minimum value of (x, y) is the maximum value of range, and the maximum value of ordinate is the position of liquid level;Specific step
It is rapid as follows:
Step 61: using minimum circumscribed rectangle Li(x, y) indicates the image f after Threshold segmentation4The company of (x, y) inner each grid
Reduction of fractions to a common denominator amount, i ∈ { 1,2 ..., N }, N indicate the grid segmented image f after Threshold segmentation4(x, y) connected component number;
Step 62: to all minimum circumscribed rectangle LiThe ordinate length of (x, y) is ranked up, and finds out longest external square
Shape Ly max(x, y), longest boundary rectangle Ly maxThe ordinate minimum value of (x, y) is the maximum value of range, the maximum of ordinate
Value is the position of liquid level.
Step 7: in a longitudinal direction, from longest grid Ly max(x, y) ordinate maximum value is to grid segmented image f4
Strong edge is searched in the bottom of (x, y), and the strong edge searched is range initial position, and range size can be obtained, complete reading;
Specific step is as follows:
Step 71: by grid segmented image f4Longest grid boundary rectangle L in (x, y)y maxThe maximum of the ordinate of (x, y)
Value ygridmaxWith minimum value ygridminIt is converted into after removal noise and enhancing and the high Instrument image f of clarity2(x,y)
In longest grid boundary rectangle Ly maxThe maximum value of the ordinate of (x, y)And minimum value
Step 72: from Instrument image f2Longest grid boundary rectangle L in (x, y)y maxThe maximum value of the ordinate of (x, y)Start, take 20 × 20 window, to Canny edge detection is carried out in window, if not detecting strong edge, by window
Opening's edge longitudinal movement 20 location of pixels, repeat Canny edge detection, until detecting strong edge, which is instrument
Table range initial position, circulation stop, and specific step is as follows for Canny edge detection:
Step 721: with a smooth input picture of Gaussian filter, obtain smoothed out image:
fs(x, y)=G (x, y) * f20×20(x,y);
Wherein, fs(x, y) indicates smoothed out image, f20×20(x, y) indicates Instrument image f2With 20 × 20 windows in (x, y)
The subgraph that mouth is chosen, G (x, y) indicate that Gaussian function, (x, y) indicate subgraph f20×20Pixel coordinate value in (x, y), σ2Table
Show the variance of Gaussian function G (x, y), " * " indicates convolution;
Step 722: according to smoothed out image, extract gradient magnitude image and gradient angular image:
Wherein, M (x, y) indicates that gradient magnitude image, α (x, y) indicate gradient angular image,After indicating smooth
Image fs(x, y) the direction x partial derivative,Indicate smoothed out image fsThe partial derivative of (x, y) in the direction y;
Step 723: gradient magnitude image M (x, y) being inhibited using non-maximum: firstly, enabling d1, d2, d3And d4Point
It Biao Shi not four basic edge directions, i.e. horizontal direction (0 °), -45 °, vertical direction (90 °), 45 °;It then looks for closest to ladder
Spend the d of angular image α (x, y)k(k=1,2,3,4);Finally, if the value of gradient magnitude image M (x, y) is less than along dkDirection
One of two neighbours' values, then enable gN(x, y)=0 (inhibition) otherwise enables gN(x, y)=M (x, y), here gN(x, y) is non-very big
Image after value inhibition, N indicate non-maxima suppression;
Step 724: the image g after non-maxima suppression is detected with dual threshold processingNThe edge of (x, y), i.e., use simultaneously
Two different threshold values are to the image g after non-maxima suppressionN(x, y) carries out edge detection:
Wherein, THIndicate high threshold, TLIndicate Low threshold, gNH(x, y) indicates the image g after non-maxima suppressionN(x,y)
By high threshold THImage after segmentation, gNL(x, y) indicates the image g after non-maxima suppressionN(x, y) passes through Low threshold TLPoint
Image after cutting, after threshold process, gNHThe nonzero element ratio g of (x, y)NL(x, y) is few, gNHAll non-zero pixels are all wrapped in (x, y)
It is contained in gNLIn (x, y), then
g'NL(x, y)=gNL(x,y)-gNH(x,y);
In above-mentioned formula, from by Low threshold TLImage g after segmentationNLIt is deleted in (x, y) all from by high threshold
THImage g after segmentationNHThe nonzero element of (x, y), can be by gNH(x, y) and g'NLNon-zero pixels in (x, y) are regarded as respectively
It is " strong " and " weak " edge pixel;
Step 725: after threshold process, by high threshold THImage g after segmentationNHAll strong edge pixels in (x, y)
It is effective edge pixel, and is marked, due to passes through high threshold THImage g after segmentationNHEdge in (x, y) exists
Gap, needs to obtain longer edge and Canny edge detection forms final output image, the specific steps are as follows:
(a) passing through high threshold THImage g after segmentationNHIt is clockwise with 8 neighborhoods to the pixel when prelocalization in (x, y)
Position next not visited edge pixel p;
(b) in g'NLAll weak pixels are labeled as efficient frontier pixel in (x, y), are connected with the connection method of 8 connectivity
It is connected to p;
(c) connection for passing through 8 connectivity, if gNHAll non-zero pixels in (x, y) have been accessed, then jump to step
(d), otherwise return step (a);
(d) by g'NL(x, y) unmarked all pixels zero setting for efficient frontier pixel, at this point, by high threshold THSegmentation
Image g afterwardsNHEdge in (x, y) there are gap is filled, and obtains a longer edge;
(e) by g'NLAll non-zero pixels tax of (x, y) is added to gNH(x, y) obtains being formed finally with Canny edge detection
Output image g (x, y).
This paper presents a kind of automatic reading method of grid circle oil level indicator, this method is deposited for instrument with ambient background
In apparent edge and gradient distribution, the grid characteristic point of the instrument in image is extracted using MSER method, and by grid feature
Point can detecte out the instrument in complex background with position and operation and morphological method, finally true using the longest grid in instrument
Determine the reading of instrument.Since the existing very mature classics of these algorithms are through method, also there is the integrated level just proposed in recent years high, robust
Property strong new algorithm, so operational efficiency is high, it is easy to accomplish and transplanting.Especially oil level indicator of this method relative to no grid,
Due to the conspicuousness of grid, method is simpler.
The foregoing is merely illustrative of the preferred embodiments of the present invention, is not intended to limit the invention, all in essence of the invention
Made any modifications, equivalent replacements, and improvements etc., should all be included in the protection scope of the present invention within mind and principle.
Claims (8)
1. a kind of grid circle oil level indicator automatic reading method of view-based access control model measurement, which comprises the following steps:
Step 1: reading in initial pictures f (x, y), noise is removed to initial pictures f (x, y) and enhancing is handled, obtains being removed and make an uproar
Sound and enhanced image fenhance(x,y);
Step 2: to being removed noise and enhanced image fenhance(x, y) carries out clarity evaluation, if the figure that clarity is low
As fenhance(x, y) return step 1 carries out enhancing processing, otherwise enables the image f that clarity is highenhance(x, y) is Instrument image f2
(x,y);
Step 3: by Instrument image f2(x, y) is negated, and obtains mirror image finvert(x, y) is matched with maximum extreme value stability region
Algorithm extracts Instrument image f2(x, y) and mirror image finvertThe area-of-interest of (x, y) respectively obtains segmented image fMSER+
(x, y) and segmented image fMSER-(x,y);
Step 4: in segmented image fMSER+(x, y) and segmented image fMSEROn the basis of (x, y), it will divide first with position and operation
Cut image fMSER+(x, y) and segmented image fMSER(x, y) is fused into a width result images fbitand(x, y), then by result images
fbitand(x, y) carries out morphologic filtering for grating image f3(x, y) is extracted;
Step 5: with Optimal-threshold segmentation by grating image f3The grid of (x, y) is divided into prospect, obtains grid segmented image f4
(x,y);
Step 6: by grid segmented image f4The longest grid L of longitudinal length in (x, y)ymax(x, y) is extracted, longest grid
Lattice LymaxThe ordinate minimum value of (x, y) is the maximum value of range, and the maximum value of ordinate is the position of liquid level;
Step 7: in a longitudinal direction, from longest grid Lymax(x, y) ordinate maximum value is to grid segmented image f4(x,y)
Bottom search for strong edge, the strong edge searched is range initial position, can be obtained range size, completes reading.
2. a kind of grid circle oil level indicator automatic reading method of view-based access control model measurement according to claim 1, feature
It is, specific step is as follows for the step 1:
Step 11: reading initial pictures f (x, y);
Step 12: initial pictures f (x, y) is converted into gray level image fgray(x, y), wherein gray level image fgrayEach of (x, y)
The formula of the gray value of pixel is as follows:
Wherein,Indicate gray level image fgrayThe gray value of (x, y) each pixel, Rf(x, y) indicates initial pictures f
The pixel value of each pixel of (x, y) red channel;Gf(x, y) indicates each pixel of initial pictures f (x, y) green channel
Pixel value;Bf(x, y) indicates that the pixel value of each pixel of initial pictures f (x, y) blue channel, (x, y) indicate initial pictures
Each pixel coordinate;
Step 13: to gray level image fgray(x, y) carries out bilateral filtering and obtains the image f after removal noisedenosie(x, y), it is bilateral
Filtering Formula is as follows:
Wherein, fgray(k, l) is gray level image, i.e. gray level image fgray(x, y), fdenosie(x, y) is the figure removed after noise
Picture, (k, l) are the pixel coordinates of gray level image, and (x, y) is the pixel coordinate for removing the image after noise, and ω (x, y, k, l) is
The weighting coefficient of bilateral filtering, and weighting coefficient ω (x, y, k, l) depends on the product of domain core and codomain core, formula is such as
Under:
Wherein,It is domain core,Indicate the side of domain core
Difference,It is codomain core,Indicate the variance of codomain core;
Step 14: to the image f for having carried out bilateral filteringdenosie(x, y) carries out histogram specification, obtains enhanced image
fenhance(x, y), specific step is as follows for histogram specification:
Step 141: the image f after calculating removal noisedenosieThe histogram p of (x, y)r(r), wherein r indicates gray value, is used in combination
pr(r) the histogram equalization transformation of following formula is found:
Wherein, skIt is the gray value that the pixel value after equalizing is k, L is the number of grey levels in image, i.e., to 8 bit images
Be 256, MN be equalization after image in pixel sum, M be equalization after the every a line of image number of pixels, N
It is the number of pixels for indicating each column of image after equalization, njIt is the image f for having carried out bilateral filteringdenosieIt is grey in (x, y)
Degree is the number of pixels of j, then, skThe integer being rounded up in range [0, L-1];
Step 142: according to the following formula to q=0,1,2 ..., L-1 calculates transforming function transformation function G (zq) all value:
Wherein, ziIndicate the gray value that the pixel value after being mapped is i, pz(zi) it is that pixel value is z in defined histogrami's
Pixel accounts for the percentage of defined histogram pixel, and q indicates to be added to z from 0qNumber of pixels, transforming function transformation function G (zq) indicate rule
Fixed histogram equalized after pixel value, G (zq) value be rounded to the integer in range [0, L-1], by G
(zq) value be stored in a table;
Step 143: to each value sk, the G (z that is stored using step 142q) value find response zqValue, so that G (zq) closest
sk, and store from skTo zqMapping, when meeting given skZqWhen being worth more than one, that is, when mapping not unique, select the smallest
Value;
Step 144: removing the image f after noise to input firstdenosie(x, y) is carried out at the equalization as step 141
Reason, the mapping then found using step 143 is the image f after removal noisedenosiePixel after each of (x, y) is balanced
Value skCorresponding z in image after being mapped as histogram specificationqValue.
3. a kind of grid circle oil level indicator automatic reading method of view-based access control model measurement according to claim 2, feature
It is, specific step is as follows for the step 2:
Step 21: to being removed noise and enhanced image fenhance(x, y) carries out clarity evaluation, clarity evaluation method
Using the amplitude for the gradient for calculating image, the amplitude calculation formula of gradient is as follows:
Wherein, gradfenhance(x, y) is indicated to removal noise and enhanced image fenhanceThe gradient vector of (x, y), | |
||22 norms for indicating vector, shown herein as gradient vector gradfenhanceThe amplitude of (x, y),Indicate the side x
To partial derivative,Indicate the partial derivative in the direction y;
Step 22: if removal noise and enhanced image fenhance(x, y) is judged as the high image of clarity, then should
Image tagged is Instrument image f2(x, y), if it is determined that the image low for clarity, then change the Histogram Mapping letter in step 14
Number, and will removal noise and enhanced image fenhance(x, y) return step 14 carries out histogram specification.
4. a kind of grid circle oil level indicator automatic reading method of view-based access control model measurement according to claim 1, feature
It is, maximum extreme value stability region (MaximallyStableExtremalRegions, the MSER) matching algorithm of the step 3
Specific step is as follows:
Step 31: by clear high Instrument image f2(x, y) is negated, and obtains mirror image finvert(x, y), wherein mirror image
finvertThe formula of the pixel value of (x, y) each pixel is as follows:
Wherein,Indicate Instrument image f2The pixel value of (x, y) each pixel,Indicate mirror image
finvertThe pixel value of (x, y) each pixel;
Step 32: respectively to Instrument image f2(x, y) and mirror image finvertThe all pixels point of (x, y) is according to pixel value size
It is ranked up;
Step 33: according to step 32, respectively from Instrument image f2(x, y) and mirror image finvertChoosing in (x, y) has minimum
Pixel value gminPixel as source point, having the merger of other source points in four neighborhood of source point is the same connected component, the connection
Component constitutes a node of tree data structure, and then the pixel value of Xiang Geng great starts iteration;
Step 34: considering Instrument image f respectively2(x, y) and mirror image finvertCurrent grayvalue is the pixel of g in (x, y),
The connected component near four neighborhoods is added in the pixel that current grayvalue is g, and updates the composition of the connected component in step 33
Tree data structure: when two or more connected components are merged into one, the connected component that is merged is assigned one
A new node, and it is made into the father node of ancestor node;
Step 35: if the pixel that a gray value is g belongs to two or more connected components simultaneously, the picture for being g by gray value
Then two or more connected component region merging techniques of vegetarian refreshments determine whether pixel value at this time is max pixel value at one
gmax, if not max pixel value gmax, execute step 33;If it is max pixel value gmax, execute step 36;
Step 36: Instrument image f2(x, y) and mirror image finvertThe all pixels point of (x, y) is processed, and obtains one
The data structure of connected component area when updating g each time --- fork tree, the function by fork tree as a threshold value, pitches the every of tree
A node can be seen as the extremal region comprising extreme value;
Step 37: if Qi,...,QjIt is a mutually nested extremal region, wherein gmin≤ i < j≤gmax, thereforeWherein gmin≤ g < gmax, then maximum extremal region QgBe it is most stable of, q (g)=| Qg+Δ\Qg-Δ|/Qg?
Place has a local minimum, wherein | | indicate the gesture of cardinality set, it is that metric set element is how many
Amount, Δ is the parameter of model, finally, Instrument image f2(x, y) obtains the segmented image f comprising instrument grid connected regionMSER+
(x,y);And mirror image finvert(x, y) obtains the segmented image f comprising instrument grid connected regionMSER-(x,y)。
5. a kind of grid circle oil level indicator automatic reading method of view-based access control model measurement according to claim 1, feature
It is, specific step is as follows for the step 4:
Step 41: by segmented image fMSER+(x, y) and segmented image fMSER-(x, y) execution position and operation obtain in place with result figure
As fbitandThe operation method of (x, y), position and operation is by segmented image fMSER+(x, y) and segmented image fMSER-(x's, y) is every
Then the pixel value of identical point of coordinate of two images is executed AND operation by a pixel value binary representation;
Step 42: align and result images fbitand(x, y) carries out closing operation of mathematical morphology, obtains closed operation image fclose(x,
Y), i.e., by position and result images fbitandConnected component in (x, y) comprising instrument grid connects, and is convenient for step 43 energy
Easily grid region is extracted, the formula of closed operation is as follows:
Wherein, " " indicates closed operation,Indicate morphological dilations, "!" indicating morphological erosion, B indicates morphological structure
Member, it is however generally that, structural elements B takes cross-shaped structure, and the formula of morphological dilations and morphological erosion is as follows:
Wherein,Indicate the set of the pixel in the origin mapping description scheme member B about structural elements B
All elements, (B)z=w | and w=b+z, b ∈ B } it indicates the origin translation of B to point z;
Step 43: surrounding closed operation image f with a minimum adjacent rectanglecloseEach connected component, utilizes grid in (x, y)
The area and length-width ratio of the adjacent rectangle of the minimum of regional connectivity component can be by grating image f3(x, y) is extracted.
6. a kind of grid circle oil level indicator automatic reading method of view-based access control model measurement according to claim 1, feature
It is, the formula of the Optimal-threshold segmentation of the step 5 are as follows:
Wherein, f4(x, y) represents the grid segmented image after Optimal-threshold segmentation, f3(x, y) represents grating image, and k is adaptive
Optimal threshold, m (k) is the mean value for being added to gray level k;mGIt is the average gray of whole image, i.e. global threshold, here
Image refers to that image refers to grating image f3(x,y);piExpression pixel value is the percentage of the pixel of i in the picture, figure here
It seem to refer to that image refers to grating image f3(x,y);
Assuming that threshold value T (k)=k, and using it input picture, image here refers to that image refers to grating image
f3(x,y);Thresholding processing is two class C1And C2, P1(k) indicate that pixel is assigned to class C1Probability, 1-P1(k) pixel quilt is indicated
Assign to class C2Probability, σ2It is C1Class and C2Variance between class.
7. a kind of grid circle oil level indicator automatic reading method of view-based access control model measurement according to claim 1, feature
It is, specific step is as follows for the step 6:
Step 61: using minimum circumscribed rectangle Li(x, y) indicates the image f after Threshold segmentation4The connection of (x, y) inner each grid point
Amount, i ∈ { 1,2 ..., N }, N indicate the grid segmented image f after Threshold segmentation4(x, y) connected component number;
Step 62: to all minimum circumscribed rectangle LiThe ordinate length of (x, y) is ranked up, and finds out longest boundary rectangle
Lymax(x, y), longest boundary rectangle LymaxThe ordinate minimum value of (x, y) is the maximum value of range, and the maximum value of ordinate is
The position of liquid level.
8. a kind of grid circle oil level indicator automatic reading method of view-based access control model measurement according to claim 1, feature
It is, specific step is as follows for the step 7:
Step 71: by grid segmented image f4Longest grid boundary rectangle L in (x, y)ymaxThe maximum value of the ordinate of (x, y)
ygridmaxWith minimum value ygridminIt is converted into after removal noise and enhancing and the high Instrument image f of clarity2In (x, y)
Longest grid boundary rectangle LymaxThe maximum value of the ordinate of (x, y)And minimum value
Step 72: from Instrument image f2Longest grid boundary rectangle L in (x, y)ymaxThe maximum value of the ordinate of (x, y)
Start, take 20 × 20 window, in window carry out Canny edge detection, if not detecting strong edge, by window along
20 location of pixels are vertically moved, repeat Canny edge detection, until detecting strong edge, which is instrument range
Initial position, circulation stop, and specific step is as follows for Canny edge detection:
Step 721: with a smooth input picture of Gaussian filter, obtain smoothed out image:
fs(x, y)=G (x, y) * f20×20(x,y);
Wherein, fs(x, y) indicates smoothed out image, f20×20(x, y) indicates Instrument image f2It is selected in (x, y) with 20 × 20 windows
In subgraph, G (x, y) indicate Gaussian function, (x, y) indicate subgraph f20×20Pixel coordinate value in (x, y), σ2Indicate high
The variance of this function G (x, y), " * " indicate convolution;
Step 722: according to smoothed out image, extract gradient magnitude image and gradient angular image:
Wherein, M (x, y) indicates that gradient magnitude image, α (x, y) indicate gradient angular image,Indicate smoothed out figure
As fs(x, y) the direction x partial derivative,Indicate smoothed out image fsThe partial derivative of (x, y) in the direction y;
Step 723: gradient magnitude image M (x, y) being inhibited using non-maximum: firstly, enabling d1, d2, d3And d4Table respectively
Show four basic edge directions, i.e. horizontal direction (0 °), -45 °, vertical direction (90 °), 45 °;It then looks for closest to gradient angle
Spend the d of image α (x, y)k(k=1,2,3,4);Finally, if the value of gradient magnitude image M (x, y) is less than along dkTwo of direction
One of neighbours' value, then enable gN(x, y)=0 (inhibition) otherwise enables gN(x, y)=M (x, y), here gN(x, y) is non-maximum suppression
Image after system, N indicate non-maxima suppression;
Step 724: the image g after non-maxima suppression is detected with dual threshold processingNThe edge of (x, y), i.e., while with two not
Same threshold value is to the image g after non-maxima suppressionN(x, y) carries out edge detection:
Wherein, THIndicate high threshold, TLIndicate Low threshold, gNH(x, y) indicates the image g after non-maxima suppressionN(x, y) passes through
High threshold THImage after segmentation, gNL(x, y) indicates the image g after non-maxima suppressionN(x, y) passes through Low threshold TLAfter segmentation
Image, after threshold process, gNHThe nonzero element ratio g of (x, y)NL(x, y) is few, gNHAll non-zero pixels are included in (x, y)
gNLIn (x, y), then
g'NL(x, y)=gNL(x,y)-gNH(x,y);
In above-mentioned formula, from by Low threshold TLImage g after segmentationNLIt is deleted in (x, y) all from by high threshold THSegmentation
Image g afterwardsNHThe nonzero element of (x, y), can be by gNH(x, y) and g'NLNon-zero pixels in (x, y) regard " strong " as respectively
" weak " edge pixel;
Step 725: after threshold process, by high threshold THImage g after segmentationNHAll strong edge pixels in (x, y) are
Effective edge pixel, and be marked, due to passing through high threshold THImage g after segmentationNHThere is seam in the edge in (x, y)
Gap, needs to obtain longer edge and Canny edge detection forms final output image, the specific steps are as follows:
(a) passing through high threshold THImage g after segmentationNHIn (x, y), the pixel when prelocalization is positioned clockwise with 8 neighborhoods
Next not visited edge pixel p;
(b) in g'NLAll weak pixels are labeled as efficient frontier pixel in (x, y), are connected to the connection method of 8 connectivity
p;
(c) connection for passing through 8 connectivity, if gNHAll non-zero pixels in (x, y) have been accessed, then jump to step (d), no
Then return step (a);
(d) by g'NL(x, y) unmarked all pixels zero setting for efficient frontier pixel, at this point, by high threshold THAfter segmentation
Image gNHEdge in (x, y) there are gap is filled, and obtains a longer edge;
(e) by g'NLAll non-zero pixels tax of (x, y) is added to gNH(x, y) obtains being formed with Canny edge detection final defeated
Image g (x, y) out.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710059293.4A CN106815851B (en) | 2017-01-24 | 2017-01-24 | A kind of grid circle oil level indicator automatic reading method of view-based access control model measurement |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710059293.4A CN106815851B (en) | 2017-01-24 | 2017-01-24 | A kind of grid circle oil level indicator automatic reading method of view-based access control model measurement |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106815851A CN106815851A (en) | 2017-06-09 |
CN106815851B true CN106815851B (en) | 2019-05-24 |
Family
ID=59112512
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710059293.4A Active CN106815851B (en) | 2017-01-24 | 2017-01-24 | A kind of grid circle oil level indicator automatic reading method of view-based access control model measurement |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106815851B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108985288B (en) * | 2018-07-17 | 2022-06-14 | 电子科技大学 | TGMSERs-based SAR image oil spill detection method |
CN112801098B (en) * | 2019-11-14 | 2023-01-10 | 临沂市拓普网络股份有限公司 | Contour technology-based mathematical symbol identification method |
CN111539312A (en) * | 2020-04-21 | 2020-08-14 | 罗嘉杰 | Method for extracting table from image |
CN113077398A (en) * | 2021-04-09 | 2021-07-06 | 上海申瑞继保电气有限公司 | Circuit breaker circular on-off indicator lamp image noise filtering method |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102750540A (en) * | 2012-06-12 | 2012-10-24 | 大连理工大学 | Morphological filtering enhancement-based maximally stable extremal region (MSER) video text detection method |
CN103871234A (en) * | 2012-12-10 | 2014-06-18 | 中兴通讯股份有限公司 | Grid mapping growth-based traffic network division method and configuration server |
CN104616280A (en) * | 2014-11-26 | 2015-05-13 | 西安电子科技大学 | Image registration method based on maximum stable extreme region and phase coherence |
CN106228161A (en) * | 2016-07-18 | 2016-12-14 | 电子科技大学 | A kind of pointer-type dial plate automatic reading method |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012040367A1 (en) * | 2010-09-22 | 2012-03-29 | Dow Corning Corporation | Organosiloxane block copolymer |
-
2017
- 2017-01-24 CN CN201710059293.4A patent/CN106815851B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102750540A (en) * | 2012-06-12 | 2012-10-24 | 大连理工大学 | Morphological filtering enhancement-based maximally stable extremal region (MSER) video text detection method |
CN103871234A (en) * | 2012-12-10 | 2014-06-18 | 中兴通讯股份有限公司 | Grid mapping growth-based traffic network division method and configuration server |
CN104616280A (en) * | 2014-11-26 | 2015-05-13 | 西安电子科技大学 | Image registration method based on maximum stable extreme region and phase coherence |
CN106228161A (en) * | 2016-07-18 | 2016-12-14 | 电子科技大学 | A kind of pointer-type dial plate automatic reading method |
Non-Patent Citations (1)
Title |
---|
"基于MSER的图像文本定位的应用研究";李鉴鸿;《中国优秀硕士学位论文全文数据库 信息科技辑》;20151215(第12期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN106815851A (en) | 2017-06-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2018107939A1 (en) | Edge completeness-based optimal identification method for image segmentation | |
CN107092871B (en) | Remote sensing image building detection method based on multiple dimensioned multiple features fusion | |
CN106815851B (en) | A kind of grid circle oil level indicator automatic reading method of view-based access control model measurement | |
CN105654501B (en) | Self-adaptive image segmentation method based on fuzzy threshold | |
CN110717489B (en) | Method, device and storage medium for identifying text region of OSD (on Screen display) | |
CN109146948B (en) | Crop growth phenotype parameter quantification and yield correlation analysis method based on vision | |
CN104008553B (en) | Crack detection method with image gradient information and watershed method conflated | |
CN111027446B (en) | Coastline automatic extraction method of high-resolution image | |
CN104361582B (en) | Method of detecting flood disaster changes through object-level high-resolution SAR (synthetic aperture radar) images | |
CN108229342B (en) | Automatic sea surface ship target detection method | |
CN104240204B (en) | Solar silicon wafer and battery piece counting method based on image processing | |
CN111091095B (en) | Method for detecting ship target in remote sensing image | |
CN108009529B (en) | Forest fire smoke video target detection method based on characteristic root and hydrodynamics | |
CN108830832A (en) | A kind of plastic barrel surface defects detection algorithm based on machine vision | |
CN112734729B (en) | Water gauge water level line image detection method and device suitable for night light supplement condition and storage medium | |
CN105138992A (en) | Coastline detection method based on regional active outline model | |
CN112734761A (en) | Industrial product image boundary contour extraction method | |
CN105809673A (en) | SURF (Speeded-Up Robust Features) algorithm and maximal similarity region merging based video foreground segmentation method | |
Yao et al. | Automatic extraction of road markings from mobile laser-point cloud using intensity data | |
CN106570878A (en) | Heavy oil microcosmic interface detection method based on gray scale difference | |
Ma et al. | Intelligent optimization of seam-line finding for orthophoto mosaicking with LiDAR point clouds | |
CN109543498A (en) | A kind of method for detecting lane lines based on multitask network | |
CN107292899A (en) | A kind of Corner Feature extracting method for two dimensional laser scanning instrument | |
Xiao et al. | Multiresolution-Based Rough Fuzzy Possibilistic C-Means Clustering Method for Land Cover Change Detection | |
Omidalizarandi et al. | Segmentation and classification of point clouds from dense aerial image matching |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |