CN1797470A - Quick method for picking up stepped edge in sub pixel level - Google Patents

Quick method for picking up stepped edge in sub pixel level Download PDF

Info

Publication number
CN1797470A
CN1797470A CNA2004101025873A CN200410102587A CN1797470A CN 1797470 A CN1797470 A CN 1797470A CN A2004101025873 A CNA2004101025873 A CN A2004101025873A CN 200410102587 A CN200410102587 A CN 200410102587A CN 1797470 A CN1797470 A CN 1797470A
Authority
CN
China
Prior art keywords
point
gray
edge
value
step edge
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CNA2004101025873A
Other languages
Chinese (zh)
Other versions
CN100357974C (en
Inventor
张广军
贺俊吉
魏振忠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Beijing University of Aeronautics and Astronautics
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CNB2004101025873A priority Critical patent/CN100357974C/en
Publication of CN1797470A publication Critical patent/CN1797470A/en
Application granted granted Critical
Publication of CN100357974C publication Critical patent/CN100357974C/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention belongs to the machine vision and related image processing techniques, relating to the improvement of an image edge extracting method. The steps of the invention: making smooth filtering processing on the original image; searching a boundary point Ps with steady-state grey scale maximum value and a boundary point Pe with steady-stage grey scale minimum value of a step edge; calculating initial values a0, b0 and c0 of edge model parameters; calculating the optimum values of a, b and c and determining the step edge. The initial value selection of the model parameters is simple, and the step edge extraction has high speed, strong robustness and high accuracy.

Description

One sub pixel level quick method for picking up stepped edge
Technical field
The invention belongs to machine vision and associated picture treatment technology, relate to improvement image edge extraction method.
Background technology
The most basic feature of image is the edge, so-called edge is meant that pixel grey scale in the image has step to change or the set of those pixels that the roof shape changes, it is present between target and background, target and target, zone and the zone, and relevant, thereby show as step edge and line edge with the uncontinuity of the first order derivative of brightness of image or brightness of image.Step edge shows as the pixel gray-scale value of brightness of image on the both sides at discontinuous place evident difference, and this species diversity carries out the transition to dark background for image from the bright field scape from visually apparent, or carries out the transition to dark scene from bright background.Handle for machine vision and associated picture, the Pixel-level step edge extracts and can't meet the demands.The precision that step edge extracts directly influences the order of accuarcy that machine vision and associated picture are handled, thus the step edge of sub-pixel-level to be extracted in a lot of occasions be necessary.At present, sub-pixel-level step edge extracting method mainly contains two classes: a class is to carry out interpolation in certain zone of Pixel-level step edge, and as curve fitting or surface fitting, and the extreme value place of leading with curve or curved surface single order is as the accurate position of step edge.In order to obtain sub-pixel level precision, nonlinear optimization method is adopted in the estimation of curve or surface model parameter, because initial value is selected relatively difficulty, the nonlinear optimization process is longer, and precision is subjected to certain restriction.The another kind of method that the sub-pixel-level step edge extracts is earlier original image to be carried out first derivation, makes it become line edge, and then the first moment or the Hessian matrix in calculating pixel level line edge zone, realizes the edge extracting of sub-pixel-level.The common drawback of these two class methods is to calculate comparatively loaded down with trivial detailsly, and the speed of Flame Image Process is slow, is unsuitable for the requirement in real time fast that machine vision and associated picture are handled.
Summary of the invention
The objective of the invention is: the deficiency at existing method exists, a sub pixel level quick method for picking up stepped edge is proposed, further improve the real-time that machine vision and associated picture are handled.
Technical scheme of the present invention is: a sub pixel level quick method for picking up stepped edge is characterized in that the step edge skeleton pattern is defined as:
y = a 1 + e x - b σ + c - - - [ 1 ]
The step of extracting the sub-pixel-level step edge is as follows:
(1) use the Gauss convolution kernel that original image is carried out The disposal of gentle filter;
(2) from top to bottom, from left to right press row/column scan image, if the gray-scale value of current scan point compares the fall of the gray-scale value of analyzing spot>30 before this, then current point is the stable state gray scale maximum value boundary point P of step edge sThen at P sContinue scanning after the point, if the gray-scale value of current scan point is than the amplitude of variation of the gray-scale value of analyzing spot<5 before this, then current point is the frontier point P of the stable state minimum gray value of step edge e, the span of gray-scale value is 0~255, so P is arranged sThe gray-scale value y of point sWith horizontal ordinate x s, P eThe gray-scale value y of point eWith horizontal ordinate x eSatisfy following formula:
y s=y max,x s=x min
y e=y min,x e=x max [5]
(3) the initial value a of edge calculation model parameter 0, b 0, c 0The initial value that calculates a, b and c according to following formula is respectively:
a 0=y max-y min=y s-y e [6]
b 0=(x max+x min)/2=(x s+x e)/2 [7]
c 0=y min=y e [8]
(4) calculate a, b, the optimal value of c is determined step edge, the steps include:
1. at a P sAnd P eSearch for gray-scale value y by row or column between the row or column at place e≤ y i≤ y sSome P i(x i, y i), i=1,2 ..., m, wherein m is the number of the point that searches, m 〉=3;
2. by formula [1], make
f ( x , y ) = y - a 1 + e x - b σ - c - - - [ 9 ]
Getting the objective optimization function is following formula: min Σ i = 1 m f 2 ( x i , y i ) - - - [ 10 ]
3. a P i(x i, y i), i=1,2 ..., m brings formula [10] into, obtain an overdetermination about a, b, the Nonlinear System of Equations of c.Utilize the Levenberg-Marquardt nonlinear optimization algorithm to obtain a, b, the optimal value of c; Then put x=b, y = a 2 + c Be the position of step edge point.
Advantage of the present invention is: the model parameter initial value is chosen simply, and the step edge extraction rate is fast, strong robustness, precision height.
Description of drawings
Fig. 1 is a step edge skeleton pattern synoptic diagram.
Fig. 2 is that the inventive method step edge extracts the emulation experiment synoptic diagram.
Fig. 3 is to use the inventive method to carry out the synoptic diagram of edge extracting example.
Embodiment
Below the present invention is described in further details.
Step edge skeleton pattern function such as Fig. 1.According to the step edge shape, the step edge skeleton pattern may be defined as:
y = - a 1 + e x - b σ + c - - - [ 1 ]
A wherein, b, c is a location parameter, σ represents variance.The geometric meaning of three parameters is: a represents the height at edge, i.e. the maximum steady state value of curve and the difference between the minimum steady-state value; B represents the horizontal ordinate of the symcenter position at edge; C represents the base value at edge, i.e. the minimum steady-state value of curve.The foundation that proposes this model is that this mathematical function is that the profile characteristics of these characteristics and step edge match about x=b point antisymmetry.
Model parameter estimation.
A, b, nonlinear optimization method is adopted in the estimation of c parameter.Guarantee that nonlinear optimization method converges to global optimum's point, the selection of the initial value of parameter is the key link very.
(1) initial value chooses.
Because the step edge model parameter that proposes previously has tangible geometric meaning, so that choosing of parameter initial value becomes simple.
At first doing theoretical analysis, is that formula [1] has according to model:
y max = lim x → - ∞ , y = a + c , y min = lim x → ∞ , y = c - - - [ 2 ]
So can obtain the initial value of a and c:
a 0=y max-y min,c 0=y min。[3]
Find out by model, obvious x=b, y = a 2 + c Be the symcenter of step edge, so the initial value of b is optional:
b 0=(x max+x min)/2, [4]
X wherein Max, x MinBe respectively the stable state minimum gray value frontier point and the corresponding horizontal ordinate of maximum value boundary point of step edge.
According to model is formula [1], and the step edge point is defined in the point of step edge Grad maximum, and this point is the first order derivative maximum of points, second derivative zero crossing, i.e. y "=0, solve marginal point at the x=b place.
Real image is by the discrete two-dimensional digital image that changes into of line scanning, is worth with the gray scale (brightness) of each point of numeral image between 0~255.When practical operation, the step of extracting the sub-pixel-level step edge is as follows:
(1) use the Gauss convolution kernel that original image is carried out The disposal of gentle filter.(referring to Ma Songde, Zhang Zhengyou writes, " computer vision-theory of computation and algorithm basis ", Beijing: Science Press, 1998.)
(2) from top to bottom (from left to right) is by row (row) scan image, if when the gray-scale value (span is 0~255) of current scan point has obvious decline than the gray-scale value of analyzing spot before this (drop-out value>30), then current point is the stable state gray scale maximum value boundary point P of step edge sThen at P sContinue scanning after the point, if the gray-scale value of current scan point does not have obvious variation (changing value<5) than the gray-scale value of analyzing spot before this, then current point is the frontier point P of the stable state minimum gray value of step edge eSo P is arranged sThe gray-scale value y of point sWith horizontal ordinate x s, P eThe gray-scale value y of point eWith horizontal ordinate x eSatisfy following formula:
y s=y max,x s=x min
y e=y min,x e=x max [5]
(3) the initial value a of edge calculation model parameter 0, b 0, c 0The initial value that calculates a, b and c according to following formula is respectively:
a 0=y max-y min=y s-y e [6]
b 0=(x max+x min)/2=(x s+x e)/2 [7]
c 0=y min=y e [8]
(4) calculate a, b, the optimal value of c is determined step edge, the steps include:
1. at a P sAnd P eSearch for gray-scale value y by row or column between the row or column at place e≤ y≤y sSome P i(x i, y i), i=1,2 ..., m, wherein m is the number of the point that searches, m 〉=3;
2. by formula [1], make
f ( x , y ) = y - a 1 + e x - b σ - c - - - [ 9 ]
Getting the objective optimization function is following formula: min Σ i = 1 m f 2 ( x i , y i ) - - - [ 10 ]
3. a P i(x i, y i), i=1,2 ..., m brings formula (10) into, obtain an overdetermination about a, b, the Nonlinear System of Equations of c.Utilize the Levenberg-Marquardt nonlinear optimization algorithm to obtain a, b, the optimal value of c; Then put x=b, y = a 2 + c Be the position of step edge point.
Emulation experiment.
Emulation experiment has been carried out on the edge extracting method limit that the present invention is based on the step edge skeleton pattern, and step is as follows:
(1) produce a desirable step signal, its desirable step edge point is B=1.50, as Fig. 2 (a).
(2), make it to be similar to the true step signal of a real world, as Fig. 2 (b) with level and smooth this step signal of Gaussian function.
(3) additive noise signal of about 37dB of signal to noise ratio (S/N ratio) of generation and Gaussian distributed is superimposed upon real step signal with it and folds, as the edge contour signal, as Fig. 2 (c).
(4) method of employing 5.2 is tried to achieve parameter a, b, and c, wherein b is the step edge point, as Fig. 2 (d).
(5) repeating step (3), (4) 1000 times obtain 1000 b values, and mean value is 1.5056.
(6) deviation (b-B) of 1000 step edge point b of calculating and true step edge point B, obtaining standard deviation is 0.0393.According to 3 σ principles of measuring error, the positional precision of extracting step edge based on the edge extracting method of step edge skeleton pattern should promptly can be accurate to sub-pixel-level at 0.12 below the pixel.It is 55ms that said method extracts each average consuming time of step edge point b.In addition, according to above-mentioned steps (1)~(6), the positional precision of the step edge point that the method that adopts the method for interpolation respectively and ask single order to lead back calculating first moment or Hessian matrix is tried to achieve is respectively 0.31 pixel and 0.22 pixel, average 108ms and 145ms respectively consuming time.Therefore, be significantly improved based on the edge extracting method of step edge skeleton pattern speed than other existing method.
Fig. 3 is that of this algorithm extracts instance graph.On the visual observation just as can be seen this algorithm extracted the position at edge more accurately.

Claims (1)

1, a sub pixel level quick method for picking up stepped edge is characterized in that, the step edge skeleton pattern is defined as:
y = a 1 + e x - b σ + c - - - [ 1 ]
The step of extracting the sub-pixel-level step edge is as follows:
(1) use the Gauss convolution kernel that original image is carried out The disposal of gentle filter;
(2) from top to bottom, from left to right press row/column scan image, if the gray-scale value of current scan point compares the fall of the gray-scale value of analyzing spot>30 before this, then current point is the stable state gray scale maximum value boundary point P of step edge sThen at P sContinue scanning after the point, if the gray-scale value of current scan point is than the amplitude of variation of the gray-scale value of analyzing spot<5 before this, then current point is the frontier point P of the stable state minimum gray value of step edge e, the span of gray-scale value is 0~255, so P is arranged sThe gray-scale value y of point sWith horizontal ordinate x s, P eThe gray-scale value y of point eWith horizontal ordinate x eSatisfy following formula:
y s=y max,x s=x min
y e=y min,x e=x max [5]
(3) the initial value a of edge calculation model parameter 0, b 0, c 0The initial value that calculates a, b and c according to following formula is respectively:
a 0=y max-y min=y s-y e [6]
b 0=(x max?+x min)/2=(x s+x e)/2 [7]
c 0=y min=y e [8]
(4) calculate a, b, the optimal value of c is determined step edge, the steps include:
1. at a P sAnd P eSearch for gray-scale value y by row or column between the row or column at place e≤ y i≤ y sSome P i(x i, y i), i=1,2 ..., m, wherein m is the number of the point that searches, m 〉=3;
2. by formula [1], make
f ( x , y ) = y - a 1 + e x - b σ + c - - - [ 9 ]
Getting the objective optimization function is following formula: min Σ i = 1 m f 2 ( x i , y i ) - - - [ 10 ]
3. a P i(x i, y i), i=1,2 ..., m brings formula [10] into, obtain an overdetermination about a, b, the Nonlinear System of Equations of c.Utilize the Levenberg-Marquardt nonlinear optimization algorithm to obtain a, b, the optimal value of c; Then put x=b, y = a 2 + c Be the position of step edge point.
CNB2004101025873A 2004-12-28 2004-12-28 Quick method for picking up stepped edge in sub pixel level Expired - Fee Related CN100357974C (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CNB2004101025873A CN100357974C (en) 2004-12-28 2004-12-28 Quick method for picking up stepped edge in sub pixel level

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CNB2004101025873A CN100357974C (en) 2004-12-28 2004-12-28 Quick method for picking up stepped edge in sub pixel level

Publications (2)

Publication Number Publication Date
CN1797470A true CN1797470A (en) 2006-07-05
CN100357974C CN100357974C (en) 2007-12-26

Family

ID=36818479

Family Applications (1)

Application Number Title Priority Date Filing Date
CNB2004101025873A Expired - Fee Related CN100357974C (en) 2004-12-28 2004-12-28 Quick method for picking up stepped edge in sub pixel level

Country Status (1)

Country Link
CN (1) CN100357974C (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101196389B (en) * 2006-12-05 2011-01-05 鸿富锦精密工业(深圳)有限公司 Image measuring system and method
CN105225221A (en) * 2014-07-02 2016-01-06 中国科学院广州生物医药与健康研究院 Method for detecting image edge and system
WO2018072208A1 (en) * 2016-10-21 2018-04-26 Abb Schweiz Ag Method, electronic device and system of picking an object from a container
CN111882570A (en) * 2020-07-28 2020-11-03 浙江水晶光电科技股份有限公司 Edge positioning method and device, storage medium and electronic equipment
CN113449264A (en) * 2020-03-27 2021-09-28 中国移动通信集团设计院有限公司 Method and device for monitoring waveform edge

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101196389B (en) * 2006-12-05 2011-01-05 鸿富锦精密工业(深圳)有限公司 Image measuring system and method
CN105225221A (en) * 2014-07-02 2016-01-06 中国科学院广州生物医药与健康研究院 Method for detecting image edge and system
CN105225221B (en) * 2014-07-02 2018-02-06 中国科学院广州生物医药与健康研究院 Method for detecting image edge and system
WO2018072208A1 (en) * 2016-10-21 2018-04-26 Abb Schweiz Ag Method, electronic device and system of picking an object from a container
CN113449264A (en) * 2020-03-27 2021-09-28 中国移动通信集团设计院有限公司 Method and device for monitoring waveform edge
CN113449264B (en) * 2020-03-27 2023-08-15 中国移动通信集团设计院有限公司 Waveform edge monitoring method and device
CN111882570A (en) * 2020-07-28 2020-11-03 浙江水晶光电科技股份有限公司 Edge positioning method and device, storage medium and electronic equipment

Also Published As

Publication number Publication date
CN100357974C (en) 2007-12-26

Similar Documents

Publication Publication Date Title
CN109087274B (en) Electronic device defect detection method and device based on multi-dimensional fusion and semantic segmentation
CN112991347B (en) Three-dimensional-based train bolt looseness detection method
CN100351039C (en) Precisive measurement of static knife profile
CN103390280B (en) Based on the Fast Threshold dividing method of Gray Level-Gradient two-dimensional symmetric Tsallis cross entropy
CN101673338B (en) Fuzzy license plate identification method based on multi-angle projection
CN102324099B (en) Step edge detection method oriented to humanoid robot
CN105865344A (en) Workpiece dimension measuring method and device based on machine vision
CN110853081B (en) Ground and airborne LiDAR point cloud registration method based on single-tree segmentation
CN109708658B (en) Visual odometer method based on convolutional neural network
CN106204528A (en) A kind of size detecting method of part geometry quality
CN109685827B (en) Target detection and tracking method based on DSP
CN111462066A (en) Thread parameter detection method based on machine vision
CN101029823A (en) Method for tracking vehicle based on state and classification
CN1702684A (en) Strong noise image characteristic points automatic extraction method
CN106875430B (en) Single moving target tracking method and device based on fixed form under dynamic background
CN1306454C (en) Modified transfer function measuring method and system
CN107194402B (en) Parallel refined skeleton extraction method
CN1797470A (en) Quick method for picking up stepped edge in sub pixel level
TWI383690B (en) Method for image processing
CN105160644A (en) Method for positioning center of crisscross image in CCD image measurement system
CN1766928A (en) A kind of motion object center of gravity track extraction method based on the dynamic background sport video
CN105005991B (en) A kind of method for calculating atom barycenter displacement in high resolution scanning transmission image in batches
CN103632373B (en) A kind of flco detection method of three-frame difference high-order statistic combination OTSU algorithms
CN111429437B (en) Image non-reference definition quality detection method for target detection
CN1896682A (en) X-shaped angular-point sub-pixel extraction

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
C17 Cessation of patent right
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20071226

Termination date: 20111228