CN1763472A - Quick and high-precision method for extracting center of structured light stripe - Google Patents

Quick and high-precision method for extracting center of structured light stripe Download PDF

Info

Publication number
CN1763472A
CN1763472A CN 200510123724 CN200510123724A CN1763472A CN 1763472 A CN1763472 A CN 1763472A CN 200510123724 CN200510123724 CN 200510123724 CN 200510123724 A CN200510123724 A CN 200510123724A CN 1763472 A CN1763472 A CN 1763472A
Authority
CN
China
Prior art keywords
partiald
prime
circletimes
convolution
sigma
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN 200510123724
Other languages
Chinese (zh)
Other versions
CN100491899C (en
Inventor
周富强
张广军
江洁
胡坤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Beijing University of Aeronautics and Astronautics
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CNB2005101237246A priority Critical patent/CN100491899C/en
Publication of CN1763472A publication Critical patent/CN1763472A/en
Application granted granted Critical
Publication of CN100491899C publication Critical patent/CN100491899C/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Processing (AREA)

Abstract

The invention relates to an extraction method of structure optical fringe center, which comprises the following steps: convoluting the z (x ,y) and Gauss differential to obtain rx, ry, rxx, rxy, ryy; adapting the separability of Gauss Function to transfer the two-dimensional convolution of image into one-dimensional convolution; utilizing the recursive method to realize the Gauss one-dimensional convolution; solving the characteristic value and characteristic vector of Hessian matrix to obtain the fringe normal direction (nx, ny); solving the inferior pixel coordinate of optical fringe center according to the Taylor expansion formula. The invention reduces the operand, which realizes the rapid extraction of structure optical fringe center line.

Description

A kind of structural light stripes center quick and high-precision method for extracting
Technical field
The invention belongs to measuring technique, relate to improvement structural light stripes center extracting method.
Background technology
In active vision system based on the triangulation principle, structural light stripes is owing to the change in depth on testee surface is modulated, be reflected to distortion has then taken place in the image, the degree of its distortion has comprised the relative position information between laser instrument, the ccd video camera and the depth information on testee surface.Seek out these information, must at first from the image that contains striation, obtain the accurate position at striations center.Therefore, in structured light vision detection, a very important Flame Image Process task is exactly to obtain the accurate location information at structural light stripes center.Common striations center line extraction method has gray threshold method, extremum method, Grads threshold method etc. (seeing: Yu Qifeng, Lu Hongwei, Liu Xiaolin, " based on the precision measurement and the motion measurement of image) ", Beijing: Science Press, 2002.).These methods realize simple, but precision is not high.He Junji (sees: He Junji, Zhang Guangjun, " optical strip image Study on processing method in the structured light three-dimensional vision detection ", the journal .2003 of BJ University of Aeronautics ﹠ Astronautics, 29 (7): 593-597.) grade is carried out Gauss or fitting of parabola on the striation cross section, by asking its extreme point to obtain the sub-pixel location at striations center, this method only is suitable for the little straight line striations of normal direction variation in the image again.Hu Bin (Hu Bin, Li Dehua, Buddha's warrior attendant, Hu Hanping, " based on the structural light stripes center detection method of direction template ", computer engineering and application, 2002,11:59-60.) wait the structural light stripes center detection method that has proposed based on direction template, use a plurality of direction templates to detect the sub-pixel location that normal direction changes striations center greatly, this method noise resisting ability is strong, has certain broken line repairing ability, but calculates more complicated.Striation in the image can be understood as curvilinear structures or the lines of certain width is arranged.Steger (STEGER C, " An Unbiased Detector of Curvilinear Structures ", IEEE Transactions on Pattern Analysis and machine Intelligence.1998,20 (2): 113-125.) utilize the Hessian matrix to obtain the normal direction of striations in the image, ask extreme point on the normal direction to obtain the sub-pixel location of striations center line then.Advantages such as the Steger method has the precision height, and robustness is good are widely applied to literal identification, the road Identification in take photo by plane image or the satellite image, the fields such as extraction of medical image medium vessels, bone.The major defect of this algorithm is that operand is big, is difficult to the rapid extraction of implementation structure striations center line, is difficult to the higher application scenario of requirement of real time.
Summary of the invention
The objective of the invention is: propose a kind of structural light stripes center rapid extracting method, further improve the real-time requirement that Machine Vision Detection is used.
Technical scheme of the present invention is: a kind of structural light stripes center rapid extracting method, it is characterized in that,
1, (x y) with the gaussian kernel convolution of differential form, obtains r respectively with image z x, r y, r Xx, r Xy, r Yy, suc as formula [1];
r x = ( ∂ g ( x , y ) / ∂ x ) ⊗ z ( x , y ) r y = ( ∂ g ( x , y ) / ∂ y ) ⊗ z ( x , y ) r xx = ( ∂ 2 g ( x , y ) / ∂ x 2 ) ⊗ z ( x , y ) r yy = ( ∂ 2 g ( x , y ) / ∂ y 2 ) ⊗ z ( x , y ) r xy = ( ∂ 2 g ( x , y ) / ∂ x ∂ y ) ⊗ z ( x , y ) - - - [ 1 ]
1.1, utilize the separability of Gaussian function, will be converted into the one dimension convolution to the two-dimensional convolution of image; To image z (x, two-dimentional Gaussian convolution equivalence y) is Gauss row convolution and Gauss's row convolution, is two-dimentional Gauss's template of n * n for a size, the operand of the two-dimensional convolution shown in the formula [1] is 5n 2Inferior taking advantage of adds, and the operand of one dimension convolution is to take advantage of for 10n time to add, and therefore, the two-dimensional convolution shown in the formula [1] is converted into twice one dimension convolution shown in the formula [2];
r x = g ( y ) ⊗ ( ( dg ( x ) / dx ) ⊗ z ( x , y ) ) r y = ( dg ( y ) / dy ) ⊗ ( g ( x ) ⊗ z ( x , y ) ) r xx = g ( y ) ⊗ ( ( d 2 g ( x ) / d x 2 ) ⊗ z ( x , y ) ) r yy = ( d 2 g ( y ) / d y 2 ) ⊗ ( g ( x ) ⊗ z ( x , y ) ) r xy ( dg ( y ) / dy ) ⊗ ( ( dg ( x ) / dx ) ⊗ z ( x , y ) ) - - - [ 2 ]
1.2, utilize recursion method to realize Gauss's one dimension convolution, reduce the redundant computation amount in the calculating process, the process of Recursive Implementation is divided into forward recursive and two parts of backward recursive, the result who is input as forward recursive of backward recursive; Suppose certain delegation or a certain I of classifying as (n) of input picture, the output of the forward recursive of the Gaussian convolution of the Gaussian convolution of image, single order differential and second-order differential is respectively T (n), T ' (n) and T " (n), backward recursive is exported and is respectively O (n); O ' (n) and O " (n), then have: T ( n ) = BI ( n ) + Σ i = 1 3 c i T ( n - i ) O ( n ) = BT ( n ) + Σ i = 1 3 c i O ( n + i ) T ′ ( n ) = B ( I ( n + 1 ) - I ( n - 1 ) ) / 2 + Σ i = 1 3 c i T ′ ( n - i ) O ′ ( n ) = B T ′ ( n ) + Σ i = 1 3 c i O ′ ( n + i ) T ′ ′ ( n ) = B ( I ( n ) - I ( n - 1 ) ) + Σ i = 1 3 c i T ′ ′ ( n - i ) O ′ ′ ( n ) = B ( T ′ ′ ( n + 1 ) - T ′ ′ ( n ) ) + Σ i = 1 3 c i O ′ ′ ( n + i ) - - - [ 3 ]
Coefficient c in the formula 1, c 2, c 3, B is:
q = 0.98711 σ - 0.96330 , σ > 2.5 3.97156 - 4.14554 1 - 0.26891 σ , 0.5 ≤ σ ≤ 2.5
c 0=1.57825+2.44413q+1.4281q 2+0.422205q 3
c 1=(2.44413q+2.85619q 2+1.26661q 3)/c 0
c 2=-(1.4281q 2+1.26661q 3)/c 0
c 3=0.422205q 3/c 0
B=1-(c 1+c 2+c 3);
2, find the solution the eigenwert and the proper vector of Hessian matrix, obtain the normal direction (n of lines x, n y); The expression formula H of Hessian matrix (x, y) suc as formula shown in [4], normal direction (n x, n y) by image z (x, the pairing proper vector of Hessian matrix eigenvalue of maximum absolute value y) provides;
H ( x , y ) = ∂ 2 g ( x , y ) ∂ x 2 ∂ 2 g ( x , y ) ∂ x ∂ y ∂ 2 g ( x , y ) ∂ x ∂ y ∂ 2 g ( x , y ) ∂ y 2 ⊗ z ( x , y ) - - - [ 4 ]
= r xx r xy r xy r yy
3, find the solution the sub-pix coordinate at striations center according to Taylor expansion; The vector of unit length of supposing the striation normal direction of being tried to achieve according to the Hessian matrix is (n x, n y), with current point (x 0, y 0) be basic point, the intensity profile function on the structural light stripes xsect is carried out the second order Taylor expansion, then point (the x on the striped xsect 0+ tn x, y 0+ tn y) gray scale can be expressed as:
z(x 0+tn x,y 0+tn y)=Z(x 0,y 0)+N(r x r y) T+NH(x,y)N T/2
N=(tn wherein xTn y), by _ Z/_t=0, can obtain:
t = - n x r x + n y r y n x 2 r xx + 2 n x n y r xy + n y 2 r yy Then the exact position of striation central point is (x 0+ tn x, y 0+ tn y).
Advantage of the present invention is: the convolution recurrence fast algorithm that the present invention proposes is under the precondition of not sacrificing precision and robustness, reduced the operand of algorithm widely, realized the rapid extraction of structural light stripes center line, for the real-time application of algorithm is laid a good foundation.
Description of drawings
Fig. 1 is the operand contrast of three kinds of algorithms of different.Among the figure, top dotted line is the convolution algorithm amount that adopts true two-dimentional Gaussian convolution implementation algorithm; Following dotted line is that part employing the inventive method is the convolution algorithm amount that convolution is separated implementation algorithm; Following solid line is the convolution algorithm amount that comprehensively adopts convolution Recursive Implementation algorithm of the present invention.
Fig. 2 is that the effect that adopts convolution separation method and convolution recursion method to handle same width of cloth structured light print image compares, and Fig. 2 a is a raw image, and Fig. 2 b is the result who adopts the convolution separation method to obtain, and Fig. 2 c is the result who adopts the inventive method to obtain.
Fig. 3 is that process flow diagram is extracted at the structural light stripes center that the present invention proposes.
Embodiment
Below the present invention is described in further details.At first, briefly introduce ultimate principle based on the structural light stripes center extracting method of Hessian matrix.
Basic ideas based on the sub-pixel precision structural light stripes center extraction algorithm of Hessian matrix are: at first utilize the Hessian matrix to determine the normal direction of lines, utilize Taylor expansion to obtain the sub-pixel location at striations center then on its normal direction.
The normal direction of lines be exactly image z (x, y) second order directional derivative absolute value is got the direction of maximum value in (wherein x, y is respectively horizontal stroke, the ordinate of image), this direction can be determined by eigenwert and the proper vector of calculating the Hessian matrix.The Hessian matrix can be expressed as:
H ( x , y ) = ∂ 2 g ( x , y ) ∂ x 2 ∂ 2 g ( x , y ) ∂ x ∂ y ∂ 2 g ( x , y ) ∂ x ∂ y ∂ 2 g ( x , y ) ∂ y 2 ⊗ z ( x , y ) First
= r xx r xy r xy r yy
Wherein (x y) is two-dimensional Gaussian function to g, _ expression convolution algorithm.Point (x 0, y 0) normal direction (x y) is provided by the characteristic of correspondence vector at the Hessian of this some matrix eigenvalue of maximum absolute value by image z.
The vector of unit length of supposing the striation normal direction of being tried to achieve according to the Hessian matrix is (n x, n y), with point (x 0, y 0) be basic point, the intensity profile function on the structural light stripes xsect is carried out the second order Taylor expansion, then point (the x on the striped xsect 0+ tn x, y 0+ tn y) gray scale can be expressed as:
Z (x 0+ tn x, y 0+ tn y)=Z (x 0, y 0)+N (r xr y) T+ NH (x, y) N T/ 2 second
N=(tn wherein xTn y), r x, r y(x y) obtains with the gaussian kernel convolution of corresponding differential form respectively, promptly by image z
r x = ( ∂ g ( x , y ) / ∂ x ) ⊗ z ( x , y ) r y = ( ∂ g ( x , y ) / ∂ y ) ⊗ z ( x , y ) Third
By _ Z/_t=0, obtain according to formula second:
t = - n x r x + n y r y n x 2 r xx + 2 n x n y r xy + n y 2 r yy Fourth
Then the exact position of striation central point is (x 0+ tn x, y 0+ tn y).
By top narration as can be seen, owing to will carry out the two-dimentional Gaussian convolution of at least 5 large forms to every of entire image in the process of asking the Hessian matrix, this causes the operand of algorithm very big, is difficult to the extract real-time at implementation structure striations center.Therefore, need under the precondition that guarantees the extraction precision, reduce the operand of algorithm, the realization that improves algorithm as far as possible.
According to the separability (see: Jia Yun gets, " machine vision ", Beijing: Science Press, 2000.) of Gaussian function, (x, two-dimentional Gaussian convolution y) can equivalence be Gauss's row convolution and Gauss's row convolution to image z.For a size is two-dimentional Gauss's template of n * n, and the operand of a two-dimensional convolution is n 2Inferior taking advantage of adds, and the operand of quadratic one-dimensional convolution is to take advantage of for 2n time to add.Obviously, the separability of Gaussian convolution can reduce the operand of convolution.The Gaussian convolution nuclear of differential form also satisfies this character.Utilize the separability of Gaussian convolution, 5 two-dimensional convolutions to image in the algorithm are converted into the one dimension convolution 10 times, it is identical that 2 one dimension convolution are wherein arranged, and rationally arranges can be reduced to the one dimension convolution 8 times after the computation sequence of convolution.The convolution algorithm amount of a picture point is by the 5n of the two-dimensional convolution of direct calculating in the image 2Inferior take advantage of to add to be reduced to take advantage of for 8n time add.
The inventive method can be described as " convolution recurrence fast algorithm " again.In fact, in every the two-dimentional Gaussian convolution process, same template covering lap is n (n-2) size between adjacent 2, therefore exists bulk redundancy to calculate in computed image.Above-mentioned convolution decouples computation method can not fundamentally reduce the operand of algorithm, and especially for the Gaussian convolution computing of large form, so the present invention introduces the disposal route of recurrence.
Operator f (k) to the convolution of piece image is:
y ( i ) = Σ k = 0 N - 1 f ( k ) x ( i - k ) Penta
Its transport function is: F ( z ) = Σ n = 0 N - 1 f ( n ) z - n , The sequence number of picture point in the i presentation video, z -1Be the transform operator.
Formula penta is carried out transform can be got: Y (z)=F (z) X (z), F (z)=Y (z)/X (z).F (z) is N rank functions, then has
F ( z ) = Σ i = 0 n - 1 a i z - i / ( 1 + Σ i = 1 n b i z - i ) = Y ( z ) X ( z )
Y ( z ) ( 1 + Σ i = 1 n b i z - i ) = X ( z ) Σ i = 0 n - 1 a i z - i Heptan
Formula is carried out the Z inverse transformation heptan to be obtained:
y ( i ) = Σ j = 0 n - 1 a j x ( i - j ) - Σ k = 0 n b k y ( i - k ) Hot
Suppose that input picture is I (n), the output of the forward recursive of the Gaussian convolution of the Gaussian convolution of image, single order differential and second-order differential is respectively T (n), T ' (n) and T " (n), backward recursive is exported and is respectively O (n), O ' (n) and O " (n), then have:
T ( n ) = BI ( n ) + Σ i = 1 3 c i T ( n - i ) O ( n ) = BI ( n ) + Σ i = 1 3 c i O ( n + i ) T ′ ( n ) = B ( I ( n + 1 ) - I ( n - 1 ) ) / 2 + Σ i = 1 3 c i T ′ ( n - i ) O ′ ( n ) = B T ′ ( n ) + Σ i = 1 3 c i O ′ ( n + i ) T ′ ′ ( n ) = B ( I ( n ) - I ( n - 1 ) ) + Σ i = 1 3 c i T ′ ′ ( n - i ) O ′ ′ ( n ) = B ( T ′ ′ ( n + 1 ) - T ′ ′ ( n ) ) + Σ i = 1 3 c i O ′ ′ ( n + i ) - - - [ 3 ]
Coefficient c in the formula 1, c 2, c 3, B is
q = 0.98711 σ - 0.96330 , σ > 2.5 3.97156 - 4.14554 1 - 0.26891 σ , 0.5 ≤ σ ≤ 2.5
c 0=1.57825+2.44413q+1.4281q 2+0.422205q 3
c 1=(2.44413q+2.85619q 2+1.26661q 3)/c 0
c 2=-(1.4281q 2+1.26661q 3)/c 0
c 3=0.422205q 3/c 0
B=1-(c 1+c 2+c 3)
According to the separability of Gaussian function, can be successively image be carried out recursive convolution on line direction and the column direction, the result who obtains is with directly image to be carried out two-dimensional convolution consistent.Therefore, structural light stripes of the present invention center convolution recurrence Fast Extraction is summarized as follows:
1, (x y) with the gaussian kernel convolution of differential form, obtains r respectively with image z x, r y, r Xx, r Xy, r Yy, suc as formula [1];
r x = ( ∂ g ( x , y ) / ∂ x ) ⊗ z ( x , y ) r y = ( ∂ g ( x , y ) / ∂ y ) ⊗ z ( x , y ) r xx = ( ∂ 2 g ( x , y ) / ∂ x 2 ) ⊗ z ( x , y ) r yy = ( ∂ 2 g ( x , y ) / ∂ y 2 ) ⊗ z ( x , y ) r xy = ( ∂ 2 g ( x , y ) / ∂ x ∂ y ) ⊗ z ( x , y ) - - - [ 1 ]
1.1, utilize the separability of Gaussian function, will be converted into the one dimension convolution to the two-dimensional convolution of image; To image z (x, two-dimentional Gaussian convolution equivalence y) is Gauss row convolution and Gauss's row convolution, is two-dimentional Gauss's template of n * n for a size, the operand of the two-dimensional convolution shown in the formula [1] is 5n 2Inferior taking advantage of adds, and the operand of one dimension convolution is to take advantage of for 10n time to add, and therefore, the two-dimensional convolution shown in the formula [1] is converted into twice one dimension convolution shown in the formula [2];
r x = g ( y ) ⊗ ( ( dg ( x ) / dx ) ⊗ z ( x , y ) ) r y = ( dg ( y ) / dy ) ⊗ ( g ( x ) ⊗ z ( x , y ) ) r xx = g ( y ) ⊗ ( ( d 2 g ( x ) / d x 2 ) ⊗ z ( x , y ) ) r yy = ( d 2 g ( y ) / d y 2 ) ⊗ ( g ( x ) ⊗ z ( x , y ) ) r xy ( dg ( y ) / dy ) ⊗ ( ( dg ( x ) / dx ) ⊗ z ( x , y ) ) - - - [ 2 ]
1.2, utilize recursion method to realize Gauss's one dimension convolution, reduce the redundant computation amount in the calculating process, the process of Recursive Implementation is divided into forward recursive and two parts of backward recursive, the result who is input as forward recursive of backward recursive; Suppose certain delegation or a certain I of classifying as (n) of input picture, the output of the forward recursive of the Gaussian convolution of the Gaussian convolution of image, single order differential and second-order differential is respectively T (n), T ' (n) and T " (n), backward recursive is exported and is respectively O (n); O ' (n) and O " (n), then have:
T ( n ) = BI ( n ) + Σ i = 1 3 c i T ( n - i ) O ( n ) = BI ( n ) + Σ i = 1 3 c i O ( n + i ) T ′ ( n ) = B ( I ( n + 1 ) - I ( n - 1 ) ) / 2 + Σ i = 1 3 c i T ′ ( n - i ) O ′ ( n ) = B T ′ ( n ) + Σ i = 1 3 c i O ′ ( n + i ) T ′ ′ ( n ) = B ( I ( n ) - I ( n - 1 ) ) + Σ i = 1 3 c i T ′ ′ ( n - i ) O ′ ′ ( n ) = B ( T ′ ′ ( n + 1 ) - T ′ ′ ( n ) ) + Σ i = 1 3 c i O ′ ′ ( n + i ) - - - [ 3 ]
Coefficient c in the formula 1, c 2, c 3, B is:
q = 0.98711 σ - 0.96330 , σ > 2.5 3.97156 - 4.14554 1 - 0.26891 σ , 0.5 ≤ σ ≤ 2.5
c 0=1.57825+2.44413q+1.4281q 2+0.422205q 3
c 1=(2.44413q+2.85619q 2+1.26661q 3)/c 0
c 2=-(1.4281q 2+1.26661q 3)/c 0
c 3=0.422205q 3/c 0
B=1-(c 1+c 2+c 3);
2, find the solution the eigenwert and the proper vector of Hessian matrix, obtain the normal direction (n of lines x, n y); The expression formula H of Hessian matrix (x, y) suc as formula shown in [4], normal direction (n x, n y) by image z (x, the pairing proper vector of Hessian matrix eigenvalue of maximum absolute value y) provides;
H ( x , y ) = ∂ 2 g ( x , y ) ∂ x 2 ∂ 2 g ( x , y ) ∂ x ∂ y ∂ 2 g ( x , y ) ∂ x ∂ y ∂ 2 g ( x , y ) ∂ y 2 ⊗ z ( x , y ) - - - [ 4 ]
= r xx r xy r xy r yy
3, find the solution the sub-pix coordinate at striations center according to Taylor expansion; The vector of unit length of supposing the striation normal direction of being tried to achieve according to the Hessian matrix is (n x, n y), with current point (x 0, y 0) be basic point, the intensity profile function on the structural light stripes xsect is carried out the second order Taylor expansion, then point (the x on the striped xsect 0+ tn x, y 0+ tn y) gray scale can be expressed as:
z(x 0+tn x,y 0+tn y)=Z(x 0,y 0)+N(r x r y) T+NH(x,y)N T/2
N=(tn wherein xTn y), by _ Z/_t=0, can obtain:
t = - n x r x + n y r y n x 2 r xx + 2 n x n y r xy + n y 2 r yy
Then the exact position of striation central point is (x 0+ tn x, y 0+ tn y).
Embodiment
By experiment, from operand, extract precision, three aspects of robustness have compared the performance of former algorithm and fast algorithm.
Operand relatively.
Referring to Fig. 1, this is the not operand of three kinds of implementation methods contrast simultaneously of template size.Suppose that the image size that a width of cloth has a structural light stripes is 768 * 576, choose Gauss's parameter σ=2.5, the size of Gaussian convolution template is got n=[8 σ]+1.When σ=2.5, n=21.The convolution algorithm amount that adopts true two-dimentional Gaussian convolution implementation algorithm is 5n 2Take advantage of for * 768 * 576=975421440 time and add; The convolution algorithm amount that adopts convolution to separate implementation algorithm is that 8n * 768 * 576=74317824 time takes advantage of and adds; The convolution algorithm amount that adopts the quick implementation algorithm of convolution recurrence is to take advantage of for 8 * 8 * 768 * 576=28311552 time to add.Obviously, the operand of convolution recurrence Fast Extraction reduces greatly, and the operand of algorithm and Gauss's parameter choose irrelevant.
On the PC of dominant frequency Pentium 2.4G, utilize VISUAL C++6.0 respectively convolution separation algorithm and convolution recursive algorithm, handle same width of cloth size and be 768 * 576 image, get σ=4 and handle required times and be respectively 650ms and 200ms.Hence one can see that, compares with the convolution separation algorithm, and convolution recurrence fast algorithm of the present invention has reduced calculated amount greatly, has improved the speed of algorithm significantly.
Algorithm extracts ratio of precision.
Adopted true picture, the extraction precision of convolution separation algorithm and convolution recurrence fast algorithm of the present invention is compared, the result is as shown in table 1.
The extraction result of two kinds of algorithms of table 1 relatively
Rounded coordinate The convolution separation algorithm Convolution recurrence fast algorithm Error The RMS error
x/pixels y/pixels x/pixels y/pixels x/pixels y/pixels x/pixels y/pixels Distance error
275 120 274.93 119.78 274.86 119.81 0.07 0.03 0.076 0.113
275 121 275.02 121.07 275.06 121.08 0.04 0.01 0.041
274 122 273.82 121.46 273.65 121.51 0.17 0.05 0.177
275 122 275.12 122.37 275.25 122.36 0.13 0.01 0.130
274 123 273.92 122.76 273.85 122.79 0.07 0.03 0.076
274 124 274.02 124.05 274.04 124.06 0.02 0.01 0.022
273 125 272.82 124.44 272.62 124.47 0.20 0.03 0.202
274 125 274.11 125.35 274.23 125.33 0.12 0.02 0.122
273 126 272.91 125.73 272.81 125.73 0.10 0.00 0.100
273 127 273.01 127.01 272.98 126.99 0.03 0.02 0.036
Error between convolution recurrence fast algorithm of the present invention as can be seen from Table 1 and the convolution partition method is about 0.113 pixel, show thus, convolution recurrence fast algorithm has improved speed in extracting the process of striation, but still has obtained the extraction precision with the convolution partition method.
The algorithm robustness relatively.
Referring to Fig. 2, Fig. 2 a is a raw image, and Fig. 2 b is the result who adopts the convolution separation method to obtain, and Fig. 2 c is the result who adopts the inventive method to obtain.By comparing the effect of the two kinds of same width of cloth structured light of algorithm process print images, these two kinds of methods all have good robustness.

Claims (1)

1, a kind of structural light stripes center rapid extracting method is characterized in that,
1.1, (x y) with the gaussian kernel convolution of differential form, obtains r respectively with image z x, r y, r Xx, r Xy, r Yy, suc as formula [1];
r x = ( ∂ g ( x , y ) / ∂ x ) ⊗ z ( x , y ) r y = ( ∂ g ( x , y ) / ∂ y ) ⊗ z ( x , y ) r xx = ( ∂ 2 g ( x , y ) / ∂ x 2 ) ⊗ z ( x , y ) r yy = ( ∂ 2 g ( x , y ) / ∂ y 2 ) ⊗ z ( x , y ) r xy = ( ∂ 2 g ( x , y ) / ∂ x ∂ y ) ⊗ z ( x , y ) - - - [ 1 ]
1.1.1, utilize the separability of Gaussian function, will be converted into the one dimension convolution to the two-dimensional convolution of image; To image z (x, two-dimentional Gaussian convolution equivalence y) is Gauss row convolution and Gauss's row convolution, is two-dimentional Gauss's template of n * n for a size, the operand of the two-dimensional convolution shown in the formula [1] is 5n 2Inferior taking advantage of adds, and the operand of one dimension convolution is to take advantage of for 10n time to add, and therefore, the two-dimensional convolution shown in the formula [1] is converted into twice one dimension convolution shown in the formula [2];
r x = g ( y ) ⊗ ( ( dg ( x ) / dx ) ⊗ z ( x , y ) ) r y = ( dg ( y ) / dy ) ⊗ ( g ( x ) ⊗ z ( x , y ) ) r xx = g ( y ) ⊗ ( ( d 2 g ( x ) / d x 2 ) ⊗ z ( x , y ) ) r yy = ( d 2 g ( y ) / d y 2 ) ⊗ ( g ( x ) ⊗ z ( x , y ) ) r xy = ( dg ( y ) / dy ) ⊗ ( ( dg ( x ) / dx ) ⊗ z ( x , y ) ) - - - [ 2 ]
1.1.2, utilize recursion method to realize Gauss's one dimension convolution, reduce the redundant computation amount in the calculating process, the process of Recursive Implementation is divided into forward recursive and two parts of backward recursive, the result who is input as forward recursive of backward recursive; Suppose certain delegation or a certain I of classifying as (n) of input picture, the output of the forward recursive of the Gaussian convolution of the Gaussian convolution of image, single order differential and second-order differential is respectively T (n), T ' (n) and T " (n), backward recursive is exported and is respectively O (n); O ' (n) and O " (n), then have:
T ( n ) = BI ( n ) + Σ i = 1 3 c i T ( n - i ) O ( n ) = BT ( n ) + Σ i = 1 3 c i O ( n + i ) T ( n ) = B ( I ( n + 1 ) - I ( n - 1 ) ) / 2 + Σ i = 1 3 c i T ′ ( n - i ) O ′ ( n ) = BT ′ ( n ) + Σ i = 1 3 c i O ′ ( n + i ) T ′ ′ ( n ) = B ( I ( n ) - I ( n - 1 ) ) + Σ i = 1 3 c i T ′ ′ ( n - i ) O ′ ′ ( n ) = B ( T ′ ′ ( n + 1 ) - T ′ ′ ( N ) ) + Σ i = 1 3 c i O ′ ′ ( n + i ) - - - [ 3 ]
Coefficient c in the formula 1, c 2, c 3, B is:
q = 0.98711 σ - 0.96330 , σ > 2.5 3.97156 - 4.14554 1 - 0.26891 σ , 0.5 ≤ σ ≤ 2.5
c 0=1.57825+2.44413q+1.4281q 2+0.422205q 3
c 1=(2.44413q+2.85619q 2+1.26661q 3)/c 0
c 2=-(1.4281q 2+1.26661q 3)/c 0
c 3=0.422205q 3/c 0
B=1-(c 1+c 2+c 3);
1.2, find the solution the eigenwert and the proper vector of Hessian matrix, obtain the normal direction (n of lines x, n y); The expression formula H of Hessian matrix (x, y) suc as formula shown in [4], normal direction (n x, n y) by image z (x, the pairing proper vector of Hessian matrix eigenvalue of maximum absolute value y) provides;
H ( x , y ) = ∂ 2 g ( x , y ) ∂ x 2 ∂ 2 g ( x , y ) ∂ x ∂ y ∂ 2 g ( x , y ) ∂ x ∂ y ∂ 2 g ( x , y ) ∂ y 2 ⊗ z ( x , y ) - - - [ 4 ]
= r xx r xy r xy r yy
1.3, find the solution the sub-pix coordinate at striations center according to Taylor expansion; The vector of unit length of supposing the striation normal direction of being tried to achieve according to the Hessian matrix is (n x, n y), with current point (x 0, y 0) be basic point, the intensity profile function on the structural light stripes xsect is carried out the second order Taylor expansion, then point (the x on the striped xsect 0+ tn x, y 0+ tn y) gray scale can be expressed as:
z(x 0+tn x,y 0+tn y)=Z(x 0,y 0)+N(r x r y) T+NH(x,y)N T/2
N=(tn wherein xTn y), by _ Z/_t=0, can obtain:
t = - n x r x + n y r y n x 2 r xx + 2 n x n y r xy + n y 2 r yy
Then the exact position of striation central point is (x 0+ tn x, y 0+ tn y).
CNB2005101237246A 2005-11-22 2005-11-22 Quick and high-precision method for extracting center of structured light stripe Expired - Fee Related CN100491899C (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CNB2005101237246A CN100491899C (en) 2005-11-22 2005-11-22 Quick and high-precision method for extracting center of structured light stripe

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CNB2005101237246A CN100491899C (en) 2005-11-22 2005-11-22 Quick and high-precision method for extracting center of structured light stripe

Publications (2)

Publication Number Publication Date
CN1763472A true CN1763472A (en) 2006-04-26
CN100491899C CN100491899C (en) 2009-05-27

Family

ID=36747728

Family Applications (1)

Application Number Title Priority Date Filing Date
CNB2005101237246A Expired - Fee Related CN100491899C (en) 2005-11-22 2005-11-22 Quick and high-precision method for extracting center of structured light stripe

Country Status (1)

Country Link
CN (1) CN100491899C (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100371676C (en) * 2006-11-01 2008-02-27 北京航空航天大学 Method and device for quick high precision positioning light spot image mass center
CN100405004C (en) * 2006-08-25 2008-07-23 北京航空航天大学 High precision and fast extraction device with optical strip image features and method thereof
CN101504770B (en) * 2009-03-19 2010-04-21 北京航空航天大学 Structural light strip center extraction method
CN103411562A (en) * 2013-08-22 2013-11-27 电子科技大学 Structured light laser strip center extraction method based on dynamic programming and mean-shift
CN104217402A (en) * 2014-08-20 2014-12-17 北京奇艺世纪科技有限公司 Real-time Gaussian Blur method and device of video on mobile equipment
CN104657587A (en) * 2015-01-08 2015-05-27 华中科技大学 Method for extracting center line of laser stripe
CN105005981A (en) * 2014-04-18 2015-10-28 北京航空航天大学 Light stripe center extraction method and apparatus based on multiple dimensions
CN107687819A (en) * 2017-08-01 2018-02-13 大连理工大学 A kind of optical losses sub-pixel extraction of quick high accuracy
CN109102516A (en) * 2017-06-20 2018-12-28 成都飞机工业(集团)有限责任公司 A kind of bend pipe center line extraction method of robust
CN109558819A (en) * 2018-11-20 2019-04-02 北京市遥感信息研究所 A kind of depth network light weight method for Remote Sensing Target detection
CN110866924A (en) * 2019-09-24 2020-03-06 重庆邮电大学 Line structured light center line extraction method and storage medium
CN113223074A (en) * 2021-05-06 2021-08-06 哈尔滨工程大学 Underwater laser stripe center extraction method

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100405004C (en) * 2006-08-25 2008-07-23 北京航空航天大学 High precision and fast extraction device with optical strip image features and method thereof
CN100371676C (en) * 2006-11-01 2008-02-27 北京航空航天大学 Method and device for quick high precision positioning light spot image mass center
CN101504770B (en) * 2009-03-19 2010-04-21 北京航空航天大学 Structural light strip center extraction method
CN103411562A (en) * 2013-08-22 2013-11-27 电子科技大学 Structured light laser strip center extraction method based on dynamic programming and mean-shift
CN103411562B (en) * 2013-08-22 2016-01-13 电子科技大学 A kind of structured light strip center extraction method based on dynamic programming and average drifting
CN105005981A (en) * 2014-04-18 2015-10-28 北京航空航天大学 Light stripe center extraction method and apparatus based on multiple dimensions
CN104217402A (en) * 2014-08-20 2014-12-17 北京奇艺世纪科技有限公司 Real-time Gaussian Blur method and device of video on mobile equipment
CN104657587A (en) * 2015-01-08 2015-05-27 华中科技大学 Method for extracting center line of laser stripe
CN104657587B (en) * 2015-01-08 2017-07-18 华中科技大学 A kind of center line extraction method of laser stripe
CN109102516A (en) * 2017-06-20 2018-12-28 成都飞机工业(集团)有限责任公司 A kind of bend pipe center line extraction method of robust
CN107687819A (en) * 2017-08-01 2018-02-13 大连理工大学 A kind of optical losses sub-pixel extraction of quick high accuracy
CN109558819A (en) * 2018-11-20 2019-04-02 北京市遥感信息研究所 A kind of depth network light weight method for Remote Sensing Target detection
CN110866924A (en) * 2019-09-24 2020-03-06 重庆邮电大学 Line structured light center line extraction method and storage medium
CN110866924B (en) * 2019-09-24 2023-04-07 重庆邮电大学 Line structured light center line extraction method and storage medium
CN113223074A (en) * 2021-05-06 2021-08-06 哈尔滨工程大学 Underwater laser stripe center extraction method

Also Published As

Publication number Publication date
CN100491899C (en) 2009-05-27

Similar Documents

Publication Publication Date Title
CN1763472A (en) Quick and high-precision method for extracting center of structured light stripe
CN109269430B (en) Multi-standing-tree breast height diameter passive measurement method based on deep extraction model
Chen et al. Pseudo-stereo for monocular 3d object detection in autonomous driving
Strecha et al. On benchmarking camera calibration and multi-view stereo for high resolution imagery
CN1282942C (en) Image processing method for appearance inspection
CN101059340A (en) Vehicle tread measurement method based on stereo vision and laser
CN109272524B (en) Small-scale point cloud noise denoising method based on threshold segmentation
CN101038678A (en) Smooth symmetrical surface rebuilding method based on single image
Jeon et al. Reconstruction-based pairwise depth dataset for depth image enhancement using CNN
US20140210951A1 (en) Apparatus and method for reconstructing three-dimensional information
CN106204528A (en) A kind of size detecting method of part geometry quality
CN109373912B (en) Binocular vision-based non-contact six-degree-of-freedom displacement measurement method
CN101030300A (en) Method for matching depth image
CN111080709A (en) Multispectral stereo camera self-calibration algorithm based on track feature registration
Wang et al. Three-dimensional reconstruction of dilute bubbly flow field with light-field images based on deep learning method
Kim et al. Performance analysis and validation of a stereo vision system
CN103700082B (en) Image split-joint method based on dual quaterion relative orientation
CN109215118B (en) Incremental motion structure recovery optimization method based on image sequence
CN1921562A (en) Method for image noise reduction based on transforming domain mathematics morphology
CN111028280B (en) # -shaped structured light camera system and method for performing scaled three-dimensional reconstruction of target
CN1173297C (en) Method for analysing digital image texture structure
US20140125844A1 (en) Image processing apparatus, image processing method, and non-transitory computer readable recording medium
CN114998448A (en) Method for calibrating multi-constraint binocular fisheye camera and positioning space point
CN108615221B (en) Light field angle super-resolution method and device based on shearing two-dimensional polar line plan
Barath P-HAF: Homography Estimation using Partial Local Affine Frames.

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
C17 Cessation of patent right
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20090527

Termination date: 20111122