CN105005981A - Light stripe center extraction method and apparatus based on multiple dimensions - Google Patents

Light stripe center extraction method and apparatus based on multiple dimensions Download PDF

Info

Publication number
CN105005981A
CN105005981A CN201410158714.5A CN201410158714A CN105005981A CN 105005981 A CN105005981 A CN 105005981A CN 201410158714 A CN201410158714 A CN 201410158714A CN 105005981 A CN105005981 A CN 105005981A
Authority
CN
China
Prior art keywords
convolution
gaussian kernel
pixel
hessian matrix
asks
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410158714.5A
Other languages
Chinese (zh)
Other versions
CN105005981B (en
Inventor
刘震
李凤娇
李小菁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN201410158714.5A priority Critical patent/CN105005981B/en
Publication of CN105005981A publication Critical patent/CN105005981A/en
Application granted granted Critical
Publication of CN105005981B publication Critical patent/CN105005981B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Processing (AREA)

Abstract

The invention discloses a light stripe center extraction method based on multiple dimensions. The method comprises the following steps: performing noise smooth processing on an image to obtain a smooth image; processing the smooth image through a skeleton method to obtain an initial center point of a light stripe and a corresponding initial normal direction; performing gray scale Gauss function fitting on the cross section of the light stripe along the initial normal direction of the light stripe to obtain a light stripe width of the light stripe at each position; according to the light strip width, determining a first convolution Gauss nucleus mean square deviation sigma 0 at each position; according to a first convolution Gauss nucleus corresponding to the sigma 0, solving candidate points of a light stripe center at a pixel level by use of a Hessian matrix method; obtaining sub-pixel coordinates of a light stripe image characteristic by taking each candidate point as a basic point; and connecting with each light stripe center point to form the light stripe center. By using the laser light stripe center point coordinate extraction method provided by the invention, through selecting an optimal Gauss nucleus mean square deviation at each position of the light stripe, the coordinates of the light stripe center points are extracted. The precision is high, the versatility is good, and the anti-interference capability is high.

Description

Based on multiple dimensioned Light stripes center extraction method and device
Technology neighborhood
The present invention relates to image procossing neighborhood, particularly relate to a kind of based on multiple dimensioned Light stripes center extraction method and device.
Background technology
In online structured light measurement system, the accurate extraction of optical losses is one of key factor affecting whole measuring system precision.Common light stripe centric line extracting method comprises:
Edge method and threshold method: algorithm is simple, travelling speed is fast, but precision is lower.
Extremum method: carry out Gauss or Parabolic Fit on striation cross section, by the sub-pixel location asking its extreme point to obtain optical losses, the method is easily because striation intensity profile is uneven and noise causes Light stripes center extraction precision not high;
Gravity model appoach: gravity model appoach can reduce the error because the asymmetry of striation intensity profile causes, but due to factors such as testee curved surface and structures, the Curvature varying of striations image is larger, in order to improve the precision of optical losses, first need to utilize Hessian matrix, Sobel gradient operator and direction template etc. to be used to determine the normal direction of striation, cause algorithm complicated, calculated amount is large;
Steger method: utilize Hessian matrix to obtain the normal direction of striation in image by German Steger doctor C, then asks the extreme point in normal direction to obtain the sub-pixel location of light stripe centric line, has the advantages such as precision is high, robustness good and be widely used; But for height, reflective and striation thickness changes violent laser optical strip image, still cannot realize high-precision extraction optical losses; And convolution operation a large amount of in operating process causes arithmetic speed slower.
Summary of the invention
In view of this, fundamental purpose of the present invention is to provide a kind of based on multiple dimensioned Light stripes center extraction method and device, can be applicable to the extracted with high accuracy that striation thickness under high reflective state changes violent optical losses point coordinate.
For achieving the above object, technical scheme of the present invention is achieved in that
First aspect present invention provides a kind of based on multiple dimensioned Light stripes center extraction method, and described method comprises:
Noise smoothing process is carried out to image, obtains smoothed image;
By Skeleton method, initial normal direction corresponding to the initial center point that obtains striation and corresponding initial center point described in each is processed to described smoothed image;
Along the described initial normal direction of described striation initial center point, gray scale Gaussian function fitting is carried out to striation xsect, obtain the striation width at striation each position place;
According to the striation width at each position place, determine the first convolution gaussian kernel meansquaredeviationσ 0 at each position place and the bidimensional width N*N of described first convolution gaussian kernel, and according to described σ 0and N*N determines the first Gaussian convolution core;
According to described first convolution gaussian kernel, a Hessian matrix of each pixel in the l*l contiguous range RON that convolution asks for initial center point place described in each; Described l be greater than 3 positive integer;
The candidate point of optical losses Pixel-level is asked for according to a described Hessian matrix;
With candidate point described in each for basic point, obtain the subpixel coordinates of optical losses point;
Connect each described optical losses point according to described subpixel coordinates and form optical losses.
Preferably,
At the striation width according to each position place, determine the convolution gaussian kernel meansquaredeviationσ at each position place 0afterwards, described method also comprises:
According to σ -10-delta σ; σ + 10+ delta σ, determines the second convolution gaussian kernel meansquaredeviationσ at each position place -1and the second bidimensional width N of convolution gaussian kernel -1* N -1, the 3rd convolution gaussian kernel meansquaredeviationσ + 1and the 3rd bidimensional width N of convolution gaussian kernel + 1* N + 1; Described delta σ is convolution gaussian kernel mean square deviation modifying factor;
According to the second convolution gaussian kernel described in each, convolution asks for the 2nd Hessian matrix in RON region corresponding to each pixel;
According to the 3rd convolution gaussian kernel described in each, convolution asks for the 3rd Hessian matrix in RON region corresponding to each pixel;
Before asking for the candidate point of optical losses Pixel-level according to a described Hessian matrix, according to formula calculate each pixel in described RON corresponding form S set;
Described S = { { C - 1 p 1 , C 0 p 1 , C + 1 p 1 } , { C - 1 p 2 , C 0 p 2 , C + 1 p 2 } , . . . { C - 1 p i , C 0 p i , C + 1 p i } . . . , { C - 1 p I , C 0 p I , C + 1 p I } } ;
Get maximal value in described S corresponding convolution gaussian kernel mean square deviation is best convolution gaussian kernel meansquaredeviationσ best;
The described candidate point asking for optical losses Pixel-level according to a described Hessian matrix is:
According to described σ bestcorresponding convolution gaussian kernel asks for best Hessian matrix;
The candidate point of optical losses Pixel-level is asked for according to described best Hessian matrix;
Wherein, the value of described m is-1,0 or+1; Described | λ m| maxfor described σ mcorresponding Hessian matrix norm eigenvalue of maximum; Described p ifor i-th pixel in described RON region; Described I is sum of all pixels in described RON region; Described i is the integer being less than described I.
Preferably, the bidimensional width of the bidimensional width of the first convolution gaussian kernel, the bidimensional width of the second convolution gaussian kernel and the 3rd convolution gaussian kernel is asked for according to following formula;
N m=2*round(4*σ m)+1;
Wherein said round represents and rounds up.
Preferably, ask for a described Hessian matrix, the 2nd Hessian matrix, the 3rd Hessian matrix comprise:
Locate l*l pixel in the described RON region at each initial center point place;
With each pixel p in RON region icentered by, ROC region in positioning image; The range size in wherein said RON region equals the bidimensional width of corresponding convolution gaussian kernel;
First convolution gaussian kernel, the second convolution gaussian kernel and the 3rd convolution gaussian kernel are carried out respectively single pixel convolution with corresponding RON region and obtain a corresponding Hessian matrix, the 2nd Hessian matrix and the 3rd Hessian matrix;
5, method according to claim 2, is characterized in that, asks for candidate point and comprises:
Utilize C m p i = σ m 2 | λ m | max ; Described σ mfor σ best;
Choose the pixel being greater than threshold value is candidate point.
Second aspect present invention provides a kind of based on multiple dimensioned Light stripes center extraction device, and described device comprises:
Smooth unit, for carrying out noise smoothing process to image, obtains smoothed image;
First acquiring unit, for processing initial normal direction corresponding to the initial center point that obtains striation and corresponding initial center point described in each by Skeleton method to described smoothed image;
Second acquisition unit, for the initial normal direction along described striation initial center point, carries out gray scale Gaussian function fitting to striation xsect, obtains the striation width at striation each position place;
First determining unit, for the striation width according to each position place, determines the meansquaredeviationσ of the first convolution gaussian kernel at each position place 0and the first bidimensional width N*N of convolution gaussian kernel, and according to described σ 0and N*N determines the first Gaussian convolution core;
First asks for unit, for according to the first convolution gaussian kernel described in each, and a Hessian matrix of each pixel in the l*l contiguous range RON that convolution asks for initial center point place described in each; Described l be greater than 3 positive integer;
Second asks for unit, asks for the candidate point of optical losses Pixel-level for a described Hessian matrix;
3rd acquiring unit, for candidate point described in each for basic point, obtain the subpixel coordinates of optical strip image central point;
Linkage unit, forms optical losses for connecting each described optical losses point according to described subpixel coordinates.
Further,
Described device also comprises the 3rd to be asked for unit and the 4th and asks for unit;
Described first determining unit, also for after the convolution gaussian kernel meansquaredeviationσ 0 determining each position place, according to σ -10-delta σ; σ + 10+ delta σ, determines the convolution gaussian kernel variance second convolution gaussian kernel meansquaredeviationσ at each position place -1and the 3rd convolution gaussian kernel meansquaredeviationσ + 1; Described delta σ is convolution gaussian kernel mean square deviation modifying factor;
Described first determining unit, also for according to described σ -1obtain the bidimensional width N of the second convolution gaussian kernel -1* N -1, according to described σ + 1obtain the bidimensional width N of the 3rd convolution gaussian kernel + 1* N + 1;
Described first asks for unit, and also for according to the second convolution gaussian kernel described in each, convolution asks for the 2nd Hessian matrix described in each corresponding to initial center point of each pixel in RON region; According to the 3rd convolution gaussian kernel described in each, convolution asks for the 3rd Hessian matrix described in each corresponding to initial center point of each pixel in RON region;
Described 3rd asks for unit, also for before asking for the candidate point of optical losses Pixel-level according to a described Hessian matrix, according to formulae discovery calculate each pixel in described RON corresponding form S set;
Described S = { { C - 1 p 1 , C 0 p 1 , C + 1 p 1 } , { C - 1 p 2 , C 0 p 2 , C + 1 p 2 } , . . . { C - 1 p i , C 0 p i , C + 1 p i } . . . , { C - 1 p I , C 0 p I , C + 1 p I } } ;
4th asks for unit, for getting maximal value in described S corresponding convolution gaussian kernel mean square deviation is best convolution gaussian kernel meansquaredeviationσ best;
Described first asks for unit, also for according to described σ bestcorresponding gaussian kernel asks for best Hessian matrix;
Described second asks for unit, asks for the candidate point of optical losses Pixel-level specifically for best Hessian matrix;
Wherein, the value of described m is-1,0 or+1; Described | λ m| maxfor described σ mcorresponding Hessian matrix norm eigenvalue of maximum; Described p ifor i-th pixel in described RON; Described I is sum of all pixels in described RON; Described i is the integer being less than described I.
Further,
Described first determining unit, specifically for asking for the bidimensional width of the bidimensional width of the first convolution gaussian kernel, the bidimensional width of the second convolution gaussian kernel and the 3rd convolution gaussian kernel according to following formula; N m=2*round (4* σ m)+1;
Wherein said round represents and rounds up.
Further,
Described first asks for unit, specifically for locating l*l pixel in the described RON region at each initial center point place; Centered by each pixel, ROC region in positioning image; The range size in wherein said ROC region equals the size of corresponding convolution gaussian kernel; And the first convolution gaussian kernel, the second convolution gaussian kernel and the 3rd convolution gaussian kernel are carried out single pixel convolution respectively at corresponding ROC region obtain a corresponding Hessian matrix, the 2nd Hessian matrix and the 3rd Hessian matrix;
Further, described second unit is asked for specifically for utilizing described σ mfor σ best;
Choose the pixel being greater than threshold value is candidate point.
Embodiments provide a kind of based on multiple dimensioned Light stripes center extraction method and device, by extracting the bone in striation region in image as initial center point, the gaussian kernel mean square deviation corresponding to each central point is determined according to the width of each central spot striation, and then the gaussian kernel mean square deviation determination optical losses point at foundation diverse location place, form last optical losses; Be applied to the Light stripes center extraction method of entire image relative to existing method gaussian kernel mean square deviation, obvious extraction accuracy is higher, and error is less.
Accompanying drawing explanation
Fig. 1 is the schematic flow sheet based on multiple dimensioned Light stripes center extraction method described in the embodiment of the present invention one;
Fig. 2 is the image effect figure of the embodiment of the present invention one optical losses to be extracted;
The smoothed image that Fig. 3 obtains through noise smoothing process for image described in Fig. 2;
Fig. 4 is the region of striation described in Fig. 3 initial center point partial schematic diagram;
Fig. 5 is the schematic diagram of the Gaussian function described in embodiment;
Fig. 6 is the schematic diagram of RON and ROC described in embodiment;
The optical losses that Fig. 7 extracts after method described in the embodiment of the present invention for Fig. 2;
Fig. 8 is the apparatus structure schematic diagram described in the embodiment of the present invention.
Embodiment
Below in conjunction with Figure of description and specific embodiment technical scheme of the present invention done and further elaborate.
Embodiment one:
As shown in Figure 1, the present embodiment provides a kind of based on multiple dimensioned Light stripes center extraction method, and described method comprises:
Step S110: carry out noise smoothing process to image, obtains smoothed image;
Step S120: the initial normal direction by Skeleton method, described smoothed image being processed to the initial center point that obtains striation and corresponding initial center point described in each;
Step S130: the initial normal direction along described striation initial center point carries out gray scale Gaussian function fitting to striation xsect, obtains the striation width at striation each position place;
Step S140: according to the striation width at each position place, determine the meansquaredeviationσ of the first convolution gaussian kernel at each position place 0and the first bidimensional width N*N of convolution gaussian kernel, and according to described σ 0and N*N determines the first Gaussian convolution core;
Step S150: according to the first convolution gaussian kernel described in each, convolution asks for a Hessian matrix of each pixel in l*l contiguous range RON in initial center point place described in each; Described l be greater than 3 positive integer;
Step S160: the candidate point asking for optical losses Pixel-level according to a described Hessian matrix;
Step S170: with candidate point described in each for basic point, obtains the subpixel coordinates of optical losses;
Step S180: according to described subpixel coordinates, connects each described optical losses and forms optical losses.
In described step S110, gaussian filtering process is done to entire image and carrys out smooth noise.When carrying out gaussian filtering, the span of the mean square value of Gaussian function is preferably 1-10, as concrete value 5 or 6 etc.Fig. 2 is the former figure of striation obtained in the reflective situation of a panel height; The smoothed image shown in Fig. 3 is defined after step S110 process.
Step S120 obtains the initial center point of striation and corresponding initial normal direction by Skeleton method, concrete is as shown in Figure 4, a normal direction that each initial center point is corresponding is that the arrow of starting point represents in the diagram respectively with initial centered value.
Described step S120 specifically can comprise:
Sub-step 1: different according to the light and shade of smoothed image striation, adopts multi thresholds method process optical strip image; Sub-step 2: for threshold region interested, based on condition expansion algorithm, extracts connected component.Conveniently follow-up image procossing, described step S120 also can comprise:
Sub-step 3: according to area-constrained and form factor constraint, eliminate little interference region and extended background interference region.
Choose satisfied first pre-conditioned connected component, Skeleton is carried out in the striation region corresponding to described connected component thus obtains initial centered value, as shown in Figure 4 concrete, wherein each some expression initial center point in Fig. 4.After carrying out Skeleton to striation region, the described initial center obtained is a concrete pixel.
Suppose (x i, y i), i=1,2 ..., n represents described initial center point, for any point (x in initial center point described in several i, y i), in initial center point, get its most contiguous 2k point (k is positive integer).Such as, for point (x 3, y 3), q=5, then get (x 1, y 1), (x 2, y 2) ..., (x 11, y 11) be required point range.Utilize least square method by this 2k+1 some fitting a straight line, obtain the normalized normal direction of straight line as striation normal direction (n xi, n yi); And the striation normal direction usually corresponding to described initial center point is also called striation normal direction initial value, to distinguish the normal direction corresponding to each central point in the final striation extracted.
In step S130, when carrying out grey Gaussian function fitting, Gaussian function used can be as shown in Figure 5; Wherein, the maximal value of Gaussian function is A, with y>=0.2A for constraint obtains the striation width of striation in position; Wherein, described striation width is designated as w i, i=1,2 ..., n1; Wherein n1 be greater than 1 integer.
In step S140, the striation width according to each position place determines the first convolution gaussian kernel meansquaredeviationσ at each position place 0, concrete can according to formula r=w i/ 2 and
Determine in described step S140 that described first convolution gaussian kernel size specifically can be: the bidimensional width N*N determining the first convolution gaussian kernel according to described first convolution gaussian kernel mean square deviation, in the present embodiment according to formula N=2*round (4* σ 0)+1 to determine.The corresponding matrix of a usual convolution gaussian kernel; Matrix comprises row and column; A dimension of line number homography; Another dimension of row homography; In an embodiment, the bidimensional width of described first convolution gaussian kernel is N*N, represents that line number is equal with columns.
In described step S150, described RON region as shown in Figure 6.Wherein, shown in Fig. 6, comprise the RON region (Region of neighborhood) of the 5*5 that an initial center point place pixel is formed, and the ROC region (Region of convolution) in RON region centered by first pixel.Wherein, the region that in Fig. 6, shade is corresponding is described initial center point place pixel.
Described step S170 can specifically comprise: whether meet second according to the subpixel coordinates corresponding to each candidate point and preset constraint condition; Determine that the satisfied second pixel coordinate presetting constraint condition is the subpixel coordinates of described optical losses point.
Method described in the present embodiment, Skeleton has been carried out to obtain initial center point relative to the striation region of existing method to image, the the first convolution gaussian kernel mean square deviation obtaining each position place extracts striation region, thus greatly reduce problems such as the error of the optical losses extracted region that whole striation region adopts same convolution gaussian kernel variance to cause are large, improve the precision that striation extracts greatly.
May be there is the deviation of each side such as calculating in described the first convolution gaussian kernel mean square deviation calculated, in order to further improve the precision that striation extracts, the present embodiment has also done following improvement:
At the striation width according to each position place, after determining the convolution gaussian kernel meansquaredeviationσ 0 at each position place, described method also comprises:
According to σ -10-delta σ; σ + 10+ delta σ, determines the convolution gaussian kernel variance second convolution gaussian kernel meansquaredeviationσ at each position place -1and the 3rd convolution gaussian kernel meansquaredeviationσ + 1; Described delta σ is convolution gaussian kernel mean square deviation modifying factor; The span of described delta σ can be extremely equivalence, concrete span can be determined according to parameters such as the composition of striation and equipment controllable precisions;
According to the second convolution gaussian kernel described in each, convolution asks for the 2nd Hessian matrix in corresponding RON corresponding to each pixel; According to the 3rd convolution gaussian kernel described in each, convolution asks for the 3rd Hessian matrix in corresponding RON corresponding to each pixel;
Before asking for the candidate point of optical losses Pixel-level according to described first convolution gaussian kernel and a described Hessian matrix, according to formulae discovery calculate each pixel in described RON form S set;
Described S = { { C - 1 p 1 , C 0 p 1 , C + 1 p 1 } , { C - 1 p 2 , C 0 p 2 , C + 1 p 2 } , . . . { C - 1 p i , C 0 p i , C + 1 p i } . . . , { C - 1 p I , C 0 p I , C + 1 p I } } ;
Get maximal value in described S corresponding convolution gaussian kernel mean square deviation is best convolution gaussian kernel meansquaredeviationσ best;
Wherein, the value of described m is-1,0 or+1; Described | λ m| maxfor described σ mcorresponding Hessian matrix norm eigenvalue of maximum; Described p ifor i-th pixel in described RON; Sum of all pixels in RON described in described I; Described i is the integer being less than described I.
The described candidate point asking for optical losses Pixel-level according to described N and a described Hessian matrix is:
According to described σ bestcorresponding convolution gaussian kernel asks for best Hessian matrix;
The candidate point of optical losses Pixel-level is asked for according to described best Hessian matrix;
The concrete acquisition methods of described second convolution gaussian kernel and the 3rd convolution gaussian kernel can see the first convolution gaussian kernel, specific as follows:
The bidimensional width of the bidimensional width of the first convolution gaussian kernel, the bidimensional width of the second convolution gaussian kernel and the 3rd convolution gaussian kernel is asked for according to following formula;
N m=2*round(4*σ m)+1;
Wherein said round represents and rounds up.
In the present embodiment, by the drift correction to the first gaussian kernel mean square deviation, further obtain best gaussian kernel mean square deviation, further improve precision.
Further, described 2nd Hessian matrix and the 3rd Hessian matrix also can asking for see a Hessian matrix, specific as follows:
Ask for a described Hessian matrix, the 2nd Hessian matrix, the 3rd Hessian matrix comprise:
Locate l*l pixel in the described RON at each initial center point place;
Centered by each pixel, ROC in positioning image; The range size of wherein said ROC equals the size of corresponding convolution gaussian kernel; Wherein, the region in Fig. 6 described in dotted line frame is with pixel p icorresponding ROC region; First convolution gaussian kernel, the second convolution gaussian kernel, the 3rd convolution gaussian kernel are carried out respectively single pixel convolution with corresponding ROC and obtain a corresponding Hessian matrix, the 2nd Hessian matrix, the 3rd Hessian matrix.
Described best Hessian matrix is one of them of a Hessian matrix, the 2nd Hessian matrix and the 3rd Hessian matrix.
In addition, also provide a kind of method below, be used for asking for described candidate point; Ask for candidate point to comprise:
Utilize C m p i = σ m 2 | λ m | max ; Described σ mfor σ best;
Choose the pixel being greater than threshold value is candidate point; Wherein, usual described threshold value is pre-determined.
Wherein, described step S170 determines that the subpixel coordinates of optical losses point comprises the steps:
With candidate point (x 0, y 0) be basic point, the second Taylor series is carried out to the distributed function on striation xsect, according to candidate point (x 0, y 0) at normal direction (n x, n y) on first directional derivative be zero, the subpixel coordinates obtaining optical strip image features is (p x, p y)=(x 0+ tn x, y 0+ tn y), wherein if namely first order derivative be zero point be positioned at current pixel, then this point (p x, p y) be optical losses point.Described be second described in described step S220 and preset constraint condition.
Described r x, r y, r xx, r xy, r yyfor the partial derivative obtained after optical losses image to be extracted and discrete Gaussian function convolution.Concrete how convolution asks inverse partially can see prior art.
When performing step S180, can according to existing Steger method, by the pixel with maximum second derivative, adding suitable neighborhood point makes striation centerline be linked to be line, needs the sub-pixel Distance geometry angle between Integrated comparative neighborhood point and current optical losses to be connected point to change when connecting.
Wherein, the lines in two striations of Fig. 7 are image shown in Fig. 1 and adopt the method process described in the present embodiment to obtain optical losses.
Comprehensively above-mentioned, present embodiments providing a kind of striation, to extract error little, precision high based on multiple dimensioned Light stripes center extraction method, the picture centre being particularly useful for the laser striation that thickness under high reflective state alters a great deal is extracted.
Embodiment two:
As shown in Figure 8, the present embodiment provides a kind of based on multiple dimensioned Light stripes center extraction device, and described device comprises:
Smooth unit 110, for carrying out noise smoothing process to image, obtains smoothed image;
First acquiring unit 120, for processing initial normal direction corresponding to the initial center point that obtains striation and corresponding initial center point described in each by Skeleton method to described smoothed image;
Second acquisition unit 130, for the initial normal direction along described striation initial center point, carries out gray scale Gaussian function fitting to striation xsect, obtains the striation width at striation each position place;
First determining unit 140, for the striation width according to each position place, determines the meansquaredeviationσ of the first convolution gaussian kernel at each position place 0and the first bidimensional width N*N of convolution gaussian kernel, and according to described σ 0and N*N determines the first Gaussian convolution core;
First asks for unit 150, for according to the first convolution gaussian kernel described in each, and a Hessian matrix of each pixel in the l*l contiguous range RON that convolution asks for initial center point place described in each; Described l be greater than 3 positive integer;
Second asks for unit 160, asks for the candidate point of optical losses Pixel-level for a described Hessian matrix;
3rd acquiring unit 170, for candidate point described in each for basic point, obtain the subpixel coordinates of optical strip image central point;
Linkage unit 180, forms optical losses for connecting each described optical losses point according to described subpixel coordinates.Described smooth unit 110, first acquiring unit 120, second acquisition unit 130, first determining unit 140, first ask for that unit 150, second asks for unit 160, the 3rd acquiring unit 170, linkage unit 180 can be generally functional unit; Functional unit described in each can correspond to separately or integrated the processor that has processing capacity.Therefore the device described in the present embodiment can comprise processor, storage medium and one or more communication interface.Described processor, storage medium and communication interface carry out data transmission by bus between any two.Described communication interface is in order to carry out data interaction with peripheral hardware.Concrete described processor can comprise the device that multinuclear or central processing unit, digital signal processor, single-chip microcomputer or Programmable Logic Device etc. have processing capacity.Described storage medium is preferably non-moment storage medium, and can keep data when power down, storage inside has software or program; Described processor runs the function that described software and program can realize each functional unit above-mentioned.
Further, described device also comprises the 3rd and asks for unit and the 4th and ask for unit; Described first determining unit, also for determining the convolution gaussian kernel meansquaredeviationσ at each position place 0after, according to σ -10-delta σ; σ + 10+ delta σ, determines the convolution gaussian kernel variance second convolution gaussian kernel meansquaredeviationσ at each position place -1and the 3rd convolution gaussian kernel meansquaredeviationσ + 1; Described delta σ is convolution gaussian kernel mean square deviation modifying factor;
Described first determining unit, also for obtaining the bidimensional width N of the second convolution gaussian kernel according to described σ-1 -1* N -1, according to described σ + 1obtain the bidimensional width N of the 3rd convolution gaussian kernel + 1* N + 1;
Described first asks for unit, and also for according to the second convolution gaussian kernel described in each, convolution asks for the 2nd Hessian matrix described in each corresponding to initial center point of each pixel in RON region; According to the 3rd convolution gaussian kernel described in each, convolution asks for the 3rd Hessian matrix described in each corresponding to initial center point of each pixel in RON region;
Described 3rd asks for unit, also for before asking for the candidate point of optical losses Pixel-level according to a described Hessian matrix, according to formulae discovery calculate each pixel in described RON corresponding form S set;
Described S = { { C - 1 p 1 , C 0 p 1 , C + 1 p 1 } , { C - 1 p 2 , C 0 p 2 , C + 1 p 2 } , . . . { C - 1 p i , C 0 p i , C + 1 p i } . . . , { C - 1 p I , C 0 p I , C + 1 p I } } ;
4th asks for unit, for getting maximal value in described S corresponding convolution gaussian kernel mean square deviation is best convolution gaussian kernel meansquaredeviationσ best;
Described first asks for unit, also for according to described σ bestcorresponding gaussian kernel asks for best Hessian matrix;
Described second asks for unit, asks for the candidate point of optical losses Pixel-level specifically for best Hessian matrix;
Wherein, the value of described m is-1,0 or+1; Described | λ m| maxfor described σ mcorresponding Hessian matrix norm eigenvalue of maximum; Described p ifor i-th pixel in described RON; Described I is sum of all pixels in described RON; Described i is the integer being less than described I.
The present embodiment passes through the setting of the 3rd determining unit and the 4th determining unit, has asked for the best Hessian matrix of best gaussian kernel mean square deviation one-level, has again improve the precision of Light stripes center extraction.
Further, described first determining unit, specifically for asking for the bidimensional width of the bidimensional width of the first convolution gaussian kernel, the bidimensional width of the second convolution gaussian kernel and the 3rd convolution gaussian kernel according to following formula; N m=2*round (4* σ m)+1; Wherein said round represents and rounds up.
By above-mentioned improvement, determine the bidimensional width how device described in the present embodiment calculates convolution gaussian kernel.
Preferably, described first asks for unit, specifically for locating l*l pixel in the described RON region at each initial center point place; Centered by each pixel, ROC region in positioning image; The range size in wherein said ROC region equals the size of corresponding convolution gaussian kernel; And the first convolution gaussian kernel, the second convolution gaussian kernel and the 3rd convolution gaussian kernel are carried out single pixel convolution respectively obtain a corresponding Hessian matrix, the 2nd Hessian matrix and the 3rd Hessian matrix with corresponding ROC region;
Wherein, described best Hessian matrix be a Hessian matrix, the 2nd Hessian matrix, the 3rd Hessian matrix one of them.
Further, described second unit is asked for specifically for utilizing described σ mfor σ best;
Choose the pixel being greater than threshold value is candidate point.
Device described in the present embodiment described provides hardware support based on multiple dimensioned Light stripes center extraction method for embodiment is a kind of, any described technical scheme of available embodiment to accomplish method, can while the error reducing Light stripes center extraction and lifting extraction accuracy, also there is structure simple, realize easy advantage.
In several embodiments that the application provides, should be understood that disclosed equipment and method can realize by another way.Apparatus embodiments described above is only schematic, such as, the division of described unit, be only a kind of logic function to divide, actual can have other dividing mode when realizing, and as: multiple unit or assembly can be in conjunction with, maybe can be integrated into another system, or some features can be ignored, or do not perform.In addition, the coupling each other of shown or discussed each ingredient or direct-coupling or communication connection can be by some interfaces, and the indirect coupling of equipment or unit or communication connection can be electrical, machinery or other form.
The above-mentioned unit illustrated as separating component or can may not be and physically separates, and the parts as unit display can be or may not be physical location, namely can be positioned at a place, also can be distributed in multiple network element; Part or all of unit wherein can be selected according to the actual needs to realize the object of the present embodiment scheme.
In addition, each functional unit in various embodiments of the present invention can all be integrated in a processing module, also can be each unit individually as a unit, also can two or more unit in a unit integrated; Above-mentioned integrated unit both can adopt the form of hardware to realize, and the form that hardware also can be adopted to add SFU software functional unit realizes.
This neighborhood those of ordinary skill is understood that all or part of step realizing said method embodiment can have been come by the hardware that programmed instruction is relevant, aforesaid program can be stored in a computer read/write memory medium, this program, when performing, performs the step comprising said method embodiment; And aforesaid storage medium comprises: movable storage device, ROM (read-only memory) (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), magnetic disc or CD etc. various can be program code stored medium.
The above; be only the specific embodiment of the present invention, but protection scope of the present invention is not limited thereto, any technician being familiar with this technology neighborhood is in the technical scope that the present invention discloses; change can be expected easily or replace, all should be encompassed within protection scope of the present invention.Therefore, protection scope of the present invention should be as the criterion with the protection domain of described claim.

Claims (10)

1., based on a multiple dimensioned Light stripes center extraction method, described method comprises:
Noise smoothing process is carried out to image, obtains smoothed image;
By Skeleton method, initial normal direction corresponding to the initial center point that obtains striation and corresponding initial center point described in each is processed to described smoothed image;
Along the described initial normal direction of described striation initial center point, gray scale Gaussian function fitting is carried out to striation xsect, obtain the striation width at striation each position place;
According to the striation width at each position place, determine the first convolution gaussian kernel meansquaredeviationσ 0 at each position place and the bidimensional width N*N of described first convolution gaussian kernel, and according to described σ 0and N*N determines the first Gaussian convolution core;
According to described first convolution gaussian kernel, a Hessian matrix of each pixel in the l*l contiguous range RON that convolution asks for initial center point place described in each; Described l be greater than 3 positive integer;
The candidate point of optical losses Pixel-level is asked for according to a described Hessian matrix;
With candidate point described in each for basic point, obtain the subpixel coordinates of optical losses point;
Connect each described optical losses point according to described subpixel coordinates and form optical losses.
2. method according to claim 1, is characterized in that,
At the striation width according to each position place, after determining the convolution gaussian kernel meansquaredeviationσ 0 at each position place, described method also comprises:
According to σ -10-delta σ; σ + 10+ delta σ, determines the second convolution gaussian kernel meansquaredeviationσ at each position place -1and the second bidimensional width N of convolution gaussian kernel -1* N -1, the 3rd convolution gaussian kernel meansquaredeviationσ + 1and the 3rd bidimensional width N of convolution gaussian kernel + 1* N + 1; Described delta σ is convolution gaussian kernel mean square deviation modifying factor;
According to the second convolution gaussian kernel described in each, convolution asks for the 2nd Hessian matrix in RON region corresponding to each pixel;
According to the 3rd convolution gaussian kernel described in each, convolution asks for the 3rd Hessian matrix in RON region corresponding to each pixel;
Before asking for the candidate point of optical losses Pixel-level according to a described Hessian matrix, according to formula calculate each pixel in described RON corresponding form S set;
Described
Get maximal value in described S corresponding convolution gaussian kernel mean square deviation is best convolution gaussian kernel meansquaredeviationσ best;
The described candidate point asking for optical losses Pixel-level according to a described Hessian matrix is:
According to described σ bestcorresponding convolution gaussian kernel asks for best Hessian matrix;
The candidate point of optical losses Pixel-level is asked for according to described best Hessian matrix;
Wherein, the value of described m is-1,0 or+1; Described | λ m| maxfor described σ mcorresponding Hessian matrix norm eigenvalue of maximum; Described p ifor i-th pixel in described RON region; Described I is sum of all pixels in described RON region; Described i is the integer being less than described I.
3. method according to claim 2, is characterized in that, asks for the bidimensional width of the bidimensional width of the first convolution gaussian kernel, the bidimensional width of the second convolution gaussian kernel and the 3rd convolution gaussian kernel according to following formula;
N m=2*round(4*σ m)+1;
Wherein said round represents and rounds up.
4. according to the method in claim 2 or 3, it is characterized in that, ask for a described Hessian matrix, the 2nd Hessian matrix, the 3rd Hessian matrix comprise:
Locate l*l pixel in the described RON region at each initial center point place;
With each pixel p in RON region icentered by, ROC region in positioning image; The range size in wherein said RON region equals the bidimensional width of corresponding convolution gaussian kernel;
First convolution gaussian kernel, the second convolution gaussian kernel and the 3rd convolution gaussian kernel are carried out respectively single pixel convolution with corresponding RON region and obtain a corresponding Hessian matrix, the 2nd Hessian matrix and the 3rd Hessian matrix.
5. method according to claim 2, is characterized in that, asks for candidate point and comprises:
Utilize described σ mfor σ best;
Choose the pixel being greater than threshold value is candidate point.
6., based on a multiple dimensioned Light stripes center extraction device, described device comprises:
Smooth unit, for carrying out noise smoothing process to image, obtains smoothed image;
First acquiring unit, for processing initial normal direction corresponding to the initial center point that obtains striation and corresponding initial center point described in each by Skeleton method to described smoothed image;
Second acquisition unit, for the initial normal direction along described striation initial center point, carries out gray scale Gaussian function fitting to striation xsect, obtains the striation width at striation each position place;
First determining unit, for the striation width according to each position place, determines the meansquaredeviationσ of the first convolution gaussian kernel at each position place 0and the first bidimensional width N*N of convolution gaussian kernel, and according to described σ 0and N*N determines the first Gaussian convolution core;
First asks for unit, for according to the first convolution gaussian kernel described in each, and a Hessian matrix of each pixel in the l*l contiguous range RON that convolution asks for initial center point place described in each; Described l be greater than 3 positive integer;
Second asks for unit, asks for the candidate point of optical losses Pixel-level for a described Hessian matrix;
3rd acquiring unit, for candidate point described in each for basic point, obtain the subpixel coordinates of optical strip image central point;
Linkage unit, forms optical losses for connecting each described optical losses point according to described subpixel coordinates.
7. device according to claim 6, is characterized in that,
Described device also comprises the 3rd to be asked for unit and the 4th and asks for unit;
Described first determining unit, also for determining the convolution gaussian kernel meansquaredeviationσ at each position place 0after, according to σ -10-delta σ; σ + 10+ delta σ, determines the convolution gaussian kernel variance second convolution gaussian kernel meansquaredeviationσ at each position place -1and the 3rd convolution gaussian kernel meansquaredeviationσ + 1; Described delta σ is convolution gaussian kernel mean square deviation modifying factor;
Described first determining unit, also for obtaining the bidimensional width N of the second convolution gaussian kernel according to described σ-1 -1* N -1, according to described σ + 1obtain the bidimensional width N of the 3rd convolution gaussian kernel + 1* N + 1;
Described first asks for unit, and also for according to the second convolution gaussian kernel described in each, convolution asks for the 2nd Hessian matrix described in each corresponding to initial center point of each pixel in RON region; According to the 3rd convolution gaussian kernel described in each, convolution asks for the 3rd Hessian matrix described in each corresponding to initial center point of each pixel in RON region;
Described 3rd asks for unit, also for before asking for the candidate point of optical losses Pixel-level according to a described Hessian matrix, according to formulae discovery calculate each pixel in described RON corresponding form S set;
Described
4th asks for unit, for getting maximal value in described S corresponding convolution gaussian kernel mean square deviation is best convolution gaussian kernel meansquaredeviationσ best;
Described first asks for unit, also for according to described σ bestcorresponding gaussian kernel asks for best Hessian matrix;
Described second asks for unit, asks for the candidate point of optical losses Pixel-level specifically for best Hessian matrix;
Wherein, the value of described m is-1,0 or+1; Described | λ m| maxfor described σ mcorresponding Hessian matrix norm eigenvalue of maximum; Described p ifor i-th pixel in described RON; Described I is sum of all pixels in described RON; Described i is the integer being less than described I.
8. device according to claim 7, is characterized in that,
Described first determining unit, specifically for asking for the bidimensional width of the bidimensional width of the first convolution gaussian kernel, the bidimensional width of the second convolution gaussian kernel and the 3rd convolution gaussian kernel according to following formula; N m=2*round (4* σ m)+1;
Wherein said round represents and rounds up.
9. the device according to claim 6 or 7, is characterized in that,
Described first asks for unit, specifically for locating l*l pixel in the described RON region at each initial center point place; Centered by each pixel, ROC region in positioning image; The range size in wherein said ROC region equals the size of corresponding convolution gaussian kernel; And the first convolution gaussian kernel, the second convolution gaussian kernel and the 3rd convolution gaussian kernel are carried out single pixel convolution respectively obtain a corresponding Hessian matrix, the 2nd Hessian matrix and the 3rd Hessian matrix with corresponding ROC region.
10. device according to claim 7, is characterized in that, described second asks for unit specifically for utilizing described σ mfor σ best;
Choose the pixel being greater than threshold value is candidate point.
CN201410158714.5A 2014-04-18 2014-04-18 Based on multiple dimensioned Light stripes center extraction method and device Active CN105005981B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410158714.5A CN105005981B (en) 2014-04-18 2014-04-18 Based on multiple dimensioned Light stripes center extraction method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410158714.5A CN105005981B (en) 2014-04-18 2014-04-18 Based on multiple dimensioned Light stripes center extraction method and device

Publications (2)

Publication Number Publication Date
CN105005981A true CN105005981A (en) 2015-10-28
CN105005981B CN105005981B (en) 2017-10-27

Family

ID=54378640

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410158714.5A Active CN105005981B (en) 2014-04-18 2014-04-18 Based on multiple dimensioned Light stripes center extraction method and device

Country Status (1)

Country Link
CN (1) CN105005981B (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106228542A (en) * 2016-07-13 2016-12-14 苏州光图智能科技有限公司 High-rate laser projection line peak detection method
CN107367241A (en) * 2017-03-15 2017-11-21 山东交通学院 A kind of automobile tire decorative pattern recognition methods based on machine vision
CN108564621A (en) * 2018-04-28 2018-09-21 中国科学院电子学研究所 Structured light strip center extraction method and device for rut detection
CN109255789A (en) * 2018-07-26 2019-01-22 张利军 Image segmentation system based on computer disposal
CN109559305A (en) * 2018-11-26 2019-04-02 易思维(杭州)科技有限公司 A kind of quick processing system of line-structured light image based on SOC-FPGA
CN111462214A (en) * 2020-03-19 2020-07-28 南京理工大学 Line structure light stripe central line extraction method based on Hough transformation
CN111784725A (en) * 2020-06-29 2020-10-16 易思维(杭州)科技有限公司 Light strip center extraction method
CN112241964A (en) * 2020-09-22 2021-01-19 天津大学 Light strip center extraction method for line structured light non-contact measurement
CN113223074A (en) * 2021-05-06 2021-08-06 哈尔滨工程大学 Underwater laser stripe center extraction method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1763472A (en) * 2005-11-22 2006-04-26 北京航空航天大学 Quick and high-precision method for extracting center of structured light stripe
CN101499168A (en) * 2009-03-19 2009-08-05 哈尔滨工业大学 Structured light strip center extraction method based on ridge line tracing and Hessian matrix
CN101504770A (en) * 2009-03-19 2009-08-12 北京航空航天大学 Structural light strip center extraction method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1763472A (en) * 2005-11-22 2006-04-26 北京航空航天大学 Quick and high-precision method for extracting center of structured light stripe
CN101499168A (en) * 2009-03-19 2009-08-05 哈尔滨工业大学 Structured light strip center extraction method based on ridge line tracing and Hessian matrix
CN101504770A (en) * 2009-03-19 2009-08-12 北京航空航天大学 Structural light strip center extraction method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
孙军华等: "钢轨磨耗动态测量中激光光条中心的快速提取", 《光学精密工程》 *

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106228542A (en) * 2016-07-13 2016-12-14 苏州光图智能科技有限公司 High-rate laser projection line peak detection method
CN107367241A (en) * 2017-03-15 2017-11-21 山东交通学院 A kind of automobile tire decorative pattern recognition methods based on machine vision
CN107367241B (en) * 2017-03-15 2020-06-30 山东交通学院 Automobile tire pattern recognition method based on machine vision
CN108564621B (en) * 2018-04-28 2021-09-24 中国科学院电子学研究所 Structured light strip center extraction method and device for track detection
CN108564621A (en) * 2018-04-28 2018-09-21 中国科学院电子学研究所 Structured light strip center extraction method and device for rut detection
CN109255789A (en) * 2018-07-26 2019-01-22 张利军 Image segmentation system based on computer disposal
CN109255789B (en) * 2018-07-26 2020-08-25 上海广播电视信息网络有限公司 Image segmentation system based on computer processing
CN109559305A (en) * 2018-11-26 2019-04-02 易思维(杭州)科技有限公司 A kind of quick processing system of line-structured light image based on SOC-FPGA
CN109559305B (en) * 2018-11-26 2023-06-30 易思维(杭州)科技有限公司 Line structured light image rapid processing system based on SOC-FPGA
CN111462214A (en) * 2020-03-19 2020-07-28 南京理工大学 Line structure light stripe central line extraction method based on Hough transformation
CN111784725B (en) * 2020-06-29 2023-06-20 易思维(杭州)科技有限公司 Light bar center extraction method
CN111784725A (en) * 2020-06-29 2020-10-16 易思维(杭州)科技有限公司 Light strip center extraction method
CN112241964A (en) * 2020-09-22 2021-01-19 天津大学 Light strip center extraction method for line structured light non-contact measurement
CN112241964B (en) * 2020-09-22 2022-12-27 天津大学 Light strip center extraction method for line structured light non-contact measurement
CN113223074A (en) * 2021-05-06 2021-08-06 哈尔滨工程大学 Underwater laser stripe center extraction method

Also Published As

Publication number Publication date
CN105005981B (en) 2017-10-27

Similar Documents

Publication Publication Date Title
CN105005981A (en) Light stripe center extraction method and apparatus based on multiple dimensions
CN1879553B (en) Method and device for detecting boundary of chest image
US9123112B2 (en) Method for the pre-processing of a three-dimensional image of the surface of a tyre for use in the inspection of said surface
CN101137003B (en) Gray associated analysis based sub-pixel fringe extracting method
CN101408985B (en) Method and apparatus for extracting circular luminous spot second-pixel center
KR20190028794A (en) GPU-based TFT-LCD Mura Defect Detection Method
CN105719298A (en) Edge detection technology based line diffusion function extracting method
CN104376319B (en) A kind of method based on anisotropic Gaussian core extraction closed edge image outline
CN110807459B (en) License plate correction method and device and readable storage medium
CN101504770B (en) Structural light strip center extraction method
CN103761739A (en) Image registration method based on half energy optimization
CN109583365A (en) Method for detecting lane lines is fitted based on imaging model constraint non-uniform B-spline curve
CN105469408A (en) Building group segmentation method for SAR image
US10997708B2 (en) Quantifying tread rib edge locations
CN113538378A (en) Bearing size online detection system based on deep learning
CN106529548A (en) Sub-pixel level multi-scale Harris corner point detection algorithm
CN103456031A (en) Novel method for area image interpolation
CN111986286A (en) Radar-based curve drawing method and device, electronic equipment and storage medium
CN101430789B (en) Image edge detection method based on Fast Slant Stack transformation
CN103837135B (en) Workpiece inspection method and system thereof
CN114781314A (en) Method and system for rapidly drawing layout
CN107424583B (en) Display data processing method and system for special-shaped image
CN111489383B (en) Depth image up-sampling method and system based on depth marginal point and color image
CN110517299B (en) Elastic image registration algorithm based on local feature entropy
CN113223074A (en) Underwater laser stripe center extraction method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant