CN104616325A - Rapid and high-precision method for extracting light strip center on large surface - Google Patents

Rapid and high-precision method for extracting light strip center on large surface Download PDF

Info

Publication number
CN104616325A
CN104616325A CN201510034164.0A CN201510034164A CN104616325A CN 104616325 A CN104616325 A CN 104616325A CN 201510034164 A CN201510034164 A CN 201510034164A CN 104616325 A CN104616325 A CN 104616325A
Authority
CN
China
Prior art keywords
striation
image
center
straight
line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510034164.0A
Other languages
Chinese (zh)
Other versions
CN104616325B (en
Inventor
刘巍
高鹏
张洋
杨帆
李晓东
贾振元
高航
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dalian University of Technology
Original Assignee
Dalian University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dalian University of Technology filed Critical Dalian University of Technology
Priority to CN201510034164.0A priority Critical patent/CN104616325B/en
Publication of CN104616325A publication Critical patent/CN104616325A/en
Application granted granted Critical
Publication of CN104616325B publication Critical patent/CN104616325B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Geometry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a rapid and high-precision method for extracting straight-line structure light on a large composite component surface, belonging to the technical field of computer vision measurement, and relates to an image quality evaluating method and a method for extracting a light strip center of the straight-line structure light. The method comprises the steps: with respect to the straight-line structure line on the straight-line structure light on the large composite component surface and on the basis of Gauss similarity image quality evaluation, establishing light strip evaluation standards and providing center extraction criteria; acquiring an incident angle of a structure light source and an observing angle of a camera by adopting the transformation relation of spatial information and an illumination model, formed by the laser structure light, on the composite surface; extracting and amending the geometric center of the structure light in an image according to the incident angle of the structure light source and the observing angle of the camera, and obtaining the actual light strip center. Based on the Gauss similarity image quality evaluation method, the light strip image is evaluated; and the method is higher in precision, rapid in computing speed and capable of satisfying the rapid and high-precision extraction of the structure light strip center on the surface the large parts.

Description

A kind of large surfaces Light stripes center extraction method of quick high accuracy
Technical field
The invention belongs to computer vision measurement technical field, relate to a kind of image quality evaluating method and linear structure light Light stripes center extraction method.
Background technology
Along with the significantly increase of large-scale Grand Equipments machine-building size, the high-acruracy survey of its large-scale part and component as shape face sizes such as large aircraft empennage, large-scale antennas is the precondition and guarantee that its equipment high-quality is linked and packed.Binocular vision photogrammetry method based on auxiliary laser scanning is its conventional large-scale surface measurement method of one, the method utilizes binocular camera gather auxiliary laser striped and carry out optical losses reconstruction, realizes the high-acruracy survey of measured object bodily form facial contour size.But, because large-scale part and component shape face size is large, laser stripe is large at the mobile yardstick in measured object surface, laser stripe affects by multi-source factors such as on-the-spot illumination, material surface characteristic simultaneously, and traditional center extraction method is difficult to the extracted with high accuracy realizing structured light strip center in large scale surface measurement.Therefore, how the vision measurement of the large-scale zero component surface light strip center of structured light of extraction to large-scale zero component of quick high accuracy is significant.
Existing structured light strip center extraction method mainly contains traditional geometrical center method, grey scale centre of gravity method and curve fitting method etc., and has more distinctive direction template method and hessian matrix method.Traditional geometrical center method extracts optical losses further, fast operation by detecting striation edge, but precision is lower; Grey scale centre of gravity method be grey scale centre of gravity by calculating optical strip image every a line as optical losses, robustness is low, and precision is not high; Curve fitting method can reach degree of precision, but calculated amount is large, and arithmetic speed is slow.Chinese scholars, on traditional Light stripes center extraction algorithm basis, further provides and has more distinctive geometric direction template and Hessian matrix method." Detection of structured light strip center based on direction template " [J] that Hu Bin etc. deliver. computer engineering and application, 2002.11:58-60, propose a kind of method utilizing direction-changeable template detection structured light strip center, on 0 °, 90 °, 45 ° and 135 ° of directions design 4 templates, respectively with these 4 templates to image each row relax of advancing.The method has certain anti-white noise ability and broken line repairing ability; The patent No. that Zhou Fuqiang etc. invent is that " a kind of structured light strip center quick and high-precision method for extracting " of CN200510123724.6 adopts the normal direction solving hessian matrix determination striation, Taylor expansion and recursion method is utilized to solve the sub-pixel center of striation, under the condition ensureing precision and robustness, reduce operand, achieve the rapid extraction of light stripe center to a certain extent.
Summary of the invention
The technical barrier that the present invention will solve extracts for the quick high accuracy of structure striation under large-scale zero component surface, complicated measurement environment, invented Gauss's similarity image quality evaluation standard and accordingly based on the optical losses quick and high-precision method for extracting of this picture appraisal.In structure Light stripes center extraction process, need to carry out image quality evaluation to optical strip image, to the image optical losses correction lower than certain threshold value, the quick high accuracy realizing large-scale zero component surface structure optical losses extracts.
The technical solution used in the present invention is a kind of large-scale composite material component surface linear structure light extracting method of quick high accuracy, it is characterized in that, the method is for large-scale composite material component surface linear structure light, set up striation evaluation criterion based on the image quality evaluation of Gauss's similarity, propose center extraction decision criteria; The illumination model adopting laser structure light to be formed at composite material surface and spatial information transformation relation, the acquisition incident angle of structured light light source and the viewing angle of video camera; Pass through the incident angle of obtained structured light light source and the viewing angle of video camera and extraction correction is carried out to structured light geometric center in image, thus obtain actual optical losses; The concrete steps of method are as follows:
The first step is based on Gaussian structures similarity image quality evaluation standard setting center extraction decision criteria
1) based on the image quality evaluating method of Gaussian structures similarity
For the Gaussian characteristics of structured light and the structural similarity of image, the image quality evaluating method based on Gaussian structures similarity is proposed; For two width image x to be compared, y, wherein x is standard picture, and y is image to be evaluated, can compare the tolerance realized the structure similarity degree of two width images by following brightness, contrast, structural similarity, Gauss's similarity four aspects;
Brightness ratio is comparatively:
l ( x , y ) = 2 μ x μ y + C 1 μ x 2 + μ y 2 + C 1 - - - ( 1 )
Wherein, μ x, μ ythe mean flow rate of image, c 1that n is number of pixels in order to avoid denominator is the zero little constant established;
Contrast compares:
c ( x , y ) = 2 σ x σ y + C 2 σ x 2 + σ y 2 + C 2 - - - ( 2 )
Wherein, σ x, σ ythe standard deviation of image, c 2that n is number of pixels in order to avoid denominator is the zero little constant established;
Structure comparison:
s ( x , y ) = σ xy + C 3 σ x σ y + C 3 - - - ( 3 )
Wherein, σ xythe covariance of image, characterisation of structures contrast, that n is number of pixels in order to avoid denominator is the zero little constant established;
The Gaussian characteristics of structure based light striation, definition Gauss similarity is as follows:
g ( y ) = 1 - Σ u = 1 T [ ( 1 M Σ i = 1 M x u , i - g ( s ) ) / ( 1 M Σ i = 1 M x u , i ) ] 2 T - - - ( 4 )
Wherein T, x u,i, M represents the gray-scale value of image to be evaluated respectively, for Gaussian distribution curve formula, be gaussian curve approximation coefficient; So it is as follows that the image evaluation method in conjunction with structure based similarity can set up Gauss's similarity image quality evaluation model:
GFSSIM(x,y)=[l(x,y)] α[c(x,y)] β[s(x,y)] γ[g(y)] λ(5)
Wherein, α, beta, gamma, λ is brightness, contrast and structure comparison and the self-adaptation proportion coefficients of Gauss's similarity shared by this picture appraisal model respectively;
2) based on standard picture setting center extraction decision criteria
Definition standard picture is the image that laser beam incident line and camera optical axis are taken with straight line and perpendicular to camera during receiving plane; Standard picture optical losses is the geometric center of striation, can adopt geometrical center method extracting directly; Gauss's similarity image quality evaluation threshold value of established standards image is GFSSIM; When image to be evaluated and standard picture are enough similar, namely more than GFSSIM, then can assert that the optical losses of image to be evaluated is similar to the geometric center being positioned at striation, geometrical center method can be adopted to extract optical losses; When image striation to be evaluated and standard picture have relatively large deviation, namely GFSSIM is less than when image to be evaluated and standard picture are enough similar, the center of image striation to be evaluated is not in the geometric center of striation, create larger side-play amount, certain correction need be carried out to the center extraction of this striation;
Second step standard picture Light stripes center extraction method
1) striation two edges straight line is extracted based on Hough transform
Under image coordinate system, setting straight-line equation is v=ku+b,
Wherein (u, v) pixel coordinate that is image, k is the slope of straight line, and b is the intercept of straight line; The parametric equation of definition straight line is:
ρ=u icosθ+v isinθ (6)
Wherein, ρ is the distance of true origin and straight line, θ be straight line with uaxle clamp angle; With ρ-θ for independent variable-dependent variable, by pixel (u each in image coordinate system i, v i) bring in formula (5) and form a curve respectively, its intersections of complex curve is (ρ, θ), then can show that the parameter of space line is:
k=-cotθ (7)
b=ρ/sinθ (8)
And then the straight-line equation of striation two edges can be obtained:
y=x(-cotθ l)+ρ l/sinθ l(9)
y=x(-cotθ r)+ρ r/sinθ r(10)
Wherein, (ρ l, θ l) be the corresponding parameter of left hand edge straight-line equation, (ρ r, θ r) be the corresponding parameter of right hand edge straight-line equation; Due to striation edge line less parallel, therefore striation width D can adopt two parallel lines spacing formula to calculate:
k=-(cotθ l+cotθ r)/2 (11)
D = | ρ l / sin θ l - ρ r / sin θ r | 1 + k 2 - - - ( 12 )
Wherein, k is average gradient, for calculating two parallel lines spacings;
2) striation geometric center is determined
Adopt Gauss's similarity picture appraisal standard to evaluate left and right image, if meet structured light to extract threshold value, directly according to edge line equation, geometrical center method determination optical losses can be adopted; Optical losses straight-line equation can be asked for angular bisector to obtain by simultaneous two edges straight-line equation.
Center line straight slope k is tried to achieve by angular bisector slope formula (13);
| k l - k | 1 + k l k = | k - k r | 1 + kk r - - - ( 13 )
Wherein, k l=-cot θ lfor left hand edge straight slope, k r=-cot θ rfor right hand edge straight slope; So optical losses straight-line equation is:
y-y 0=k(x-x 0) (14)
Wherein, (x 0, y 0) be left and right edges straight-line intersection coordinate; x 0=(b r-b l)/(k l-k r); y 0=k lx 0+ b l; b r, b lfor intercept that is right, left straight line;
3rd step: revise striation extracting method
Adopt Gauss's similarity picture appraisal standard to evaluate left and right image, if lower than extraction threshold value, then need the optical losses to geometrical center method extracts to revise;
1) multi-source factor coupling model
Based on laser, composite material surface object properties, analyze the reflection characteristic of laser at composite material surface, set up multi-source factor coupling model, its model formation is as follows:
I - k a I pa = ( I ′ - k a I ′ pa ) d cos i ( k d cos i + k s ( cos ( θ - i ) ) h ) - - - ( 15 )
Wherein, I is total structured light light intensity, k ai pafor surround lighting light intensity, k afor surround lighting illumination tensor, k dfor diffuse-reflection factor, k sfor specularity factor, i is the angle of incident striation and planar process vector, and θ is the angle of direction of observation and planar process vector, and d is the distance that laser instrument arrives receiving plane, and h is reflectivity factor;
2) structural light strip width is determined
Spatial relation according to real image optical losses and standard picture optical losses can set up following relational expression:
D=d[tan(i+α)-tan(i-α)]sinθ (16)
Wherein, D is actual striation width, and i is striation incident angle, and θ is camera looks into fee angle, and α is the half at generating laser horizontal emission angle, and d is the distance that laser instrument arrives receiving plane;
3) angle correction solves
Known by formula (15), (16), when the information of standard laser striation is known, the gray-scale value of its striation and space width are only with the incident angle of laser and composite material surface and to observe the observation angle of camera and measured surface relevant.When obtaining laser optical strip image, Hough edge extracting is adopted to calculate the developed width D of striation, simultaneous formula (15), (16) reverse go out striation incident angle i and camera view angle θ, revise optical losses line position as angle correction;
4) optical losses correction
Adopt the geometric center of geometrical center method rapid extraction laser striation, due to known striation incident angle i and camera view angle θ, according to optical losses at theorem in Euclid space Changing Pattern, revise optical losses line position, the actual striation center relative ideal geometric center off-set value obtained after correction is:
Δ = [ tan i - 1 2 tan ( i + α ) - 1 2 tan ( i - α ) ] cos θ - - - ( 17 )
The invention has the beneficial effects as follows to adopt and based on Gauss's similarity graph image quality evaluation method, optical strip image is evaluated, whole Light stripes center extraction process is divided into Gaussian image quality assessment, standard picture striation geometric center is extracted and optical losses correction three key steps, determine that optical strip image to be measured is revised the need of center according to image quality evaluation, obtain actual optical losses accurately according to Light stripes center extraction algorithm and optical losses modification method.Precision is higher, fast operation, and the quick high accuracy that can meet the structure optical losses of large-scale zero component surface extracts.
Accompanying drawing explanation
Fig. 1 is light strip center of structured light skew schematic diagram.In figure, i-striation incident angle, θ-camera view angle, the off-set value of the relative geometry optical losses of Δ-actual optical losses, d-laser instrument is to the distance of receiving plane, α-laser instrument launches half-angle, 1-testee surface, 2-laser instrument, 3-camera imaging face, 4-camera lens, 5-solid line is geometry optical losses position, and 6-dotted line is actual striation center.
Fig. 2 is the large surfaces Light stripes center extraction process flow diagram of quick high accuracy.
Embodiment
The specific embodiment of the present invention is described in detail below in conjunction with technical scheme and accompanying drawing.Accompanying drawing 1 is light strip center of structured light skew schematic diagram.Testee surface is the t800 composite panel of 3.4 × 0.6m, and striation is got on multiple material plate at a certain angle, and adjustment camera focus gathers clear optical strip image.
Embodiment 1, the present invention adopts two video cameras configuring wide-angle lens respectively to take a width optical strip image.Video camera model is viewworks VC-12MC-M/C 65 video camera, resolution: 4096 × 3072, imageing sensor: CMOS, frame per second: silent frame, the highest 64.3fps, weight: 420g.Wide-angle lens model is EF 16-35mm f/2.8L II USM, and parameter is as follows, lens focus: f=16-35, APS focal length: 25.5-52.5, aperture: F2.8, Lens: 82 × 106.Shooting condition is as follows: picture pixels is 4096 × 3072, and lens focus is 17mm, and object distance is 750mm, and visual field is about 800mm × 800mm.
Accompanying drawing 2 is the large surfaces Light stripes center extraction process flow diagram of quick high accuracy.According to this operating process, whole Light stripes center extraction is divided into Gaussian image quality assessment, standard picture striation geometric center is extracted and optical losses correction three key steps.
The first step is based on Gaussian structures similarity image quality evaluation standard setting structured light correction threshold
1) the image quality evaluation model based on Gaussian structures similarity is set up
This example is set up based on Gaussian structures similarity image quality evaluation model according to formula (5), and wherein α, beta, gamma, λ all gets 1.
2) determination of correction threshold
For large aircraft exemplary complex material members surface, laser instrument often offsets 2 degree of cameras and catches an optical strip image, obtain striation sequence image thus, computing is carried out by formula (5) image quality evaluation model, it is GFSSIM=0.998 that analysis can obtain correction threshold, after laser instrument offsets 20 ° of angles, laser striation off-centring is greater than 0.15mm, should revise optical losses.
Second step standard picture Light stripes center extraction method
1) striation two edges straight line is extracted based on Hough transform
First obtain according to the definition of the parametric equation of the straight line (6) with the coordinate diagram of ρ-θ for independent variable-dependent variable, wherein ρ is the distance of true origin and straight line, θ is straight line and u axle clamp angle, the intersection point of ρ-θ curve can be calculated slope and the intercept of space line according to formula (7) and (8), and then obtain straight-line equation (9) and (10) of striation two edges.
2) geometrical center method determination striation geometric center
According to striation two edges obtained in the previous step straight-line equation, calculate light stripe centric line straight slope k by angular bisector formula (13), simultaneous equations (9) and (10) obtain a bit (x on center line 0, y 0), obtain light stripe centric line equation (14) thus.
Three stepwise updating striation extracting method
1) calculating of angle correction
Simultaneous formula (15) and (16) reverse can go out striation incident angle i and camera view angle θ.Except i and θ, all the other parameters are all known, and actual striation width D can be tried to achieve according to the formula (11) in striation edge extracting process and (12).
2) optical losses correction
Adopting formula (17) correction optical losses line position to obtain actual striation center relative ideal geometric center off-set value is Δ.On the desired light bar geometric center basis that geometrical center method extracts, laser instrument left-side images geometric center shifted by delta to the right, laser instrument image right geometric center shifted by delta left, obtains actual optical losses.
The present invention adopts and evaluates optical strip image based on Gauss's similarity graph image quality evaluation method, directly adopt Hough transform and geometrical center method to extract optical losses to the image meeting striation extraction threshold value, by optical losses correction, actual optical losses is obtained to the image not meeting striation extraction threshold value after extraction striation geometric center.The method beneficial effect is that precision is higher, fast operation, and the quick high accuracy that can meet the structure optical losses of large-scale zero component surface extracts.

Claims (1)

1. a large surfaces Light stripes center extraction method for quick high accuracy, is characterized in that, the method, for large-scale composite material component surface linear structure light, sets up striation evaluation criterion based on the image quality evaluation of Gauss's similarity, proposes center extraction decision criteria; The illumination model adopting laser structure light to be formed at composite material surface and spatial information transformation relation, the acquisition incident angle of structured light light source and the viewing angle of video camera; Pass through the incident angle of obtained structured light light source and the viewing angle of video camera and extraction correction is carried out to structured light geometric center in image, thus obtain actual optical losses; Method concrete steps are as follows:
The first step is based on Gaussian structures similarity image quality evaluation standard setting center extraction decision criteria
1) based on the image quality evaluating method of Gaussian structures similarity
For the Gaussian characteristics of structured light and the structural similarity of image, the image quality evaluating method based on Gaussian structures similarity is proposed; For two width image x to be compared, y, wherein x is standard picture, and y is image to be evaluated, compares by following brightness, contrast, structural similarity, Gauss's similarity four aspects the tolerance realized the structure similarity degree of two width images;
Brightness ratio is comparatively:
Wherein, μ x, μ ythe mean flow rate of image, c 1that n is number of pixels in order to avoid denominator is the zero little constant established;
Contrast compares:
Wherein, σ x, σ ythe standard deviation of image, c 2that n is number of pixels in order to avoid denominator is the zero little constant established;
Structure comparison:
σ xythe covariance of image, characterisation of structures contrast, c 3that n is number of pixels in order to avoid denominator is the zero little constant established;
The Gaussian characteristics of structure based light striation, definition Gauss similarity is as follows:
Wherein, T, x u,i, M represents the gray-scale value of image to be evaluated respectively, for Gaussian distribution curve formula, be gaussian curve approximation coefficient; It is as follows that image evaluation method in conjunction with structure based similarity sets up Gauss's similarity image quality evaluation model:
GFSSIM(x,y)=[l(x,y)] α[c(x,y)] β[s(x,y)] γ[g(y)] λ(5)
Wherein, α, beta, gamma, λ is brightness, contrast and structure comparison and the self-adaptation proportion coefficients of Gauss's similarity shared by this picture appraisal model respectively;
2) based on standard picture setting center extraction decision criteria
Definition standard picture is the image that laser beam incident line and camera optical axis are taken with straight line and perpendicular to camera during receiving plane; Standard picture optical losses is the geometric center of striation, adopts geometrical center method extracting directly; Gauss's similarity image quality evaluation threshold value of established standards image is GFSSIM; When image to be evaluated and standard picture are enough similar, namely more than GFSSIM, then assert that the optical losses of image to be evaluated is similar to the geometric center being positioned at striation, adopt geometrical center method to extract optical losses; When image striation to be evaluated and standard picture have relatively large deviation, namely GFSSIM is less than when image to be evaluated and standard picture are enough similar, the center of image striation to be evaluated is not in the geometric center of striation, create larger side-play amount, certain correction need be carried out to the center extraction of this striation;
Second step standard picture Light stripes center extraction method
1) striation two edges straight line is extracted based on Hough transform
Under image coordinate system, setting straight-line equation is v=ku+b, wherein, the pixel coordinate that (u, v) is image, k is the slope of straight line, and b is the intercept of straight line; The parametric equation of definition straight line is:
ρ=u icosθ+v isinθ (6)
Wherein, ρ is the distance of true origin and straight line, θ be straight line with uaxle clamp angle; With ρ-θ for independent variable-dependent variable, by pixel (u each in image coordinate system i, v i) bring in formula (5) and form a curve respectively, its intersections of complex curve is (ρ, θ), then show that the parameter of space line is:
k=-cotθ (7)
b=ρ/sinθ (8)
And then obtain the straight-line equation of striation two edges:
y=x(-cotθ l)+ρ l/sinθ l(9)
y=x(-cotθ r)+ρ r/sinθ r(10)
Wherein, (ρ l, θ l) be the corresponding parameter of left hand edge straight-line equation, (ρ r, θ r) be the corresponding parameter of right hand edge straight-line equation; Due to striation edge line less parallel, therefore striation width D adopts two parallel lines spacing formula to calculate:
k=-(cotθ l+cotθ r)/2 (11)
Wherein, k is average gradient, for calculating two parallel lines spacings;
2) striation geometric center is determined
Adopt Gauss's similarity picture appraisal standard to evaluate left and right image, if meet structured light to extract threshold value, directly according to edge line equation, adopt geometrical center method determination optical losses; Optical losses straight-line equation simultaneous two edges straight-line equation is asked for angular bisector to obtain;
Center line straight slope k is tried to achieve by angular bisector slope formula (13);
Wherein, k l=-cot θ lfor left hand edge straight slope, k r=-cot θ rfor right hand edge straight slope; So optical losses straight-line equation is:
y-y 0=k(x-x 0) (14)
Wherein, (x 0, y 0) be left and right edges straight-line intersection coordinate; x 0=(b r-b l)/(k l-k r); y 0=k lx 0+ b l; b r, b lfor intercept that is right, left straight line;
3rd step: revise striation extracting method
Adopt Gauss's similarity picture appraisal standard to evaluate left and right image, if lower than extraction threshold value, then need the optical losses to geometrical center method extracts to revise;
1) multi-source factor coupling model
Based on laser, composite material surface object properties, analyze the reflection characteristic of laser at composite material surface, set up multi-source factor coupling model, its model formation is as follows:
Wherein, I is total structured light light intensity, k ai pafor surround lighting light intensity, k afor surround lighting illumination tensor, k dfor diffuse-reflection factor, k sfor specularity factor, i is the angle of incident striation and planar process vector, and θ is the angle of direction of observation and planar process vector, and d is the distance that laser instrument arrives receiving plane, and h is reflectivity factor;
2) structural light strip width is determined
Spatial relation according to real image optical losses and standard picture optical losses sets up following relational expression:
D=d[tan(i+α)-tan(i-α)]sinθ (16)
Wherein, D is actual striation width, and i is striation incident angle, and θ is camera looks into fee angle, and α is the half at generating laser horizontal emission angle, and d is the distance that laser instrument arrives receiving plane;
3) angle correction solves
Known by formula (15), (16), when the information of standard laser striation is known, the gray-scale value of its striation and space width are only with the incident angle of laser and composite material surface and to observe the observation angle of camera and measured surface relevant; When obtaining laser optical strip image, Hough edge extracting is adopted to calculate the developed width D of striation, simultaneous formula (15), (16) reverse go out striation incident angle i and camera view angle θ, revise optical losses line position as angle correction;
4) optical losses correction
Adopt the geometric center of geometrical center method rapid extraction laser striation, due to known striation incident angle i and camera view angle θ, according to optical losses at theorem in Euclid space Changing Pattern, revise optical losses line position, the actual striation center relative ideal geometric center off-set value obtained after correction is:
CN201510034164.0A 2015-01-21 2015-01-21 A kind of large surfaces Light stripes center extraction method of quick high accuracy Active CN104616325B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510034164.0A CN104616325B (en) 2015-01-21 2015-01-21 A kind of large surfaces Light stripes center extraction method of quick high accuracy

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510034164.0A CN104616325B (en) 2015-01-21 2015-01-21 A kind of large surfaces Light stripes center extraction method of quick high accuracy

Publications (2)

Publication Number Publication Date
CN104616325A true CN104616325A (en) 2015-05-13
CN104616325B CN104616325B (en) 2018-02-16

Family

ID=53150758

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510034164.0A Active CN104616325B (en) 2015-01-21 2015-01-21 A kind of large surfaces Light stripes center extraction method of quick high accuracy

Country Status (1)

Country Link
CN (1) CN104616325B (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105091783A (en) * 2015-05-30 2015-11-25 大连理工大学 Method for modeling light strips projected on cylinder surface based on sectional gray scale energy distribution
CN105141839A (en) * 2015-08-21 2015-12-09 大连理工大学 Method for obtaining high-definition images based on aperture time control
CN105335988A (en) * 2015-09-25 2016-02-17 大连理工大学 Hierarchical processing based sub-pixel center extraction method
CN105931232A (en) * 2016-04-18 2016-09-07 南京航空航天大学 High-precision sub-pixel extraction method for centers of light bars of structured light
CN106097430A (en) * 2016-06-28 2016-11-09 哈尔滨工程大学 A kind of laser stripe center line extraction method of many gaussian signals matching
CN106228542A (en) * 2016-07-13 2016-12-14 苏州光图智能科技有限公司 High-rate laser projection line peak detection method
CN107516324A (en) * 2017-07-20 2017-12-26 大连理工大学 A kind of object boundary extracting method based on the mutation of striation geometric properties
CN107578420A (en) * 2017-08-21 2018-01-12 大连理工大学 A kind of adaptive striation carrying out image threshold segmentation method
CN107687819A (en) * 2017-08-01 2018-02-13 大连理工大学 A kind of optical losses sub-pixel extraction of quick high accuracy
CN110533675A (en) * 2019-08-26 2019-12-03 大连理工大学 A kind of laser stripe blocks noise filtering and compensation method
CN111260708A (en) * 2020-01-14 2020-06-09 华中科技大学鄂州工业技术研究院 Line structure optical center extraction method and system
CN111462214A (en) * 2020-03-19 2020-07-28 南京理工大学 Line structure light stripe central line extraction method based on Hough transformation
CN111721316A (en) * 2020-06-22 2020-09-29 重庆大学 High-performance lane line identification region-of-interest prediction method
CN116878382A (en) * 2023-08-10 2023-10-13 安徽大学 Remote high-speed surface measurement method based on structured light

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101814185A (en) * 2010-04-14 2010-08-25 天津大学 Line structured light vision sensor calibration method for micro-size measurement
US8224068B2 (en) * 2007-09-18 2012-07-17 University Of Kentucky Research Foundation (Ukrf) Lock and hold structured light illumination
CN103411553A (en) * 2013-08-13 2013-11-27 天津大学 Fast calibration method of multiple line structured light visual sensor
CN103955927A (en) * 2014-04-26 2014-07-30 江南大学 Fillet weld automatic tracking method based on laser vision

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8224068B2 (en) * 2007-09-18 2012-07-17 University Of Kentucky Research Foundation (Ukrf) Lock and hold structured light illumination
CN101814185A (en) * 2010-04-14 2010-08-25 天津大学 Line structured light vision sensor calibration method for micro-size measurement
CN103411553A (en) * 2013-08-13 2013-11-27 天津大学 Fast calibration method of multiple line structured light visual sensor
CN103955927A (en) * 2014-04-26 2014-07-30 江南大学 Fillet weld automatic tracking method based on laser vision

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
LI QI等: "Statistical behavior analysis and precision optimization for the laser stripe center detector based on Steger"s algorithm", 《OPTICS EXPRESS》 *
梁治国等: "结构光三维测量中的亚像素级特征提取与边缘检测", 《机械工程学报》 *
贺俊吉等: "结构光三维视觉检测中光条图像处理方法研究", 《北京航空航天大学学报》 *

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105091783A (en) * 2015-05-30 2015-11-25 大连理工大学 Method for modeling light strips projected on cylinder surface based on sectional gray scale energy distribution
CN105091783B (en) * 2015-05-30 2017-08-22 大连理工大学 Periphery striation modeling method based on section gray scale Energy distribution
CN105141839B (en) * 2015-08-21 2018-01-26 大连理工大学 A kind of high-definition image acquisition methods based on aperture time control
CN105141839A (en) * 2015-08-21 2015-12-09 大连理工大学 Method for obtaining high-definition images based on aperture time control
CN105335988A (en) * 2015-09-25 2016-02-17 大连理工大学 Hierarchical processing based sub-pixel center extraction method
CN105335988B (en) * 2015-09-25 2017-12-26 大连理工大学 A kind of sub-pix center extraction method based on layered shaping
CN105931232A (en) * 2016-04-18 2016-09-07 南京航空航天大学 High-precision sub-pixel extraction method for centers of light bars of structured light
CN105931232B (en) * 2016-04-18 2019-02-19 南京航空航天大学 Light strip center of structured light high-precision sub-pixel extraction
CN106097430A (en) * 2016-06-28 2016-11-09 哈尔滨工程大学 A kind of laser stripe center line extraction method of many gaussian signals matching
CN106228542A (en) * 2016-07-13 2016-12-14 苏州光图智能科技有限公司 High-rate laser projection line peak detection method
CN107516324A (en) * 2017-07-20 2017-12-26 大连理工大学 A kind of object boundary extracting method based on the mutation of striation geometric properties
CN107516324B (en) * 2017-07-20 2019-12-17 大连理工大学 Target boundary extraction method based on geometric characteristic mutation of light bars
CN107687819A (en) * 2017-08-01 2018-02-13 大连理工大学 A kind of optical losses sub-pixel extraction of quick high accuracy
CN107578420A (en) * 2017-08-21 2018-01-12 大连理工大学 A kind of adaptive striation carrying out image threshold segmentation method
CN107578420B (en) * 2017-08-21 2019-11-19 大连理工大学 A kind of adaptive striation carrying out image threshold segmentation method
CN110533675A (en) * 2019-08-26 2019-12-03 大连理工大学 A kind of laser stripe blocks noise filtering and compensation method
CN110533675B (en) * 2019-08-26 2021-01-19 大连理工大学 Laser stripe shielding noise filtering and compensating method
CN111260708A (en) * 2020-01-14 2020-06-09 华中科技大学鄂州工业技术研究院 Line structure optical center extraction method and system
CN111462214A (en) * 2020-03-19 2020-07-28 南京理工大学 Line structure light stripe central line extraction method based on Hough transformation
CN111721316A (en) * 2020-06-22 2020-09-29 重庆大学 High-performance lane line identification region-of-interest prediction method
CN116878382A (en) * 2023-08-10 2023-10-13 安徽大学 Remote high-speed surface measurement method based on structured light
CN116878382B (en) * 2023-08-10 2024-05-24 安徽大学 Remote high-speed surface measurement method based on structured light

Also Published As

Publication number Publication date
CN104616325B (en) 2018-02-16

Similar Documents

Publication Publication Date Title
CN104616325A (en) Rapid and high-precision method for extracting light strip center on large surface
CN105300316B (en) Optical losses rapid extracting method based on grey scale centre of gravity method
CN107945268B (en) A kind of high-precision three-dimensional method for reconstructing and system based on binary area-structure light
CN104930985B (en) Binocular vision 3 D topography measurement method based on space-time restriction
US8803943B2 (en) Formation apparatus using digital image correlation
CN104748683B (en) A kind of on-line automatic measurement apparatus of Digit Control Machine Tool workpiece and measuring method
CN104101331B (en) Based on the noncooperative target pose measurement of all-optical field camera
CN103438832B (en) Based on the 3-dimensional image measuring method of line-structured light
CN111414798A (en) Head posture detection method and system based on RGB-D image
CN105046743A (en) Super-high-resolution three dimensional reconstruction method based on global variation technology
CN105716539B (en) A kind of three-dimentioned shape measurement method of quick high accuracy
CN105205858A (en) Indoor scene three-dimensional reconstruction method based on single depth vision sensor
CN103884271A (en) Direct calibration method for line structured light vision sensor
CN111192235A (en) Image measuring method based on monocular vision model and perspective transformation
CN103292733B (en) A kind of corresponding point lookup method based on phase shift and trifocal tensor
CN104848801A (en) Line structure light vision sensor calibration method based on parallel bicylindrical target
CN110044374A (en) A kind of method and odometer of the monocular vision measurement mileage based on characteristics of image
CN106500625A (en) A kind of telecentricity stereo vision measuring apparatus and its method for being applied to the measurement of object dimensional pattern micron accuracies
CN103985121B (en) Method for optical calibration of underwater projector structure
Tian et al. An experimental evaluation method for the performance of a laser line scanning system with multiple sensors
Ren et al. Accurate three-dimensional shape and deformation measurement at microscale using digital image correlation
Li et al. Laser scanning based three dimensional measurement of vegetation canopy structure
CN109506629B (en) Method for calibrating rotation center of underwater nuclear fuel assembly detection device
CN103884294A (en) Wide view field type infrared light three-dimensional morphology measurement method and device thereof
CN102789644B (en) Novel camera calibration method based on two crossed straight lines

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant