CN112241964B - Light strip center extraction method for line structured light non-contact measurement - Google Patents

Light strip center extraction method for line structured light non-contact measurement Download PDF

Info

Publication number
CN112241964B
CN112241964B CN202010999750.XA CN202010999750A CN112241964B CN 112241964 B CN112241964 B CN 112241964B CN 202010999750 A CN202010999750 A CN 202010999750A CN 112241964 B CN112241964 B CN 112241964B
Authority
CN
China
Prior art keywords
pixel
points
point
image
neighborhood
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010999750.XA
Other languages
Chinese (zh)
Other versions
CN112241964A (en
Inventor
王太勇
张凌雷
冯志杰
韩文灯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin University
Original Assignee
Tianjin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin University filed Critical Tianjin University
Priority to CN202010999750.XA priority Critical patent/CN112241964B/en
Publication of CN112241964A publication Critical patent/CN112241964A/en
Application granted granted Critical
Publication of CN112241964B publication Critical patent/CN112241964B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • G06T5/70
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Abstract

The invention discloses a light strip center extraction method for line structured light non-contact measurement, which comprises the steps of sequentially carrying out region-of-interest extraction, preliminary noise filtration and self-adaptive threshold segmentation on an image to obtain an optimal binarized image; then, sequentially carrying out median and Gaussian filtering on the image; roughly extracting the center of the light bar by using a skeleton thinning algorithm to obtain the center line of the light bar of a single pixel and numbering the center points; and obtaining N pixel points with nonzero gray level in the neighborhood of the central point by adopting a k nearest neighborhood method, fitting a line segment by the N pixel points, solving a unit normal vector of the central point by adopting a principal component analysis algorithm, performing Gaussian function fitting on the gray value of the pixel point in the neighborhood of the central point and in the direction of the unit normal vector, and taking the fitted Gaussian function expected value as the final coordinate of the central point of the light bar. The above process is repeated until all points are traversed, and the precise position of the center of the stripe is obtained. The invention improves the extraction precision of the light strip center and has high efficiency.

Description

Light strip center extraction method for line structured light non-contact measurement
Technical Field
The invention relates to the field of non-contact measurement in the intelligent manufacturing industry, in particular to a light strip center extraction method for line structured light non-contact measurement.
Background
The line structured light vision three-dimensional measurement technology has the advantages of non-contact property, good flexibility, high measurement speed, higher precision and the like, and has wide application prospects in the fields of defect detection, reverse engineering, robotics, cultural relic digitization and the like.
The structured light vision three-dimensional measurement technology is non-contact measurement detection based on a laser triangulation principle. The laser projects line structured light to the surface of the measured object, the light strip can change along with the shape of the object, the deformed light strip image is collected by the CCD camera, the central position coordinate comprises the relative position information between the laser and the CCD camera and the depth information of the surface of the measured object, and the three-dimensional contour information of the measured object can be reversely solved through a two-dimensional picture, so that the three-dimensional reconstruction of the surface of the measured object is realized.
In the research of the on-line structured light vision measuring system, the accurate extraction of the light strip center of the laser image is an important factor influencing the precision of the whole measuring system. Wherein the determination of the light bar normal direction is a critical step.
The existing commonly used light strip center extraction algorithms include a contour centerline method, a gray threshold value method, a gray gravity center method, a Gaussian fitting method, a Hessian matrix method and the like. The contour centerline method and the gray threshold value method have poor robustness and low detection precision; the gray scale gravity center method and the Gaussian fitting method do not consider that the normal direction of the light bar is only suitable for the light bar with small change of the normal direction; the hessian matrix method is poor in real-time performance due to large calculation amount.
Disclosure of Invention
The invention provides a light strip center extraction method for line structured light non-contact measurement to solve the technical problems in the prior art.
The technical scheme adopted by the invention for solving the technical problems in the prior art is as follows: a light strip center extraction method for line structured light non-contact measurement is provided, which comprises the steps of firstly extracting the region of interest of the collected image; performing preliminary noise filtering processing on the extracted region of interest; performing self-adaptive threshold segmentation on the image subjected to the preliminary noise filtering processing to obtain an optimal binarized image; sequentially carrying out median filtering and Gaussian filtering on the binarized image; then, roughly extracting the light strip centers in the Gaussian filtered image by adopting a skeleton thinning algorithm, and outputting coordinate data of roughly extracted light strip center pixel points; setting a central pixel point of the light strip as a central point, and numbering the central point; marking a point with the number of k as a kth central point, obtaining N pixel points with nonzero gray scale in the neighborhood of the kth central point by adopting a k nearest neighborhood method, fitting a line segment by the N pixel points, obtaining a unit normal vector of the kth central point and vertical to the line segment by adopting a principal component analysis algorithm, fitting a Gaussian function to the gray scale value of the pixel point in the neighborhood of the kth central point and in the direction of the unit normal vector, and taking the expected value of the fitted Gaussian function as the final coordinate of the central point of the light strip so as to obtain the central coordinate of the light strip at a sub-pixel level; and repeating the process until all the numbered central points are traversed to obtain the accurate position of the center of the stripe.
Further, the method for roughly extracting the light bar centers in the image after Gaussian filtering by adopting a skeleton thinning algorithm comprises the following steps of:
a1, setting an image subjected to Gaussian filtering as an image to be refined, and setting the pixel value of a white pixel point as 1 and the pixel value of a black pixel point as 0 in the image to be refined; setting the thinning times as u times; initializing m =0;
a2, copying an image to be thinned to generate a first temporary image; selecting a pixel point with a pixel value of 1 from the first temporary image, and defining the pixel point as pi;
step A3, extracting square neighborhood points around the pi point, and setting pi 1 ~pi 8 Eight square neighborhood points of pi points, where pi 1 At the upper left of pi point, pi 1 ~pi 8 Are adjacent in sequence along the clockwise direction; pi (total internal diameter) 1 ~pi 8 The pixel values of the pixel points are sequentially corresponding to si 1 ~si 8 ;si 1 ~si 8 The value is 1 or 0;
step A4, scanning the first temporary image, if the pixel values of the neighborhood points of the pi points simultaneously meet the following three conditions, deleting the pi points in the image to be refined, otherwise, keeping the pi points:
(1)2≤si 1 +si 2 +si 3 +si 4 +si 5 +si 6 +si 7 +si 8 ≤6;
(2)si 1 ~si 8 the values of (A) are sequentially arranged correspondingly; the frequency of 0 and 1 mode appearing on adjacent pixel points in the pixel value arrangement is less than or equal to 1;
(3) While satisfying si 2 *si 4 *si 6 =0 and si 4 *si 6 *si 8 =0;
Step A5, judging whether all pixel points with pixel values of 1 in the first temporary image are selected in a traversing manner, if not, continuously selecting a pixel point with pixel value of 1 which is not selected in the first temporary image, and defining the pixel point as pi; returning to the step A3; if the traversing selection is carried out, the step A6 is carried out;
a6, copying the image to be thinned processed in the previous step to generate a second temporary image; selecting a pixel point with a pixel value of 1 from the second temporary image, and defining the pixel point as pk;
step A7, extracting square neighborhood points around the pk point, and setting pk 1 ~pk 8 Eight square neighborhood points of pk points, where pk 1 Located at the upper left of the point pk, pk 1 ~pk 8 Are adjacent in sequence along the clockwise direction; pk 1 ~pk 8 The pixel values of the pixel points sequentially correspond to sk 1 ~sk 8 ;sk 1 ~sk 8 The value is 1 or 0;
step A8, scanning the second temporary image, deleting the pk points in the image to be refined if the pixel values of the neighborhood points of the pk points simultaneously meet the following three conditions, otherwise, keeping the pk points:
(1)2≤sk 1 +sk 2 +sk 3 +sk 4 +sk 5 +sk 6 +sk 7 +sk 8 ≤6;
(2)sk 1 ~sk 8 the values of (A) are sequentially arranged correspondingly; the frequency of 0 and 1 mode appearing on adjacent pixel points in the pixel value arrangement is less than or equal to 1;
(3) Satisfy sk simultaneously 2 *sk 4 *sk 8 =0 and sk 2 *sk 6 *sk 8 =0;
Step A9, judging whether all pixel points with pixel values of 1 in the second temporary image are selected in a traversing way, if not, continuously selecting a pixel point with pixel value of 1 in the second temporary image, and defining the pixel point as pk; returning to the step A7; if the selection is traversed, making m = m +1, and turning to the step A10;
and A10, judging whether m is smaller than u, if so, returning to the step A2, and if not, taking the processed image to be thinned as a final light bar central skeleton image.
Further, u is 10 to 20 times.
Further, the method for obtaining the unit normal vector of the kth central point and the line segment perpendicular to the line segment by adopting the principal component analysis algorithm comprises the following steps:
step B1, establishing the following function Q (n), and enabling n to be a unit normal vector which is perpendicular to the line segment and is the kth central point when Q (n) takes the minimum value:
Figure BDA0002693872480000031
wherein x is i The coordinates of the ith pixel point in the k central point neighborhood are obtained, i is 1 … m, m is the number of pixel points in the k central point neighborhood, and c is the coordinates of the k central point on the crude extraction optical line;
step B2, performing decentralized processing on the data, and setting y i =x i The-c, Q (n) function reduces to:
Figure BDA0002693872480000032
wherein, y i Taking 1 … m as the coordinates of the ith pixel point in the k central point neighborhood after decentralization;
step B3, setting y i Has the coordinates of (a) i ,b i ) Let an intermediate matrix be S, S being expressed as follows:
Figure BDA0002693872480000033
the Q (n) function is further simplified to:
Q(n)=minn T Sn;
wherein the constraint of the equation is n T n =1, as known from lagrange' S equation, solving unit normal vector n is to perform vector decomposition on matrix S, and the eigenvector corresponding to the minimum eigenvalue is taken as the unit to be solvedA normal vector n.
Further, the method of performing gaussian fitting on the gray values of the pixel points is as follows:
let f (x) be a Gaussian function, the expression of f (x) is as follows:
Figure BDA0002693872480000041
wherein, A is the height of the Gaussian curve, mu is the coordinate of the peak center of the Gaussian function, and sigma is the variance, logarithmic operation is carried out on two sides of the formula, the Gaussian function is converted into a quadratic curve equation and is expressed by F (x):
Figure BDA0002693872480000042
F(x)=a 0 +a 1 x+a 2 x 2
wherein a is 0 ,a 1 ,a 2 Are the coefficients of the quadratic curve equation respectively,
Figure BDA0002693872480000043
extracting z coordinate points in the neighborhood of the kth central point in the direction of a unit normal vector n of the kth central point to serve as a data set, and using M j Expressed, j is 1 … z, and is substituted into equation F (x) to obtain a 0 、a 1 、a 2 And obtaining a fitted Gaussian function.
Further, the method of using the expected value of the fitted gaussian function as the coordinate of the center point of the final light bar is as follows:
will M j And (3) making residual errors by the coordinates of each point in the image and the coordinates of the corresponding points obtained by the fitted Gaussian function, and solving the sum of the residual errors, wherein the expression is as follows:
Figure BDA0002693872480000044
wherein D is the sum of residuals, and z is the number of data points in the data set;
are respectively to a 0 ,a 1 ,a 2 Calculating a partial derivative of
Figure BDA0002693872480000045
Arrange the items and handle a 0 ,a 1 ,a 2 Isolated to give the following formula:
Figure BDA0002693872480000051
finally, solve the equation by HausHold transform 0 a 1 a 2 ]Finally obtaining a coordinate point of the center of the Gaussian function peak as
Figure BDA0002693872480000052
And the coordinates in the corresponding image are the sub-pixel positions of the light bar centers.
Furthermore, before the region of interest extraction processing is performed on the acquired image, contrast and brightness adjustment is performed on the acquired image.
The invention has the advantages and positive effects that: in the existing method, the distribution of pixels of the cross section of the laser line image is mostly assumed to have the properties of symmetry, uniformity and the like, and in the actual acquisition process, the distribution of pixels of the cross section of the laser line image hardly meets the general symmetry and uniformity due to the problems of the incident angle, refraction and scattering caused by the uneven surface of the measured object, noise of image acquisition equipment and the like. The light strip center extraction method for line structured light non-contact measurement overcomes the problems, the pixel gray scale distribution in the normal direction is more symmetrical, and the expected value point obtained by Gaussian fitting is closer to the light strip center point, so that the extraction precision of the light strip center is improved. And a large amount of convolution operation is not needed, and the efficiency is improved.
Drawings
FIG. 1 is a flow chart of the operation of the present invention.
Fig. 2 is an original image captured by a CCD camera.
Fig. 3 is an image with the contrast and brightness of the original image adjusted.
Fig. 4 is an image after optimal binarization of the image in fig. 3.
Fig. 5 is an image obtained by performing coarse extraction on the image in fig. 4 by using a skeleton thinning algorithm.
Fig. 6 is a diagram of the final effect of extracting the central point of the light bar by using the method of the present invention.
FIG. 7 is a layout diagram of square neighborhood points around pi points in a skeleton refinement algorithm.
Detailed Description
For further understanding of the contents, features and effects of the present invention, the following embodiments are enumerated in conjunction with the accompanying drawings, and the following detailed description is given:
referring to fig. 1 to 7, a method for extracting the center of a light strip for line structured light non-contact measurement first extracts a region of interest from an acquired image; performing primary noise filtering processing on the extracted region of interest; performing self-adaptive threshold segmentation on the image subjected to the preliminary noise filtering processing to obtain an optimal binarized image; sequentially carrying out median filtering and Gaussian filtering on the binarized image; then, roughly extracting the light strip center in the image after Gaussian filtering by adopting a skeleton thinning algorithm, and outputting coordinate data of a roughly extracted light strip center pixel point; setting the central pixel point of the optical strip as a central point, and numbering the central point; marking a point with the number of k as a kth central point, obtaining N pixel points with nonzero gray scale in the neighborhood of the kth central point by adopting a k nearest neighborhood method, fitting a line segment by the N pixel points, obtaining a unit normal vector of the kth central point and vertical to the line segment by adopting a principal component analysis algorithm, fitting a Gaussian function to the gray scale value of the pixel point in the neighborhood of the kth central point and in the direction of the unit normal vector, and taking the expected value of the fitted Gaussian function as the final coordinate of the central point of the light strip so as to obtain the central coordinate of the light strip at a sub-pixel level; and repeating the process until all the numbered central points are traversed to obtain the accurate position of the center of the stripe.
Before the region of interest extraction processing is carried out on the collected image, the contrast and the brightness of the collected image can be adjusted.
The method for roughly extracting the light strip centers in the image after Gaussian filtering by adopting the skeleton thinning algorithm comprises the following steps of:
a1, setting the image after Gaussian filtering as an image to be refined, and setting the pixel value of a white pixel point as 1 and the pixel value of a black pixel point as 0 in the image to be refined; the thinning times can be set to be u times; u may be from 10 to 20 times; a parameter m may be set for accumulating the number of thinning processes, and m =0 may be initialized.
A2, copying an image to be refined to generate a first temporary image; a pixel with a pixel value of 1 may be selected from the first temporary image, and the pixel may be defined as pi.
Step A3, square neighborhood points around the pi point can be extracted, and the pi is set 1 ~pi 8 Eight square neighborhood points of pi points, where pi 1 At the upper left of pi point, pi 1 ~pi 8 Are adjacent in sequence along the clockwise direction; pi (total internal diameter) 1 ~pi 8 The pixel values of the pixel points are sequentially corresponding to si 1 ~si 8 ;si 1 ~si 8 The value is 1 or 0.
Step A4, scanning the first temporary image, if the pixel values of the neighborhood points of the pi points simultaneously meet the following three conditions, deleting the pi points in the image to be refined, otherwise, keeping the pi points:
(1)2≤si 1 +si 2 +si 3 +si 4 +si 5 +si 6 +si 7 +si 8 ≤6。
(2)si 1 ~si 8 the values of (A) are sequentially arranged correspondingly; the frequency of 0 and 1 mode appearing on adjacent pixel points in the pixel value arrangement is less than or equal to 1.
(3) While satisfying si 2 *si 4 *si 6 =0 and si 4 *si 6 *si 8 =0。
Step A5, judging whether all pixel points with pixel values of 1 in the first temporary image are selected in a traversing manner, if not, continuously selecting a pixel point with pixel value of 1 which is not selected in the first temporary image, and defining the pixel point as pi; returning to the step A3; if the selection is traversed, the step A6 is carried out.
Step A6, copying the image to be thinned processed in the previous step to generate a second temporary image; a pixel with a pixel value of 1 may be selected from the second temporary image, and the pixel may be defined as pk.
Step A7, square neighborhood points around the pk point can be extracted, and pk is set 1 ~pk 8 Eight square neighborhood points of pk points, where pk 1 Located at the upper left of the pk point, pk 1 ~pk 8 Are adjacent in sequence along the clockwise direction; pk 1 ~pk 8 The pixel values of the pixels are sequentially corresponded to sk 1 ~sk 8 ;sk 1 ~sk 8 The value is 1 or 0.
And A8, scanning the second temporary image, and deleting the pk points in the image to be refined if the pixel values of the neighborhood points of the pk points simultaneously meet the following three conditions, or else, keeping the pk points.
(1)2≤sk 1 +sk 2 +sk 3 +sk 4 +sk 5 +sk 6 +sk 7 +sk 8 ≤6。
(2)sk 1 ~sk 8 The values of (A) are sequentially arranged correspondingly; the frequency of 0 and 1 mode appearing on adjacent pixel points in the pixel value arrangement is less than or equal to 1; this condition can ensure continuity after the current pixel point is deleted.
(3) Satisfy sk simultaneously 2 *sk 4 *sk 8 =0 and sk 2 *sk 6 *sk 8 =0。
Step A9, judging whether all pixel points with pixel values of 1 in the second temporary image are selected in a traversing manner, if not, continuously selecting a pixel point with pixel value of 1 which is not selected in the second temporary image, and defining the pixel point as pk; returning to the step A7; if the selection is traversed, let m = m +1, and go to step a10.
And A10, judging whether m is smaller than u, if so, returning to the step A2, and if not, taking the processed image to be thinned as a final light bar central skeleton image.
And finishing the step A2 to the step A9 to finish a refinement algorithm, and repeating the step A2 to the step A9 through multiple iterations to obtain a final light bar central skeleton diagram.
The light bar central skeleton diagram is an image obtained by rough extraction of the light bar center, the light bar center line formed by single pixels is obtained from the image obtained by rough extraction of the light bar center, and coordinate point information on the light bar center line can be correspondingly output and numbered.
And during numbering, traversing the light bar central skeleton graph line by line, and sequentially numbering and marking the pixel points with the pixel values of 1.
After numbering, the unit normal vector of the kth central point and vertical to the line segment is obtained by adopting a principal component analysis algorithm, and the method for obtaining the unit normal vector of the kth central point and vertical to the line segment by adopting the principal component analysis algorithm can comprise the following steps:
in step B1, the following function Q (n) may be established, such that when Q (n) takes the minimum value, n is the unit normal vector perpendicular to the line segment and at the kth center point:
Figure BDA0002693872480000071
wherein x is i The coordinate of the ith pixel point in the k central point neighborhood is obtained, i is 1 … m, m is the number of the pixel points in the k central point neighborhood, and c is the coordinate of the k central point on the crude extraction optical line;
b2, performing decentralized processing on the data, and setting y i =x i The-c, Q (n) function reduces to:
Figure BDA0002693872480000081
wherein, y i Taking 1 … m as the coordinates of the ith pixel point in the k central point neighborhood after decentralization;
step B3, can be set to y i Has the coordinates of (a) i ,b i ) An intermediate matrix may be set to S, which is expressed as follows:
Figure BDA0002693872480000082
the Q (n) function is further simplified to:
Q(n)=minn T Sn;
wherein the constraint of the equation is n T n =1, as can be seen from the lagrange equation, solving the unit normal vector n is to perform vector decomposition on the matrix S, and taking the eigenvector corresponding to the minimum eigenvalue as the solved unit normal vector n.
Further, the gray value of the pixel point in the k-th central point neighborhood and in the unit normal vector direction is subjected to gaussian function fitting, and the method for performing gaussian fitting on the gray value of the pixel point may be as follows:
let f (x) be a Gaussian function, and the expression of f (x) is as follows:
Figure BDA0002693872480000083
wherein, A is the height of the Gaussian curve, mu is the coordinate of the peak center of the Gaussian function, and sigma is the variance, logarithmic operation is carried out on two sides of the formula, the Gaussian function is converted into a quadratic curve equation and is expressed by F (x):
Figure BDA0002693872480000084
F(x)=a 0 +a 1 x+a 2 x 2
wherein a is 0 ,a 1 ,a 2 Are the coefficients of a quadratic curve equation respectively,
Figure BDA0002693872480000085
extracting z coordinate points in the neighborhood of the kth central point as a data set in the direction of a unit normal vector n of the kth central point, and using M j Is expressed by j is taken1 … z, substituting into equation F (x), to find a 0 、a 1 、a 2 And obtaining a fitted Gaussian function.
Taking the fitted expectation value of the Gaussian function as the final coordinate of the central point of the light strip, thereby obtaining the central coordinate of the light strip at a sub-pixel level; the method of using the expected value of the fitted gaussian function as the final coordinate of the center point of the light bar may be as follows:
can be combined with M j And (3) making residual errors between the coordinates of each point and the corresponding point coordinates obtained by the fitted Gaussian function, and solving the sum of the residual errors, wherein the expression is as follows:
Figure BDA0002693872480000091
wherein D is the sum of residuals, and z is the number of data points in the data set;
can be respectively paired with a 0 ,a 1 ,a 2 Calculating a partial derivative, i.e.
Figure BDA0002693872480000092
Arrange the items and handle a 0 ,a 1 ,a 2 Separated to give the following formula:
Figure BDA0002693872480000093
finally, the equation solution [ a ] can be solved by HausHold transform 0 a 1 a 2 ]Finally obtaining a coordinate point of the center of the Gaussian function peak as
Figure BDA0002693872480000094
And the coordinates in the corresponding image are the sub-pixel positions of the light bar centers.
The above-mentioned embodiments are only for illustrating the technical ideas and features of the present invention, and the purpose thereof is to enable those skilled in the art to understand the contents of the present invention and to carry out the same, and the present invention shall not be limited to the embodiments, i.e. the equivalent changes or modifications made within the spirit of the present invention shall fall within the scope of the present invention.

Claims (7)

1. A light strip center extraction method for line structured light non-contact measurement is characterized in that firstly, region-of-interest extraction processing is carried out on an acquired image; performing primary noise filtering processing on the extracted region of interest; performing self-adaptive threshold segmentation on the image subjected to the preliminary noise filtering processing to obtain an optimal binarized image; sequentially carrying out median filtering and Gaussian filtering on the binarized image; then, roughly extracting the light strip center in the image after Gaussian filtering by adopting a skeleton thinning algorithm, and outputting coordinate data of a roughly extracted light strip center pixel point; setting a central pixel point of the light strip as a central point, and numbering the central point; marking a point with the number of k as a kth central point, obtaining N pixel points with nonzero gray scale in the neighborhood of the kth central point by adopting a k nearest neighborhood method, fitting a line segment by the N pixel points, obtaining a unit normal vector of the kth central point and vertical to the line segment by adopting a principal component analysis algorithm, fitting a Gaussian function to the gray scale value of the pixel point in the neighborhood of the kth central point and in the direction of the unit normal vector, and taking the expected value of the fitted Gaussian function as the final coordinate of the central point of the light strip so as to obtain the central coordinate of the light strip at a sub-pixel level; and repeating the process until all the numbered central points are traversed to obtain the accurate position of the center of the stripe.
2. The method for extracting the light strip centers for line structured light non-contact measurement as claimed in claim 1, wherein the method for roughly extracting the light strip centers in the gaussian filtered image by using the skeleton thinning algorithm comprises the following steps:
a1, setting an image subjected to Gaussian filtering as an image to be refined, and setting the pixel value of a white pixel point as 1 and the pixel value of a black pixel point as 0 in the image to be refined; setting the thinning times as u times; initializing m =0;
a2, copying an image to be thinned to generate a first temporary image; selecting a pixel point with a pixel value of 1 from the first temporary image, and defining the pixel point as pi;
step A3, extracting square neighborhood points around the pi point, and setting the pi 1 ~pi 8 Eight square neighborhood points of pi points, where pi 1 At the upper left of pi point, pi 1 ~pi 8 Are adjacent in sequence along the clockwise direction; pi (total internal diameter) 1 ~pi 8 The pixel values of the pixel points are sequentially corresponding to si 1 ~si 8 ;si 1 ~si 8 A value of 1 or 0;
step A4, scanning the first temporary image, if the pixel values of the neighborhood points of the pi points simultaneously meet the following three conditions, deleting the pi points in the image to be refined, otherwise, keeping the pi points:
(1)2≤si 1 +si 2 +si 3 +si 4 +si 5 +si 6 +si 7 +si 8 ≤6;
(2)si 1 ~si 8 the values of (A) are sequentially arranged correspondingly; the frequency of 0 and 1 mode appearing on adjacent pixel points in the pixel value arrangement is less than or equal to 1;
(3) While satisfying si 2 *si 4 *si 6 =0 and si 4 *si 6 *si 8 =0;
Step A5, judging whether all pixel points with pixel values of 1 in the first temporary image are selected in a traversing manner, if not, continuously selecting a pixel point with pixel value of 1 which is not selected in the first temporary image, and defining the pixel point as pi; returning to the step A3; if the traversing selection is carried out, the step A6 is carried out;
a6, copying the image to be thinned processed in the previous step to generate a second temporary image; selecting a pixel point with a pixel value of 1 from the second temporary image, and defining the pixel point as pk;
step A7, extracting square neighborhood points around the pk point, and setting pk 1 ~pk 8 Eight square neighborhood points of pk points, where pk 1 Located at the upper left of the pk point, pk 1 ~pk 8 Are adjacent in sequence along the clockwise direction; pk 1 ~pk 8 The pixel values of the pixel points sequentially correspond to sk 1 ~sk 8 ;sk 1 ~sk 8 A value of 1 or 0;
step A8, scanning the second temporary image, if the pixel values of the neighborhood points of the pk points simultaneously meet the following three conditions, deleting the pk points in the image to be refined, otherwise, keeping the pk points:
(1)2≤sk 1 +sk 2 +sk 3 +sk 4 +sk 5 +sk 6 +sk 7 +sk 8 ≤6;
(2)sk 1 ~sk 8 the values of (A) are sequentially arranged correspondingly; the frequency of 0 and 1 mode appearing on adjacent pixel points in the pixel value arrangement is less than or equal to 1;
(3) Satisfy sk simultaneously 2 *sk 4 *sk 8 =0 and sk 2 *sk 6 *sk 8 =0;
Step A9, judging whether all pixel points with pixel values of 1 in the second temporary image are selected in a traversing manner, if not, continuously selecting a pixel point with pixel value of 1 which is not selected in the second temporary image, and defining the pixel point as pk; returning to the step A7; if the selection is traversed, making m = m +1, and turning to the step A10;
and A10, judging whether m is smaller than u, if so, returning to the step A2, and if not, obtaining the final light bar central skeleton image from the processed image to be thinned.
3. The method of claim 2, wherein u is 10 to 20 times.
4. The method for extracting the center of the light bar for line structured light non-contact measurement as claimed in claim 1, wherein the method for obtaining the unit normal vector perpendicular to the line segment at the kth center point by using the principal component analysis algorithm comprises the following steps:
step B1, establishing a function Q (n) as follows, wherein when Q (n) takes the minimum value, n is a unit normal vector which is perpendicular to the line segment and is the kth central point:
Figure FDA0002693872470000021
wherein x is i The coordinates of the ith pixel point in the k central point neighborhood are obtained, i is 1 … m, m is the number of pixel points in the k central point neighborhood, and c is the coordinates of the k central point on the crude extraction optical line;
step B2, performing decentralized processing on the data, and setting y i =x i The-c, Q (n) function reduces to:
Figure FDA0002693872470000031
wherein, y i For the coordinates of the ith pixel point in the neighborhood of the kth central point after decentralization, taking 1 … m as i;
step B3, setting y i Has the coordinates of (a) i ,b i ) Let an intermediate matrix be S, S being expressed as follows:
Figure FDA0002693872470000032
the Q (n) function is further simplified to:
Q(n)=min n T Sn;
wherein the constraint of the equation is n T n =1, and according to lagrangian equations, solving the unit normal vector n is to perform vector decomposition on the matrix S, and taking the feature vector corresponding to the minimum feature value as the solved unit normal vector n.
5. The method for extracting the center of a light bar for line structured light non-contact measurement as claimed in claim 1, wherein the method of performing gaussian fitting on the gray values of the pixel points is as follows:
let f (x) be a Gaussian function, the expression of f (x) is as follows:
Figure FDA0002693872470000033
wherein, A is the height of the Gaussian curve, mu is the coordinate of the peak center of the Gaussian function, and sigma is the variance, logarithmic operation is carried out on two sides of the formula, the Gaussian function is converted into a quadratic curve equation and is expressed by F (x):
Figure FDA0002693872470000034
F(x)=a 0 +a 1 x+a 2 x 2
wherein a is 0 ,a 1 ,a 2 Are the coefficients of the quadratic curve equation respectively,
Figure FDA0002693872470000035
extracting z coordinate points in the neighborhood of the kth central point in the direction of a unit normal vector n of the kth central point to serve as a data set, and using M j Expressed, j is 1 … z, and is substituted into equation F (x) to obtain a 0 、a 1 、a 2 And obtaining a fitted Gaussian function.
6. The method of claim 5, wherein the fitted expected value of the Gaussian function is used as the final coordinate of the center point of the light bar as follows:
will M j And (3) making residual errors between the coordinates of each point and the corresponding point coordinates obtained by the fitted Gaussian function, and solving the sum of the residual errors, wherein the expression is as follows:
Figure FDA0002693872470000041
wherein D is the sum of residuals, and z is the number of data points in the data set;
are respectively to a 0 ,a 1 ,a 2 Calculating a partial derivative, i.e.
Figure FDA0002693872470000042
Arrange the items and handle a 0 ,a 1 ,a 2 Isolated to give the following formula:
Figure FDA0002693872470000043
finally, solve the equation by HausHold transform 0 a 1 a 2 ]Finally obtaining a coordinate point of the center of the Gaussian function peak as
Figure FDA0002693872470000044
And the coordinates in the corresponding image are the sub-pixel positions of the light bar centers.
7. The method as claimed in claim 1, wherein the contrast and brightness of the captured image are adjusted before the region of interest extraction process is performed on the captured image.
CN202010999750.XA 2020-09-22 2020-09-22 Light strip center extraction method for line structured light non-contact measurement Active CN112241964B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010999750.XA CN112241964B (en) 2020-09-22 2020-09-22 Light strip center extraction method for line structured light non-contact measurement

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010999750.XA CN112241964B (en) 2020-09-22 2020-09-22 Light strip center extraction method for line structured light non-contact measurement

Publications (2)

Publication Number Publication Date
CN112241964A CN112241964A (en) 2021-01-19
CN112241964B true CN112241964B (en) 2022-12-27

Family

ID=74171646

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010999750.XA Active CN112241964B (en) 2020-09-22 2020-09-22 Light strip center extraction method for line structured light non-contact measurement

Country Status (1)

Country Link
CN (1) CN112241964B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112884750B (en) * 2021-03-04 2022-03-25 湖州点彩智能科技有限公司 GPU-based plain color fabric crease extraction method
CN113324478A (en) * 2021-06-11 2021-08-31 重庆理工大学 Center extraction method of line structured light and three-dimensional measurement method of forge piece
CN114548189B (en) * 2022-04-25 2022-08-09 可孚医疗科技股份有限公司 Method and device for detecting prothrombin time and computer readable storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101804907A (en) * 2010-03-17 2010-08-18 燕山大学 Machine vision belt tearing detecting and protecting device
CN103234475A (en) * 2012-11-27 2013-08-07 深圳华用科技有限公司 Sub-pixel surface morphology detecting method based on laser triangular measuring method
CN105005981A (en) * 2014-04-18 2015-10-28 北京航空航天大学 Light stripe center extraction method and apparatus based on multiple dimensions
CN109035213A (en) * 2018-07-05 2018-12-18 大连理工大学 Optical losses sub-pixel extraction based on striation section Energy distribution uniqueness
CN110443846A (en) * 2019-07-02 2019-11-12 苏州全视智能光电有限公司 A method of a cloud is quickly generated based on direction template high-precision

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101804907A (en) * 2010-03-17 2010-08-18 燕山大学 Machine vision belt tearing detecting and protecting device
CN101986143A (en) * 2010-03-17 2011-03-16 燕山大学 Machine vision belt tear detection and protective device
CN103234475A (en) * 2012-11-27 2013-08-07 深圳华用科技有限公司 Sub-pixel surface morphology detecting method based on laser triangular measuring method
CN105005981A (en) * 2014-04-18 2015-10-28 北京航空航天大学 Light stripe center extraction method and apparatus based on multiple dimensions
CN109035213A (en) * 2018-07-05 2018-12-18 大连理工大学 Optical losses sub-pixel extraction based on striation section Energy distribution uniqueness
CN110443846A (en) * 2019-07-02 2019-11-12 苏州全视智能光电有限公司 A method of a cloud is quickly generated based on direction template high-precision

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
"Research on 3d measurement model by line structure light vision";Siyuan Liu;《EURASIP Journal on Image and Video Processing》;20180918;1-10页 *
"一种改进的高斯拟合法在光带中心提取中的应用";孙盼庆等;《电子设计工程》;20120731;179-181页 *

Also Published As

Publication number Publication date
CN112241964A (en) 2021-01-19

Similar Documents

Publication Publication Date Title
CN112241964B (en) Light strip center extraction method for line structured light non-contact measurement
CN110569704B (en) Multi-strategy self-adaptive lane line detection method based on stereoscopic vision
CN110866924B (en) Line structured light center line extraction method and storage medium
CN109658398B (en) Part surface defect identification and evaluation method based on three-dimensional measurement point cloud
CN104318548B (en) Rapid image registration implementation method based on space sparsity and SIFT feature extraction
CN108107444B (en) Transformer substation foreign matter identification method based on laser data
CN110070567B (en) Ground laser point cloud registration method
CN109615654B (en) Method for measuring corrosion depth and area of inner surface of drainage pipeline based on binocular vision
CN107392929B (en) Intelligent target detection and size measurement method based on human eye vision model
CN107481274B (en) Robust reconstruction method of three-dimensional crop point cloud
CN107767456A (en) A kind of object dimensional method for reconstructing based on RGB D cameras
CN111415376B (en) Automobile glass subpixel contour extraction method and automobile glass detection method
CN109447939B (en) Weld width prediction method before molten pool forming based on mixed spectrum vision
CN109559324A (en) A kind of objective contour detection method in linear array images
CN109470149B (en) Method and device for measuring position and posture of pipeline
CN114998198A (en) Injection molding surface defect identification method
CN111640158A (en) End-to-end camera based on corresponding mask and laser radar external reference calibration method
CN113706593B (en) Vehicle chassis point cloud fusion method suitable for vehicle geometric passing parameter detection
CN115482195B (en) Train part deformation detection method based on three-dimensional point cloud
CN111354047B (en) Computer vision-based camera module positioning method and system
CN110310331A (en) A kind of position and orientation estimation method based on linear feature in conjunction with point cloud feature
CN102446356A (en) Parallel and adaptive matching method for acquiring remote sensing images with homogeneously-distributed matched points
CN105513094A (en) Stereo vision tracking method and stereo vision tracking system based on 3D Delaunay triangulation
CN112257722A (en) Point cloud fitting method based on robust nonlinear Gaussian-Hummer model
CN110705553B (en) Scratch detection method suitable for vehicle distant view image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant