CN110966937B - Large member three-dimensional configuration splicing method based on laser vision sensing - Google Patents

Large member three-dimensional configuration splicing method based on laser vision sensing Download PDF

Info

Publication number
CN110966937B
CN110966937B CN201911313082.4A CN201911313082A CN110966937B CN 110966937 B CN110966937 B CN 110966937B CN 201911313082 A CN201911313082 A CN 201911313082A CN 110966937 B CN110966937 B CN 110966937B
Authority
CN
China
Prior art keywords
splicing
laser
profile
laser vision
filtering
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911313082.4A
Other languages
Chinese (zh)
Other versions
CN110966937A (en
Inventor
雷正龙
吴世博
王志敏
孙璐璐
郭亨通
黎炳蔚
陈彦宾
王智远
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Institute of Technology
Original Assignee
Harbin Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Institute of Technology filed Critical Harbin Institute of Technology
Priority to CN201911313082.4A priority Critical patent/CN110966937B/en
Publication of CN110966937A publication Critical patent/CN110966937A/en
Application granted granted Critical
Publication of CN110966937B publication Critical patent/CN110966937B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/254Projection of a pattern, viewing through a pattern, e.g. moiré
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/26Measuring arrangements characterised by the use of optical techniques for measuring angles or tapers; for testing the alignment of axes
    • G01B11/27Measuring arrangements characterised by the use of optical techniques for measuring angles or tapers; for testing the alignment of axes for testing the alignment of axes
    • G01B11/272Measuring arrangements characterised by the use of optical techniques for measuring angles or tapers; for testing the alignment of axes for testing the alignment of axes using photoelectric detection means

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

A large member three-dimensional configuration splicing method based on laser vision sensing relates to the field of image measurement and data processing, and aims to solve the problems of large workload, low efficiency, poor anti-interference capability and difficulty in realizing online measurement of large complex molded surface straightness accuracy. The laser vision measuring method is based on the idea of a pitch method, and measures the straightness of the large complex molded surface by utilizing the advantages of the short-distance measurement accuracy, stability, instantaneity and the like of the laser vision sensor. The profile size of the large complex profile is large, so that the profile can be divided into a plurality of partially overlapped sections with the same length as the laser stripes, segmented measurement is carried out on each segmented section, then data processing is carried out to carry out splicing reconstruction on the linearity curve, and the linearity of the large complex profile is calculated by using the linearity curve subjected to splicing reconstruction. The invention is applied to the field of image measurement.

Description

Large member three-dimensional configuration splicing method based on laser vision sensing
Technical Field
The invention relates to the field of image measurement and data processing, in particular to a three-dimensional configuration splicing method for a large member based on laser vision sensing.
Background
With the rapid development of intelligent manufacturing, the requirements of people on the accuracy, the measuring speed, the non-contact property and the like of three-dimensional measuring technology and equipment are continuously improved. The traditional measuring method needs a large amount of manual work and is difficult to meet the requirements of automatic and on-line measurement. For objects with complex shapes and large sizes of molded surfaces, the operability of the traditional measuring method is obviously reduced, and particularly the difficulty of online real-time measurement of large complex molded surfaces is extremely high.
For the molding precision of large-size profiles, the accurate measurement of straightness is taken as the most basic requirement. Straightness measurement methods commonly used in engineering practice include direct methods and indirect methods. The direct method generally includes selecting a measuring reference line, and obtaining a linearity value by measuring the deviation of each point on the measured straight line relative to the base line, including an optical gap method and a tabulation method. The indirect method mainly adopts the idea of a pitch method, divides a measured straight line into a plurality of pitches, measures the relative value of each section by adopting a small-angle measuring instrument, and obtains the straightness error through data processing, wherein the straightness error comprises a level meter method and a collimator method. The implementation method of various straightness measuring methods is as follows:
(1) the light gap method adopts the knife edge angle square to contact with the measured straight line or plane, and judges the error of the straightness according to the size of the light transmission gap.
(2) The dial indicator on the precision platform is contacted with the surface of the measured object, and the linearity error of the measured straight line relative to the platform is calculated by reading the dial indicator through continuously or discontinuously moving the gauge stand.
(3) The gradienter method uses a natural horizontal plane as a measuring datum line, and determines the slope change of the position of a measured point of a measured straight line according to the number of grids of bubble deviation.
(4) The collimator method utilizes the characteristics of high laser brightness and good directivity, takes the laser beam as a measuring reference, and determines the relative straightness deviation of each measuring point according to the beam offset.
The optical gap method is simple to operate and has higher measurement accuracy, but the method is easily influenced by illumination conditions, the cross section shape, the width and the roughness of a contact surface, and the method is only suitable for measuring the straightness errors of planes, cylindrical surfaces and conical surfaces with smaller sizes. The method of drawing a table measuring efficiency is lower, and the measurement size is restricted by the size of the platform. The level meter method has the advantages of simple operation, high measurement precision and good stability, but the measurement process has low efficiency, is sensitive to temperature and has complex data processing. The collimator method has long measurement distance and high measurement precision, but the measurement precision is easily influenced by temperature, airflow and vibration.
For measuring the straightness of large-size complex profiles, a level meter and a collimator are often adopted. The straightness measuring methods of the level and the collimator are based on a pitch method, the accumulated error of the method is large, and the measurement uncertainty is accumulated successively. For large-size complex profiles, the larger the size is, the larger the accumulated error of straightness measurement is, and the measurement uncertainty is increased. It is noted that as the size of the target object increases, the attenuation of the light intensity of the collimator is more severe, and the uncertainty of the reading error observed by the human eye also increases. For a measuring instrument adopting a laser collimator design, the energy of laser light scattered, reflected and absorbed cannot be ignored due to the increase of the target size. The attenuation of laser light intensity causes the deviation of laser directivity, and measured data is unstable, and error deviation direction and deviation degree can not be measured accurately.
In recent years, three-dimensional measurement methods based on laser vision are developed rapidly, and laser vision sensors have the characteristics of high measurement precision, high measurement speed, strong anti-interference capability, easiness in combination with a motion system and the like, and are ideal choices for realizing linearity measurement of large-scale complex molded surfaces. Meanwhile, compared with other optical measurement methods, the laser vision measurement method has the advantages of real-time performance, accuracy, small accumulated error and the like, meets the requirements of high precision, high stability and real-time performance in industrial production, and has high engineering application value.
The invention patent of patent application No. 200910069220.9 proposes a method for measuring the diameter and straightness parameters of a seamless circular steel pipe, but the method needs to calibrate a multi-line structured light sensor, needs to determine a plurality of space section centers, and has a complex processing process. The invention patent of patent application number 201310447721.2 provides an on-line non-contact measurement method for the straightness of shaft parts, which relates to the calibration of complex internal reference, distortion coefficient and structured light plane parameters of a CCD camera, and the measurement length is only 200 mm. The invention patent of patent application No. 201510209414.X proposes a method and a device for detecting single-side characteristics of a special-shaped guide rail based on laser line structured light, which can realize measurement of single-side straightness, but have the advantages of limited measurement width, less device freedom degree and single measurement object.
Disclosure of Invention
The invention aims to solve the problems of large and complex molded surface straightness measurement workload, low efficiency, poor anti-interference capability and difficulty in realizing on-line measurement, and provides a novel method, namely a large member three-dimensional configuration splicing method based on laser vision sensing. The laser vision measuring method is based on the idea of a pitch method, and measures the straightness of the large-scale complex molded surface by utilizing the advantages of the short-distance measurement accuracy, stability, instantaneity and the like of the laser vision sensor. The profile size of the large complex profile is large, so that the profile can be divided into a plurality of partially overlapped sections with the same length as the laser stripes, segmented measurement is carried out on each segmented section, then data processing is carried out to carry out splicing reconstruction on the linearity curve, and the linearity of the large complex profile is calculated by using the linearity curve subjected to splicing reconstruction.
The invention relates to a large member three-dimensional configuration splicing method based on laser vision sensing, which is carried out according to the following steps:
the method comprises the following steps: before measuring the profile of the molded surface, calibrating a laser vision sensor, and restoring a two-dimensional pixel image of a computer to earth coordinates;
step two: the laser vision sensor and the motion system are rigidly fixed, so that the sensor is always in a common working interval in the motion process;
step three: establishing communication between a laser vision sensor and a PC controller, setting sensor parameters and setting motion system parameters;
step four: scanning a profile image of a workpiece to be detected by an image scanning unit, obtaining an array of 512 pixel points by scanning each time, wherein the range of the horizontal coordinate is 0-511, the range of the vertical coordinate is 0-1023, and the point arrays are sequentially arranged according to the horizontal coordinate to obtain a profile two-dimensional image obtained by scanning;
step five: filtering the profile image acquired by the image acquisition card;
step six: splicing the laser stripe overlapping sections after filtering treatment by adopting a least square method, and specifically comprising the following implementation steps:
(1) selecting a splicing coincidence rate lambda, and determining the abscissa range I of the spliced section according to the lambda, wherein the lambda is 0-20%;
(2) calculating a square loss function delta y 'according to formula (1)'i
Δy′i=∑x∈I(y′i,x-y′i+1,x)2=∑x∈I(Δkix+Δbi)2 (1)
Wherein, y'i,xAnd yi+1,xCoordinates of coincident points of the i-th and i + 1-th section profiles, Δ kiAnd Δ biRotation coefficients and translation coefficients, respectively;
delta y'iSolving as a function using least squares, by
Figure BDA0002325053900000031
And
Figure BDA0002325053900000032
so that Δ y'iThe value is minimum, and then the rotation coefficient delta k of the ith splicing is obtained according to the formula (1)iAnd a translation coefficient Δ bi
(3) Using coefficient of rotation Δ kiAnd a translation coefficient Δ biAnd adjusting the profile of the i +1 section: based on the previous laser stripe, according to the rotation coefficient delta kiAnd a translation coefficient Δ biCarrying out translation and rotation processing on the subsequent section of laser stripe, and splicing the subsequent section of laser stripe to the previous section of laser stripe;
(4) and (5) repeating the step (3) until the splicing is finished, and finishing the algorithm.
The straightness splicing measurement method based on laser vision comprises the following steps:
(1) calibrating parameters of a laser vision sensor;
(2) the laser vision sensor is integrated with the motion system;
(3) carrying out laser stripe data filtering processing;
(4) and (5) splicing the straightness curves.
The hardware of the laser vision sensor system consists of a laser, a laser power supply, an image acquisition card, a CCD camera power supply, a controller and the like. The invention is suitable for linear laser measuring equipment such as scanning optical vision sensors, structured optical vision sensors and the like. The working principle of the sensor system is as follows: before use, the method is initialized, communication between a controller and a sensor is established, parameters such as laser power and stripe length are set, and then the sensor is started. When the profile is measured, a laser beam in the sensor is projected onto the profile to be measured, and the CCD module receives laser stripe information and outputs the surface height information of the profile to be measured. The length of the laser stripe data collected each time is 512 one-dimensional arrays, the corresponding range of the abscissa is 0-511, the numerical value of each digit corresponds to the size of the ordinate, the range is 0-1023, and the units of the abscissa and the ordinate are pixels. The stripe information can be displayed on a screen in real time, and the deformation of the stripes reflects the fluctuation condition of the real-time molded surface. The implementation flow of the system software is shown in fig. 1, and the main flow includes: sensor initialization, on-off of scanning light, laser stripe length setting, real-time display of laser stripes, real-time acquisition to an internal memory, laser off and the like.
The visual sensing is a process of converting real three-dimensional contour information of a workpiece into a computer two-dimensional pixel image, and the coordinate calibration is a process of restoring the computer two-dimensional pixel image into an earth coordinate. The conversion calibration of the sensor internal coordinate system and the terrestrial coordinate system comprises the calibration in the height direction and the width direction. The laser vision sensor is calibrated by adopting a field calibration method, wherein the field calibration method does not adopt the establishment of a mathematical model for a camera and a sensing systemThe model and the calculation model parameters are calibrated, but the corresponding relation between the two-dimensional pixel coordinate point of the computer and the spherical coordinate point in the measuring surface is directly measured by a test method, and an index table which is connected with the two coordinate points is established after mathematical fitting. During actual calibration, two-dimensional pixel coordinate values are input, and the earth coordinate values corresponding to the two-dimensional pixel coordinate values are searched through an index table, so that the calibration process is completed. The field calibration schematic diagram of the laser vision sensor is shown in fig. 2, a sample of 20mm × 20mm × 100mm is fixed on a motor base, the common working range of the sensor is 130 mm-210 mm, and the calibration is carried out at a speed of 4 mm/step in the range. Wherein Z iseAnd XeRespectively, the abscissa and ordinate values, Z, of the global coordinate systemsAnd XsRespectively, the abscissa and ordinate values of the sensor coordinate system.
The filtering processing algorithm mainly aims to remove random noise points in the acquired data so as to improve the validity and the reliability of the data. And provides high quality, high accuracy data for subsequent image processing operations. The filtering process requires that the details of the original data are kept as much as possible while the noise points are accurately removed, so that image distortion is avoided and larger errors are brought. There are many existing filtering algorithms, and the common ones are mean filtering, median filtering, hamming filtering, secondary filtering and bell-shaped filtering. Because the profile of the molded surface is measured by adopting the laser vision sensor, the morphological processing of expansion and corrosion is not required to be carried out on the filtered image when the image algorithm processing is carried out, no edge extraction and no center line extraction exist, and the profile measuring process is simple, quick and easy to carry out.
Because the length of the laser stripe projected to the target profile by the laser vision sensor is limited, the filtered stripe needs to be spliced to obtain a straightness curve. The laser vision sensor and the motion system are combined to realize segmented measurement, and a straightness splicing measurement schematic diagram based on laser vision is shown in fig. 3.
In the process of measuring the straightness, the straightness splicing measuring method based on the laser vision is adopted, and compared with other straightness measuring methods, the straightness splicing measuring method has the following advantages:
(1) the laser vision measurement process is non-contact, good in accessibility and strong in anti-interference capability, the measurement accuracy is not influenced by the cross section shape and the surface roughness of the measured object and the ambient temperature, humidity and illumination conditions, and the measurement uncertainty and the measurement accuracy are not increased along with the increase of the size of the measured object;
(2) the measurement process is simple to operate, high in speed and high in efficiency, is easy to combine with a motion system, and is suitable for measuring large-scale complex molded surfaces;
(3) compared with other laser vision measuring methods, the measuring size of the method is not limited by the length of the laser striation, and the straightness of the large-size molded surface can be measured through multiple splicing;
(4) fitting the overlapped sections by adopting a least square method, wherein after fitting, the straightness accuracy splicing error is small, the splicing accuracy is high, and more profile size accuracy information can be extracted through subsequent modeling;
(5) by determining the proper splicing coincidence rate and considering single splicing error and splicing efficiency, the splicing accuracy can be ensured and the accumulated error can be reduced.
Drawings
FIG. 1 is a flow chart of a laser vision sensing system;
FIG. 2 is a schematic diagram of field calibration of a laser vision sensor;
FIG. 3 is a schematic diagram of laser vision based straightness stitching measurement;
FIG. 4 is a straightness stitching schematic;
FIG. 5 is a graph of the stitching effect of scanned images with different coincidence rates; wherein, a figure is a 0% coincidence rate scanned image, b figure is a 4% coincidence rate scanned image, c figure is an 8% coincidence rate scanned image, d figure is a 12% coincidence rate scanned image, e figure is a 16% coincidence rate scanned image, and f figure is a 20% coincidence rate scanned image; in the graphs a to f, a is a contour 1, B is a slope 1, C is a contour 2, and D is a slope 2.
Detailed Description
The first embodiment is as follows: the three-dimensional configuration splicing method of the large member based on the laser vision sensing is carried out according to the following steps:
the method comprises the following steps: before measuring the profile of the molded surface, calibrating a laser vision sensor, and restoring a two-dimensional pixel image of a computer to earth coordinates;
step two: the laser vision sensor and the motion system are rigidly fixed, so that the sensor is always in a common working interval in the motion process;
step three: establishing communication between a laser vision sensor and a PC controller, setting sensor parameters and setting motion system parameters;
step four: scanning a profile image of a workpiece to be detected by an image scanning unit, obtaining an array of 512 pixel points by scanning each time, wherein the range of the horizontal coordinate is 0-511, the range of the vertical coordinate is 0-1023, and the point arrays are sequentially arranged according to the horizontal coordinate to obtain a profile two-dimensional image obtained by scanning;
step five: filtering the profile image acquired by the image acquisition card;
step six: splicing the laser stripe overlapping sections after filtering treatment by adopting a least square method, and specifically comprising the following implementation steps:
(1) selecting a splicing coincidence rate lambda, and determining the abscissa range I of the spliced section according to the lambda, wherein the lambda is 0-20%;
(2) calculating a square loss function delta y 'according to formula (1)'i
Δy′i=∑x∈I(y′i,x-y′i+1,x)2=∑x∈I(Δkix+Δbi)2 (1)
Wherein, y'i,xAnd yi+1,xCoordinates of coincident points of the i-th and i + 1-th section profiles, Δ kiAnd Δ biRotation coefficients and translation coefficients, respectively;
delta y'iSolving as a function using least squares, by
Figure BDA0002325053900000061
And
Figure BDA0002325053900000062
so that Δ y'iThe value is minimum, and then the rotation coefficient delta k of the ith splicing is obtained according to the formula (1)iAnd a translation coefficient Δ bi
(3) Using coefficient of rotation Δ kiAnd a translation coefficient Δ biAnd adjusting the profile of the i +1 section: based on the previous laser stripe, according to the rotation coefficient delta kiAnd a translation coefficient Δ biCarrying out translation and rotation processing on the subsequent section of laser stripe, and splicing the subsequent section of laser stripe to the previous section of laser stripe;
(4) and (5) repeating the step (3) until the splicing is finished, and finishing the algorithm.
In the sixth step of the present embodiment, the laser stripes need to be spliced, and theoretically, as long as it is ensured that the two smooth curves can be continuous and conductive at the coincident point, smooth splicing of the two curves can be ensured. Therefore, under ideal conditions, only one coincident point is needed to meet the splicing requirement of two smooth curves. In fact, in order to improve the splicing accuracy, multiple points on one overlapped section are often selected for splicing. Taking the splicing of two smooth curves as an example, a mathematical model of a straightness splicing algorithm is established, and the splicing principle is shown in fig. 4.
Contour 1:
Figure BDA0002325053900000063
contour 2:
Figure BDA0002325053900000064
in the formula:
Figure BDA0002325053900000065
as a measured value, y1、y2Is a profile straightness error profile, k1、k2、b1、b2For the stitching factor, x is the abscissa of each measurement point during the measurement of the profile of each segment.
In the overlapping section, fromWhen the measurement datum is inconsistent, the measurement value is different at the measurement point of the overlapped section, but the actual profile of the guide rail is not changed, so that the following relation exists:
Figure BDA0002325053900000071
y12: equation (1) minus equation (2) yields:
Δym=Δkx+Δb(x2≤x≤x3) (3)
in the formula: Δ k and Δ b are splicing error coefficients, and Δ k ═ k1-k2,Δb=b1-b2The method can solve the delta k and the delta b by adopting a least square method, then takes any measurement curve of the delta k and the delta b as a reference, and can unify the other measurement curve to the same reference line, thereby realizing the splicing of the outline 1 and the outline 2.
The second embodiment is as follows: the present embodiment is different from the specific embodiment in that: the filtering algorithm is mean filtering, median filtering, hamming filtering, quadratic filtering or bell-shaped filtering.
The rest is the same as the first embodiment.
The third concrete implementation mode: the present embodiment is different from the specific embodiment in that: the laser vision sensor is a linear laser sensor. The rest is the same as the first embodiment.
The fourth concrete implementation mode: the present embodiment is different from the specific embodiment in that: the linear laser sensor is a scanning laser vision sensor or a structured light vision sensor. The rest is the same as the first embodiment.
The fifth concrete implementation mode: the present embodiment is different from the specific embodiment in that: the motion system is a coordinate robot system, a multi-joint robot system or a CNC system. The rest is the same as the first embodiment.
The beneficial effects of the present invention are demonstrated by the following examples:
example 1
In this embodiment, the scanning laser vision sensor is used to measure the straightness of the profile, and the number of splicing times is one. The wavelength of a laser light source of the scanning type laser vision sensor is 650-699 nm, the laser power is 2-30 mW, the common working range of the sensor is 130-210 mm, the length of a laser stripe is 0-80 mm, and the sampling frequency is 4 frames/second, namely 4 groups of 512-point arrays are obtained every second.
The scanning laser vision sensor is adopted to collect the flat plate profile, and the specific experimental method is as follows:
the method comprises the following steps: before measuring the profile of the molded surface, the laser vision sensor is calibrated, and the two-dimensional pixel image of the computer is restored to the earth coordinate. The calibration result of the laser vision sensor is Ze=-2×10-07Zs 3+0.0006Zs 2-0.6699Zs+419.45,Xe/Xs=-0.1397Zs+252.66;
Step two: the laser vision sensor and the motion system are rigidly fixed, so that the sensor is always in a common working interval of 130-210 mm in the motion process;
step three: setting an IP address of a PC controller, establishing communication between a laser vision sensor and the PC controller, setting a scanning offset of the sensor to be-2000, a scanning range value to be 0-500 and a sensor visual field to be 0-80 mm;
step four: scanning a profile image of a workpiece to be measured by an image scanning unit, obtaining a group of 512 point arrays by each scanning, wherein the range of the horizontal coordinate is 0-511, the range of the vertical coordinate is 0-1023, and the point arrays are sequentially arranged according to the horizontal coordinate to obtain a profile two-dimensional image obtained by the scanning;
step five: filtering the profile image acquired by the image acquisition card, wherein the filtering algorithm can adopt common algorithms such as mean filtering, median filtering, Hamming filtering, secondary filtering, bell-shaped filtering and the like and dynamic self-adaptive threshold filtering forms of various algorithms;
step six: and splicing the laser stripes after filtering, fitting the contour of the coincidence section by adopting a least square method with the former laser stripe as a reference, performing translation and rotation processing on the latter laser stripe according to the fitting coefficient, and splicing the later laser stripe with the former laser stripe.
In the sixth step of this embodiment, the laser stripes need to be spliced, and theoretically, as long as it is ensured that the two smooth curves can be continuous and conductive at the coincident point, smooth splicing of the two curves can be ensured. Therefore, under ideal conditions, only one coincident point is needed to meet the splicing requirement of two smooth curves. In fact, in order to improve the splicing accuracy, multiple points on one overlapped section are often selected for splicing. Taking the splicing of two smooth curves as an example, a mathematical model of a straightness splicing algorithm is established, and the splicing principle is shown in fig. 4.
Contour 1:
Figure BDA0002325053900000081
contour 2:
Figure BDA0002325053900000082
in the formula:
Figure BDA0002325053900000083
as a measured value, y1、y2Is a profile straightness error profile, k1、k2、b1、b2For the stitching factor, x is the abscissa of each measurement point during the measurement of the profile of each segment.
In the overlapped section, the measurement reference is not consistent, the measurement value is different at the measurement point of the overlapped section, but the actual profile of the guide rail is not changed, so the following relation exists:
Figure BDA0002325053900000084
y1=y2equation (1) minus equation (2) yields:
Δym=Δkx+Δb(x2≤x≤x3) (3)
in the formula: Δ k and Δ b are splicing error coefficients, and Δ k ═ k1-k2,Δb=b1-b2The method can obtain Δ k and Δ b by least square method, and then take them arbitrarilyThe measuring curve is taken as a benchmark, namely, the other measuring curve can be unified to the same reference line, so that the contour 1 and the contour 2 are spliced.
Specifically, splicing the filtered laser stripes in the sixth step, wherein a straightness splicing algorithm is used. The straightness splicing algorithm based on laser vision comprises the following specific implementation steps:
(1) selecting a splicing coincidence rate lambda, and determining the abscissa range I of the spliced section according to the lambda, wherein the lambda is 0-20%;
(2) calculating a square loss function delta y 'according to formula (1)'i
Δy′i=∑x∈I(y′i,x-y′i+1,x)2=∑x∈I(Δkix+Δbi)2 (1)
Wherein, y'i,xAnd y'i+1,xCoordinates of coincident points of the i-th and i + 1-th section profiles, Δ kiAnd Δ biRotation coefficients and translation coefficients, respectively;
delta y'iSolving as a function using least squares, by
Figure BDA0002325053900000091
Master and slave
Figure BDA0002325053900000092
So that Δ y'iThe value is minimum, and then the rotation coefficient delta k of the ith splicing is obtained according to the formula (1)iAnd a translation coefficient Δ bi
(3) Using coefficient of rotation Δ kiAnd a translation coefficient Δ biAnd adjusting the profile of the i +1 section: based on the previous laser stripe, according to the rotation coefficient delta kiAnd a translation coefficient Δ biCarrying out translation and rotation processing on the subsequent section of laser stripe, and splicing the subsequent section of laser stripe to the previous section of laser stripe;
(4) and (5) repeating the step (3) until the splicing is finished, and finishing the algorithm.
And splicing the two sections of outlines by adopting splicing coincidence rates of 0 percent (coincidence 1 point), 4 percent, 8 percent, 12 percent, 16 percent and 20 percent respectively, wherein the corresponding overlapping lengths of different splicing coincidence rates are different from the single splicing error and the accumulated splicing error. In experiments with different overlapping lengths, the least square method data processing error of the splicing algorithm is verified, and the influence of the overlapping rate in the splicing process on the measurement result is obtained, wherein the experiment result is shown in fig. 5.
The linearity values corresponding to the linearity curves under different coincidence rates are shown in table 1, and certain single errors are brought by the linearity curve splicing and accumulated along with the increase of the splicing times. It can be known from the table that the single splicing error value is larger when the splicing coincidence rate is 0% and 4%, although the splicing times are smaller, the accumulated error is not small. With the increase of the splicing coincidence rate, the splicing error is slightly increased, and the splicing times are also increased, so that the small splicing coincidence rate is adopted as much as possible under the condition of ensuring the splicing accuracy. The influence of the splicing times and the splicing coincidence rate on the linearity error is comprehensively considered, and the linearity measurement is suitable under the condition that the splicing coincidence rate is 8%.
TABLE 1 straightness obtained at different splice coincidence rates
Figure BDA0002325053900000093
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned.
Furthermore, it should be understood that although the present description refers to embodiments, not every embodiment may contain only a single embodiment, and such description is for clarity only, and those skilled in the art should integrate the description, and the embodiments may be combined as appropriate to form other embodiments understood by those skilled in the art.
The present invention is not limited to the above description of the embodiments, and those skilled in the art should, in light of the present disclosure, appreciate that many changes and modifications can be made without departing from the spirit and scope of the invention.

Claims (5)

1. A three-dimensional configuration splicing method of a large member based on laser vision sensing is characterized by comprising the following steps:
before measuring the profile of the molded surface, calibrating a laser vision sensor, and restoring a two-dimensional pixel image of a computer to earth coordinates;
rigidly fixing the laser vision sensor and the motion system to ensure that the sensor is always in a common working interval in the motion process;
establishing communication between the laser vision sensor and a PC controller, setting sensor parameters and setting motion system parameters;
scanning the profile image of the profile of the workpiece to be detected by an image scanning unit, obtaining a group of 512 pixel point arrays by each scanning, wherein the range of the horizontal coordinate is 0-511, the range of the vertical coordinate is 0-1023, and the point arrays are sequentially arranged according to the horizontal coordinate to obtain the profile two-dimensional image obtained by the scanning;
fifthly, filtering the profile contour image acquired by the image acquisition card;
step six, splicing the laser stripe overlapping sections after filtering treatment by adopting a least square method, and specifically comprising the following implementation steps:
(1) selecting a splicing coincidence rate lambda, and determining the abscissa range I of the spliced section according to the lambda, wherein the lambda is 0-20%;
(2) calculating a square loss function delta y 'according to formula (1)'i
Δy′i=∑x∈I(y′i,x-y′i+1,x)2=∑x∈I(Δkix+Δbi)2 (1)
Wherein, y'i,xAnd y'i+1,xCoordinates of coincident points of the i-th and i + 1-th section profiles, Δ kiAnd Δ biRespectively the rotation coefficient and the translation coefficient of the ith splicing;
delta y'iSolving as a function using least squares, by
Figure FDA0002809752220000011
And
Figure FDA0002809752220000012
so that Δ y'iThe value is minimum, and then the rotation coefficient delta k of the ith splicing is obtained according to the formula (1)iAnd a translation coefficient Δ bi
(3) Rotation coefficient delta k using ith spliceiAnd the translation coefficient deltab of the ith spliceiAnd adjusting the profile of the i +1 section: based on the previous laser stripe, the rotation coefficient delta k of the ith splicingiAnd the translation coefficient deltab of the ith spliceiCarrying out translation and rotation processing on the subsequent section of laser stripe, and splicing the subsequent section of laser stripe to the previous section of laser stripe;
(4) repeating the step (3) until the splicing is finished and the algorithm is finished; the working range of the common working interval is 130 mm-210 mm.
2. The laser vision sensing-based large member three-dimensional configuration splicing method as claimed in claim 1, wherein the filtering process is performed by using a filtering algorithm, and the filtering algorithm is mean filtering, median filtering, hamming filtering, secondary filtering or bell-shaped filtering.
3. The laser vision sensing-based large member three-dimensional configuration splicing method as claimed in claim 1, wherein the laser vision sensor is a linear laser sensor.
4. The laser vision sensing-based large member three-dimensional configuration splicing method as claimed in claim 3, wherein the linear laser sensor is a scanning laser vision sensor or a structured light vision sensor.
5. The laser vision sensing-based large member three-dimensional configuration splicing method as claimed in claim 1, wherein the motion system is a coordinate robot system, a multi-joint robot system or a CNC system.
CN201911313082.4A 2019-12-18 2019-12-18 Large member three-dimensional configuration splicing method based on laser vision sensing Active CN110966937B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911313082.4A CN110966937B (en) 2019-12-18 2019-12-18 Large member three-dimensional configuration splicing method based on laser vision sensing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911313082.4A CN110966937B (en) 2019-12-18 2019-12-18 Large member three-dimensional configuration splicing method based on laser vision sensing

Publications (2)

Publication Number Publication Date
CN110966937A CN110966937A (en) 2020-04-07
CN110966937B true CN110966937B (en) 2021-03-09

Family

ID=70035163

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911313082.4A Active CN110966937B (en) 2019-12-18 2019-12-18 Large member three-dimensional configuration splicing method based on laser vision sensing

Country Status (1)

Country Link
CN (1) CN110966937B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10018570B2 (en) * 2016-11-09 2018-07-10 The Boeing Company Combined surface inspection using multiple scanners
CN111970054B (en) * 2020-09-14 2023-07-21 长春理工大学 View field spliced wide-area rapid capturing laser communication terminal

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1483999A (en) * 2003-08-15 2004-03-24 清华大学 Method and system for measruing object two-dimensiond surface outline
US6714302B2 (en) * 1998-06-30 2004-03-30 Canon Kabushiki Kaisha Aligning method, aligner, and device manufacturing method
JP3578588B2 (en) * 1996-04-23 2004-10-20 松下電器産業株式会社 Electronic component mounting equipment
KR101432366B1 (en) * 2012-07-17 2014-08-22 한양대학교 에리카산학협력단 Five degree of freedom stitching method and system of three dimensonal profile data
KR101501512B1 (en) * 2013-11-11 2015-03-18 한국표준과학연구원 2D Phase-Matched Nonlinear Optical Structured Illumination Microscopy Apparatus and Method of The Same
CN106312397A (en) * 2016-10-12 2017-01-11 华南理工大学 Laser vision guided automatic welding track tracking system and method
CN107578464A (en) * 2017-06-30 2018-01-12 长沙湘计海盾科技有限公司 A kind of conveyor belt workpieces measuring three-dimensional profile method based on line laser structured light
CN107726975A (en) * 2017-09-20 2018-02-23 大连理工大学 A kind of error analysis method of view-based access control model stitching measure
CN107907048A (en) * 2017-06-30 2018-04-13 长沙湘计海盾科技有限公司 A kind of binocular stereo vision method for three-dimensional measurement based on line-structured light scanning
CN108959655A (en) * 2018-08-07 2018-12-07 南京大学 A kind of adaptive online recommended method towards dynamic environment
CN109238168A (en) * 2018-08-06 2019-01-18 大连理工大学 Large-scale metrology part surface three dimension shape high-precision measuring method

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101571379B (en) * 2009-06-11 2010-12-08 天津大学 Method for measuring diameter and straightness accuracy parameters of seamless round steel pipe
CN103471531B (en) * 2013-09-27 2016-01-20 吉林大学 The online non-contact measurement method of axial workpiece linearity
CN105091778B (en) * 2015-04-28 2017-10-24 长春机械科学研究院有限公司 Obform guide rail one side characteristic detection method and device based on line-structured laser

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3578588B2 (en) * 1996-04-23 2004-10-20 松下電器産業株式会社 Electronic component mounting equipment
US6714302B2 (en) * 1998-06-30 2004-03-30 Canon Kabushiki Kaisha Aligning method, aligner, and device manufacturing method
CN1483999A (en) * 2003-08-15 2004-03-24 清华大学 Method and system for measruing object two-dimensiond surface outline
KR101432366B1 (en) * 2012-07-17 2014-08-22 한양대학교 에리카산학협력단 Five degree of freedom stitching method and system of three dimensonal profile data
KR101501512B1 (en) * 2013-11-11 2015-03-18 한국표준과학연구원 2D Phase-Matched Nonlinear Optical Structured Illumination Microscopy Apparatus and Method of The Same
CN106312397A (en) * 2016-10-12 2017-01-11 华南理工大学 Laser vision guided automatic welding track tracking system and method
CN107578464A (en) * 2017-06-30 2018-01-12 长沙湘计海盾科技有限公司 A kind of conveyor belt workpieces measuring three-dimensional profile method based on line laser structured light
CN107907048A (en) * 2017-06-30 2018-04-13 长沙湘计海盾科技有限公司 A kind of binocular stereo vision method for three-dimensional measurement based on line-structured light scanning
CN107726975A (en) * 2017-09-20 2018-02-23 大连理工大学 A kind of error analysis method of view-based access control model stitching measure
CN109238168A (en) * 2018-08-06 2019-01-18 大连理工大学 Large-scale metrology part surface three dimension shape high-precision measuring method
CN108959655A (en) * 2018-08-07 2018-12-07 南京大学 A kind of adaptive online recommended method towards dynamic environment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于立体视觉的高精度标定与测量方法;孔颖乔等;《计算机应用》;20170610;第37卷(第6期);第1798-1802页 *

Also Published As

Publication number Publication date
CN110966937A (en) 2020-04-07

Similar Documents

Publication Publication Date Title
CN101526336B (en) Calibration method of linear structured light three-dimensional visual sensor based on measuring blocks
CN108759699B (en) Method and system for measuring three-dimensional full-field deformation of masonry structure material with large field of view
Zexiao et al. Complete 3D measurement in reverse engineering using a multi-probe system
US8803943B2 (en) Formation apparatus using digital image correlation
CN102853786B (en) Apparatus and method for detecting flatness
CN104266608B (en) Field calibration device for visual sensor and calibration method
CN108311545B (en) Y-type rolling mill continuous rolling centering and hole pattern detection system and method
CN102506711B (en) Line laser vision three-dimensional rotate scanning method
CN110966937B (en) Large member three-dimensional configuration splicing method based on laser vision sensing
CN109443214B (en) Calibration method and device, measurement method and device for structured light three-dimensional vision
CN101813462A (en) Three-dimensional feature optical measuring system controlled by uniprocessor and measuring method
CN110672037A (en) Linear light source grating projection three-dimensional measurement system and method based on phase shift method
CN110763136B (en) High-precision three-dimensional workpiece size detection system and method
CN106092137B (en) The outdoor calibrator (-ter) unit and method of a kind of vehicle-mounted three-dimensional laser pavement detection system
CN108444413A (en) Ceramic wall and floor bricks flatness detecting device and method
CN111207670A (en) Line structured light calibration device and method
CN113554697A (en) Cabin section profile accurate measurement method based on line laser
CN100523720C (en) Optical non-contact three-dimensional measuring instrument
CN112017181A (en) Cylinder product surface detection method and related equipment thereof
CN114170321A (en) Camera self-calibration method and system based on distance measurement
CN108759668B (en) Tracking type three-dimensional scanning method and system in vibration environment
CN113804696A (en) Method for determining size and area of defect on surface of bar
CN107255458B (en) Resolving method of vertical projection grating measurement simulation system
JP4863006B2 (en) 3D shape measurement method
CN2914032Y (en) Optics non-contact type three-dimensional shaped measuring instrument

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant