CN112325796A - Large-scale workpiece profile measuring method based on auxiliary positioning multi-view point cloud splicing - Google Patents
Large-scale workpiece profile measuring method based on auxiliary positioning multi-view point cloud splicing Download PDFInfo
- Publication number
- CN112325796A CN112325796A CN202011151689.XA CN202011151689A CN112325796A CN 112325796 A CN112325796 A CN 112325796A CN 202011151689 A CN202011151689 A CN 202011151689A CN 112325796 A CN112325796 A CN 112325796A
- Authority
- CN
- China
- Prior art keywords
- point cloud
- coordinate system
- scanning
- module
- profile
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
A large workpiece profile measuring method based on auxiliary positioning multi-view point cloud splicing determines the pose of a mobile robot in a global coordinate system through a large scene positioning system, a high-precision 3D scanning system is fixed on a robot end manipulator, scanning point cloud data are converted into the global coordinate system in real time through hand-eye calibration and a positive kinematics model, and profile point cloud splicing contour reconstruction of a large part is achieved through a precise registration algorithm based on an iteration closest point. The invention overcomes the contradiction relation between the measurement precision and the measurement range by combining the large scene positioning system and the local high-precision scanning system, the measurement process is easy to realize automation, and the operation complexity of the large part profile measurement is reduced.
Description
Technical Field
The invention relates to a technology in the field of aviation component manufacturing, in particular to a large workpiece profile measuring method based on auxiliary positioning multi-view point cloud splicing.
Background
In the manufacturing and assembling process of complex products such as aviation, aerospace, ships and the like, certain key features are usually inspected for manufacturing quality by a specific physical inspection tool. For some box bottom curved surface components, the film sticking degree is obtained through a profiling mold and a feeler gauge to evaluate the curved surface manufacturing quality. Such methods have the following disadvantages: firstly, the precision requirement of the entity tool checking fixture is high, long-term maintenance is needed, and the manufacturing cost and the maintenance cost are high. Secondly, the workload of the inspection process is large, the operation is complex, the automation degree is low, a large amount of manual decision-making time is needed in the inspection process, more uncertain factors exist in the judgment of errors, and the efficiency is difficult to improve. Finally, the special tool and gauge has poor flexibility and is difficult to adapt to flexible manufacturing production with multiple characteristics, multiple varieties and multiple batches.
In order to evaluate the manufacturing quality of various features such as points, lines, surfaces and the like in the manufacturing and assembling process of the large component, the profile information of the butt end face needs to be obtained. The existing three-dimensional structured light camera or laser scanner can realize high-precision point cloud imaging in a local small range to obtain better curved surface information, but the measurement range is limited, and an integral point cloud image with a large profile cannot be directly obtained. The handheld portable three-coordinate laser scanner utilizes the flexible three-coordinate measuring arm to expand the measuring range to a certain extent, but the expansion degree is limited, the measuring process is complicated, the automation degree is not high, and the handheld portable three-coordinate laser scanner is still not suitable for measuring large surfaces. The three-dimensional scanner combined with the marker can obtain the profile information of a large part under the assistance of the marker, but the process needs to stick a large number of markers on a measured object to extract complete profile point cloud, and is complex and low in efficiency.
Disclosure of Invention
The invention provides a large workpiece profile measuring method based on auxiliary positioning multi-view point cloud splicing, aiming at the problems that complex profiles cannot be measured, the precision is reduced along with the increase of the measuring distance and a specific target is required in the existing manufacturing and assembling process of large fuselages, wing bodies and cabin sections.
The invention is realized by the following technical scheme:
the invention relates to a large-scale workpiece profile measuring system based on auxiliary positioning multi-view point cloud splicing, which comprises: big scene orientation module based on many laser tracker or indoor positioning system (indoor GPS, iGPS), based on mobile robot's scanning operation module, close-range high accuracy 3D scanning module and the data processing module of point cloud coarse registration and accurate registration based on binocular structure light, wherein: the large scene positioning module carries out pose transformation processing on a robot base coordinate system and a coordinate system on a global space according to a target fixed on the mobile robot, records spatial position information of the robot when a point cloud data block is acquired in real time, and outputs an obtained transformation matrix to the data processing module; the scanning operation module carries out multi-view scanning on the molded surface of the part to be measured according to the preset track information, simultaneously records joint angle data of the mechanical arm corresponding to each frame of point cloud image, and outputs the joint angle data to the data processing module; the close-range high-precision 3D scanning module scans the molded surface of the part according to the shooting instruction and outputs the acquired point cloud image to the data processing module; and the data processing module determines the initial pose of each frame of point cloud in a world coordinate system through coordinate transformation, a hand-eye calibration result and a kinematic forward solution according to the space pose information of the mobile robot and the angle information of the mechanical arm joint corresponding to each frame of point cloud image, and completes accurate matching by using an iterative closest point algorithm after rough matching and finally outputs a complete model of the surface contour of the large part.
The invention is based on the measuring method of the large workpiece surface of the system, the position and the posture of the mobile robot in the global coordinate system are determined through the large scene positioning system, the high-precision 3D scanning system is fixed on the robot end manipulator, the scanning point cloud data are converted into the global coordinate system in real time through the hand-eye calibration and the positive kinematic model, and the surface point cloud splicing contour reconstruction of the large part is realized by utilizing the precise registration algorithm based on the iteration closest point.
The global coordinate system is determined during the initial layout of the laser tracker or iGPS, and remains unchanged without changing the layout position.
The hand-eye calibration algorithm is as follows: when the 3D scanning system is mounted on the robot tool head, in order to obtain the coordinate transformation relationship between the scanning system measurement coordinate system and the robot tool coordinate system, a calibration algorithm is required, which specifically includes: the scanning system is fixedly connected with the tail end of the mechanical arm, the relation of the scanning system relative to a calibration object is obtained by enabling the robot to be in different positions and postures in the calibration process, then a calibration equation AX (X XB) can be established by combining the postures of the robot, and a coordinate transformation relation matrix can be obtained by solving the equation.
And step 3: the method comprises the following steps of collecting local dense point cloud and carrying out rough point cloud registration based on auxiliary positioning, specifically:
3.1) according to the measurement requirement of the molded surface, sequentially acquiring the molded surface local dense point cloud of the large-scale component by using a 3D vision sensor according to a given track, and recording the pose of the trolley and the rotation angle of each joint of the mechanical arm when point cloud data are acquired in real time;
3.2) calculating to obtain a measurement coordinate system of the 3D vision sensor, namely a transformation relation of the coordinate system of the vision sensor relative to a global coordinate system, based on the trolley positioning result obtained in the step 3.1, the joint angle calculation kinematics positive solution obtained in the step 3.1 and the hand-eye calibration result obtained in the step 2;
and 3.3) successively converting the point cloud data obtained by measurement in the step 3.1 into a global coordinate system by utilizing the coordinate transformation relation obtained by calculation in the step 3.2, and obtaining a point cloud model with a better initial pose.
And 4, step 4: by utilizing an iterative closest point algorithm, finishing the precise matching of point cloud data under multiple viewing angles to obtain a profile reconstruction result of a large workpiece, specifically comprising the following steps:
4.1) searching the corresponding relation of two adjacent point clouds according to the nearest corresponding point principle, which specifically comprises the following steps: and establishing a point cloud structure based on the octree, and establishing a point-by-point correspondence of two pieces of point clouds by using a nearest neighbor search algorithm.
Since the adjacent point clouds only partially overlap, the maximum search range is set 1/5 which is the scale of the overlap region during the search process.
And 4.2) solving the least square distance problem of the point pairs by adopting a closed solution based on singular value decomposition according to the corresponding relation, and obtaining an optimal matching solution under the current corresponding relation of the two point clouds.
The least square distance problem is constructed and solved by the following steps:
firstly, constructing a least square distance problemWherein: p is a radical ofi∈P,qiE, Q is respectively a corresponding point pair of two pieces of point clouds obtained by searching in the step 4.1;
carrying out singular value decomposition on the W matrix: w ═ U ∑ VTThe rotation and translation matrix of the optimal match under the current conditions is: r ═ UVT,t=p-Rq。
4.3) repeating the step 4.1 and the step 4.2 until the error value of the corresponding point calculated according to the least square distance problem is smaller than a preset value or reaches a preset maximum iteration number, and then considering convergence.
4.4) aiming at all point clouds to be registered, the registration treatment of the step 4.1 to the step 4.3 is carried out in sequence according to the collected precedence relationship, and finally the point cloud model with the complete target molded surface can be obtained.
Technical effects
The invention integrally solves the problem that the prior art can not carry out high-precision flexible high-efficiency measurement on the outline of a large-scale component, and the conventional 3D vision sensor can obtain local dense point cloud but has a limited measurement range; although the laser tracker or iGPS has a large measurement range, only the data of sparse feature points can be obtained. Compared with the prior art, the initial pose of the local point cloud is obtained through the large-scene auxiliary positioning system, the rough registration process does not depend on the extraction of characteristic points, a complete point cloud model of a large workpiece profile in a large specification range can be realized, the flexibility degree is high, physical markers do not need to be pasted on a measured object, the measuring process is simple in procedure, the obtained model can be used for being compared with a theoretical model or profile information of a part to be assembled, the accurate evaluation of the manufacturing quality is realized, the measuring process can be highly automated, and the operation complexity is reduced.
Drawings
FIG. 1 is a schematic diagram of the system of the present invention;
FIG. 2 is a schematic view of camera and mobile robot hand-eye calibration;
FIG. 3 is a schematic view of a large workpiece measurement process based on assisted positioning multi-view point cloud registration.
Detailed Description
As shown in fig. 1, the present embodiment relates to a large workpiece profile measuring system based on auxiliary positioning multi-view point cloud registration, which includes: the system comprises a large scene positioning module based on a multi-laser tracker or an iGPS, a scanning operation module based on a mobile robot, a close-range high-precision 3D scanning module based on binocular structured light and a data processing module for rough and fine point cloud registration.
The large scene positioning module comprises: coordinate measuring unit and position appearance positioning unit.
The coordinate measuring unit comprises a plurality of signal transmitter base stations and a receiver, wherein the signal transmitter base stations are used as metering transmitters, the signal transmitter base stations continuously transmit signals with different frequencies and simultaneously receive and judge the distance between the signal transmitter base stations and the receiver, and the signal transmitter base stations jointly establish a full coordinate system. In the working range, each receiver must exchange information directly with more than two transmitters at any time, so that the three-dimensional coordinate information can be solved in real time through the triangulation principle.
The position and pose positioning unit comprises at least three receivers arranged on the mobile robot base, the receivers form a receiver group, and the position and pose of the mobile robot base relative to the global coordinate system are calculated in real time according to coordinate information of more than three positioning points on the mobile robot base in the global coordinate system. The pose transformation information is transmitted to the data processing module in real time and serves as one of the input of the point cloud rough registration.
The scanning operation module comprises: bottom moving platform and set up the arm unit on it, wherein: the bottom mobile platform carries the mechanical arm to move in a measurement scene, and the accessible space at the tail end of the mechanical arm is increased; the coordinate measuring unit is arranged on the bottom moving platform and is used for measuring and recording pose information in the operation process of the moving platform; the tail end of the mechanical arm is fixedly connected with the close-range high-precision 3D scanning module, the close-range high-precision 3D scanning module is controlled according to preset track information to complete scanning operation, and joint angle information in the signal acquisition process is transmitted to the data processing module in real time to serve as one of the inputs of point cloud coarse registration.
The close-range high-precision 3D scanning module is realized by a three-dimensional sensor such as a structured light camera, and local contour information of a workpiece at a specific view angle point is collected and an obtained point cloud image is transmitted to a data processing module.
The data processing module comprises: a point cloud coarse registration and fine registration module, wherein: the point cloud rough registration module receives a point cloud image frame from the close-range high-precision 3D scanning module, joint angle information of a current mechanical arm from the scanning operation module and pose information of a current bottom moving platform in a global coordinate system from the large-scene positioning module, obtains a rough registration result of the point cloud in the global coordinate system through pose change and kinematic forward solution, transmits the result to the fine registration module, obtains a precise registration result by utilizing an ICP algorithm and outputs the precise registration result as profile information of a large workpiece.
The embodiment relates to a large workpiece profile measuring method based on auxiliary positioning multi-view point cloud splicing of the system, which comprises the following steps:
step 1: and (4) establishing a large-scene auxiliary positioning system. Take the measurement of the butt end face of the wing body of a certain type of airplane as an example. In order to ensure efficient, reliable and high-precision butt joint of the wing body, profile measurement of a butt joint surface of the wing body needs to be completed. Due to the fact that the size of the part is large, in order to expand the measuring range of the high-precision 3D vision sensor, the fact that the whole molded surface scanning is completed in a point cloud splicing mode is proposed.
Firstly, an auxiliary positioning system of a large scene is constructed, and as shown in fig. 1, a large-scale measurement field is constructed in a measurement scene through a plurality of laser trackers numbered as phi or other auxiliary positioning instruments such as an iGPS signal transmitting base station. After calibration, a unified coordinate system of the measuring field can be established, the function of acquiring the three-dimensional coordinate information of the signal receiver or the target in a specific range is realized, and a foundation is provided for positioning the mobile trolley with the subsequent serial number of II.
Step 2: and calibrating the relative pose of the 3D scanning system relative to the base of the mobile robot. In the measurement process, the native data collected by the 3D sensor with the number of (v) is recorded in the sensor coordinate system CCS, and in order to obtain the point cloud information in the world coordinate system WCS, the transformation relationship from the camera coordinate system to the world coordinate system needs to be determined.
As shown in fig. 2, by the hand-eye calibration method, under the condition that the motion platform of the mobile robot with the serial number of (ii) is fixed, according to the relative determined pose relationship between the motion platform and the fixed calibration frame, a camera coordinate system CCS and a tool at the tail end of the mechanical arm can be calibratedRelative pose T of table coordinate system TCStc. After the camera is installed, if no special condition exists, the conversion matrix does not need to be calibrated again.
And step 3: and collecting local dense point clouds and completing point cloud coarse registration based on auxiliary positioning.
As shown in figure 1, aiming at a given fuselage section, according to the measurement requirement of the molded surface, a 3D vision sensor is used for sequentially acquiring dense point clouds of part molded surface parts according to a specified trackRecording the coordinate information p of each target on the trolley corresponding to each collected point cloud in real time and in the large scenei1,pi2,pi3And the angle theta of each joint of the measuring armi={θi1,θi2,…,θi6Obtaining the conversion relation from a camera measurement coordinate system CCS to a trolley motion platform coordinate system MCS through positive kinematics and a hand-eye calibration resultAnd transformation relation of trolley to global coordinate system WCSThereby obtaining the transformation from the camera measurement coordinate system to the global coordinate systemThe method specifically comprises the following steps:
because the relative position and attitude of the global coordinate system and the large-scale workpiece are kept unchanged in the measuring process, all the cloud blocks of the measuring points can obtain a relatively accurate profile point cloud profile after corresponding position and attitude transformation, and the method specifically comprises the following steps:
and 4, step 4: and (4) a fine registration process based on the iteration closest point. The large scene auxiliary positioning system provides a relatively accurate initial pose for the collected local point cloud, and in order to obtain an accurate model capable of representing the profile contour characteristics, an accurate registration process needs to be carried out on all point cloud blocks. Aiming at the dense point cloud model, under the condition of better initial pose, an accurate matching result can be obtained through a registration method based on the iteration closest point.
In this embodiment, the corresponding relationship between two adjacent point clouds is first searched according to the principle of the closest corresponding point, and since the adjacent point clouds only partially overlap, the maximum search range is also set, which can be empirically set to about 1/5 of the scale of the overlap region. And secondly, obtaining an optimal matching solution under the current corresponding relation of the two point clouds by adopting a closed solution based on singular value decomposition according to the corresponding relation. And (3) aiming at all point clouds to be registered, sequentially carrying out fine registration according to the acquired precedence relation, and finally obtaining a point cloud model with a complete target profile for later manufacturing quality evaluation or verification and guidance assembly process.
In conclusion, aiming at the contradiction between high precision and large scene in the profile measurement of large workpieces, the invention provides the method for organically combining the high-precision 3D vision sensor and the large scene auxiliary positioning instrument and realizing the reconstruction of the complete profile of the large profile in the form of auxiliary positioning multi-view point cloud splicing. The whole system depends on a mobile robot and a digital sensing technology, can improve the effective measurement range on the premise of ensuring the measurement precision, has high system flexibility, can realize high automation, and can provide a simple, efficient and feasible measurement scheme for profile measurement of large-scale components.
The foregoing embodiments may be modified in many different ways by those skilled in the art without departing from the spirit and scope of the invention, which is defined by the appended claims and all changes that come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein.
Claims (9)
1. The utility model provides a large-scale work piece profile measurement system based on supplementary location multi-view point cloud concatenation which characterized in that includes: big scene orientation module based on many laser tracker or iGPS, based on mobile robot's scanning operation module, close-range high accuracy 3D scanning module and the data processing module of point cloud coarse registration and accurate registration based on binocular structure light, wherein: the large scene positioning module carries out pose transformation processing on a robot base coordinate system and a coordinate system on a global space according to a target fixed on the mobile robot, records spatial position information of the robot when a point cloud data block is acquired in real time, and outputs an obtained transformation matrix to the data processing module; the scanning operation module carries out multi-view scanning on the molded surface of the part to be measured according to the preset track information, simultaneously records joint angle data of the mechanical arm corresponding to each frame of point cloud image, and outputs the joint angle data to the data processing module; the close-range high-precision 3D scanning module scans the molded surface of the part according to the shooting instruction and outputs the acquired point cloud image to the data processing module; and the data processing module determines the initial pose of each frame of point cloud in a world coordinate system through coordinate transformation, a hand-eye calibration result and a kinematic forward solution according to the space pose information of the mobile robot and the angle information of the mechanical arm joint corresponding to each frame of point cloud image, and completes accurate matching by using an iterative closest point algorithm after rough matching and finally outputs a complete model of the surface contour of the large part.
2. The system of claim 1, wherein the large scene positioning module comprises: the system comprises a coordinate measuring unit and a pose positioning unit;
the coordinate measuring unit comprises a plurality of signal transmitter base stations and receivers which are used as metering transmitters, wherein the signal transmitter base stations continuously transmit signals with different frequencies and simultaneously receive and judge the distance from the transmitters through the receivers;
the position and orientation positioning unit comprises at least three receivers arranged on a mobile robot base, the receivers form a receiver group, the position and orientation of the mobile robot base relative to the global coordinate system are calculated in real time according to coordinate information of more than three positioning points on the mobile robot base in the global coordinate system, and the position and orientation transformation information is transmitted to the data processing module in real time and is used as one of the input of point cloud coarse registration.
3. The large workpiece profile measurement system of claim 1, wherein the scan operation module comprises: bottom moving platform and set up the arm unit on it, wherein: the bottom mobile platform carries the mechanical arm to move in a measurement scene, and the accessible space at the tail end of the mechanical arm is increased; the coordinate measuring unit is arranged on the bottom moving platform and is used for measuring and recording pose information in the operation process of the moving platform; the tail end of the mechanical arm is fixedly connected with the close-range high-precision 3D scanning module, the close-range high-precision 3D scanning module is controlled according to preset track information to complete scanning operation, and joint angle information in the signal acquisition process is transmitted to the data processing module in real time to serve as one of the inputs of point cloud coarse registration.
4. The system for measuring the profile of a large workpiece according to claim 1, wherein the close-range high-precision 3D scanning module is realized by a three-dimensional sensor such as a structured light camera by collecting the local contour information of the workpiece at a specific view point and transmitting the obtained point cloud image to the data processing module.
5. The system of claim 1, wherein the data processing module comprises: a point cloud coarse registration and fine registration module, wherein: the point cloud rough registration module receives a point cloud image frame from the close-range high-precision 3D scanning module, joint angle information of a current mechanical arm from the scanning operation module and pose information of a current bottom moving platform in a global coordinate system from the large-scene positioning module, obtains a rough registration result of the point cloud in the global coordinate system through pose change and kinematic forward solution, transmits the result to the fine registration module, obtains a fine registration result by using an iterative closest point algorithm and outputs the fine registration result as profile contour information of a large workpiece.
6. The large workpiece profile measuring method based on the system of any one of claims 1 to 5 is characterized in that the pose of a mobile robot in a global coordinate system is determined through a large scene positioning system, a high-precision 3D scanning system is fixed on a robot end manipulator, scanning point cloud data are converted into the global coordinate system in real time through hand-eye calibration and a positive kinematic model, and profile point cloud splicing contour reconstruction of a large part is achieved through a precise registration algorithm based on an iteration closest point.
7. The method for measuring the profile of the large workpiece according to claim 6, which comprises the following steps:
step 1, establishing an indoor large scene auxiliary positioning system based on a plurality of laser trackers or iGPS (internet positioning system) by means of a physical target, specifically: installing a plurality of physical targets on a base platform of the mobile measuring robot so as to record the positions of target points on the mobile platform in a global coordinate system in real time in the profile measuring process, and resolving according to the positions of more than three targets to obtain the relative pose of the mobile robot in the global coordinate system in the measuring process;
step 2, calibrating the relative pose of the 3D scanning system relative to the base of the mobile robot, specifically: the method comprises the following steps of installing a structured light camera or other 3D visual sensors in a tool coordinate system of the mobile robot, namely the coordinate system corresponding to a robot end joint, and calculating a transformation relation of the coordinate system of the visual sensors relative to the tool coordinate system through a hand-eye calibration algorithm, so that point cloud data acquired by the camera is converted into the robot tool coordinate system in real time;
and step 3: the method comprises the following steps of collecting local dense point cloud and carrying out rough point cloud registration based on auxiliary positioning, specifically:
3.1) according to the measurement requirement of the molded surface, sequentially acquiring the molded surface local dense point cloud of the large-scale component by using a 3D vision sensor according to a given track, and recording the pose of the trolley and the rotation angle of each joint of the mechanical arm when point cloud data are acquired in real time;
3.2) calculating to obtain a measurement coordinate system of the 3D vision sensor, namely a transformation relation of the coordinate system of the vision sensor relative to a global coordinate system, based on the trolley positioning result obtained in the step 3.1, the joint angle calculation kinematics positive solution obtained in the step 3.1 and the hand-eye calibration result obtained in the step 2;
3.3) successively converting the point cloud data obtained by measurement in the step 3.1 into a global coordinate system by utilizing the coordinate transformation relation obtained by calculation in the step 3.2, and obtaining a point cloud model with a better initial pose;
and 4, step 4: by utilizing an iterative closest point algorithm, finishing the precise matching of point cloud data under multiple viewing angles to obtain a profile reconstruction result of a large workpiece, specifically comprising the following steps:
4.1) searching the corresponding relation of two adjacent point clouds according to the nearest corresponding point principle, which specifically comprises the following steps: establishing a point cloud structure based on an octree, and establishing a point-by-point correspondence relationship of two pieces of point clouds by using a nearest neighbor search algorithm;
4.2) solving the least square distance problem of the point pairs by adopting a closed solution based on singular value decomposition according to the corresponding relation to obtain an optimal matching solution under the current corresponding relation of the two point clouds;
4.3) repeating the step 4.1 and the step 4.2 until the error value of the corresponding point calculated according to the least square distance problem is smaller than a preset value or reaches a preset maximum iteration number, and then considering convergence;
4.4) aiming at all point clouds to be registered, the registration treatment of the step 4.1 to the step 4.3 is carried out in sequence according to the collected precedence relationship, and finally the point cloud model with the complete target molded surface can be obtained.
8. The method for measuring the profile of the large workpiece according to claim 7, wherein the hand-eye calibration algorithm comprises the following steps: when the 3D scanning system is mounted on the robot tool head, in order to obtain the coordinate transformation relationship between the scanning system measurement coordinate system and the robot tool coordinate system, a calibration algorithm is required, which specifically includes: the scanning system is fixedly connected with the tail end of the mechanical arm, the relation of the scanning system relative to a calibration object is obtained by enabling the robot to be in different positions and postures in the calibration process, then a calibration equation AX (X XB) can be established by combining the postures of the robot, and a coordinate transformation relation matrix can be obtained by solving the equation.
9. The method for measuring the profile of the large workpiece according to claim 7, wherein the rough registration of the point clouds specifically comprises the following steps: aiming at a given fuselage section, according to the measurement requirement of the molded surface, sequentially acquiring local dense point cloud of the molded surface of the part by using a 3D visual sensor according to a specified trackRecording the coordinate information p of each target on the trolley corresponding to each collected point cloud in a large scene in real timei1,pi2,pi3And each joint angle theta of the measuring armi={θi1,θi2,…,θi6Obtaining the conversion relation from a camera measurement coordinate system CCS to a trolley motion platform coordinate system MCS through positive kinematics and a hand-eye calibration resultAnd transformation relation of trolley to global coordinate system WCSThereby obtaining the transformation from the camera measurement coordinate system to the global coordinate systemThe method specifically comprises the following steps:
Zi=Xi×Yi′,Yi=Zi×Xi.
because the relative position and attitude of the global coordinate system and the large-scale workpiece are kept unchanged in the measuring process, all the cloud blocks of the measuring points can obtain a relatively accurate profile point cloud profile after corresponding position and attitude transformation, and the method specifically comprises the following steps:
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011151689.XA CN112325796A (en) | 2020-10-26 | 2020-10-26 | Large-scale workpiece profile measuring method based on auxiliary positioning multi-view point cloud splicing |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011151689.XA CN112325796A (en) | 2020-10-26 | 2020-10-26 | Large-scale workpiece profile measuring method based on auxiliary positioning multi-view point cloud splicing |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112325796A true CN112325796A (en) | 2021-02-05 |
Family
ID=74311683
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011151689.XA Pending CN112325796A (en) | 2020-10-26 | 2020-10-26 | Large-scale workpiece profile measuring method based on auxiliary positioning multi-view point cloud splicing |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112325796A (en) |
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113310427A (en) * | 2021-05-18 | 2021-08-27 | 杭州思看科技有限公司 | Three-dimensional scanning system and three-dimensional scanning method |
CN113375594A (en) * | 2021-06-08 | 2021-09-10 | 四川大学青岛研究院 | Aircraft skin profile digital detection method |
CN113554614A (en) * | 2021-07-21 | 2021-10-26 | 中国人民解放军陆军工程大学 | Pipeline measurement system pose calibration method for point cloud splicing |
CN113551612A (en) * | 2021-06-18 | 2021-10-26 | 中广核检测技术有限公司 | CRDM thermal casing wear loss measuring device |
CN113587845A (en) * | 2021-07-29 | 2021-11-02 | 中国科学院长春光学精密机械与物理研究所 | Large-aperture lens contour detection device and detection method |
CN113701626A (en) * | 2021-08-10 | 2021-11-26 | 哈尔滨岛田大鹏工业股份有限公司 | 3D machine vision detection method for automobile longitudinal beam |
CN113706454A (en) * | 2021-07-13 | 2021-11-26 | 广东泽亨智能科技有限公司 | Workpiece offset detection method based on registration and spraying device |
CN113894785A (en) * | 2021-10-27 | 2022-01-07 | 华中科技大学无锡研究院 | Control method, device and system for in-situ measurement and processing of blades of water turbine |
CN114279326A (en) * | 2021-12-22 | 2022-04-05 | 易思维(天津)科技有限公司 | Global positioning method of three-dimensional scanning equipment |
CN114812408A (en) * | 2022-04-07 | 2022-07-29 | 中车青岛四方车辆研究所有限公司 | Method and system for measuring height of stone sweeper from rail surface |
CN114993204A (en) * | 2022-06-07 | 2022-09-02 | 上海交通大学 | Large-size component profile dynamic measurement and auxiliary feature positioning method and system |
CN115026470A (en) * | 2022-06-10 | 2022-09-09 | 重庆大学 | Robot welding system and method for reinforcing mesh |
CN115416051A (en) * | 2022-08-31 | 2022-12-02 | 北京精密机电控制设备研究所 | Flexible multi-joint mechanical arm space pose measuring system and method |
CN116476070A (en) * | 2023-05-22 | 2023-07-25 | 北京航空航天大学 | Method for adjusting scanning measurement path of large-scale barrel part local characteristic robot |
CN116643290A (en) * | 2023-06-16 | 2023-08-25 | 山西建筑工程集团有限公司 | Metering method and system for double-platform motion compensation of irregular contour |
CN116929209A (en) * | 2023-07-11 | 2023-10-24 | 无锡多恩多自动化有限公司 | Detection equipment and detection method for rod-shaped materials |
CN117726673A (en) * | 2024-02-07 | 2024-03-19 | 法奥意威(苏州)机器人系统有限公司 | Weld joint position obtaining method and device and electronic equipment |
CN118080205A (en) * | 2024-04-24 | 2024-05-28 | 四川吉埃智能科技有限公司 | Automatic spraying method and system based on vision |
DE102023101972A1 (en) | 2023-01-27 | 2024-08-01 | Matthias Krebs | Sensor system and method for operating a sensor system |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101093160A (en) * | 2007-07-12 | 2007-12-26 | 上海交通大学 | Method for measuring geometric parameters of spatial circle based on technique of binocular stereoscopic vision |
US20110178753A1 (en) * | 2010-01-20 | 2011-07-21 | Faro Technologies, Inc. | Portable Articulated Arm Coordinate Measuring Machine and Integrated Environmental Recorder |
CN103438824A (en) * | 2013-08-06 | 2013-12-11 | 北京航空航天大学 | Large wallboard class part digitized quality detection method |
CN103925872A (en) * | 2013-12-23 | 2014-07-16 | 中国神华能源股份有限公司 | Laser scanning measurement device and method for acquiring spatial distribution of target objects |
CN104182095A (en) * | 2014-08-13 | 2014-12-03 | 长春理工大学 | Mobile self-positioning laser three-dimensional projection system |
CN106524904A (en) * | 2016-09-19 | 2017-03-22 | 天津大学 | Rapid mutual scanning directional method for multiple laser trackers |
CN106959080A (en) * | 2017-04-10 | 2017-07-18 | 上海交通大学 | A kind of large complicated carved components three-dimensional pattern optical measuring system and method |
CN108444383A (en) * | 2018-03-08 | 2018-08-24 | 大连理工大学 | The box-like process integral measurement method of view-based access control model laser group |
CN109916333A (en) * | 2019-04-04 | 2019-06-21 | 大连交通大学 | A kind of large scale target with high precision three-dimensional reconstruction system and method based on AGV |
CN111536901A (en) * | 2020-05-21 | 2020-08-14 | 南京航空航天大学 | Combined three-dimensional laser scanning measurement method based on field gantry and robot |
-
2020
- 2020-10-26 CN CN202011151689.XA patent/CN112325796A/en active Pending
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101093160A (en) * | 2007-07-12 | 2007-12-26 | 上海交通大学 | Method for measuring geometric parameters of spatial circle based on technique of binocular stereoscopic vision |
US20110178753A1 (en) * | 2010-01-20 | 2011-07-21 | Faro Technologies, Inc. | Portable Articulated Arm Coordinate Measuring Machine and Integrated Environmental Recorder |
CN103438824A (en) * | 2013-08-06 | 2013-12-11 | 北京航空航天大学 | Large wallboard class part digitized quality detection method |
CN103925872A (en) * | 2013-12-23 | 2014-07-16 | 中国神华能源股份有限公司 | Laser scanning measurement device and method for acquiring spatial distribution of target objects |
CN104182095A (en) * | 2014-08-13 | 2014-12-03 | 长春理工大学 | Mobile self-positioning laser three-dimensional projection system |
CN106524904A (en) * | 2016-09-19 | 2017-03-22 | 天津大学 | Rapid mutual scanning directional method for multiple laser trackers |
CN106959080A (en) * | 2017-04-10 | 2017-07-18 | 上海交通大学 | A kind of large complicated carved components three-dimensional pattern optical measuring system and method |
CN108444383A (en) * | 2018-03-08 | 2018-08-24 | 大连理工大学 | The box-like process integral measurement method of view-based access control model laser group |
CN109916333A (en) * | 2019-04-04 | 2019-06-21 | 大连交通大学 | A kind of large scale target with high precision three-dimensional reconstruction system and method based on AGV |
CN111536901A (en) * | 2020-05-21 | 2020-08-14 | 南京航空航天大学 | Combined three-dimensional laser scanning measurement method based on field gantry and robot |
Non-Patent Citations (1)
Title |
---|
赵夫群等: "改进的概率迭代最近点配准算法", 《图学学报》 * |
Cited By (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113310427B (en) * | 2021-05-18 | 2023-03-07 | 思看科技(杭州)股份有限公司 | Three-dimensional scanning system and three-dimensional scanning method |
CN113310427A (en) * | 2021-05-18 | 2021-08-27 | 杭州思看科技有限公司 | Three-dimensional scanning system and three-dimensional scanning method |
CN113375594A (en) * | 2021-06-08 | 2021-09-10 | 四川大学青岛研究院 | Aircraft skin profile digital detection method |
CN113375594B (en) * | 2021-06-08 | 2022-05-06 | 四川大学青岛研究院 | Aircraft skin profile digital detection method |
CN113551612A (en) * | 2021-06-18 | 2021-10-26 | 中广核检测技术有限公司 | CRDM thermal casing wear loss measuring device |
CN113706454B (en) * | 2021-07-13 | 2022-05-03 | 广东泽亨智能科技有限公司 | Workpiece offset detection method based on registration and spraying device |
CN113706454A (en) * | 2021-07-13 | 2021-11-26 | 广东泽亨智能科技有限公司 | Workpiece offset detection method based on registration and spraying device |
CN113554614A (en) * | 2021-07-21 | 2021-10-26 | 中国人民解放军陆军工程大学 | Pipeline measurement system pose calibration method for point cloud splicing |
CN113554614B (en) * | 2021-07-21 | 2022-12-20 | 中国人民解放军陆军工程大学 | Pipeline measurement system pose calibration method for point cloud splicing |
CN113587845A (en) * | 2021-07-29 | 2021-11-02 | 中国科学院长春光学精密机械与物理研究所 | Large-aperture lens contour detection device and detection method |
CN113701626B (en) * | 2021-08-10 | 2023-08-04 | 哈尔滨岛田大鹏工业股份有限公司 | Automobile longitudinal beam 3D machine vision detection method |
CN113701626A (en) * | 2021-08-10 | 2021-11-26 | 哈尔滨岛田大鹏工业股份有限公司 | 3D machine vision detection method for automobile longitudinal beam |
CN113894785A (en) * | 2021-10-27 | 2022-01-07 | 华中科技大学无锡研究院 | Control method, device and system for in-situ measurement and processing of blades of water turbine |
CN113894785B (en) * | 2021-10-27 | 2023-06-09 | 华中科技大学无锡研究院 | Control method, device and system for in-situ measurement and processing of turbine blades |
CN114279326A (en) * | 2021-12-22 | 2022-04-05 | 易思维(天津)科技有限公司 | Global positioning method of three-dimensional scanning equipment |
CN114279326B (en) * | 2021-12-22 | 2024-05-28 | 易思维(天津)科技有限公司 | Global positioning method of three-dimensional scanning equipment |
CN114812408B (en) * | 2022-04-07 | 2023-08-22 | 中车青岛四方车辆研究所有限公司 | Method and system for measuring height of stone sweeper from rail surface |
CN114812408A (en) * | 2022-04-07 | 2022-07-29 | 中车青岛四方车辆研究所有限公司 | Method and system for measuring height of stone sweeper from rail surface |
CN114993204B (en) * | 2022-06-07 | 2023-03-14 | 上海交通大学 | Large-size component profile dynamic measurement and auxiliary feature positioning method and system |
CN114993204A (en) * | 2022-06-07 | 2022-09-02 | 上海交通大学 | Large-size component profile dynamic measurement and auxiliary feature positioning method and system |
CN115026470A (en) * | 2022-06-10 | 2022-09-09 | 重庆大学 | Robot welding system and method for reinforcing mesh |
CN115416051A (en) * | 2022-08-31 | 2022-12-02 | 北京精密机电控制设备研究所 | Flexible multi-joint mechanical arm space pose measuring system and method |
DE102023101972A1 (en) | 2023-01-27 | 2024-08-01 | Matthias Krebs | Sensor system and method for operating a sensor system |
CN116476070A (en) * | 2023-05-22 | 2023-07-25 | 北京航空航天大学 | Method for adjusting scanning measurement path of large-scale barrel part local characteristic robot |
CN116476070B (en) * | 2023-05-22 | 2023-11-10 | 北京航空航天大学 | Method for adjusting scanning measurement path of large-scale barrel part local characteristic robot |
CN116643290B (en) * | 2023-06-16 | 2024-04-26 | 山西建筑工程集团有限公司 | Metering method and system for double-platform motion compensation of irregular contour |
CN116643290A (en) * | 2023-06-16 | 2023-08-25 | 山西建筑工程集团有限公司 | Metering method and system for double-platform motion compensation of irregular contour |
CN116929209A (en) * | 2023-07-11 | 2023-10-24 | 无锡多恩多自动化有限公司 | Detection equipment and detection method for rod-shaped materials |
CN117726673A (en) * | 2024-02-07 | 2024-03-19 | 法奥意威(苏州)机器人系统有限公司 | Weld joint position obtaining method and device and electronic equipment |
CN117726673B (en) * | 2024-02-07 | 2024-05-24 | 法奥意威(苏州)机器人系统有限公司 | Weld joint position obtaining method and device and electronic equipment |
CN118080205A (en) * | 2024-04-24 | 2024-05-28 | 四川吉埃智能科技有限公司 | Automatic spraying method and system based on vision |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112325796A (en) | Large-scale workpiece profile measuring method based on auxiliary positioning multi-view point cloud splicing | |
CN109990701B (en) | Mobile measurement system and method for large-scale complex curved surface three-dimensional shape robot | |
CN109373898B (en) | Complex part pose estimation system and method based on three-dimensional measurement point cloud | |
CN111156925B (en) | Three-dimensional measurement method for large component based on line structured light and industrial robot | |
Wang et al. | A mobile robotic measurement system for large-scale complex components based on optical scanning and visual tracking | |
CN109029257B (en) | Large-scale workpiece pose measurement system and method based on stereoscopic vision and structured light vision | |
CN102607457B (en) | Measuring device and measuring method for large three-dimensional morphology based on inertial navigation technology | |
CN108801142B (en) | Double-movement measuring robot system and method for super-large-size workpiece | |
CN104515478B (en) | A kind of automatic method for three-dimensional measurement of high-precision blade of aviation engine and system | |
CN102825602B (en) | PSD (Position Sensitive Detector)-based industrial robot self-calibration method and device | |
CN112833786B (en) | Cabin attitude and pose measuring and aligning system, control method and application | |
CN109341535B (en) | High-efficiency high-precision large satellite antenna assembly measurement system | |
CN111561868A (en) | Method for realizing non-contact measurement of antenna profile by utilizing optical tracking structure optical scanner | |
CN112648934B (en) | Automatic elbow geometric form detection method | |
CN115972093B (en) | Workpiece surface measuring method and device and wing wallboard soft mold polishing method | |
CN111811400B (en) | Combined positioning device and method based on AGV and laser tracker | |
CN112288823A (en) | Calibration method of standard cylinder curved surface point measuring equipment | |
CN114964213B (en) | Building engineering construction positioning system and method based on attitude sensing and visual scanning | |
CN114434036B (en) | Three-dimensional vision system for gantry robot welding of large ship structural member and operation method | |
CN114459345B (en) | Aircraft fuselage position and posture detection system and method based on visual space positioning | |
CN117893610B (en) | Aviation assembly robot gesture measurement system based on zoom monocular vision | |
CN204269086U (en) | The automatic three-dimension measuring system of a kind of high-precision blade of aviation engine | |
Peng et al. | Development of an integrated laser sensors based measurement system for large-scale components automated assembly application | |
CN112991270B (en) | Dynamic measurement method for deformation of large-size rotating body die | |
Wang et al. | Combined measurement based wing-fuselage assembly coordination via multiconstraint optimization |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20210205 |
|
RJ01 | Rejection of invention patent application after publication |