CN112284290B - Autonomous measurement method and system for aero-engine blade robot - Google Patents

Autonomous measurement method and system for aero-engine blade robot Download PDF

Info

Publication number
CN112284290B
CN112284290B CN202011122728.3A CN202011122728A CN112284290B CN 112284290 B CN112284290 B CN 112284290B CN 202011122728 A CN202011122728 A CN 202011122728A CN 112284290 B CN112284290 B CN 112284290B
Authority
CN
China
Prior art keywords
robot
blade
measuring
point
measurement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011122728.3A
Other languages
Chinese (zh)
Other versions
CN112284290A (en
Inventor
王耀南
唐永鹏
缪志强
毛建旭
朱青
张辉
周显恩
江一鸣
彭伟星
刘学兵
林杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan University
Original Assignee
Hunan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan University filed Critical Hunan University
Priority to CN202011122728.3A priority Critical patent/CN112284290B/en
Publication of CN112284290A publication Critical patent/CN112284290A/en
Application granted granted Critical
Publication of CN112284290B publication Critical patent/CN112284290B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/2433Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures for measuring outlines by shadow casting

Abstract

The invention discloses an autonomous measuring method and system for an aircraft engine blade robot. The method comprises the following steps: calibrating a servo rotary workbench, a structured light three-dimensional scanner and a robot, arranging measuring points according to an aircraft engine blade design model, planning a path measured by the robot, measuring the three-dimensional appearance of the aircraft engine blade, and processing the measurement data of the aircraft engine blade. The system comprises a hardware system and a software system, wherein the hardware system comprises a servo rotary workbench, a blade measuring clamp, a structured light three-dimensional scanner, a robot control cabinet and an industrial computer; the software system comprises a measuring system calibration module, a measuring point layout module, a robot measuring path planning module, a point cloud data processing and visualization module, a three-dimensional model format conversion module, a human-computer interaction module and a measuring process control module. According to the invention, the measuring points are automatically distributed and the measuring path of the robot is generated according to the product design model, so that the autonomous measurement of the robot is realized.

Description

Autonomous measurement method and system for aero-engine blade robot
Technical Field
The invention relates to the technical field of robot three-dimensional measurement, in particular to an autonomous measurement method and system for an aircraft engine blade robot.
Background
The blade of the aircraft engine is a core part of the aircraft engine, directly determines the aerodynamic performance and the service life of the aircraft engine, and is usually made of titanium alloy, high-temperature alloy and other materials, and the manufacturing amount of the blade accounts for about 30% of the total manufacturing ratio of the aircraft engine. When the blade works in a severe environment with high temperature and high pressure for a long time, the thermal barrier coating is easy to fall off, ablate, crack, corrode and the like, and the blade is broken when the damage is serious. The measurement of the three-dimensional shape of the blade is the key for evaluating and improving the processing quality of the blade, and the acquisition of the damage data of the blade profile is also of great significance for the repair and reprocessing of the blade.
Compared with the measurement of other parts, the blade measurement of the aero-engine has the advantages of high measurement precision requirement, high measurement efficiency requirement, complex processing and analysis of blade measurement data, high comprehensive evaluation difficulty and high measurement reliability requirement.
At present, in the measuring process of the blade of the domestic aero-engine, the traditional standard sample plate measuring means still dominates, the efficiency is low, the development is slow, and the integrated process of design, manufacture and detection is severely restricted. In order to meet the requirement of rapid and efficient detection, three-coordinate measuring machines are commonly adopted in developed western countries to detect blades of aero-engines. The detection method is high in precision, good in repeatability, strong in universality and the like, can finish comprehensive measurement on the blade profile of the aero-engine, is one of the methods with the highest measurement precision of the blade profile of the aero-engine at present, but is low in detection efficiency, complex in measurement path planning and directly influences the measurement result by the radius compensation precision of the measuring head.
In order to ensure the measurement precision of the blade of the aero-engine and simultaneously improve the measurement efficiency and the automation degree of the blade of the aero-engine, the combination of the structured light three-dimensional scanner and the robot technology is an important way for solving the technical problem.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: the method and the system for automatically measuring the blade of the aircraft engine overcome the defects in the prior art, and improve the efficiency and the degree of automation of blade measurement of the aircraft engine.
The technical solution of the invention is as follows: the method for automatically measuring the blade robot of the aircraft engine comprises the following steps:
s1: calibrating a servo rotary workbench, a structured light three-dimensional scanner and a robot;
s2: arranging measurement points according to an aircraft engine blade design model;
s3: planning a path measured by the robot;
s4: measuring the three-dimensional appearance of the blade of the aero-engine;
s5: measurement data of the aircraft engine blade is processed.
Further, the calibration servo rotary worktable, the structured light three-dimensional scanner and the robot comprise the following steps:
s11: fixing three precise standard balls on a servo rotary worktable, and ensuring that the three precise standard balls are within the working distance and the visual field range of the structured light three-dimensional scanner;
s12: keeping the robot still, rotating the servo rotary worktable at a stable rotation angle alpha, scanning once by the structured light three-dimensional scanner to obtain measurement data, and calculating an average clustering sphere center P;
s13: continuously keeping the robot still, rotating the servo rotary worktable for N times at a stable rotation angle alpha to ensure that the angle is more than 360 degrees, and recording the average clustering center PiWhere i 1.. times.n, and fitting the set of points { P using a least squares methodiGet the average cluster center PiRadius R of circular motion around the axis of rotation of the servo-rotating table0
S14: the robot acts to set the terminal attitude xi of the robotjHerein, thisWhile the servo-rotating table rotates by a stable angle thetaiRotating, once scanning by the structured light three-dimensional scanner, and calculating the average clustering sphere center P of observationijWherein i 1, 1.. and N, j 1.. and M, and ensure
Figure BDA0002732582630000021
And when the tail end of the robot is in xi posturejThe three precision standard balls are within the working distance and the visual field range of the structured light three-dimensional scanner;
s15: observation of result R according to step S130And step S14 observing result { P }ijI 1, N, j 1, M, and performing hand-eye calibration of the robot and the structured light three-dimensional scannerhandTcameraAnd determining the attitude transformation relationship between the axis of rotation of the servo rotary table and the robot base coordinate systembaseTaxisWherein { hand } is a robot end coordinate system, { base } is a robot base coordinate system, { camera } is a structured light three-dimensional scanner coordinate system, and { axis } is a coordinate system of a servo rotary table rotation axis.
Further, the method for arranging the measuring points according to the aircraft engine blade design model comprises the following steps:
s21: importing an aircraft engine blade design model, and carrying out Poisson disc sampling to generate a point cloud M ═ finishbladePmodel,i|i=1,...,NmIn which N ismThe number of points in the point cloud M;
s22: clustering point cloud M to generate Gaussian mixture model
Figure BDA0002732582630000022
Wherein λiK is the weight and the mixing quantity of the Gaussian mixture model, mui、∑iAre respectively Gaussian distribution
Figure BDA0002732582630000023
A mean and covariance matrix of;
s23: according to μi、∑iGenerating a set of candidate measurement points from the working distance D of the structured light three-dimensional scannerClosed { Vi1., K }, where V ═ i ═ 1iContaining positions v of candidate measurement pointsiAnd direction ni
S24: from a set of candidate measurement points { ViChoose the best set of candidate measurement points { D } from | i ═ 1i1,. n, where n is equal to or less than K, and ensuring that the robot obtains the integrity of the model by measuring, where n is the number of the finally arranged measuring points;
s25: setting D ═ DiI 1.. n } depending on the set of selected best candidate measurement points D ═ D ·iI 1, n, determining the optimal sequence of measurement points
Figure BDA0002732582630000031
Wherein s isi=1,...,n。
Further, the planning of the measured path of the robot includes the following steps:
s31: clamping and fixing the aero-engine blade on a blade measuring clamp, and ensuring that the blade is within the working distance and the visual field range of a structured light three-dimensional scanner;
s32: the structured light three-dimensional scanner scans once to obtain measurement data, removes noise in the measurement data, and segments a point cloud of a blade area into a final archcameraPobserved,j|j=1,...,NsIn which N issThe number of points in the point cloud S;
s33: aligning the point cloud S of the blade area of the aircraft engine with the point cloud M of the blade design model of the aircraft engine through a point cloud registration algorithm, and solving a transformation matrix intomodelTobservedGuarantee the corresponding point paircameraPobserved,jAndbladePmodel,isatisfy the requirement of
Figure BDA0002732582630000032
Wherein { model } is an aeroengine blade design model coordinate system, { observed } is an aeroengine blade measurement data coordinate system, { blank } is an aeroengine blade coordinate system to be measured,cameraPobserved,jto measure points in the data for a structured light three-dimensional scanner,bladePmodel,idesigning points in the model for the aircraft engine blade;
s34: from matrix equations
Figure BDA0002732582630000033
Solving the transformation matrix between the coordinate systems { blade } and { axis }:axisTblade=(baseTaxis)-1·baseThand·handTcamera·(modelTobserved)-1
s35: according to the optimal sequence of measurement points Dsequence={D1,...,Dk,...Dn}, target measurement point Dk={vk,nk}={bladevk,bladenkH, calculating the movement of the robot to DkThe shortest straight path L;
s36: calculating the position and the direction of a target measuring point in a robot base coordinate system:
Figure BDA0002732582630000034
Figure BDA0002732582630000035
wherein the content of the first and second substances,
Figure BDA0002732582630000036
respectively the position and the direction of a measuring point in a rotating shaft coordinate system of the servo rotating workbench,axisPcamerafor the current position of the structured light three-dimensional scanner in the rotary axis coordinate system of the servo rotary worktable,axisTbladeis a transformation matrix between coordinate systems { blade } and { axis }, r1axisPcamera-(axisPcamera T·ez)ez
Figure BDA0002732582630000041
ez=[0 0 1]T,sk=1,...,n,Rz(theta) rotation of the rotary table about the axis of rotation zA rotation matrix of the angle is formed,
Figure BDA0002732582630000042
s37: calculating the translational motion of the structured light three-dimensional scanner from the current attitude motion to the target measurement point
Figure BDA0002732582630000043
Rotation matrix R of minimum rotational motion cos θ I- (1-cos θ) nnT+ sin θ n ^ wherein,
Figure BDA0002732582630000044
Figure BDA0002732582630000045
θ=arccos(λ),n=basew/||basew |, n ^ n ═ nx ny nz]TThe anti-symmetric matrix of (a) is,
Figure BDA0002732582630000046
for the position of a target measuring point of a structured light three-dimensional scanner,basePcamerabasenzrespectively the current position and direction of the structured light three-dimensional scanner;
s38: calculating target pose xi of robot motion*Calculating
Figure BDA0002732582630000047
Figure BDA0002732582630000048
x=m32-m23,y=m13-m31,z=m21-m12And according to the quaternion q being w + xi + yj + zk, the target pose of the robot motion
Figure BDA00027325826300000412
Wherein the content of the first and second substances,
Figure BDA00027325826300000413
for the position of the robot end at the target measurement point, R*Is the attitude of the tail end of the robot at a target measuring point, R is a rotation matrix,basePhandis the current position of the robot end-point,baseRhandthe motion trail of the robot is generated by linear interpolation of a robot controller in a robot control cabinet for the current posture of the tail end of the robot;
s39: calculating the rotation angle of the servo rotary table movement
Figure BDA0002732582630000049
The target position of the servo rotary table is
Figure BDA00027325826300000410
Minimum time T of servo rotary table movementamin=α/ωmaxWherein e iszIs [ 001 ]]T,θaFor the current position of the servo rotary table sgn () is a sign function, ωmaxMaximum angular velocity of rotation, r, for steady motion of the servo rotary table1For measuring the radial vector of a point relative to the axis of rotation, r2The radial vector of the target measurement point relative to the rotation axis is obtained.
Further, the computing robot moves to DkComprises the following steps:
s351: calculating the current position of the structured light three-dimensional scanner in a rotating shaft coordinate system of the servo rotating workbench:
Figure BDA00027325826300000411
s352: calculating the optimal measurement point
Figure BDA0002732582630000051
Description in { axis } coordinate System
Figure BDA0002732582630000052
Figure BDA0002732582630000053
S353: calculating the vertical distance of the structured light three-dimensional scanner from the current position to the target measuring point
Figure BDA0002732582630000054
Wherein the content of the first and second substances,
Figure BDA0002732582630000055
for the position of the measuring point in the coordinate system of the rotary axis of the servo rotary table,axisPcamerathe current position of the structured light three-dimensional scanner in a rotary shaft coordinate system of the servo rotary worktable;
s354: calculating the radial vector r of the measuring point relative to the axis of rotation1And the radial vector r of the target measurement point relative to the axis of rotation2
Figure BDA0002732582630000056
S355: calculating the shortest straight path of the robot moving to the target measuring point:
Figure BDA0002732582630000057
the shortest time T of the linear motion of the robotrmin=L/vmaxWherein v ismaxThe set maximum speed of the linear motion of the robot is obtained.
Further, the method for measuring the three-dimensional appearance of the blade of the aircraft engine comprises the following steps:
s41: controlling the robot and the servo rotary worktable to move to a target pose according to the planned robot measurement path;
s42: adjusting the motion speed v of the robot to be L/T and the angular speed omega of the servo rotary table to be alpha/T so that the two move synchronously, wherein T is max { T {amin,TrminIn which T isrminIs the shortest time of linear motion of the robot, TaminIs served asThe shortest time of the rotary worktable movement, wherein alpha is the rotation angle of the servo rotary worktable movement;
s43: the structured light three-dimensional scanner scans once, the measured data is transmitted to an industrial computer for processing, and a target measuring point D is updatedk
S44: steps S41-S43 are repeated until the robot traverses all target measurement points.
Further, the method for processing the measurement data of the blade of the aircraft engine comprises the following steps:
s51: denoising point cloud data;
s52: simplifying the point cloud of the blades of the aircraft engine;
s53: segmenting point clouds in the blade area of the aircraft engine;
s54: splicing point clouds of blades of the aircraft engine;
s55: reconstructing a curved surface model of the blade of the aircraft engine;
s56: and analyzing and evaluating the dimension error of the profile of the blade of the aircraft engine.
The invention also provides an autonomous measuring system of the blade robot of the aircraft engine, which comprises a hardware system and a software system, wherein:
the hardware system comprises a servo rotary workbench, a blade measuring clamp, a structured light three-dimensional scanner, a robot control cabinet and an industrial computer; wherein:
the servo rotary workbench is connected and communicated with the industrial computer through a field bus and is used for being matched with the robot to measure and executing a motion instruction issued by the industrial computer;
the blade measuring clamp is fixed on the servo rotary workbench through a connecting piece and is used for clamping and fixing the aero-engine blade to be measured;
the structured light three-dimensional scanner is fixed on a tail end flange of the robot through a connecting piece and used for collecting measurement data, and the measurement data are transmitted to an industrial computer through an industrial Ethernet for processing;
one end of the robot control cabinet is communicated with the robot through a field bus, and the other end of the robot control cabinet is communicated with the industrial computer through a network and used for controlling the motion of the robot so that the structured light three-dimensional scanner at the tail end of the robot moves according to a planned measuring path;
the software system includes: the system comprises a measuring system calibration module, a robot measuring point layout module, a robot measuring path planning module, a point cloud data processing and visualization module, a three-dimensional model format conversion module, a measuring system virtual simulation module, a measuring process control module and a human-computer interaction module respectively connected with the measuring system virtual simulation module, wherein:
the measuring system calibration module is used for calibrating the servo rotary workbench, the structured light three-dimensional scanner and the robot, and a calibration result is applied to the robot measuring point layout module;
the robot measuring point layout module generates an optimal measuring point according to the aero-engine blade design model and outputs the generated measuring point data to the robot measuring path planning module;
the robot measurement path planning module searches out a path with the shortest motion of the robot according to the generated optimal measurement point and outputs path data to the measurement process control module;
the measurement process control module is used for controlling the robot to move according to a planned path and controlling the servo rotary worktable and the robot to move to a target posture;
the point cloud data processing and visualization module is used for denoising, simplifying, dividing, splicing and reconstructing point cloud data acquired by the structured light three-dimensional scanner and analyzing and visualizing blade profile errors;
the three-dimensional model format conversion module is used for importing various aero-engine blade design models and exporting various three-dimensional models and can perform three-dimensional model format conversion;
the measurement system virtual simulation module is used for simulating the effect achieved by measurement in a real environment and the motion conditions of the servo rotary worktable and the robot, and improving the reliability of the measurement system and the control accuracy and response speed of the servo rotary worktable and the robot;
the man-machine interaction module monitors input, output and intermediate data of each module, provides a graphical user interface for operators to use, and manages the measurement process.
The invention has the following beneficial effects: the invention independently develops an autonomous measuring method and system of an aircraft engine blade robot, automatically generates measuring points and plans a robot measuring path through an aircraft engine blade design model, and realizes the automation of measurement; the structured light three-dimensional scanner realizes large-scale efficient measurement in a mode that the structured light three-dimensional scanner is fixed on a flange at the tail end of the robot through a connecting piece; the cooperative cooperation of the servo rotary table and the robot also greatly reduces the measurement time. Meanwhile, the system is strong in openness and expansibility, and can integrate functional modules such as network communication, motion control, point cloud data processing and visualization, profile size error analysis, ASC/PLY/IGES output and the like so as to meet the requirements of commercial application and academic research.
Drawings
FIG. 1 is a general block diagram of an autonomous measurement method of an aircraft engine blade robot;
FIG. 2 is a schematic diagram of three precision standard balls fixed on a servo rotary worktable, wherein 1 is the servo rotary worktable, 2 is a blade measuring fixture, 8 is the precision standard ball, 4 is a structured light three-dimensional scanner, and 5 is a robot;
FIG. 3 is a flow chart of calculating an average cluster centroid;
FIG. 4 is a schematic diagram of the basic principle of Hough voting for detecting a cloud of spherical points, wherein (a) is the case when the points are co-spherical and (b) (c) is the case when the points are unlikely to be co-spherical;
FIG. 5 is a schematic diagram of the basic idea of calibration of a robot measurement system, in which (a) is a distribution of observation points corresponding to a calibration result, (b) is a schematic diagram of minimum rotational motion between two coordinate systems, and (c) is a scattered distribution of observation points;
FIG. 6 is a schematic diagram of coordinate system configurations and transformation relationships between coordinate systems in a robotic measurement system, where (a) is the coordinate system configuration in the measurement system and (b) is the determination of whether an edge passes through or approaches a measurement object;
FIG. 7 is a flow chart of placement of measurement points according to an aircraft engine blade design model;
FIG. 8 is a feature of a projected contour of a Gaussian ellipsoid, where (a) is the feature where the projected contour contains 6 vertically-directed polygon edges and (b) is the feature where the projected contour contains 4 vertically-directed polygon edges;
FIG. 9 is a flow chart for calculating the gain of information observed from each candidate measurement point;
FIG. 10 is a flow chart of selecting an optimal set of measurement points from a set of candidate measurement points;
FIG. 11 is a flow chart for determining an optimal sequence of measurement points;
FIG. 12 is a flow chart for planning a measurement path for a robot;
FIG. 13 is a schematic diagram of a hardware system of an autonomous measuring system of an aircraft engine blade robot, wherein 1 is a servo rotary table, 2 is a blade measuring fixture, 3 is an aircraft engine blade, 4 is a structured light three-dimensional scanner, 5 is a robot, 6 is a robot control cabinet, and 7 is an industrial computer;
FIG. 14 is a software system architecture diagram of an aircraft engine blade robot autonomous measurement system.
Detailed Description
In order to make the technical solutions of the present invention more clear and definite, the present invention is further described in detail below with reference to the embodiments and the drawings, it should be noted that the embodiments and features of the embodiments of the present application can be combined with each other without conflict.
Referring to fig. 1, fig. 1 is a method for autonomous measurement of an aircraft engine blade robot provided in this embodiment, and includes the following steps:
s1: calibrating a servo rotary workbench 1, a structured light three-dimensional scanner 4 and a robot 5;
s2: arranging measurement points according to an aircraft engine blade design model;
s3: planning the path measured by the robot 5;
s4: measuring the three-dimensional appearance of the blade of the aero-engine;
s5: measurement data of the aircraft engine blade is processed.
Further, referring to fig. 2, the calibrating servo rotating table 1, the structured light three-dimensional scanner 4 and the robot 5 include the following steps:
s11: referring to fig. 2, three precision standard balls 8 are fixed on the servo rotary table 1, and the three precision standard balls 8 are ensured to be within the working distance and the visual field range of the structured light three-dimensional scanner 4;
s12: referring to fig. 3, the robot 5 is kept still, the servo rotary table 1 rotates at a stable rotation angle α, the structured light three-dimensional scanner 4 scans once to obtain measurement data, and an average clustering sphere center P is calculated; ,
calculating an average clustering sphere center P, namely detecting and segmenting point clouds distributed on a standard sphere through Hough voting, clustering the segmented point clouds into 3 classes by using a K-means + + algorithm, wherein each class corresponds to a sphere area of one standard sphere, fitting the sphere by a least square method to obtain the sphere center positions of three precise standard spheres 3, and calculating the average clustering sphere center P to represent an observation result, wherein the specific steps are as follows:
s1201: removing noise points in the measured data by a point cloud bilateral filtering algorithm to obtain a point cloud S ═ last pagecameraPobserved,j|j=1,...,NsCalculating the normal direction of each point as N ═ retaining curlcameranobserved,j|j=1,...,NsIn which N issDefining an empty set C for the number of points in the point cloud S, and initializing i, r as 1, wherein i, r belongs to {1s};
S1202: extraction of point p from S, NrAnd normal nrDefining the Hough voting accumulator Accr
S1203: extraction of point p from S, NiAnd normal niAnd calculating point pair characteristics:
Figure BDA0002732582630000081
wherein d ═ pi-pr
S1204: if C is present1>ηrs 2Or C2> 0 or C3< 0 or C4>cosε1Then i ← i +is updated1, return to step S1203, where ε1Is a very small positive value, eta ∈ (0, 4)],rsAs the radius of the standard sphere, judgment condition C1>ηrs 2Is explained as piAt prRadius of
Figure BDA0002732582630000082
Out of the neighborhood of (1), condition C2> 0 or C3< 0 to be interpreted as piAnd prSpherical surface sharing is impossible, as shown in FIG. 4(b), condition C4>cosε1Is explained as piAnd prThe directions are parallel and no common sphere is possible, as shown in fig. 4 (c);
s1205: computing
Figure BDA0002732582630000083
If | | | r-rs||<ε2
S1S2-C2C3>C1 cosε3And C2C3C4+S1S2C4+S1C3S3-C2S2S3>C1 cosε4Then the accumulator AccrPlus 1, wherein rsIs the radius of a standard sphere,. epsilon2、ε3、ε4A very small positive value;
s1206: if p isiAll points in S are traversed and Acc is finishedr>threshvoteA 1 is to prAdded to the point set C, where threshvoteIf the minimum number of voting support is not, updating r ← r +1, i ← 1, and returning to step S1202;
wherein, the Hough voting detects the point cloud of the spherical surface, referring to FIG. 4(a), for the point p belonging to the spherical surfacerA point p in the vicinity thereofiAnd prConstructed point pair characteristics
Figure BDA0002732582630000091
Wherein
Figure BDA0002732582630000092
By point pair characteristic F ═ (| | d |, (n)r,d),∠(ni,d),∠(nr,ni)),
Figure BDA0002732582630000093
It can be known that the constraint condition F is satisfied2+F3N and F2-F3=F4The relaxation constraint is | F2+F3-π|<ε3And | F2-F3-F4|<ε4In which epsilon3、ε4A very small positive value;
use of C1、C2、C3、C4Describing relaxation constraints as S1S2-C2C3>C1 cosε3And C2C3C4+S1S2C4+S1C3S3-C2S2S3>C1cosε4In which C is1=F1 2,C2=F1cosF2,C3=F1cosF3,C4=cosF4
Figure BDA0002732582630000094
If p isrNear (radius of
Figure BDA0002732582630000095
In the neighborhood of) has enough piIf the voting is supported, then consider prBelongs to a spherical area and passes through | | | r-rs||<ε2Judging whether the sphere belongs to the sphere of the standard sphere, wherein eta can be 2 to ensure enough voting number and reduce the search area, and threshvoteCan be set to a small value and then adjusted by actual conditions, because a strict condition | | | r-r is sets||<ε2,threshvoteThe value of (c) can be relaxed;
s1207: randomly selecting a point from the point set C as the center C of the initial cluster1
S1208: calculate each point xiDistance D (x) from the center point of the existing clusteri) Then, the probability of each point being selected as the next cluster center point is calculated:
Figure BDA0002732582630000096
then selecting the next clustering center point by a wheel disc method;
the wheel disc method specifically operates as follows: calculating a probability distribution function
Figure BDA0002732582630000097
P0Randomly generating a random number between 0 and 1, and determining the interval [ P ] to which the random number belongsk-1(x),Pk(x)]Then x corresponds tokIs the next cluster center selected;
repeating the step S1208 until K cluster center points are selected, where K is 3, and when the distance between each point and the plurality of cluster centers is calculated, d (x) takes the minimum distance value;
s1209: calculating the distance between each point and each cluster center, returning each point to the cluster center closest to the point, calculating the mean value of each class midpoint as a new cluster center, and if the class center is not changed any more or reaches the maximum iteration number NcExecuting step S1210, otherwise, continuing to execute step S1209;
s1210: each class { CkFitting points in 1,2 and 3 to a spherical surface by using a least square method, wherein the centers of the fitted spheres are P1,P2,P3Radius is respectively r1、r2、r3The fitting error is defined as
Figure BDA0002732582630000101
Wherein least squares are used to fit CkThe specific operations of midpoint to sphere (circle in space) are:
computing matrices
Figure BDA0002732582630000102
Using Ax as b and least square solution x as (A)TA)-1b, the center (circle center) of the fitting spherical surface is (a b c)TRadius of
Figure BDA0002732582630000103
Wherein n is CkThe number of midpoints;
s13: continuously keeping the robot 5 still, rotating the servo rotary worktable 1 for N times at a stable rotation angle alpha to ensure that the angle alpha is more than 360 degrees, and recording the average clustering center PiWhere i 1.. times.n, and fitting the set of points { P using a least squares methodiGet the average cluster center PiRadius R of circular motion around the axis of rotation of the servo-rotating table0Increasing R in order to generate enough observation points0The estimation accuracy of (a) may be 10 °, N may be 36;
s14: the robot 5 operates to set the terminal attitude xi of the robot 5jAt this time, the servo rotary table 1 rotates by a smooth rotation angle θiRotating, the structured light three-dimensional scanner 4 scans once to calculate the average clustering sphere center P of observationijWherein i 1, 1.. and N, j 1.. and M, and ensure
Figure BDA0002732582630000104
And when the tail end of the robot 5 is in xi posturejWhen the three precision standard balls 3 are in the working distance and the visual field range of the structured light three-dimensional scanner 4;
s15: observation of result R according to step S130And step S14 observing result { P }ijI ═ 1,. N, j ═ 1,. M }, the hand-eye calibration of the robot 5 and the structured light three-dimensional scanner 4 is performedhandTcameraAnd determining the attitude transformation relationship between the axis of rotation of the servo rotary table 1 and the base coordinate system of the robot 5baseTaxisWherein { hand } is the terminal coordinate system of robot 5, { base } is the base coordinate system of robot 5, { camera } is the coordinate system of structured light three-dimensional scanner 4, and { axis } is the coordinate system of the rotation axis of servo rotation table 1, and the relationship between the coordinate systems refers to fig. 6;
referring to FIG. 5(a), the implementationhandTcameraAndbaseTaxissimultaneous calibration, first of all, one of them solving
Figure BDA0002732582630000117
Make observed { PijI 1, N, j 1, M, with radius R around the axis of rotation of the servo rotary table 10According to { PijThe circular motion of the servo rotary table can recover the direction n of the rotary shaft of the servo rotary table, and the origin of a coordinate system { axis } can be set as the circle center P of the circular motion0With the z-axis set to n, the x-axis and y-axis can be determined by the minimum rotational movement of the z-axis to the direction n of the { base } coordinate system, as shown in FIG. 5(b), to obtain another solution
Figure BDA0002732582630000118
As shown in figure 5(c) of the drawings,
Figure BDA0002732582630000119
nearbyhandTcameraWill result in { PijScattered around the circular motion, if errors are accumulated
Figure BDA0002732582630000111
The larger the distribution, the more scattered; searching for a numerical optimal solution of a calibration problem through a genetic algorithm and particle swarm algorithm mixed algorithm, and specifically comprising the following steps:
S1501:baseThandby reading the terminal attitude xi of the robot 5 in the robot controllerjObtained as (x, y, z, α, β, γ), i.e.:
Figure BDA0002732582630000112
txyz=[x y z]T
Figure BDA0002732582630000113
s1502: is provided withhandTcamera=T(tx,ty,tz,rx,ry,rz) From tx∈[xmin,xmax]、ty∈[ymin,ymax]、tz∈[zmin,zmax]、rx∈[αminmax]、ry∈[βminmax]、rz∈[γminmax]Initializing a particle swarm with the scale of m in a six-dimensional search space, and setting an initial position x of a kth particlek=(xk1,xk2,xk3,xk4,xk5,xk6) And velocity vk=(vk1,vk2,vk3,vk4,vk5,vk6) K 1.. m, where xmin=x-Δx1,xmax=x+Δx2,ymin=y-Δy1,ymax=y+Δy2,zmin=z-Δz1,zmax=z+Δz2,αmin=α-Δα1,αmax=α+Δα2,βmin=β-Δβ1,βmax=β+Δβ2,γmin=γ-Δγ1,γmax=γ+Δγ2Selecting proper delta x according to actual conditions1、Δx2、Δy1、Δy2、Δz1、Δz2、Δα1、Δα2、Δβ1、Δβ2、Δγ1、Δγ2Limiting the scope of the search space;
s1503: calculating the adaptive value of each particle by first calculating the translation t and rotation matrix Rxyz:t=[xk1 xk2 xk3]T
Figure BDA0002732582630000114
Then
Figure BDA0002732582630000115
Computing
Figure BDA0002732582630000116
Fitting the set of points { P } by least squaresij *|i=1,...N,jM to a sphere with a sphere center P0Radius of r0Then from { Pij *Great drawing point setsPl *Cir, satisfy the condition | calculationsPt *-P0||∈(r0-σ,r0+ σ), l ═ 1.. S, where S is ∑ tonesPl *The number of midpoints, σ, is a very small positive value; then, fromsPl *Extraction point insP1 *sP2 *Calculating n ═sP1 *×sP2 *Calculating an adaptive value for each particle
Figure BDA0002732582630000121
Finally update axisk=(P0,n,r0);
S1504: for each particle, its adapted value and the best position X it has experiencedk*=(Xk1,Xk2,Xk3,Xk4,Xk5,Xk6) Is compared if ek<ek*Then x iskAs the best position the current particle has experienced;
s1505: each particle will adapt its value to the best position X that the global has experiencedg=(Xg1,Xg2,Xg3,Xg4,Xg5,Xg6) Is compared if ek<egThen x iskAs the current global best position, if r0∈(R0-ξ,R0+ xi), update axis ═ axiskWhere ξ is a very small positive value;
s1506: selecting, crossing and mutating the particle swarm, calculating the adaptive value of each particle, and updating Xk*And Xg
S1507: the velocity and position of each particle is updated by first calculating:
Figure BDA0002732582630000122
wherein k is 1, m, s is 1, 6, vks∈[-vsmax,vsmax],vsmaxIs the maximum search speed in the s-th dimension,
Figure BDA0002732582630000123
effectively controls and restrains the flight speed of the particles for the contraction factor, simultaneously enhances the local searching capability of the algorithm,
Figure BDA0002732582630000124
C=c1+c2and C is more than 4, t is the current iteration number, tmaxFor maximum number of iterations, learning factor c1、c2Is a non-negative constant, r1、r2Obey [0,1 ] for mutually independent pseudo-random numbers]Then, x is calculatedks(t+1)=xks(t)+vks(t+1);
S1508: if the optimal solution has stagnated and no longer changes or reaches the maximum number of iterations or has obtained a good enough adaptation value, the iteration is stopped, the optimal solution is output,
Figure BDA0002732582630000125
otherwise, returning to the step S1503;
s1509: calculating a translation t*And a rotation matrix
Figure BDA0002732582630000126
Figure BDA0002732582630000127
Then
Figure BDA0002732582630000128
S1510: by axis ═ P0,n,r0) Calculating the translation taxis=P0Calculating the rotation w ═ ez×n,θ=arccos(ez T·n),nwR is calculated from the formula rhoderligs |/| w | | |axis=cosθI-(1-cosθ)nwnw T+sinθnwWherein ez=[0 0 1]T,nwIs nw=[nx ny nz]TThe anti-symmetric matrix of (a) is,
Figure BDA0002732582630000131
s1511: computing
Figure BDA0002732582630000132
Further, referring to fig. 7, the arranging the measurement points according to the aircraft engine blade design model includes the following steps:
s21: importing a CAD (computer-aided design) design model of an aircraft engine blade, and carrying out Poisson disc sampling to generate a point cloud M ═ last openingbladePmodel,i|i=1,...,NmIn which N ismThe number of points in the point cloud M;
s22: clustering point cloud M to generate Gaussian mixture model
Figure BDA0002732582630000133
Wherein λiK is the weight and the mixing quantity of the Gaussian mixture model, mui、∑iAre respectively Gaussian distribution
Figure BDA0002732582630000134
A mean and covariance matrix of;
the method comprises the following steps that a candidate measuring point set is constructed according to Gaussian mixture model parameters of a point cloud M and the working distance D of a structured light three-dimensional scanner 4, the mixing number K is too small to generate enough candidate measuring points, a cavity area is generated in a measuring result, the mixing number K is too large, the calculating time is long, the storage resource consumption is high, it is very important to select a proper K value according to the size and the surface condition of an actual blade, and a larger K value is needed for measurement of the whole blade and the like; the method comprises the following specific steps:
first, a mixture is definedNumber of components K, initializing parameter of each component pijj,∑jCalculating a log-likelihood function:
Figure BDA0002732582630000135
wherein j is 1.. K;
then, E-step (expected calculation procedure) calculates the posterior probability:
Figure BDA0002732582630000136
wherein the hidden variable zikDenotes xiBelongs to the kth gaussian mixture component, j 1.., K;
next, M-step (maximization procedure) updates:
Figure BDA0002732582630000137
Figure BDA0002732582630000138
wherein
Figure BDA0002732582630000139
j=1,...,K;
Finally, a log-likelihood function is calculated
Figure BDA00027325826300001310
j 1, K, checking whether the parameter or lnP converges, and if not, returning to continue executing E-step and M-step;
s23: according to μi、∑iAnd the working distance D of the structured light three-dimensional scanner 4 generates a set of candidate measurement points { V }i1., K }, where V ═ i ═ 1iContaining positions v of candidate measurement pointsiAnd direction ni
First, the normal direction { n ] of each point in the point cloud M is calculatedi|i=1,...,Nm};
Then, the distance mu is searched from the point cloud MiK ofμOne neighboring point { pij|||pijj||<ρ,i=1,...KμJ ═ 1.. K }, where ρ isA set neighborhood radius;
recalculation
Figure BDA0002732582630000141
k is 1,2,3, and S is selected to bekMaximum k, defining a reference direction
Figure BDA0002732582630000142
Calculating the direction n of the candidate measuring pointsj=-uskWherein sgn () is a sign function, ukIs sigmajUnit vector of the feature vector of (a), σkIs sigmajIs a characteristic value ofjIs 0, then usObtained by cross multiplication of other two feature vectors;
finally, the position v of the candidate measuring point is calculatedsj=μj-(D-ησk)nsjWhere D is the working distance of the structured light three-dimensional scanner 4, and D is the element (D)min,Dmax),ησk<D-Dmin,σkIs in the direction u of referenceskParallel eigenvectors ukCorresponding characteristic value, eta is regulating factor, according to muiThe curvature of the nearby point cloud is used to adjust the position of the measuring point, the sampling rate is increased at the place with large curvature, and D can be selected
Figure BDA0002732582630000143
Eta is selected such that Dmin<D-ησk<Dmax
S24: from a set of candidate measurement points { ViChoose the best set of candidate measurement points { D } from | i ═ 1iI | (1.), n }, wherein n is less than or equal to K, and the integrity of the model obtained by the measurement of the robot 5 is ensured, wherein n is the number of the finally arranged measurement points; the method comprises the following specific steps:
s241: referring to fig. 9, the information gain observed from each candidate measurement point is calculated:
first, at a candidate measurement point { v }sj,nsjThe possible observed point cloud distribution is
Figure BDA0002732582630000144
Satisfies nsjusk< 0, wherein uskIs GkA corresponding reference direction;
then, P is addeds(x) Medium gaussian mixture component GkPress mukAnd candidate measurement point { vsj,nsjDistance d betweenk=(μksj)nsjSorting from small to large, k being 1kMinimum gaussian mixture component GkAdd to set Ω { (k, p)k1,pk2,pk3,pk4,pk5,pk6) In }, pk1=μk-2σk1uk1,pk2=μk+2σk1uk1,pk3=μk-2σk2uk2,pk4=μk+2σk2uk2,pk5=μk-2σk3uk3,pk6=μk+2σk3uk3Wherein p isk1,pk2,pk3,pk4,pk5,pk6Determine GkCorresponding to a Gaussian ellipsoid, σk1、σk2、σk3Are respectively sigmakCharacteristic value of (u)k1、uk2、uk3Is sigmakUnit vector of the feature vector of (1), if ∑kIs 0, then usObtained by cross multiplication of other two feature vectors;
subsequently, step by step from Ps(x) In which d is taken outkGreater gaussian mixture component GkIf the parameter μ is for all gaussian components in the set ΩΩAnd dΩSatisfies the conditions
Figure BDA0002732582630000151
dk=(μk-vsj)nsj,dΩ=(μΩ-vsj)nsjThen G will bekAdding to the set omega, otherwise adding to the set omega*={(k,pk1,pk2,pk3,pk4,pk5,pk6) In (1) }; repeating the steps until Ps(x) The medium Gaussian mixture components are each assigned to the set Ω or the set Ω*Performing the following steps;
next, the sets Ω and Ω are calculated*Medium gaussian mixture component GkThe corresponding projection point of the Gaussian ellipsoid in the field of view of the structured light three-dimensional scanner 4, i.e. for a given one of the elements (k, p)k1,pk2,pk3,pk4,pk5,pk6) Calculating
Figure BDA0002732582630000152
Figure BDA0002732582630000153
Figure BDA0002732582630000154
Then, the sets Ω and Ω are calculated*Medium gaussian mixture component GkCorresponding projection profile characteristics of Gaussian ellipsoid, and defining the projection profile characteristic form as
Figure BDA0002732582630000155
Representing measured points from candidate { v }sj,nsjObserved Gaussian mixture component GkThe projection profile of the corresponding Gaussian ellipsoid in the visual field range of the structured light three-dimensional scanner; wherein the projection profile is a convex polygon generated from projection points, and the defined projection profile is characterized by a projection center
Figure BDA0002732582630000156
And a vector s pointing perpendicularly from the center of projection to the edge of the convex polygonvComposition of vector svThe modulo length of (a) represents the distance of the projection center to the polygon edge, see fig. 8 (a); in addition, the distribution of the projection points may refer to fig. 8(b), and the constructed convex polygon will surround part of the points, and the specific steps are as follows:
computing
Figure BDA0002732582630000157
Select | | | siS with the largest | |iAs an initial value, i is added to the list Circle { };
selecting one j from List {1,2,3,4,5,6}, i ≠ j, and calculating
Figure BDA0002732582630000158
From { s } - }1,s2,s3,s4,s5,s6Select s fromkIf all of skSatisfies sksv≤||sv||2K ≠ i ≠ j, j is removed from List ═ {1,2,3,4,5,6}, i is added to Circle, s is added to CirclevAdded to the Gaussian mixture component GkSet of projection profile features SjkAnd mixing sjAs an initial value si
If the values of the first element and the last element in the list Circle are the same, stopping iteration, otherwise, returning to continue calculating the sv
Calculating a Gaussian mixture component GkAnd (3) calculating the projection outline area of the corresponding Gaussian ellipsoid by sequentially taking { a, b, c }, { a, b, d }, { a, b, e }, and { a, b, f } from Circle
Figure BDA0002732582630000161
Figure BDA0002732582630000162
If only 5 elements are in Circle, only S needs to be calculated1And S2
Figure BDA0002732582630000163
For the set omega*Medium gaussian mixture component GkCalculate GkAnd (3) setting the obtained Circle as { a, b, c, d, e, f, a } of the corresponding projection grid of the Gaussian ellipsoid, sequentially taking two adjacent elements from the Circle, and calculating
Figure BDA0002732582630000164
Wherein
Figure BDA0002732582630000165
Which means that the rounding is made up,
Figure BDA0002732582630000166
i=1,..,nv-1,j=1,...,nl-1, from
Figure BDA0002732582630000167
The set of constructs is denoted as
Figure BDA0002732582630000168
After the above steps are completed, the set Ω is processed next*Medium gaussian mixture component GkBy GkEvaluating whether the Gaussian ellipsoid is shielded or not and the covered degree of the Gaussian ellipsoid during shielding by using the corresponding points in the projection grid of the Gaussian ellipsoid; when G iskAll the Gaussian ellipsoid projection points are in GaOutside the projected outline area of Gaussian ellipsoid and GaAll the Gaussian ellipsoid projection points are in GkOutside the projected outline area of Gaussian ellipsoid of GkGaussian ellipsoid and GaThe Gaussian ellipsoids do not have shielding problem, and the shielding problem is solved through calculation
Figure BDA0002732582630000169
In that
Figure BDA00027325826300001610
Projection of direction, i.e. if present
Figure BDA00027325826300001611
So that
Figure BDA00027325826300001612
1.. 6, then the projected points are
Figure BDA00027325826300001613
At GkOutside the projected outline area of the Gaussian ellipsoid, otherwise, the projected point
Figure BDA00027325826300001614
At GkWithin the region of the Gaussian ellipsoid projection profile of GbThe projection point of the Gaussian ellipsoid is
Figure BDA00027325826300001615
When G iskWhen the Gaussian ellipsoid has shielding problem, G should be dealt with firstkGaussian ellipsoid projection contour region calculation interpolation point
Figure BDA00027325826300001616
Then make statistics of
Figure BDA00027325826300001617
The covered degree in the shielding process is judged according to the ratio of the number in a certain Gaussian ellipsoid projection outline region to the total number of the interpolation points, and the specific steps are as follows:
let Ps(x) Neutralization candidate measuring point { vsj,nsjD < dkIs made up of a set F of gaussian mixture componentsk={Gkf|f=1,...,NF},
Figure BDA0002732582630000171
Number of points in
Figure BDA0002732582630000172
The coverage statistic variable cover is 0;
if it is not
Figure BDA0002732582630000173
Or
Figure BDA0002732582630000174
Or
Figure BDA0002732582630000175
Or
Figure BDA0002732582630000176
Or
Figure BDA0002732582630000177
Or
Figure BDA00027325826300001713
Then the cover is not changed, otherwise the cover is added with 1, wherein
Figure BDA0002732582630000178
f=1,...,NF
If the statistical coverage rate
Figure BDA0002732582630000179
Wherein eta*For a set threshold, consider the set Ω*Medium gaussian mixture component GkIs completely shielded, then Ps(x) Not counting GkOtherwise Ps(x) Needs to count GkThe information gain of (1), the existence of partial occlusion also needs to be countedkThe information gain of (1);
at the candidate measurement point { vsj,nsjThe possible observed point cloud distributions are:
Figure BDA00027325826300001710
finally, the candidate measurement points { v are calculatedsj,nsjObserving the obtained information gain, namely calculating:
Figure BDA00027325826300001711
wherein sigmakIs GkAnd a reference direction uskThe characteristic values corresponding to the parallel characteristic vectors;
s242: after information gain observed by each candidate measuring point is obtained through calculation, an optimal group of measuring points { D ] is selectediReferring to fig. 10, the specific steps are:
from the set of candidate measurement points { VjSearch Information gain InformationsjLargest candidate measurement point { vsj,nsjIs the initial measurement pointWill { v }sj,nsjAdd to the set of measurement points D, let Ps(x)=Psj *(x);
From the set of candidate measurement points { VjSelect the next candidate measurement point { v }si,nsiEnsure two observations Ps(x) And Psi *(x) Having the same Gaussian mixture component, and calculating all the same Gaussian components GsharedIs the contour area SsharedAnd
Figure BDA00027325826300001712
wherein P issharedFor a set of all identical Gaussian components, SsharedTaking the minimum projected contour area in the two observations, if GsharedIf there is no occlusion problem, η is equal to 0, otherwise η is equal to ηk
Estimate Ps(x) And Psi *(x) Calculating JS divergence between:
Figure BDA0002732582630000184
the KL divergence between two Gaussian mixture models can be estimated by the following steps:
Figure BDA0002732582630000185
Figure BDA0002732582630000181
wherein
Figure BDA0002732582630000182
d is the dimension of the multivariate gaussian distribution, where d is 3;
selection of Sshared,sum>Sshared,minAnd DJS(Psi *(x)||Ps(x) Maximum { v) }si,nsiAs next measurement point, { V } from the set of candidate measurement pointsjGet rid of { v }si,nsiAnd will { v }si,nsiAdd it to the set of measurement points D, update Ps(x)=Ps(x)+Psi *(x)-Pshared(x) In which S isshared,minFor a set minimum amount of information, P, shared by two adjacent observationsshared(x) Is PsharedA mixture distribution of all gaussian components;
if D isJS(Ps(x) If | P (x) < gamma, outputting the measurement point set, otherwise returning to continue from the candidate measurement point set { V |)jSelect the next candidate measurement point { v }si,nsiRepeatedly executing subsequent steps until DJS(Ps(x) | p (x)) < γ, where γ is a small positive value; wherein the condition D is judgedJS(Ps(x) | p (x) < γ describes the difference between the point cloud distribution observed at all measurement points and the overall point cloud distribution, and if the difference is smaller, the point cloud distribution observed at all measurement points can reflect the overall point cloud distribution;
s25: setting D ═ DiI 1.. n } depending on the set of selected best candidate measurement points D ═ D ·iI 1, n, determining the optimal sequence of measurement points
Figure BDA0002732582630000183
Wherein s isi=1,...,n;
Firstly, generating a graph with a constraint relation by discrete points, then searching the graph by a simulated annealing method to traverse the shortest paths of all measurement points, referring to fig. 11, the specific steps are as follows:
s251: connecting three-dimensional discrete points in the set D into a triangular mesh through Delaunay triangulation, and constructing a map Graph { D ═ Di,dijThat is, vertices (measurement points) of the triangular meshes are vertices of Graph, edges (distances between adjacent measurement points) of the triangular meshes are edges of Graph, and a center point of each triangular mesh is calculated
Figure BDA0002732582630000191
Wherein Dk1,Dk2,Dk3The vertex of the kth triangular patch;
if the Graph edge crosses or is close to the measurement object, then such edge should be removed to generate a reasonable Graph and the triangle patch to which the edge connects is deleted;
the cylindrical surface is used for surrounding a measuring object, and the spatial area of the measuring object is set as
Figure BDA0002732582630000192
The center of the bottom surface of the cylindrical surface is
Figure BDA0002732582630000193
The center of the top surface is
Figure BDA0002732582630000194
Referring to FIG. 6(b), calculate
Figure BDA0002732582630000195
Figure BDA0002732582630000196
dv1=min{||vi-pc||,||vj-pc||},dv2=min{||vi *-pc *||,||vj *-pc *| l }, keeping the condition 'lambda' satisfiedv1>0,λv2> 0 and | | sv1||>dmin,||sv2||>dmin'or' lambdav1>0,λv2Less than or equal to 0 and sv1||>dmin,dv2>dmin'or' lambdav1≤0,λv2> 0 and dv1>dmin,||sv2||>dmin'or' lambdav1≤0,λv2Not more than 0 and dv1>dmin,dv2>dmin"wherein v isi *、vj *、pc *Are each vi、vj、pcProjection of points on XY plane, dminSetting the minimum distance between the edge in Graph and a cylindrical surface surrounding a measuring object;
s252: initial annealing temperature TkK is 0 and a cooling factor α;
s253: according to Graph ═ Di,dij} generating a path
Figure BDA0002732582630000197
Let global optimal solution Pathbest=Pathi
S254: generating a random number j, j 1
Figure BDA0002732582630000198
And (3) calculating:
Figure BDA0002732582630000199
wherein
Figure BDA00027325826300001910
Exchanging the neighborhood structure according to the relationship of the edges in the graph to generate a new path, namely, points on two triangular patches with a common edge can generate a new path by exchanging the sequence of passing through two common vertexes;
s255: if P is presentkSatisfy | | Pk-P1| P < εk-P2If | < epsilon, where epsilon is a very small positive value, the exchange neighborhood structure generates a new path
Figure BDA00027325826300001911
Calculating path length variation
Figure BDA00027325826300001912
Otherwise, returning to the step S254;
s256: if according to the probability
Figure BDA0002732582630000201
Then Pathbest=PathjWherein random (0,1) is [0,1 ]]Random numbers within the interval;
s257: annealing operation, Tk+1=αTk,k←k+1,Pathi=PathjIf the convergence criterion is satisfied (from several adjacent Path)bestObservation of PathbestWhether the stagnation is not changed any more), the annealing process is ended, and the optimal solution Path is outputbestOtherwise, returning to the step S254;
s258: search PathbestThe longest side of the middle
Figure BDA0002732582630000202
The optimal sequence of measurement points is
Figure BDA0002732582630000203
Further, referring to fig. 12, the planning of the path measured by the robot 5 includes the following steps:
s31: clamping and fixing an aircraft engine blade 3 on a blade measuring clamp 2, and ensuring that the blade is within the working distance and the visual field range of a structured light three-dimensional scanner 4;
s32: the structured light three-dimensional scanner 4 scans once to obtain the measurement data, removes the noise in the measurement data, and segments the point cloud of the blade area into a final archcameraPobserved,j|j=1,...,NsIn which N issThe number of points in the point cloud S;
s33: aligning the point cloud S of the blade area of the aircraft engine with the point cloud M of the CAD design model of the blade of the aircraft engine through a point cloud registration algorithm, and solving a transformation matrix intomodelTobservedThen corresponding to the point paircameraPobserved,jAndbladePmodel,isatisfy the following requirements
Figure BDA0002732582630000204
Wherein { model } is a CAD design model coordinate system of the aero-engine blade, { observed } is a measurement data coordinate system of the aero-engine blade, { blank } is a coordinate system of the aero-engine blade to be measured,cameraPobserved,jto measure points in the data for a structured light three-dimensional scanner,bladePmodel,idesigning points in the model for the aircraft engine blade;
s34: from the matrix equation:
Figure BDA0002732582630000205
solving the transformation matrix between the coordinate systems { blade } and { axis }:axisTblade=(baseTaxis)-1·baseThand·handTcamera·(modelTobserved)-1
s35: according to the optimal sequence of measurement points Dsequence={D1,...,Dk,...Dn}, target measurement points
Figure BDA0002732582630000206
Computing robot
5 moves to DkThe shortest straight path L comprises the following specific steps:
s351: calculating the current position of the structured light three-dimensional scanner 4 in a rotating shaft coordinate system of the servo rotating worktable 1:
Figure BDA0002732582630000207
s352: calculating the optimal measurement point
Figure BDA0002732582630000211
Description in { axis } coordinate System
Figure BDA0002732582630000212
Namely:
Figure BDA0002732582630000213
s353: calculating the vertical distance of the structured light three-dimensional scanner 4 moving from the current position to the target measurement point:
Figure BDA0002732582630000214
wherein e isz=[0 0 1]T
Figure BDA0002732582630000215
For the position of the measuring point in the coordinate system of the rotary axis of the servo rotary table,axisPcamerathe current position of the structured light three-dimensional scanner in a rotary shaft coordinate system of the servo rotary worktable;
s354: calculating the radial vector r of the measuring point relative to the axis of rotation1And the radial vector r of the target measurement point relative to the axis of rotation2
Figure BDA0002732582630000216
S355: calculating the shortest straight path of the robot 5 moving to the target measurement point:
Figure BDA0002732582630000217
the shortest time T of the linear motion of the robot 5rmin=L/vmaxWherein v ismaxThe set maximum linear motion speed of the robot 5;
s36: calculating the position of a target measurement point in a robot 5 base coordinate system
Figure BDA0002732582630000218
And direction
Figure BDA0002732582630000219
Figure BDA00027325826300002110
Wherein the content of the first and second substances,
Figure BDA00027325826300002111
respectively the position and the direction of a measuring point in a rotating shaft coordinate system of the servo rotating workbench,axisPcamerato servo the current position of the structured light three-dimensional scanner 4 in the rotary axis coordinate system of the rotary table 1,axisTbladeis a transformation matrix between coordinate systems { blade } and { axis }, r1axisPcamera-(axisPcamera T·ez)ez
Figure BDA00027325826300002112
ez=[0 0 1]T,sk=1,...,n,Rz(theta) is a rotation matrix of the servo rotary table 1 rotated by an angle theta around the direction of the rotation axis z,
Figure BDA00027325826300002113
s37: calculating the translational motion of the structured light three-dimensional scanner 4 from the current attitude motion to the target measurement point:
Figure BDA00027325826300002114
rotation matrix of minimum rotational movement: r ═ cos θ I- (1-cos θ) nnT+ sin θ n ^ wherein,
Figure BDA00027325826300002115
θ=arccos(λ),n=basew/||basew |, n ^ n ═ nx ny nz]TThe anti-symmetric matrix of (a) is,
Figure BDA00027325826300002116
for the position of a target measuring point of a structured light three-dimensional scanner,basePcamerabasenzrespectively the current position and direction of the structured light three-dimensional scanner 4;
s38: calculating the target pose xi of the robot 5 motion*
Figure BDA0002732582630000221
Figure BDA0002732582630000222
Wherein the content of the first and second substances,x=m32-m23,y=m13-m31,z=m21-m12according to the quaternion q ═ w + xi + yj + zk, the target pose of the robot 5 motion
Figure BDA0002732582630000223
Wherein the content of the first and second substances,
Figure BDA0002732582630000224
for the position of the end of the robot 5 at the target measuring point, R*The attitude of the tail end of the robot 5 at a target measuring point, R is a rotation matrix,basePhandbeing the current position of the end of the robot 5,baseRhandthe motion trail of the robot 5 is generated by linear interpolation of a robot controller in a robot control cabinet 6 for the current posture of the tail end of the robot 5;
s39: calculating the rotation angle of the movement of the servo rotary table 1
Figure BDA0002732582630000225
The target position of the rotation of the servo rotary table 1 is set to
Figure BDA0002732582630000226
And the shortest time is Tamin=α/ωmaxWherein e iszIs [ 001 ]]T,θaFor the current position of the servo rotary table 1 sgn () is a sign function, ωmaxMaximum angular velocity of rotation, r, for steady movement of the servo rotary table 11For measuring the radial vector of a point relative to the axis of rotation, r2The radial vector of the target measurement point relative to the rotation axis is obtained.
Further, the method for measuring the three-dimensional appearance of the blade 3 of the aircraft engine comprises the following steps:
s41: controlling the robot 5 and the servo rotary worktable 1 to move to a target pose according to the planned measuring path of the robot 5;
s42: the movement velocity v of the robot 5 and the angular velocity ω of the servo rotary table 1 are adjusted to L/T and α/T, respectivelyMotion synchronization, where T ═ max { Tamin,TrminIn which T isrminIs the shortest time, T, of the linear motion of the robot 5aminThe shortest movement time of the servo rotary table 1 is defined as alpha, and the rotation angle of the servo rotary table 1 is defined as alpha;
s43: the structured light three-dimensional scanner 4 scans once, the measured data is transmitted to the industrial computer 7 for processing, and then the target measuring point D is updatedk
S44: steps S41-S43 are repeated until the robot 5 traverses all target measurement points.
Further, the method for processing the measurement data of the blade of the aircraft engine comprises the following steps:
s51: denoising point cloud data;
s52: simplifying the point cloud of the blades of the aircraft engine;
s53: segmenting point clouds in the blade area of the aircraft engine;
s54: splicing point clouds of blades of the aircraft engine;
s55: reconstructing a curved surface model of the blade of the aircraft engine;
s56: and analyzing and evaluating the dimension error of the profile of the blade of the aircraft engine.
Referring to fig. 13 and 14, the invention further provides an autonomous measuring system for an aircraft engine blade robot, which comprises a hardware system and a software system, wherein:
the hardware system comprises a servo rotary worktable 1, a blade measuring clamp 2, a structured light three-dimensional scanner 4, a robot 5, a robot control cabinet 6 and an industrial computer 7; the servo rotary workbench 1 is connected and communicated with an industrial computer 7 through a field bus, and is used for being matched with the robot 5 to measure and executing a motion instruction issued by the industrial computer 7; the blade measuring clamp 2 is fixed on the servo rotary worktable 1 through a connecting piece and is used for clamping and fixing an aero-engine blade 3 to be measured; the structured light three-dimensional scanner 4 is fixed on a flange at the tail end of the robot 5 through a connecting piece and is used for collecting measurement data, and the measurement data are transmitted to an industrial computer 7 through an industrial Ethernet for processing; one end of the robot control cabinet 6 is communicated with the robot 5 through a field bus, and the other end of the robot control cabinet is communicated with the industrial computer 7 through a network, so that the robot 5 is controlled to move, and the structured light three-dimensional scanner 4 at the tail end of the robot 5 moves according to a planned measuring path;
further, the robot 5 communicates with the industrial computer 7 through a TCP/IP protocol or a field bus;
further, the structured light three-dimensional scanner 4 is connected with the industrial computer 7 through a GigE interface and a network cable;
further, a servo driver of the servo rotary worktable 1 is communicated with the industrial computer 7 through an EtherCAT field bus or an RS232 serial port or a CANOPEN bus;
the software system includes: the system comprises a measuring system calibration module, a robot measuring point layout module, a robot measuring path planning module, a point cloud data processing and visualization module, a three-dimensional model format conversion module, a measuring system virtual simulation module, a measuring process control module and a human-computer interaction module respectively connected with the measuring system virtual simulation module, wherein:
the measuring system calibration module is used for calibrating the servo rotary workbench 1, the structured light three-dimensional scanner 4 and the robot 5, and the calibration result is applied to the robot measuring point layout module;
the robot measuring point layout module generates an optimal measuring point according to the aero-engine blade design model and outputs the generated measuring point data to the robot measuring path planning module;
the robot measurement path planning module searches out a path with the shortest motion of the robot according to the generated optimal measurement point and outputs path data to the measurement process control module;
the measurement process control module is used for controlling the robot 5 to move according to a planned path and controlling the servo rotary worktable 1 and the robot 5 to move to a target posture;
the point cloud data processing and visualization module is used for denoising, simplifying, dividing, splicing and reconstructing point cloud data acquired by the structured light three-dimensional scanner 4 and analyzing and visualizing blade profile errors;
the three-dimensional model format conversion module is used for importing various aero-engine blade design models and exporting various three-dimensional models and can perform three-dimensional model format conversion;
the measurement system virtual simulation module is used for simulating the effect achieved by measurement in a real environment and the motion conditions of the servo rotary worktable and the robot, and improving the reliability of the measurement system and the control accuracy and response speed of the servo rotary worktable and the robot;
the man-machine interaction module monitors input, output and intermediate data of each module, provides a graphical user interface for operators to use, and manages the measurement process;
further, the software system is based on a Windows operating system; the ROS2 is applied, and a DDS publish/subscribe system architecture and a QoS (quality of service) strategy are utilized to ensure that data are distributed efficiently and flexibly in real time, meet the application requirements of distributed real-time communication and realize that a plurality of robots simultaneously perform measurement tasks.
The invention has the following beneficial effects: the invention independently develops an autonomous measuring method and system of an aircraft engine blade robot, automatically generates measuring points and plans a robot measuring path through an aircraft engine blade design model, and realizes the automation of measurement; the structured light three-dimensional scanner realizes large-scale efficient measurement in a mode that the structured light three-dimensional scanner is fixed on a flange at the tail end of the robot through a connecting piece; the cooperative cooperation of the servo rotary table and the robot also greatly reduces the measurement time. Meanwhile, the system is strong in openness and expansibility, and can integrate functional modules such as network communication, motion control, point cloud data processing and visualization, profile size error analysis, ASC/PLY/IGES output and the like so as to meet the requirements of commercial application and academic research.
In the description above, numerous specific details are set forth in order to provide a thorough understanding of the present invention, however, the present invention may be practiced in other ways than those specifically described herein, and therefore should not be construed as limiting the scope of the present invention.
In conclusion, although the present invention has been described with reference to the preferred embodiments, it should be noted that, although various changes and modifications may be made by those skilled in the art, they should be included in the scope of the present invention unless they depart from the scope of the present invention.

Claims (7)

1. The autonomous measuring method of the aero-engine blade robot is characterized by comprising the following steps:
s1: the calibration servo rotary worktable, the structured light three-dimensional scanner and the robot specifically comprise the following steps:
s11: fixing three precise standard balls on a servo rotary worktable, and ensuring that the three precise standard balls are within the working distance and the visual field range of the structured light three-dimensional scanner;
s12: keeping the robot still, rotating the servo rotary worktable at a stable rotation angle alpha, scanning once by the structured light three-dimensional scanner to obtain measurement data, and calculating an average clustering sphere center P;
s13: continuously keeping the robot still, rotating the servo rotary worktable for N times at a stable rotation angle alpha to ensure that the angle is more than 360 degrees, and recording the average clustering center PiWhere i 1.. times.n, and fitting the set of points { P using a least squares methodiGet the average cluster center PiRadius R of circular motion around the axis of rotation of the servo-rotating table0
S14: the robot acts to set the terminal attitude xi of the robotjAt this time, the servo rotary table rotates by a smooth rotation angle θiRotating, once scanning by the structured light three-dimensional scanner, and calculating the average clustering sphere center P of observationijWherein i 1, 1.. and N, j 1.. and M, and ensure
Figure FDA0003122446370000011
And when the tail end of the robot is in xi posturejThe three precision standard balls are within the working distance and the visual field range of the structured light three-dimensional scanner;
s15: observation of result R according to step S130And step S14 observing result { P }ijI 1.. N, j 1.. M }, and the machine is operated in the same manner as described aboveHand-eye calibration of human and structured light three-dimensional scannerhandTcameraAnd determining the attitude transformation relationship between the axis of rotation of the servo rotary table and the robot base coordinate systembaseTaxisWherein { hand } is a robot end coordinate system, { base } is a robot base coordinate system, { camera } is a structured light three-dimensional scanner coordinate system, and { axis } is a coordinate system of a servo rotary table rotating shaft;
s2: arranging measurement points according to an aircraft engine blade design model;
s3: planning a path measured by the robot;
s4: measuring the three-dimensional appearance of the blade of the aero-engine;
s5: measurement data of the aircraft engine blade is processed.
2. The aircraft engine blade robot autonomous measuring method according to claim 1, wherein the arranging of the measuring points according to the aircraft engine blade design model comprises the following steps:
s21: importing an aircraft engine blade design model, and carrying out Poisson disc sampling to generate a point cloud M ═ finishbladePmodel,i|i=1,...,NmIn which N ismThe number of points in the point cloud M;
s22: clustering point cloud M to generate Gaussian mixture model
Figure FDA0003122446370000021
Wherein λiK is the weight and the mixing quantity of the Gaussian mixture model, mui、∑iAre respectively Gaussian distribution
Figure FDA0003122446370000022
A mean and covariance matrix of;
s23: according to μi、∑iAnd generating a set of candidate measurement points { V } for the working distance D of the structured light three-dimensional scanneri1., K }, where V ═ i ═ 1iContaining positions v of candidate measurement pointsiAnd direction ni
S24: from a set of candidate measurement points { ViChoose the best set of candidate measurement points { D } from | i ═ 1i1,. n, where n is equal to or less than K, and ensuring that the robot obtains the integrity of the model by measuring, where n is the number of the finally arranged measuring points;
s25: setting D ═ Di1., n }, according to the selected set of best candidate measurement points D ═ D ·iI 1, n, determining the optimal sequence of measurement points
Figure FDA0003122446370000023
Wherein s isi=1,...,n。
3. The aircraft engine blade robot autonomous measuring method according to claim 2, wherein the planning of the robot-measured path comprises the following steps:
s31: clamping and fixing the aero-engine blade on a blade measuring clamp, and ensuring that the blade is within the working distance and the visual field range of a structured light three-dimensional scanner;
s32: the structured light three-dimensional scanner scans once to obtain measurement data, removes noise in the measurement data, and segments a point cloud of a blade area into a final archcameraPobserved,j|j=1,...,NsIn which N issThe number of points in the point cloud S;
s33: aligning the point cloud S of the blade area of the aircraft engine with the point cloud M of the blade design model of the aircraft engine through a point cloud registration algorithm, and solving a transformation matrix intomodelTobservedGuarantee the corresponding point paircameraPobserved,jAndbladePmodel,isatisfy the requirement of
Figure FDA0003122446370000024
Wherein { model } is an aeroengine blade design model coordinate system, { observed } is an aeroengine blade measurement data coordinate system, { blank } is an aeroengine blade coordinate system to be measured,cameraPobserved,jfor structured light three-dimensional scanner measurementsThe point in the volume data is measured by the point,bladePmodel,idesigning points in the model for the aircraft engine blade;
s34: from matrix equations
Figure FDA0003122446370000025
Solving the transformation matrix between the coordinate systems { blade } and { axis }:axisTblade=(baseTaxis)-1·baseThand·handTcamera·(modelTobserved) -1
s35: according to the optimal sequence of measurement points Dsequence={D1,...,Dk,...Dn}, target measurement point Dk={vk,nk}={bladevk,bladenkH, calculating the movement of the robot to DkThe shortest straight path L;
s36: calculating the position and the direction of a target measuring point in a robot base coordinate system:
Figure FDA0003122446370000031
Figure FDA0003122446370000032
wherein the content of the first and second substances,
Figure FDA0003122446370000033
respectively the position and the direction of a measuring point in a rotating shaft coordinate system of the servo rotating workbench,axisPcamerafor the current position of the structured light three-dimensional scanner in the rotary axis coordinate system of the servo rotary worktable,axisTbladeis a transformation matrix between coordinate systems { blade } and { axis }, r1axisPcamera-(axisPcamera T·ez)ez
Figure FDA0003122446370000034
Rz(theta) for rotary operationThe stage is rotated around the rotation axis z by a rotation matrix of theta degrees,
Figure FDA0003122446370000035
s37: calculating the translational motion of the structured light three-dimensional scanner from the current attitude motion to the target measurement point
Figure FDA0003122446370000036
Rotation matrix R of minimum rotational motion cos θ I- (1-cos θ) nnT+ sin θ n ^ wherein,
Figure FDA0003122446370000037
θ=arccos(λ),n=basew/||basew |, n ^ n ═ nx ny nz]TThe anti-symmetric matrix of (a) is,
Figure FDA0003122446370000038
for the position of a target measuring point of a structured light three-dimensional scanner,basePcamerabasenzrespectively the current position and direction of the structured light three-dimensional scanner;
s38: calculating target pose xi of robot motion*Calculating
Figure FDA0003122446370000039
Figure FDA00031224463700000310
x=m32-m23,y=m13-m31,z=m21-m12And according to the quaternion q being w + xi + yj + zk, the target pose of the robot motion
Figure FDA00031224463700000311
Wherein the content of the first and second substances,
Figure FDA00031224463700000312
for the position of the robot end at the target measurement point, R*Is the attitude of the tail end of the robot at a target measuring point, R is a rotation matrix,basePhandis the current position of the robot end-point,baseRhandthe motion trail of the robot is generated by linear interpolation of a robot controller in a robot control cabinet for the current posture of the tail end of the robot;
s39: calculating the rotation angle of the servo rotary table movement
Figure FDA00031224463700000313
The target position of the servo rotary table is
Figure FDA00031224463700000314
Minimum time T of servo rotary table movementamin=α/ωmaxWherein e iszIs [ 001 ]]T,θaFor the current position of the servo rotary table sgn () is a sign function, ωmaxMaximum angular velocity of rotation, r, for steady motion of the servo rotary table1For measuring the radial vector of a point relative to the axis of rotation, r2The radial vector of the target measurement point relative to the rotation axis is obtained.
4. An aircraft engine blade robot autonomous measuring method according to claim 3, characterized in that the computing robot moves to DkComprises the following steps:
s351: calculating the current position of the structured light three-dimensional scanner in a rotating shaft coordinate system of the servo rotating workbench:
Figure FDA0003122446370000041
s352: calculating the optimal measurement point
Figure FDA0003122446370000042
At { axis } coordinateDescription of the invention
Figure FDA0003122446370000043
Figure FDA0003122446370000044
S353: calculating the vertical distance of the structured light three-dimensional scanner from the current position to the target measuring point
Figure FDA0003122446370000045
Wherein e isz=[0 0 1]T
Figure FDA0003122446370000046
For the position of the measuring point in the coordinate system of the rotary axis of the servo rotary table,axisPcamerathe current position of the structured light three-dimensional scanner in a rotary shaft coordinate system of the servo rotary worktable;
s354: calculating the radial vector r of the measuring point relative to the axis of rotation1And the radial vector r of the target measurement point relative to the axis of rotation2
r1axisPcamera-(axisPcamera T·ez)ez
Figure FDA0003122446370000047
S355: calculating the shortest straight path of the robot moving to the target measuring point:
Figure FDA0003122446370000048
the shortest time T of the linear motion of the robotrmin=L/vmaxWherein v ismaxThe set maximum speed of the linear motion of the robot is obtained.
5. The aircraft engine blade robot autonomous measuring method according to claim 4, wherein the measuring of the three-dimensional topography of the aircraft engine blade comprises the following steps:
s41: controlling the robot and the servo rotary worktable to move to a target pose according to the planned robot measurement path;
s42: adjusting the motion speed v of the robot to be L/T and the angular speed omega of the servo rotary table to be alpha/T so that the two move synchronously, wherein T is max { T {amin,TrminIn which T isrminIs the shortest time of linear motion of the robot, TaminThe shortest time of the movement of the servo rotary worktable is alpha, and the alpha is the rotation angle of the movement of the servo rotary worktable;
s43: the structured light three-dimensional scanner scans once, the measured data is transmitted to an industrial computer for processing, and a target measuring point D is updatedk
S44: steps S41-S43 are repeated until the robot traverses all target measurement points.
6. The aircraft engine blade robot autonomous measuring method according to claim 5, characterized in that the processing of the measurement data of the aircraft engine blade comprises the following steps:
s51: denoising point cloud data;
s52: simplifying the point cloud of the blades of the aircraft engine;
s53: segmenting point clouds in the blade area of the aircraft engine;
s54: splicing point clouds of blades of the aircraft engine;
s55: reconstructing a curved surface model of the blade of the aircraft engine;
s56: and analyzing and evaluating the dimension error of the profile of the blade of the aircraft engine.
7. An autonomous measuring system of an aircraft engine blade robot, characterized in that the system comprises a hardware system and a software system, wherein:
the hardware system comprises a servo rotary workbench, a blade measuring clamp, a structured light three-dimensional scanner, a robot control cabinet and an industrial computer; wherein:
the servo rotary workbench is connected and communicated with the industrial computer through a field bus and is used for being matched with the robot to measure and executing a motion instruction issued by the industrial computer;
the blade measuring clamp is fixed on the servo rotary workbench through a connecting piece and is used for clamping and fixing the aero-engine blade to be measured;
the structured light three-dimensional scanner is fixed on a tail end flange of the robot through a connecting piece and used for collecting measurement data, and the measurement data are transmitted to an industrial computer through an industrial Ethernet for processing;
one end of the robot control cabinet is communicated with the robot through a field bus, and the other end of the robot control cabinet is communicated with the industrial computer through a network and used for controlling the motion of the robot so that the structured light three-dimensional scanner at the tail end of the robot moves according to a planned measuring path;
the software system includes: the system comprises a measuring system calibration module, a robot measuring point layout module, a robot measuring path planning module, a point cloud data processing and visualization module, a three-dimensional model format conversion module, a measuring system virtual simulation module, a measuring process control module and a human-computer interaction module respectively connected with the measuring system virtual simulation module, wherein:
the measuring system calibration module is used for calibrating the servo rotary workbench, the structured light three-dimensional scanner and the robot, and a calibration result is applied to the robot measuring point layout module;
the robot measuring point layout module generates an optimal measuring point according to the aero-engine blade design model and outputs the generated measuring point data to the robot measuring path planning module;
the robot measurement path planning module searches out a path with the shortest motion of the robot according to the generated optimal measurement point and outputs path data to the measurement process control module;
the measurement process control module is used for controlling the robot to move according to a planned path and controlling the servo rotary worktable and the robot to move to a target posture;
the point cloud data processing and visualization module is used for denoising, simplifying, dividing, splicing and reconstructing point cloud data acquired by the structured light three-dimensional scanner and analyzing and visualizing blade profile errors;
the three-dimensional model format conversion module is used for importing various aero-engine blade design models and exporting various three-dimensional models and can perform three-dimensional model format conversion;
the measurement system virtual simulation module is used for simulating the effect achieved by measurement in a real environment and the motion conditions of the servo rotary worktable and the robot, and improving the reliability of the measurement system and the control accuracy and response speed of the servo rotary worktable and the robot;
the man-machine interaction module monitors input, output and intermediate data of each module, provides a graphical user interface for operators to use, and manages the measurement process;
the system is based on the autonomous measuring method of the aeronautical transmitter blade robot in any one of claims 1-6.
CN202011122728.3A 2020-10-20 2020-10-20 Autonomous measurement method and system for aero-engine blade robot Active CN112284290B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011122728.3A CN112284290B (en) 2020-10-20 2020-10-20 Autonomous measurement method and system for aero-engine blade robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011122728.3A CN112284290B (en) 2020-10-20 2020-10-20 Autonomous measurement method and system for aero-engine blade robot

Publications (2)

Publication Number Publication Date
CN112284290A CN112284290A (en) 2021-01-29
CN112284290B true CN112284290B (en) 2021-09-28

Family

ID=74423112

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011122728.3A Active CN112284290B (en) 2020-10-20 2020-10-20 Autonomous measurement method and system for aero-engine blade robot

Country Status (1)

Country Link
CN (1) CN112284290B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113034608B (en) * 2021-03-11 2022-08-23 东北大学秦皇岛分校 Corneal surface morphology measuring device and method
CN113074658B (en) * 2021-03-19 2022-05-03 中国科学院自动化研究所 Intelligent detection workstation for repairing blade of aero-engine
CN113239500B (en) * 2021-07-12 2021-09-21 四川大学 Reference point neighborhood feature matching method based on covariance matrix
CN113686268A (en) * 2021-07-13 2021-11-23 北京航天计量测试技术研究所 Automatic measuring system and method for exhaust area of turbine guider
CN113834450A (en) * 2021-08-12 2021-12-24 北京航天计量测试技术研究所 Automatic measuring system and method for exhaust area of turbine guider
CN114200891B (en) * 2021-12-10 2023-09-22 上海交通大学 Model-free cylindrical casting inner cavity milling system and track planning method
CN114407006B (en) * 2021-12-16 2024-02-09 中国人民解放军空军工程大学 Control method for repairing and three-dimensional reconstructing aero-engine blade disc and application thereof
CN114459377A (en) * 2022-02-10 2022-05-10 中国航发沈阳发动机研究所 Device and method for measuring blade profile of aircraft engine

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104515478A (en) * 2014-12-11 2015-04-15 华中科技大学 Automatic three-dimensional measuring method and automatic three-dimensional measuring system for high-precision blade of aviation engine
CN104567679A (en) * 2015-01-08 2015-04-29 华中科技大学 Turbine blade visual inspection system
CN105180834A (en) * 2015-05-28 2015-12-23 华中科技大学 Blade air inlet and exhaust edge three-dimensional non-contact measuring device
CN109990701A (en) * 2019-03-04 2019-07-09 华中科技大学 A kind of large complicated carved three-dimensional appearance robot traverse measurement system and method
EP3567340A1 (en) * 2018-05-09 2019-11-13 Siemens Gamesa Renewable Energy A/S Visual inspection arrangement
CN111272099A (en) * 2020-03-31 2020-06-12 许斌 Surface structure light precision detection system for three-dimensional surface morphology of aero-engine blade

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104515478A (en) * 2014-12-11 2015-04-15 华中科技大学 Automatic three-dimensional measuring method and automatic three-dimensional measuring system for high-precision blade of aviation engine
CN104567679A (en) * 2015-01-08 2015-04-29 华中科技大学 Turbine blade visual inspection system
CN105180834A (en) * 2015-05-28 2015-12-23 华中科技大学 Blade air inlet and exhaust edge three-dimensional non-contact measuring device
EP3567340A1 (en) * 2018-05-09 2019-11-13 Siemens Gamesa Renewable Energy A/S Visual inspection arrangement
CN109990701A (en) * 2019-03-04 2019-07-09 华中科技大学 A kind of large complicated carved three-dimensional appearance robot traverse measurement system and method
CN111272099A (en) * 2020-03-31 2020-06-12 许斌 Surface structure light precision detection system for three-dimensional surface morphology of aero-engine blade

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
航空涡轮叶片机器人三维测量与数据处理;马千里;《中国优秀硕士学位论文全文数据库 信息科技辑》;20190615(第6期);正文第5页第1.4节,第6-13页第2.1-2.6节,第14-22页第3.1-3.6节,第23-38页第4.1-4.8节,第39-57页第5.1-5.7节 *
铸造涡轮叶片线激光测量与数据处理方法;伍安;《中国优秀硕士学位论文全文数据库 基础科学辑》;20190615(第6期);正文第7-8页第1.4节,第9-12页第2.1-2.2节 *

Also Published As

Publication number Publication date
CN112284290A (en) 2021-01-29

Similar Documents

Publication Publication Date Title
CN112284290B (en) Autonomous measurement method and system for aero-engine blade robot
Elkott et al. Automatic sampling for CMM inspection planning of free-form surfaces
CN109541997A (en) It is a kind of towards the quick, intelligent programmed method of plane/almost plane workpiece spray robot
CN107972034B (en) Complex workpiece trajectory planning simulation system based on ROS platform
Zhou et al. Off-line programming system of industrial robot for spraying manufacturing optimization
CN108508848B (en) Interpolation data-based milling contour error evaluation method
Stojadinovic et al. Ants colony optimisation of a measuring path of prismatic parts on a CMM
CN110059879B (en) Automatic planning method for three-coordinate measurement of vehicle body
CN109683552B (en) Numerical control machining path generation method on complex point cloud model guided by base curve
Das et al. Scan registration with multi-scale k-means normal distributions transform
Hu et al. Automatic generation of efficient and interference-free five-axis scanning path for free-form surface inspection
CN113276130B (en) Free-form surface spraying path planning method and system based on point cloud slice
Zheng et al. A primitive-based 3D reconstruction method for remanufacturing
Li et al. A tracking-based numerical algorithm for efficiently constructing the feasible space of tool axis of a conical ball-end cutter in five-axis machining
Wang et al. A new point cloud slicing based path planning algorithm for robotic spray painting
CN113910001B (en) Numerical control machine tool space error identification method
CN113171913B (en) Spraying path generation method based on three-dimensional point cloud of seat furniture
Liu et al. Task allocation and coordinated motion planning for autonomous multi-robot optical inspection systems
CN110310322A (en) Method for detecting assembly surface of 10-micron-level high-precision device
Gao et al. Accessibility analysis in efficient inspection of closed blisk on 3-axis CMM with 2-axis probe head
CN109636077B (en) Variable node assembly path planning method based on dual local pose transformation
CN113436235B (en) Laser radar and visual point cloud initialization automatic registration method
CN115563574A (en) Multi-sensor air target point trace data fusion method based on comprehensive criterion
CN108595373A (en) It is a kind of without control DEM method for registering
Mi et al. Optimal build orientation based on material changes for FGM parts

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant