CN113902709B - Real-time surface flatness analysis method for guiding aircraft composite skin repair - Google Patents

Real-time surface flatness analysis method for guiding aircraft composite skin repair Download PDF

Info

Publication number
CN113902709B
CN113902709B CN202111184947.9A CN202111184947A CN113902709B CN 113902709 B CN113902709 B CN 113902709B CN 202111184947 A CN202111184947 A CN 202111184947A CN 113902709 B CN113902709 B CN 113902709B
Authority
CN
China
Prior art keywords
point cloud
composite material
area
repair
material skin
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111184947.9A
Other languages
Chinese (zh)
Other versions
CN113902709A (en
Inventor
汪俊
闫号
黄安义
易程
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Aeronautics and Astronautics
Original Assignee
Nanjing University of Aeronautics and Astronautics
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Aeronautics and Astronautics filed Critical Nanjing University of Aeronautics and Astronautics
Priority to CN202111184947.9A priority Critical patent/CN113902709B/en
Publication of CN113902709A publication Critical patent/CN113902709A/en
Application granted granted Critical
Publication of CN113902709B publication Critical patent/CN113902709B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Quality & Reliability (AREA)
  • Health & Medical Sciences (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a real-time surface flatness analysis method for guiding repair of composite material skin of an airplane, belonging to the technical field of intelligent processing of composite materials; based on hardware devices comprising: structured light scanners, projectors, visual sensors, mechanical arms, trolleys and the like; the analysis method comprises the following steps: s1, erecting a scanner and auxiliary equipment; s2, data acquisition; s3, point cloud segmentation of the composite material skin repairing area; s4, fitting a non-repaired area curved surface; s5, obtaining a distance cloud picture; s6, three-dimensional projection and visualization of the cloud picture; s7, judging whether the requirements are met, namely automatically analyzing the surface flatness of the repaired area, judging whether the requirements on the surface flatness of the repaired area are met, and if the requirements are met, ending the guiding task; otherwise, guiding a technician to carry out grinding repair on the uneven area according to the projection information of S6, and repeating the steps S2-S6 after the repair is finished.

Description

Real-time surface flatness analysis method for guiding aircraft composite skin repair
Technical Field
The invention relates to the technical field of intelligent processing of composite materials, in particular to a real-time surface flatness analysis method for guiding repair of composite material skins of airplanes.
Background
With the continuous development of composite material forming technology, advanced composite materials are applied in the aviation industry in a large quantity, so that the total weight of an airplane can be effectively reduced, and the performance of the airplane can be improved. The amount of composite materials used in modern large passenger aircraft, such as boeing 787, has reached 50%; the using amount of the composite material of the air passenger A350XWB reaches 52 percent; the bodies of modern military helicopters and small airplanes also adopt full composite material structures for weight reduction. Although the composite material has better mechanical property, the impact resistance is poorer, and the composite material structure can be damaged more or less in the manufacturing, installation and use processes. Taking the composite material maintenance system of hansa 2006 as an example, 1248 composite material damages, 216 lightning strikes and 184 bird strikes caused by ground misoperation in 1647 composite material damages of 243-shelf airplanes.
With the development of composite repair techniques, it has been shown that, with appropriate repair techniques, composite components can be effectively repaired, even restored to their original strength. The cementing repair is a key technology for repairing composite materials and is mainly used for three conditions of crack arrest, strength recovery and vibration damping, wherein the most popular for the strength recovery is the miter cut repair. The repair of the aircraft has the advantages of high strength after repair, capability of keeping the aerodynamic appearance of the original structure and the like, and has wide development prospect in the repair of the aircraft structure. Composite repair methods used in the aerospace industry, which must meet stringent regulations and requirements, have also been applied to other fields, such as the windmill industry and yachts, and have not yet entered the marine or automotive industry, but the automotive body shop organization can provide high quality repairs to composite structural components.
Notably, the repaired area is higher than the non-repaired area after repair of the composite member, which affects its aerodynamic performance to some extent. In order to meet the requirement of pneumatic appearance, the existing method is to carry out manual grinding repair on a repair area, and in the grinding process, an engineer can only judge the flatness during grinding by touching with hands, so that corresponding evaluation indexes and evaluation means are lacked. Therefore, the method for analyzing the surface flatness of the composite material skin of the airplane in real time can analyze and guide the composite material skin in real time when grinding repair is carried out after the composite material skin is repaired, and has great significance.
Disclosure of Invention
The invention provides a real-time analysis method for surface flatness for guiding repair of composite material skin of an airplane aiming at the defects in the prior art, and the method has the advantages that the surface flatness of the composite material skin can be quantitatively calculated when grinding repair is carried out after repair, and the surface flatness is visually displayed on the surface of a workpiece.
In order to achieve the purpose, the invention adopts the following technical scheme:
the invention discloses a real-time surface flatness analysis method for guiding aircraft composite skin repair, which is based on the following hardware devices and comprises the following steps: the system comprises a structured light scanner, a projector, a visual sensor, a mechanical arm and a trolley; the analysis method comprises the following steps:
s1, erecting a structural optical scanner and hardware devices except the structural optical scanner;
s2, acquiring surface data of the composite material skin, wherein the surface data comprises point cloud data and visible light data;
s3, point cloud segmentation of the composite material skin repair area, namely, the segmentation of the composite material skin repair area and a peripheral non-repair area is realized by using a point cloud segmentation network with fused geometric features;
s4, performing surface fitting on the non-repaired area, namely obtaining point cloud of the composite material skin non-repaired area on the basis of the S3, and performing surface fitting to obtain surface parameters;
s5, obtaining a distance cloud picture, calculating the distance between the point cloud of the composite material skin repairing area segmented in the S3 and the fitted curved surface in the S4, and converting the distance cloud picture into the distance cloud picture;
s6, performing three-dimensional projection visualization on the cloud image, registering the distance cloud image acquired in the S5 with the visible light image of the composite material skin, and projecting the distance cloud image to the surface of the composite material skin by using a projector calibrated with a visual sensor;
s7, judging whether the flatness requirement is met, automatically analyzing the surface flatness of the repair area, judging whether the requirement is met, and if the requirement is met, ending the guide task; otherwise, guiding grinding to repair the uneven area according to the projection information in the S6, and repeating the S2-S6 after the repair is finished.
In order to further optimize the technical scheme, the following measures are also taken:
further, the dolly can remove, fix and go up and down, and structured light scanner possesses the SDK function, S1 includes: the trolley moves to a designated position and is fixed, the mechanical arm is arranged on the trolley and is lifted to a height which enables the tail end of the mechanical arm to reach the repairing area; the structure light scanner, the projector and the visual sensor are all installed at the tail end of the mechanical arm, and the projector, the visual sensor and the mechanical arm are calibrated in pose.
Further, the S2 specifically includes: the mechanical arm is used for driving the structured light scanner to move to the position of the repairing area and adjusting the structured light scanner to the optimal scanning posture, the structured light scanner scans and acquires the composite material skin repairing area and the surface point cloud data around the composite material skin repairing area at a fixed position, and the visual sensor acquires the composite material skin visible light image.
Further, the S3 specifically includes:
s3-1, reinforcing the geometric sensitivity invariance of the composite material skin repair area and the surrounding point cloud data obtained in the S2; the strengthening step comprises the following steps: firstly, calculating a principal axis of PCA, and aligning the input point cloud by aligning the direction of the principal axis of PCA with a Cartesian space so as to improve the direction consistency and geometric sensitivity invariance of each scanning datum;
s3-2, calculating the average curvature, normal and surface density of the composite material skin repair area and surrounding point clouds after PCA alignment and performing data splicing to obtain N x 8 geometrical feature fusion point clouds; wherein N is the number of the midpoint of the point cloud segmentation network input point cloud;
s3-3, constructing a point cloud segmentation network of the composite material skin repair area suitable for the geometrical feature fusion point cloud; the point cloud segmentation network is based on an improved PointNet network and is divided into a global feature perception module and a point-by-point segmentation module; the global feature perception module extracts the features of each point in the point cloud through two shared weight multilayer perceptrons, the input of each multilayer perceptron is a feature vector output and spliced by all the previous perceptrons, and after a series of feature extractions, a one-dimensional global feature is obtained through a maximum pooling layer; the input of the point-by-point segmentation module is to copy the one-dimensional global features for N times and splice the one-dimensional global features with the output of a first multilayer perceptron in the global feature perception module to obtain a feature vector, and then the final point semantic segmentation prediction is realized by using two multilayer perceptrons sharing weight values;
s3-4, inputting the geometrical characteristic fusion point cloud of the S3-2 into a point cloud segmentation network of a composite material skin repair area to obtain semantic segmentation results of the composite material skin repair area and a non-repair area, and then calculating a loss function between the predicted semantic segmentation result and the GroundTruth, wherein the loss function uses cross entropy, and the calculation formula is as follows:
Figure GDA0003916935450000031
in the formula (I), the compound is shown in the specification,
Figure GDA0003916935450000032
expressed as an OneHot encoded tag, < >>
Figure GDA0003916935450000033
B is the batch number and C is the total semantic category number;
s3-5, continuously updating and optimizing parameters of the composite material skin repair area segmentation network by using a back propagation Adam optimization algorithm, so that the output of the segmentation network is continuously close to GroudTruth; when the accuracy of the verification set is stable, the point cloud segmentation network training is completed;
and S3-6, using the trained point cloud segmentation network to realize the segmentation of the composite material skin repairing area and the non-repairing area.
Further, in S4, the fitting of the point cloud surface of the non-repaired area specifically includes: removing the middle repairing area according to the segmentation result obtained in the step S3, denoising surrounding area point cloud by using a Gaussian filter algorithm, and finally performing surface fitting by using a least square method, wherein a fitting function is expressed as:
Figure GDA0003916935450000034
wherein λ i (X) is the coefficient to be found, p i (X) is a basis function, X is a point coordinate, and n is the number of point clouds; and taking the fitted curved surface as an ideal curved surface for subsequent operation.
Further, in S5, acquiring the distance cloud graph includes:
s5-1, calculating a directed distance, and calculating the directed distance from the point cloud of the segmented repairing area in the S3 to the fitting plane in the S4;
s5-2, forming a point cloud bounding box, wherein a fixed value is taken along the height direction of the curved surface in the depth direction of the point cloud bounding box, the length and the width of the point cloud bounding box are the same as the visual field of the scanner at the optimal scanning distance, and a plane formed in the length and the width directions is vertical to the visual field direction;
s5-3, dividing the bounding box and counting the number of points, dividing the bounding box into a plurality of small regions in the length and width directions according to the set resolution of the generated image, and counting the number of points contained in each region;
s5-4, assigning a region, wherein if the region contains points, the value of the region is the average value of the directed distances of all the points, and if the region does not contain points, the value of the region is the average value of the values of the small surrounding boxes in the neighborhood containing the points;
s5-5, generating a two-dimensional image, and performing RGB three-channel gray value conversion on the scalar value of each area in the S5-4 to obtain a distance cloud graph, namely representing an error distance graph from each small area point set of the composite material skin repairing area to the peripheral non-composite material skin repairing area curved surface by using different colors;
further, the S6 specifically includes:
s6-1, acquiring a feature descriptor, and calculating the visible light image acquired in S1 and the SURF descriptor of the two-dimensional distance cloud picture acquired in S5;
s6-2, performing feature matching, and acquiring corresponding pixel points of the two images by using a similarity measurement algorithm;
s6-3, testing and screening tuples, randomly obtaining three pairs of corresponding pixel points, and respectively marking as (p) 1 p 2 p 3 ),(q 1 q 2 q 3 ) And calculating whether the relative position relation of the images in the respective images meets the requirement, namely:
Figure GDA0003916935450000041
wherein λ is threshold, | · | | non-woven phosphor 2 Represents the Euclidean distance; if the position relation meets the threshold requirement, the screening is passed, otherwise, the screening is not passed; wherein, the random principle is as follows: preferentially selecting points with less selected times, and selecting the points which pass the screening no longer; the end rule is that all points that fail have been selected more than three times or the number of points that pass reaches a thresholdA value requirement;
s6-4, obtaining a transformation matrix, and calculating an optimal transformation matrix of the distance cloud picture and the visible light image by using a least square method according to the corresponding points; calculating a pose transformation matrix of the mechanical arm by using the vision sensor, the projector and the mechanical arm which are calibrated in advance;
and S6-5, pose transformation and projection, wherein the pose of the mechanical arm is adjusted according to the pose transformation matrix in the S6-4, and then the distance cloud picture is projected to the surface of the composite material by using a projector.
Compared with the prior art, the invention has the beneficial effects that:
the method can quantitatively analyze the surface flatness of the composite skin in real time during grinding and repairing, and visually projects and displays the surface of the composite through the projector, so that the interference of human subjectivity is reduced; the method has great significance for promoting aviation intelligent maintenance, improving the maintenance digitization level and reducing the labor intensity of workers.
Drawings
FIG. 1 is a diagram of relative position of hardware devices in a method for real-time analysis of surface flatness to guide repair of composite aircraft skin;
FIG. 2 is a flowchart of an overall method for guiding a real-time analysis method of surface flatness for repair of an aircraft composite skin;
FIG. 3 is a point cloud segmentation network structure of a surface flatness real-time analysis method for guiding repair of composite aircraft skin;
FIG. 4 is a flow chart of a point cloud segmentation of a composite skin repair area by a method of real-time analysis of surface flatness to guide repair of aircraft composite skins;
FIG. 5 is a flow chart of a method for obtaining a distance cloud plot by a real-time analysis method of surface flatness for guiding repair of an aircraft composite skin;
FIG. 6 is a three-dimensional projection visualization flow chart of a cloud image of a method for guiding real-time analysis of surface flatness for repair of aircraft composite skins;
FIG. 7 is a sample drawing of a method for real-time analysis of surface flatness to guide repair of composite skins for aircraft after repair of composite skins;
FIG. 8 is a composite skin repair area and surrounding surface point cloud data obtained by scanning a real-time analysis method for surface flatness for guiding repair of aircraft composite skin;
FIG. 9 is a result of segmentation of composite skin repair area by a real-time surface flatness analysis method for guiding repair of aircraft composite skin;
FIG. 10 is a surface flatness real-time analysis method for guiding aircraft composite skin repair that results from surface fitting around the composite skin repair area;
FIG. 11 is a distance cloud generated by a method for real-time analysis of surface flatness for guiding repair of aircraft composite skins.
Detailed Description
The present invention will now be described in further detail with reference to the accompanying drawings.
It should be noted that the terms "upper", "lower", "left", "right", "front", "back", etc. used in the present invention are for clarity of description only, and are not intended to limit the scope of the present invention, and the relative relationship between the terms and the terms is not limited by the technical contents of the essential changes.
Referring to fig. 1 to 11, a real-time analysis method for guiding surface flatness of aircraft composite skin repair according to the embodiment is based on the following hardware devices, including: referring to fig. 2, the mechanical arm 1, the vision sensor 2, the structured light scanner 3, the projector 4, and the cart 5, in this embodiment, the analysis method includes the following steps:
s1, erecting a structural optical scanner and hardware devices except the structural optical scanner;
s2, collecting surface data of the composite material skin, wherein the surface data comprises point cloud data and visible light data;
s3, point cloud segmentation of the composite material skin repair area, namely, a point cloud segmentation network with fused geometric features is used for achieving segmentation of the composite material skin repair area and a surrounding non-repair area;
s4, performing surface fitting on the non-repaired area, namely obtaining point cloud of the non-repaired area of the composite material skin on the basis of the S3, and performing surface fitting to obtain surface parameters, wherein the surface fitting result is shown in FIG. 10;
s5, obtaining a distance cloud picture, calculating the distance between the point cloud of the skin repair area of the composite material, which is segmented in the S3, and the fitted curved surface in the S4, and converting the distance cloud picture into the distance cloud picture;
s6, performing three-dimensional projection visualization on the cloud image, registering the distance cloud image acquired in the S5 with the visible light image of the composite material skin, and projecting the distance cloud image to the surface of the composite material skin by using a projector calibrated with a visual sensor;
referring to fig. 1, in the present embodiment, the cart 5 can move, be fixed, and be lifted, the structured light scanner 3 has an SDK function, and S1 includes: the trolley 5 is moved to a designated position and fixed, the mechanical arm 1 is arranged on the trolley 5 and is lifted to a height which enables the tail end of the mechanical arm 1 to reach a repairing area; the structured light scanner 3, the projector 4 and the visual sensor 2 are all installed at the tail end of the mechanical arm 1, and the projector 4, the visual sensor 2 and the mechanical arm 1 are calibrated in pose.
S2 specifically comprises the following steps: the mechanical arm 1 is used for driving the structured light scanner 3 to move to the position of the repair area and adjusting the structured light scanner 3 to the optimal scanning pose, the structured light scanner 3 scans and acquires the repair area of the composite material skin and the surface point cloud data around the repair area at the fixed pose, as shown in figure 8, and the visual sensor 2 acquires the visible light image of the composite material skin, as shown in figure 7.
Referring to fig. 4, in the present embodiment, S3 specifically includes:
and S3-1, reinforcing the geometric sensitivity invariance of the composite material skin repair area and the surrounding point cloud data acquired in the S2. The principal axis of the PCA is first calculated, and then the input point cloud is aligned by aligning the PCA principal axis direction with the cartesian space. Specifically, we first align the first principal axis with the Z-axis and then the second principal axis with the X-axis, thereby improving the directional consistency and geometric sensitivity invariance of each scan data.
S3-2, calculating the average curvature, the normal and the surface density of the composite material skin repairing area and the surrounding point cloud after PCA alignment, and performing data splicing to obtain N x 8 geometrical feature fusion point cloud; and N is the number of the points in the point cloud segmentation network input point cloud.
And S3-3, constructing a point cloud segmentation network structure of the composite material skin repairing area suitable for the geometrical characteristic fusion point cloud. The network structure of the system is based on an improved PointNet network and can be divided into a global feature perception module and a point-by-point segmentation module. Firstly, for a global feature perception module, extracting features of each point through two multilayer perceptrons sharing weight values, inputting the input of each multilayer perceptron into feature vectors output and spliced by all previous perceptrons, and obtaining a one-dimensional global feature (1 x 1096) through a maximum pooling layer after a series of feature extractions. For the point-by-point segmentation module, the input is to copy the one-dimensional global features for N times and splice the one-dimensional global features with the output of the first multilayer perceptron in the global feature perception module to obtain (Nx 1168) feature vectors, and then the last point semantic segmentation prediction is realized by using the two shared weight multilayer perceptrons.
S3-4, inputting the geometrical characteristic fusion point cloud of the S3-2 into a point cloud segmentation network of a composite material skin repair area to obtain semantic segmentation results of the composite material skin repair area and a non-repair area, and then calculating a loss function between the predicted semantic segmentation result and the GroundTruth, wherein the loss function uses cross entropy, and the calculation formula is as follows:
Figure GDA0003916935450000071
in the formula (I), the compound is shown in the specification,
Figure GDA0003916935450000072
expressed as an OneHot encoded tag, < >>
Figure GDA0003916935450000073
B is the batch number and C is the total semantic category number for the output of the point cloud segmentation network.
And S3-5, continuously updating and optimizing parameters of the point cloud segmentation network of the composite material skin repair area by using a back propagation Adam optimization algorithm to enable the output to be continuously close to GroudTruth, and finishing point cloud segmentation network training when the accuracy of a verification set is stable.
And S3-6, using the trained point cloud segmentation network to realize the segmentation of the composite material skin repairing area and the non-repairing area, wherein the segmentation result is shown in figure 9.
In S4, fitting the point cloud curved surface of the non-repaired area specifically comprises the following steps: after the middle repairing area is removed, denoising is completed on the point cloud of the surrounding area by using a Gaussian filtering algorithm, surface fitting is performed by using a least square method, and a fitting function is expressed as:
Figure GDA0003916935450000074
wherein λ is i (X) is the coefficient to be found, p i (X) is a basis function, X is a point coordinate, and n is the number of point clouds. And taking the fitted curved surface as an ideal curved surface for subsequent operation.
Referring to fig. 5, in the present embodiment, in S5, the obtaining the distance cloud graph includes:
s5-1, calculating a directed distance, and calculating the directed distance from the point cloud of the repair area segmented in the S3 to the fitting plane in the S4;
s5-2, forming a point cloud enclosing box, wherein the depth direction of the point cloud enclosing box is along the height direction of the curved surface, the size of the point cloud enclosing box is a fixed value, the length and the width of the point cloud enclosing box are the size of a visual field of the structured light scanner at the optimal scanning distance, and a plane formed in the length direction and the width direction is vertical to the visual field direction;
s5-3, dividing the bounding box and counting the number of points, dividing the bounding box into a plurality of small regions in the length and width directions according to the set resolution of the generated image, and counting the number of points contained in each region;
and S5-4, assigning a region, wherein if the region contains points, the value of the region is the average value of the directional distances of all the points, and if the region does not contain points, the value of the region is the average value of the values of the neighborhood small bounding boxes containing the points at the periphery.
And S5-5, generating a two-dimensional image, and performing RGB three-channel gray value conversion on the scalar value of each area in the S5-4 to obtain a distance cloud picture, namely representing an error distance picture from each small area point set of the composite material skin repairing area to the peripheral non-composite material skin repairing area curved surface by using different colors, as shown in figure 11.
Referring to fig. 6, in the present embodiment, the S6 specifically includes:
s6-1, acquiring a feature descriptor, and calculating the SURF descriptor of the visible light image acquired in the S1 and the SURF descriptor of the two-dimensional distance cloud image acquired in the S5;
s6-2, performing feature matching, and acquiring corresponding pixel points of the two images by using a similarity measurement algorithm;
s6-3, tuple testing and screening, randomly obtaining three pairs of corresponding pixel points, and respectively marking as (p) 1 p 2 p 3 ),(q 1 q 2 q 3 ) And calculating whether the relative position relation of the images in the respective images meets the requirement, namely:
Figure GDA0003916935450000081
wherein λ is threshold, | · | | non-woven phosphor 2 Representing the euclidean distance. And if the position relation meets the threshold requirement, the screening is passed, otherwise, the screening is not passed. The random principle is as follows: preferentially selecting points with less selected times, and selecting the points which pass the screening no longer; the ending principle is that all points which fail to pass are selected for more than three times or the number of passed points reaches the threshold value requirement;
s6-4, obtaining a transformation matrix, and calculating an optimal transformation matrix of the distance cloud picture and the visible light image by using a least square method according to the corresponding points; calculating a pose transformation matrix of the mechanical arm 1 by using the vision sensor 2, the projector 4 and the mechanical arm 1 which are calibrated in advance;
s6-5, pose transformation and projection, wherein the pose of the mechanical arm 1 is adjusted according to the pose transformation matrix in the S6-4, and then the distance cloud picture is projected to the surface of the composite material skin by using the projector 4.
The above is only a preferred embodiment of the present invention, and the protection scope of the present invention is not limited to the above-mentioned embodiments, and all technical solutions belonging to the idea of the present invention belong to the protection scope of the present invention. It should be noted that modifications and embellishments within the scope of the invention may be made by those skilled in the art without departing from the principle of the invention.

Claims (6)

1. A real-time analysis method for guiding the surface flatness of aircraft composite skin repair is based on the following hardware devices and comprises the following steps: the system comprises a structured light scanner, a projector, a visual sensor, a mechanical arm and a trolley; the method is characterized by comprising the following steps:
s1, erecting a structural optical scanner and hardware devices except the structural optical scanner;
s2, acquiring surface data of the composite material skin, wherein the surface data comprises point cloud data and visible light data;
s3, point cloud segmentation of the composite material skin repair area, namely, the segmentation of the composite material skin repair area and the surrounding non-repair area is realized by using a point cloud segmentation network with the fused geometric features, and the method comprises the following steps:
s3-1, reinforcing the geometric sensitivity invariance of the composite material skin repair area and the surrounding point cloud data obtained in the S2; the strengthening step comprises the following steps: firstly, calculating a principal axis of PCA, and aligning the input point cloud by aligning the direction of the principal axis of PCA with a Cartesian space so as to improve the direction consistency and geometric sensitivity invariance of each scanning datum;
s3-2, calculating the average curvature, the normal and the surface density of the composite material skin repairing area and the surrounding point cloud after PCA alignment and performing data splicing to obtain N x 8 geometrical feature fusion point cloud; wherein N is the number of the midpoint of the point cloud segmentation network input point cloud;
s3-3, constructing a point cloud segmentation network of the composite material skin repair area suitable for the geometrical feature fusion point cloud; the point cloud segmentation network is based on an improved PointNet network and is divided into a global feature sensing module and a point-by-point segmentation module; the global feature perception module extracts the features of each point in the point cloud through two shared weight multilayer perceptrons, the input of each multilayer perceptron is a feature vector output and spliced by all the previous perceptrons, and after a series of feature extractions, a one-dimensional global feature is obtained through a maximum pooling layer; the input of the point-by-point segmentation module is to copy the one-dimensional global features for N times and splice the one-dimensional global features with the output of a first multilayer perceptron in the global feature perception module to obtain a feature vector, and then the final point semantic segmentation prediction is realized by using two multilayer perceptrons sharing weight values;
s3-4, inputting the geometrical feature fusion point cloud of the S3-2 into a composite material skin repair area point cloud segmentation network to obtain semantic segmentation results of a composite material skin repair area and a non-repair area, and then calculating a loss function between the predicted semantic segmentation result and the GroudTruth, wherein the loss function uses cross entropy, and the calculation formula is as follows:
Figure FDA0003916935440000011
in the formula (I), the compound is shown in the specification,
Figure FDA0003916935440000012
expressed as an OneHot encoded tag, < >>
Figure FDA0003916935440000013
B is the batch number and C is the total semantic category number;
s3-5, continuously updating and optimizing parameters of the point cloud segmentation network of the composite material skin repair area by using a back-propagation Adam optimization algorithm, so that the output of the point cloud segmentation network is continuously close to GroundTruth; when the accuracy of the verification set is stable, the point cloud segmentation network training is completed;
s3-6, using the trained point cloud segmentation network to realize the segmentation of the composite material skin repairing area and the non-repairing area;
s4, performing surface fitting on the non-repaired area, namely obtaining point cloud of the composite material skin non-repaired area on the basis of the S3, and performing surface fitting to obtain surface parameters;
s5, obtaining a distance cloud picture, calculating the distance between the point cloud of the composite material skin repairing area segmented in the S3 and the fitted curved surface in the S4, and converting the distance cloud picture into the distance cloud picture;
s6, performing three-dimensional projection visualization on the cloud image, registering the distance cloud image acquired in the S5 with the visible light image of the composite material skin, and projecting the distance cloud image to the surface of the composite material skin by using a projector calibrated with a visual sensor;
s7, judging whether the flatness requirement is met, automatically analyzing the surface flatness of the repair area, judging whether the requirement is met, and if the requirement is met, ending the guide task; otherwise, guiding to grind and repair the uneven area according to the projection information in the step S6, and repeating the step S2 to the step S6 after the repair is finished.
2. The method for real-time analysis of surface flatness for guiding repair of composite aircraft skin according to claim 1, wherein the trolley is capable of moving, fixing and lifting, the structured light scanner has SDK function, and the S1 includes: the trolley moves to a designated position and is fixed, the mechanical arm is arranged on the trolley and is lifted to a height which enables the tail end of the mechanical arm to reach the repairing area; the structured light scanner, the projector and the visual sensor are all installed at the tail end of the mechanical arm, and the projector, the visual sensor and the mechanical arm are calibrated in pose.
3. The method for real-time analysis of the surface flatness for guiding repair of an aircraft composite skin according to claim 1, wherein S2 specifically comprises: the mechanical arm is used for driving the structured light scanner to move to the position of the repairing area and adjusting the structured light scanner to the optimal scanning posture, the structured light scanner scans and acquires the composite material skin repairing area and the surface point cloud data around the composite material skin repairing area at a fixed position, and the visual sensor acquires the composite material skin visible light image.
4. The method for analyzing the surface flatness of the aircraft composite skin repair in real time according to claim 1, wherein in the step S4, the point cloud surface fitting of the non-repair area specifically includes: removing the middle repairing area according to the segmentation result obtained in the S3, denoising surrounding area point cloud by using a Gaussian filter algorithm, and finally performing surface fitting by using a least square method, wherein a fitting function is expressed as:
Figure FDA0003916935440000021
wherein λ i (X) is the coefficient to be found, p i (X) is a basis function, X is a point coordinate, and n is the number of point clouds; and taking the fitted curved surface as an ideal curved surface for subsequent operation.
5. The method for analyzing the surface flatness of the composite aircraft skin repair in real time according to claim 1, wherein in the step S5, obtaining the distance cloud chart comprises:
s5-1, calculating a directed distance, and calculating the directed distance from the point cloud of the segmented repairing area in the S3 to the fitting plane in the S4;
s5-2, forming a point cloud enclosing box, wherein a fixed value is taken along the height direction of the curved surface in the depth direction of the point cloud enclosing box, the length and the width of the point cloud enclosing box are the same as the visual field of the structured light scanner at the optimal scanning distance, and a plane formed in the length direction and the width direction is vertical to the visual field direction;
s5-3, dividing the bounding box and counting the number of points, dividing the bounding box into a plurality of small regions in the length and width directions according to the set resolution of the generated image, and counting the number of points contained in each region;
s5-4, assigning a region, wherein if the region contains points, the value of the region is the average value of the directed distances of all the points, and if the region does not contain points, the value of the region is the average value of the values of the small surrounding boxes in the neighborhood containing the points;
and S5-5, generating a two-dimensional image, and performing RGB three-channel gray value conversion on the scalar value of each area in the S5-4 to obtain a distance cloud picture, namely representing an error distance picture from each small area point set of the composite material skin repairing area to the peripheral non-composite material skin repairing area curved surface by using different colors.
6. The method for real-time analysis of the surface flatness for guiding repair of an aircraft composite skin according to claim 1, wherein S6 specifically comprises:
s6-1, acquiring a feature descriptor, and calculating the visible light image acquired in S1 and the SURF descriptor of the two-dimensional distance cloud picture acquired in S5;
s6-2, performing feature matching, and acquiring corresponding pixel points of the two images by using a similarity measurement algorithm;
s6-3, tuple testing and screening, randomly obtaining three pairs of corresponding pixel points, and respectively marking as (p) 1 p 2 p 3 ),(q 1 q 2 q 3 ) And calculating whether the relative position relation of the images in the respective images meets the requirement, namely:
Figure FDA0003916935440000031
wherein λ is threshold, | · | | non-woven phosphor 2 Represents the Euclidean distance; if the position relation meets the threshold requirement, the screening is passed, otherwise, the screening is not passed; wherein, the random principle is as follows: preferentially selecting points with less selected times, and selecting the points which pass the screening no longer; the ending principle is that all points which fail to pass are selected for more than three times or the number of passed points reaches the threshold value requirement;
s6-4, obtaining a transformation matrix, and calculating an optimal transformation matrix of the distance cloud picture and the visible light image by using a least square method according to the corresponding points; calculating a pose transformation matrix of the mechanical arm by using the vision sensor, the projector and the mechanical arm which are calibrated in advance;
s6-5, pose transformation and projection, wherein pose adjustment is carried out on the mechanical arm according to the pose transformation matrix in the S6-4, and then the distance cloud picture is projected to the surface of the composite material by using a projector.
CN202111184947.9A 2021-10-12 2021-10-12 Real-time surface flatness analysis method for guiding aircraft composite skin repair Active CN113902709B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111184947.9A CN113902709B (en) 2021-10-12 2021-10-12 Real-time surface flatness analysis method for guiding aircraft composite skin repair

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111184947.9A CN113902709B (en) 2021-10-12 2021-10-12 Real-time surface flatness analysis method for guiding aircraft composite skin repair

Publications (2)

Publication Number Publication Date
CN113902709A CN113902709A (en) 2022-01-07
CN113902709B true CN113902709B (en) 2023-04-07

Family

ID=79191459

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111184947.9A Active CN113902709B (en) 2021-10-12 2021-10-12 Real-time surface flatness analysis method for guiding aircraft composite skin repair

Country Status (1)

Country Link
CN (1) CN113902709B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114406326A (en) * 2022-03-28 2022-04-29 西安兴航航空科技股份有限公司 Novel processing technology of aircraft skin
CN117788472B (en) * 2024-02-27 2024-05-14 南京航空航天大学 Method for judging corrosion degree of rivet on surface of aircraft skin based on DBSCAN algorithm

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112710233A (en) * 2020-12-18 2021-04-27 南京航空航天大学 Large-scale aircraft skin point cloud obtaining equipment and method based on laser point cloud
CN113375594A (en) * 2021-06-08 2021-09-10 四川大学青岛研究院 Aircraft skin profile digital detection method

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111274976B (en) * 2020-01-22 2020-09-18 清华大学 Lane detection method and system based on multi-level fusion of vision and laser radar
CN111524129B (en) * 2020-04-29 2021-05-18 南京航空航天大学 Aircraft skin butt joint gap calculation method based on end face extraction
CN112907528B (en) * 2021-02-09 2021-11-09 南京航空航天大学 Point cloud-to-image-based composite material laying wire surface defect detection and identification method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112710233A (en) * 2020-12-18 2021-04-27 南京航空航天大学 Large-scale aircraft skin point cloud obtaining equipment and method based on laser point cloud
CN113375594A (en) * 2021-06-08 2021-09-10 四川大学青岛研究院 Aircraft skin profile digital detection method

Also Published As

Publication number Publication date
CN113902709A (en) 2022-01-07

Similar Documents

Publication Publication Date Title
CN113902709B (en) Real-time surface flatness analysis method for guiding aircraft composite skin repair
Li et al. Automatic crack recognition for concrete bridges using a fully convolutional neural network and naive Bayes data fusion based on a visual detection system
CN113139453B (en) Orthoimage high-rise building base vector extraction method based on deep learning
CN111062915A (en) Real-time steel pipe defect detection method based on improved YOLOv3 model
CN110838112A (en) Insulator defect detection method based on Hough transform and YOLOv3 network
Wang et al. Measurement for cracks at the bottom of bridges based on tethered creeping unmanned aerial vehicle
CN107967685A (en) A kind of bridge pier and tower crack harmless quantitative detection method based on unmanned aerial vehicle remote sensing
CN111768417B (en) Railway wagon overrun detection method based on monocular vision 3D reconstruction technology
CN111241994A (en) Method for extracting remote sensing image rural highway desertification road section for deep learning
CN114973116A (en) Method and system for detecting foreign matters embedded into airport runway at night by self-attention feature
CN112241964B (en) Light strip center extraction method for line structured light non-contact measurement
CN115222884A (en) Space object analysis and modeling optimization method based on artificial intelligence
CN115294541A (en) Local feature enhanced Transformer road crack detection method
CN113673444B (en) Intersection multi-view target detection method and system based on angular point pooling
CN112561989B (en) Recognition method for hoisting object in construction scene
CN112197773B (en) Visual and laser positioning mapping method based on plane information
CN111798516A (en) Method for detecting running state quantity of bridge crane equipment and analyzing errors
CN111932635B (en) Image calibration method adopting combination of two-dimensional and three-dimensional vision processing
Zhongwei et al. Automatic segmentation and approximation of digitized data for reverse engineering
Ma et al. Surface roughness detection based on image analysis
CN115131688A (en) Unmanned aerial vehicle shooting point extraction method for inspection component
CN114511582A (en) Automatic ancient city battlement extraction method
Augustin et al. Detection of inclusion by using 3D laser scanner in composite prepreg manufacturing technique using convolutional neural networks
McAlorum et al. Automated concrete crack inspection with directional lighting platform
CN117788471B (en) YOLOv 5-based method for detecting and classifying aircraft skin defects

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant