CN112396604B - Multi-view-angle-based aircraft skin defect detection method - Google Patents

Multi-view-angle-based aircraft skin defect detection method Download PDF

Info

Publication number
CN112396604B
CN112396604B CN202110078491.1A CN202110078491A CN112396604B CN 112396604 B CN112396604 B CN 112396604B CN 202110078491 A CN202110078491 A CN 202110078491A CN 112396604 B CN112396604 B CN 112396604B
Authority
CN
China
Prior art keywords
detection
camera module
camera
aircraft skin
fixed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110078491.1A
Other languages
Chinese (zh)
Other versions
CN112396604A (en
Inventor
曾向荣
钟志伟
刘衍
张政
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
National University of Defense Technology
Original Assignee
National University of Defense Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National University of Defense Technology filed Critical National University of Defense Technology
Priority to CN202110078491.1A priority Critical patent/CN112396604B/en
Publication of CN112396604A publication Critical patent/CN112396604A/en
Application granted granted Critical
Publication of CN112396604B publication Critical patent/CN112396604B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/01Arrangements or apparatus for facilitating the optical investigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/01Arrangements or apparatus for facilitating the optical investigation
    • G01N2021/0106General arrangement of respective parts
    • G01N2021/0112Apparatus in one mechanical, optical or electronic block
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Immunology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biochemistry (AREA)
  • Analytical Chemistry (AREA)
  • Data Mining & Analysis (AREA)
  • Chemical & Material Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Quality & Reliability (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

The invention discloses a multi-view-angle-based aircraft skin defect detection method, which comprises a detection system, wherein the detection system comprises a bracket, a fixed camera module and a movable camera module which are arranged on the bracket, and a processing module connected with the fixed camera module and the movable camera module, wherein the fixed camera module integrates a camera into the processing module by adopting multi-path signal acquisition and a hub management mode to carry out rough detection on skin; the mobile camera module is independently managed in a wireless mode, and is controlled and image-transmitted by adopting a wireless signal; the detection method comprises the following steps: firstly, roughly detecting an aircraft skin by adopting a fixed camera module to obtain the detection probability of the aircraft skin, and selecting the resolution and the focal length of a camera according to the target detection probability; and then, moving the camera module on the bracket to perform fine detection on the aircraft skin, and finally performing alignment detection by adopting an image matching algorithm so as to improve the detection accuracy.

Description

Multi-view-angle-based aircraft skin defect detection method
Technical Field
The invention relates to the field of aircraft skin defect detection, in particular to an aircraft skin defect detection method based on multiple viewing angles.
Background
The aircraft skin is a layer of lead alloy metal on the surface of the aircraft, forms the appearance of the aircraft, and has the functions of maintaining the aerodynamics and transferring loads. In the whole flying process from take-off to landing of the airplane, the skin is subjected to external size change and continuous pressure action, the skin periodically expands and contracts for a long time, and tiny fatigue cracks are easily formed on the surface of the skin; china is wide in territory and complex in natural environment, airplanes in various regions are in different natural environments for a long time, and skins and rivet riveting positions are easy to corrode after contacting with corrosion factors. Under the combined action of fatigue damage and a corrosive environment, the damage of the aircraft skin is inevitable. In the flying process of the airplane, under the action of external force, after undetected fine defects are expanded to critical conditions, the defects can be rapidly expanded and amplified, so that the airplane is structurally damaged, the airplane is possibly disintegrated in the air, and unpredictable disasters occur.
In the machine vision, a robot replaces human eyes, a shot result is converted into an image signal and transmitted to an image processing system, judgment and measurement are made, and the skin of the airplane is generally detected in a single image scanning mode or a mobile robot scanning mode.
Patent CN202010058064.2 discloses a method for detecting and classifying defects on the surface of an aircraft skin, which introduces a method for detecting and classifying the aircraft skin, mainly solving the problem of image processing, and the accuracy of detection is affected by an image acquisition method which is not mentioned.
Disclosure of Invention
In order to solve the technical problems in the prior art, the invention firstly discloses an aircraft skin defect detection system based on multiple visual angles, which comprises a bracket, a fixed camera module, a movable camera module and a processing module, wherein the fixed camera module and the movable camera module are arranged on the bracket, and the processing module is connected with the fixed camera module and the movable camera module; the mobile camera module is independently managed in a wireless mode, and is controlled and image-transmitted by adopting a wireless signal; the fixed camera module comprises a plurality of symmetrical first cameras fixed on the left side and the right side of the support and a second camera fixed in the middle of the lower side of the support, and the movable camera module comprises a third camera movably arranged on the support.
As a further improvement of the above technical solution:
the first cameras on the left side of the support are 5 in number, the first cameras are used for detecting aircraft nose, fuselage, wing and empennage, the second cameras are used for detecting the middle part of the empennage, and the first cameras and the second cameras are provided with labels.
The invention also discloses a multi-view-angle-based aircraft skin defect detection method, which comprises the following steps:
s1, the fixed camera module carries out coarse detection on the aircraft skin to obtain the detection probability of the aircraft skin, and the selection of the resolution and the focal length of the camera is obtained according to the target detection probability: the lower the resolution of the defective part of the aircraft skin, the lower the probability of detecting in-place defects; the higher the resolution of the defective portion, the smaller the range of detection, in particular in relation to the pixel values detected in the image by the aircraft skin:
Figure DEST_PATH_IMAGE002
wherein
Figure DEST_PATH_IMAGE004
Is the probability of detection of the aircraft skin,
Figure DEST_PATH_IMAGE006
in order to detect the number of points,
Figure DEST_PATH_IMAGE008
for minimum detected resolution, it can be obtained from an aircraft skin identification algorithm
Figure 100002_DEST_PATH_IMAGE008A
In a target recognition algorithm based on deep learning
Figure DEST_PATH_IMAGE008AA
The method is defined as small target detection, the pixel value is 32 x 32, and when the resolution of a defect image is greater than 64, the probability of detecting a target defect is 94.5%;
and S2, moving the camera module to perform fine detection on the aircraft skin by moving on the bracket.
Preferably, the fine detection comprises the steps of:
a1, when a fixed camera with a certain label k (k is eleven numbers from 1 to 11) detects a target, analyzing the detection probability of a defect part, and when the detection probability is more than T, the detection probability T is different according to different algorithms, generally 0.8 to 0.9 is selected and can be defined by self. Visually judging whether the part is a skin defect part by an operator; when the detection probability is smaller than T, carrying out high-precision alignment detection on the detection area by using the mobile camera module;
a2, according to the label of the fixed camera, when the camera k detects a target, the mobile camera module is controlled by the processing module to accurately move above the suspected skin defect for coarse alignment;
and A3, carrying out accurate alignment detection on the suspected area according to an image matching algorithm.
Preferably, the image matching algorithm comprises the following steps:
b1, randomly extracting 8 non-collinear sample data from the data set, and calculating a transformation matrix H by using a RANSAC algorithm, and marking the transformation matrix H as a model M;
b2, calculating projection errors of all data in the data set and the model M, and if the errors are smaller than a threshold value, adding an inner point set I; if the error is larger than the threshold value, updating the data;
b3, if the iteration number is more than m, exiting; otherwise, adding 1 to the iteration number, and repeating the steps.
Preferably, the method for precisely moving the mobile camera module includes the steps of:
C1、
Figure DEST_PATH_IMAGE010
Figure DEST_PATH_IMAGE012
in order to translate the vertical vector of the matrix,
Figure DEST_PATH_IMAGE014
for rotation vectors, SVD decomposition is used
Figure DEST_PATH_IMAGE016
Wherein
Figure DEST_PATH_IMAGE018
Figure DEST_PATH_IMAGE020
Is an orthogonal matrix, and the matrix is,
Figure DEST_PATH_IMAGE022
is a singular value matrix;
C2、
Figure DEST_PATH_IMAGE024
wherein
Figure DEST_PATH_IMAGE026
Obtaining a rotation matrix and a translation vector;
c3, rotation matrix
Figure DEST_PATH_IMAGE028
And calculating the Euler angle by adopting the rotation matrix to obtain the angles in three directions as follows:
Figure DEST_PATH_IMAGE030
,
Figure DEST_PATH_IMAGE032
,
Figure DEST_PATH_IMAGE034
c4 real-time detecting the angle and translation vector of three directions obtained by matching the moving camera image with the fixed camera, and controlling the moving camera by the processing unit if the moving camera image is matched with the fixed camera
Figure DEST_PATH_IMAGE036
For the next movement
Figure DEST_PATH_IMAGE038
The angle of the direction of the light beam,
Figure DEST_PATH_IMAGE040
move in that direction, otherwise move in the opposite direction。
Compared with the prior art, the invention has the following beneficial effects:
according to the method for detecting the defects of the aircraft skin based on the multiple visual angles, the detection system performs picture processing on the aircraft from multiple angles through the fixed camera, the aircraft skin is subjected to rough detection, the accuracy and reliability of the identification of the defects of the surface of the aircraft skin are improved by combining the mobile camera, and the identification coverage rate of the surface of the aircraft skin is enhanced. And moving the mobile camera by adopting an accurate moving method and simultaneously adopting a picture matching algorithm to process the picture. The identification accuracy of the surface defects of the aircraft skin is high, and the safety of the aircraft is guaranteed. The aircraft skin defect detection method based on multiple viewing angles realizes non-contact and nondestructive accurate measurement and is beneficial to realizing the accurate maintenance and management of the aircraft.
Drawings
FIG. 1 is a schematic structural view of the present invention;
fig. 2 is an aircraft skin image matching map.
Reference numerals: 101. fixing the camera; 102. moving the camera; 103. a support; 107. the mobile camera shoots an image; 108. fixing a camera to shoot an image; 110. left and right image feature points; 111. matching lines of the left and right image feature points; 109. the left image matches the location on the right image.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In the description of the present invention, it should be understood that the terms "front", "back", "left", "right", "up", "down", and the like indicate orientations or positional relationships based on those shown in the drawings, and are used merely for convenience in describing the present invention and simplifying the description, but do not indicate or imply that the devices or elements indicated by the terms must have specific orientations, be constructed and operated in specific orientations, and therefore, should not be construed as limiting the present invention.
As shown in fig. 1, the present invention provides a multi-view-based aircraft skin defect detection system and a detection method thereof, mainly aiming at the problems of low detection efficiency and complex scanning of aircraft skin in a single image scanning mode or a mobile robot scanning mode, wherein the detection system mainly comprises three parts: a fixed camera module, a mobile camera module, and a processing unit. The fixed camera module detects the aircraft skin in a fixed position and fixed focal length mode, and the optimal camera arrangement is obtained according to the structural characteristics of the aircraft. The resolution of the mobile camera 102 of the mobile camera module is higher than that of the fixed camera 101, the focal length is longer than that of the fixed camera 101, the mobile camera module can move horizontally on a guide rail on the support 103 and in the rolling direction, and the mobile camera module can be used for high-definition airplane skin detection and positioning. And the processing module is used for integrating all cameras into the processing module, controlling the moving camera module to move, and performing rough detection and fine detection on the skin image.
Specifically, the fixed camera die is divided into 5 fixed cameras 101 on the left and right sides and a middle fixed camera 101, the 5 fixed cameras 101 symmetrical on the left and right sides respectively detect 1 nose, 1 fuselage, 2 wings and 1 empennage of the airplane, the middle fixed camera detects the middle part of the empennage, and the fixed cameras are labeled; the algorithm for detecting the aircraft skin can be a detection algorithm based on deep learning or an image detection algorithm utilizing the target texture features. The resolution and focal length of the camera are selected according to the target detection probability:
the lower the resolution of the defective part of the aircraft skin, the lower the probability of detecting in-place defects; the higher the resolution of the defective portion, the smaller the range of detection. In particular, it relates to the detection of pixel values in an image according to the aircraft skin:
Figure 100002_DEST_PATH_IMAGE002A
wherein
Figure 100002_DEST_PATH_IMAGE004A
Is the probability of detection of the aircraft skin,
Figure DEST_PATH_IMAGE006A
in order to detect the number of points,
Figure DEST_PATH_IMAGE008AAA
for minimum detected resolution, it can be obtained from an aircraft skin identification algorithm
Figure DEST_PATH_IMAGE008AAAA
E.g. using a deep learning based target recognition algorithm
Figure DEST_PATH_IMAGE008AAAAA
Defined as small target detection, the pixel value is 32 x 32.
The resolution of the mobile camera 102 is higher than that of the fixed camera 101, the focal length is longer than that of the fixed camera, the mobile camera can horizontally move on a guide rail on the support 103 and move in the rolling direction, and the mobile camera can be used for high-definition aircraft skin detection and positioning. The detection of the mobile camera 101 is specifically as follows:
firstly, when a fixed camera 101 with a number of k detects a target, analyzing the detection probability of a defect part, and when the detection probability is more than 0.8, visually judging whether the defect part is a skin defect part by an operator; when the detection probability is less than 0.8, carrying out high-precision alignment detection on the detection area by using the mobile camera module;
secondly, according to the mark number of the fixed camera 101, when the camera k detects a target, the mobile camera module quickly moves to the position above the suspected skin defect to perform rough alignment;
and thirdly, carrying out accurate alignment detection on the suspected area according to an image matching algorithm.
As shown in fig. 2, the matching diagram of the skin image of the aircraft is that the left side is a moving camera shot image 107, the right side is a fixed camera shot image 108, left and right image feature points 110 provide left and right image feature point matching lines 111 for algorithm matching, and the right side box is a left image at a right image matching position 109. The feature points of the image matching algorithm may be SIFT (Scale-invariant feature transform) Scale invariant feature transformation feature description points or surf (speeded Up Robust features) fast invariant feature transformation feature description points, and the like, and the matching lines 111 of the left and right image feature points and the left image feature point are matched at the right image matching position 109 by using a random sample consensus algorithm.
The RANSAC algorithm finds an optimal homography matrix H with a size of 3 x 3. RANSAC aims to find an optimal parameter matrix so that the number of data points satisfying the matrix is the maximum. Since the homography matrix has 9 unknown parameters, at least 9 linear equations are needed for solving, and corresponding to the point position information, two equations can be listed for one group of point pairs, and at least 5 groups of matching point pairs are included.
Figure DEST_PATH_IMAGE042
RANSAC algorithm step:
randomly extracting 8 sample data from a data set (the 8 samples cannot be collinear), calculating a transformation matrix H, and marking as a model M;
calculating projection errors of all data in the data set and the model M, and adding an inner point set I if the errors are smaller than a threshold value; if the error is greater than the threshold, the data is updated.
Quitting if the iteration times are more than m; otherwise, adding 1 to the iteration number, and repeating the steps.
After the homography matrix H is obtained, SVD (singular Value decomposition) singular Value decomposition is adopted to obtain a rotation matrix and a translation matrix, and then a processing unit is used for controlling the motion of the mobile camera. The method comprises the following specific steps:
Figure 100002_DEST_PATH_IMAGE010A
Figure DEST_PATH_IMAGE044
in order to translate the vertical vector of the matrix,
Figure 100002_DEST_PATH_IMAGE014A
for rotation vectors, SVD decomposition is used
Figure 100002_DEST_PATH_IMAGE016A
Wherein
Figure 100002_DEST_PATH_IMAGE018A
Figure 100002_DEST_PATH_IMAGE020A
Is an orthogonal matrix, and the matrix is,
Figure 100002_DEST_PATH_IMAGE022A
is a matrix of singular values.
Figure 100002_DEST_PATH_IMAGE024A
Wherein
Figure 100002_DEST_PATH_IMAGE026A
And obtaining a rotation matrix and a translation vector.
③ rotating matrix
Figure 100002_DEST_PATH_IMAGE028A
And calculating the Euler angle by adopting the rotation matrix to obtain the angles in three directions as follows:
Figure 100002_DEST_PATH_IMAGE030A
,
Figure 100002_DEST_PATH_IMAGE032A
,
Figure DEST_PATH_IMAGE034A
detecting the angles and translation vectors in three directions obtained by matching the moving camera image with the fixed camera in real time, and controlling the moving camera to move by using the processing unit
Figure DEST_PATH_IMAGE046
And minimum. If it is
Figure DEST_PATH_IMAGE036A
For the next movement
Figure DEST_PATH_IMAGE038A
The angle of the direction of the light beam,
Figure DEST_PATH_IMAGE040A
move in that direction, otherwise move in the opposite direction.
A processing module: all cameras are integrated into a processing unit, the motion of a mobile camera module is controlled, and a skin image rough detection part and a skin image detail detection part are carried out. The fixed camera module management adopts that each camera corresponds to one network cable, and the cameras are integrated into one processing unit in a mode of centralized management of the fixed cameras by adopting a concentrator, and the rough detection of the skin is carried out; the management of the mobile camera module is independently managed in a wireless mode, the mobile camera module is subjected to horizontal movement and rolling direction movement and image transmission on a guide rail of the support by adopting wireless signals, wherein the image transmission can be compressed transmission and uncompressed transmission, when an operator needs to confirm whether defects exist, the image is transmitted in an uncompressed mode, and other image JPEG compression modes can be adopted for transmission.
Although embodiments of the present invention have been shown and described, it will be appreciated by those skilled in the art that changes, modifications, substitutions and alterations can be made in these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.

Claims (4)

1. The method for detecting the defects of the aircraft skin based on multiple visual angles comprises a detection system and is characterized in that the detection system comprises a bracket, a fixed camera module and a movable camera module which are arranged on the bracket, and a processing module connected with the fixed camera module and the movable camera module, wherein the fixed camera module integrates a camera into the processing module by adopting multi-path signal acquisition and a hub management mode to carry out rough detection on the skin; the mobile camera module is independently managed in a wireless mode, and is controlled and image-transmitted by adopting a wireless signal; the fixed camera module comprises a plurality of first cameras which are fixed on the left side and the right side of the support and are symmetrical, and a second camera which is fixed in the middle of the lower side of the support, and the movable camera module comprises a third camera which is movably arranged on the support;
the number of the first cameras on the left side of the support is 5, the first cameras comprise 1 camera head, 1 fuselage, 2 wings and 1 empennage, the second cameras are used for detecting the middle part of the empennage, and the first cameras and the second cameras are provided with labels;
the detection method comprises the following steps:
s1, carrying out coarse detection on the aircraft skin by the fixed camera module to obtain the detection probability of the aircraft skin, wherein the resolution and the focal length of the camera are selected according to the target detection probability; probability of aircraft skin detectionP(N)Comprises the following steps:
Figure DEST_PATH_IMAGE002A
wherein
Figure DEST_PATH_IMAGE004A
In order to detect the number of points,
Figure DEST_PATH_IMAGE006AAA
for minimum detected resolution, it can be obtained from an aircraft skin identification algorithm
Figure DEST_PATH_IMAGE006AAAA
In a target recognition algorithm based on deep learning
Figure DEST_PATH_IMAGE006AAAAA
The method is defined as small target detection, the pixel value is 32 x 32, and when the resolution of a defect image is greater than 64, the probability of detecting a target defect is 94.5%;
and S2, moving the camera module to perform fine detection on the aircraft skin by moving on the bracket.
2. The detection method according to claim 1, characterized in that: the fine detection comprises the following steps:
a1, when a fixed camera with a mark number of k detects a target, analyzing the detection probability of a defect part, and when the detection probability is more than T, visually judging whether the defect part is a skin defect part by an operator; when the detection probability is smaller than T, carrying out high-precision alignment detection on the detection area by using the mobile camera module;
a2, according to the label of the fixed camera, when the camera k detects a target, the mobile camera module is controlled by the processing module to accurately move above the suspected skin defect for coarse alignment;
and A3, carrying out accurate alignment detection on the suspected area according to an image matching algorithm.
3. The detection method according to claim 2, characterized in that: the image matching algorithm comprises the following steps:
b1, randomly extracting 8 non-collinear sample data from the data set, and calculating a transformation matrix H by using a RANSAC algorithm, and marking the transformation matrix H as a model M;
b2, calculating projection errors of all data in the data set and the model M, and if the errors are smaller than a threshold value, adding an inner point set I; if the error is larger than the threshold value, updating the data;
b3, if the iteration number is more than m, exiting; otherwise, adding 1 to the iteration number, and repeating the steps.
4. The detection method according to claim 3, characterized in that: the precise moving method of the mobile camera module comprises the following steps:
C1、
Figure DEST_PATH_IMAGE008A
Figure DEST_PATH_IMAGE010A
in order to translate the vertical vector of the matrix,
Figure DEST_PATH_IMAGE012A
for rotation vectors, SVD decomposition is used
Figure DEST_PATH_IMAGE014A
Wherein
Figure DEST_PATH_IMAGE016A
Figure DEST_PATH_IMAGE018A
Is an orthogonal matrix, and the matrix is,
Figure DEST_PATH_IMAGE020A
is a singular value matrix;
C2、
Figure DEST_PATH_IMAGE022A
wherein
Figure DEST_PATH_IMAGE024A
Obtaining a rotation matrix and a translation vector;
c3, rotation matrix
Figure DEST_PATH_IMAGE026A
And calculating the Euler angle by adopting the rotation matrix to obtain the angles in three directions as follows:
Figure DEST_PATH_IMAGE028A
,
Figure DEST_PATH_IMAGE030A
,
Figure DEST_PATH_IMAGE032A
c4, real-time detection of the moving camera image and the matching of the fixed camera to the acquired three-direction angles and translation vectors, and controlling the moving camera to move by using the processing unit.
CN202110078491.1A 2021-01-21 2021-01-21 Multi-view-angle-based aircraft skin defect detection method Active CN112396604B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110078491.1A CN112396604B (en) 2021-01-21 2021-01-21 Multi-view-angle-based aircraft skin defect detection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110078491.1A CN112396604B (en) 2021-01-21 2021-01-21 Multi-view-angle-based aircraft skin defect detection method

Publications (2)

Publication Number Publication Date
CN112396604A CN112396604A (en) 2021-02-23
CN112396604B true CN112396604B (en) 2021-03-30

Family

ID=74624967

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110078491.1A Active CN112396604B (en) 2021-01-21 2021-01-21 Multi-view-angle-based aircraft skin defect detection method

Country Status (1)

Country Link
CN (1) CN112396604B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114295678B (en) * 2021-12-07 2023-09-19 北京卫星制造厂有限公司 Detection equipment for satellite force bearing barrel

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7716987B2 (en) * 2006-07-31 2010-05-18 University Of Dayton Non-contact thermo-elastic property measurement and imaging system for quantitative nondestructive evaluation of materials
CN102542261B (en) * 2011-12-31 2014-03-26 华中科技大学 Two-dimensional computable target detection, recognition and identification performance predicting method
CN108447058B (en) * 2018-03-30 2020-07-14 北京理工大学 Image quality evaluation method and system
CN110174413B (en) * 2019-06-13 2021-08-31 中新红外科技(武汉)有限公司 Blade defect detection method and maintenance method
CN110654571B (en) * 2019-11-01 2023-10-20 西安航通测控技术有限责任公司 Nondestructive testing robot system and method for surface defects of aircraft skin
CN111340754B (en) * 2020-01-18 2023-08-25 中国人民解放军国防科技大学 Method for detecting and classifying surface defects of aircraft skin

Also Published As

Publication number Publication date
CN112396604A (en) 2021-02-23

Similar Documents

Publication Publication Date Title
CN113744270B (en) Unmanned aerial vehicle visual detection and identification method for crane complex steel structure surface defects
CN107186708B (en) Hand-eye servo robot grabbing system and method based on deep learning image segmentation technology
CN110580717B (en) Unmanned aerial vehicle autonomous inspection route generation method for electric power tower
WO2018028103A1 (en) Unmanned aerial vehicle power line inspection method based on characteristics of human vision
CN112200178B (en) Transformer substation insulator infrared image detection method based on artificial intelligence
CN112348034A (en) Crane defect detection system based on unmanned aerial vehicle image recognition and working method
CN112419429B (en) Large-scale workpiece surface defect detection calibration method based on multiple viewing angles
CN110470226A (en) A kind of bridge structure displacement measurement method based on UAV system
CN111768417B (en) Railway wagon overrun detection method based on monocular vision 3D reconstruction technology
CN108846331B (en) Video identification method for judging whether screw fastener of motor train unit chassis falls off or not
CN112396604B (en) Multi-view-angle-based aircraft skin defect detection method
Minghui et al. Deep learning enabled localization for UAV autolanding
CN112966571A (en) Standing long jump flight height measurement method based on machine vision
CN113486877A (en) Power equipment infrared image real-time detection and diagnosis method based on lightweight artificial intelligence model
CN112102403A (en) High-precision positioning method and system for autonomous inspection unmanned aerial vehicle in power transmission tower scene
CN113744230B (en) Unmanned aerial vehicle vision-based intelligent detection method for aircraft skin damage
CN113706496B (en) Aircraft structure crack detection method based on deep learning model
Liu et al. A UAV-based aircraft surface defect inspection system via external constraints and deep learning
CN116245944A (en) Cabin automatic docking method and system based on measured data
CN115857040A (en) Dynamic visual detection device and method for foreign matters on locomotive roof
CN115144879A (en) Multi-machine multi-target dynamic positioning system and method
Maunder et al. Ai-based general visual inspection of aircrafts based on yolov5
Yang et al. Locator slope calculation via deep representations based on monocular vision
Wang et al. Research on appearance defect detection of power equipment based on improved faster-rcnn
CN108229530B (en) Wrist arm fault analysis method and analysis device thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant