CN118196053A - Method and device for fusing surface images of aircraft panel - Google Patents

Method and device for fusing surface images of aircraft panel Download PDF

Info

Publication number
CN118196053A
CN118196053A CN202410355429.6A CN202410355429A CN118196053A CN 118196053 A CN118196053 A CN 118196053A CN 202410355429 A CN202410355429 A CN 202410355429A CN 118196053 A CN118196053 A CN 118196053A
Authority
CN
China
Prior art keywords
image
calibration point
calibration
positioning model
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410355429.6A
Other languages
Chinese (zh)
Inventor
尹佳
王玮
吕小兵
郭中华
黄敏
王浩熠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
AVIC Xian Aircraft Industry Group Co Ltd
Original Assignee
AVIC Xian Aircraft Industry Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by AVIC Xian Aircraft Industry Group Co Ltd filed Critical AVIC Xian Aircraft Industry Group Co Ltd
Priority to CN202410355429.6A priority Critical patent/CN118196053A/en
Publication of CN118196053A publication Critical patent/CN118196053A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Processing (AREA)

Abstract

The application provides an aircraft panel surface image fusion method and device. The method comprises the following steps: s1, acquiring an image of each shooting area comprising an aircraft panel, wherein the aircraft panel is divided into a plurality of shooting areas through gridding in advance, and each shooting area comprises four calibration points of specific colors arranged at gridding vertex angles; s2, determining the coordinates of each calibration point on each image based on the calibration point identification positioning model; and step S3, splicing two adjacent images according to the coordinate of the standard point shared by the two images in sequence according to the shooting region order to form the surface image of the whole wallboard. The application realizes the image fusion of the aircraft wall panel, avoids the detection omission risk problem caused by the field of view limitation of the acquired image, and improves the overall intuitiveness of image detection of the wall panel.

Description

Method and device for fusing surface images of aircraft panel
Technical Field
The application belongs to the technical field of aviation digital measurement, and particularly relates to a method and a device for fusing surface images of aircraft wall panels.
Background
Aviation intelligent manufacturing capability is continuously improved, and an integral surface image of an aircraft panel is required to be acquired so as to detect the quality of a detected workpiece in real time. There is a limited range in image photographing sites in general, and panoramic photographing cannot be performed on an aircraft panel having a large size due to high precision requirements on photographing pixels.
Disclosure of Invention
In order to solve the technical problems, the application provides a method and a device for fusing surface images of an aircraft panel, which are used for forming a surface image of the whole panel by locally shooting the aircraft panel and then synthesizing pictures.
In a first aspect of the present application, a method for fusing images of a surface of an aircraft panel, comprising:
s1, acquiring an image of each shooting area comprising an aircraft panel, wherein the aircraft panel is divided into a plurality of shooting areas through gridding in advance, and each shooting area comprises four calibration points of specific colors arranged at gridding vertex angles;
S2, determining the coordinates of each calibration point on each image based on the calibration point identification positioning model;
and step S3, splicing two adjacent images according to the coordinate of the standard point shared by the two images in sequence according to the shooting region order to form the surface image of the whole wallboard.
Preferably, in step S1, the calibration points of the specific color are black calibration dots, and the black calibration dots are preset on the aircraft panel by way of pasting.
Preferably, step S2 is preceded by the further steps of:
Step S201, a training set is constructed by pasting a specific color calibration point on a flat plate and carrying out coordinate labeling;
Step S202, training a preselected deep neural network architecture through a training set to obtain a first calibration point identification positioning model, wherein the preselected deep neural network architecture is provided with a plurality of network layers;
Step 203, using a set number of network layers as clipping network layers, connecting the network layers before and after clipping network layers, using the L1 norm of the output characteristic F of the clipping network layers as a channel score penalty term of a loss function, performing sparsification training on the first calibration point identification positioning model to obtain a second calibration point identification positioning model without clipping network layers, and using the second calibration point identification positioning model as a final calibration point identification positioning model.
Preferably, step S201 is followed by further scaling and/or angle transforming the images in the training set.
Preferably, step S3 further includes:
Step S31, determining adjacent edges of two adjacent images;
step S32, determining two coordinates of the target points on the adjacent sides of the first image And two other coordinates of the two points on the adjacent side of the second image substantially coincident with the two points of the first image
Step S33, determining the scaling r of the second image according to the following formula:
The rotation angle θ of the second image is determined according to the following formula:
Step S34, correcting the second image according to the scaling and the rotation angle;
step S35, determining the coordinate of the calibration point on the adjacent edge on the corrected second image based on the calibration point identification positioning model;
And S36, cutting and splicing the images according to the connecting lines of the coordinates of the two standard points on the adjacent sides of the images.
In a second aspect of the present application, an aircraft panel surface image fusion apparatus, comprising:
The image acquisition module is used for acquiring images of shooting areas comprising the aircraft panel, wherein the aircraft panel is divided into a plurality of shooting areas through gridding in advance, and each shooting area comprises four calibration points of specific colors arranged at gridding vertex angles;
The calibration point coordinate recognition module is used for determining each calibration point coordinate on each image based on the calibration point recognition positioning model;
And the image splicing module is used for splicing two adjacent images according to the sequence of the shooting areas and the coordinate of the standard point shared by the two images to form the surface image of the whole wallboard.
Preferably, in the image acquisition module, the calibration points of the specific color are black calibration dots, and the black calibration dots are preset on the aircraft panel in a pasting mode.
Preferably, the aircraft panel surface image fusion device further comprises a calibration point recognition positioning model training module, and the training module comprises:
The training set construction unit is used for constructing a training set by pasting a specific color calibration point on the flat plate and carrying out coordinate marking;
the first calibration point identification positioning model training unit is used for training a preselected deep neural network architecture through a training set to obtain a first calibration point identification positioning model, wherein the preselected deep neural network architecture is provided with a plurality of network layers;
The second calibration point identification positioning model training unit is used for connecting a set number of network layers serving as clipping network layers, connecting the network layers before and after clipping network layers, taking the L1 norm of the output characteristic F of the clipping network layers as a channel score penalty term of a loss function, performing sparse training on the first calibration point identification positioning model to obtain a second calibration point identification positioning model without clipping network layers, and taking the second calibration point identification positioning model as a final calibration point identification positioning model.
Preferably, the training module further comprises:
And the image enhancement unit is used for performing scale transformation and/or angle transformation on the images in the training set.
Preferably, the image stitching module includes:
an adjacent edge determining unit for determining adjacent edges of two adjacent images;
a shared reference point determining unit for determining two reference point coordinates on adjacent sides of the first image And two other landmark coordinates/>, on adjacent sides of the second image, substantially coincident with the two landmarks of the first image
A scaling and rotation angle calculating unit for determining a scaling r of the second image according to the following formula:
The rotation angle θ of the second image is determined according to the following formula:
An image correction unit for correcting the second image according to the scaling and the rotation angle;
the calibration point re-identification unit is used for determining calibration point coordinates on adjacent edges on the corrected second image based on the calibration point identification positioning model;
and the cutting and splicing unit is used for cutting and splicing the images according to the connecting line of the coordinates of the two standard points on the adjacent edges of the images.
In a third aspect of the application, a computer device comprises a processor, a memory and a computer program stored on the memory and executable on the processor, the processor executing the computer program for implementing the aircraft panel surface image fusion method as defined in any one of the preceding claims.
In a fourth aspect of the application, a readable storage medium stores a computer program for implementing an aircraft panel surface image fusion method as described above when executed by a processor.
The depth learning-based standard point identification and image correction splicing technology provided by the application can be used for realizing rapid fusion of images of all areas of the wallboard, providing necessary data information sources for digital and intelligent detection of the quality of subsequent products, avoiding the detection omission risk problem caused by the limitation of the view field of the acquired images, and improving the overall intuitiveness of image detection of the wallboard.
Drawings
FIG. 1 is a flow chart of a preferred embodiment of the aircraft panel surface image fusion method of the present application.
FIG. 2 is a schematic diagram of a network layer sparse training and compression design in accordance with the embodiment of FIG. 1.
Fig. 3 is a schematic diagram of a computer device suitable for use in implementing an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application become more apparent, the technical solutions in the embodiments of the present application will be described in more detail with reference to the accompanying drawings in the embodiments of the present application. In the drawings, the same or similar reference numerals denote the same or similar elements or elements having the same or similar functions throughout. The described embodiments are some, but not all, embodiments of the application. The embodiments described below by referring to the drawings are exemplary and intended to illustrate the present application and should not be construed as limiting the application. All other embodiments, based on the embodiments of the application, which are apparent to those of ordinary skill in the art without inventive faculty, are intended to be within the scope of the application. Embodiments of the present application will be described in detail below with reference to the accompanying drawings.
According to a first aspect of the present application, as shown in fig. 1, a method for fusing an image of a surface of an aircraft panel mainly includes:
step S1, acquiring an image containing each shooting area of an aircraft panel, wherein the aircraft panel is divided into a plurality of shooting areas through gridding in advance, and each shooting area comprises four calibration points of specific colors arranged at gridding vertex angles.
In some alternative embodiments, the calibration points of the specific color are black calibration dots, and the black calibration dots are preset on the aircraft panel by pasting.
In step S1, black calibration dots are pasted on the wallboard parts in advance, the wallboard grid marks are divided into a plurality of shooting areas, and an industrial camera is used for acquiring images of the workpiece surface in each grid, so as to obtain a plurality of wallboard area images.
When the industrial camera is used for acquiring images of the surface of a workpiece in each grid, the industrial camera is required to be opposite to the center of the grid as much as possible, and the shot images can be slightly larger than the grid area, so that a plurality of wall plate area images are obtained.
In addition, the application can generally form matrix calibration dots on the surface of the aircraft panel based on the calibration dots as the basis of meshing division, wherein the area between every two adjacent calibration dots is used as a grid, and at least the four calibration dots should be shot when shooting. In an alternative embodiment, the aircraft panel may be divided into grids by scribing, and a plurality of calibration points may be covered in each grid.
And S2, determining the coordinates of each calibration point on each image based on the calibration point identification positioning model.
The black fixed dots on the images of all the areas of the positioning wallboard are quickly identified and positioned by using the calibration point identification positioning model, and the coordinates of the black fixed dots on the images of all the areas are obtained. For this purpose, it is necessary to obtain a calibration point identification positioning model in advance by training.
In some alternative embodiments, step S2 is preceded by further comprising:
Step S201, a training set is constructed by pasting a specific color calibration point on a flat plate and carrying out coordinate labeling;
Step S202, training a preselected deep neural network architecture through a training set to obtain a first calibration point identification positioning model, wherein the preselected deep neural network architecture is provided with a plurality of network layers;
Step 203, using a set number of network layers as clipping network layers, connecting the network layers before and after clipping network layers, using the L1 norm of the output characteristic F of the clipping network layers as a channel score penalty term of a loss function, performing sparsification training on the first calibration point identification positioning model to obtain a second calibration point identification positioning model without clipping network layers, and using the second calibration point identification positioning model as a final calibration point identification positioning model.
In this embodiment, in step S201, a training set is constructed, that is, after locating and labeling the calibration points on the image, a calibration point image database is built, and those skilled in the art will understand that a portion in the training set may be further used as a verification set, a test set, etc. to test and verify the finally trained model.
After the positioning and labeling of the calibration points on each black calibration dot image are completed, an xml format label document is generated, the label document contains the name of the image and the coordinate information of the calibration point target in the image, and the calibration point image database comprises the black calibration dot image and the corresponding xml format label document.
In some alternative embodiments, step S201 further comprises scaling and/or angle transforming the images in the training set. This embodiment is used to extend the training set and to increase the robustness of the model.
In step S202, the pre-selected deep neural network architecture is trained based on the training set, and it should be noted that, in the field of aircraft panel surface image generation, the proper deep neural network architecture needs to be selected due to high requirement of the calibration point detection speed and precision, and the deep neural network architecture YOLOv S is preferably used here. The training process is conventional training, namely inputting images in a training set into a deep neural network architecture, carrying out difference calculation according to the output calibration point result and the image calibration point coordinates marked in the training set, introducing the difference into a loss function, further correcting internal parameters of the deep neural network architecture, and obtaining a trained first calibration point identification positioning model when convergence conditions or precision requirements and the like are met.
As described above, in order to further increase the calibration point detection speed, the first calibration point identification positioning model is optimized in step S203, that is, the network layer in the first calibration point identification positioning model is cut and compressed through the thinning training, so as to reduce the calculation amount of the finally formed model. Specifically, a quantitative network layer is selected from a first calibration point identification positioning model to serve as a clipping network layer, such as a first network layer in the middle of fig. 2, a quick connection structure is applied to the front and back of the selected network layer, then an L1 norm of an output characteristic F of a sparsification layer is calculated and introduced into a model sparsification training loss function, and a channel score penalty term of the model sparsification training loss function is as follows:
Wherein, L represents the total number of network layers needing to be thinned from the model, for example, a10 th network layer with 16 network layers is cut, a 9 th network layer output result is imported into an 11 th network layer, and then an L1 norm I F 10||1 of a10 th network layer output characteristic is calculated; for example, clipping the 9 th and 10 th network layers with 16 network layers, importing the 8 th network layer output result into the 11 th network layer, and calculating the L1 norms of the 9 th and 10 th network layer output characteristics, namely R (F) = |F 9||1+||F10||1; and in the calculation of the loss function by taking the norm or the norm and the channel score penalty term serving as a new deep neural network architecture, after the sparsification training calibration point identification positioning model is completed, a sparse network layer in the model is cut, and a second calibration point identification positioning model is obtained.
It should be noted that, in the process of cutting the network layers, more network layers can be cut step by step through a trial cutting mode, for example, one network layer is cut first, after training, if the accuracy requirement is met, one network layer is cut out of the rest network layers until the accuracy is not met, so that a plurality of network layers can be cut out as much as possible to save the calculation amount to the greatest extent.
And step S3, splicing two adjacent images according to the coordinate of the standard point shared by the two images in sequence according to the shooting region order to form the surface image of the whole wallboard.
In this step, the shooting area sequence may be generally expressed in a matrix form, so as to determine which two images need to be stitched, for example, a first row and a first column of images and a first row and a second column of images are stitched left and right, and two calibration points on the right side of the first row and the first column of images and two calibration points on the left side of the first row and the second column of images are shared calibration point coordinates, and the two pairs of calibration point coordinates are actually a pair, so that image stitching can be performed based on the two pairs of calibration point coordinates.
In step S3, all the images are divided and spliced in pairs according to the shooting region sequence, then the images are spliced in pairs, and the spliced images are further divided and spliced in pairs until a complete plane wallboard surface image is finally formed.
In some alternative embodiments, step S3 further comprises:
In step S31, adjacent edges of two adjacent images are determined, for example, a right edge of a left image and a left edge of a right image are adjacent edges, and a lower edge of an upper image and an upper edge of a lower image are adjacent edges.
Step S32, determining two coordinates of the target points on the adjacent sides of the first imageAnd two other coordinates of the two points on the adjacent side of the second image substantially coincident with the two points of the first image
Step S33, determining the scaling r of the second image according to the following formula:
The rotation angle θ of the second image is determined according to the following formula:
and step S34, correcting the second image according to the scaling and the rotation angle.
Taking the splicing of left and right images as an example, an upper pair of physical calibration points Z1 and Z2 are shared in two images, the upper calibration point Z1 and the lower calibration point Z2 on the right side of a first image are taken as reference points, and firstly, the distance between the reference points Z1 and Z2 and the upper calibration point and the lower calibration point shared on the left side of a second image are calculatedAnd/>And (3) scaling the second image to obtain a second scaled image, and then rotating the second scaled image by taking the connecting line of the upper and lower calibration points on the right side of the first image as a reference direction to obtain a corrected second image.
Step S35, determining the coordinate of the calibration point on the adjacent edge on the corrected second image based on the calibration point identification positioning model;
And S36, cutting and splicing the images according to the connecting lines of the coordinates of the two standard points on the adjacent sides of the images.
For example, the corrected second image is input to the calibration point recognition positioning model, the coordinates of the corrected calibration point sum are obtained, the first image is cut and the left part image is reserved according to the connection line of the calibration points z1 to z2, the corrected second image is cut and the right part image is reserved according to the connection line of the calibration points, and the reserved two parts images are spliced. And continuously splicing the two adjacent area images to form the whole wallboard surface image.
The second aspect of the present application provides an aircraft panel surface image fusion device corresponding to the above method, mainly comprising:
The image acquisition module is used for acquiring images of shooting areas comprising the aircraft panel, wherein the aircraft panel is divided into a plurality of shooting areas through gridding in advance, and each shooting area comprises four calibration points of specific colors arranged at gridding vertex angles;
The calibration point coordinate recognition module is used for determining each calibration point coordinate on each image based on the calibration point recognition positioning model;
And the image splicing module is used for splicing two adjacent images according to the sequence of the shooting areas and the coordinate of the standard point shared by the two images to form the surface image of the whole wallboard.
In some alternative embodiments, in the image acquisition module, the calibration points of the specific color are black calibration dots, and the black calibration dots are preset on the aircraft panel by means of pasting.
In some alternative embodiments, the aircraft panel surface image fusion apparatus further includes a calibration point identification positioning model training module, the training module including:
The training set construction unit is used for constructing a training set by pasting a specific color calibration point on the flat plate and carrying out coordinate marking;
the first calibration point identification positioning model training unit is used for training a preselected deep neural network architecture through a training set to obtain a first calibration point identification positioning model, wherein the preselected deep neural network architecture is provided with a plurality of network layers;
The second calibration point identification positioning model training unit is used for connecting a set number of network layers serving as clipping network layers, connecting the network layers before and after clipping network layers, taking the L1 norm of the output characteristic F of the clipping network layers as a channel score penalty term of a loss function, performing sparse training on the first calibration point identification positioning model to obtain a second calibration point identification positioning model without clipping network layers, and taking the second calibration point identification positioning model as a final calibration point identification positioning model.
In some alternative embodiments, the training module further comprises:
And the image enhancement unit is used for performing scale transformation and/or angle transformation on the images in the training set.
In some alternative embodiments, the image stitching module includes:
an adjacent edge determining unit for determining adjacent edges of two adjacent images;
a shared reference point determining unit for determining two reference point coordinates on adjacent sides of the first image And two other landmark coordinates/>, on adjacent sides of the second image, substantially coincident with the two landmarks of the first image
A scaling and rotation angle calculating unit for determining a scaling r of the second image according to the following formula:
The rotation angle θ of the second image is determined according to the following formula:
An image correction unit for correcting the second image according to the scaling and the rotation angle;
the calibration point re-identification unit is used for determining calibration point coordinates on adjacent edges on the corrected second image based on the calibration point identification positioning model;
and the cutting and splicing unit is used for cutting and splicing the images according to the connecting line of the coordinates of the two standard points on the adjacent edges of the images.
In a third aspect of the application, a computer device comprises a processor, a memory, and a computer program stored on the memory and executable on the processor, the processor executing the computer program for implementing a method of fusion of aircraft panel surface images.
In a fourth aspect of the application, a readable storage medium stores a computer program for implementing an aircraft panel surface image fusion method as described above when executed by a processor. The computer-readable storage medium may be contained in the apparatus described in the above embodiment; or may be present alone without being fitted into the device. The computer readable storage medium carries one or more programs which, when executed by the apparatus, process data as described above.
Referring now to FIG. 3, there is illustrated a schematic diagram of a computer device 400 suitable for use in implementing embodiments of the present application. The computer device shown in fig. 3 is only one example and should not be construed as limiting the functionality and scope of use of embodiments of the present application.
As shown in fig. 3, the computer device 400 includes a Central Processing Unit (CPU) 401, which can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 402 or a program loaded from a storage section 408 into a Random Access Memory (RAM) 403. In the RAM403, various programs and data required for the operation of the device 400 are also stored. The CPU401, ROM402, and RAM403 are connected to each other by a bus 404. An input/output (I/O) interface 405 is also connected to bus 404.
The following components are connected to the I/O interface 405: an input section 406 including a keyboard, a mouse, and the like; an output portion 407 including a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker, and the like; a storage section 408 including a hard disk or the like; and a communication section 409 including a network interface card such as a LAN card, a modem, or the like. The communication section 409 performs communication processing via a network such as the internet. The drive 410 is also connected to the I/O interface 405 as needed. A removable medium 411 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is installed on the drive 410 as needed, so that a computer program read therefrom is installed into the storage section 408 as needed.
In particular, according to embodiments of the present application, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present application include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flowcharts. In such an embodiment, the computer program may be downloaded and installed from a network through the communication portion 409 and/or installed from the removable medium 411. The above-described functions defined in the method of the present application are performed when the computer program is executed by a Central Processing Unit (CPU) 401. The computer storage medium of the present application may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present application, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The modules or units described in the embodiments of the present application may be implemented by software, or may be implemented by hardware. The modules or units described may also be provided in a processor, the names of which do not in some cases constitute a limitation of the module or unit itself.
The foregoing is merely illustrative of the present application, and the present application is not limited thereto, and any changes or substitutions easily contemplated by those skilled in the art within the scope of the present application should be included in the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. An aircraft panel surface image fusion method, comprising:
s1, acquiring an image of each shooting area comprising an aircraft panel, wherein the aircraft panel is divided into a plurality of shooting areas through gridding in advance, and each shooting area comprises four calibration points of specific colors arranged at gridding vertex angles;
S2, determining the coordinates of each calibration point on each image based on the calibration point identification positioning model;
and step S3, splicing two adjacent images according to the coordinate of the standard point shared by the two images in sequence according to the shooting region order to form the surface image of the whole wallboard.
2. The method for fusing an aircraft panel surface image according to claim 1, wherein in step S1, the calibration points of the specific color are black calibration dots, and the black calibration dots are preset on the aircraft panel by pasting.
3. The aircraft panel surface image fusion method of claim 1, further comprising, prior to step S2:
Step S201, a training set is constructed by pasting a specific color calibration point on a flat plate and carrying out coordinate labeling;
Step S202, training a preselected deep neural network architecture through a training set to obtain a first calibration point identification positioning model, wherein the preselected deep neural network architecture is provided with a plurality of network layers;
Step 203, using a set number of network layers as clipping network layers, connecting the network layers before and after clipping network layers, using the L1 norm of the output characteristic F of the clipping network layers as a channel score penalty term of a loss function, performing sparsification training on the first calibration point identification positioning model to obtain a second calibration point identification positioning model without clipping network layers, and using the second calibration point identification positioning model as a final calibration point identification positioning model.
4. A method of aircraft panel surface image fusion according to claim 3, further comprising, after step S201, scaling and/or angle transforming the images in the training set.
5. The aircraft panel surface image fusion method of claim 1, wherein step S3 further comprises:
Step S31, determining adjacent edges of two adjacent images;
step S32, determining two coordinates of the target points on the adjacent sides of the first image And two other landmark coordinates on adjacent sides of the second image that are substantially coincident with the two landmarks of the first image,
Step S33, determining the scaling r of the second image according to the following formula:
The rotation angle θ of the second image is determined according to the following formula:
Step S34, correcting the second image according to the scaling and the rotation angle;
step S35, determining the coordinate of the calibration point on the adjacent edge on the corrected second image based on the calibration point identification positioning model;
And S36, cutting and splicing the images according to the connecting lines of the coordinates of the two standard points on the adjacent sides of the images.
6. An aircraft panel surface image fusion apparatus, comprising:
The image acquisition module is used for acquiring images of shooting areas comprising the aircraft panel, wherein the aircraft panel is divided into a plurality of shooting areas through gridding in advance, and each shooting area comprises four calibration points of specific colors arranged at gridding vertex angles;
The calibration point coordinate recognition module is used for determining each calibration point coordinate on each image based on the calibration point recognition positioning model;
And the image splicing module is used for splicing two adjacent images according to the sequence of the shooting areas and the coordinate of the standard point shared by the two images to form the surface image of the whole wallboard.
7. The aircraft panel surface image fusion apparatus of claim 6, wherein the specific color calibration points in the image acquisition module are black calibration dots that are pre-arranged on the aircraft panel by way of pasting.
8. The aircraft panel surface image fusion apparatus of claim 6, further comprising a calibration point identification positioning model training module comprising:
The training set construction unit is used for constructing a training set by pasting a specific color calibration point on the flat plate and carrying out coordinate marking;
the first calibration point identification positioning model training unit is used for training a preselected deep neural network architecture through a training set to obtain a first calibration point identification positioning model, wherein the preselected deep neural network architecture is provided with a plurality of network layers;
The second calibration point identification positioning model training unit is used for connecting a set number of network layers serving as clipping network layers, connecting the network layers before and after clipping network layers, taking the L1 norm of the output characteristic F of the clipping network layers as a channel score penalty term of a loss function, performing sparse training on the first calibration point identification positioning model to obtain a second calibration point identification positioning model without clipping network layers, and taking the second calibration point identification positioning model as a final calibration point identification positioning model.
9. The aircraft panel surface image fusion apparatus of claim 8, wherein the training module further comprises:
And the image enhancement unit is used for performing scale transformation and/or angle transformation on the images in the training set.
10. The aircraft panel surface image fusion apparatus of claim 6, wherein the image stitching module comprises:
an adjacent edge determining unit for determining adjacent edges of two adjacent images;
a shared reference point determining unit for determining two reference point coordinates on adjacent sides of the first image And two other landmark coordinates/>, on adjacent sides of the second image, substantially coincident with the two landmarks of the first image
A scaling and rotation angle calculating unit for determining a scaling r of the second image according to the following formula:
The rotation angle θ of the second image is determined according to the following formula:
An image correction unit for correcting the second image according to the scaling and the rotation angle;
the calibration point re-identification unit is used for determining calibration point coordinates on adjacent edges on the corrected second image based on the calibration point identification positioning model;
and the cutting and splicing unit is used for cutting and splicing the images according to the connecting line of the coordinates of the two standard points on the adjacent edges of the images.
CN202410355429.6A 2024-03-27 2024-03-27 Method and device for fusing surface images of aircraft panel Pending CN118196053A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410355429.6A CN118196053A (en) 2024-03-27 2024-03-27 Method and device for fusing surface images of aircraft panel

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410355429.6A CN118196053A (en) 2024-03-27 2024-03-27 Method and device for fusing surface images of aircraft panel

Publications (1)

Publication Number Publication Date
CN118196053A true CN118196053A (en) 2024-06-14

Family

ID=91412026

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410355429.6A Pending CN118196053A (en) 2024-03-27 2024-03-27 Method and device for fusing surface images of aircraft panel

Country Status (1)

Country Link
CN (1) CN118196053A (en)

Similar Documents

Publication Publication Date Title
EP4020385A1 (en) Method for training image depth estimation model and method for processing image depth information
CN109754464B (en) Method and apparatus for generating information
CN114494388B (en) Three-dimensional image reconstruction method, device, equipment and medium in large-view-field environment
CN111288956B (en) Target attitude determination method, device, equipment and storage medium
CN111739037B (en) Semantic segmentation method for indoor scene RGB-D image
US11417077B2 (en) Systems and methods for rapid alignment of digital imagery datasets to models of structures
CN113239925A (en) Text detection model training method, text detection method, device and equipment
CN114445597B (en) Three-dimensional lane line generation method and device, electronic device and computer readable medium
CN109816791B (en) Method and apparatus for generating information
CN115620264B (en) Vehicle positioning method and device, electronic equipment and computer readable medium
CN110288691B (en) Method, apparatus, electronic device and computer-readable storage medium for rendering image
CN117132583A (en) Wafer defect detection method and device, electronic equipment and nonvolatile storage medium
CN118196053A (en) Method and device for fusing surface images of aircraft panel
EP4318314A1 (en) Image acquisition model training method and apparatus, image detection method and apparatus, and device
CN114723640B (en) Obstacle information generation method and device, electronic equipment and computer readable medium
Deng et al. Automatic true orthophoto generation based on three-dimensional building model using multiview urban aerial images
CN116797701A (en) Diffusion effect rendering method and device, storage medium and electronic equipment
CN110782390A (en) Image correction processing method and device and electronic equipment
CN113781653B (en) Object model generation method and device, electronic equipment and storage medium
CN115908581A (en) Vehicle-mounted camera pitch angle calibration method, device, equipment and storage medium
CN111950356B (en) Seal text positioning method and device and electronic equipment
CN111696154B (en) Coordinate positioning method, device, equipment and storage medium
CN109587469B (en) Image processing method and device based on artificial intelligence recognition
CN114266879A (en) Three-dimensional data enhancement method, model training detection method, three-dimensional data enhancement equipment and automatic driving vehicle
CN116630436B (en) Camera external parameter correction method, camera external parameter correction device, electronic equipment and computer readable medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination