CN115423746B - Image processing method for calculating skin hole site and aperture - Google Patents

Image processing method for calculating skin hole site and aperture Download PDF

Info

Publication number
CN115423746B
CN115423746B CN202210878722.1A CN202210878722A CN115423746B CN 115423746 B CN115423746 B CN 115423746B CN 202210878722 A CN202210878722 A CN 202210878722A CN 115423746 B CN115423746 B CN 115423746B
Authority
CN
China
Prior art keywords
hole
edge
search
photo
equal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210878722.1A
Other languages
Chinese (zh)
Other versions
CN115423746A (en
Inventor
李博
喻志勇
姜振喜
曾德标
宋戈
沈昕
李卫东
游莉萍
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Aircraft Industrial Group Co Ltd
Original Assignee
Chengdu Aircraft Industrial Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Aircraft Industrial Group Co Ltd filed Critical Chengdu Aircraft Industrial Group Co Ltd
Priority to CN202210878722.1A priority Critical patent/CN115423746B/en
Publication of CN115423746A publication Critical patent/CN115423746A/en
Application granted granted Critical
Publication of CN115423746B publication Critical patent/CN115423746B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)

Abstract

The application belongs to the technical field of inspection and detection, and particularly relates to an image processing method for calculating skin hole sites and pore diameters, which comprises the following steps: step S1, designing a connecting hole recognition model by adopting a deep learning method: shooting a sample photo set, making a label graph set of the sample photo set, and training a detection network; step S2, roughly calculating hole sites and hole diameters: shooting a photo of a hole to be detected, and roughly calculating a hole position and a hole diameter; step S3, accurately positioning a hole edge pixel point set: appointing detection operators, image convolution operation and dividing search areas for hole edges in different directions; and S4, accurately calculating the hole position and the hole diameter of the connecting hole. By adopting the image processing method provided by the application, the accurate edge pixel points of the connecting holes can be quickly found from the skin photo, the hole positions and the hole diameters of the connecting holes are calculated, and the processing quality of the composite skin part is accurately detected.

Description

Image processing method for calculating skin hole site and aperture
Technical Field
The application belongs to the technical field of inspection and detection, and particularly relates to an image processing method for calculating skin hole sites and aperture.
Background
With the rapid development of aircraft manufacturing technology, the requirements of aircraft range and flight safety are increasing, and more composite materials are used for manufacturing aircraft profile parts. The composite materials can be processed into large thin-wall skin parts with the size of more than ten meters, and have the advantages of light weight, high strength, good toughness and the like which are incomparable with metal materials. The composite skin part and the metal skeleton part are generally connected by rivets and bolts, so hundreds of connecting holes are required to be manufactured on the composite skin part. Before the composite skin part is delivered, the hole positions and the hole diameters of all connecting holes are required to be measured, so that the accurate assembly between the composite skin part and the metal part can be ensured.
The detection method of a large number of connecting holes on the composite skin part mainly comprises contact probe detection and non-contact detection. Touch probe detection is a commonly used method at present, but there is a risk of inefficiency and collision of the probe against the skin surface. The non-contact detection is a novel digital detection technology, mainly adopts a machine vision technology to realize various measurement tasks on a detected object, and has the remarkable advantages of high efficiency, safety and no contact. One of the key research contents of the machine vision technology is an image processing method. Because images acquired under different scenes have very large variability, it is difficult to find an image processing method which can adapt to various scenes, and particularly in the field of high-precision detection, a specific image processing algorithm is developed for a specific scene almost all the time.
Uneven illumination, dust, cuttings, liquid and other serious environmental factors exist on the processing site of the composite skin part, so that high-quality photos cannot be taken. The existing general image processing methods, such as Roberts edge detection operators, sobel edge detection operators, prewitt edge detection operators and the like, cannot accurately calculate the hole position and the hole diameter of the connecting hole, and even cannot accurately identify the hole edge pixel points of the connecting hole. The bottleneck problem of the image processing method greatly limits the application of non-contact detection on composite skin parts.
Disclosure of Invention
In view of the above problems, an object of the present application is to provide an image processing method for calculating skin hole sites and aperture diameters, comprising the steps of:
the application is realized by the following technical scheme:
an image processing method for calculating skin hole sites and aperture diameters, comprising the steps of:
step S1: designing a connecting hole recognition model by adopting a deep learning method;
s101, shooting a sample photo set
And taking a plurality of gray-scale photos with the size of M multiplied by M, wherein the gray-scale photos comprise skin connecting holes, and forming a sample photo set.
S102, making a label atlas of a sample photo set
Drawing a white mark circle approximately coinciding with the connecting hole on the gray photo 1, wherein the center of the white mark circle is marked asThe method comprises the steps of carrying out a first treatment on the surface of the Copying the white marked circle onto a black canvas 3.1 with a size of MxM to form a label graph of the gray photo, wherein the center coordinates of the white marked circle in the label graph are still
For all other photos in the gray photo set in step S101, a label map is made for each gray photo by the method described above, so as to form a label photo set.
S103, training detection network
And (3) downsampling by adopting a VGG-16 network to extract features, upsampling by adopting a deconvolution form, and finally classifying all pixel points in the image. Number of iterations of trainingepochFor 100, the corresponding test set is 50 grayscale photographs, the input picture size of the model is mxm, and the loss function used is cross entropy:
where L represents the loss, y represents the true value,representing the predicted value;
after 100 iterations, the loss function of the model is reduced slightly, the whole model is converged, the over fitting prevention can be stopped in advance, and the weight of the model trained at the time is used as a characteristic recognition parameter for detecting the area where the connecting hole is located.
Step S2: rough calculation of hole site and aperture
S201, shooting a photo of the hole to be measured
And shooting a photo of the hole to be detected, wherein the photo of the hole to be detected comprises a connecting hole and a hole edge. Establishing a coordinate system on the photo of the hole to be measured according to the following rule: the origin of coordinates is located at the lower left corner of the image, the positive X-axis direction is horizontally to the right, and the positive Y-axis direction is vertically upwards.
For the pixel point at any position in the hole photo to be detected, the gray value is recorded asGray values of all pixel points in the photo form a gray matrixgrayA
S202, roughly calculating hole sites and hole diameters
And initializing feature extraction parameters of a detection network by using the model parameters obtained in the step S103, normalizing the photo to be identified with any input resolution to the size of M multiplied by M by adopting a cubic spline interpolation mode, and setting the identification model as an eval () evaluation mode. And finally, the output result comprises a background and a foreground, the foreground is an area where the connecting hole is located, the background is other areas except the connecting hole, and the hole edge point set of the connecting hole can be roughly obtained by mapping the identified result to the original photo resolution in equal proportion.
The hole edge point set has N edge points, and the coordinates of each edge point are as follows. Constructing an initial circle according to the hole edge point set, and the center coordinates of the initial circleRadius sumCalculated as follows:
1). equal to the abscissa of all edge pointsThe average value of the sum of the values,equal to the ordinate of all edge pointsAverage value of the sum;
2). equal toMaximum valueA difference in the minimum values;
the formula is:
step S3: accurate positioning hole edge pixel point set
S301, appointing detection operators for hole edges in different directions
The Kirsch edge detection operator is an edge detection operator proposed by r.kirsch and consists of eight 3×3 order matrices. In the round hole edge detection scenario herein, detection operators are assigned to hole edges of different orientations as follows:
s302, image convolution operation
Matrix gray scalegrayARespectively withK_E、K_W、K_N、K_S、K_NE、K_SE、K_NW、K_SWThe detection operators perform convolution operation to respectively obtain eastern edge matrixesedgeEAnd its corresponding east-edge image and west-edge matrixedgeWAnd its corresponding western edge image and southbound edge matrixedgeSAnd its corresponding southbound edge image, northbound edge matrixedgeNAnd its corresponding northeast edge image, northeast edge matrixedgeNEAnd its corresponding northeast edge image and southeast edge matrixedgeSEAnd its corresponding southeast-oriented edge image and northwest-oriented edge matrixedgeNWAnd corresponding northwest edge image and southwest edge matrix thereofedgeSWAnd its corresponding southwest edge image.
S303, dividing the search area
Connecting an upper left corner point and a lower right corner point of the hole photo to be detected to form a third dotted line, connecting the upper right corner point and the lower left corner point of the hole photo to be detected to form a fourth dotted line, and dividing the hole photo to be detected into four areas by the third dotted line and the fourth dotted line: east, west, south, north.
The hole edges of the eastern region include an eastern edge, a northeast edge, and a southeast edge; the hole edges of the western region comprise a western edge, a northwest edge and a southwest edge; the hole edges of the southward region comprise a southward edge, a southwest edge and a southwest edge; the hole edges of the northbound region include a northbound edge, and a northbound edge.
S304, defining a search start circle and an accurate edge matrix
The search starting circle is defined as follows: center coordinatesIs equal to the center coordinates of the initial circleThe method comprises the steps of carrying out a first treatment on the surface of the Radius of radius
The first diameter line segment passes throughAnd forms an included angle of 45 degrees with the X axis, and the second diameter line segment passes through and forms an included angle of-45 degrees with the X axis; the circumference of the search starting circle is divided into four circular arcs by the first diameter line segment and the second diameter line segment: east arc, west arc, south arc, north arc; the four sections of circular arcs comprise the pixel points with the number of allnPixel:
The exact edge matrix is defined as followsEdge: the dimension of the matrix is M multiplied by M; all elements are equal to 1;
s305, searching the hole edge of the east region;
will be (17.1) on the eastern arcnPixelThe pixel points are arranged from top to bottom in sequence to form a pointCollection set
Hole edge search of east region will be inEach horizontal line is located, and the searching starting point isThe method comprises the following specific steps:
defining an accumulated variableiIs zero, defines an accumulated variablejIs zero;
(ⅱ)、is marked as (1)Will beIs recorded as the maximum value of (2)
(iii) ifGreater than gray thresholdthresholdOrder-makingEqual to zero, go to step (v); if it isLess than or equal to the gray thresholdthresholdLet the accumulated variablejThe value of (2) is increased by 1Moving one pixel to the right to position to the next pointTurning to step (iv);
(iv) will beThe maximum value among the three is recorded asTurning to step (iii);
(v) ifiGreater than or equal tonPixelTurning to step (vi); if it isiLess thannPixel,Let accumulated variableiAdding 1 to the value of (2), and turning to the step (ii);
(vi) ending the hole edge search in the eastern region.
S306, searching hole edge of western region
Will be in western arcnPixelThe pixel points are arranged from top to bottom in sequence to form a point set
Hole edge search of the western region will be inEach horizontal line is located, and the searching starting point isThe method comprises the following specific steps:
defining an accumulated variableiIs zero, defines an accumulated variablejIs zero;
(ⅱ)、is marked as (1)Will beThe maximum value among the three is recorded as
(iii) ifGreater than gray thresholdthresholdOrder-makingEqual to zero, go to step (v); if it isLess than or equal to the gray thresholdthresholdLet the accumulated variablejThe value of (2) is increased by 1Moving one pixel to the left to position to the next pointTurning to step (iv);
(iv) will be
The maximum value among the three is recorded asTurning to step (iii);
(v) ifiGreater than or equal tonPixelTurning to step (vi); if it isiLess thannPixel,Let accumulated variableiAdding 1 to the value of (2), and turning to the step (ii);
(vi) ending the hole edge search in the western region (4.7).
S307, hole edge search of southbound region
On a circular arc in the south directionnPixelThe pixel points are sequentially arranged from left to right to form a point set
Hole edge search of the southbound region will be atEach vertical line is located, and the searching starting point isThe method comprises the following specific steps:
defining an accumulated variableiIs zero, defines an accumulated variablejIs zero;
(ⅱ)、is marked as (1)Will beThe maximum value among the three is recorded as
(iii) ifGreater than gray thresholdthresholdOrder-makingEqual to zero, go to step (v); if it isLess than or equal to the gray thresholdthresholdLet the accumulated variablejThe value of (2) is increased by 1Move one pixel down to position to the next pointTurning to step (iv);
(iv) will beThe maximum value among the three is recorded asTurning to step (iii);
(v) ifiGreater than or equal tonPixelTurning to step (vi); if it isiLess thannPixel,Let accumulated variableiAdding 1 to the value of (2), and turning to the step (ii);
(vi) ending the hole edge search in the southbound region (4.8).
S308, searching hole edge of north region
On a north arcnPixelThe pixel points are sequentially arranged from left to right to form a point set
The hole edge search in the northbound region will be atEach vertical line is located, and the searching starting point isThe method comprises the following specific steps:
defining an accumulated variableiIs zero, defines an accumulated variablejIs zero;
(ⅱ)、is marked as (1)Will beThe maximum value among the three is recorded as
(iii) ifGreater than gray thresholdthresholdOrder-makingEqual to zero, go to step (v); if it isLess than or equal to the gray thresholdthresholdLet the accumulated variablejThe value of (2) is increased by 1Move one pixel up to position to the next pointTurning to step (iv);
(iv) will beThe maximum value among the three is recorded asTurning to step (iii);
(v) ifiGreater than or equal tonPixelTurning to step (vi); if it isiLess thannPixel,Let accumulated variableiThe value of (2) is added to 1, and the process shifts to (ii)) Step, step (2);
(vi) ending the hole edge search in the northbound region.
Step S4: accurate calculation of the hole position and the aperture of the connecting hole
Accurate edge matrixEdgeIs a binary image with only two colors of black and white, wherein the black dot set isEdgeIn the element which is equal to zero,Edgethe element coordinates of which are equal to zero are also the pixel coordinates of the hole edge. Using average value method to make circle fitting to black point set to obtain circle center coordinateRadius ofIs a precise circle of (c).
The hole position coordinates of the connecting holes areThe radius of the connecting hole is
The application has the advantages that:
the image processing method capable of accurately calculating the hole positions and the pore diameters of the skin can quickly find out the accurate edge pixel points of the connecting holes from the skin photo, calculate the hole positions and the pore diameters of the connecting holes and accurately detect the processing quality of the composite skin part. The method can enable technicians to release from tedious development work of image processing algorithms, put limited effort into design and optimization of other important modules of the machine vision detection system, and promote popularization and application of non-contact detection technologies represented by machine vision in the field of aircraft manufacturing detection.
Drawings
Fig. 1 is a grayscale photograph.
Fig. 2 is a display view of a white marked circle.
Fig. 3 is a label view of fig. 1.
Fig. 4 is a photograph of a well to be measured.
Fig. 5 is a display of a set of hole edge points.
Fig. 6 is a display view of an initial circle.
Fig. 7 is an east edge image (one).
Fig. 8 is an east edge image (two).
Fig. 9 is a western edge image.
Fig. 10 is a southbound edge image.
Fig. 11 is a northbound edge image.
Fig. 12 is a northeast edge image.
Fig. 13 is a southeast edge image.
Fig. 14 is a northwest edge image.
Fig. 15 is a southwest edge image.
Fig. 16 is a search area division diagram.
Fig. 17 is a display diagram of the search initial circle.
Fig. 18 is a diagram of an aperture edge search of the east region.
FIG. 19 is a diagram of a hole edge search for western regions.
FIG. 20 is a schematic view of hole edge search in a southbound region.
FIG. 21 is a diagram of an aperture edge search for a northbound region.
FIG. 22 is a hole edge search flow chart for the east region.
FIG. 23 is a hole edge search flow chart for the western region.
FIG. 24 is a hole edge search flow chart for a southbound region.
FIG. 25 is a hole edge search flow chart for a northbound region.
Fig. 26 is a binary diagram of a precise edge matrix.
FIG. 27 is a plot of a set of pixels at the edge of a precision hole.
Fig. 28 is a display view of a perfect circle.
Fig. 29 is a flowchart of all steps.
In the accompanying drawings: 1-gray scale photograph; 1.1-a first connection hole; 2-white marked circles; 3-sample plot; 3.1-black canvas; 4-a photo of the hole to be measured; 4.1-hole edges; 4.2-a second connection hole; 4.3-coordinate system; 4.4-third dashed line; 4.5-fourth dashed line; 4.6-east region; 4.7-western region; 4.8-southbound regions; 4.9-northbound region; 5-a set of hole edge points; 6-initial circle; 7-east edge image; 7.1-east edges; 8-a first dashed line; 9-a second dashed line; 10-western edge image; 10.1-western edge; 11-southern edge image; 11.1-southbound edges; 12-northbound edge image; 12.1-northbound edge; 13-northeast edge image; 13.1-northeast edges; 14-southeast edge image; 14.1-southeast edges; 15-northwest edge image; 15.1-northwest edges; 16-southwest edge image; 16.1-southwest edge; 17-searching an initial circle; 17.1-east arc; 17.2-western arc; 17.3-south arc; 17.4-north arc; 18-a first diameter line segment; 19-a second diameter line segment; 20-a binary image of the accurate Edge matrix Edge; 20.1-black dot sets; 20.2-white point set; 21-exact circle.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are intended to explain the present application rather than to limit the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The application will be described in further detail with reference to the drawings and examples, but the application is not limited to the examples.
As shown in fig. 29, an image processing method for calculating skin hole sites and aperture diameters includes the steps of:
step S1: designing a connecting hole recognition model by adopting a deep learning method;
s101, shooting a sample photo set;
1000 grayscale photographs with 1024×1024 resolution and containing skin connecting holes were taken to form a sample photograph set.
S102, making a label atlas of a sample photo set;
referring to fig. 1, 2 and 3, a method of producing a label image will be described with reference to a grayscale photo 1. Referring to fig. 1, the grayscale photo 1 includes a complete first connection hole 1.1. Referring to fig. 2, a white mark circle 2 approximately coincident with the first connecting hole 1.1 is drawn on the gray photo 1, and the center of the white mark circle 2 is marked as. Referring to fig. 3, a white marking circle 2 is copied onto a black canvas 3.1 with a resolution of 1024×1024 to form a label 3 of a gray photo 1, wherein the center coordinates of the white marking circle 2 in the label 3 are still
And (3) for all other photos of the gray photo set in the step S101, adopting the method to manufacture a label graph for the rest 999 sample photos to form the label photo set.
S103, training a detection network;
and (3) downsampling by adopting a VGG-16 network to extract features, upsampling by adopting a deconvolution form, and finally classifying all pixel points in the image. Number of iterations of trainingepochFor 100, the corresponding test set is 50 grayscale photographs, the input picture size of the model is 1024×1024, and the loss function used is cross entropy:
where L represents the loss, y represents the true value,representing the predicted value;
after 100 iterations, the loss function of the model is reduced slightly, the whole model is converged, the over fitting prevention can be stopped in advance, and the model weight trained at the time is used as the characteristic recognition parameter of the detection of the region where the first connecting hole 1.1 is located.
Step S2: roughly calculating hole sites and pore diameters;
s201, shooting a photo of a hole to be detected;
referring to fig. 4, a photograph 4 of the hole under test with a resolution of 1024×1024 is taken, and the photograph 4 of the hole under test includes the second connection hole 4.2 and the hole edge 4.1. A coordinate system 4.3 is established on the photo 4 of the hole to be measured according to the following rule: the origin of coordinates is located at the lower left corner of the image, the positive X-axis direction is horizontally to the right, and the positive Y-axis direction is vertically upwards.
For the pixel point at any position in the hole photo 4 to be measured, the gray value is recorded asGray values of all pixel points in the photo form a gray matrixgrayA
S202, roughly calculating hole sites and pore diameters;
initializing feature extraction parameters of a detection network by using the model parameters obtained in the step S103, normalizing the photo to be identified with any input resolution to 1024×1024 dimensions by adopting a cubic spline interpolation mode, and setting the identification model as an eval () evaluation mode. And finally, the output result comprises a background and a foreground, the foreground is the area where the connecting hole is located, the background is other areas except the connecting hole, and the hole edge point set 5 of the connecting hole can be roughly obtained by mapping the identified result to the resolution of the original photo in equal proportion, as shown in fig. 5.
The hole edge point set 5 has 2392 edge points, and the coordinates of each edge point are as follows. Referring to fig. 6, an initial circle 6 is constructed from the hole edge point set 5, and the center coordinates of the initial circle 6Radius sumCalculated as follows:
1). equal to the abscissa of all edge pointsThe average value of the sum of the values,equal to the ordinate of all edge pointsAverage value of the sum;
2). equal toMaximum valueA difference in the minimum values;
that is:
the initial circle 6 has been relatively close to the hole edge 4.1 of the second connecting hole 4.2 and the initial circle 6 will be the input condition of step S304.
Step S3: accurately positioning a hole edge pixel point set;
s301, appointing detection operators for hole edges in different directions;
the Kirsch edge detection operator is an edge detection operator proposed by r.kirsch, and consists of eight 3×3 order matrices, which are respectively:
in the round hole edge detection scenario herein, detection operators are assigned to hole edges of different orientations as follows:
the eight detection operators described above are each only sensitive to the hole edge 4.1 in the respective direction. For example, referring to fig. 7, 8, the east operator is sensitive only to the east edge 7.1, but not to edges in other directions. In a subsequent step, the eight detection operators are properly combined, enabling an accurate detection of the hole edge 4.1.
S302, performing image convolution operation;
matrix gray scalegrayARespectively withK_E、K_W、K_N、K_S、K_NE、K_SE、K_NW、K_SWAnd carrying out convolution operation on the detection operator to obtain 8 edge matrixes. Each edge matrix represents an edge image, and the images represented by 8 edge matrices are shown in fig. 7-15. Gray scale matrixgrayAThe correspondence between the detection operator, the edge matrix, and the edge image is shown in table 1.
It should be noted that the resolution of fig. 7 to 15 and the hole pattern 4 to be measured is 1024×1024. For the sake of space saving, the hole photo 4 to be measured is a reduced photo of the actual photo; fig. 7 to 15 are enlarged views of actual pictures in order to more clearly show the edges of the holes.
TABLE 1 Gray matrixgrayACorresponding relation table among detection operator, edge matrix and edge image
Referring to fig. 7, gray matrixgrayAAnd east operatorK_EThe eastern edge matrix is obtained by convolution operationedgeE,edgeEIs an east edge image 7; referring to fig. 8, in order to more intuitively exhibit the edge detection effect, a first dotted line 8 is formed by connecting the center point and the upper right corner point of the east edge image 7, and a second dotted line 9 is formed by connecting the center point and the lower right corner point of the east edge image 7, it can be clearly found that: the eastern edge 7.1 between the first dashed line 8 and the second dashed line 9 is most pronounced compared to the hole edges in other orientations, and appears as an edge curve with a higher brightness.
Similar to fig. 8, fig. 9 to 15 each include two broken lines added to the edge image to more intuitively show the edge detection effect:
referring to fig. 9, it can be seen in the western edge image 10: the western edge 10.1 is most pronounced compared to the hole edges in other orientations, and appears as a segment of an edge curve with higher brightness.
Referring to fig. 10, it can be seen in the south edge image 11: the southerly edge 11.1 is most pronounced compared to the hole edges in other orientations, and appears as an edge curve with higher brightness.
Referring to fig. 11, it can be seen in the northbound edge image 12: the northbound edge 12.1 is most pronounced compared to the hole edges in other orientations, and appears as an edge curve with a higher brightness.
Referring to fig. 12, it can be seen in northeast edge image 13: the northeast edge 13.1 is most pronounced compared to the hole edges in other orientations, and appears as an edge curve with higher brightness.
Referring to fig. 13, it can be seen in southeast edge image 14: the southeast edge 14.1 is most pronounced compared to the hole edges in other orientations, and appears as an edge curve with higher brightness.
Referring to fig. 14, it can be seen in northwest edge image 15: northwest edge 15.1 is most pronounced compared to the other oriented aperture edges, and appears as a segment of an edge curve with higher brightness.
Referring to fig. 15, it can be seen in southwest edge image 16: the southwest edge 16.1 is most pronounced compared to the hole edges in other orientations, and appears as an edge curve with higher brightness.
The east edge 7.1, the west edge 10.1, the south edge 11.1, the north edge 12.1, the northeast edge 13.1, the southeast edge 14.1, the northwest edge 15.1, and the southwest edge 16.1 have higher brightness, indicating that the gray values of the pixels located on these edges are larger.
In the next step, only the edge searching area is divided into the photo 4 of the hole to be detected, and then edge pixel points are searched outwards from the inside of the connecting hole, so that the accurate position of the hole edge 4.1 can be obtained. Referring to fig. 7 to 15, since the gray value inside the connection hole is much smaller than the gray value of the hole edge, there is a gray threshold valuethresholdWhen the gray value of a pixel is greater than gray=150The degree threshold 150 indicates that this pixel has reached the hole edge 4.1.
S303, dividing a search area;
referring to fig. 16, the upper left corner and the lower right corner of the hole photo 4 to be measured are connected to form a third dotted line 4.4, the upper right corner and the lower left corner of the hole photo 4 to be measured are connected to form a fourth dotted line 4.5, and the third dotted line 4.4 and the fourth dotted line 4.5 divide the photo into four areas: east region 4.6, west region 4.7, south region 4.8, north region 4.9.
Referring to fig. 8, 12, 13 and 16, the aperture edges of the eastbound region 4.6 include an eastbound edge 7.1, a northeast edge 13.1 and a southeast edge 14.1.
Referring to fig. 9, 14, 15 and 16, the hole edges of the western region 4.7 include a western edge 10.1, a northwest edge 15.1, and a southwest edge 16.1.
Referring to fig. 10, 13, 15 and 16, the hole edges of the southbound region 4.8 include a southbound edge 11.1, southeast to edge 14.1, southwest to edge 16.1.
Referring to fig. 11, 12, 14 and 16, the aperture edges of the northbound region 4.9 include a northbound edge 12.1, a northeast edge 13.1, a northwest edge 15.1.
S304, defining a search starting circle and an accurate edge matrix;
step S202 obtains an initial circle 6, the center coordinates of the initial circle 6 are (501, 496), and the radius of the initial circle 6 is 389; step S303 details the hole edge distribution of the eastbound region 4.6, the westbound region 4.7, the southbound region 4.8, and the northbound region 4.9. Thus, a precise pixel of the hole edge 4.1 is obtained by searching outwards from the inside of the initial circle 6.
Referring to fig. 17 to 21, first, the search start circle 17 is defined as follows: (1) Center coordinatesCenter coordinates (501, 496) equal to the initial circle 6; (2) Radius of radius. The initial circle 17 is located entirely in the inner region of the hole edge 4.1.
The first diameter segment 18 passes through the center coordinatesAnd forms an included angle of 45 degrees with the X axis, and the second diameter line segment 19 passes through the center coordinatesAnd forms an included angle of-45 degrees with the X axis. The first diameter line segment 18 and the second diameter line segment 19 divide the circumference of the search start circle 17 into four circular arcs: east arc 17.1, west arc 17.2, south arc 17.3, north arc 17.4. The four sections of circular arcs comprise the pixel points with the number of allnPixel:
The exact edge matrix is defined as followsEdge: (1) a matrix dimension 1024 x 1024; (2) all elements are equal to 1. That is:
s305, hole edge search of eastern region
The 520 pixel points of 17.1 on the east arc are orderly arranged from top to bottom to form a point set
Refer to fig. 18 and 22 toThe horizontal line illustrates the hole edge search process for east region 17.1:
defining an accumulated variablejIs zero;is the coordinates of (a)Will beThe maximum value among the three is recorded as
(ii) ifGreater than 150 gray threshold, go to step (iv); if it isLess than or equal to 150, let the accumulated variablejThe value of (2) is increased by 1Moving one pixel to the right to position to the next pointGo to step (iii).
(iii) willThe maximum value among the three is recorded asAnd (3) turning to the step (ii).
(iv) orderIs equal to zero and is equal to zero,the search process for the horizontal line is ended.
Wherein whenGreater than gray threshold 150, indicating a current search pointThe pixel point of the east edge 7.1 has been reached. The hole edge search of east region 17.1 will be at the point of passageIs performed on each horizontal line of the restThe 519 horizontal lines are referred to above for searching the hole edge of the eastern region 17.1 in steps (i) to (iv).
S306, hole edges of the western region;
the 520 pixel points of 17.2 on the western arc are sequentially arranged from top to bottom to form a point set
Refer to fig. 19 and 23 toThe horizontal line illustrates the hole edge search process for western region 17.2:
defining an accumulated variablejIs zero;is the coordinates of (a)Will beThe maximum value among the three is recorded as
(ii) ifGreater than 150 gray threshold, go to step (iv); if it isLess than or equal to 150, let the accumulated variablejThe value of (2) is increased by 1Moving one pixel to the left to position to the next pointGo to step (iii).
(iii) willThe maximum value among the three is recorded asAnd (3) turning to the step (ii).
(iv) orderIs equal to zero and is equal to zero,the search process for the horizontal line is ended.
Wherein whenGreater than gray threshold 150, indicating a current search pointThe pixel point of the western edge 10.1 has been reached. The hole edge search of the western region 17.2 will be at the passing pointIs performed on each horizontal line of the restThe 519 horizontal lines are referred to above for the hole edge search in the western region 17.2 in steps (i) to (iv).
S307, hole edges of the south area;
the 520 pixel points of 17.3 on the south arc are orderly arranged from left to right to form a point set
Referring to fig. 20, 24, the hole edge search procedure for the southbound region 17.3 is illustrated with the plumb line:
defining an accumulated variablejIs zero;is the coordinates of (a)Will beThe maximum value among the three is recorded as
(ii) ifGreater than 150 gray threshold, go to step (iv); if it isLess than or equal to 150, let the accumulated variablejThe value of (2) is increased by 1Move one pixel down to position to the next pointGo to step (iii).
(iii) willThe maximum value among the three is recorded asAnd (3) turning to the step (ii).
(iv) orderIs equal to zero and is equal to zero,the search process for the plumb line is ended.
Wherein whenGreater than gray threshold 150, indicating a current search pointThe pixel point of the south edge 11.1 has been reached. The hole edge search of the southbound region 17.3 will be at the point of passageEach vertical line of (2) is arranged on the restThe 519 plumb lines are referred to above for the hole edge search in southbound region 17.3 in steps (i) through (iv).
S308, the hole edge of the north region;
sequentially arranging 520 pixel points of 17.4 on the north arc from left to right to form a point set
Refer to fig. 21 and 25 toThe plumb line is taken as an example to illustrate the hole edge search process for the northbound region 17.4:
defining an accumulated variablejIs zero;is the coordinates of (a)Will beThe maximum value among the three is recorded as
(ii) ifGreater than 150 gray threshold, go to step (iv); if it isLess than or equal to 150, let the accumulated variablejThe value of (2) is increased by 1Move one pixel up to position to the next pointGo to step (iii).
(iii) willThe maximum value among the three is recorded asAnd (3) turning to the step (ii).
(iv) orderIs equal to zero and is equal to zero,the search process for the plumb line is ended.
Wherein whenGreater than gray threshold 150, indicating a current search pointThe pixel point of the northbound edge 12.1 has been reached. The hole edge search of the northbound region 17.4 will be at the point of passageEach vertical line of (2) is arranged on the restThe 519 plumb lines are referred to above for the hole edge search in north region 17.4 in steps (i) through (iv).
Step S4: precisely calculating the hole position and the hole diameter of the connecting hole;
through steps S304-S308, the accurate edge matrixEdgeThe elements with the middle part equal to 1 are reassigned to zero, and the elements equal to zero are added to the dataEdgeI.e. the coordinates of the pixel point of the hole edge 4.1 in the hole photo 4 to be measured. Referring to FIG. 26, a precise edge matrixEdgeThe image of (2) is a binary image 20 with only two colors of black and white, wherein the black point set 20.1 isEdgeIs equal to zero. Referring to fig. 27, to more intuitively show the search result of the hole edge 4.1, the color of the black dot set 20.1 is modified to white to form a white dot set 20.2, and the white dot set 20.2 is covered on the hole photo 4 to be measured, so that the white dot set 20.2 is closely attached to the hole edge 4.1 to describe the white dotThe set 20.2 is the actual pixel point of the hole edge 4.1.
Referring to fig. 28, the black point set 20.1 of fig. 26 is fitted to a circle using an average method to obtain an exact circle 21 having a center coordinate equal to (530, 531) and a radius equal to 372. The precise circle 21 is closely attached to the hole edge 4.1, the hole location coordinates of the second connecting hole 4.2 are (530, 531), and the radius of the second connecting hole 4.2 is 372.

Claims (9)

1. An image processing method for calculating skin hole sites and aperture diameters, characterized by: the method comprises the following steps:
step S1: designing a connecting hole recognition model by adopting a deep learning method: shooting a sample photo set, making a label graph set of the sample photo set, and training a detection network;
step S2: roughly calculating hole sites and pore diameters: shooting a photo of a hole to be detected, and roughly calculating a hole position and a hole diameter;
step S3: accurately positioning a hole edge pixel point set: appointing detection operators, image convolution operation and dividing search areas for hole edges in different directions; defining a search starting circle and an accurate edge matrix, an eastern area hole edge search, a western area hole edge search, a southwestern area hole edge search, a northbound area hole edge search, an northeast area hole edge search, an southeast area hole edge search, a northwest area hole edge search and a southwestern area hole edge search;
step S4: precisely calculating the hole position and the hole diameter of the connecting hole;
the step S3 specifically comprises the following steps:
s301, appointing detection operators for hole edges in different directions;
the detection operators are assigned to the hole edges at different orientations as follows:
s302, performing image convolution operation;
matrix gray scalegrayARespectively withK_E、K_W、K_N、K_S、K_NE、K_SE、K_NW、K_SWThe detection operators perform convolution operation to respectively obtain eastern edge matrixesedgeEAnd its corresponding east edge image (7), west edge matrixedgeWAnd corresponding southward edge image (10), southward edge matrixedgeSAnd its corresponding southbound edge image (11), northbound edge matrixedgeNAnd its corresponding northbound edge image (12), northeast edge matrixedgeNEAnd corresponding northeast-oriented edge image (13) and southeast-oriented edge matrix thereofedgeSEAnd its corresponding southeast edge image (14), northwest edge matrixedgeNWAnd corresponding northwest edge image (15), southwest edge matrixedgeSWAnd its corresponding southwest edge image (16);
s303, dividing a search area;
connecting an upper left corner point and a lower right corner point of the hole photo (4) to be detected to form a third dotted line (4.4), and connecting the upper right corner point and the lower left corner point of the hole photo (4) to be detected to form a fourth dotted line (4.5); the third dotted line (4.4) and the fourth dotted line (4.5) divide the hole photo (4) to be measured into four areas: an eastern region (4.6), a western region (4.7), a southbound region (4.8) and a northbound region (4.9);
s304, defining a search starting circle and an accurate edge matrix;
the search start circle (17) is defined as follows: center coordinatesIs equal to the centre coordinates of the initial circle (6)>The method comprises the steps of carrying out a first treatment on the surface of the Radius->
The first diameter line segment (18) passes throughAnd an angle of 45 DEG with the X-axis, the second diameter line section (19) passing +.>And forms an included angle of-45 degrees with the X axis; the circumference of the search start circle (17) is divided into four circular arcs by a first diameter line segment (18) and a second diameter line segment (19): an eastern arc (17.1), a western arc (17.2), a southbound arc (17.3) and a northbound arc (17.4); the four sections of circular arcs comprise the pixel points with the number of allnPixel:
The exact edge matrix is defined as followsEdge: the dimension of the matrix is M multiplied by M; all elements are equal to 1;
s305, searching the hole edge of the east region;
will be on the eastern arc (17.1)nPixelThe pixel points are arranged from top to bottom in sequence to form a point set
The hole edge search in the east region (4.6) is respectively performedOn each horizontal line, and the search starting point is +.>
S306, searching the hole edge of the western region;
will be on the western arc (17.2)nPixelThe pixel points are arranged from top to bottom in sequence to form a point set
Order theThe pore edge search of the western region (4.7) will be at +.>On each horizontal line, and the search starting point is +.>
S307, searching the hole edge of the south area;
will be on the south arc (17.3)nPixelThe pixel points are sequentially arranged from left to right to form a point set
Order theThe pore edge search of the southbound region (4.8) will be at +.>On each vertical line, and the search starting point is +>
S308, searching the hole edge of the north region;
on a north arc (17.4)nPixelThe pixel points are sequentially arranged from left to right to form a point set
Order theThe pore edge search of the northbound region (4.9) will be at +.>On each vertical line, and the search starting point is +>
The step S4 specifically comprises the following steps:
at the exact edge matrixEdgeBlack dot set in binary image (20)20.1 I.e. isEdgeThe element equal to zero in the black point set (20.1) is subjected to circle fitting by using an average value method to obtain the circle center coordinate asRadius is->The hole position coordinates of the second connecting hole (4.2) are +.>The radius of the second connecting hole (4.2) is +.>
2. An image processing method for calculating skin hole sites and aperture diameters according to claim 1, characterized in that:
the step S1 specifically comprises the following steps:
s101, shooting a sample photo set;
taking a plurality of gray-scale photos which comprise skin connecting holes and have the size of M multiplied by M to form a sample photo set;
s102, making a label atlas of a sample photo set;
covering a white marking circle (2) overlapped with the first connecting hole (1.1) on a black canvas (3.1) with the size of M multiplied by M to form a label graph (3) of the gray photo (1), and adopting the same method to manufacture a label graph for each gray photo aiming at all other photos of the sample photo set in the step S101 to form a label graph set;
s103, training a detection network;
downsampling by adopting a VGG-16 network to extract characteristics, upsampling by adopting a deconvolution mode, and classifying all pixel points in an image; number of iterations of trainingepoch100, 50 gray-scale photos are used as the corresponding test set, the size of the input picture of the model is MxM, and the loss function is usedCross entropy:wherein L represents a loss, y represents the true value ++>Representing the predicted value;
after 100 iterations, the loss function of the model is reduced slightly, the whole model is converged, the over fitting prevention can be stopped in advance, and the model weight trained at the time is used as a characteristic identification parameter for detecting the region where the first connecting hole (1.1) is located.
3. An image processing method for calculating skin hole sites and aperture diameters according to claim 2, characterized in that:
the step S2 specifically comprises the following steps:
s201, shooting a photo of a hole to be detected;
taking a hole-to-be-measured photo (4) comprising a second connecting hole (4.2) and a hole edge (4.1), wherein gray values of all pixels of the hole-to-be-measured photo (4) form a gray matrixgrayA
S202, roughly calculating hole sites and pore diameters;
initializing feature extraction parameters of a detection network by using the model parameters obtained in the step S103, normalizing a photo to be identified with any input resolution to the size of M multiplied by M by adopting a cubic spline interpolation mode, and setting an identification model as an eval () evaluation mode; finally, the output result contains a background and a foreground, the foreground is the area where the connecting hole is located, the background is other areas except the connecting hole, and the hole edge point set (5) of the connecting hole can be roughly obtained by mapping the identified result to the original photo resolution in equal proportion;
the hole edge point set (5) has N edge points, and the coordinates of each edge point are as followsThe method comprises the steps of carrying out a first treatment on the surface of the Constructing an initial circle (6) by the hole edge point set (5), wherein the center coordinates of the initial circle (6) are +.>And radius>Calculated as follows:
1). equal to all edge points abscissa +.>Average value of sum,/>Equal to all edge points ordinate +.>Average value of the sum;
2). equal to->Maximum value and->A difference in the minimum values;
the formula is:
4. an image processing method for calculating skin hole sites and aperture diameters according to claim 2, wherein in step S102: the center coordinates of the white mark circle (2) in the gray level photo (1) and the sample graph (3) are
5. An image processing method for calculating skin hole sites and aperture diameters according to claim 3, characterized in that: in step S201: a coordinate system (4.3) is established on the photo (4) of the hole to be measured according to the following rule: the origin of coordinates is located at the lower left corner of the image, the positive X-axis direction is horizontally to the right, and the positive Y-axis direction is vertically upwards.
6. An image processing method for calculating skin hole sites and aperture diameters according to claim 1, characterized in that: in step S305:
defining an accumulated variableiIs zero, defines an accumulated variablejIs zero;
(ⅱ)、is marked as +.>Will->、/>The maximum value is recorded as->
(iii) ifGreater than gray thresholdthresholdLet->Equal to zero, turn into the first%V) step; if->Less than or equal to the gray thresholdthresholdLet the accumulated variablejThe value of (2) is increased by 1Shifting one pixel to the right to position to the next point +.>Turning to step (iv);
(iv) will beThe maximum value between the three is named->Turning to step (iii);
(v) ifiGreater than or equal tonPixelTurning to step (vi); if it isiLess thannPixel,Let accumulated variableiAdding 1 to the value of (2), and turning to the step (ii);
(vi) ending the hole edge search in the east region (4.6).
7. An image processing method for calculating skin hole sites and pore diameters according to claim 1, wherein step S306 comprises the following specific contents:
defining an accumulated variableiIs zero, defines an accumulated variablejIs zero;
(ⅱ)、is marked as +.>Will->、/>The maximum value between the three is named->
(iii) ifGreater than gray thresholdthresholdLet->Equal to zero, go to step (v); if->Less than or equal to the gray thresholdthresholdLet the accumulated variablejThe value of (2) is increased by 1Shifting one pixel to the left to position to the next point +.>Turning to step (iv);
(iv) will be、/>
The maximum value between the three is named->Turning to step (iii);
(v) ifiGreater than or equal tonPixelTurning to step (vi); if it isiLess thannPixel,Let accumulated variableiAdding 1 to the value of (2), and turning to the step (ii);
(vi) ending the hole edge search in the western region (4.7).
8. The image processing method for calculating skin hole sites and pore diameters according to claim 1, wherein step S307 comprises the following specific contents:
defining an accumulated variableiIs zero, defines an accumulated variablejIs zero;
(ⅱ)、is marked as +.>Will->、/>The maximum value between the three is named->
(iii) ifGreater than gray thresholdthresholdLet->Equal to zero, go to step (v); if->Less than or equal to the gray thresholdthresholdLet the accumulated variablejThe value of (2) is increased by 1Move downwards oneThe individual pixels are positioned to the next point +.>Turning to step (iv);
(iv) will be、/>、/>The maximum value between the three is named->Turning to step (iii);
(v) ifiGreater than or equal tonPixelTurning to step (vi); if it isiLess thannPixel,Let accumulated variableiAdding 1 to the value of (2), and turning to the step (ii);
(vi) ending the hole edge search in the southbound region (4.8).
9. The image processing method for calculating skin hole sites and pore diameters according to claim 1, wherein step S308 comprises the following specific contents:
defining an accumulated variableiIs zero, defines an accumulated variablejIs zero;
(ⅱ)、is marked as +.>Will->、/>The maximum value between the three is named->
(iii) ifGreater than gray thresholdthresholdLet->Equal to zero, go to step (v); if->Less than or equal to the gray thresholdthresholdLet the accumulated variablejThe value of (2) is increased by 1Move one pixel up to position to the next point +.>Turning to step (iv);
(iv) will be、/>、/>The maximum value between the three is named->Turning to step (iii);
(v) ifiGreater than or equal tonPixelTurning to step (vi); if it isiLess thannPixel,Let accumulated variableiThe value of (2) is increased by 1,turning to step (ii);
(vi) ending the hole edge search in the northbound region (4.9).
CN202210878722.1A 2022-07-25 2022-07-25 Image processing method for calculating skin hole site and aperture Active CN115423746B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210878722.1A CN115423746B (en) 2022-07-25 2022-07-25 Image processing method for calculating skin hole site and aperture

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210878722.1A CN115423746B (en) 2022-07-25 2022-07-25 Image processing method for calculating skin hole site and aperture

Publications (2)

Publication Number Publication Date
CN115423746A CN115423746A (en) 2022-12-02
CN115423746B true CN115423746B (en) 2023-10-10

Family

ID=84196232

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210878722.1A Active CN115423746B (en) 2022-07-25 2022-07-25 Image processing method for calculating skin hole site and aperture

Country Status (1)

Country Link
CN (1) CN115423746B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115230191A (en) * 2022-07-25 2022-10-25 成都飞机工业(集团)有限责任公司 Forming method of stealth box section part

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104359404A (en) * 2014-11-24 2015-02-18 南京航空航天大学 Quick visual detection method for plenty of guide holes of small sizes in airplane parts
CN110906875A (en) * 2019-11-26 2020-03-24 湖北工业大学 Visual processing method for aperture measurement
CN113420363A (en) * 2021-08-25 2021-09-21 成都飞机工业(集团)有限责任公司 Method for predicting matching of skin skeleton of aircraft component
CN114193231A (en) * 2022-02-16 2022-03-18 成都飞机工业(集团)有限责任公司 Bottom hole orifice measuring method for numerical control countersink
CN114219802A (en) * 2022-02-21 2022-03-22 成都飞机工业(集团)有限责任公司 Skin connecting hole position detection method based on image processing
CN114346759A (en) * 2022-03-10 2022-04-15 成都飞机工业(集团)有限责任公司 Device for hole online detection and hole finish machining and machining method thereof

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104359404A (en) * 2014-11-24 2015-02-18 南京航空航天大学 Quick visual detection method for plenty of guide holes of small sizes in airplane parts
CN110906875A (en) * 2019-11-26 2020-03-24 湖北工业大学 Visual processing method for aperture measurement
CN113420363A (en) * 2021-08-25 2021-09-21 成都飞机工业(集团)有限责任公司 Method for predicting matching of skin skeleton of aircraft component
CN114193231A (en) * 2022-02-16 2022-03-18 成都飞机工业(集团)有限责任公司 Bottom hole orifice measuring method for numerical control countersink
CN114219802A (en) * 2022-02-21 2022-03-22 成都飞机工业(集团)有限责任公司 Skin connecting hole position detection method based on image processing
CN114346759A (en) * 2022-03-10 2022-04-15 成都飞机工业(集团)有限责任公司 Device for hole online detection and hole finish machining and machining method thereof

Also Published As

Publication number Publication date
CN115423746A (en) 2022-12-02

Similar Documents

Publication Publication Date Title
CN111243032B (en) Full-automatic detection method for checkerboard corner points
CN110223226B (en) Panoramic image splicing method and system
CN114897864B (en) Workpiece detection and defect judgment method based on digital-analog information
CN106251353A (en) Weak texture workpiece and the recognition detection method and system of three-dimensional pose thereof
CN107945221B (en) Three-dimensional scene feature expression and high-precision matching method based on RGB-D image
CN103985133A (en) Search method and system for optimal splicing lines among images based on graph-cut energy optimization
CN110415304B (en) Vision calibration method and system
CN110008833B (en) Target ship detection method based on optical remote sensing image
CN115423746B (en) Image processing method for calculating skin hole site and aperture
CN111382658B (en) Road traffic sign detection method in natural environment based on image gray gradient consistency
CN113012096B (en) Display screen sub-pixel positioning and brightness extraction method, device and storage medium
CN112215925A (en) Self-adaptive follow-up tracking multi-camera video splicing method for coal mining machine
CN113255452A (en) Extraction method and extraction system of target water body
CN109544513A (en) A kind of steel pipe end surface defect extraction knowledge method for distinguishing
CN112767497A (en) High-robustness calibration device based on circular calibration plate and positioning method
CN113744142B (en) Image restoration method, electronic device and storage medium
CN117576219A (en) Camera calibration equipment and calibration method for single shot image of large wide-angle fish-eye lens
CN106022337A (en) Planar object detection method based on continuous edge characteristic
CN113706607B (en) Subpixel positioning method, computer equipment and device based on circular array diagram
CN114219802B (en) Skin connecting hole position detection method based on image processing
CN114943823B (en) Unmanned aerial vehicle image splicing method and system based on deep learning semantic perception
CN105741268B (en) A kind of vision positioning method based on colored segment and its topological relation
CN113807238A (en) Visual measurement method for area of river surface floater
CN116542974B (en) Method for detecting surface defects of copper-clad plate based on multi-scale gridding
CN113222986B (en) Continuous casting billet angular point and edge contour point set positioning method, system, medium and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant