CN115423746A - Image processing method for calculating skin hole site and hole diameter - Google Patents

Image processing method for calculating skin hole site and hole diameter Download PDF

Info

Publication number
CN115423746A
CN115423746A CN202210878722.1A CN202210878722A CN115423746A CN 115423746 A CN115423746 A CN 115423746A CN 202210878722 A CN202210878722 A CN 202210878722A CN 115423746 A CN115423746 A CN 115423746A
Authority
CN
China
Prior art keywords
hole
edge
equal
search
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210878722.1A
Other languages
Chinese (zh)
Other versions
CN115423746B (en
Inventor
李博
喻志勇
姜振喜
曾德标
宋戈
沈昕
李卫东
游莉萍
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Aircraft Industrial Group Co Ltd
Original Assignee
Chengdu Aircraft Industrial Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Aircraft Industrial Group Co Ltd filed Critical Chengdu Aircraft Industrial Group Co Ltd
Priority to CN202210878722.1A priority Critical patent/CN115423746B/en
Publication of CN115423746A publication Critical patent/CN115423746A/en
Application granted granted Critical
Publication of CN115423746B publication Critical patent/CN115423746B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)

Abstract

The invention belongs to the technical field of inspection and detection, and particularly relates to an image processing method for calculating skin hole sites and skin hole diameters, which comprises the following steps: step S1, designing a connecting hole identification model by adopting a deep learning method: shooting a sample photo set, making a label photo set of the sample photo set, and training a detection network; step S2, roughly calculating hole positions and hole diameters: shooting a picture of a hole to be detected, and roughly calculating a hole position and a hole diameter; s3, accurately positioning a hole edge pixel point set: appointing a detection operator, performing image convolution operation and dividing a search area for the hole edges in different directions; and S4, accurately calculating the hole position and the hole diameter of the connecting hole. By adopting the image processing method provided by the application, the accurate edge pixel point of the connecting hole can be quickly found from the skin photo, the hole position and the hole diameter of the connecting hole are calculated, and the processing quality of the composite material skin part is accurately detected.

Description

Image processing method for calculating skin hole site and hole diameter
Technical Field
The invention belongs to the technical field of inspection and detection, and particularly relates to an image processing method for calculating skin hole sites and skin hole diameters.
Background
With the rapid development of aircraft manufacturing technology, people have higher and higher requirements on the flight distance and flight safety of the aircraft, and more composite materials are used for manufacturing aircraft profile parts. The composite materials can be processed into large thin-wall skin parts more than ten meters, and have the advantages of light weight, high strength, good toughness and the like which cannot be compared with metal materials. The composite skin part and the metal framework part are generally connected by rivets and bolts, so hundreds of connecting holes are required to be formed in the composite skin part. Before the composite skin part is delivered, hole positions and hole diameters of all connecting holes need to be measured so as to ensure that the composite skin part and the metal part can be accurately assembled.
The method for detecting a large number of connecting holes on the composite skin part mainly comprises contact probe detection and non-contact detection. Contact probe inspection is currently the usual method, but there is a risk of inefficiency and the probe hitting the skin surface. The non-contact detection is a novel digital detection technology, mainly adopts a machine vision technology to realize various measurement tasks on a measured object, and has the remarkable advantages of high efficiency, safety and no contact. One of the core research contents of machine vision technology is the image processing method. Because the images acquired under different scenes have very large differences, it is difficult to find an image processing method which can adapt to various scenes, and particularly, in the field of high-precision detection, a specific image processing algorithm is developed for a specific scene.
A large number of severe environmental factors such as uneven illumination, dust, cuttings, liquid and the like exist in a processing field of a composite skin part, so that high-quality pictures cannot be taken. The existing general image processing methods, such as Roberts edge detection operators, sobel edge detection operators, prewitt edge detection operators and the like, can not accurately calculate the hole position and the hole diameter of the connecting hole, and even can not accurately identify the hole edge pixel point of the connecting hole. The bottleneck problem of the image processing method greatly limits the application of non-contact detection on the composite skin part.
Disclosure of Invention
In view of the above problems, the present invention provides an image processing method for calculating skin hole locations and hole diameters, comprising the following steps:
the invention is realized by the following technical scheme:
an image processing method for calculating skin hole site and hole diameter comprises the following steps:
step S1: designing a connecting hole identification model by adopting a deep learning method;
s101, taking a sample photo album
And taking a plurality of gray-scale pictures which comprise skin connecting holes and have the size of M multiplied by M to form a sample photo set.
S102, making label atlas of sample photo album
Drawing a white mark circle approximately coinciding with the connecting hole on the gray scale picture 1, and recording the coordinates of the center of the white mark circle as
Figure 408108DEST_PATH_IMAGE001
(ii) a Copying the white marking circle to a black canvas 3.1 with the size of M multiplied by M to form a label graph of the gray scale photo, wherein the coordinate of the center of the white marking circle in the label graph is still as
Figure 794090DEST_PATH_IMAGE001
For all other photos in the grayscale photo album in step S101, a label map is created for each grayscale photo by the above method to form a label map album.
S103, training detection network
And (3) adopting a VGG-16 network to perform downsampling to extract features, then adopting a deconvolution mode to perform upsampling, and finally classifying all pixel points in the image. Number of iterations of trainingepochFor 100, the corresponding test set is 50 grayscale photographs, the input picture size of the model is M × M, and the loss function used is the cross entropy:
Figure 901723DEST_PATH_IMAGE002
where L represents loss, y represents true value,
Figure 765774DEST_PATH_IMAGE003
representing a predicted value;
after 100 iterations, the loss function of the model is reduced slightly, the whole model is converged, overfitting can be prevented by stopping in advance, and the model weight trained at this time is used as the characteristic identification parameter for detecting the area where the connecting hole is located.
Step S2: rough calculation of hole site and hole diameter
S201, shooting a picture of the hole to be measured
And taking a picture of the hole to be detected, wherein the picture of the hole to be detected comprises a connecting hole and a hole edge. Establishing a coordinate system on the photo of the hole to be detected according to the following rules: the origin of coordinates is located in the lower left corner of the image, with the positive X-axis direction going horizontally to the right and the positive Y-axis direction going vertically upward.
For the pixel point at any position in the photo of the hole to be measured, the gray value of the pixel point is recorded as
Figure 908042DEST_PATH_IMAGE004
The gray values of all pixel points in the photo form a gray matrixgrayA
S202, roughly calculating hole positions and hole diameters
And initializing the characteristic extraction parameters of the detection network by using the model parameters obtained in the step S103, normalizing the picture to be recognized with any input resolution to the size of M multiplied by M by adopting a cubic spline interpolation mode, and setting the recognition model as an eval () evaluation mode. And finally, the output result comprises a background and a foreground, the foreground is the area where the connecting hole is located, the background is the other areas except the connecting hole, and the hole edge point set of the connecting hole can be roughly obtained by mapping the identified result to the original picture resolution in an equal proportion.
The hole edge point set has N edge points, and the coordinate of each edge point is
Figure 825445DEST_PATH_IMAGE005
. Constructing an initial circle according to the hole edge point set, and the center coordinates of the initial circle
Figure 826899DEST_PATH_IMAGE006
And radius
Figure 353695DEST_PATH_IMAGE007
Calculated as follows:
1).
Figure 491416DEST_PATH_IMAGE008
equal to the abscissa of all edge points
Figure 78255DEST_PATH_IMAGE009
The average value of the sum of the values,
Figure 301426DEST_PATH_IMAGE010
equal to the ordinate of all edge points
Figure 631913DEST_PATH_IMAGE011
The average of the sums;
2).
Figure 624140DEST_PATH_IMAGE012
is equal to
Figure 148924DEST_PATH_IMAGE013
Maximum value and
Figure 859391DEST_PATH_IMAGE013
a difference in the minimum values;
the formula is as follows:
Figure 462411DEST_PATH_IMAGE014
and step S3: accurately positioning hole edge pixel point set
S301, appointing detection operators for hole edges in different directions
Kirsch edge detection operator is an edge detection operator proposed by r.kirsch, consisting of eight 3 × 3 order matrices. In the round hole edge detection scene, detection operators are assigned to the hole edges in different directions as follows:
Figure 574723DEST_PATH_IMAGE015
Figure 503365DEST_PATH_IMAGE016
Figure 701128DEST_PATH_IMAGE017
Figure 373418DEST_PATH_IMAGE018
Figure 340237DEST_PATH_IMAGE019
Figure 206824DEST_PATH_IMAGE020
Figure 626304DEST_PATH_IMAGE021
Figure 836706DEST_PATH_IMAGE022
s302, image convolution operation
Will gray matrixgrayAAre respectively connected withK_E、K_W、K_N、K_S、K_NE、K_SE、K_NW、K_SWCarrying out convolution operation on the detection operators to respectively obtain east edge matrixesedgeEAnd corresponding east edge image and west edge matrixedgeWAnd corresponding west edge image and south edge matrix thereofedgeSAnd corresponding south edge image and north edge matrix thereofedgeNAnd corresponding northbound edge image and northbound edge matrix thereofedgeNEAnd corresponding northeast edge image and southeast edge matrix thereofedgeSEAnd corresponding southeast edge image and northwest edge matrix thereofedgeNWAnd corresponding northwest edge image and southwest edge matrixedgeSWAnd its corresponding southwestern edge image.
S303, dividing a search area
Connect the upper left corner point and the lower right corner point of the hole picture that will await measuring and form the third dotted line, connect the upper right corner point and the lower left corner point of the hole picture that will await measuring and form the fourth dotted line, the hole picture that will await measuring of third dotted line and fourth dotted line divides into four regions: east region, west region, south region, north region.
The aperture edges of the east region include an east edge, a northeast edge, and a southeast edge; the hole edge of the western-direction area comprises a western-direction edge, a northwest-direction edge and a southwest-direction edge; the hole edge of the southward region comprises a southeast edge, a southeast edge and a southwest edge; the hole edges of the northbound region include a northbound edge, a northeast edge, and a northwest edge.
S304, defining a search starting circle and an accurate edge matrix
The search start circle is defined as follows: center coordinate
Figure 782665DEST_PATH_IMAGE023
Equal to the centre coordinates of the initial circle
Figure 459634DEST_PATH_IMAGE024
(ii) a Radius of
Figure 366410DEST_PATH_IMAGE025
A first diameter line section passes
Figure 380502DEST_PATH_IMAGE026
And forms an angle of 45 degrees with the X axisThe two diameter line sections pass through and form an included angle of-45 degrees with the X axis; the first diameter line segment and the second diameter line segment divide the circumference of the search starting circle into four circular arcs: east circular arc, west circular arc, south circular arc and north circular arc; the number of the pixel points contained in the four sections of circular arcs is equal tonPixel:
Figure 321914DEST_PATH_IMAGE027
The exact edge matrix is defined as followsEdge: the matrix dimension is M multiplied by M; all elements equal to 1;
s305, searching for hole edges of the east region;
will be east-ward rounded (17.1)nPixelThe pixel points are arranged from top to bottom in sequence to form a point set
Figure 542022DEST_PATH_IMAGE028
Figure 936094DEST_PATH_IMAGE029
The hole edge search of the east region will be at
Figure 488298DEST_PATH_IMAGE030
Is carried out on each horizontal line, and the starting point of the search is
Figure 143271DEST_PATH_IMAGE030
The method comprises the following specific steps:
defining an accumulation variableiIs zero, defining an accumulation variablejIs zero;
(ⅱ)、
Figure 896463DEST_PATH_IMAGE031
is noted as
Figure 902465DEST_PATH_IMAGE032
Will be
Figure 399306DEST_PATH_IMAGE033
Figure 675828DEST_PATH_IMAGE034
Figure 334343DEST_PATH_IMAGE035
The maximum value of (A) is recorded as
Figure 827641DEST_PATH_IMAGE036
(iii) if
Figure 862593DEST_PATH_IMAGE036
Greater than a grey thresholdthresholdLet us order
Figure 757737DEST_PATH_IMAGE037
If the value is equal to zero, the step (v) is carried out; if it is used
Figure 321573DEST_PATH_IMAGE036
Less than or equal to the gray thresholdthresholdLet the accumulated variablejValue of (1) plusMove one pixel to the right to locate the next point
Figure 567747DEST_PATH_IMAGE038
And (iv) carrying out the step (iv);
(iv) mixing
Figure 406390DEST_PATH_IMAGE039
The maximum value between the three is recorded as
Figure 126347DEST_PATH_IMAGE036
Transferring to the step (iii);
(v), ifiGreater than or equal tonPixelShifting to the step (vi); if it is usediIs less thannPixel,Make the accumulated variableiAdding 1 to the value of (ii), and turning to the step (ii);
and (vi) finishing the hole edge search of the east-direction area.
S306, searching hole edges of west-direction area
On the west arcnPixelThe pixel points are arranged from top to bottom in sequence to form a point set
Figure 392243DEST_PATH_IMAGE040
Figure 594554DEST_PATH_IMAGE041
Hole edge search in the west region will be
Figure 236888DEST_PATH_IMAGE042
Is carried out on each horizontal line, and the starting point of the search is
Figure 841045DEST_PATH_IMAGE042
The method comprises the following specific steps:
defining an accumulation variableiIs zero, defines an accumulation variablejIs zero;
(ⅱ)、
Figure 277842DEST_PATH_IMAGE043
is noted as
Figure 967449DEST_PATH_IMAGE044
Will be
Figure 147895DEST_PATH_IMAGE045
Figure 108023DEST_PATH_IMAGE046
Figure 574777DEST_PATH_IMAGE047
The maximum value between the three is recorded as
Figure 627046DEST_PATH_IMAGE048
(iii) if
Figure 470237DEST_PATH_IMAGE048
Greater than a grey thresholdthresholdLet us order
Figure 924352DEST_PATH_IMAGE049
If the value is equal to zero, the step (v) is carried out; if it is not
Figure 827586DEST_PATH_IMAGE048
Less than or equal to the gray thresholdthresholdLet the accumulated variablejValue of (1) plusMove one pixel location to the left to the next point
Figure 367152DEST_PATH_IMAGE050
And (iv) carrying out the step (iv);
(iv) mixing
Figure 515499DEST_PATH_IMAGE051
Figure 824120DEST_PATH_IMAGE052
Figure 898256DEST_PATH_IMAGE053
The maximum value between the three is recorded as
Figure 659538DEST_PATH_IMAGE048
Transferring to the step (iii);
(v), ifiGreater than or equal tonPixelShifting to the step (vi); if it is notiIs less thannPixel,Make the accumulated variableiAdding 1 to the value of (ii), and turning to the step (ii);
and vi) finishing the hole edge search of the west-oriented area (4.7).
S307, hole edge searching of south area
Will be on the south arcnPixelThe pixel points are sequentially arranged from left to right to form a point set
Figure 110111DEST_PATH_IMAGE054
Figure 273239DEST_PATH_IMAGE055
Hole edge search in the southbound region will be at
Figure 252697DEST_PATH_IMAGE056
Is carried out on each plumb line, and the starting point of the search is
Figure 766855DEST_PATH_IMAGE056
The method comprises the following specific steps:
defining an accumulation variableiIs zero, defining an accumulation variablejIs zero;
(ⅱ)、
Figure 257004DEST_PATH_IMAGE057
is marked as
Figure 274639DEST_PATH_IMAGE058
Will be
Figure 956156DEST_PATH_IMAGE059
Figure 692030DEST_PATH_IMAGE060
Figure 484406DEST_PATH_IMAGE061
The maximum value between the three is recorded as
Figure 356547DEST_PATH_IMAGE062
(iii) if
Figure 208965DEST_PATH_IMAGE062
Greater than a grey thresholdthresholdLet us order
Figure 792656DEST_PATH_IMAGE063
If the value is equal to zero, the step (v) is carried out; if it is not
Figure 998509DEST_PATH_IMAGE062
Less than or equal to the gray thresholdthresholdLet the accumulated variablejValue of (1) plusMove one pixel down to the next point
Figure 115370DEST_PATH_IMAGE064
Turning to the step (iv);
(iv) mixing
Figure 14056DEST_PATH_IMAGE065
Figure 583577DEST_PATH_IMAGE066
Figure 593122DEST_PATH_IMAGE067
The maximum value between the three is recorded as
Figure 830068DEST_PATH_IMAGE062
Transferring to the step (iii);
(v), ifiIs greater than or equal tonPixelShifting to the step (vi); if it is usediIs less thannPixel,Make the accumulated variableiAdding 1 to the value of (ii), and turning to the step (ii);
and vi) finishing the hole edge search of the southbound area (4.8).
S308, hole edge searching of northbound region
Will be north to the arcnPixelThe pixel points are sequentially arranged from left to right to form a point set
Figure 634076DEST_PATH_IMAGE068
Figure 457938DEST_PATH_IMAGE069
The hole edge search of the northbound region will be at
Figure 5594DEST_PATH_IMAGE070
Is carried out on each plumb line, and the starting point of the search is
Figure 97046DEST_PATH_IMAGE070
The method comprises the following specific steps:
defining an accumulated variableiIs zero, defining an accumulation variablejIs zero;
(ⅱ)、
Figure 71956DEST_PATH_IMAGE071
is marked as
Figure 616069DEST_PATH_IMAGE072
Will be
Figure 232996DEST_PATH_IMAGE073
Figure 178955DEST_PATH_IMAGE074
Figure 324765DEST_PATH_IMAGE075
The maximum value between the three is recorded as
Figure 845921DEST_PATH_IMAGE076
(iii) if
Figure 735380DEST_PATH_IMAGE076
Greater than a grey thresholdthresholdLet us order
Figure 801425DEST_PATH_IMAGE077
If the value is equal to zero, the step (v) is carried out; if it is used
Figure 118137DEST_PATH_IMAGE076
Less than or equal to the gray thresholdthresholdLet the accumulated variablejValue of (2) plus 1Move one pixel location up to the next point
Figure 636843DEST_PATH_IMAGE078
And (iv) carrying out the step (iv);
(iv) mixing
Figure 329992DEST_PATH_IMAGE079
Figure 250544DEST_PATH_IMAGE080
Figure 364256DEST_PATH_IMAGE081
The maximum value between the three is recorded as
Figure 245624DEST_PATH_IMAGE076
Transferring to the step (iii);
(v), ifiGreater than or equal tonPixelAnd (vi) turning to the step (vi); if it is notiIs less thannPixel,Make the accumulated variableiAdding 1 to the value of (ii), and turning to the step (ii);
and (vi) finishing the hole edge search of the northbound region.
And step S4: accurately calculating hole site and hole diameter of connecting hole
Precision edge matrixEdgeThe image of (2) is a binary image with only black and white colors, wherein the black point set isEdgeOf (a) is equal to zero, and,Edgethe element coordinate equal to zero in (1) is also the pixel point coordinate of the hole edge. Using an average value method to carry out circle fitting on the black point set to obtain a circle center coordinate of
Figure 601519DEST_PATH_IMAGE082
Radius of
Figure 251943DEST_PATH_IMAGE083
Is precisely circular.
The hole site coordinates of the connecting hole are
Figure 566250DEST_PATH_IMAGE082
The radius of the connecting hole is
Figure 295434DEST_PATH_IMAGE084
The application has the advantages that:
the application provides an image processing method capable of accurately calculating the hole position and the hole diameter of the skin, which can quickly find the accurate edge pixel point of the connecting hole from the skin picture, calculate the hole position and the hole diameter of the connecting hole and accurately detect the processing quality of the composite skin part. The method can relieve technicians from complex image processing algorithm development work, puts limited energy into the design and optimization of other important modules of the machine vision detection system, and promotes the popularization and application of non-contact detection technologies represented by machine vision in the field of aircraft manufacturing detection.
Drawings
Fig. 1 is a grayscale photograph.
Fig. 2 is a display view of white labeled circles.
Fig. 3 is a label view of fig. 1.
FIG. 4 is a photograph of a well to be tested.
Fig. 5 is a display of a set of hole edge points.
Fig. 6 is a display view of an initial circle.
Fig. 7 is an east edge image (one).
Fig. 8 is east edge image (two).
FIG. 9 is a west edge image.
Fig. 10 is a southbound edge image.
FIG. 11 is a northbound edge image.
Fig. 12 is a northeast edge image.
Fig. 13 is a southeast edge image.
FIG. 14 is a northwest edge image.
Fig. 15 is a southwestern edge image.
Fig. 16 is a search area division diagram.
Fig. 17 is a display diagram of searching for an initial circle.
Fig. 18 is a schematic diagram of hole edge search for the east region.
FIG. 19 is a schematic diagram of a hole edge search for the west region.
Fig. 20 is a schematic diagram of hole edge search for southbound regions.
Fig. 21 is a schematic view of hole edge search of the northbound region.
Fig. 22 is a hole edge search flowchart of the east region.
FIG. 23 is a hole edge search flow diagram for the west-facing region.
Fig. 24 is a hole edge search flow diagram for the southbound region.
Fig. 25 is a hole edge search flow diagram for the northbound region.
FIG. 26 is a binary map of the exact edge matrix.
FIG. 27 is a diagram of a pixel constellation at the edge of a precision hole.
Fig. 28 is a display view of a perfect circle.
Fig. 29 is a flowchart of the overall steps.
In the drawings: 1-grayscale photo; 1.1-a first connection hole; 2-white marking circles; 3-sample graph; 3.1-black canvas; 4-photo of the hole to be detected; 4.1-well edge; 4.2-second connection hole; 4.3-coordinate system; 4.4-third dotted line; 4.5-fourth dashed line; 4.6-east region; 4.7-west region; 4.8-southward region; 4.9-northbound region; 5-hole edge point set; 6-initial circle; 7-east edge image; 7.1-east edge; 8-first dotted line; 9-second dashed line; 10-west edge image; 10.1-west edge; 11-southbound edge images; 11.1-southerly edge; 12-northbound edge images; 12.1-northbound edge; 13-northeast edge image; 13.1-northeast edge; 14-southeast edge image; 14.1-southeast edge; 15-northwest edge image; 15.1-northwest edge; 16-southwesterly edge images; 16.1-southwesterly oriented edges; 17-search for initial circle; 17.1-east arc; 17.2-west circular arc; 17.3-south circular arc; 17.4-north arc; 18-a first diameter line segment; 19-a second diameter line segment; 20-a binary map of the exact Edge matrix Edge; 20.1-black set of dots; 20.2-white point set; 21-precise circle.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are for explaining the present invention and not for limiting the present invention. All other embodiments, which can be obtained by a person skilled in the art without making any creative effort based on the embodiments in the present invention, belong to the protection scope of the present invention.
The present invention will be described in further detail with reference to the drawings and examples, but the present invention is not limited to the examples.
As shown in fig. 29, an image processing method for calculating skin hole site and hole diameter includes the following steps:
step S1: designing a connecting hole identification model by adopting a deep learning method;
s101, shooting a sample photo set;
1000 grayscale photographs were taken containing skin attachment holes and having a resolution of 1024 × 1024 to make up a sample photograph collection.
S102, making a label atlas of the sample photo album;
referring to fig. 1, 2, and 3, a method for creating a label map will be described by taking a grayscale photograph 1 as an example. Referring to fig. 1, a greyscale photograph 1 contains a complete first connection aperture 1.1. Referring to fig. 2, a white mark circle 2 approximately coinciding with the first connection hole 1.1 is drawn on the gray-scale photograph 1, and the coordinates of the center of the white mark circle 2 are recorded as
Figure 455019DEST_PATH_IMAGE001
. Referring to fig. 3, a white marking circle 2 is copied to a black canvas 3.1 with a resolution still 1024 × 1024 to form a label map 3 of the gray scale photograph 1, wherein the coordinate of the center of the white marking circle 2 in the label map 3 is still the same
Figure 959950DEST_PATH_IMAGE001
For all other photos in the gray photo album in step S101, label maps are created for the remaining 999 sample photos by the above method, so as to form a label map album.
S103, training a detection network;
and (3) adopting a VGG-16 network to perform downsampling to extract features, then adopting a deconvolution mode to perform upsampling, and finally classifying all pixel points in the image. Number of iterations of trainingepochFor 100, the corresponding test set is 50 gray-scale pictures, the input picture size of the model is 1024 × 1024, and the loss function used is the cross entropy:
Figure 179579DEST_PATH_IMAGE002
where L represents the loss, y represents the true value,
Figure 301119DEST_PATH_IMAGE003
representing a predicted value;
after 100 iterations, the loss function of the model is reduced slightly, the overall model is converged, the overfitting can be prevented by stopping in advance, and the weight of the trained model is used as the characteristic identification parameter for detecting the area where the first connection hole 1.1 is located.
Step S2: roughly calculating hole positions and hole diameters;
s201, shooting a picture of a hole to be detected;
referring to fig. 4, a photo 4 of the hole to be measured with a resolution of 1024 × 1024 is taken, and the photo 4 of the hole to be measured includes the second connection hole 4.2 and the hole edge 4.1. Establishing a coordinate system 4.3 on the picture 4 of the hole to be detected according to the following rules: the origin of coordinates is located in the lower left corner of the image, with the positive X-axis direction going horizontally to the right and the positive Y-axis direction going vertically upward.
For the pixel point at any position in the photo 4 of the hole to be measured, the gray value of the pixel point is recorded as
Figure 733237DEST_PATH_IMAGE004
The gray values of all pixel points in the photo form a gray matrixgrayA
Figure 358253DEST_PATH_IMAGE085
S202, roughly calculating hole positions and hole diameters;
and initializing the feature extraction parameters of the detection network by using the model parameters obtained in the step S103, normalizing the picture to be recognized with any input resolution to 1024 x 1024 sizes by adopting a cubic spline interpolation mode, and setting the recognition model as an eval () evaluation mode. And finally, the output result comprises a background and a foreground, the foreground is the area where the connecting hole is located, the background is the other areas except the connecting hole, and the hole edge point set 5 of the connecting hole can be roughly obtained by mapping the identified result to the original picture resolution in an equal proportion, as shown in fig. 5.
The hole edge point set 5 has 2392 edge points, and the coordinate of each edge point is
Figure 889729DEST_PATH_IMAGE086
. Referring to fig. 6, an initial circle 6 is constructed from the hole edge point set 5, the coordinates of the center of the initial circle 6
Figure 327926DEST_PATH_IMAGE087
And radius
Figure 235839DEST_PATH_IMAGE007
Calculated as follows:
1).
Figure 839996DEST_PATH_IMAGE008
equal to the abscissa of all edge points
Figure 276793DEST_PATH_IMAGE009
The average value of the sum of the values,
Figure 966400DEST_PATH_IMAGE010
equal to the ordinate of all edge points
Figure 881267DEST_PATH_IMAGE011
The average of the sums;
2).
Figure 605509DEST_PATH_IMAGE012
is equal to
Figure 478787DEST_PATH_IMAGE013
Maximum value and
Figure 157156DEST_PATH_IMAGE013
a difference in the minimum values;
that is:
Figure 875713DEST_PATH_IMAGE088
Figure 454462DEST_PATH_IMAGE089
Figure 233062DEST_PATH_IMAGE090
the initial circle 6 is already relatively close to the hole edge 4.1 of the second connection hole 4.2, and the initial circle 6 will be the input condition of step S304.
And step S3: accurately positioning a hole edge pixel point set;
s301, assigning detection operators for hole edges in different directions;
kirsch edge detection operator is an edge detection operator proposed by r.kirsch, and is composed of eight matrices of 3 × 3 order, which are:
Figure 897261DEST_PATH_IMAGE091
Figure 685089DEST_PATH_IMAGE092
in the round hole edge detection scene, detection operators are assigned to the hole edges in different directions as follows:
Figure 993710DEST_PATH_IMAGE093
Figure 67846DEST_PATH_IMAGE094
Figure 829128DEST_PATH_IMAGE095
Figure 781166DEST_PATH_IMAGE096
Figure 209873DEST_PATH_IMAGE097
Figure 454910DEST_PATH_IMAGE098
Figure 969068DEST_PATH_IMAGE099
Figure 957752DEST_PATH_IMAGE100
the eight detection operators described above are each sensitive only to the hole edge 4.1 in the respective direction. For example, referring to fig. 7, 8, the east operator is sensitive only to east edges 7.1, and is not sensitive to edges in other directions. The eight detection operators are properly combined in the subsequent steps, so that the hole edge 4.1 can be accurately detected.
S302, performing image convolution operation;
will gray scale matrixgrayAAre respectively connected withK_E、K_W、K_N、K_S、K_NE、K_SE、K_NW、K_SWAnd carrying out convolution operation on the detection operator to obtain 8 edge matrixes. Each edge matrix represents an edge image, and 8 edge matrices represent images as shown in fig. 7-15. Gray matrixgrayAThe correspondence relationship among the detection operator, the edge matrix, and the edge image is shown in table 1.
It should be noted that the resolution of fig. 7 to 15 and the to-be-measured hole photograph 4 are 1024 × 1024. In order to save space, the photo 4 of the hole to be detected is a photo obtained by reducing the actual photo; fig. 7 to 15 are enlarged views of actual pictures to show the edge of the hole more clearly.
TABLE 1 grayscale matrixgrayATable of correspondence between detection operator, edge matrix and edge image
Figure 975387DEST_PATH_IMAGE101
Referring to FIG. 7, a gray matrixgrayAAnd east operatorK_EConvolution operation is carried out to obtain an east edge matrixedgeE,edgeEIs an east edge image 7; referring to fig. 8, to more intuitively exhibit the edge detection effect, the center point and the upper right of the east edge image 7 are connectedThe corner points form a first dotted line 8, and the center point and the lower right corner point of the east edge image 7 are connected to form a second dotted line 9, which can be obviously found as follows: the east edge 7.1 between the first dotted line 8 and the second dotted line 9 is most pronounced compared to other oriented hole edges, showing a segment of the edge curve with a higher brightness.
Similar to fig. 8, fig. 9 to 15 each add two dotted lines to the edge image to more intuitively show the edge detection effect:
referring to fig. 9, visible in the west-oriented edge image 10 are: the west edge 10.1 is most pronounced compared to other orientations of the aperture edge, and exhibits a segment of the edge curve with a higher brightness.
Referring to fig. 10, visible in the southbound edge image 11 are: the south edge 11.1 is most pronounced compared to other orientations of the aperture edge, and exhibits a section of the edge curve with a higher brightness.
Referring to fig. 11, visible in the northbound edge image 12 are: the north edge 12.1 is most pronounced compared to other orientations of the aperture edge, and exhibits a section of the edge curve with a higher brightness.
Referring to fig. 12, it can be seen in the northeast edge image 13: the northeast edge 13.1 is most pronounced compared to other orientations of the aperture edge, and appears as a section of the edge curve with higher brightness.
Referring to fig. 13, visible in the southeast edge image 14 are: southeast edge 14.1 is most pronounced compared to other orientations of the aperture edge, and exhibits a section of the edge curve with higher brightness.
Referring to fig. 14, visible in the northwest edge image 15 are: the northwest edge 15.1 is most pronounced compared to the other hole edges, appearing as a segment of the edge curve with higher brightness.
Referring to fig. 15, visible in southwest edge image 16 is: southwest edge 16.1 is most pronounced compared to other orientations of the aperture edge, appearing as a segment of the edge curve with higher brightness.
The east edge 7.1, the west edge 10.1, the south edge 11.1, the north edge 12.1, the north-east edge 13.1, the south-east edge 14.1, the north-west edge 15.1, the south-west edge 16.1 have a higher brightness, indicating that the gray value of the pixels located on these edges is higher.
In the next step, the accurate position of the hole edge 4.1 can be obtained only by dividing the picture 4 of the hole to be detected into edge searching areas and then searching edge pixel points from the inside of the connecting hole to the outside. Referring to fig. 7 to 15, since the gray scale value inside the connection hole is much smaller than the gray scale value of the hole edge, there is a gray scale threshold valuethreshold=150, when the gray value of a certain pixel point is larger than the gray threshold 150, it indicates that the pixel point has reached the hole edge 4.1.
S303, dividing a search area;
referring to fig. 16, the upper left corner point and the lower right corner point of the hole-to-be-detected picture 4 are connected to form a third dotted line 4.4, the upper right corner point and the lower left corner point of the hole-to-be-detected picture 4 are connected to form a fourth dotted line 4.5, and the third dotted line 4.4 and the fourth dotted line 4.5 divide the picture into four regions: east 4.6, west 4.7, south 4.8, north 4.9.
Referring to fig. 8, 12, 13 and 16, the aperture edges of the east region 4.6 include an east edge 7.1, a northeast edge 13.1 and a southeast edge 14.1.
Referring to fig. 9, 14, 15 and 16, the aperture edge of the west region 4.7 includes a west edge 10.1, a northwest edge 15.1, a southwest edge 16.1.
Referring to fig. 10, 13, 15 and 16, the aperture edge of southbound region 4.8 includes a southbound edge 11.1, a southeast edge 14.1, and a southwest edge 16.1.
Referring to fig. 11, 12, 14 and 16, the aperture edges of northbound region 4.9 include northbound edge 12.1, northeast edge 13.1, northwest edge 15.1.
S304, defining a search starting circle and an accurate edge matrix;
step S202 obtains an initial circle 6, the coordinates of the center of the initial circle 6 are (501, 496), and the radius of the initial circle 6 is 389; step S303 describes in detail the hole edge distribution of east 4.6, west 4.7, south 4.8, and north 4.9 areas. Thus, searching from the inside of the initial circle 6 outwards can obtain the exact pixel point of the hole edge 4.1.
Referring to fig. 17 to 21, the search start circle 17 is first defined as follows: (1) Circle center coordinate
Figure 797850DEST_PATH_IMAGE102
Equal to the centre coordinates of the initial circle 6 (501, 496); (2) Radius of
Figure 658358DEST_PATH_IMAGE103
. The initial circle 17 is located completely within the inner region of the hole edge 4.1.
First diameter line segment 18 passes through the center coordinates
Figure 60521DEST_PATH_IMAGE102
And forms an angle of 45 degrees with the X axis, and a second diameter line segment 19 passes through the center coordinate
Figure 824340DEST_PATH_IMAGE102
And forms an included angle of-45 degrees with the X axis. The first diameter line segment 18 and the second diameter line segment 19 divide the circumference of the search start circle 17 into four circular arcs: east arc 17.1, west arc 17.2, south arc 17.3, north arc 17.4. The number of the pixel points contained in the four sections of circular arcs is equal tonPixel:
Figure 552124DEST_PATH_IMAGE104
The exact edge matrix is defined as followsEdge: (1) matrix dimensions 1024 × 1024; (2) all elements equal to 1. That is:
Figure 634350DEST_PATH_IMAGE105
s305, hole edge search of east region
Arranging 520 pixel points of 17.1 on the east arc from top to bottom in sequence to form a point set
Figure 371361DEST_PATH_IMAGE028
Figure 488222DEST_PATH_IMAGE106
Refer to fig. 18, 22 to
Figure 121329DEST_PATH_IMAGE107
The horizontal line illustrates the hole edge search process for east region 17.1:
defining an accumulation variablejIs zero;
Figure 956430DEST_PATH_IMAGE107
has the coordinates of
Figure 965974DEST_PATH_IMAGE032
Will be
Figure 438806DEST_PATH_IMAGE033
Figure 773972DEST_PATH_IMAGE034
Figure 565211DEST_PATH_IMAGE035
The maximum value between the three is recorded as
Figure 378446DEST_PATH_IMAGE036
(ii) if
Figure 610844DEST_PATH_IMAGE036
If the gray level is larger than the gray level threshold value 150, turning to the step (iv); if it is used
Figure 710387DEST_PATH_IMAGE036
Less than or equal to 150, let the accumulated variablejValue of (1) plusMove one pixel to the right to locate the next point
Figure 395446DEST_PATH_IMAGE038
And (iii) transferring to the step (iii).
(ⅲ)、
Will be provided with
Figure 871427DEST_PATH_IMAGE039
The maximum value between the three is recorded as
Figure 958332DEST_PATH_IMAGE036
And (ii) turning to the step (ii).
(iv) order
Figure 476380DEST_PATH_IMAGE108
Is equal to zero and is,
Figure 648736DEST_PATH_IMAGE107
the search process for the horizontal line ends.
Wherein when
Figure 928407DEST_PATH_IMAGE036
Greater than the grayscale threshold 150, indicating the current search point
Figure 604239DEST_PATH_IMAGE109
Pixel points of east edge 7.1 have been reached. The hole edge search of east region 17.1 will be at the passing point
Figure 311164DEST_PATH_IMAGE030
On each horizontal line of (A), the rest
Figure 970816DEST_PATH_IMAGE110
The 519 horizontal lines complete the hole edge search of the east region 17.1 with reference to the above steps (i) to (iv).
S306, the hole edge of the west-direction area;
arranging 520 pixel points of 17.2 on the west arc from top to bottom in sequence to form a point set
Figure 663965DEST_PATH_IMAGE040
Figure 584517DEST_PATH_IMAGE111
Refer to fig. 19, 23 to
Figure 337709DEST_PATH_IMAGE112
The horizontal line illustrates the hole edge search process for the west-oriented region 17.2:
defining an accumulated variablejIs zero;
Figure 579597DEST_PATH_IMAGE112
has the coordinates of
Figure 342016DEST_PATH_IMAGE044
Will be
Figure 851495DEST_PATH_IMAGE045
Figure 775589DEST_PATH_IMAGE046
Figure 268887DEST_PATH_IMAGE047
The maximum value between the three is recorded as
Figure 569418DEST_PATH_IMAGE048
(ii) if
Figure 198983DEST_PATH_IMAGE048
If the gray level is larger than the gray level threshold value 150, turning to the step (iv); if it is not
Figure 293978DEST_PATH_IMAGE048
Less than or equal to 150, let the accumulated variablejValue of (2) plus 1Move one pixel to the left to the next point
Figure 510458DEST_PATH_IMAGE113
And (iii) transferring to the step (iii).
(iii) mixing
Figure 349101DEST_PATH_IMAGE114
Figure 98751DEST_PATH_IMAGE115
Figure 630226DEST_PATH_IMAGE116
The maximum value between the three is recorded as
Figure 707904DEST_PATH_IMAGE048
And (ii) turning to the step (ii).
(iv) order
Figure 209292DEST_PATH_IMAGE117
Is equal to zero and is,
Figure 688815DEST_PATH_IMAGE112
the search process for the horizontal line ends.
Wherein when
Figure 250247DEST_PATH_IMAGE048
Greater than the grayscale threshold 150 indicates the current search point
Figure 80799DEST_PATH_IMAGE118
Pixel points of west edge 10.1 have been reached. Hole edge search of west region 17.2 will be at the passing point
Figure 887344DEST_PATH_IMAGE043
On each horizontal line of (A), the rest
Figure 486952DEST_PATH_IMAGE119
The 519 horizontal lines complete the hole edge search of the west-direction region 17.2 with reference to the steps (i) to (iv).
S307, hole edges of the southbound area;
520 pixel points of 17.3 on the south arc are sequentially arranged from left to right to form a point set
Figure 219285DEST_PATH_IMAGE054
Figure 5975DEST_PATH_IMAGE120
Referring to fig. 20 and 24, the hole edge search process of the southbound region 17.3 is described by taking the vertical line as an example:
defining an accumulation variablejIs zero;
Figure 114746DEST_PATH_IMAGE121
has the coordinates of
Figure 568861DEST_PATH_IMAGE122
Will be
Figure 472095DEST_PATH_IMAGE123
Figure 746081DEST_PATH_IMAGE124
Figure 533909DEST_PATH_IMAGE125
The maximum value between the three is recorded as
Figure 468629DEST_PATH_IMAGE126
(ii) if
Figure 418130DEST_PATH_IMAGE127
If the gray level is larger than the gray level threshold value 150, turning to the step (iv); if it is used
Figure 569626DEST_PATH_IMAGE126
Less than or equal to 150, let the accumulated variablejValue of (1) plusMove one pixel down to the next point
Figure 629986DEST_PATH_IMAGE128
And (iii) transferring to the step (iii).
(iii) mixing
Figure 183327DEST_PATH_IMAGE129
Figure 569309DEST_PATH_IMAGE066
Figure 676942DEST_PATH_IMAGE067
The maximum value between the three is recorded as
Figure 806572DEST_PATH_IMAGE126
And (ii) turning to the step (ii).
(iv) making
Figure 715885DEST_PATH_IMAGE130
Is equal to zero and is,
Figure 7189DEST_PATH_IMAGE121
the search process for the plumb line is ended.
Wherein when
Figure 867697DEST_PATH_IMAGE126
Greater than the grayscale threshold 150 indicates the current search point
Figure 535439DEST_PATH_IMAGE131
The pixel point of the southbound edge 11.1 has been reached. Hole edge search of southbound region 17.3 will be at the passing point
Figure 797793DEST_PATH_IMAGE132
On each plumb line of (1), the rest
Figure 259998DEST_PATH_IMAGE133
And (3) finishing the hole edge search of the southbound region 17.3 by referring to the steps (i) to (iv) of the 519 plumb line.
S308, hole edges of the northbound area;
520 pixel points of 17.4 on the north arc are sequentially arranged from left to right to form a point set
Figure 748749DEST_PATH_IMAGE068
Figure 344815DEST_PATH_IMAGE134
Refer to fig. 21, 25 to
Figure 337042DEST_PATH_IMAGE135
The plumb line illustrates the hole edge search process for northbound region 17.4:
defining an accumulation variablejIs zero;
Figure 596247DEST_PATH_IMAGE135
has the coordinates of
Figure 306714DEST_PATH_IMAGE136
Will be
Figure 440892DEST_PATH_IMAGE137
Figure 553205DEST_PATH_IMAGE138
Figure 747426DEST_PATH_IMAGE075
The maximum value between the three is recorded as
Figure 679610DEST_PATH_IMAGE076
(ii) if
Figure 617479DEST_PATH_IMAGE076
If the gray level is larger than the gray level threshold value of 150, turning to the step (iv); if it is used
Figure 584298DEST_PATH_IMAGE076
Less than or equal to 150, let the accumulated variablejValue of (1) plusMove one pixel location up to the next point
Figure 824786DEST_PATH_IMAGE139
And (iii) transferring to the step (iii).
(iii) mixing
Figure 870365DEST_PATH_IMAGE079
Figure 221712DEST_PATH_IMAGE079
Figure 167671DEST_PATH_IMAGE081
The maximum value between the three is recorded as
Figure 579061DEST_PATH_IMAGE076
And (ii) proceeding to step (ii).
(iv) order
Figure 610471DEST_PATH_IMAGE140
Is equal to zero and is,
Figure 31088DEST_PATH_IMAGE135
the search process for the plumb line is ended.
Wherein when
Figure 831553DEST_PATH_IMAGE076
Greater than the grayscale threshold 150 indicates the current search point
Figure 413844DEST_PATH_IMAGE141
The pixel point of northbound edge 12.1 has been reached. The hole edge search of northbound region 17.4 will be at the passing point
Figure 422297DEST_PATH_IMAGE142
On each plumb line of (1), the rest
Figure 115446DEST_PATH_IMAGE143
The 519 plumb lines complete the hole edge search of the northbound region 17.4 with reference to the steps (i) to (iv).
And step S4: accurately calculating the hole position and the hole diameter of the connecting hole;
through the steps S304 to S308, the edge matrix is accurateEdgeSome elements equal to 1 are reassigned to zero, and these elements equal to zero are inEdgeThe coordinates in (1) are the picture 4 of the pixel point at the edge 4.1 of the hole to be measuredCoordinates of (2). Referring to FIG. 26, an edge matrix is refinedEdgeThe image of (2) is a binary image 20 with only black and white colors, wherein the black point set 20.1 isEdgeEqual to zero. Referring to fig. 27, in order to more intuitively show the search result of the hole edge 4.1, the color of the black point set 20.1 is modified into white to form a white point set 20.2, and the white point set 20.2 is covered on the picture 4 of the hole to be detected, so that the white point set 20.2 is closely attached to the hole edge 4.1, which indicates that the white point set 20.2 is an actual pixel point of the hole edge 4.1.
Referring to fig. 28, the black point set 20.1 in fig. 26 is circle-fitted using an average method to obtain an exact circle 21 having center coordinates equal to (530, 531) and a radius equal to 372. The precise circle 21 is tightly attached to the hole edge 4.1, the hole position coordinates of the second connecting hole 4.2 are (530, 531), and the radius of the second connecting hole 4.2 is 372.

Claims (11)

1. An image processing method for calculating skin hole site and hole diameter is characterized in that: the method comprises the following steps:
step S1: designing a connecting hole identification model by adopting a deep learning method: shooting a sample photo set, making a label photo set of the sample photo set, and training a detection network;
step S2: roughly calculate the hole site and pore size: shooting a picture of a hole to be detected, and roughly calculating a hole position and a hole diameter;
and step S3: accurately positioning a hole edge pixel point set: appointing a detection operator, performing image convolution operation and dividing a search area for the hole edges in different directions; defining a search starting circle and an accurate edge matrix, searching hole edges of an east area, searching hole edges of a west area, searching hole edges of a south area, searching hole edges of a north area, searching hole edges of a northeast area, searching hole edges of a southeast area, searching hole edges of a northwest area and searching hole edges of a southwest area;
and step S4: and accurately calculating the hole position and the hole diameter of the connecting hole.
2. The image processing method for calculating skin hole site and hole diameter according to claim 1, wherein:
the step S1 specifically comprises the following steps:
s101, shooting a sample photo set;
shooting a plurality of gray photos which contain skin connecting holes and have the size of M multiplied by M to form a sample photo set;
s102, making a label atlas of the sample photo album;
covering a white mark circle (2) which is overlapped with the first connecting hole (1.1) on a black canvas (3.1) with the size of M multiplied by M to form a label graph (3) of the gray-scale photo (1), and for all other photos of the sample photo set in the step S101, making a label graph for each gray-scale photo by adopting the same method to form a label graph set;
s103, training a detection network;
adopting a VGG-16 network to perform down sampling to extract features, then adopting a deconvolution mode to perform up sampling, and finally classifying all pixel points in the image; number of iterations of trainingepochFor 100, the corresponding test set is 50 grayscale photographs, the input picture size of the model is M × M, and the loss function used is the cross entropy:
Figure 99208DEST_PATH_IMAGE002
where L represents loss, y represents true value,
Figure 683773DEST_PATH_IMAGE003
representing a predicted value;
after 100 times of iteration, the loss function of the model is reduced slightly, the whole model is converged, overfitting can be prevented by stopping in advance, and the weight of the trained model is used as a characteristic identification parameter for detecting the area where the first connecting hole (1.1) is located.
3. The image processing method for calculating skin hole site and hole diameter according to claim 2, characterized in that:
the step S2 specifically comprises the following steps:
s201, shooting a picture of a hole to be detected;
a picture (4) of the hole to be detected containing the second connecting hole (4.2) and the hole edge (4.1) is taken, and gray values of all pixel points of the picture (4) of the hole to be detected form a gray matrixgrayA
S202, roughly calculating hole positions and hole diameters;
initializing feature extraction parameters of the detection network by using the model parameters obtained in the step S103, normalizing the picture to be recognized with any input resolution to the size of M multiplied by M by adopting a cubic spline interpolation mode, and setting the recognition model as an eval () evaluation mode; the final output result comprises a background and a foreground, the foreground is the area where the connecting hole is located, the background is the other areas except the connecting hole, and the hole edge point set (5) of the connecting hole can be roughly obtained by mapping the identified result to the original picture resolution in equal proportion;
the hole edge point set (5) has N edge points, and the coordinate of each edge point is
Figure 664498DEST_PATH_IMAGE004
(ii) a The hole edge point set (5) constructs an initial circle (6), and the center coordinates of the initial circle (6)
Figure 676317DEST_PATH_IMAGE005
And radius
Figure 890129DEST_PATH_IMAGE006
Calculated as follows:
1).
Figure 12806DEST_PATH_IMAGE007
equal to the abscissa of all edge points
Figure 113617DEST_PATH_IMAGE008
The average value of the sum of the values,
Figure 296337DEST_PATH_IMAGE009
equal to the ordinate of all edge points
Figure 17953DEST_PATH_IMAGE010
The average of the sums;
2).
Figure 678742DEST_PATH_IMAGE011
is equal to
Figure 758693DEST_PATH_IMAGE012
Maximum value and
Figure 253260DEST_PATH_IMAGE012
a difference in the minimum values;
the formula is as follows:
Figure 785872DEST_PATH_IMAGE013
4. the image processing method for calculating skin hole sites and hole diameters according to claim 3, wherein:
the step S3 specifically includes:
s301, assigning detection operators for hole edges in different directions;
assigning detection operators to hole edges in different orientations as follows:
Figure 640564DEST_PATH_IMAGE014
Figure 575022DEST_PATH_IMAGE015
Figure 240490DEST_PATH_IMAGE016
Figure 525978DEST_PATH_IMAGE017
Figure 528569DEST_PATH_IMAGE018
Figure 707746DEST_PATH_IMAGE019
Figure 668749DEST_PATH_IMAGE020
Figure 785741DEST_PATH_IMAGE021
s302, performing image convolution operation;
will gray scale matrixgrayAAre respectively connected withK_E、K_W、K_N、K_S、K_NE、K_SE、K_NW、K_SWCarrying out convolution operation on the detection operators to respectively obtain east edge matrixesedgeEAnd the corresponding east edge image (7) and west edge matrixedgeWAnd the corresponding west edge image (10) and south edge matrixedgeSAnd corresponding south edge image (11) and north edge matrixedgeNAnd its corresponding northbound edge image (12), northeast edge matrixedgeNEAnd corresponding northeast edge image (13) and southeast edge matrixedgeSEAnd corresponding southeast edge image (14) and northwest edge matrixedgeNWAnd the corresponding northwest edge image (15) and southwest edge matrixedgeSWAnd its corresponding southwestern edge image (16);
s303, dividing a search area;
connecting an upper left corner point and a lower right corner point of the to-be-detected hole picture (4) to form a third dotted line (4.4), and connecting an upper right corner point and a lower left corner point of the to-be-detected hole picture (4) to form a fourth dotted line (4.5); the third dotted line (4.4) and the fourth dotted line (4.5) divide the picture of the hole to be measured (4) into four regions: an east region (4.6), a west region (4.7), a south region (4.8) and a north region (4.9);
s304, defining a search starting circle and an accurate edge matrix;
the search start circle (17) is defined as follows: circle center coordinate
Figure 857602DEST_PATH_IMAGE022
Equal to the centre coordinates of the initial circle (6)
Figure 156865DEST_PATH_IMAGE023
(ii) a Radius of
Figure 23190DEST_PATH_IMAGE024
The first diameter line section (18) passes through
Figure 158636DEST_PATH_IMAGE025
And forms an included angle of 45 degrees with the X axis, and a second diameter line segment (19) passes through and forms an included angle of-45 degrees with the X axis; the first diameter line segment (18) and the second diameter line segment (19) divide the circumference of the search start circle (17) into four circular arcs: an east arc (17.1), a west arc (17.2), a south arc (17.3) and a north arc (17.4); the number of the pixel points contained in the four sections of circular arcs is equal tonPixel:
Figure 768609DEST_PATH_IMAGE026
Defining the exact edge matrix as followsEdge: the matrix dimension is M multiplied by M; all elements equal to 1;
s305, searching for hole edges of the east region;
will be east-ward rounded (17.1)nPixelThe pixel points are arranged from top to bottom in sequence to form a point set
Figure 532166DEST_PATH_IMAGE027
Figure 696956DEST_PATH_IMAGE028
Let the hole edge search of east region (4.6) be respectively at
Figure 444332DEST_PATH_IMAGE029
Is carried out on each horizontal line, and the starting point of the search is
Figure 467782DEST_PATH_IMAGE029
S306, searching hole edges of the west-direction area;
in the west direction (17.2)nPixelThe pixel points are arranged from top to bottom in sequence to form a point set
Figure 351425DEST_PATH_IMAGE030
Figure 559552DEST_PATH_IMAGE031
Order to
Figure 653279DEST_PATH_IMAGE032
West region (4.7) hole edge search will be at
Figure 605055DEST_PATH_IMAGE033
Is performed on each horizontal line, and the starting point of the search is
Figure 484149DEST_PATH_IMAGE033
S307, searching hole edges of the southward region;
from south to top (17.3)nPixelThe pixel points are sequentially arranged from left to right to form a point set
Figure 597598DEST_PATH_IMAGE034
Figure 178621DEST_PATH_IMAGE035
Order to
Figure 668509DEST_PATH_IMAGE036
Hole edge search in the southbound region (4.8) will be at
Figure 526743DEST_PATH_IMAGE037
Is carried out on each plumb line, and the search starting point is
Figure 217619DEST_PATH_IMAGE037
S308, searching hole edges of the northbound region;
to north (17.4)nPixelThe pixel points are sequentially arranged from left to right to form a point set
Figure 161304DEST_PATH_IMAGE038
Figure 579516DEST_PATH_IMAGE039
Order to
Figure 292257DEST_PATH_IMAGE040
The hole edge search of the northbound region (4.9) will be at
Figure 154034DEST_PATH_IMAGE041
Is carried out on each plumb line, and the starting point of the search is
Figure 319436DEST_PATH_IMAGE041
5. The image processing method for calculating skin hole site and hole diameter according to claim 4, wherein:
the step S4 specifically comprises the following steps:
at the precise edge matrixEdgeIn the binary image (20), the black point set (20.1) isEdgeIn the elements with the middle value equal to zero, the black point set (20.1) is subjected to circle fitting by using an average value method to obtain the center coordinates of the circle
Figure 538409DEST_PATH_IMAGE043
Radius of
Figure 105656DEST_PATH_IMAGE044
The hole site coordinates of the second connecting hole (4.2) are
Figure 138334DEST_PATH_IMAGE046
The radius of the second connecting hole (4.2) is
Figure 791033DEST_PATH_IMAGE044
6. The image processing method for calculating skin hole site and hole diameter according to claim 2, wherein in step S102: the coordinates of the center of the white mark circle (2) in the grayscale picture (1) and the sample picture (3) are both
Figure 426413DEST_PATH_IMAGE047
7. An image processing method for calculating skin hole sites and apertures according to claim 3, characterized in that: in step S201: establishing a coordinate system (4.3) on the picture (4) of the hole to be detected according to the following rules: the origin of coordinates is located in the lower left corner of the image, with the positive X-axis direction going horizontally to the right and the positive Y-axis direction going vertically upwards.
8. The image processing method for calculating skin hole site and hole diameter according to claim 4, wherein: in step S305:
defining an accumulation variableiHas an initial value ofZero, defining the cumulative variablejIs zero;
(ⅱ)、
Figure 238380DEST_PATH_IMAGE048
is noted as
Figure DEST_PATH_IMAGE049
Will be
Figure 973118DEST_PATH_IMAGE050
Figure DEST_PATH_IMAGE051
Figure 768905DEST_PATH_IMAGE052
The maximum value of (A) is recorded as
Figure 473556DEST_PATH_IMAGE053
(iii) if
Figure 890762DEST_PATH_IMAGE053
Greater than a grey thresholdthresholdLet us order
Figure 389876DEST_PATH_IMAGE054
If the value is equal to zero, the step (v) is carried out; if it is not
Figure 141800DEST_PATH_IMAGE053
Less than or equal to the gray thresholdthresholdLet the accumulated variablejValue of (1) plusMove one pixel to the right to locate the next point
Figure 118984DEST_PATH_IMAGE055
And (iv) carrying out the step (iv);
(iv) mixing
Figure 390696DEST_PATH_IMAGE056
The maximum value between the three is recorded as
Figure 60712DEST_PATH_IMAGE053
Transferring to the step (iii);
(v), ifiGreater than or equal tonPixelAnd (vi) turning to the step (vi); if it is notiIs less thannPixel,Make the accumulated variableiAdding 1 to the value of (ii), and turning to the step (ii);
and vi) finishing the hole edge search of the east-direction area (4.6).
9. The image processing method for calculating skin hole site and hole diameter according to claim 4, wherein the step S306 comprises the following specific contents:
defining an accumulation variableiIs zero, defining an accumulation variablejIs zero;
(ⅱ)、
Figure 302862DEST_PATH_IMAGE057
is marked as
Figure 83736DEST_PATH_IMAGE058
Will be
Figure 475534DEST_PATH_IMAGE059
Figure 316451DEST_PATH_IMAGE060
Figure 42968DEST_PATH_IMAGE061
The maximum value between the three is recorded as
Figure 361953DEST_PATH_IMAGE062
(iii) if
Figure 732892DEST_PATH_IMAGE062
Greater than ashDegree thresholdthresholdLet us order
Figure 620077DEST_PATH_IMAGE063
If the value is equal to zero, the step (v) is carried out; if it is not
Figure 443676DEST_PATH_IMAGE062
Less than or equal to the gray thresholdthresholdLet the accumulated variables bejValue of (2) plus 1Move one pixel to the left to the next point
Figure 956566DEST_PATH_IMAGE064
And (iv) carrying out the step (iv);
(iv) mixing
Figure 182011DEST_PATH_IMAGE065
Figure 240097DEST_PATH_IMAGE066
Figure 816572DEST_PATH_IMAGE067
The maximum value between the three is recorded as
Figure 601994DEST_PATH_IMAGE062
Transferring to the step (iii);
(v), ifiGreater than or equal tonPixelShifting to the step (vi); if it is notiIs less thannPixel,Make the accumulated variableiAdding 1 to the value of (ii), and turning to the step (ii);
and (vi) finishing the hole edge search of the western-direction area (4.7).
10. The method of claim 4, wherein step S307 comprises the following steps:
defining an accumulated variableiIs zero, defining an accumulation variablejHas an initial value of zero;
(ⅱ)、
Figure 681945DEST_PATH_IMAGE068
Is noted as
Figure 176512DEST_PATH_IMAGE069
Will be
Figure 974703DEST_PATH_IMAGE070
Figure 560887DEST_PATH_IMAGE071
Figure 760924DEST_PATH_IMAGE072
The maximum value between the three is recorded as
Figure 426392DEST_PATH_IMAGE073
(iii) if
Figure 446300DEST_PATH_IMAGE073
Greater than a grey thresholdthresholdLet us order
Figure 104684DEST_PATH_IMAGE074
If the value is equal to zero, the step (v) is carried out; if it is not
Figure 159227DEST_PATH_IMAGE073
Less than or equal to the gray thresholdthresholdLet the accumulated variablejValue of (1) plusMove one pixel down to the next point
Figure 730017DEST_PATH_IMAGE075
Turning to the step (iv);
(iv) mixing
Figure 237222DEST_PATH_IMAGE076
Figure DEST_PATH_IMAGE077
Figure 964875DEST_PATH_IMAGE078
The maximum value between the three is recorded as
Figure 873925DEST_PATH_IMAGE073
Transferring to the step (iii);
(v), ifiGreater than or equal tonPixelAnd (vi) turning to the step (vi); if it is notiIs less thannPixel,Make the accumulated variableiAdding 1 to the value of (ii), and turning to the step (ii);
and vi) finishing the hole edge search of the southbound area (4.8).
11. The image processing method for calculating skin hole site and hole diameter according to claim 4, wherein step S308 comprises the following specific contents:
defining an accumulated variableiIs zero, defines an accumulation variablejIs zero;
(ⅱ)、
Figure DEST_PATH_IMAGE079
is noted as
Figure 412354DEST_PATH_IMAGE080
Will be
Figure DEST_PATH_IMAGE081
Figure 797068DEST_PATH_IMAGE082
Figure 547986DEST_PATH_IMAGE083
The maximum value between the three is recorded as
Figure 577122DEST_PATH_IMAGE084
(iii) if
Figure 741912DEST_PATH_IMAGE084
Greater than a grey thresholdthresholdLet us order
Figure 489288DEST_PATH_IMAGE085
If the value is equal to zero, the step (v) is carried out; if it is not
Figure 637372DEST_PATH_IMAGE084
Less than or equal to the gray thresholdthresholdLet the accumulated variablejValue of (2) plus 1Move one pixel location up to the next point
Figure 396381DEST_PATH_IMAGE086
And (iv) carrying out the step (iv);
(iv) mixing
Figure 604509DEST_PATH_IMAGE087
Figure 698235DEST_PATH_IMAGE088
Figure 650011DEST_PATH_IMAGE089
The maximum value between the three is recorded as
Figure 263526DEST_PATH_IMAGE084
Transferring to the step (iii);
(v), ifiGreater than or equal tonPixelShifting to the step (vi); if it is notiIs less thannPixel,Make the accumulated variableiAdding 1 to the value of (ii), and turning to the step (ii);
and (vi) finishing the hole edge search of the north area (4.9).
CN202210878722.1A 2022-07-25 2022-07-25 Image processing method for calculating skin hole site and aperture Active CN115423746B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210878722.1A CN115423746B (en) 2022-07-25 2022-07-25 Image processing method for calculating skin hole site and aperture

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210878722.1A CN115423746B (en) 2022-07-25 2022-07-25 Image processing method for calculating skin hole site and aperture

Publications (2)

Publication Number Publication Date
CN115423746A true CN115423746A (en) 2022-12-02
CN115423746B CN115423746B (en) 2023-10-10

Family

ID=84196232

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210878722.1A Active CN115423746B (en) 2022-07-25 2022-07-25 Image processing method for calculating skin hole site and aperture

Country Status (1)

Country Link
CN (1) CN115423746B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115230191A (en) * 2022-07-25 2022-10-25 成都飞机工业(集团)有限责任公司 Forming method of stealth box section part

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104359404A (en) * 2014-11-24 2015-02-18 南京航空航天大学 Quick visual detection method for plenty of guide holes of small sizes in airplane parts
CN110906875A (en) * 2019-11-26 2020-03-24 湖北工业大学 Visual processing method for aperture measurement
CN113420363A (en) * 2021-08-25 2021-09-21 成都飞机工业(集团)有限责任公司 Method for predicting matching of skin skeleton of aircraft component
CN114193231A (en) * 2022-02-16 2022-03-18 成都飞机工业(集团)有限责任公司 Bottom hole orifice measuring method for numerical control countersink
CN114219802A (en) * 2022-02-21 2022-03-22 成都飞机工业(集团)有限责任公司 Skin connecting hole position detection method based on image processing
CN114346759A (en) * 2022-03-10 2022-04-15 成都飞机工业(集团)有限责任公司 Device for hole online detection and hole finish machining and machining method thereof

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104359404A (en) * 2014-11-24 2015-02-18 南京航空航天大学 Quick visual detection method for plenty of guide holes of small sizes in airplane parts
CN110906875A (en) * 2019-11-26 2020-03-24 湖北工业大学 Visual processing method for aperture measurement
CN113420363A (en) * 2021-08-25 2021-09-21 成都飞机工业(集团)有限责任公司 Method for predicting matching of skin skeleton of aircraft component
CN114193231A (en) * 2022-02-16 2022-03-18 成都飞机工业(集团)有限责任公司 Bottom hole orifice measuring method for numerical control countersink
CN114219802A (en) * 2022-02-21 2022-03-22 成都飞机工业(集团)有限责任公司 Skin connecting hole position detection method based on image processing
CN114346759A (en) * 2022-03-10 2022-04-15 成都飞机工业(集团)有限责任公司 Device for hole online detection and hole finish machining and machining method thereof

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115230191A (en) * 2022-07-25 2022-10-25 成都飞机工业(集团)有限责任公司 Forming method of stealth box section part

Also Published As

Publication number Publication date
CN115423746B (en) 2023-10-10

Similar Documents

Publication Publication Date Title
CN113450307B (en) Product edge defect detection method
CN111243032B (en) Full-automatic detection method for checkerboard corner points
CN107610141B (en) Remote sensing image semantic segmentation method based on deep learning
CN108776140B (en) Machine vision-based printed matter flaw detection method and system
CN114897864B (en) Workpiece detection and defect judgment method based on digital-analog information
CN112308916B (en) Target pose recognition method based on image target
CN112330593A (en) Building surface crack detection method based on deep learning network
CN110706224B (en) Optical element weak scratch detection method, system and device based on dark field image
CN112164048B (en) Magnetic shoe surface defect automatic detection method and device based on deep learning
CN110648323B (en) Defect detection classification system and method thereof
CN111382658B (en) Road traffic sign detection method in natural environment based on image gray gradient consistency
CN113343976B (en) Anti-highlight interference engineering measurement mark extraction method based on color-edge fusion feature growth
CN110008833B (en) Target ship detection method based on optical remote sensing image
CN110648316A (en) Steel coil end face edge detection algorithm based on deep learning
CN112381062A (en) Target detection method and device based on convolutional neural network
CN115423746A (en) Image processing method for calculating skin hole site and hole diameter
CN109544513A (en) A kind of steel pipe end surface defect extraction knowledge method for distinguishing
CN113012096A (en) Display screen sub-pixel positioning and brightness extraction method, device and storage medium
CN115546795A (en) Automatic reading method of circular pointer instrument based on deep learning
CN113706607B (en) Subpixel positioning method, computer equipment and device based on circular array diagram
CN110031471B (en) Method, system and device for analyzing surface defect growth of large-caliber optical element
CN114219802B (en) Skin connecting hole position detection method based on image processing
CN111738936A (en) Image processing-based multi-plant rice spike length measuring method
CN111260955A (en) Parking space detection system and method adopting parking space frame lines and end points
CN113591548B (en) Target ring identification method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant