CN115423746A - Image processing method for calculating skin hole site and hole diameter - Google Patents
Image processing method for calculating skin hole site and hole diameter Download PDFInfo
- Publication number
- CN115423746A CN115423746A CN202210878722.1A CN202210878722A CN115423746A CN 115423746 A CN115423746 A CN 115423746A CN 202210878722 A CN202210878722 A CN 202210878722A CN 115423746 A CN115423746 A CN 115423746A
- Authority
- CN
- China
- Prior art keywords
- hole
- edge
- equal
- search
- value
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30164—Workpiece; Machine component
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30168—Image quality inspection
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Quality & Reliability (AREA)
- Image Analysis (AREA)
Abstract
The invention belongs to the technical field of inspection and detection, and particularly relates to an image processing method for calculating skin hole sites and skin hole diameters, which comprises the following steps: step S1, designing a connecting hole identification model by adopting a deep learning method: shooting a sample photo set, making a label photo set of the sample photo set, and training a detection network; step S2, roughly calculating hole positions and hole diameters: shooting a picture of a hole to be detected, and roughly calculating a hole position and a hole diameter; s3, accurately positioning a hole edge pixel point set: appointing a detection operator, performing image convolution operation and dividing a search area for the hole edges in different directions; and S4, accurately calculating the hole position and the hole diameter of the connecting hole. By adopting the image processing method provided by the application, the accurate edge pixel point of the connecting hole can be quickly found from the skin photo, the hole position and the hole diameter of the connecting hole are calculated, and the processing quality of the composite material skin part is accurately detected.
Description
Technical Field
The invention belongs to the technical field of inspection and detection, and particularly relates to an image processing method for calculating skin hole sites and skin hole diameters.
Background
With the rapid development of aircraft manufacturing technology, people have higher and higher requirements on the flight distance and flight safety of the aircraft, and more composite materials are used for manufacturing aircraft profile parts. The composite materials can be processed into large thin-wall skin parts more than ten meters, and have the advantages of light weight, high strength, good toughness and the like which cannot be compared with metal materials. The composite skin part and the metal framework part are generally connected by rivets and bolts, so hundreds of connecting holes are required to be formed in the composite skin part. Before the composite skin part is delivered, hole positions and hole diameters of all connecting holes need to be measured so as to ensure that the composite skin part and the metal part can be accurately assembled.
The method for detecting a large number of connecting holes on the composite skin part mainly comprises contact probe detection and non-contact detection. Contact probe inspection is currently the usual method, but there is a risk of inefficiency and the probe hitting the skin surface. The non-contact detection is a novel digital detection technology, mainly adopts a machine vision technology to realize various measurement tasks on a measured object, and has the remarkable advantages of high efficiency, safety and no contact. One of the core research contents of machine vision technology is the image processing method. Because the images acquired under different scenes have very large differences, it is difficult to find an image processing method which can adapt to various scenes, and particularly, in the field of high-precision detection, a specific image processing algorithm is developed for a specific scene.
A large number of severe environmental factors such as uneven illumination, dust, cuttings, liquid and the like exist in a processing field of a composite skin part, so that high-quality pictures cannot be taken. The existing general image processing methods, such as Roberts edge detection operators, sobel edge detection operators, prewitt edge detection operators and the like, can not accurately calculate the hole position and the hole diameter of the connecting hole, and even can not accurately identify the hole edge pixel point of the connecting hole. The bottleneck problem of the image processing method greatly limits the application of non-contact detection on the composite skin part.
Disclosure of Invention
In view of the above problems, the present invention provides an image processing method for calculating skin hole locations and hole diameters, comprising the following steps:
the invention is realized by the following technical scheme:
an image processing method for calculating skin hole site and hole diameter comprises the following steps:
step S1: designing a connecting hole identification model by adopting a deep learning method;
s101, taking a sample photo album
And taking a plurality of gray-scale pictures which comprise skin connecting holes and have the size of M multiplied by M to form a sample photo set.
S102, making label atlas of sample photo album
Drawing a white mark circle approximately coinciding with the connecting hole on the gray scale picture 1, and recording the coordinates of the center of the white mark circle as(ii) a Copying the white marking circle to a black canvas 3.1 with the size of M multiplied by M to form a label graph of the gray scale photo, wherein the coordinate of the center of the white marking circle in the label graph is still as。
For all other photos in the grayscale photo album in step S101, a label map is created for each grayscale photo by the above method to form a label map album.
S103, training detection network
And (3) adopting a VGG-16 network to perform downsampling to extract features, then adopting a deconvolution mode to perform upsampling, and finally classifying all pixel points in the image. Number of iterations of trainingepochFor 100, the corresponding test set is 50 grayscale photographs, the input picture size of the model is M × M, and the loss function used is the cross entropy:
after 100 iterations, the loss function of the model is reduced slightly, the whole model is converged, overfitting can be prevented by stopping in advance, and the model weight trained at this time is used as the characteristic identification parameter for detecting the area where the connecting hole is located.
Step S2: rough calculation of hole site and hole diameter
S201, shooting a picture of the hole to be measured
And taking a picture of the hole to be detected, wherein the picture of the hole to be detected comprises a connecting hole and a hole edge. Establishing a coordinate system on the photo of the hole to be detected according to the following rules: the origin of coordinates is located in the lower left corner of the image, with the positive X-axis direction going horizontally to the right and the positive Y-axis direction going vertically upward.
For the pixel point at any position in the photo of the hole to be measured, the gray value of the pixel point is recorded asThe gray values of all pixel points in the photo form a gray matrixgrayA。
S202, roughly calculating hole positions and hole diameters
And initializing the characteristic extraction parameters of the detection network by using the model parameters obtained in the step S103, normalizing the picture to be recognized with any input resolution to the size of M multiplied by M by adopting a cubic spline interpolation mode, and setting the recognition model as an eval () evaluation mode. And finally, the output result comprises a background and a foreground, the foreground is the area where the connecting hole is located, the background is the other areas except the connecting hole, and the hole edge point set of the connecting hole can be roughly obtained by mapping the identified result to the original picture resolution in an equal proportion.
The hole edge point set has N edge points, and the coordinate of each edge point is. Constructing an initial circle according to the hole edge point set, and the center coordinates of the initial circleAnd radiusCalculated as follows:
1). equal to the abscissa of all edge pointsThe average value of the sum of the values,equal to the ordinate of all edge pointsThe average of the sums;
the formula is as follows:
and step S3: accurately positioning hole edge pixel point set
S301, appointing detection operators for hole edges in different directions
Kirsch edge detection operator is an edge detection operator proposed by r.kirsch, consisting of eight 3 × 3 order matrices. In the round hole edge detection scene, detection operators are assigned to the hole edges in different directions as follows:
s302, image convolution operation
Will gray matrixgrayAAre respectively connected withK_E、K_W、K_N、K_S、K_NE、K_SE、K_NW、K_SWCarrying out convolution operation on the detection operators to respectively obtain east edge matrixesedgeEAnd corresponding east edge image and west edge matrixedgeWAnd corresponding west edge image and south edge matrix thereofedgeSAnd corresponding south edge image and north edge matrix thereofedgeNAnd corresponding northbound edge image and northbound edge matrix thereofedgeNEAnd corresponding northeast edge image and southeast edge matrix thereofedgeSEAnd corresponding southeast edge image and northwest edge matrix thereofedgeNWAnd corresponding northwest edge image and southwest edge matrixedgeSWAnd its corresponding southwestern edge image.
S303, dividing a search area
Connect the upper left corner point and the lower right corner point of the hole picture that will await measuring and form the third dotted line, connect the upper right corner point and the lower left corner point of the hole picture that will await measuring and form the fourth dotted line, the hole picture that will await measuring of third dotted line and fourth dotted line divides into four regions: east region, west region, south region, north region.
The aperture edges of the east region include an east edge, a northeast edge, and a southeast edge; the hole edge of the western-direction area comprises a western-direction edge, a northwest-direction edge and a southwest-direction edge; the hole edge of the southward region comprises a southeast edge, a southeast edge and a southwest edge; the hole edges of the northbound region include a northbound edge, a northeast edge, and a northwest edge.
S304, defining a search starting circle and an accurate edge matrix
The search start circle is defined as follows: center coordinateEqual to the centre coordinates of the initial circle(ii) a Radius of;
A first diameter line section passesAnd forms an angle of 45 degrees with the X axisThe two diameter line sections pass through and form an included angle of-45 degrees with the X axis; the first diameter line segment and the second diameter line segment divide the circumference of the search starting circle into four circular arcs: east circular arc, west circular arc, south circular arc and north circular arc; the number of the pixel points contained in the four sections of circular arcs is equal tonPixel:
The exact edge matrix is defined as followsEdge: the matrix dimension is M multiplied by M; all elements equal to 1;
s305, searching for hole edges of the east region;
will be east-ward rounded (17.1)nPixelThe pixel points are arranged from top to bottom in sequence to form a point set:
The hole edge search of the east region will be atIs carried out on each horizontal line, and the starting point of the search isThe method comprises the following specific steps:
defining an accumulation variableiIs zero, defining an accumulation variablejIs zero;
(iii) ifGreater than a grey thresholdthresholdLet us orderIf the value is equal to zero, the step (v) is carried out; if it is usedLess than or equal to the gray thresholdthresholdLet the accumulated variablejValue of (1) plus,Move one pixel to the right to locate the next pointAnd (iv) carrying out the step (iv);
(v), ifiGreater than or equal tonPixelShifting to the step (vi); if it is usediIs less thannPixel,Make the accumulated variableiAdding 1 to the value of (ii), and turning to the step (ii);
and (vi) finishing the hole edge search of the east-direction area.
S306, searching hole edges of west-direction area
On the west arcnPixelThe pixel points are arranged from top to bottom in sequence to form a point set:
Hole edge search in the west region will beIs carried out on each horizontal line, and the starting point of the search isThe method comprises the following specific steps:
defining an accumulation variableiIs zero, defines an accumulation variablejIs zero;
(iii) ifGreater than a grey thresholdthresholdLet us orderIf the value is equal to zero, the step (v) is carried out; if it is notLess than or equal to the gray thresholdthresholdLet the accumulated variablejValue of (1) plus,Move one pixel location to the left to the next pointAnd (iv) carrying out the step (iv);
(v), ifiGreater than or equal tonPixelShifting to the step (vi); if it is notiIs less thannPixel,Make the accumulated variableiAdding 1 to the value of (ii), and turning to the step (ii);
and vi) finishing the hole edge search of the west-oriented area (4.7).
S307, hole edge searching of south area
Will be on the south arcnPixelThe pixel points are sequentially arranged from left to right to form a point set:
Hole edge search in the southbound region will be atIs carried out on each plumb line, and the starting point of the search isThe method comprises the following specific steps:
defining an accumulation variableiIs zero, defining an accumulation variablejIs zero;
(iii) ifGreater than a grey thresholdthresholdLet us orderIf the value is equal to zero, the step (v) is carried out; if it is notLess than or equal to the gray thresholdthresholdLet the accumulated variablejValue of (1) plus,Move one pixel down to the next pointTurning to the step (iv);
(v), ifiIs greater than or equal tonPixelShifting to the step (vi); if it is usediIs less thannPixel,Make the accumulated variableiAdding 1 to the value of (ii), and turning to the step (ii);
and vi) finishing the hole edge search of the southbound area (4.8).
S308, hole edge searching of northbound region
Will be north to the arcnPixelThe pixel points are sequentially arranged from left to right to form a point set:
The hole edge search of the northbound region will be atIs carried out on each plumb line, and the starting point of the search isThe method comprises the following specific steps:
defining an accumulated variableiIs zero, defining an accumulation variablejIs zero;
(iii) ifGreater than a grey thresholdthresholdLet us orderIf the value is equal to zero, the step (v) is carried out; if it is usedLess than or equal to the gray thresholdthresholdLet the accumulated variablejValue of (2) plus 1,Move one pixel location up to the next pointAnd (iv) carrying out the step (iv);
(v), ifiGreater than or equal tonPixelAnd (vi) turning to the step (vi); if it is notiIs less thannPixel,Make the accumulated variableiAdding 1 to the value of (ii), and turning to the step (ii);
and (vi) finishing the hole edge search of the northbound region.
And step S4: accurately calculating hole site and hole diameter of connecting hole
Precision edge matrixEdgeThe image of (2) is a binary image with only black and white colors, wherein the black point set isEdgeOf (a) is equal to zero, and,Edgethe element coordinate equal to zero in (1) is also the pixel point coordinate of the hole edge. Using an average value method to carry out circle fitting on the black point set to obtain a circle center coordinate ofRadius ofIs precisely circular.
The application has the advantages that:
the application provides an image processing method capable of accurately calculating the hole position and the hole diameter of the skin, which can quickly find the accurate edge pixel point of the connecting hole from the skin picture, calculate the hole position and the hole diameter of the connecting hole and accurately detect the processing quality of the composite skin part. The method can relieve technicians from complex image processing algorithm development work, puts limited energy into the design and optimization of other important modules of the machine vision detection system, and promotes the popularization and application of non-contact detection technologies represented by machine vision in the field of aircraft manufacturing detection.
Drawings
Fig. 1 is a grayscale photograph.
Fig. 2 is a display view of white labeled circles.
Fig. 3 is a label view of fig. 1.
FIG. 4 is a photograph of a well to be tested.
Fig. 5 is a display of a set of hole edge points.
Fig. 6 is a display view of an initial circle.
Fig. 7 is an east edge image (one).
Fig. 8 is east edge image (two).
FIG. 9 is a west edge image.
Fig. 10 is a southbound edge image.
FIG. 11 is a northbound edge image.
Fig. 12 is a northeast edge image.
Fig. 13 is a southeast edge image.
FIG. 14 is a northwest edge image.
Fig. 15 is a southwestern edge image.
Fig. 16 is a search area division diagram.
Fig. 17 is a display diagram of searching for an initial circle.
Fig. 18 is a schematic diagram of hole edge search for the east region.
FIG. 19 is a schematic diagram of a hole edge search for the west region.
Fig. 20 is a schematic diagram of hole edge search for southbound regions.
Fig. 21 is a schematic view of hole edge search of the northbound region.
Fig. 22 is a hole edge search flowchart of the east region.
FIG. 23 is a hole edge search flow diagram for the west-facing region.
Fig. 24 is a hole edge search flow diagram for the southbound region.
Fig. 25 is a hole edge search flow diagram for the northbound region.
FIG. 26 is a binary map of the exact edge matrix.
FIG. 27 is a diagram of a pixel constellation at the edge of a precision hole.
Fig. 28 is a display view of a perfect circle.
Fig. 29 is a flowchart of the overall steps.
In the drawings: 1-grayscale photo; 1.1-a first connection hole; 2-white marking circles; 3-sample graph; 3.1-black canvas; 4-photo of the hole to be detected; 4.1-well edge; 4.2-second connection hole; 4.3-coordinate system; 4.4-third dotted line; 4.5-fourth dashed line; 4.6-east region; 4.7-west region; 4.8-southward region; 4.9-northbound region; 5-hole edge point set; 6-initial circle; 7-east edge image; 7.1-east edge; 8-first dotted line; 9-second dashed line; 10-west edge image; 10.1-west edge; 11-southbound edge images; 11.1-southerly edge; 12-northbound edge images; 12.1-northbound edge; 13-northeast edge image; 13.1-northeast edge; 14-southeast edge image; 14.1-southeast edge; 15-northwest edge image; 15.1-northwest edge; 16-southwesterly edge images; 16.1-southwesterly oriented edges; 17-search for initial circle; 17.1-east arc; 17.2-west circular arc; 17.3-south circular arc; 17.4-north arc; 18-a first diameter line segment; 19-a second diameter line segment; 20-a binary map of the exact Edge matrix Edge; 20.1-black set of dots; 20.2-white point set; 21-precise circle.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are for explaining the present invention and not for limiting the present invention. All other embodiments, which can be obtained by a person skilled in the art without making any creative effort based on the embodiments in the present invention, belong to the protection scope of the present invention.
The present invention will be described in further detail with reference to the drawings and examples, but the present invention is not limited to the examples.
As shown in fig. 29, an image processing method for calculating skin hole site and hole diameter includes the following steps:
step S1: designing a connecting hole identification model by adopting a deep learning method;
s101, shooting a sample photo set;
1000 grayscale photographs were taken containing skin attachment holes and having a resolution of 1024 × 1024 to make up a sample photograph collection.
S102, making a label atlas of the sample photo album;
referring to fig. 1, 2, and 3, a method for creating a label map will be described by taking a grayscale photograph 1 as an example. Referring to fig. 1, a greyscale photograph 1 contains a complete first connection aperture 1.1. Referring to fig. 2, a white mark circle 2 approximately coinciding with the first connection hole 1.1 is drawn on the gray-scale photograph 1, and the coordinates of the center of the white mark circle 2 are recorded as. Referring to fig. 3, a white marking circle 2 is copied to a black canvas 3.1 with a resolution still 1024 × 1024 to form a label map 3 of the gray scale photograph 1, wherein the coordinate of the center of the white marking circle 2 in the label map 3 is still the same。
For all other photos in the gray photo album in step S101, label maps are created for the remaining 999 sample photos by the above method, so as to form a label map album.
S103, training a detection network;
and (3) adopting a VGG-16 network to perform downsampling to extract features, then adopting a deconvolution mode to perform upsampling, and finally classifying all pixel points in the image. Number of iterations of trainingepochFor 100, the corresponding test set is 50 gray-scale pictures, the input picture size of the model is 1024 × 1024, and the loss function used is the cross entropy:
after 100 iterations, the loss function of the model is reduced slightly, the overall model is converged, the overfitting can be prevented by stopping in advance, and the weight of the trained model is used as the characteristic identification parameter for detecting the area where the first connection hole 1.1 is located.
Step S2: roughly calculating hole positions and hole diameters;
s201, shooting a picture of a hole to be detected;
referring to fig. 4, a photo 4 of the hole to be measured with a resolution of 1024 × 1024 is taken, and the photo 4 of the hole to be measured includes the second connection hole 4.2 and the hole edge 4.1. Establishing a coordinate system 4.3 on the picture 4 of the hole to be detected according to the following rules: the origin of coordinates is located in the lower left corner of the image, with the positive X-axis direction going horizontally to the right and the positive Y-axis direction going vertically upward.
For the pixel point at any position in the photo 4 of the hole to be measured, the gray value of the pixel point is recorded asThe gray values of all pixel points in the photo form a gray matrixgrayA:
S202, roughly calculating hole positions and hole diameters;
and initializing the feature extraction parameters of the detection network by using the model parameters obtained in the step S103, normalizing the picture to be recognized with any input resolution to 1024 x 1024 sizes by adopting a cubic spline interpolation mode, and setting the recognition model as an eval () evaluation mode. And finally, the output result comprises a background and a foreground, the foreground is the area where the connecting hole is located, the background is the other areas except the connecting hole, and the hole edge point set 5 of the connecting hole can be roughly obtained by mapping the identified result to the original picture resolution in an equal proportion, as shown in fig. 5.
The hole edge point set 5 has 2392 edge points, and the coordinate of each edge point is. Referring to fig. 6, an initial circle 6 is constructed from the hole edge point set 5, the coordinates of the center of the initial circle 6And radiusCalculated as follows:
1). equal to the abscissa of all edge pointsThe average value of the sum of the values,equal to the ordinate of all edge pointsThe average of the sums;
that is:
the initial circle 6 is already relatively close to the hole edge 4.1 of the second connection hole 4.2, and the initial circle 6 will be the input condition of step S304.
And step S3: accurately positioning a hole edge pixel point set;
s301, assigning detection operators for hole edges in different directions;
kirsch edge detection operator is an edge detection operator proposed by r.kirsch, and is composed of eight matrices of 3 × 3 order, which are:
in the round hole edge detection scene, detection operators are assigned to the hole edges in different directions as follows:
the eight detection operators described above are each sensitive only to the hole edge 4.1 in the respective direction. For example, referring to fig. 7, 8, the east operator is sensitive only to east edges 7.1, and is not sensitive to edges in other directions. The eight detection operators are properly combined in the subsequent steps, so that the hole edge 4.1 can be accurately detected.
S302, performing image convolution operation;
will gray scale matrixgrayAAre respectively connected withK_E、K_W、K_N、K_S、K_NE、K_SE、K_NW、K_SWAnd carrying out convolution operation on the detection operator to obtain 8 edge matrixes. Each edge matrix represents an edge image, and 8 edge matrices represent images as shown in fig. 7-15. Gray matrixgrayAThe correspondence relationship among the detection operator, the edge matrix, and the edge image is shown in table 1.
It should be noted that the resolution of fig. 7 to 15 and the to-be-measured hole photograph 4 are 1024 × 1024. In order to save space, the photo 4 of the hole to be detected is a photo obtained by reducing the actual photo; fig. 7 to 15 are enlarged views of actual pictures to show the edge of the hole more clearly.
TABLE 1 grayscale matrixgrayATable of correspondence between detection operator, edge matrix and edge image
Referring to FIG. 7, a gray matrixgrayAAnd east operatorK_EConvolution operation is carried out to obtain an east edge matrixedgeE,edgeEIs an east edge image 7; referring to fig. 8, to more intuitively exhibit the edge detection effect, the center point and the upper right of the east edge image 7 are connectedThe corner points form a first dotted line 8, and the center point and the lower right corner point of the east edge image 7 are connected to form a second dotted line 9, which can be obviously found as follows: the east edge 7.1 between the first dotted line 8 and the second dotted line 9 is most pronounced compared to other oriented hole edges, showing a segment of the edge curve with a higher brightness.
Similar to fig. 8, fig. 9 to 15 each add two dotted lines to the edge image to more intuitively show the edge detection effect:
referring to fig. 9, visible in the west-oriented edge image 10 are: the west edge 10.1 is most pronounced compared to other orientations of the aperture edge, and exhibits a segment of the edge curve with a higher brightness.
Referring to fig. 10, visible in the southbound edge image 11 are: the south edge 11.1 is most pronounced compared to other orientations of the aperture edge, and exhibits a section of the edge curve with a higher brightness.
Referring to fig. 11, visible in the northbound edge image 12 are: the north edge 12.1 is most pronounced compared to other orientations of the aperture edge, and exhibits a section of the edge curve with a higher brightness.
Referring to fig. 12, it can be seen in the northeast edge image 13: the northeast edge 13.1 is most pronounced compared to other orientations of the aperture edge, and appears as a section of the edge curve with higher brightness.
Referring to fig. 13, visible in the southeast edge image 14 are: southeast edge 14.1 is most pronounced compared to other orientations of the aperture edge, and exhibits a section of the edge curve with higher brightness.
Referring to fig. 14, visible in the northwest edge image 15 are: the northwest edge 15.1 is most pronounced compared to the other hole edges, appearing as a segment of the edge curve with higher brightness.
Referring to fig. 15, visible in southwest edge image 16 is: southwest edge 16.1 is most pronounced compared to other orientations of the aperture edge, appearing as a segment of the edge curve with higher brightness.
The east edge 7.1, the west edge 10.1, the south edge 11.1, the north edge 12.1, the north-east edge 13.1, the south-east edge 14.1, the north-west edge 15.1, the south-west edge 16.1 have a higher brightness, indicating that the gray value of the pixels located on these edges is higher.
In the next step, the accurate position of the hole edge 4.1 can be obtained only by dividing the picture 4 of the hole to be detected into edge searching areas and then searching edge pixel points from the inside of the connecting hole to the outside. Referring to fig. 7 to 15, since the gray scale value inside the connection hole is much smaller than the gray scale value of the hole edge, there is a gray scale threshold valuethreshold=150, when the gray value of a certain pixel point is larger than the gray threshold 150, it indicates that the pixel point has reached the hole edge 4.1.
S303, dividing a search area;
referring to fig. 16, the upper left corner point and the lower right corner point of the hole-to-be-detected picture 4 are connected to form a third dotted line 4.4, the upper right corner point and the lower left corner point of the hole-to-be-detected picture 4 are connected to form a fourth dotted line 4.5, and the third dotted line 4.4 and the fourth dotted line 4.5 divide the picture into four regions: east 4.6, west 4.7, south 4.8, north 4.9.
Referring to fig. 8, 12, 13 and 16, the aperture edges of the east region 4.6 include an east edge 7.1, a northeast edge 13.1 and a southeast edge 14.1.
Referring to fig. 9, 14, 15 and 16, the aperture edge of the west region 4.7 includes a west edge 10.1, a northwest edge 15.1, a southwest edge 16.1.
Referring to fig. 10, 13, 15 and 16, the aperture edge of southbound region 4.8 includes a southbound edge 11.1, a southeast edge 14.1, and a southwest edge 16.1.
Referring to fig. 11, 12, 14 and 16, the aperture edges of northbound region 4.9 include northbound edge 12.1, northeast edge 13.1, northwest edge 15.1.
S304, defining a search starting circle and an accurate edge matrix;
step S202 obtains an initial circle 6, the coordinates of the center of the initial circle 6 are (501, 496), and the radius of the initial circle 6 is 389; step S303 describes in detail the hole edge distribution of east 4.6, west 4.7, south 4.8, and north 4.9 areas. Thus, searching from the inside of the initial circle 6 outwards can obtain the exact pixel point of the hole edge 4.1.
Referring to fig. 17 to 21, the search start circle 17 is first defined as follows: (1) Circle center coordinateEqual to the centre coordinates of the initial circle 6 (501, 496); (2) Radius of. The initial circle 17 is located completely within the inner region of the hole edge 4.1.
First diameter line segment 18 passes through the center coordinatesAnd forms an angle of 45 degrees with the X axis, and a second diameter line segment 19 passes through the center coordinateAnd forms an included angle of-45 degrees with the X axis. The first diameter line segment 18 and the second diameter line segment 19 divide the circumference of the search start circle 17 into four circular arcs: east arc 17.1, west arc 17.2, south arc 17.3, north arc 17.4. The number of the pixel points contained in the four sections of circular arcs is equal tonPixel:
The exact edge matrix is defined as followsEdge: (1) matrix dimensions 1024 × 1024; (2) all elements equal to 1. That is:
s305, hole edge search of east region
Arranging 520 pixel points of 17.1 on the east arc from top to bottom in sequence to form a point set:
Refer to fig. 18, 22 toThe horizontal line illustrates the hole edge search process for east region 17.1:
defining an accumulation variablejIs zero;has the coordinates ofWill be、、The maximum value between the three is recorded as。
(ii) ifIf the gray level is larger than the gray level threshold value 150, turning to the step (iv); if it is usedLess than or equal to 150, let the accumulated variablejValue of (1) plus,Move one pixel to the right to locate the next pointAnd (iii) transferring to the step (iii).
(ⅲ)、
Will be provided withThe maximum value between the three is recorded asAnd (ii) turning to the step (ii).
Wherein whenGreater than the grayscale threshold 150, indicating the current search pointPixel points of east edge 7.1 have been reached. The hole edge search of east region 17.1 will be at the passing pointOn each horizontal line of (A), the restThe 519 horizontal lines complete the hole edge search of the east region 17.1 with reference to the above steps (i) to (iv).
S306, the hole edge of the west-direction area;
arranging 520 pixel points of 17.2 on the west arc from top to bottom in sequence to form a point set:
Refer to fig. 19, 23 toThe horizontal line illustrates the hole edge search process for the west-oriented region 17.2:
defining an accumulated variablejIs zero;has the coordinates ofWill be、、The maximum value between the three is recorded as。
(ii) ifIf the gray level is larger than the gray level threshold value 150, turning to the step (iv); if it is notLess than or equal to 150, let the accumulated variablejValue of (2) plus 1,Move one pixel to the left to the next pointAnd (iii) transferring to the step (iii).
Wherein whenGreater than the grayscale threshold 150 indicates the current search pointPixel points of west edge 10.1 have been reached. Hole edge search of west region 17.2 will be at the passing pointOn each horizontal line of (A), the restThe 519 horizontal lines complete the hole edge search of the west-direction region 17.2 with reference to the steps (i) to (iv).
S307, hole edges of the southbound area;
520 pixel points of 17.3 on the south arc are sequentially arranged from left to right to form a point set:
Referring to fig. 20 and 24, the hole edge search process of the southbound region 17.3 is described by taking the vertical line as an example:
defining an accumulation variablejIs zero;has the coordinates ofWill be、、The maximum value between the three is recorded as。
(ii) ifIf the gray level is larger than the gray level threshold value 150, turning to the step (iv); if it is usedLess than or equal to 150, let the accumulated variablejValue of (1) plus,Move one pixel down to the next pointAnd (iii) transferring to the step (iii).
Wherein whenGreater than the grayscale threshold 150 indicates the current search pointThe pixel point of the southbound edge 11.1 has been reached. Hole edge search of southbound region 17.3 will be at the passing pointOn each plumb line of (1), the restAnd (3) finishing the hole edge search of the southbound region 17.3 by referring to the steps (i) to (iv) of the 519 plumb line.
S308, hole edges of the northbound area;
520 pixel points of 17.4 on the north arc are sequentially arranged from left to right to form a point set:
Refer to fig. 21, 25 toThe plumb line illustrates the hole edge search process for northbound region 17.4:
defining an accumulation variablejIs zero;has the coordinates ofWill be、、The maximum value between the three is recorded as。
(ii) ifIf the gray level is larger than the gray level threshold value of 150, turning to the step (iv); if it is usedLess than or equal to 150, let the accumulated variablejValue of (1) plus,Move one pixel location up to the next pointAnd (iii) transferring to the step (iii).
Wherein whenGreater than the grayscale threshold 150 indicates the current search pointThe pixel point of northbound edge 12.1 has been reached. The hole edge search of northbound region 17.4 will be at the passing pointOn each plumb line of (1), the restThe 519 plumb lines complete the hole edge search of the northbound region 17.4 with reference to the steps (i) to (iv).
And step S4: accurately calculating the hole position and the hole diameter of the connecting hole;
through the steps S304 to S308, the edge matrix is accurateEdgeSome elements equal to 1 are reassigned to zero, and these elements equal to zero are inEdgeThe coordinates in (1) are the picture 4 of the pixel point at the edge 4.1 of the hole to be measuredCoordinates of (2). Referring to FIG. 26, an edge matrix is refinedEdgeThe image of (2) is a binary image 20 with only black and white colors, wherein the black point set 20.1 isEdgeEqual to zero. Referring to fig. 27, in order to more intuitively show the search result of the hole edge 4.1, the color of the black point set 20.1 is modified into white to form a white point set 20.2, and the white point set 20.2 is covered on the picture 4 of the hole to be detected, so that the white point set 20.2 is closely attached to the hole edge 4.1, which indicates that the white point set 20.2 is an actual pixel point of the hole edge 4.1.
Referring to fig. 28, the black point set 20.1 in fig. 26 is circle-fitted using an average method to obtain an exact circle 21 having center coordinates equal to (530, 531) and a radius equal to 372. The precise circle 21 is tightly attached to the hole edge 4.1, the hole position coordinates of the second connecting hole 4.2 are (530, 531), and the radius of the second connecting hole 4.2 is 372.
Claims (11)
1. An image processing method for calculating skin hole site and hole diameter is characterized in that: the method comprises the following steps:
step S1: designing a connecting hole identification model by adopting a deep learning method: shooting a sample photo set, making a label photo set of the sample photo set, and training a detection network;
step S2: roughly calculate the hole site and pore size: shooting a picture of a hole to be detected, and roughly calculating a hole position and a hole diameter;
and step S3: accurately positioning a hole edge pixel point set: appointing a detection operator, performing image convolution operation and dividing a search area for the hole edges in different directions; defining a search starting circle and an accurate edge matrix, searching hole edges of an east area, searching hole edges of a west area, searching hole edges of a south area, searching hole edges of a north area, searching hole edges of a northeast area, searching hole edges of a southeast area, searching hole edges of a northwest area and searching hole edges of a southwest area;
and step S4: and accurately calculating the hole position and the hole diameter of the connecting hole.
2. The image processing method for calculating skin hole site and hole diameter according to claim 1, wherein:
the step S1 specifically comprises the following steps:
s101, shooting a sample photo set;
shooting a plurality of gray photos which contain skin connecting holes and have the size of M multiplied by M to form a sample photo set;
s102, making a label atlas of the sample photo album;
covering a white mark circle (2) which is overlapped with the first connecting hole (1.1) on a black canvas (3.1) with the size of M multiplied by M to form a label graph (3) of the gray-scale photo (1), and for all other photos of the sample photo set in the step S101, making a label graph for each gray-scale photo by adopting the same method to form a label graph set;
s103, training a detection network;
adopting a VGG-16 network to perform down sampling to extract features, then adopting a deconvolution mode to perform up sampling, and finally classifying all pixel points in the image; number of iterations of trainingepochFor 100, the corresponding test set is 50 grayscale photographs, the input picture size of the model is M × M, and the loss function used is the cross entropy:where L represents loss, y represents true value,representing a predicted value;
after 100 times of iteration, the loss function of the model is reduced slightly, the whole model is converged, overfitting can be prevented by stopping in advance, and the weight of the trained model is used as a characteristic identification parameter for detecting the area where the first connecting hole (1.1) is located.
3. The image processing method for calculating skin hole site and hole diameter according to claim 2, characterized in that:
the step S2 specifically comprises the following steps:
s201, shooting a picture of a hole to be detected;
a picture (4) of the hole to be detected containing the second connecting hole (4.2) and the hole edge (4.1) is taken, and gray values of all pixel points of the picture (4) of the hole to be detected form a gray matrixgrayA;
S202, roughly calculating hole positions and hole diameters;
initializing feature extraction parameters of the detection network by using the model parameters obtained in the step S103, normalizing the picture to be recognized with any input resolution to the size of M multiplied by M by adopting a cubic spline interpolation mode, and setting the recognition model as an eval () evaluation mode; the final output result comprises a background and a foreground, the foreground is the area where the connecting hole is located, the background is the other areas except the connecting hole, and the hole edge point set (5) of the connecting hole can be roughly obtained by mapping the identified result to the original picture resolution in equal proportion;
the hole edge point set (5) has N edge points, and the coordinate of each edge point is(ii) a The hole edge point set (5) constructs an initial circle (6), and the center coordinates of the initial circle (6)And radiusCalculated as follows:
1). equal to the abscissa of all edge pointsThe average value of the sum of the values,equal to the ordinate of all edge pointsThe average of the sums;
the formula is as follows:
4. the image processing method for calculating skin hole sites and hole diameters according to claim 3, wherein:
the step S3 specifically includes:
s301, assigning detection operators for hole edges in different directions;
assigning detection operators to hole edges in different orientations as follows:
s302, performing image convolution operation;
will gray scale matrixgrayAAre respectively connected withK_E、K_W、K_N、K_S、K_NE、K_SE、K_NW、K_SWCarrying out convolution operation on the detection operators to respectively obtain east edge matrixesedgeEAnd the corresponding east edge image (7) and west edge matrixedgeWAnd the corresponding west edge image (10) and south edge matrixedgeSAnd corresponding south edge image (11) and north edge matrixedgeNAnd its corresponding northbound edge image (12), northeast edge matrixedgeNEAnd corresponding northeast edge image (13) and southeast edge matrixedgeSEAnd corresponding southeast edge image (14) and northwest edge matrixedgeNWAnd the corresponding northwest edge image (15) and southwest edge matrixedgeSWAnd its corresponding southwestern edge image (16);
s303, dividing a search area;
connecting an upper left corner point and a lower right corner point of the to-be-detected hole picture (4) to form a third dotted line (4.4), and connecting an upper right corner point and a lower left corner point of the to-be-detected hole picture (4) to form a fourth dotted line (4.5); the third dotted line (4.4) and the fourth dotted line (4.5) divide the picture of the hole to be measured (4) into four regions: an east region (4.6), a west region (4.7), a south region (4.8) and a north region (4.9);
s304, defining a search starting circle and an accurate edge matrix;
the search start circle (17) is defined as follows: circle center coordinateEqual to the centre coordinates of the initial circle (6)(ii) a Radius of;
The first diameter line section (18) passes throughAnd forms an included angle of 45 degrees with the X axis, and a second diameter line segment (19) passes through and forms an included angle of-45 degrees with the X axis; the first diameter line segment (18) and the second diameter line segment (19) divide the circumference of the search start circle (17) into four circular arcs: an east arc (17.1), a west arc (17.2), a south arc (17.3) and a north arc (17.4); the number of the pixel points contained in the four sections of circular arcs is equal tonPixel:
Defining the exact edge matrix as followsEdge: the matrix dimension is M multiplied by M; all elements equal to 1;
s305, searching for hole edges of the east region;
will be east-ward rounded (17.1)nPixelThe pixel points are arranged from top to bottom in sequence to form a point set:
Let the hole edge search of east region (4.6) be respectively atIs carried out on each horizontal line, and the starting point of the search is;
S306, searching hole edges of the west-direction area;
in the west direction (17.2)nPixelThe pixel points are arranged from top to bottom in sequence to form a point set:
Order toWest region (4.7) hole edge search will be atIs performed on each horizontal line, and the starting point of the search is;
S307, searching hole edges of the southward region;
from south to top (17.3)nPixelThe pixel points are sequentially arranged from left to right to form a point set:
Order toHole edge search in the southbound region (4.8) will be atIs carried out on each plumb line, and the search starting point is;
S308, searching hole edges of the northbound region;
to north (17.4)nPixelThe pixel points are sequentially arranged from left to right to form a point set:
5. The image processing method for calculating skin hole site and hole diameter according to claim 4, wherein:
the step S4 specifically comprises the following steps:
at the precise edge matrixEdgeIn the binary image (20), the black point set (20.1) isEdgeIn the elements with the middle value equal to zero, the black point set (20.1) is subjected to circle fitting by using an average value method to obtain the center coordinates of the circleRadius ofThe hole site coordinates of the second connecting hole (4.2) areThe radius of the second connecting hole (4.2) is。
7. An image processing method for calculating skin hole sites and apertures according to claim 3, characterized in that: in step S201: establishing a coordinate system (4.3) on the picture (4) of the hole to be detected according to the following rules: the origin of coordinates is located in the lower left corner of the image, with the positive X-axis direction going horizontally to the right and the positive Y-axis direction going vertically upwards.
8. The image processing method for calculating skin hole site and hole diameter according to claim 4, wherein: in step S305:
defining an accumulation variableiHas an initial value ofZero, defining the cumulative variablejIs zero;
(iii) ifGreater than a grey thresholdthresholdLet us orderIf the value is equal to zero, the step (v) is carried out; if it is notLess than or equal to the gray thresholdthresholdLet the accumulated variablejValue of (1) plus,Move one pixel to the right to locate the next pointAnd (iv) carrying out the step (iv);
(v), ifiGreater than or equal tonPixelAnd (vi) turning to the step (vi); if it is notiIs less thannPixel,Make the accumulated variableiAdding 1 to the value of (ii), and turning to the step (ii);
and vi) finishing the hole edge search of the east-direction area (4.6).
9. The image processing method for calculating skin hole site and hole diameter according to claim 4, wherein the step S306 comprises the following specific contents:
defining an accumulation variableiIs zero, defining an accumulation variablejIs zero;
(iii) ifGreater than ashDegree thresholdthresholdLet us orderIf the value is equal to zero, the step (v) is carried out; if it is notLess than or equal to the gray thresholdthresholdLet the accumulated variables bejValue of (2) plus 1,Move one pixel to the left to the next pointAnd (iv) carrying out the step (iv);
(v), ifiGreater than or equal tonPixelShifting to the step (vi); if it is notiIs less thannPixel,Make the accumulated variableiAdding 1 to the value of (ii), and turning to the step (ii);
and (vi) finishing the hole edge search of the western-direction area (4.7).
10. The method of claim 4, wherein step S307 comprises the following steps:
defining an accumulated variableiIs zero, defining an accumulation variablejHas an initial value of zero;
(iii) ifGreater than a grey thresholdthresholdLet us orderIf the value is equal to zero, the step (v) is carried out; if it is notLess than or equal to the gray thresholdthresholdLet the accumulated variablejValue of (1) plus,Move one pixel down to the next pointTurning to the step (iv);
(v), ifiGreater than or equal tonPixelAnd (vi) turning to the step (vi); if it is notiIs less thannPixel,Make the accumulated variableiAdding 1 to the value of (ii), and turning to the step (ii);
and vi) finishing the hole edge search of the southbound area (4.8).
11. The image processing method for calculating skin hole site and hole diameter according to claim 4, wherein step S308 comprises the following specific contents:
defining an accumulated variableiIs zero, defines an accumulation variablejIs zero;
(iii) ifGreater than a grey thresholdthresholdLet us orderIf the value is equal to zero, the step (v) is carried out; if it is notLess than or equal to the gray thresholdthresholdLet the accumulated variablejValue of (2) plus 1,Move one pixel location up to the next pointAnd (iv) carrying out the step (iv);
(v), ifiGreater than or equal tonPixelShifting to the step (vi); if it is notiIs less thannPixel,Make the accumulated variableiAdding 1 to the value of (ii), and turning to the step (ii);
and (vi) finishing the hole edge search of the north area (4.9).
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210878722.1A CN115423746B (en) | 2022-07-25 | 2022-07-25 | Image processing method for calculating skin hole site and aperture |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210878722.1A CN115423746B (en) | 2022-07-25 | 2022-07-25 | Image processing method for calculating skin hole site and aperture |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115423746A true CN115423746A (en) | 2022-12-02 |
CN115423746B CN115423746B (en) | 2023-10-10 |
Family
ID=84196232
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210878722.1A Active CN115423746B (en) | 2022-07-25 | 2022-07-25 | Image processing method for calculating skin hole site and aperture |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115423746B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115230191A (en) * | 2022-07-25 | 2022-10-25 | 成都飞机工业(集团)有限责任公司 | Forming method of stealth box section part |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104359404A (en) * | 2014-11-24 | 2015-02-18 | 南京航空航天大学 | Quick visual detection method for plenty of guide holes of small sizes in airplane parts |
CN110906875A (en) * | 2019-11-26 | 2020-03-24 | 湖北工业大学 | Visual processing method for aperture measurement |
CN113420363A (en) * | 2021-08-25 | 2021-09-21 | 成都飞机工业(集团)有限责任公司 | Method for predicting matching of skin skeleton of aircraft component |
CN114193231A (en) * | 2022-02-16 | 2022-03-18 | 成都飞机工业(集团)有限责任公司 | Bottom hole orifice measuring method for numerical control countersink |
CN114219802A (en) * | 2022-02-21 | 2022-03-22 | 成都飞机工业(集团)有限责任公司 | Skin connecting hole position detection method based on image processing |
CN114346759A (en) * | 2022-03-10 | 2022-04-15 | 成都飞机工业(集团)有限责任公司 | Device for hole online detection and hole finish machining and machining method thereof |
-
2022
- 2022-07-25 CN CN202210878722.1A patent/CN115423746B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104359404A (en) * | 2014-11-24 | 2015-02-18 | 南京航空航天大学 | Quick visual detection method for plenty of guide holes of small sizes in airplane parts |
CN110906875A (en) * | 2019-11-26 | 2020-03-24 | 湖北工业大学 | Visual processing method for aperture measurement |
CN113420363A (en) * | 2021-08-25 | 2021-09-21 | 成都飞机工业(集团)有限责任公司 | Method for predicting matching of skin skeleton of aircraft component |
CN114193231A (en) * | 2022-02-16 | 2022-03-18 | 成都飞机工业(集团)有限责任公司 | Bottom hole orifice measuring method for numerical control countersink |
CN114219802A (en) * | 2022-02-21 | 2022-03-22 | 成都飞机工业(集团)有限责任公司 | Skin connecting hole position detection method based on image processing |
CN114346759A (en) * | 2022-03-10 | 2022-04-15 | 成都飞机工业(集团)有限责任公司 | Device for hole online detection and hole finish machining and machining method thereof |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115230191A (en) * | 2022-07-25 | 2022-10-25 | 成都飞机工业(集团)有限责任公司 | Forming method of stealth box section part |
Also Published As
Publication number | Publication date |
---|---|
CN115423746B (en) | 2023-10-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113450307B (en) | Product edge defect detection method | |
CN111243032B (en) | Full-automatic detection method for checkerboard corner points | |
CN107610141B (en) | Remote sensing image semantic segmentation method based on deep learning | |
CN108776140B (en) | Machine vision-based printed matter flaw detection method and system | |
CN114897864B (en) | Workpiece detection and defect judgment method based on digital-analog information | |
CN112308916B (en) | Target pose recognition method based on image target | |
CN112330593A (en) | Building surface crack detection method based on deep learning network | |
CN110706224B (en) | Optical element weak scratch detection method, system and device based on dark field image | |
CN112164048B (en) | Magnetic shoe surface defect automatic detection method and device based on deep learning | |
CN110648323B (en) | Defect detection classification system and method thereof | |
CN111382658B (en) | Road traffic sign detection method in natural environment based on image gray gradient consistency | |
CN113343976B (en) | Anti-highlight interference engineering measurement mark extraction method based on color-edge fusion feature growth | |
CN110008833B (en) | Target ship detection method based on optical remote sensing image | |
CN110648316A (en) | Steel coil end face edge detection algorithm based on deep learning | |
CN112381062A (en) | Target detection method and device based on convolutional neural network | |
CN115423746A (en) | Image processing method for calculating skin hole site and hole diameter | |
CN109544513A (en) | A kind of steel pipe end surface defect extraction knowledge method for distinguishing | |
CN113012096A (en) | Display screen sub-pixel positioning and brightness extraction method, device and storage medium | |
CN115546795A (en) | Automatic reading method of circular pointer instrument based on deep learning | |
CN113706607B (en) | Subpixel positioning method, computer equipment and device based on circular array diagram | |
CN110031471B (en) | Method, system and device for analyzing surface defect growth of large-caliber optical element | |
CN114219802B (en) | Skin connecting hole position detection method based on image processing | |
CN111738936A (en) | Image processing-based multi-plant rice spike length measuring method | |
CN111260955A (en) | Parking space detection system and method adopting parking space frame lines and end points | |
CN113591548B (en) | Target ring identification method and system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |