CN117109471A - Part contour detection and comparison method - Google Patents

Part contour detection and comparison method Download PDF

Info

Publication number
CN117109471A
CN117109471A CN202310937992.XA CN202310937992A CN117109471A CN 117109471 A CN117109471 A CN 117109471A CN 202310937992 A CN202310937992 A CN 202310937992A CN 117109471 A CN117109471 A CN 117109471A
Authority
CN
China
Prior art keywords
point set
point
contour
image
pair
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310937992.XA
Other languages
Chinese (zh)
Inventor
应灿
杨芳臣
饶俊
罗浩良
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lotes Co Ltd
Original Assignee
Lotes Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lotes Co Ltd filed Critical Lotes Co Ltd
Priority to CN202310937992.XA priority Critical patent/CN117109471A/en
Publication of CN117109471A publication Critical patent/CN117109471A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4038Scaling the whole image or part thereof for image mosaicing, i.e. plane images composed of plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/16Image acquisition using multiple overlapping images; Image stitching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/32Indexing scheme for image data processing or generation, in general involving image mosaicing

Abstract

The invention provides a part contour detection and comparison method, which comprises the following steps: providing a first part and a second part matched with the first part or a corresponding reference graph, and acquiring images of contours to be detected on the parts or the reference graphs; acquiring a first point set of a first contour to be measured of a first part in a corresponding image, and acquiring a second point set of a second contour to be measured or a reference image of a second part; matching the first point set with the second point set, and overlapping the first point set with the second point set according to a matching result; determining the minimum distance point in the second point set corresponding to each point in the first point set to obtain a point pair set; calculating the distance between each point pair in the point pair set; determining whether the distance between each point pair in the set of point pairs is within a predetermined tolerance range; and marking relevant information on the image according to the judging result. The detection comparison result is visually displayed in one image, so that a technician can intuitively read the detection comparison result.

Description

Part contour detection and comparison method
[ field of technology ]
The invention relates to a part contour detection and comparison method.
[ background Art ]
In the manufacturing industry, it is often necessary to measure the dimensions of the part contour, whether it is a production part or a use part, and determine whether the measured part meets a predetermined dimensional tolerance to verify the manufacturing accuracy of the measured part. Traditional measurement mode utilizes manual operation measuring instrument to measure, and measuring efficiency and precision have instability. For irregularly shaped parts, if the manufacturing accuracy is to be fully checked, all dimensions of the profile of the irregularly shaped part need to be measured, which is inefficient and is not practical to manually measure. For different parts to be matched, in general, whether the different parts meet respective tolerances can be only checked respectively, and whether the parts to be tested can be matched with each other smoothly can not be reflected intuitively by one-time measurement.
[ invention ]
Aiming at the background technology, the invention provides two part contour detection and comparison methods, wherein one method is used for detecting and comparing a first part and a second part which are matched with each other, whether a first contour to be detected of the first part to be detected, a second contour to be detected of the second part to be detected and a gap between the corresponding first contour to be detected and the corresponding second contour to be detected accord with a preset tolerance range or not is obtained through one-time measurement, and the results are visually displayed in one image, so that a technician can intuitively read the detection comparison result, and whether the first part and the second part can be matched with each other smoothly is judged; and the other type of detection and comparison are carried out on the single part and the corresponding reference graph thereof, whether the single part to be detected accords with a preset tolerance range relative to the reference graph or not is obtained through one-time measurement, and the results are visually displayed in one image, so that a technician can conveniently and intuitively read the detection and comparison result.
In order to achieve the above purpose, the invention adopts the following technical means:
a method of part contour detection and comparison, comprising: providing a first part and a second part which are matched with each other, and acquiring a first image of a first contour to be detected on the first part and a second image of a second contour to be detected on the second part; step two, a first point set of a first contour to be detected is obtained in a first image, and a second point set of a second contour to be detected is obtained in a second image; step three, matching the first point set and the second point set, and overlapping and displaying the first point set and the second point set in the second image according to a matching result; determining the minimum distance point in the second point set corresponding to each point in the first point set to obtain a point pair set; step five, calculating the distance between each point pair in the point pair set; step six, judging whether the distance between each point pair in the point pair set is within a preset tolerance range; and step seven, marking relevant information on the second image according to the judging result.
Further, in the first step, a plurality of partial images of the single first contour to be measured and/or the single second contour to be measured are acquired, and then the partial images are spliced, so that a complete image containing the whole first contour to be measured and/or the whole second contour to be measured is obtained.
Further, when a single first part has a plurality of first contours to be measured, or a single second part has a plurality of second contours to be measured, or a plurality of first parts and/or second parts need to be measured simultaneously, in step one, a plurality of partial images are collected, each partial image contains a single complete first contour to be measured or a single second contour to be measured, and then the partial images are spliced, so that a complete image containing a plurality of first contours to be measured or a plurality of second contours to be measured is obtained.
Further, before the first point set and the second point set are matched, a first coordinate system of the first image and a second coordinate system of the second image are established, coordinate data of the first point set in the first coordinate system is obtained in the third step, coordinate data of the second point set in the second coordinate system is obtained, and then coordinate system transformation overlapping is carried out to achieve matching of the first point set and the second point set.
In the third step, firstly, performing rough matching on the first point set and the second point set by a minimum circumscribed rectangle method or an inertia ellipse method; and then, carrying out fine matching on the first point set and the second point set by a least square method to realize the matching of the first point set and the second point set.
Further, in step seven, the related information includes the position of the pair of points that deviate from the tolerance range, the distance size of the pair of points that deviate from the tolerance range, and whether the distance size exceeds the upper limit or the lower limit with respect to the tolerance range.
Further, in step five, the pixel distance between each pair of points is calculated, and the distance is converted from pixel units to millimeter units by multiplying the pixel equivalent of the second image, so as to obtain the actual distance of each pair of points.
The technical means has the following technical effects: the first to-be-measured profile of the first part to be measured, the second to-be-measured profile of the second part to be measured and the gaps between the corresponding first to-be-measured profile and the corresponding second to-be-measured profile are obtained through one-time measurement, whether the gaps between the first to-be-measured profile and the corresponding second to-be-measured profile meet the preset tolerance range or not is judged, visual display of the results in an image is facilitated, a technician can intuitively read and detect the comparison result, accordingly whether the first part and the second part can be successfully matched with each other is judged, in addition, the detection and comparison process is automatic, the whole first point set forming the first to-be-measured profile and the whole second point set forming the second to-be-measured profile are detected and compared, and the efficiency and the precision are higher and stable relative to manpower.
A method of part contour detection and comparison, comprising: step one, providing a single part, and acquiring an image of a contour to be detected of the part and a reference image corresponding to the contour to be detected; step two, acquiring a first point set of a contour to be detected in an image, and acquiring a second point set of a reference graph; step three, matching the first point set and the second point set, and overlapping and displaying the first point set and the second point set in the reference graph according to a matching result; determining the minimum distance point in the second point set corresponding to each point in the first point set to obtain a point pair set; step five, calculating the distance between each point pair in the point pair set; step six, judging whether the distance between each point pair in the point pair set is within a preset tolerance range; and step seven, marking relevant information on the image according to the judging result.
Further, in step one, the reference pattern is obtained from an image of a standard part sample or a CAD drawing for machining the part.
Further, in the first step, a plurality of partial images of a single contour to be measured are acquired, and then the partial images are spliced, so that a complete image containing the whole contour to be measured is obtained.
Further, when there are multiple contours to be tested on a single part or multiple parts need to be tested at the same time, in step one, multiple partial images of the single or multiple parts are acquired, each partial image contains a single complete contour to be tested, and then the partial images are spliced, so that a complete image containing multiple contours to be tested is obtained.
Further, before the first point set and the second point set are matched, in the third step, a first coordinate system of the image and a second coordinate system of the reference graph are established, coordinate data of the first point set in the first coordinate system are obtained, coordinate data of the second point set in the second coordinate system are obtained, and then coordinate system transformation overlapping is carried out to achieve matching of the first point set and the second point set.
In the third step, firstly, performing rough matching on the first point set and the second point set by a minimum circumscribed rectangle method or an inertia ellipse method; and then, carrying out fine matching on the first point set and the second point set by a least square method to realize the matching of the first point set and the second point set.
Further, in step seven, the related information includes the position of the pair of points that deviate from the tolerance range, the distance size of the pair of points that deviate from the tolerance range, and whether the distance size exceeds the upper limit or the lower limit with respect to the tolerance range.
The technical means has the following technical effects: the method has the advantages that whether the measured single part accords with a preset tolerance range relative to the reference graph or not is obtained through one-time measurement, the results are visually displayed in one image, a technician can intuitively read the detection comparison result conveniently, in addition, the detection comparison process is automatic, the whole first point set forming the outline to be measured and the whole second point set forming the reference graph are detected and compared, and the efficiency and the precision are higher and stable relative to those of manpower.
[ description of the drawings ]
FIG. 1 is a schematic perspective view of a first part and a second part mated to each other to be inspected;
FIG. 2 is a flow chart of the method of the present invention for detecting and comparing the first and second parts that are matched with each other;
FIG. 3 is a schematic diagram showing the result of the detection alignment method in FIG. 2;
FIG. 4 is a flow chart of the method for detecting and comparing a single part with a reference pattern according to the present invention;
FIG. 5 is a schematic perspective view of a part to be inspected;
FIG. 6 is a schematic diagram showing the result of the detection alignment method in FIG. 4;
fig. 7 is an enlarged view of fig. 6 in the region L.
Reference numerals illustrate:
fig. 1 to 3:
first part 1 first contour 11 first set of points 12 to be measured
Second part 2 second contour 21 to be measured second set of points 22
Fig. 5 to 7:
part 3 first set of points 32 of contour 31 to be measured
Reference pattern 4 second set of points 41
[ detailed description ] of the invention
For a better understanding of the invention with objects, structures, features, and effects, the invention will be described further with reference to the drawings and to the detailed description.
The part contour detection and comparison method can be used for detecting whether the matched to-be-detected contours of different parts deviate from a preset tolerance range or not and giving specific positions and numerical values deviating from the tolerance range, and can also be used for detecting whether the to-be-detected contours of single parts to be detected deviate from the tolerance range of a reference graph and giving specific positions and numerical values deviating from the tolerance range of the to-be-detected contours relative to the reference graph. The term "contour" or "edge" as used herein is not limited to the outer edge line of the part, but refers to any line of visual region of interest (region of interest) on the part.
Referring to fig. 1, this is the first case when one part is to be fitted into an opening in another part (e.g. a male and female die in a stamping die that cooperate with each other). For convenience of the following description, the part assembled into the opening is taken as the first part 1, the lower edge of the first part 1 is the first contour to be measured 11, the part with the opening is taken as the second part 2, and the upper edge of the opening is taken as the second contour to be measured 21, but the invention is not limited thereto.
Referring to fig. 2, for the first situation, the method for detecting and comparing the contour of a part provided by the present invention includes: step one 100, providing a first part 1 and a second part 2 which are matched with each other, and acquiring a first image of a first contour 11 to be measured on the first part 1 and a second image of a second contour 21 to be measured on the second part 2; step two 200, acquiring a first point set 12 of a first contour 11 to be measured in a first image, and acquiring a second point set 22 of a second contour 21 to be measured in a second image; step three 300, matching the first point set 12 and the second point set 22, and overlapping and displaying the first point set 12 and the second point set 22 in the second image according to the matching result; step four 400, determining the minimum distance point in the second point set 22 corresponding to each point in the first point set 12, and obtaining a point pair set; step five 500, calculating the distance between each point pair in the point pair set; step six 600, judging whether the distance between each point pair in the point pair set is within a preset tolerance range; and seventhly, marking relevant information on the second image according to the judging result 700a and 700 b.
The first image and the second image are generally acquired by optical image acquisition in step one 100. Generally, before performing step one 100, nonlinear correction is required to be performed on an optical lens of the image capturing device to eliminate image distortion, so as to obtain a distortion-free image.
In order to obtain a high-resolution image to improve the detection accuracy, a high-resolution image acquisition device is required, so that the field of view which can be observed by the image acquisition device is small, and the whole contour to be detected can not be covered. Therefore, when the first contour to be measured 11 and/or the second contour to be measured 21 are necessarily out of the field of view of the image capturing device, image stitching is required in step one 100. In step one 100, a plurality of partial undistorted images of the first contour to be measured 11 and/or the second contour to be measured 21 are acquired, and then the partial images are stitched together, so as to obtain a complete image including the whole first contour to be measured 11 and/or the second contour to be measured 21 across the field of view. If the first contour to be measured 11 and/or the second contour to be measured 21 do not exceed the field of view of the image acquisition device, no image stitching is necessary. The first image and/or the second image may each include a plurality of regions of interest, for example, a single first part 1 has a plurality of first contours to be measured 11, and then the first contours to be measured 11 may all be spliced in the same first image, or a single first part 1 has only a single first contour to be measured 11, or a plurality of first parts 1 may be placed side by side, so that a plurality of first contours to be measured 11 are spliced in the same first image, and a subsequent step may process a plurality of first contours to be measured 11 and/or a plurality of second contours to be measured 21 simultaneously.
In step two 200, an edge contour point set extraction operator is applied, for example, to extract the first contour to be measured 11 and the second contour to be measured 21 respectively, so as to obtain a first point set 12 of the first contour to be measured 11 and a second point set 22 of the second contour to be measured 21. The contour extraction operator can realize pixel-level contour point set extraction, and can select different operators, such as Lanser operator, derich operator and Canny operator, according to different contour edge characteristics. On the basis of the pixel-level contour extraction, the contour extraction at the sub-pixel level may be further performed, for example, a method such as a moment method (Zernike moment, etc.), an interpolation method, a fitting method, etc., to obtain the first point set 12 and the second point set 22 with higher precision. The obtained first point set 12 and second point set 22 can be directly reserved on the corresponding first contour to be measured 11 and second contour to be measured 21.
In step three 300, the first set of points 12 and the second set of points 22 are matched, so that the pattern surrounded by the first set of points 12 overlaps as much as possible with the pattern surrounded by the second set of points 22, and the first set of points 12 is mapped into the second image which retains the second set of points 22 according to the best matching result, so that the second image simultaneously displays the first set of points 12 and the second set of points 22, thereby providing visual contrast. It is also possible to map the second set of points 22 in the first image, in which the first set of points 12 is retained, and the invention is not limited in this regard.
The above-described matching can be performed using a band-based matching method. Before matching the first set of points 12 and the second set of points 22, a first coordinate system of the first image and a second coordinate system of the second image are established, and the first and second coordinate systems may be established in step two 200. The method for establishing the first coordinate system and the second coordinate system is basically the same, and the first coordinate system and the second coordinate system are established by means of searching straight lines, intersection points of the two straight lines and the like in the first image and the second image. In step three 300, coordinate data of the first point set 12 in the first coordinate system is obtained, coordinate data of the second point set 22 in the second coordinate system is obtained, and then coordinate system transformation overlapping is performed to realize the best matching between the first point set 12 and the second point set 22.
The matching may also be performed using a matching method without a reference. In step three 300, first, rough matching is performed on the first point set 12 and the second point set 22 by a minimum circumscribed rectangle method or an inertia ellipse method; and then the first point set 12 and the second point set 22 are subjected to fine matching by a least square method, so that the best matching of the first point set 12 and the second point set 22 is realized.
In step four 400, the minimum distance point in the second point set 22 corresponding to each point in the first point set 12 is determined, that is, each point in the first point set 12 searches for the corresponding point with the minimum distance in the second point set 22, and a point pair set is obtained.
In step five 500, the distance between each pair of points in the set of pairs of points is calculated. In the second coordinate system, the pixel distance between each pair of points is calculated, and the distance is converted from a pixel unit to a millimeter unit by multiplying the pixel equivalent of the second image, so as to obtain the actual Euclidean distance of each pair of points. In addition, one of the point sets is used as a reference point set, and the connection line of the point-to-distance is positioned outside or inside the reference point set to define the positive or negative of the point-to-distance. Referring to fig. 3, in this schematic diagram, the first point set 12 is taken as a reference point set, if the minimum distance point is located outside the first point set 12, the point-to-point distance is "positive", if the minimum distance point is located inside the first point set 12, the point-to-point distance is "negative", and all the point-to-point distances shown in fig. 3 are "positive" distances, and there is no "negative" distance. In other schematic diagrams, the definition of "positive" and "negative" distances may be interchanged, or the second set of points 22 may be selected as the set of reference points, which is not a limitation of the present invention.
In step six 600, a determination is made as to whether the distance between each pair of points within the set of pairs of points is within a predetermined tolerance. Generally, the predetermined tolerance range is a numerical range. Whether the point-to-distance exceeds the upper limit or the lower limit of the tolerance range, the deviation from the preset tolerance range reflects that at least one of the first contour to be measured 11 and the second contour to be measured 21 has defects, and the design requirement is not met. In addition, referring to fig. 3, if a "negative" distance occurs, it is directly reflected that the first contour to be measured 11 and the second contour to be measured 21 cannot be fitted to each other.
In steps 700a and 700b, relevant information is marked on the second image according to the judgment result. The relevant information includes, but is not limited to, the location of the point pairs that deviate from the tolerance range, the distance magnitude of the point pairs that deviate from the tolerance range, and whether the distance magnitude exceeds an upper limit or a lower limit relative to the tolerance range. Specific indication modes include displaying specific numerical values, color distinction, use codes and the like. For point pairs that do not deviate from the tolerance range, no additional processing may be done or they may be visually distinguished from point pairs that do deviate from the tolerance range.
Referring to fig. 3, fig. 3 is a schematic diagram of a detection comparison result for the first case, and for further understanding of the present invention, the tolerance range is visually indicated as a dash-dot line A, B, C, D and is displayed in the second image, and the tolerance range is displayed as an optional function of the present invention, and the present invention does not need to visualize the dash-dot line A, B, C, D first and then compare the point-to-distance. The dash-dot line A, D represents the upper limit of the predetermined tolerance range, the dash-dot line B, C represents the lower limit of the predetermined tolerance range, the second set of points 22 should be distributed between dash-dot line a and dash-dot line B, the first set of points 12 should be distributed between dash-dot line C and dash-dot line D, and the point-to-point distance should be greater than the distance between dash-dot line B and dash-dot line C and less than the distance between dash-dot line a and dash-dot line D. In the schematic diagram of fig. 3, the second point set 22 of the I region is located outside the dash-dot line a, and the point-to-point distance of the I region exceeds the upper limit of the predetermined tolerance range, and the second point set 22 of the J region is located inside the dash-dot line B, and the point-to-point distance of the J region exceeds the lower limit of the predetermined tolerance range.
The dash-dot line A, B is a predetermined tolerance range of the first contour to be measured 11, the dash-dot line C, D is a predetermined tolerance range of the second contour to be measured 12, and the point-to-point distance conforming to the two tolerance ranges may still float due to the approaching or separating of the first point set 12 and the second point set 22, so that the tolerance range (not shown) may be further set for the point-to-point distance itself, and may be determined and marked. The similarity between the first contour to be measured 11 and the second contour to be measured 12 can be determined by determining whether the pair of points is consistent with the tolerance range of the pair of points, and further determining whether the two contours to be measured have similar extending directions (or fluctuation conditions). For pairs whose point pair distances meet the tolerance range, the distance from the end point belonging to the first set of points 12 to the score line C is similar to the distance from the end point belonging to the second set of points 22 to the score line a.
Referring to fig. 3, the straight line segment between the first point set 12 and the second point set 22 is the shortest distance connecting line between each point pair, which points in the first point set 12 and the second point set 22 correspondingly form the shortest distance point pair is visually displayed, and whether the "negative" distance and the position of the "negative" distance exist or not can be visually judged through the connecting lines, in fig. 3, all the connecting lines are located on the outer side of the first point set 12, so that when a technical person reads the detection comparison result shown in fig. 3, the "negative" distance can be quickly judged.
The applicable objects of the present invention for the aforementioned first case include: a single part comprising a single contour to be measured, a single part comprising a plurality of contours to be measured, a plurality of parts each comprising a single contour to be measured, a plurality of parts each comprising a plurality of contours to be measured. For example, the first part 1 may be a single punch with only one first profile 11 to be tested, and the second part 2 may be a die with a plurality of second profiles 21 to be tested (punched holes), and by applying the detection and comparison method of the present invention, it is possible to accurately find out which punched holes the tested punch can enter, and which punched holes cannot enter, so as to guide a technician to avoid matching the punch with punched holes that cannot receive the punch when assembling the punch and the die.
Referring to fig. 4 to 6, for the second situation, the method for detecting and comparing the profile of the part provided by the present invention includes: step one 100, obtaining an image of a contour 31 to be measured of a part 3 and a reference figure 4 corresponding to the contour 31 to be measured; step two 200, acquiring a first point set 32 of a contour 31 to be detected in an image of the part 3, and acquiring a second point set 41 of a reference graph 4; step three 300, matching the first point set 32 and the second point set 41, and overlapping and displaying the first point set 32 and the second point set 41 in the reference graph 4 according to the matching result; step four 400, determining the minimum distance point in the second point set 41 corresponding to each point in the first point set 32, and obtaining a point pair set; step five 500, calculating the distance between each point pair in the point pair set; step six 600, judging whether the distance between each point pair in the point pair set is within a preset tolerance range; and seventhly, marking relevant information on the second image according to the judging result 700a and 700 b.
The second case is different from the first case in the first to third steps 100 to 300. In step one 100, the reference pattern 4 is mainly obtained from a CAD drawing for machining the part 3, and in a few cases, an image of a sample of the standard part 3 (or referred to as a golden sample) may also be selected to obtain the reference pattern 4. The corresponding CAD drawing and the image of the so-called master part 3 sample are long-term stable reference objects of the corresponding part 3 to be tested, rather than the object to be tested in a single test. In step two 200, a first set of points 32 may be obtained from the reference pattern 4 using means conventional in the industry, and a second set of points 41 of the outline 31 to be measured may be obtained from the image of the part 3 to be measured using the same method as in the first case. In step three 300, the first point set 32 is mapped into the reference pattern 4 according to the matching result of the first point set 32 and the second point set 41, so that the first point set 32 and the second point set 41 are displayed in the reference pattern 4 in an overlapping manner. As for steps four 400 to seven 700a, 700b, this is substantially the same for both cases.
In the second case, similarly to the first case described above, in order to obtain a high-resolution image to improve detection accuracy, it is necessary to use a high-resolution image pickup device, and the field of view that can be observed by the image pickup device is small and it is not necessarily possible to cover the entire contour 31 to be detected. Therefore, when the contour 31 to be measured must exceed the field of view of the image capturing device, image stitching is required in step one 100. In step one 100, a plurality of partial images of the contour 31 to be measured are acquired, and then the partial images are stitched together, so as to obtain a complete image including the whole contour 31 to be measured and crossing the field of view. If the contour 31 to be measured does not exceed the field of view of the image acquisition device, no image stitching is necessary. The image of the part to be measured 3 and/or the reference graph 4 may each include a plurality of regions of interest, for example, a single part 3 has a plurality of contours to be measured 31, and then the contours to be measured 31 may be spliced in the same image, or a single part 3 has only a single contour to be measured 31, or a plurality of contours to be measured 31 of a plurality of parts 3 may be spliced in the same image, and the subsequent steps may process a plurality of contours to be measured 31 simultaneously.
Similar to the first case described above, the above-described matching can be performed using a band-based matching method in the second case: before matching the first set of points 32 and the second set of points 41, a first coordinate system of the image of the part 3 to be measured and a second coordinate system of the reference pattern 4 are established, and the first and second coordinate systems may be established in step two 200. The first and second coordinate systems are established basically by the same method, and the first and second coordinate systems are established by means of searching straight lines, intersection points of the two straight lines and the like in the image and the reference graph 4. In step three 300, coordinate data of the first point set 32 in the first coordinate system is acquired, coordinate data of the second point set 41 in the second coordinate system is acquired, and then coordinate system transformation overlapping is performed, so as to realize the best matching between the first point set 32 and the second point set 41. The above-mentioned matching can also be performed using a matching method without a reference: in step three 300, first, rough matching is performed on the first point set 32 and the second point set 41 by a minimum circumscribed rectangle method or an inertia ellipse method; and then the first point set 32 and the second point set 41 are subjected to fine matching by a least square method, so that the best matching of the first point set 32 and the second point set 41 is realized.
Referring to fig. 6 to 7, fig. 6 to 7 are schematic diagrams of the detection comparison result and the partial enlarged diagrams thereof for the second case, and the predetermined tolerance range is not visualized. A portion of the second set of points 41 in the L-region is located inside the first set of points 32, creating a "negative" distance, wherein pairs of points outside the predetermined tolerance range are also identified by the steps of the invention 700a, 700b, and the remaining portion of the second set of points 41 in the L-region is located outside the first set of points 32, creating a "positive" distance, wherein pairs of points outside the predetermined tolerance range are also identified by the steps of the invention 700a, 700 b.
The part contour detection and comparison method provided by the invention has the following beneficial effects:
(1) For the first situation, whether the first to-be-detected profile of the first part to be detected and the second to-be-detected profile of the second part to be detected and the gaps between the corresponding first to-be-detected profile and the corresponding second to-be-detected profile accord with a preset tolerance range or not are obtained through one-time measurement, and the results are visually displayed in one image, so that a technician can intuitively read and detect the comparison result, and whether the first part and the second part can be successfully matched with each other is judged, in addition, the detection and comparison process is automatic, the whole first point set forming the first to-be-detected profile and the whole second point set forming the second to-be-detected profile are detected and compared, and the efficiency and the precision are higher and stable relative to manpower.
(2) For the second situation, whether the detected single part accords with a preset tolerance range relative to the reference graph or not is obtained through one-time measurement, and the results are visually displayed in one image, so that a technician can intuitively read the detection comparison result, in addition, the detection comparison process is automatic, the whole first point set forming the outline to be detected and the whole second point set forming the reference graph are detected and compared, and the efficiency and the precision are higher and stable relative to those of manpower.
(3) The part detection and comparison method is suitable for spliced images, for example, a plurality of local contours to be detected are spliced together to form a complete contour to be detected, so that the complete contour to be detected is displayed in a detection result at the same time, and a technician can intuitively know the occurrence position of a point pair exceeding a preset tolerance range on an actual part; the method is also suitable for detecting and comparing a plurality of complete contours to be detected after being spliced together, and improves the efficiency.
(4) The measuring equipment applying the part detection and comparison method of the invention detects standard blocks of 40 mm, 60 mm and 100 mm, and the detection result is smaller than other advanced equipment in the industry in transverse direction and has deviation smaller than 1 micron.
The above detailed description is merely illustrative of the preferred embodiments of the present invention and is not intended to limit the scope of the invention, therefore, all technical equivalents which may be employed in the present specification and illustrations are included in the scope of the invention.

Claims (14)

1. The method for detecting and comparing the contour of the part is characterized by comprising the following steps:
providing a first part and a second part which are matched with each other, and acquiring a first image of a first contour to be detected on the first part and a second image of a second contour to be detected on the second part;
step two, a first point set of a first contour to be detected is obtained in a first image, and a second point set of a second contour to be detected is obtained in a second image;
step three, matching the first point set and the second point set, and overlapping and displaying the first point set and the second point set in the second image according to a matching result;
determining the minimum distance point in the second point set corresponding to each point in the first point set to obtain a point pair set;
step five, calculating the distance between each point pair in the point pair set;
step six, judging whether the distance between each point pair in the point pair set is within a preset tolerance range;
and step seven, marking relevant information on the second image according to the judging result.
2. The part contour detection and comparison method of claim 1, wherein: in the first step, a plurality of partial images of a single first contour to be measured and/or a single second contour to be measured are acquired, and then the partial images are spliced, so that a complete image containing the whole first contour to be measured and/or the whole second contour to be measured is obtained.
3. The part contour detection and comparison method of claim 1, wherein: when a single first part is provided with a plurality of first contours to be tested, or a single second part is provided with a plurality of second contours to be tested, or a plurality of first parts and/or second parts need to be tested simultaneously, in step one, a plurality of partial images are collected, each partial image comprises a single complete first contour to be tested or a single second contour to be tested, and the partial images are spliced, so that a complete image comprising a plurality of first contours to be tested or a plurality of second contours to be tested is obtained.
4. The part contour detection and comparison method of claim 1, wherein: before the first point set and the second point set are matched, a first coordinate system of the first image and a second coordinate system of the second image are established, coordinate data of the first point set in the first coordinate system is obtained in the third step, coordinate data of the second point set in the second coordinate system is obtained, and then coordinate system transformation overlapping is carried out to achieve matching of the first point set and the second point set.
5. The part contour detection and comparison method of claim 1, wherein: in the third step, firstly, carrying out rough matching on the first point set and the second point set by a minimum circumscribed rectangle method or an inertia ellipse method; and then, carrying out fine matching on the first point set and the second point set by a least square method to realize the matching of the first point set and the second point set.
6. The part contour detection and comparison method of claim 1, wherein: in step seven, the related information includes the position of the pair of points that deviate from the tolerance range, the distance size of the pair of points that deviate from the tolerance range, and whether the distance size exceeds the upper limit or the lower limit with respect to the tolerance range.
7. The part contour detection and comparison method of claim 1, wherein: in the fifth step, the pixel distance between each point pair is calculated, and the distance is converted from a pixel unit to a millimeter unit by multiplying the pixel equivalent of the second image, so as to obtain the actual distance of each point pair.
8. The method for detecting and comparing the contour of the part is characterized by comprising the following steps:
step one, providing a single part, and acquiring an image of a contour to be detected of the part and a reference image corresponding to the contour to be detected;
step two, acquiring a first point set of a contour to be detected in an image, and acquiring a second point set of a reference graph;
step three, matching the first point set and the second point set, and overlapping and displaying the first point set and the second point set in the reference graph according to a matching result;
determining the minimum distance point in the second point set corresponding to each point in the first point set to obtain a point pair set;
step five, calculating the distance between each point pair in the point pair set;
step six, judging whether the distance between each point pair in the point pair set is within a preset tolerance range;
and step seven, marking relevant information on the image according to the judging result.
9. The part contour detection and comparison method of claim 8, wherein: in step one, a reference pattern is obtained from an image of a standard part sample or CAD drawing used to machine the part.
10. The part contour detection and comparison method of claim 8, wherein: in the first step, a plurality of partial images of a single contour to be measured are acquired, and then the partial images are spliced, so that a complete image containing the whole contour to be measured is obtained.
11. The part contour detection and comparison method of claim 8, wherein: when a single part is provided with a plurality of contours to be tested or a plurality of parts need to be tested at the same time, in the first step, a plurality of partial images of the single part or the plurality of parts are collected, each partial image comprises a single complete contour to be tested, and the partial images are spliced, so that a complete image comprising the plurality of contours to be tested is obtained.
12. The part contour detection and comparison method of claim 8, wherein: before the first point set and the second point set are matched, in the third step, a first coordinate system of the image and a second coordinate system of the reference graph are established, coordinate data of the first point set in the first coordinate system are obtained, coordinate data of the second point set in the second coordinate system are obtained, and then coordinate system transformation overlapping is carried out to achieve matching of the first point set and the second point set.
13. The part contour detection and comparison method of claim 8, wherein: in the third step, firstly, carrying out rough matching on the first point set and the second point set by a minimum circumscribed rectangle method or an inertia ellipse method; and then, carrying out fine matching on the first point set and the second point set by a least square method to realize the matching of the first point set and the second point set.
14. The part contour detection and comparison method of claim 8, wherein: in step seven, the related information includes the position of the pair of points that deviate from the tolerance range, the distance size of the pair of points that deviate from the tolerance range, and whether the distance size exceeds the upper limit or the lower limit with respect to the tolerance range.
CN202310937992.XA 2023-07-27 2023-07-27 Part contour detection and comparison method Pending CN117109471A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310937992.XA CN117109471A (en) 2023-07-27 2023-07-27 Part contour detection and comparison method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310937992.XA CN117109471A (en) 2023-07-27 2023-07-27 Part contour detection and comparison method

Publications (1)

Publication Number Publication Date
CN117109471A true CN117109471A (en) 2023-11-24

Family

ID=88806538

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310937992.XA Pending CN117109471A (en) 2023-07-27 2023-07-27 Part contour detection and comparison method

Country Status (1)

Country Link
CN (1) CN117109471A (en)

Similar Documents

Publication Publication Date Title
US11551341B2 (en) Method and device for automatically drawing structural cracks and precisely measuring widths thereof
CN112818988B (en) Automatic identification reading method and system for pointer instrument
CN103759758B (en) A kind of method for detecting position of the automobile meter pointer based on mechanical angle and scale identification
CN109900711A (en) Workpiece, defect detection method based on machine vision
CN114549835B (en) Pointer instrument correction identification method and device based on deep learning
US8428335B2 (en) Combining feature boundaries
CN107239742A (en) A kind of gauge pointer scale value calculating method
EP1467176B1 (en) Inspection system and method
CN110608685A (en) Object size rapid measurement method based on raspberry pie
CN109990711A (en) A kind of appearance quality detection method of punched nickel-plated steel band
CN106482636B (en) Image measuring apparatus
CN110503623A (en) A method of Bird's Nest defect on the identification transmission line of electricity based on convolutional neural networks
CN111507186A (en) Substation pointer instrument reading identification method
CN108389184A (en) A kind of workpiece drilling number detection method based on machine vision
CN115511788A (en) Method for automatic detection and modeling of motor vehicle driver examination field
US6898333B1 (en) Methods and apparatus for determining the orientation of an object in an image
CN115375608A (en) Detection method and device, detection equipment and storage medium
US20080144917A1 (en) Method and Apparatus for Automatic Measurement of Pad Geometry and Inspection thereof
CN117109471A (en) Part contour detection and comparison method
CN114111576B (en) Aircraft skin gap surface difference detection method
CN113554688B (en) O-shaped sealing ring size measurement method based on monocular vision
CN106123808B (en) A method of it is measured for the deflection of automobile rearview mirror specular angle degree
CN115641326A (en) Sub-pixel size detection method and system for ceramic antenna PIN needle image
CN112329770B (en) Instrument scale identification method and device
CN114898198A (en) Image processing method for automatic reading of pointer type pressure gauge

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination