CN112001917A - Machine vision-based geometric tolerance detection method for circular perforated part - Google Patents

Machine vision-based geometric tolerance detection method for circular perforated part Download PDF

Info

Publication number
CN112001917A
CN112001917A CN202010920126.6A CN202010920126A CN112001917A CN 112001917 A CN112001917 A CN 112001917A CN 202010920126 A CN202010920126 A CN 202010920126A CN 112001917 A CN112001917 A CN 112001917A
Authority
CN
China
Prior art keywords
image
hole
center
calculating
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010920126.6A
Other languages
Chinese (zh)
Other versions
CN112001917B (en
Inventor
丁尧
詹洪陈
刘家乐
袁杰
万凯
朱凯
陈颖
张莉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing university jinling college
Original Assignee
Nanjing university jinling college
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing university jinling college filed Critical Nanjing university jinling college
Priority to CN202010920126.6A priority Critical patent/CN112001917B/en
Publication of CN112001917A publication Critical patent/CN112001917A/en
Application granted granted Critical
Publication of CN112001917B publication Critical patent/CN112001917B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/0006Industrial image inspection using a design-rule based approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a machine vision-based form and position tolerance detection method for a circular perforated part, which is used for improving the efficiency and the precision of form and position tolerance detection of the part. The method comprises the following steps: constructing a darkroom shadowless illumination system to provide illumination for the image acquisition of the part to be measured; calibrating the camera twice, eliminating lens distortion and calculating the corresponding relation between image pixels and actual physical dimensions; collecting part image information, binarizing and filtering, and eliminating random noise in the collection process; extracting an image connected domain, and calculating the center of the connected domain as a circle center coordinate to judge whether the holes are coaxial and co-circular; respectively calculating the external rectangle and the minimum external rectangle of the communicated region to obtain two groups of length and width data, and calculating to obtain 5 groups of radius information to carry out roundness analysis on the hole; extracting the binary image contour to obtain the edge information of the part, then carrying out Hough circle transformation on the edge information to obtain the information of the circle center and the radius, comparing the information with the previously obtained data, and calculating the geometric tolerance of the part.

Description

Machine vision-based geometric tolerance detection method for circular perforated part
Technical Field
The invention belongs to the field of industrial measurement and machine vision, and particularly relates to a form and position tolerance detection method for a circular perforated part based on machine vision.
Background
Any part inevitably wears during machining, thereby causing errors. In industries such as machine manufacturing, error measurement is often required for machined parts to ensure that the machined parts meet design requirements and meet quality standards. The circular hole part is used as a common annular part, has a fixed structure and a circle of circular through holes, and is widely applied to various instruments and vehicles. The overall performance is affected by excessive part tolerances. Therefore, for a part with high quality requirement, the form and position tolerance of the part should be controlled in addition to ensuring the dimensional precision of the part. Form and position tolerances, which generally include shape and position tolerances, are the maximum amount of error that a part can tolerate relative to an ideal design. For example, after the part is machined, the roundness of the round hole cannot meet the specified requirement, and the condition belongs to shape errors; if the circle center of the circular hole is positioned to generate deviation, the position error is obtained. The technology for detecting the form and position tolerance of parts is continuously developed, and currently, two different solutions are mainly provided: three-coordinate measurement and non-contact measurement. The three-coordinate measurement is to put the part to be measured into the measuring instrument, and the detector can move on the guide rails in three directions. And calculating the size and form and position tolerance of the part according to the data returned by the probe. The method can accurately measure each part of the part, but has high cost and low efficiency. Manual operation is required, and the measurement flexibility is not enough. The non-contact measurement is a method for obtaining specific parameters of the part by adopting optical scanning under the condition of not touching the surface of the part. The non-contact measurement by means of the digital image processing technology has the advantages of simple structure, high identification speed and high detection efficiency, and can meet the real-time online quality detection requirement.
Disclosure of Invention
The purpose of the invention is as follows: aiming at the problems of complex detection and long time consumption of the form and position tolerance of the traditional part, the invention provides a method for completing the size measurement and the form and position tolerance analysis and calculation of the part by utilizing a machine vision technology, and mainly solves the problems of simple and easy layout of a detection mechanism, short detection period, online deployment and comprehensive detection content.
In order to solve the problems, the invention discloses a method for detecting form and position tolerance of a circular perforated part based on machine vision, which comprises the following steps:
step 1, constructing a darkroom shadowless illumination system, wherein the environment of the system is a closed light shading environment, and illumination in the system consists of a self-contained annular light device and a backlight source which are in the same direction as a camera and is used for providing an illumination environment for image acquisition of parts;
fixing the position of a camera, fixing the position of a platform, installing a fixed-focus camera lens, calibrating the camera lens by using a chessboard calibration plate after focusing, completing distortion correction, and eliminating the influence of lens distortion on measurement;
step 3, calibrating the camera calibrated in the step 2 by using the chessboard calibration board again, obtaining pixel coordinates of angular points through angular point detection, calculating the pixel distance between two adjacent angular points, and performing correlation calculation with the physical size of the same angular point on the chessboard calibration board to obtain the corresponding relation between image pixels and the actual physical size;
step 4, placing the part in an image acquisition area, and acquiring image information of the part;
step 5, carrying out binarization and filtering on the collected part image, and eliminating random noise in the collection process;
step 6, extracting a part area, cutting useless image information, extracting an image connected domain, and calculating the center of the connected domain as a part center coordinate;
step 7, calculating the circumscribed rectangles of all the connected regions in the image obtained in the step 6 to obtain two side lengths of the circumscribed rectangles, calculating the minimum circumscribed rectangle of the connected regions to obtain two side lengths of the minimum circumscribed rectangle, calculating 5 radiuses according to the four obtained side length data, judging the roundness of the hole, and if the two sides are all in a tolerance band, judging that the hole meets the requirements of form and position tolerance;
step 8, calculating the center point of each hole in the part by using the circumscribed rectangles of all the communicated areas in the image obtained in the step 7, comparing the center point with the center of the part obtained in the step 6, and judging whether the position of each hole in the part meets the requirement of form and position tolerance or not;
step 9, extracting the contour of the image obtained in the step 6 to obtain edge information of the part, then carrying out Hough circle transformation on the edge information to obtain the center and radius information of the circle, and if the data of the center and the radius are both in a tolerance band, judging that the hole meets the requirements of form and position tolerance;
and 10, comparing the circle center obtained in the step 9 with the part center obtained in the step 6, and judging whether the position of each hole in the part meets the requirement of form and position tolerance.
In the step 1, the constructed darkroom shadowless illumination system is in a closed light shading environment, and can prevent the influence of external uneven ambient light on measurement; the system is characterized in that the illumination in the system consists of a self-contained annular light source and a backlight source which are in the same direction as the camera, the two groups of light sources are parallel light sources, and the two groups of light sources are provided with diffuse reflection plates and used for providing an illumination environment for the image acquisition of parts; the annular light source and the camera are coaxially arranged at the top of the system, and the color of the light source can be changed as required; the backlight source is embedded in the bottom object placing table, and the upper surface of the backlight source is coplanar with the plane of the bottom object placing table; when the annular light source is adopted for illumination, the bottom liner with different colors can be replaced on the bottom object placing table, so that a better image acquisition effect is obtained.
Step 3 comprises; collecting image information of a chessboard calibration board, obtaining pixel coordinates of angular points through angular point detection, and recording two adjacent angular point pixel coordinates as (x)1,y1),(x2,y2) If two adjacent corner points are in the horizontal direction, then y1≈y2If two corner points are adjacent in the vertical direction, x is1≈x2Calculating the Euclidean distance between two points
Figure BDA0002666427940000031
Taking n pairs of two angular points adjacent in the horizontal direction or the vertical direction, and respectively calculating the Euclidean distance d between the two angular points in each pairiThe Euclidean distances between two angular points in each pair are summed and averaged to obtain the average distance between the angular points in the image
Figure BDA0002666427940000032
Figure BDA0002666427940000033
Obtaining the physical size l between two angular points according to the real object of the chessboard calibration board, and calculating the physical size corresponding to each pixel
Figure BDA0002666427940000034
The step 5 comprises the following steps: if the collected image is a color image, the threshold value of the gray image is determined by a maximum inter-class variance method through gray of 0.299R + 0.587G + 0.114B, wherein R is the red component of the pixel in the color image before conversion, G is the green component of the pixel in the color image before conversion, B is the blue component of the pixel in the color image before conversion, gray is the pixel value of the gray image after conversion, the gray image is converted into the gray image, if the gray image is the gray image, conversion is not needed, the threshold value of the gray image is determined by a maximum inter-class variance method, the threshold value is set to be 1 for the pixel which is larger than the threshold value, otherwise, 0 is set, all pixels of the image are converted to realize the traversal of the part image, each pixel is traversed for the binary image, 8 pixels around the pixel are selected, if the edge is less than 8 pixels around, the pixel value is reached by complementing 0, and adding 9 pixels in total to the traversed pixels, performing descending order, selecting the middle value as the pixel value of the traversal point, and replacing the pixel value of the original traversal point, thereby filtering the noise in the image and realizing the filtering of the binary image.
In step 6, the contour detection is carried out on the binary image obtained in the step 5 to obtain the outer edge of the part, the external rectangle is calculated, and the vertex coordinates (x) of the external rectangle are obtainedlt,ylt) And the side length (m) of the circumscribed rectanglep,np) Then with (x)lt+T,ylt+ T) is the top left corner vertex, in (m)p+2T,np+2T) is the edge length of the part image area, where the parameter T is 10 × Δ/dpixAnd delta is the form and position tolerance of the part, useless image information is cut off, and a point (x) is calculated according to the calculated circumscribed rectanglelt+0.5mp,ylt+0.5np) As the center coordinates of the part.
In step 7, two side lengths a and b of the circumscribed rectangle are obtained through the circumscribed rectangle of the connected domain, two side lengths c and d are obtained through the minimum circumscribed rectangle of the connected domain, and 5 radiuses are calculated as follows: r is1=a*dpix/2,r2=b*dpix/2,r3=c*dpix/2,r4=d*dpix/2,r5=[(a+b+c+d)/4]*dpix/2 if both are within the tolerance band, i.e. (r)1,r2,r3,r4)∈[r5-Δ,r5+Δ]And judging that the part and the hole both meet the geometric tolerance requirement.
In step 8, the external rectangles of all connected domains are obtained, and the vertex coordinates (x) at the upper left corner of each external rectangleilt,yilt) The two side lengths of the corresponding external rectangle are respectively ai,biThen the center coordinate (x) of the connected component can be calculatedic,yic) Is xic=xilt+0.5*ai,yic=yilt+0.5*biThe obtained central coordinate and the part central coordinate (x) obtained in the step 6 are comparedc,yc) And (3) performing calculation comparison to judge whether each hole meets the requirement of form and position tolerance: if xci-xc<Δ is horizontally collinear, if yci-yc<Delta is vertically collinear; if xci-xc<Δ and yci-yc<Delta is coaxial with the hole and the part; separately calculate (x)ci-xc)*dpixValue of (a) and (y)ci-yc)*dpixTo determine whether the position of the hole is consistent with the drawing; defining the distance from the center of each hole to the center of the part as rdi
Figure BDA0002666427940000041
For n circumferential arrays of holes, if the average distance from the center of the hole to the center of the part is
Figure BDA0002666427940000042
When all rdiAre all at
Figure BDA0002666427940000043
Within the interval, the holes of the circumferential array are judged to meet the requirement of form and position tolerance.
In step 9, the radius r of the Hough circle is obtained through the side length calculation of the circumscribed rectangle of the connected domain, and for a single hole, if the form and position tolerance band of the part is +/-delta, the detected multiple radii r are sequenced to obtain the maximum radius rmaxAnd a minimum radius rminWhen the hole is deformed, (r)max-rmin)>And delta, judging that the roundness of the hole does not meet the form and position tolerance requirement.
In step 10, the circle center coordinate (x) obtained by Hough transform of each hole edge obtained in step 9 is utilizedci,yci) And the part center (x) obtained in step 6c,yc) And (3) performing calculation comparison to judge whether each hole meets the requirement of form and position tolerance: if xci-xc<Δ is horizontally collinear, if yci-yc<Delta is vertically collinear; if xci-xc<Δ and yci-yc<Delta is coaxial with the hole and the part; separately calculate (x)ci-xc)*dpixAnd (y)ci-yc)*dpixJudging whether the position of the hole is consistent with the drawing; defining the distance from the center of each hole to the center of the part as rdi
Figure BDA0002666427940000044
For n circumferential arrays of holes, if the average distance from the center of the hole to the center of the part is
Figure BDA0002666427940000045
When all rdiAre all at
Figure BDA0002666427940000051
Within the interval, the holes of the circumferential array are judged to meet the requirement of form and position tolerance.
Has the advantages that: compared with the existing quality detection method, the method has the advantages that the image information of the part is obtained by utilizing the industrial camera, the binary image is obtained after the preprocessing, the binary image is analyzed to obtain the information such as the outer size, the position of the inner hole, the size of the hole and the like of the part, the obtained information is calculated, the form and position tolerance analysis of the part is realized, the system is simple in structure and high in detection efficiency, and the method can be deployed on the production line of the part, so that the synchronous quality detection of the form and position tolerance of the part is realized.
Drawings
The foregoing and/or other advantages of the invention will become further apparent from the following detailed description of the invention when taken in conjunction with the accompanying drawings.
FIG. 1 is a schematic diagram of the structure of the present invention in practice.
Fig. 2 is a first calibration diagram of an industrial camera.
Fig. 3 is a second calibration diagram of the industrial camera.
FIG. 4 is a captured raw image of a part.
Fig. 5 is an image of the part after binarization.
Fig. 6 is an image after the part image information is cut.
FIG. 7 is a drawing of a circumscribed rectangle and a minimum circumscribed rectangle of a part.
Fig. 8 is the data calculated for fig. 7.
Fig. 9 is a hough transform graph after extraction of the part edge.
Fig. 10 is a schematic diagram of geometric tolerance analysis under hough transform.
Detailed Description
The invention discloses a machine vision-based form and position tolerance detection method for a circular perforated part, which comprises the following steps of:
step 1, constructing a darkroom shadowless illumination system, wherein the environment of the system is a closed light shading environment, and illumination in the system consists of a self-contained annular light device and a backlight source which are in the same direction as a camera and is used for providing an illumination environment for image acquisition of parts;
fixing the position of a camera, fixing the position of a platform, installing a fixed-focus camera lens, calibrating the camera lens by using a chessboard calibration plate after focusing, completing distortion correction, and eliminating the influence of lens distortion on measurement;
step 3, calibrating the camera calibrated in the step 2 by using the chessboard calibration board again, obtaining pixel coordinates of angular points through angular point detection, calculating the pixel distance between two adjacent angular points, and performing correlation calculation with the physical size of the same angular point on the chessboard calibration board to obtain the corresponding relation between image pixels and the actual physical size;
step 4, placing the part in an image acquisition area, and acquiring image information of the part;
step 5, carrying out binarization and filtering on the collected part image, and eliminating random noise in the collection process;
step 6, extracting a part area, cutting useless image information, extracting an image connected domain, and calculating the center of the connected domain as a part center coordinate;
step 7, calculating the circumscribed rectangles of all the connected regions in the image obtained in the step 6 to obtain two side lengths of the circumscribed rectangles, calculating the minimum circumscribed rectangle of the connected regions to obtain two side lengths of the minimum circumscribed rectangle, calculating 5 radiuses according to the four obtained side length data, judging the roundness of the hole, and if the two sides are all in a tolerance band, judging that the hole meets the requirements of form and position tolerance;
step 8, calculating the center point of each hole in the part by using the circumscribed rectangles of all the communicated areas in the image obtained in the step 7, comparing the center point with the center of the part obtained in the step 6, and judging whether the position of each hole in the part meets the requirement of form and position tolerance or not;
step 9, extracting the contour of the image obtained in the step 6 to obtain edge information of the part, then carrying out Hough circle transformation on the edge information to obtain the center and radius information of the circle, and if the data of the center and the radius are both in a tolerance band, judging that the hole meets the requirements of form and position tolerance;
and 10, comparing the circle center obtained in the step 9 with the part center obtained in the step 6, and judging whether the position of each hole in the part meets the requirement of form and position tolerance.
In the step 1, the constructed darkroom shadowless illumination system is in a closed light shading environment, and can prevent the influence of external uneven ambient light on measurement; the system internal illumination comprises an annular light source and a backlight source which are arranged in the same direction as the camera, the two groups of light sources are parallel light sources, and the two groups of light sources are provided with diffuse reflection plates and used for providing an illumination environment for image acquisition of parts; the annular light source and the camera are coaxially arranged at the top of the system, and the color of the light source can be changed as required; the backlight source is embedded in the bottom object placing table, and the upper surface of the backlight source is coplanar with the plane of the bottom object placing table; when the annular light source is adopted for illumination, the bottom liner with different colors can be replaced on the bottom object placing table, so that a better image acquisition effect is obtained.
Step 3 comprises; collecting image information of a chessboard calibration board, obtaining pixel coordinates of angular points through angular point detection, and recording two adjacent angular point pixel coordinates as (x)1,y1),(x2,y2) If two adjacent corner points are in the horizontal direction, then y1≈y2If two corner points are adjacent in the vertical direction, x is1≈x2Calculating the Euclidean distance between two points
Figure BDA0002666427940000061
Taking n pairs of two angular points adjacent in the horizontal direction or the vertical direction, and respectively calculating the Euclidean distance d between the two angular points in each pairiThe Euclidean distances between two angular points in each pair are summed and averaged to obtain the average distance between the angular points in the image
Figure BDA0002666427940000071
Figure BDA0002666427940000072
Obtaining the physical size l between two angular points according to the real object of the chessboard calibration board, and calculating the physical size corresponding to each pixel
Figure BDA0002666427940000073
In step 6, the contour detection is carried out on the binary image obtained in the step 5 to obtain the outer edge of the part, the external rectangle is calculated, and the vertex coordinates (x) of the external rectangle are obtainedlt,ylt) And the side length (m) of the circumscribed rectanglep,np) Then with (x)lt+T,ylt+ T) is the top left corner vertex, in (m)p+2T,np+2T) is the edge length of the part image area, where the parameter T is 10 × Δ/dpixAnd delta is the form and position tolerance of the part, useless image information is cut off, and a point (x) is calculated according to the calculated circumscribed rectanglelt+0.5mp,ylt+0.5np) As the center coordinates of the part.
In step 7, two side lengths a and b of the circumscribed rectangle are obtained through the circumscribed rectangle of the connected domain, two side lengths c and d are obtained through the minimum circumscribed rectangle of the connected domain, and 5 radiuses are calculated as follows: r is1=a*dpix/2,r2=b*dpix/2,r3=c*dpix/2,r4=d*dpix/2,r5=[(a+b+c+d)/4]*dpix/2 if both are within the tolerance band, i.e. (r)1,r2,r3,r4)∈[r5-Δ,r5+Δ]And judging that the part and the hole both meet the geometric tolerance requirement.
In step 8, the external rectangles of all connected domains are obtained, and the vertex coordinates (x) at the upper left corner of each external rectangleilt,yilt) The two side lengths of the corresponding external rectangle are respectively ai,biThen the center coordinate (x) of the connected component can be calculatedic,yic) Is xic=xilt+0.5*ai,yic=yilt+0.5*biThe obtained central coordinate and the part central coordinate (x) obtained in the step 6 are comparedc,yc) And (3) performing calculation comparison to judge whether each hole meets the requirement of form and position tolerance: if xci-xc<Δ is horizontally collinear, if yci-yc<Delta is vertically collinear; if xci-xc<Δ and yci-yc<Delta is coaxial with the hole and the part; separately calculate (x)ci-xc)*dpixValue of (a) and (y)ci-yc)*dpixTo determine whether the position of the hole is consistent with the drawing; defining the distance from the center of each hole to the center of the part as rdi
Figure BDA0002666427940000074
For n circumferential arrays of holes, if the average distance from the center of the hole to the center of the part is
Figure BDA0002666427940000075
When all rdiAre all at rdWithin the range of +/-delta, the holes of the circumferential array are judged to meet the requirement of form and position tolerance.
In step 9, the radius r of the Hough circle is obtained through side length calculation of a rectangle circumscribed by the connected domain, and for a single hole, if the form and position tolerance band of the part is +/-delta, the detected multiple radii r are sequenced to obtain the maximum radius rmaxAnd a minimum radius rminWhen the hole is deformed, (r)max-rmin)>Δ,And judging that the roundness of the hole does not meet the form and position tolerance requirement.
In step 10, the circle center coordinate (x) obtained by Hough transform of each hole edge obtained in step 9 is utilizedci,yciAnd the part center (x) obtained in step 6c,yc) And (3) performing calculation comparison to judge whether each hole meets the requirement of form and position tolerance: if xci-xc<Δ is horizontally collinear, if yci-yc<Delta is vertically collinear; if xci-xc<Δ and yci-yc<Delta is coaxial with the hole and the part; separately calculate (x)ci-xc)*dpixAnd (y)ci-yc)*dpixJudging whether the position of the hole is consistent with the drawing; defining the distance from the center of each hole to the center of the part as rdi
Figure BDA0002666427940000081
For n circumferential arrays of holes, if the average distance from the center of the hole to the center of the part is
Figure BDA0002666427940000082
When all rdiAre all at
Figure BDA0002666427940000083
Within the interval, the holes of the circumferential array are judged to meet the requirement of form and position tolerance.
Examples
The invention provides a machine vision-based form and position tolerance detection method for a circular perforated part, which comprises the following steps of:
step 1, a darkroom environment (as shown in figure 1, a darkroom, 2, a ring-shaped light source, 3, an industrial camera, 4, a part to be measured, 5, a backlight source) as shown in figure 1 is built, firstly, the ring-shaped light source at the top is used for lighting, and a high-precision 12 x 9 grid series aluminum checkerboard calibration plate is shot, wherein the checkerboard size of the calibration plate is 10mm x 10 mm.
And 2, carrying out distortion correction on the camera by using a Zhang Zhengyou calibration method, wherein the corrected image is shown in fig. 2.
And 3, after correction, shooting the calibration plate again, detecting the angular points and the angular point coordinates, calculating the relationship between the pixels and the physical size, and obtaining a calibration result in the image as shown in FIG. 3.
And 4, adopting backlight source illumination, placing the part to be measured on the object placing table, and taking a picture, wherein the picture taking effect is shown in fig. 4.
And step 5, carrying out binarization processing on the image 4 to obtain a binary image as shown in the image 5.
And 6, cutting the image of the part according to the requirement of the form and position tolerance band of the part, and reducing the size to obtain the image shown in the figure 6.
And 7, calculating the circumscribed rectangle and the minimum circumscribed rectangle of the image obtained in the step 6, calculating and analyzing form and position tolerance, and analyzing whether the radius of the round hole is in a tolerance zone or not, wherein the analysis result is shown in fig. 8.
And 8, calculating the center coordinates of the holes on the basis of the step 7, and analyzing whether the holes are in a common circle or not, wherein the analysis result is shown in fig. 8.
And 9, extracting the contour of the binary image obtained in the step 6 to obtain edge image information shown in fig. 9, performing hough transform on the edge information, extracting the radius of the hole, analyzing whether the radius of the circular hole is in a tolerance zone, and obtaining an analysis result shown in fig. 10.
And step 10, calculating the center coordinates of the holes on the basis of the step 9, and analyzing whether the holes are in a common circle or not, wherein the analysis result is shown in fig. 10.
The present invention provides a method for detecting form and position tolerance of a circular perforated part based on machine vision, and a plurality of methods and ways for implementing the technical scheme, and the above description is only a preferred embodiment of the present invention, and it should be noted that, for a person skilled in the art, a plurality of improvements and modifications can be made without departing from the principle of the present invention, and these improvements and modifications should also be regarded as the protection scope of the present invention. All the components not specified in the present embodiment can be realized by the prior art.

Claims (9)

1. A form and position tolerance detection method for a circular perforated part based on machine vision is characterized by comprising the following steps:
step 1, constructing a darkroom shadowless lighting system, wherein the environment of the system is a closed shading environment;
fixing the position of a camera, fixing the position of a platform, installing a fixed-focus camera lens, and calibrating the camera lens by using a chessboard calibration plate after focusing to finish distortion correction;
step 3, calibrating the camera calibrated in the step 2 by using the chessboard calibration board again, obtaining pixel coordinates of angular points through angular point detection, calculating the pixel distance between two adjacent angular points, and performing correlation calculation with the physical size of the same angular point on the chessboard calibration board to obtain the corresponding relation between image pixels and the actual physical size;
step 4, placing the part in an image acquisition area, and acquiring image information of the part;
step 5, carrying out binarization and filtering on the collected part image;
step 6, extracting a part area, cutting useless image information, extracting an image connected domain, and calculating the center of the connected domain as a part center coordinate;
step 7, calculating the circumscribed rectangles of all the connected regions in the image obtained in the step 6 to obtain two side lengths of the circumscribed rectangles, calculating the minimum circumscribed rectangle of the connected regions to obtain two side lengths of the minimum circumscribed rectangle, calculating 5 radiuses according to the four obtained side length data, judging the roundness of the hole, and if the two sides are all in a tolerance band, judging that the hole meets the requirements of form and position tolerance;
step 8, calculating the center point of each hole in the part by using the circumscribed rectangles of all the communicated areas in the image obtained in the step 7, comparing the center point with the center of the part obtained in the step 6, and judging whether the position of each hole in the part meets the requirement of form and position tolerance or not;
step 9, extracting the contour of the image obtained in the step 6 to obtain edge information of the part, then carrying out Hough circle transformation on the edge information to obtain the center and radius information of the circle, and if the data of the center and the radius are both in a tolerance band, judging that the hole meets the requirements of form and position tolerance;
and 10, comparing the circle center obtained in the step 9 with the part center obtained in the step 6, and judging whether the position of each hole in the part meets the requirement of form and position tolerance.
2. The method according to claim 1, wherein in step 1, the in-system illumination is composed of a self-contained annular light source and a backlight source which are in the same direction as the camera, the two groups of light sources are parallel light sources, and the two groups of light sources are provided with diffuse reflection plates and used for providing an illumination environment for image acquisition of the part; the annular light source and the camera are coaxially arranged at the top of the system, and the color of the light source can be changed as required; the backlight source is embedded in the bottom object placing table, and the upper surface of the backlight source is coplanar with the plane of the bottom object placing table; when the annular light source is adopted for illumination, the bottom lining with different colors can be replaced on the bottom object placing table.
3. The method of claim 2, wherein step 3 comprises; collecting image information of a chessboard calibration board, obtaining pixel coordinates of angular points through angular point detection, and recording two adjacent angular point pixel coordinates as (x)1,y1),(x2,y2) If two adjacent corner points are in the horizontal direction, then y1≈y2If two corner points are adjacent in the vertical direction, x is1≈x2Calculating the Euclidean distance between two points
Figure FDA0002666427930000021
Taking n pairs of two angular points adjacent in the horizontal direction or the vertical direction, and respectively calculating the Euclidean distance d between the two angular points in each pairiThe Euclidean distances between two angular points in each pair are summed and averaged to obtain the average distance between the angular points in the image
Figure FDA0002666427930000022
Figure FDA0002666427930000023
Obtaining the physical size l between two angular points according to the real object of the chessboard calibration board, and calculating the physical size corresponding to each pixel
Figure FDA0002666427930000024
4. The method of claim 3, wherein step 5 comprises: converting the collected image into a gray-scale image by gray of 0.299R + 0.587G + 0.114B if the collected image is a color image, wherein R is a red component of the pixel in the color image before conversion, G is a green component of the pixel in the color image before conversion, B is a blue component of the pixel in the color image before conversion, and gray is a pixel value of the gray-scale image after conversion, if the collected image is the gray-scale image, conversion is not needed, for the obtained gray-scale image, a threshold value of the gray-scale image is determined by a maximum inter-class variance method, the threshold value is set as 1 for the pixel which is larger than the threshold value, otherwise, the threshold value is set as 0, all pixels of the image complete the conversion to realize the traversal of the part image, each pixel is traversed for the binary image, 8 pixels around the pixel are selected, if the edge, namely the surrounding is less than 8 pixels, the value of 8 pixels is reached by complementing 0, and adding 9 pixels in total to the traversed pixels, performing descending order, selecting the middle value as the pixel value of the traversal point, and replacing the pixel value of the original traversal point, thereby filtering the noise in the image and realizing the filtering of the binary image.
5. The method according to claim 4, wherein in step 6, the contour detection is performed on the binarized image obtained in step 5 to obtain the outer edge of the part, the circumscribed rectangle thereof is calculated, and the vertex coordinates (x) of the circumscribed rectangle are obtainedlt,ylt) And the side length (m) of the circumscribed rectanglep,np) Then with (x)lt+T,ylt+ T) is the top left corner vertex, in (m)p+2T,np+2T) is the edge length of the part image area, where the parameter T is 10 × Δ/dpixΔ is a partCutting off useless image information, and calculating a circumscribed rectangle by using a point (x)lt+0.5mp,ylt+0.5np) As the center coordinates of the part.
6. The method according to claim 5, wherein in step 7, the two side lengths a, b of the circumscribed rectangle are obtained from the connected domain circumscribed rectangle, and the two side lengths c, d are obtained from the minimum circumscribed rectangle of the connected domain, and 5 radii are calculated as: r is1=a*dpix/2,r2=b*dpix/2,r3=c*dpix/2,r4=d*dpix/2,r5=[(a+b+c+d)/4]*dpix/2 if both are within the tolerance band, i.e. (r)1,r2,r3,r4)∈[r5-Δ,r5+Δ]And judging that the part and the hole both meet the geometric tolerance requirement.
7. The method of claim 6, wherein in step 8, the bounding rectangles of all connected components are obtained, and for each bounding rectangle, the vertex coordinates (x) at the top left cornerilt,yilt) The two side lengths of the corresponding external rectangle are respectively ai,biThen the center coordinate (x) of the connected component can be calculatedic,yic) Is xic=xilt+0.5*ai,yic=yilt+0.5*biThe obtained central coordinate and the part central coordinate (x) obtained in the step 6 are comparedc,yc) And (3) performing calculation comparison to judge whether each hole meets the requirement of form and position tolerance: if xci-xc<Δ is horizontally collinear, if yci-yc<Delta is vertically collinear; if xci-xc<Δ and yci-yc<Delta is coaxial with the hole and the part; separately calculate (x)ci-xc)*dpixValue of (a) and (y)ci-yc)*dpixTo determine whether the position of the hole is consistent with the drawing; defining the distance from the center of each hole to the center of the part as rdi
Figure FDA0002666427930000031
For n circumferential arrays of holes, if the average distance from the center of the hole to the center of the part is
Figure FDA0002666427930000032
When all rdiAre all at
Figure FDA0002666427930000033
Within the interval, the holes of the circumferential array are judged to meet the requirement of form and position tolerance.
8. The method as claimed in claim 7, wherein in step 9, the radius r of the hough circle is calculated by the side length of the rectangle circumscribed to the connected domain, and if the tolerance band of the shape and position of the part is ± Δ, for a single hole, the detected radii r are sorted to obtain the maximum radius rmaxAnd a minimum radius rminWhen the hole is deformed, (r)max-rmin)>And delta, judging that the roundness of the hole does not meet the form and position tolerance requirement.
9. The method according to claim 8, wherein in step 10, the circle center coordinates (x) obtained by Hough transform of each hole edge are obtained in step 9ci,yci) And the part center (x) obtained in step 6c,yc) And (3) performing calculation comparison to judge whether each hole meets the requirement of form and position tolerance: if xci-xc<Δ is horizontally collinear, if yci-yc<Delta is vertically collinear; if xci-xc<Δ and yci-yc<Delta is coaxial with the hole and the part; separately calculate (x)ci-xc)*dpixAnd (y)ci-yc)*dpixJudging whether the position of the hole is consistent with the drawing; defining the distance from the center of each hole to the center of the part as rdi
Figure FDA0002666427930000041
For n circumferential arrays of holes, if the average distance from the center of the hole to the center of the part is
Figure FDA0002666427930000042
When all rdiAre all at
Figure FDA0002666427930000043
Within the interval, the holes of the circumferential array are judged to meet the requirement of form and position tolerance.
CN202010920126.6A 2020-09-04 2020-09-04 Circular perforated part form and position tolerance detection method based on machine vision Active CN112001917B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010920126.6A CN112001917B (en) 2020-09-04 2020-09-04 Circular perforated part form and position tolerance detection method based on machine vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010920126.6A CN112001917B (en) 2020-09-04 2020-09-04 Circular perforated part form and position tolerance detection method based on machine vision

Publications (2)

Publication Number Publication Date
CN112001917A true CN112001917A (en) 2020-11-27
CN112001917B CN112001917B (en) 2024-08-09

Family

ID=73469861

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010920126.6A Active CN112001917B (en) 2020-09-04 2020-09-04 Circular perforated part form and position tolerance detection method based on machine vision

Country Status (1)

Country Link
CN (1) CN112001917B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112529869A (en) * 2020-12-11 2021-03-19 中国航空工业集团公司金城南京机电液压工程研究中心 Valve sleeve throttling square hole detection method
CN113379688A (en) * 2021-05-28 2021-09-10 慕贝尔汽车部件(太仓)有限公司 Stabilizer bar hole deviation detection method and system based on image recognition
CN113592955A (en) * 2021-07-27 2021-11-02 中国科学院西安光学精密机械研究所 Circular workpiece plane coordinate high-precision positioning method based on machine vision
CN113781481A (en) * 2021-11-11 2021-12-10 滨州学院 Method and device for non-contact measurement of shape and size of object and electronic equipment
CN113899318A (en) * 2021-09-09 2022-01-07 信利光电股份有限公司 Device and method for detecting edge-to-edge distance of frame glue
CN114663427A (en) * 2022-04-25 2022-06-24 北京与子成科技有限公司 Boiler part size detection method based on image processing
WO2023188553A1 (en) * 2022-03-29 2023-10-05 株式会社島津製作所 Atomic absorption spectrophotometer

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010061201A (en) * 2008-09-01 2010-03-18 Nec Saitama Ltd Alignment mark image recognition device and alignment mark image recognition method
JP5222430B1 (en) * 2012-10-19 2013-06-26 株式会社イノテック Dimension measuring apparatus, dimension measuring method and program for dimension measuring apparatus
CN103471531A (en) * 2013-09-27 2013-12-25 吉林大学 On-line non-contact measurement method for straightness of axis parts
CN104897062A (en) * 2015-06-26 2015-09-09 北方工业大学 Visual measurement method and device for shape and position deviation of part non-coplanar parallel holes
WO2015154487A1 (en) * 2014-04-09 2015-10-15 华南理工大学 Grouped holes verticality detection system and method based on visual measurement
CN107101582A (en) * 2017-07-03 2017-08-29 吉林大学 Axial workpiece run-out error On-line Measuring Method based on structure light vision
CN109540084A (en) * 2018-10-25 2019-03-29 北京航天控制仪器研究所 The measurement method and device of part 3 d pose in a kind of supernatant liquid
KR20190051463A (en) * 2017-11-07 2019-05-15 현대모비스 주식회사 Apparatus and method for detecting checkerboard corner point for camera calibration
CN109785316A (en) * 2019-01-22 2019-05-21 湖南大学 A kind of apparent defect inspection method of chip
CN109974582A (en) * 2019-04-04 2019-07-05 江南大学 A kind of the conductor diameters non-contact vision detection device and method of automotive wire bundle

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010061201A (en) * 2008-09-01 2010-03-18 Nec Saitama Ltd Alignment mark image recognition device and alignment mark image recognition method
JP5222430B1 (en) * 2012-10-19 2013-06-26 株式会社イノテック Dimension measuring apparatus, dimension measuring method and program for dimension measuring apparatus
CN103471531A (en) * 2013-09-27 2013-12-25 吉林大学 On-line non-contact measurement method for straightness of axis parts
WO2015154487A1 (en) * 2014-04-09 2015-10-15 华南理工大学 Grouped holes verticality detection system and method based on visual measurement
CN104897062A (en) * 2015-06-26 2015-09-09 北方工业大学 Visual measurement method and device for shape and position deviation of part non-coplanar parallel holes
CN107101582A (en) * 2017-07-03 2017-08-29 吉林大学 Axial workpiece run-out error On-line Measuring Method based on structure light vision
KR20190051463A (en) * 2017-11-07 2019-05-15 현대모비스 주식회사 Apparatus and method for detecting checkerboard corner point for camera calibration
CN109540084A (en) * 2018-10-25 2019-03-29 北京航天控制仪器研究所 The measurement method and device of part 3 d pose in a kind of supernatant liquid
CN109785316A (en) * 2019-01-22 2019-05-21 湖南大学 A kind of apparent defect inspection method of chip
CN109974582A (en) * 2019-04-04 2019-07-05 江南大学 A kind of the conductor diameters non-contact vision detection device and method of automotive wire bundle

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
董芳凯;郑智贞;袁少飞;白云鑫;张余升;: "基于机器视觉的阀门零件同轴度测量系统研究", 组合机床与自动化加工技术, no. 11, 20 November 2017 (2017-11-20) *
谢红;廖志杰;邢廷文;: "一种非接触式的圆孔形零件尺寸检测", 电子设计工程, no. 19, 5 October 2016 (2016-10-05) *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112529869A (en) * 2020-12-11 2021-03-19 中国航空工业集团公司金城南京机电液压工程研究中心 Valve sleeve throttling square hole detection method
CN112529869B (en) * 2020-12-11 2023-07-21 中国航空工业集团公司金城南京机电液压工程研究中心 Valve sleeve throttling square hole detection method
CN113379688A (en) * 2021-05-28 2021-09-10 慕贝尔汽车部件(太仓)有限公司 Stabilizer bar hole deviation detection method and system based on image recognition
CN113379688B (en) * 2021-05-28 2023-12-08 慕贝尔汽车部件(太仓)有限公司 Stabilizer bar hole deviation detection method and system based on image recognition
CN113592955A (en) * 2021-07-27 2021-11-02 中国科学院西安光学精密机械研究所 Circular workpiece plane coordinate high-precision positioning method based on machine vision
CN113592955B (en) * 2021-07-27 2024-04-09 中国科学院西安光学精密机械研究所 Round workpiece plane coordinate high-precision positioning method based on machine vision
CN113899318A (en) * 2021-09-09 2022-01-07 信利光电股份有限公司 Device and method for detecting edge-to-edge distance of frame glue
CN113781481A (en) * 2021-11-11 2021-12-10 滨州学院 Method and device for non-contact measurement of shape and size of object and electronic equipment
WO2023188553A1 (en) * 2022-03-29 2023-10-05 株式会社島津製作所 Atomic absorption spectrophotometer
CN114663427A (en) * 2022-04-25 2022-06-24 北京与子成科技有限公司 Boiler part size detection method based on image processing

Also Published As

Publication number Publication date
CN112001917B (en) 2024-08-09

Similar Documents

Publication Publication Date Title
CN112001917B (en) Circular perforated part form and position tolerance detection method based on machine vision
CN109141232B (en) Online detection method for disc castings based on machine vision
CN108921176B (en) Pointer instrument positioning and identifying method based on machine vision
CN103499297B (en) A kind of high-precision measuring method based on CCD
CN105160652A (en) Handset casing testing apparatus and method based on computer vision
CN102288613B (en) Surface defect detecting method for fusing grey and depth information
CN105783723B (en) Precision die surface processing accuracy detection device and method based on machine vision
CN102135236B (en) Automatic non-destructive testing method for internal wall of binocular vision pipeline
CN102589435B (en) Efficient and accurate detection method of laser beam center under noise environment
CN113570605B (en) Defect detection method and system based on liquid crystal display panel
CN102032875B (en) Image-processing-based cable sheath thickness measuring method
CN109900711A (en) Workpiece, defect detection method based on machine vision
CN105335963A (en) Edge defect detection method and apparatus
CN103345755A (en) Chessboard angular point sub-pixel extraction method based on Harris operator
CN107796826B (en) Micro duplicate gear broken tooth defect detection method based on tooth center distance curve analysis
CN112686920A (en) Visual measurement method and system for geometric dimension parameters of circular part
CN114280075B (en) Online visual detection system and detection method for surface defects of pipe parts
CN113970560B (en) Defect three-dimensional detection method based on multi-sensor fusion
CN111815575B (en) Bearing steel ball part detection method based on machine vision
Du et al. A method of dimension measurement for spur gear based on machine vision
CN114688969A (en) Optical lens size detection device and method
CN116519640A (en) Method for measuring surface glossiness of silica gel key based on machine vision system
CN115575407A (en) Detection method applied to track and tunnel
CN114034471A (en) Method for measuring laser light path profile
CN113793321A (en) Casting surface defect dynamic detection method and device based on machine vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant