CN113222955A - Gear size parameter automatic measurement method based on machine vision - Google Patents

Gear size parameter automatic measurement method based on machine vision Download PDF

Info

Publication number
CN113222955A
CN113222955A CN202110565114.0A CN202110565114A CN113222955A CN 113222955 A CN113222955 A CN 113222955A CN 202110565114 A CN202110565114 A CN 202110565114A CN 113222955 A CN113222955 A CN 113222955A
Authority
CN
China
Prior art keywords
gear
image
radius
edge
circle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110565114.0A
Other languages
Chinese (zh)
Inventor
余建波
周俊杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tongji University
Original Assignee
Tongji University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tongji University filed Critical Tongji University
Priority to CN202110565114.0A priority Critical patent/CN113222955A/en
Publication of CN113222955A publication Critical patent/CN113222955A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T5/70
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Abstract

The invention belongs to the technical field of machine vision measurement, and provides a gear size parameter automatic measurement method based on machine vision. The method has high automation degree, the measurement precision can reach micron level, the measurement of the basic parameters of the gear can be rapidly and accurately completed, and the method is used for industrially measuring the gear products in real time and realizing full-automatic intelligent measurement.

Description

Gear size parameter automatic measurement method based on machine vision
Technical Field
The invention belongs to the technical field of machine vision measurement, and particularly relates to a gear size parameter automatic measurement method based on machine vision.
Background
Gears are one of the most widely used parts in industrial products, and the accuracy of the gears has a crucial influence on the smoothness, accuracy and life of the gear-based motion transmission system. Quality control has become a major issue in the inspection of manufacturing processes in the gear industry. At present, the traditional contact-type measurement mode is mainly used for measuring gear parameters, and a manual measurement method is to measure the gear parameters by using a measuring instrument, such as a vernier caliper, a gear pitch micrometer, a contourgraph and the like. For a contact measurement mode, path planning and probe radius compensation are needed on different gear surfaces, and the surface of a workpiece can be damaged even in the measurement process. In the actual industrial field, all the parameters of the gear are generally obtained using intelligent detection instruments and appropriate software, such as CNC gear measurement centers, gear on-line measurement sorters, gear meshing machines, three-Coordinate Measuring Machines (CMMs) and the like. The measuring modes need manual operation, the working strength is high, the technical performance is high, the influence of human factors is large, the efficiency is not high, the price of detection equipment is high, and the maintenance cost is high. Therefore, the development of a gear parameter vision measuring method with non-contact measurement, high precision, high efficiency and low cost is urgently needed.
Disclosure of Invention
The present invention has been made to solve the above problems, and an object of the present invention is to provide a method for automatically measuring gear dimensional parameters based on machine vision.
The invention provides a method for automatically measuring gear dimension parameters based on machine vision, which is characterized by comprising the following steps: step S1, acquiring an image of the gear by using an image acquisition device; step S2, preprocessing the gear image to remove noise, and obtaining a preprocessed gear image; step S3, performing threshold segmentation on the preprocessed gear image, obtaining a gear binary image through morphological operation, and performing Canny operator edge detection coarse positioning and sub-pixel edge method fine positioning of Zernike moment on the gear binary image to obtain a gear edge image; step S4, determining the gear center o of the gear by the centroid method for the gear edge image, and then calculating by using the boundary tracking algorithm and the least square fitting methodRadius r of addendum circle of gearaRadius of tooth root circle rfThe number of pixels occupied in the gear edge image; step S5, completing the calibration of pixel equivalent by using the calibration plate, and calibrating the pixel equivalent and the radius r of the addendum circleaRadius of tooth root circle rfMultiplying the occupied pixel number to obtain the radius r of the top circle of the gearaAnd root radius rf(ii) a And S6, performing masking processing on the binary image of the gear by using a circular masking template to obtain a gear tooth image only comprising gear teeth of the gear, marking a connected domain of the gear tooth image and counting to obtain the number of teeth z, wherein in the step S3, threshold segmentation is performed on the preprocessed gear image by using a Fisher criterion segmentation algorithm based on edge information of a Laplace operator.
In the method for automatically measuring the gear dimension parameters based on the machine vision, the method can further have the following characteristics: in step S4, the gear center o of the gear is determined by using the centroid method as a reference for measuring the gear size parameter, and the process of determining the gear center o of the gear by using the centroid method is as follows: step S4-1-1: inputting an edge image f (x, y) of the gear, and setting an initial circle center (x)0,y0) (ii) a Step S4-1-2: scanning the gear edge image line by line, recording the number n of pixels with the gray value of 1 and the coordinates (x) of each point imagei,yi) (ii) a Step S4-1-3: calculating the gear center o:
Figure BDA0003080673970000031
in the method for automatically measuring the gear dimension parameters based on the machine vision, the method can further have the following characteristics: in step S4, the addendum radius r of the gear is set toaRadius r of tooth root of gearfThe measurement process is as follows: step S4-2-1: inputting a gear edge image, and determining a gear center o of a gear; step S4-2-2: selecting a starting point on the edge of the gear edge image, calculating and recording the distance from each pixel point on the outline to the gear center o by using a boundary tracking algorithm, and solving the image coordinate corresponding to the maximum value of the distanceRecording the image coordinate corresponding to the minimum value into an array A, and recording the image coordinate corresponding to the minimum value into an array B; step S4-2-3: respectively fitting circles to the coordinates of the array A and the array B by using a least square method to obtain the radius r of the addendum circle of the gearaRadius r of tooth root of gearfThe number of pixels occupied in the gear edge image.
In the method for automatically measuring the gear dimension parameters based on the machine vision, the method can further have the following characteristics: in step S6, the radius of the circular mask template is greater than the radius r of the root circlefSmaller than the addendum circle radius ra
In the method for automatically measuring the gear dimension parameters based on the machine vision, the method can further have the following characteristics: in step S2, the preprocessing includes piecewise linear gray-scale transformation and adaptive median filtering; piecewise linear gray scale transformation is used to enhance the gear region; an adaptive median filtering method is used to remove noise and smooth the image.
In the method for automatically measuring the gear dimension parameters based on the machine vision, the method can further have the following characteristics: wherein, in step S3, the morphological operations include erosion and dilation for deburring the pre-processed gear image so that the image of the pre-processed gear image is refined.
In the method for automatically measuring the gear dimension parameters based on the machine vision, the method can further have the following characteristics: in step S1, the image capturing device includes a CMOS industrial area-array camera, a three-color spherical light source, a camera holder, and a computer.
In the method for automatically measuring the gear dimension parameters based on the machine vision, the method can further have the following characteristics: in step S5, the calibration board is a checkerboard calibration board, and the specific process of calibrating the pixel equivalent is as follows: focusing and shooting the calibration plate through a CMOS industrial area-array camera to obtain the number n of pixels occupied by each small cell of the calibration plate, wherein the actual size of each small cell of the calibration plate is l, and the pixel equivalent k is l/n.
Action and Effect of the invention
According to the gear size parameter automatic measurement method based on the machine vision, the Fisher criterion segmentation algorithm based on the edge information of the Laplace operator is used, the edge influence of the region of interest is strengthened while the whole image information is considered, the target region in the image can be highlighted, and conditions are provided for accurately extracting the gear edge in the follow-up process. And because the invention uses the sub-pixel edge method of the Zernike moment for self-adaptively determining the threshold condition of the step gray scale, the manual selection of the threshold of the step gray scale is avoided, so that the automatic accurate positioning of the gear edge can be realized, the labor cost is reduced, the efficiency is improved, and the error caused by the manual subjective selection of the threshold is avoided. The invention also provides an algorithm for determining the center of the gear by a centroid method, calculating by using a boundary tracking algorithm and a least square method to obtain the radius of the addendum circle and the dedendum circle, setting a mask to extract a communication domain of the gear tooth part and marking and counting to obtain the number of teeth, and the algorithm can be used for rapidly and accurately measuring each parameter of the gear and can be applied to production line detection.
In conclusion, the method is a non-contact measurement method, can avoid damaging parts and prolong the service life of the parts, and is simple to operate, low in cost, high in automation degree and efficiency, high in measurement accuracy reaching the micron level and high in stability.
Drawings
FIG. 1 is a flow chart of the present invention machine vision based automatic gear dimensional parameter measurement method; and
FIG. 2 is an image processing result of the measurement process of the automatic gear dimension parameter measurement method based on machine vision according to the present invention.
Detailed Description
In order to make the technical means, the creation features, the achievement purposes and the effects of the invention easy to understand, the following describes a method for automatically measuring the gear dimension parameters based on machine vision in detail with reference to the embodiments and the accompanying drawings.
< example >
Fig. 1 is a flowchart of the automatic measurement method of gear dimension parameters based on machine vision according to the present embodiment, and fig. 2 is an image processing result of the measurement process of the automatic measurement method of gear dimension parameters based on machine vision according to the present embodiment.
As shown in fig. 1 and 2, a method for automatically measuring gear dimension parameters based on machine vision includes the following steps:
and step S1, acquiring an image of the gear by using the image acquisition device.
In this step, the image acquisition device comprises a CMOS industrial area-array camera, a three-color spherical light source, a camera support and a computer. The embodiment is described only by the image capturing device comprising a CMOS industrial area-array camera, a three-color spherical light source, a camera holder and a computer, but other image capturing devices can achieve the same technical effect.
The specific process of the step is as follows: the CMOS industrial area-array camera is arranged at a set position, namely right above the gear, the illumination intensity and the tone of a three-color spherical light source are adjusted, the focal length of a lens is adjusted to enable imaging to be clear, and the collected gear image is selected and stored in a computer connected with the camera.
And step S2, preprocessing the gear image to remove noise and reduce interference, and obtaining a preprocessed gear image.
In this step, the preprocessing includes piecewise linear gray scale transformation and adaptive median filtering methods. And (3) enhancing the gear area by using piecewise linear gray scale transformation, and removing noise and smoothing the image by using an adaptive median filtering method.
And step S3, performing threshold segmentation on the preprocessed gear image, obtaining a gear binary image through morphological operation, and performing Canny operator edge detection coarse positioning and sub-pixel edge method fine positioning of Zernike moment on the gear binary image to obtain the gear edge image.
In the step, a Fisher criterion segmentation algorithm based on edge information of a Laplace operator is adopted to carry out threshold segmentation, the preprocessed gear image is divided into a background part and a target part, and then an optimal threshold value which enables the degree of distinguishing between two types of pixels to be maximum is selected through a Fisher evaluation function to distinguish the background and the target. Selecting a threshold t will have L different graysThe degree-level images are divided into two categories, namely object C0And background C1. Wherein C is0From the grey values in the image in the range 0, t]All pixels within, C1From the gray scale value in the image in the range of [ t +1, L-1]All pixels in the pixel. The Fisher merit function at this time is:
Figure BDA0003080673970000061
in the formula, w0And w1Are respectively C0Class and C1Proportion of each class, u0And u1Are respectively C0Class and C1Mean, σ, of each class0 2And σ1 2Are respectively C0Class and C1And (3) when the value of J reaches the maximum, namely the pixel distinguishing degree of the two classes is the maximum, the t at the moment is the selected optimal threshold value.
The Fisher criterion segmentation algorithm based on the edge information of the Laplace operator comprises the following steps:
step 1: the image f (x, y) is input, and the absolute value of the laplacian thereof is calculated to obtain an edge image.
Step 2: specifying a high-value threshold value T, and performing thresholding on the edge image acquired at Step 1 using the threshold value T to generate a binary image g so as to retain only the larger value in the Laplace imageT(x, y), image gT(x, y) is used as the marker image.
Step 3: the original image f (x, y) and the mark image g are combinedTMultiplying (x, y), selecting the pixel corresponding to the strong pixel edge (larger gray value) from f (x, y), and obtaining the image
Figure BDA0003080673970000071
Step 4: calculating an image
Figure BDA0003080673970000072
A histogram of non-zero pixels and a Fisher criterion is used to select a threshold for segmenting the image.
In this step, the morphological operations include erosion and expansion, and the preprocessed gear image is deburred, i.e., holes in the preprocessed gear image are removed, so that the preprocessed gear image is refined.
In the step, Canny operator edge detection is adopted to carry out edge rough positioning, and the calculation process is as follows:
step 1: the image is first smoothed using a gaussian filter to filter out noise:
Figure BDA0003080673970000073
step 2: calculating the gradient amplitude M (i, j) and the gradient direction theta (i, j) of each pixel point (i, j) by using the first-order partial derivative finite difference:
Figure BDA0003080673970000074
Figure BDA0003080673970000075
step 3: non-maxima suppression is performed on the gradient magnitudes to refine the edges.
Step 4: setting a Dual threshold T1,T2(T1<T2) Detecting and connecting edges, detecting the gradient amplitude of each pixel point (i, j) of the edge image, and if the gradient amplitude is larger than T2If the point is less than T, the point is considered as an edge point1If the gradient amplitude is within the range of the gradient amplitude, the point is considered not to be an edge point, and the connectivity of the edge point is taken as a measurement standard for further judgment.
In the step, a sub-pixel edge method based on Zernike moments has a self-adaptive step gray threshold, inter-class variance of pixel step gray values of a gear image target area and a background area is taken as a target function, the step gray value of each pixel takes the maximum value of the difference between the pixel and the gray values of pixels in eight neighborhoods of the pixel, and the step gray value when the inter-class variance reaches the maximum value is taken as a self-adaptive selected threshold by utilizing the principle of an Otsu threshold method.
The Zernike moment sub-pixel edge detection algorithm carries out edge positioning by calculating three Zernike moments of different orders, and the three Zernike moments are respectively marked as Z00,Z11And Z20The integral kernel functions corresponding to the three moments are respectively V00=1,V11X + iy and V20=2x2+2y2-1, and has Z00=Z′00,Z11=Z′11eAnd Z20=Z′00. Z can be calculated according to the definition of the Zernike moment and the relation of the Zernike moment before and after rotation00,Z11And Z20And then simultaneously calculating four parameters of an ideal step edge model, wherein h is the gray value of the image background part, t is the step gray value of the image target part and the background part, L is the real edge of the object, d is the distance from the origin to an edge line L, and alpha is the included angle between a vertical line d and the x axis:
Figure BDA0003080673970000081
Figure BDA0003080673970000082
Figure BDA0003080673970000083
Figure BDA0003080673970000084
the algorithm may use convolution of the template with the gray scale to calculate the Zernike moments, taking into account the template magnification effect, with an N × N size template covering the template center N when convolved while moving over the image2In this case, the radius of the unit circle is N/2, so that the vertical distance d needs to be enlarged by N/2 times to obtain the sub-pixel edge point (x)s,ys) Of co-ordinatesAnd (3) correcting the formula:
Figure BDA0003080673970000091
the edge can be refined by optimizing the threshold condition of the distance from the origin to the edge line, and the method for adaptively determining the threshold value of the step gray level is provided by artificially selecting the threshold value of the step gray level. The threshold condition for the Zernike algorithm to locate the sub-pixel edge points is as follows:
t≥tz∩d≤dz
in the formula tzIs the threshold value of the step gray scale, dzIs a threshold for the origin-to-edge line distance. 2dzShould be less than one pixel in length, as in conventional algorithms
Figure BDA0003080673970000092
Considering the template effect, the threshold condition of the distance from the origin to the edge line is d ≦ 2dz/N。tzIs generally obtained by manual experience, and the selected tzIf the value is too small, more false edges appear in the detection result; selected tzIf the value is too large, the information of the real edge can be lost while the false edge is reduced. To repeatedly adjust tzThe value of (a) can obtain a better result, so that not only is the efficiency low, but also the precision is difficult to ensure.
In the image, the gray value changes in a step mode at the edge, and the step gray value also changes greatly; in the image background area and the target area, the gray value changes smoothly, and the step gray value also changes little. Therefore, the step gray value and the gray value of the image have the same change trend. The method uses the principle of an Otsu method, uses the inter-class variance of pixel step gray values of a gear image target area and a background area as an index, the step gray value of each pixel takes the maximum value of the difference between the pixel and the gray values of pixels in eight neighborhoods of the pixel, and when the inter-class variance is maximum, t at the moment is the maximum valuezAs a step gray threshold.
Step S4, determining the gear center of the gear by the centroid method for the gear edge imageo, then calculating to obtain the radius r of the top circle of the gear by using a boundary tracking algorithm and a least square fitting methodaRadius of tooth root circle rfThe number of pixels occupied in the gear edge image.
In the step, a centroid method is adopted to determine the gear center o as a reference for measuring gear parameters, and the process of determining the gear center o of the gear by adopting the centroid method is as follows:
step S4-1-1: inputting an edge image f (x, y) of the gear, and setting an initial circle center (x)0,y0)。
Step S4-1-2: scanning the gear edge image line by line, recording the number n of pixels with the gray value of 1 and the coordinates (x) of each point imagei,yi)。
Step S4-1-3: calculating the gear center o:
Figure BDA0003080673970000101
in this step, the radius r of the addendum circleaRadius of tooth root circle rfThe flow of measuring the number of pixels occupied in the gear edge image is as follows:
step S4-2-1: the gear edge image f (x, y) is input, and the gear center o of the gear is determined.
Step S4-2-2: selecting a starting point on the edge of the gear, calculating and recording the distance from each starting point on the outline to the center of the gear by using a boundary tracking algorithm, solving the image coordinate corresponding to the maximum value of the distance and recording the image coordinate to an array A, and recording the image coordinate corresponding to the minimum value to an array B.
Step S4-2-3: respectively fitting circles to the coordinates of the array A and the array B by using a least square method to obtain the radius r of the addendum circleaRadius of tooth root circle rfThe number of pixels occupied in the gear edge image.
Step S5, completing pixel equivalent calibration by using a checkerboard calibration board, multiplying the pixel equivalent and the pixel number occupied by the radius of the addendum circle and the radius of the dedendum circle to obtain the radius r of the addendum circle of the gearaAnd the radius r of the root circlef
In this step, the specific process of calibrating the pixel equivalent is as follows: the method comprises the steps of focusing and shooting a calibration plate by a CMOS industrial area-array camera to obtain the number n of pixels occupied by each small cell of the calibration plate, wherein the actual size of each small cell of the calibration plate is l, and the pixel equivalent k is l/n.
And step S6, performing mask processing on the binary image of the gear by using a circular mask template to obtain a gear tooth image only comprising gear teeth of the gear, marking a communication domain of the gear tooth image and counting to obtain the number of teeth z.
In this step, a circular mask template is used with a radius of (r)a+rf) The radius of the circular mask template is only taken as (r) in the embodimenta+rf) 2, but the radius of the circular mask template is larger than the radius r of the root circlefSmaller than the addendum circle radius raThe same technical effect can be achieved.
In this step, a circular mask template is used with a radius of (r)a+rf) And/2, taking the point o as the center of the circle. The procedure for determining the modulus m and the reference circle radius r is as follows:
from the measured tip radius raRadius of tooth root circle rfThe gear module m can be found:
Figure BDA0003080673970000111
because the gear module is a standard value, the standard value with the minimum difference between the calculation result and the national standard is the module of the gear to be measured, and then the radius of the reference circle is calculated:
r=mz/2。
to sum up, the addendum radius raRadius of tooth root circle rfThe tooth number z, the modulus m and the reference circle radius r are all gear size parameters required by the gear size parameter automatic measurement method based on machine vision, wherein the modulus m and the reference circle radius r can be obtained according to the addendum circle radius raRadius of tooth root circle rfAnd calculating the tooth number z.
Effects and effects of the embodiments
According to the gear size parameter automatic measurement method based on the machine vision, the Fisher criterion segmentation algorithm based on the edge information of the Laplace operator is used, the edge influence of the region of interest is strengthened while the whole image information is considered, the target region in the image can be highlighted, and conditions are provided for accurately extracting the gear edge in the follow-up process. And because the invention uses the sub-pixel edge method of the Zernike moment for self-adaptively determining the threshold condition of the step gray scale, the manual selection of the threshold of the step gray scale is avoided, so that the automatic accurate positioning of the gear edge can be realized, the labor cost is reduced, the efficiency is improved, and the error caused by the manual subjective selection of the threshold is avoided. The invention also provides an algorithm for determining the center of the gear by a centroid method, calculating by using a boundary tracking algorithm and a least square method to obtain the radius of the addendum circle and the dedendum circle, setting a mask to extract a communication domain of the gear tooth part and marking and counting to obtain the number of teeth, and the algorithm can be used for rapidly and accurately measuring each parameter of the gear and can be applied to production line detection.
In conclusion, the method is a non-contact measurement method, can avoid damaging parts and prolong the service life of the parts, and is simple to operate, low in cost, high in automation degree and efficiency, high in measurement accuracy reaching the micron level and high in stability.
Furthermore, the gear image is preprocessed by the embodiment, so that noise is removed, interference is reduced, the image quality of the gear image is improved, and the accuracy of a measurement result is improved.
Furthermore, the radius of the circular mask template used in the implementation is larger than the radius r of the tooth root circlefSmaller than the addendum circle radius raThe mask processing method has the advantages that the gear tooth image of the gear tooth only comprising the gear is obtained after the mask processing, the number of the gear teeth is conveniently calculated, and the measuring efficiency is improved.
The above embodiments are preferred examples of the present invention, and are not intended to limit the scope of the present invention.

Claims (8)

1. A gear size parameter automatic measurement method based on machine vision is characterized by comprising the following steps:
step S1, acquiring an image of the gear by using an image acquisition device;
step S2, preprocessing the gear image to remove noise, and obtaining the preprocessed gear image;
step S3, performing threshold segmentation on the preprocessed gear image, obtaining a gear binary image through morphological operation, and performing Canny operator edge detection coarse positioning and sub-pixel edge method fine positioning of Zernike moment on the gear binary image to obtain a gear edge image;
step S4, determining the gear center o of the gear by a centroid method for the gear edge image, and then calculating the radius r of the gear top circle by utilizing a boundary tracking algorithm and a least square fitting methodaRadius of tooth root circle rfThe number of pixels occupied in the gear edge image;
step S5, completing the calibration of pixel equivalent by using a calibration plate, and respectively comparing the pixel equivalent with the radius r of the addendum circleaOccupied number of pixels and radius r of root circlefMultiplying the occupied pixel number to obtain the addendum circle radius r of the gearaAnd the root circle radius r of the gearf
Step S6, using a circular mask template to perform mask processing on the gear binary image to obtain a gear tooth image only including the gear teeth of the gear, then marking the communication domain of the gear tooth image and counting to obtain the number of teeth z,
in step S3, performing threshold segmentation on the preprocessed gear image by using a Fisher criterion segmentation algorithm based on edge information of a laplacian operator.
2. The machine-vision-based gear dimensional parameter automatic measurement method of claim 1, wherein:
in step S4, determining a gear center O of the gear by using a centroid method as a reference for measuring a gear size parameter, wherein the process of determining the gear center O of the gear by using the centroid method is as follows:
step S4-1-1: inputting an edge image f (x, y) of the gear, and setting an initial circle center (x)0,y0);
Step S4-1-2: scanning the gear edge image line by line, and recording the number n of pixels with the gray value of 1 and the coordinates (x) of each point imagei,yi);
Step S4-1-3: calculating the gear center O:
Figure FDA0003080673960000021
3. the machine-vision-based gear dimensional parameter automatic measurement method of claim 2, wherein:
wherein, in step S4, the addendum circle radius r of the gear isaRadius r of the root circle of the gearfThe measurement process is as follows:
step S4-2-1: inputting the gear edge image, and determining the gear center O of the gear;
step S4-2-2: selecting a starting point on the edge of the gear edge image, calculating and recording the distance from each pixel point on the outline to the gear center O by using a boundary tracking algorithm, solving the image coordinate corresponding to the maximum value of the distance and recording the image coordinate to an array A, and recording the image coordinate corresponding to the minimum value to an array B;
step S4-2-3: respectively fitting circles to the coordinates of the array A and the array B by using a least square method to obtain the radius r of the addendum circle of the gearaRadius r of tooth root of said gearfThe number of pixels occupied in the gear edge image.
4. The machine-vision-based gear dimensional parameter automatic measurement method of claim 1, wherein:
wherein, in step S6, the stepThe radius of the circular mask template is larger than the radius r of the tooth root circlefSmaller than the addendum circle radius ra
5. The machine-vision-based gear dimensional parameter automatic measurement method of claim 1, wherein:
in step S2, the preprocessing includes piecewise linear gray-scale transformation and adaptive median filtering;
the piecewise linear gray scale transformation is used for enhancing a gear region;
the adaptive median filtering method is used to remove noise and smooth images.
6. The machine-vision-based gear dimensional parameter automatic measurement method of claim 1, wherein:
wherein, in step S3, the morphological operation includes erosion and dilation for deburring the preprocessed gear image so that the image of the preprocessed gear image is refined.
7. The machine-vision-based gear dimensional parameter automatic measurement method of claim 1, wherein:
in step S1, the image capturing device includes a CMOS industrial area-array camera, a three-color spherical light source, a camera holder, and a computer.
8. The machine-vision-based gear dimensional parameter automatic measurement method of claim 7, wherein:
in step S5, the calibration board is a checkerboard calibration board, and the specific process of calibrating the pixel equivalent is as follows: focusing and shooting the calibration plate through the CMOS industrial area-array camera to obtain the number n of pixels occupied by each small cell of the calibration plate, wherein the actual size of each small cell of the calibration plate is l, and the pixel equivalent k is l/n.
CN202110565114.0A 2021-05-24 2021-05-24 Gear size parameter automatic measurement method based on machine vision Pending CN113222955A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110565114.0A CN113222955A (en) 2021-05-24 2021-05-24 Gear size parameter automatic measurement method based on machine vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110565114.0A CN113222955A (en) 2021-05-24 2021-05-24 Gear size parameter automatic measurement method based on machine vision

Publications (1)

Publication Number Publication Date
CN113222955A true CN113222955A (en) 2021-08-06

Family

ID=77098099

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110565114.0A Pending CN113222955A (en) 2021-05-24 2021-05-24 Gear size parameter automatic measurement method based on machine vision

Country Status (1)

Country Link
CN (1) CN113222955A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114494045A (en) * 2022-01-10 2022-05-13 南京工大数控科技有限公司 Large-scale straight gear geometric parameter measuring system and method based on machine vision
CN114972338A (en) * 2022-07-26 2022-08-30 武汉工程大学 Machine vision measurement method for fault of running gear of high-speed rail motor train unit
CN115035107A (en) * 2022-08-10 2022-09-09 山东正阳机械股份有限公司 Axle gear working error detection method based on image processing

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106017350A (en) * 2016-07-05 2016-10-12 沈阳工业大学 Machine-vision-based rapid detection device and detection method for medium and small module gears
CN110793462A (en) * 2019-11-15 2020-02-14 中北大学 Nylon gear reference circle measuring method based on vision technology
CN111578838A (en) * 2020-05-25 2020-08-25 安徽工业大学 Gear size visual measurement device and measurement method
CN111750789A (en) * 2020-06-08 2020-10-09 北京工业大学 Tooth pitch deviation and tooth profile deviation evaluation method in small module gear vision measurement

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106017350A (en) * 2016-07-05 2016-10-12 沈阳工业大学 Machine-vision-based rapid detection device and detection method for medium and small module gears
CN110793462A (en) * 2019-11-15 2020-02-14 中北大学 Nylon gear reference circle measuring method based on vision technology
CN111578838A (en) * 2020-05-25 2020-08-25 安徽工业大学 Gear size visual measurement device and measurement method
CN111750789A (en) * 2020-06-08 2020-10-09 北京工业大学 Tooth pitch deviation and tooth profile deviation evaluation method in small module gear vision measurement

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
JUNJIE ZHOU 等: "Chisel edge wear measurement of high-speed steel twist drills based on machine vision", 《COMPUTERS IN INDUSTRY》 *
杨云涛等: "基于机器视觉检测技术的齿轮几何参数自动测量系统", 《计量与测试技术》 *
欧阳晖等: "高分辨率遥感影像圆特征提取算法研究", 《城市勘测》 *
王宁等: "齿轮齿廓总偏差视觉测量方法研究", 《机械传动》 *
郭进等: "机器视觉标定中的亚像素中心定位算法", 《传感器与微系统》 *
高志强 等: "基于机器视觉的尼龙齿轮检测研究", 《组合机床与自动化加工技术》 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114494045A (en) * 2022-01-10 2022-05-13 南京工大数控科技有限公司 Large-scale straight gear geometric parameter measuring system and method based on machine vision
CN114494045B (en) * 2022-01-10 2024-04-16 南京工大数控科技有限公司 Large spur gear geometric parameter measurement system and method based on machine vision
CN114972338A (en) * 2022-07-26 2022-08-30 武汉工程大学 Machine vision measurement method for fault of running gear of high-speed rail motor train unit
CN115035107A (en) * 2022-08-10 2022-09-09 山东正阳机械股份有限公司 Axle gear working error detection method based on image processing
CN115035107B (en) * 2022-08-10 2022-11-08 山东正阳机械股份有限公司 Axle gear working error detection method based on image processing

Similar Documents

Publication Publication Date Title
CN109141232B (en) Online detection method for disc castings based on machine vision
CN113222955A (en) Gear size parameter automatic measurement method based on machine vision
CN107341802B (en) Corner sub-pixel positioning method based on curvature and gray scale compounding
CN108007388A (en) A kind of turntable angle high precision online measuring method based on machine vision
CN111062940B (en) Screw positioning and identifying method based on machine vision
CN104112269A (en) Solar cell laser-marking parameter detection method based on machine vision and system thereof
CN111583114B (en) Automatic measuring device and measuring method for pipeline threads
CN112686920A (en) Visual measurement method and system for geometric dimension parameters of circular part
CN109000583B (en) System and method for efficient surface measurement using laser displacement sensors
CN114897864A (en) Workpiece detection and defect judgment method based on digital-analog information
CN113324478A (en) Center extraction method of line structured light and three-dimensional measurement method of forge piece
CN111354047B (en) Computer vision-based camera module positioning method and system
CN115096206A (en) Part size high-precision measurement method based on machine vision
CN112907556A (en) Automatic measuring method for abrasion loss of rotary cutter based on machine vision
CN108802051B (en) System and method for detecting bubble and crease defects of linear circuit of flexible IC substrate
CN115861217A (en) System and method for detecting defects of circuit board of backlight plate based on vision
CN113554616A (en) Online measurement guiding method and system based on numerical control machine tool
CN111815580A (en) Image edge identification method and small module gear module detection method
CN116880353A (en) Machine tool setting method based on two-point gap
CN111815575A (en) Bearing steel ball part detection method based on machine vision
Lee et al. Development of an On-Machine External Thread Measurement System for CNC Lathes Using Eye-in-Hand Machine Vision with Morphology Technology.
CN114485433A (en) Three-dimensional measurement system, method and device based on pseudo-random speckles
CN106841231B (en) Visual precision measurement system and method for tiny parts
Yu et al. A Machine vision method for non-contact Tool Wear Inspection
CN113538399A (en) Method for obtaining accurate contour of workpiece, machine tool and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20210806

WD01 Invention patent application deemed withdrawn after publication