CN114092403A - Grinding wheel wear detection method and system based on machine vision - Google Patents
Grinding wheel wear detection method and system based on machine vision Download PDFInfo
- Publication number
- CN114092403A CN114092403A CN202111240656.7A CN202111240656A CN114092403A CN 114092403 A CN114092403 A CN 114092403A CN 202111240656 A CN202111240656 A CN 202111240656A CN 114092403 A CN114092403 A CN 114092403A
- Authority
- CN
- China
- Prior art keywords
- image
- grinding wheel
- gray
- edge
- value
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000227 grinding Methods 0.000 title claims abstract description 67
- 238000001514 detection method Methods 0.000 title claims abstract description 41
- 238000000034 method Methods 0.000 claims abstract description 29
- 238000005299 abrasion Methods 0.000 claims abstract description 26
- 230000011218 segmentation Effects 0.000 claims abstract description 25
- 238000003708 edge detection Methods 0.000 claims abstract description 20
- 238000001914 filtration Methods 0.000 claims abstract description 15
- 230000000007 visual effect Effects 0.000 claims abstract description 4
- 238000004364 calculation method Methods 0.000 claims description 13
- 238000003754 machining Methods 0.000 description 7
- 238000000605 extraction Methods 0.000 description 4
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 229910001651 emery Inorganic materials 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000004445 quantitative analysis Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 239000004576 sand Substances 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 238000012163 sequencing technique Methods 0.000 description 1
- 230000003746 surface roughness Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G06T5/70—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/136—Segmentation; Edge detection involving thresholding
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20024—Filtering details
- G06T2207/20032—Median filtering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30164—Workpiece; Machine component
Abstract
The invention relates to a grinding wheel wear detection method and a grinding wheel wear detection system based on machine vision, wherein the method comprises the following steps: calibrating internal and external parameters of a camera; placing the workpiece in the visual field range of a camera, and shooting an image of the workpiece; converting the collected image into a gray image; converting the image into a binary image by using an Otsu threshold segmentation method; removing noise existing in the image by using median filtering; using zernike moments to perform sub-pixel edge detection; determining a region of interest; performing curve fitting on the detected edge; and judging the abrasion condition of the grinding wheel. The invention can quickly and efficiently detect the abrasion condition of the grinding wheel, judge whether the grinding wheel needs to be dressed and improve the grinding processing quality.
Description
Technical Field
The invention belongs to the technical field of intelligent detection of abrasion wheel abrasion, and particularly relates to an abrasion wheel abrasion detection method and system based on machine vision.
Background
Grinding can achieve high machining precision and small surface roughness, and grinding is mostly adopted for machining high-precision workpieces in the mechanical manufacturing industry. The grinding process is to cut a workpiece with a grinding wheel rotating at high speed. Wherein, the emery wheel belongs to the consumer. With continuous processing, the sand grains of the grinding wheel continuously fall off, so that the shape of the grinding wheel is changed, the processing precision and the surface quality of a workpiece are reduced, and the processing efficiency is influenced. Therefore, the method has important engineering significance for improving the machining quality of parts by detecting the abrasion condition of the grinding wheel.
At present, the grinding wheel abrasion detection method generally depends on the experience of workers, and judges according to the sound generated during machining and the surface quality of a workpiece, so that the workers are required to have higher machining experience, the subjectivity is higher, and the labor cost is increased. In addition, the existing method for detecting the abrasion of the grinding wheel also utilizes signals such as acoustic emission, current, force and the like to detect, but the methods are qualitative detection and cannot realize quantitative detection, so that the problem of inaccurate detection exists.
Disclosure of Invention
The invention provides a grinding wheel wear detection method and a grinding wheel wear detection system based on machine vision, aiming at the problems that the wear consumption of a grinding wheel can cause shape change and reduce the processing precision and the processing quality of a workpiece in grinding processing. The invention has high detection precision and high speed, and realizes accurate detection of the abrasion of the grinding wheel.
The invention adopts the following technical scheme:
a grinding wheel wear detection method based on machine vision comprises the following steps:
(1) calibrating internal and external parameters of a camera;
(2) placing the workpiece in the visual field range of a camera, and shooting an image of the workpiece;
(3) converting the collected image into a gray image;
(4) converting the image into a binary image by using an Otsu threshold segmentation method;
(5) removing noise existing in the image by using median filtering;
(6) the zernike moments are used for sub-pixel edge detection.
(7) Determining a region of interest;
(8) performing curve fitting on the detected edge;
(9) and judging the abrasion condition of the grinding wheel.
Preferably, in step (3), the acquired RGB image is weighted-averaged to convert the RGB image into a grayscale image, and the R, G, B component weighted-average algorithm of a single pixel point is: 0.2989R +0.5870G + 0.1140B.
Preferably, in step (5), the pixels in the neighborhood are sorted by gray level using median filtering, and the median is taken as the output pixel, so as to remove the noise existing in the image.
Preferably, step (7) selects the curve of the workpiece root as the region of interest.
Preferably, in step (8), the pixel coordinates of the root curve are extracted and curve fitting is performed using the least square method.
Preferably, in step (9), the abrasion condition of the grinding wheel is judged according to the maximum curvature value of the curve obtained by fitting. The smaller the curvature, the more abrasive the wheel wears, when the curvature is less than a set threshold, the wheel needs to be dressed.
Preferably, step (4) uses an Otsu threshold segmentation method, comprising the steps of:
let the number of image pixels be N and the gray scale range be [0, L-1]The number of pixels corresponding to the gray level i is NiThe probability is:
the pixels in the image are divided into two types according to the gray value T, and the gray value is [0, T]The number of pixels in between is NminGray scale value of [ T +1, L-1]The number of pixels in between is Nmax;
the calculation formula of the between-class variance is as follows: sigma2=pmin*(avgtotal-avgmin)2+pmax*(avgtotal-avgmax)2;
Let T be at [0, L-1 ]]Sequentially taking values in the range, and taking sigma2Is the optimal threshold T of Otsu threshold segmentation.
A preferred step (6), using zernike moment sub-pixel edge detection, comprises the steps of:
is a zernike polynomial vnm(ρ, θ) complex conjugate, f (x, y) is the gray value of point (x, y) in the image;
establishing an ideal edge step model, wherein theta represents an angle formed by an edge relative to an x axis, t represents background gray scale, r represents step amplitude, (x, y) represents a circle center coordinate, and l represents a vertical distance from the circle center to the edge; rotating the edge by-theta, wherein the relationship between the original image zernike moment and the rotated zernike moment is as follows: z'00=Z00,Z′11=Z11e-jθ,Z′20=Z20;
The parameter calculation formula of the edge model is as follows:
the invention also discloses a grinding wheel wear detection system based on machine vision, which comprises the following modules:
a camera calibration module: calibrating internal and external parameters of a camera;
an image acquisition module: shooting an image of a workpiece placed in a camera view range;
an image graying module: converting the collected image into a gray image;
otsu threshold segmentation module: converting the image into a binary image by using an Otsu threshold segmentation method;
a filtering module: removing noise existing in the image by using median filtering;
a sub-pixel edge detection module: using zernike moments to perform sub-pixel edge detection;
a region of interest determination module: determining a region of interest;
a curve fitting module: performing curve fitting on the detected edge;
and a wear judging module: and judging the abrasion condition of the grinding wheel.
Preferably, the image graying module: the collected RGB image is converted into a gray image by weighted average, and the R, G, B component weighted average algorithm of a single pixel point is as follows: 0.2989R +0.5870G + 0.1140B.
Preferably, the filtering module: and sorting the pixels in the neighborhood by gray level by using median filtering, taking the median as an output pixel, and removing the noise existing in the image.
Preferably, the region of interest determination module: and selecting the curve of the root of the workpiece as the region of interest.
Preferably, the curve fitting module: and extracting pixel coordinates of the root curve, and performing curve fitting by using a least square method.
Preferably, the judging wear module: and judging the abrasion condition of the grinding wheel according to the maximum curvature value of the curve obtained by fitting. The smaller the curvature, the more abrasive the wheel wears, when the curvature is less than a set threshold, the wheel needs to be dressed.
Preferably, the Otsu threshold segmentation module, using an Otsu threshold segmentation method, comprises the following steps:
let the number of image pixels be N and the gray scale range be [0, L-1]The number of pixels corresponding to the gray level i is NiThe probability is:
the pixels in the image are divided into two types according to the gray value T, and the gray value is [0, T]The number of pixels in between is NminGray scale value of [ T +1, L-1]The number of pixels in between is Nmax;
calculation formula of between-class varianceComprises the following steps: sigma2=pmin*(avgtotal-avgmin)2+pmax*(avgtotal-avgmax)2;
Let T be at [0, L-1 ]]Sequentially taking values in the range, and taking sigma2Is the optimal threshold T of Otsu threshold segmentation.
Preferably, the sub-pixel edge detection module, using zernike moment sub-pixel edge detection, comprises the following steps:
is a zernike polynomial vnm(ρ, θ) complex conjugate, f (x, y) is the gray value of point (x, y) in the image;
establishing an ideal edge step model, wherein theta represents an angle formed by an edge relative to an x axis, t represents background gray scale, r represents step amplitude, (x, y) represents a circle center coordinate, and l represents a vertical distance from the circle center to the edge; rotating the edge by-theta, wherein the relationship between the original image zernike moment and the rotated zernike moment is as follows: z'00=Z00,Z′11=Z11e-jθ,Z′20=Z20;
The parameter calculation formula of the edge model is as follows:
the machine vision adopted by the invention is a detection technology utilizing an optical principle, and the detection method adopting the machine vision can carry out quantitative analysis on a detection object, so that the problem that workers judge the grinding wheel to be highly subjective in wear by experience is avoided, and the specific grinding wheel wear condition can be obtained. The detection technical scheme of the invention has the advantages of simple equipment, low cost, high detection speed and stable and reliable work.
The invention provides a grinding wheel wear detection method and a grinding wheel wear detection system based on machine vision, which have the following beneficial effects:
(1) the grinding wheel does not need to be detached in detection, the abrasion condition of the grinding wheel is judged by detecting a workpiece, and the detection is convenient.
(2) The shape change caused by abrasion of the grinding wheel can be quantitatively detected.
(3) Can remind whether the grinding wheel needs to be dressed or not in time.
(4) The detection equipment is simple, the cost is low, and the detection speed is high.
Drawings
FIG. 1 is a schematic view of a work piece being processed.
FIG. 2 is a schematic view of the corner wear of the grinding wheel.
FIG. 3 is a detection flow chart.
Fig. 4 is a drawing of the edge extraction of a workpiece.
FIG. 5 shows the region of the blade root to be inspected.
FIG. 6 is a graph of a root curve fit.
Fig. 7 is a block diagram of the system of the present invention.
Detailed Description
The details and embodiments of the present invention are further described below with reference to the accompanying drawings.
As shown in fig. 1 and 2, during grinding, the workpiece is fixed in position and the grinding wheel is rotated at high speed to perform machining in the feed direction shown in the drawing. Along with the continuous deepening of the processing progress, the corner of the grinding wheel can be abraded, and therefore a section of curve can be formed at the knife root of the processed workpiece. Therefore, it is necessary to precisely detect it by a certain method.
Referring to fig. 3, the method for detecting abrasion of a grinding wheel based on machine vision in the embodiment includes the following steps:
(1) shooting 15 checkerboards with different poses by the camera, and calibrating internal and external parameters and distortion parameters of the camera by using MATLAB software.
(2) And placing the workpiece in the visual field range of the camera, and shooting the image of the workpiece to be detected.
(3) Firstly, carrying out gray level processing on the acquired image, and converting the acquired image into a gray level image. In this embodiment, the weighted average of the collected RGB images is converted into a grayscale image, and the R, G, B component weighted average algorithm of a single pixel point is as follows: 0.2989R +0.5870G + 0.1140B.
(4) And calculating an optimal threshold value by using an Otsu threshold segmentation method to convert the gray-scale image into a binary image. The method comprises the following concrete steps:
let the number of image pixels be N and the gray scale range be [0, L-1]The number of pixels corresponding to the gray level i is NiThe probability is:
the pixels in the image are divided into two types according to the gray value T, and the gray value is [0, T]The number of pixels in between is NminGray scale value of [ T +1, L-1]The number of pixels in between is Nmax。
the calculation formula of the between-class variance is as follows: sigma2=pmin*(avgtotal-avgmin)2+pmax*(avgtotal-avgmax)2。
Let T be at [0, L-1 ]]Sequentially taking values in the range, and taking sigma2Is the optimal threshold T of Otsu threshold segmentation.
(5) Before the edge extraction, the image needs to be denoised, so that the noise on the image is reduced, and the edge extraction effect is improved. And (3) carrying out noise reduction on the image by adopting median filtering, wherein the method comprises the steps of taking pixel values of a certain pixel point and adjacent pixel points around the certain pixel point on the image, sequencing the pixel values, and taking the sequenced pixel value positioned in the middle position as the pixel value of the current pixel point. The method can preserve the edge characteristics without causing the phenomenon of edge blurring.
(6) And using a zernike moment subpixel edge detection algorithm to extract the workpiece edge in the image, wherein the extraction result is shown in fig. 4. The specific calculation process of the edge sub-pixel coordinate position is as follows:
is a zernike polynomial vnm(ρ, θ) complex conjugate, f (x, y) is the gray value of point (x, y) in the image;
establishing an ideal edge step model, wherein theta represents the angle of an edge relative to an x axis, t represents background gray scale, r represents step amplitude, (x, y) represents coordinates of a circle center, and l represents the vertical distance from the circle center to the edge. Rotating the edge by-theta, wherein the relationship between the original image zernike moment and the rotated zernike moment is as follows: z'00=Z00,Z′11=Z11e-jθ,Z′20=Z20。
The parameter calculation formula of the edge model is as follows:
(7) determining a region of interest (ROI). The area to be inspected in this embodiment is circled in fig. 5, which is the grinding wheel machining stop position, and therefore, a curve of grinding wheel corner wear is left in this position.
(8) Fig. 6(a) shows a curve pixel point at the knife root, a pixel point coordinate of the knife root curve is extracted, the extracted pixel point coordinate is fitted by using a least square method, and a fitting result is shown in fig. 6 (b).
(9) And calculating the curvature maximum value K of the fitting curve. When the curvature K is larger than the set threshold value, the grinding wheel does not need to be dressed. When the curvature K value is smaller than the set threshold value, the grinding wheel is prompted to need to be dressed.
As shown in fig. 7, the grinding wheel wear detection system based on machine vision of the present embodiment includes the following modules:
a camera calibration module: and calibrating the internal and external parameters of the camera.
An image acquisition module: an image of a workpiece positioned within a field of view of a camera is captured.
An image graying module: the collected RGB image is converted into a gray image by weighted average, and the R, G, B component weighted average algorithm of a single pixel point is as follows: 0.2989R +0.5870G + 0.1140B.
Otsu threshold segmentation module: converting the image into a binary image by using an Otsu threshold segmentation method; the Otsu threshold segmentation module uses an Otsu threshold segmentation method and comprises the following steps:
let the number of image pixels be N and the gray scale range be [0, L-1]The number of pixels corresponding to the gray level i is NiThe probability is:
the pixels in the image are divided into two categories according to the gray value TAt [0, T]The number of pixels in between is NminGray scale value of [ T +1, L-1]The number of pixels in between is Nmax;
the calculation formula of the between-class variance is as follows: sigma2=pmin*(avgtotal-avgmin)2+pmax*(avgtotal-avgmax)2;
Let T be at [0, L-1 ]]Sequentially taking values in the range, and taking sigma2Is the optimal threshold T of Otsu threshold segmentation.
A filtering module: and sorting the pixels in the neighborhood by gray level by using median filtering, taking the median as an output pixel, and removing the noise existing in the image.
A sub-pixel edge detection module: using zernike moments to perform sub-pixel edge detection; the sub-pixel edge detection module uses zernike moment sub-pixel edge detection, and comprises the following steps:
is a zernike polynomial vnm(ρ, θ) complex conjugate, f (x, y) is the gray value of point (x, y) in the image;
establishing an ideal edge step model, wherein theta represents an angle formed by an edge relative to an x axis, t represents background gray scale, r represents step amplitude, (x, y) represents a circle center coordinate, and l represents a vertical distance from the circle center to the edge; rotating the edge by-theta, wherein the relationship between the original image zernike moment and the rotated zernike moment is as follows: z'00=Z00,Z′11=Z11e-jθ,Z′20=Z20;
The parameter calculation formula of the edge model is as follows:
a region of interest determination module: and selecting the curve of the root of the workpiece as the region of interest.
A curve fitting module: and extracting pixel coordinates of the root curve, and performing curve fitting by using a least square method.
And a wear judging module: and judging the abrasion condition of the grinding wheel according to the maximum curvature value of the curve obtained by fitting. The smaller the curvature, the more abrasive the wheel wears, when the curvature is less than a set threshold, the wheel needs to be dressed.
The invention relates to a non-contact indirect detection method for abrasion of a grinding wheel, which utilizes a camera to detect the curvature of a root curve of a processed workpiece, judges the abrasion condition of the grinding wheel and whether the grinding wheel needs to be repaired, and reduces the processing error of the workpiece caused by abrasion of the grinding wheel. Firstly, calibrating internal and external parameters of a camera, and shooting to obtain a workpiece image. Secondly, the collected image is grayed. On the basis of the gray-scale image, the gray-scale value of the image is divided into two parts by using an Otsu threshold segmentation method through an optimal threshold value, and the two parts are converted into a binary image, so that the separation of the detection target and the background is realized. And filtering the noise existing on the image to obtain an image beneficial to edge detection. And thirdly, obtaining the edge positioning precision of a subpixel level by utilizing the zernike moment subpixel edge detection. Then, a region of interest (ROI) is determined, and an image of the detected knife root region is obtained. And finally, performing curve fitting on the detected pixel points at the edge of the cutter root to obtain the curvature of the curve, and judging the abrasion condition of the grinding wheel. The invention can quickly and efficiently detect the abrasion condition of the grinding wheel, judge whether the grinding wheel needs to be dressed and improve the grinding processing quality.
It should be understood by those skilled in the art that the above embodiments are only used for illustrating the present invention and are not to be taken as limiting the present invention, and the changes and modifications of the above embodiments are within the scope of the present invention.
Claims (10)
1. The grinding wheel wear detection method based on machine vision is characterized by comprising the following steps of:
(1) calibrating internal and external parameters of a camera;
(2) placing the workpiece in the visual field range of a camera, and shooting an image of the workpiece;
(3) converting the collected image into a gray image;
(4) converting the image into a binary image by using an Otsu threshold segmentation method;
(5) removing noise existing in the image by using median filtering;
(6) using zernike moments to perform sub-pixel edge detection;
(7) determining a region of interest;
(8) performing curve fitting on the detected edge;
(9) and judging the abrasion condition of the grinding wheel.
2. The machine vision-based grinding wheel wear detection method according to claim 1, characterized in that: step (3), the weighted average of the collected RGB images is converted into a gray image, and the R, G, B component weighted average algorithm of a single pixel point is as follows: 0.2989R +0.5870G + 0.1140B.
3. The machine vision-based grinding wheel wear detection method according to claim 1, characterized in that: and (4) using an Otsu threshold segmentation method, comprising the following steps of:
let the total pixel number of the image be N, and the gray scale range be [0, L-1]L-1 is the maximum gray level value of the image, and the number of pixels corresponding to a gray level i is niNumber of pixels niProbability p ofiComprises the following steps:
The pixels in the image are divided into two types according to the gray value T, and the gray value is [0, T]The number of pixels in between is NminGray scale value of [ T +1, L-1]The number of pixels in between is Nmax;
Probability p of occurrence of gray values smaller than the threshold valueminComprises the following steps:
probability p of occurrence of gray value greater than the threshold valuemaxComprises the following steps:
the mean value avg of the gray scale larger than the threshold valuemaxComprises the following steps:
between-class variance σ2The calculation formula of (2) is as follows: sigma2=pmin*(avgtotal-avgmin)2+pmax*(avgtotal-avgmax)2;
Let T be at [0, L-1 ]]Sequentially taking values in the range, and taking sigma2Is the optimal threshold T of Otsu threshold segmentation.
4. The machine vision-based grinding wheel wear detection method according to claim 1, characterized in that: and (5) sorting the pixels in the neighborhood according to the gray level by using median filtering, taking the middle value as an output pixel, and removing the noise in the image.
5. The machine vision-based grinding wheel wear detection method according to claim 1, characterized in that: step (6), using zernike moment sub-pixel edge detection, comprising the steps of:
p is the radial distance, theta is the angular coordinate,is a zernike polynomial vnm(ρ, θ) complex conjugate, f (x, y) is the gray value of point (x, y) in the image;
establishing an ideal edge step model, wherein theta represents an angle formed by an edge relative to an x axis, t represents background gray scale, r represents step amplitude, (x, y) represents a circle center coordinate, and l represents a vertical distance from the circle center to the edge; rotating the edge by-theta, wherein the relationship between the original image zernike moment and the rotated zernike moment is as follows: z'00=Z00,Z′11=Z11e-jθ,Z′20=Z20;
The parameter calculation formula of the edge model is as follows:
6. The machine vision-based grinding wheel wear detection method according to any one of claims 1 to 5, characterized in that: and (7) selecting the curve of the root of the workpiece as the region of interest.
7. The machine vision-based grinding wheel wear detection method according to claim 6, characterized in that: step (8), extracting pixel coordinates of the root curve, and performing curve fitting by using a least square method; and/or step (9), judging the abrasion condition of the grinding wheel according to the maximum curvature value of the curve obtained by fitting.
8. Grinding wheel wear detection system based on machine vision is characterized by comprising the following modules:
a camera calibration module: calibrating internal and external parameters of a camera;
an image acquisition module: shooting an image of a workpiece placed in a camera view range;
an image graying module: converting the collected image into a gray image;
otsu threshold segmentation module: converting the image into a binary image by using an Otsu threshold segmentation method;
a filtering module: removing noise existing in the image by using median filtering;
a sub-pixel edge detection module: using zernike moments to perform sub-pixel edge detection;
a region of interest determination module: determining a region of interest;
a curve fitting module: performing curve fitting on the detected edge;
and a wear judging module: and judging the abrasion condition of the grinding wheel.
9. The machine vision based wheel wear detection system of claim 8, wherein: an Otsu threshold segmentation module, which uses an Otsu threshold segmentation method, comprises the following steps:
let the number of image pixels be N and the gray scale range be [0, L-1]The number of pixels corresponding to the gray level i is NiThe probability is:
the pixels in the image are divided into two types according to the gray value T, and the gray value is [0, T]The number of pixels in between is NminGray scale value of [ T +1, L-1]The number of pixels in between is Nmax;
the calculation formula of the between-class variance is as follows: sigma2=pmin*(avgtotal-avgmin)2+pmax*(avgtotal-avgmax)2;
Let T be at [0, L-1 ]]Sequentially taking values in the range, and taking sigma2Is the optimal threshold T of Otsu threshold segmentation.
10. The machine vision based grinding wheel wear detection system of claim 8 or 9, wherein: a sub-pixel edge detection module using zernike moment sub-pixel edge detection, comprising the steps of:
is a zernike polynomial vnm(ρ, θ) complex conjugate, f (x, y) is the gray value of point (x, y) in the image;
establishing an ideal edge step model, wherein theta represents an angle formed by an edge relative to an x axis, t represents background gray scale, r represents step amplitude, (x, y) represents a circle center coordinate, and l represents a vertical distance from the circle center to the edge; rotating the edge by-theta, wherein the relationship between the original image zernike moment and the rotated zernike moment is as follows: z'00=Z00,Z′11=Z11e-jθ,Z′20=Z20;
The parameter calculation formula of the edge model is as follows:
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111240656.7A CN114092403A (en) | 2021-10-25 | 2021-10-25 | Grinding wheel wear detection method and system based on machine vision |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111240656.7A CN114092403A (en) | 2021-10-25 | 2021-10-25 | Grinding wheel wear detection method and system based on machine vision |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114092403A true CN114092403A (en) | 2022-02-25 |
Family
ID=80297561
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111240656.7A Pending CN114092403A (en) | 2021-10-25 | 2021-10-25 | Grinding wheel wear detection method and system based on machine vision |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114092403A (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114663433A (en) * | 2022-05-25 | 2022-06-24 | 山东科技大学 | Method and device for detecting running state of roller cage shoe, computer equipment and medium |
CN114972227A (en) * | 2022-05-17 | 2022-08-30 | 大连交通大学 | Method for measuring porosity of grinding wheel |
CN115035120A (en) * | 2022-08-12 | 2022-09-09 | 山东迪格重工机械有限公司 | Machine tool control method and system based on Internet of things |
CN115100210A (en) * | 2022-08-29 | 2022-09-23 | 山东艾克赛尔机械制造有限公司 | Anti-counterfeiting identification method based on automobile parts |
CN115100197A (en) * | 2022-08-24 | 2022-09-23 | 启东市群鹤机械设备有限公司 | Method for detecting surface burn of workpiece grinding |
CN115431101A (en) * | 2022-10-18 | 2022-12-06 | 南通钜德智能科技有限公司 | Method and system for detecting state of numerical control machine tool |
CN115526890A (en) * | 2022-11-25 | 2022-12-27 | 深圳市腾泰博科技有限公司 | Method for identifying fault factors of record player head |
CN117689677A (en) * | 2024-02-01 | 2024-03-12 | 山东大学日照研究院 | Grinding wheel abrasion state identification method, system, equipment and medium |
-
2021
- 2021-10-25 CN CN202111240656.7A patent/CN114092403A/en active Pending
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114972227A (en) * | 2022-05-17 | 2022-08-30 | 大连交通大学 | Method for measuring porosity of grinding wheel |
CN114972227B (en) * | 2022-05-17 | 2024-04-16 | 大连交通大学 | Grinding wheel porosity identification method |
CN114663433A (en) * | 2022-05-25 | 2022-06-24 | 山东科技大学 | Method and device for detecting running state of roller cage shoe, computer equipment and medium |
CN115035120A (en) * | 2022-08-12 | 2022-09-09 | 山东迪格重工机械有限公司 | Machine tool control method and system based on Internet of things |
CN115035120B (en) * | 2022-08-12 | 2022-11-04 | 山东迪格重工机械有限公司 | Machine tool control method and system based on Internet of things |
CN115100197A (en) * | 2022-08-24 | 2022-09-23 | 启东市群鹤机械设备有限公司 | Method for detecting surface burn of workpiece grinding |
CN115100210A (en) * | 2022-08-29 | 2022-09-23 | 山东艾克赛尔机械制造有限公司 | Anti-counterfeiting identification method based on automobile parts |
CN115100210B (en) * | 2022-08-29 | 2022-11-18 | 山东艾克赛尔机械制造有限公司 | Anti-counterfeiting identification method based on automobile parts |
CN115431101A (en) * | 2022-10-18 | 2022-12-06 | 南通钜德智能科技有限公司 | Method and system for detecting state of numerical control machine tool |
CN115526890A (en) * | 2022-11-25 | 2022-12-27 | 深圳市腾泰博科技有限公司 | Method for identifying fault factors of record player head |
CN117689677A (en) * | 2024-02-01 | 2024-03-12 | 山东大学日照研究院 | Grinding wheel abrasion state identification method, system, equipment and medium |
CN117689677B (en) * | 2024-02-01 | 2024-04-16 | 山东大学日照研究院 | Grinding wheel abrasion state identification method, system, equipment and medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN114092403A (en) | Grinding wheel wear detection method and system based on machine vision | |
CN117173189B (en) | Visual inspection system for polishing effect of aluminum alloy surface | |
CN107341802B (en) | Corner sub-pixel positioning method based on curvature and gray scale compounding | |
CN112529858A (en) | Welding seam image processing method based on machine vision | |
CN109682839B (en) | Online detection method for surface defects of metal arc-shaped workpiece | |
CN107490582B (en) | Assembly line workpiece detection system | |
CN114022440B (en) | Detection method and detection device for preventing repeated cutting of wafer and dicing saw | |
CN111311618A (en) | Circular arc workpiece matching and positioning method based on high-precision geometric primitive extraction | |
CN112258444A (en) | Elevator steel wire rope detection method | |
CN106780437B (en) | A kind of quick QFN chip plastic packaging image obtains and amplification method | |
CN116051460A (en) | Online detection method for abrasion of brazing diamond grinding head based on machine vision | |
CN111354009B (en) | Method for extracting shape of laser additive manufacturing molten pool | |
CN115953397A (en) | Method and equipment for monitoring process preparation flow of conical bearing retainer | |
CN115619845A (en) | Self-adaptive scanning document image inclination angle detection method | |
CN117314925B (en) | Metal workpiece surface defect detection method based on computer vision | |
CN115409787A (en) | Method for detecting defects of small pluggable transceiver module base | |
CN114820612A (en) | Roller surface defect detection method and system based on machine vision | |
CN113781413B (en) | Electrolytic capacitor positioning method based on Hough gradient method | |
CN110728286A (en) | Abrasive belt grinding material removal rate identification method based on spark image | |
CN113298775B (en) | Self-priming pump double-sided metal impeller appearance defect detection method, system and medium | |
CN116958714B (en) | Automatic identification method for wafer back damage defect | |
CN107492093B (en) | bearing abnormity detection method based on image processing | |
CN112465741B (en) | Defect detection method and device for suspension spring and valve spring and storage medium | |
CN114187286A (en) | Wood plate surface machining quality control method based on machine vision | |
CN106447683A (en) | Feature extraction algorithm of circles |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |