CN117437233A - Gear defect detection method and system based on image processing - Google Patents

Gear defect detection method and system based on image processing Download PDF

Info

Publication number
CN117437233A
CN117437233A CN202311766342.XA CN202311766342A CN117437233A CN 117437233 A CN117437233 A CN 117437233A CN 202311766342 A CN202311766342 A CN 202311766342A CN 117437233 A CN117437233 A CN 117437233A
Authority
CN
China
Prior art keywords
pixel points
boundary line
boundary
curve
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311766342.XA
Other languages
Chinese (zh)
Other versions
CN117437233B (en
Inventor
刘平珍
张光帅
苗栋
刘世浩
王烁
闫家昌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Runtong Gear Group Co ltd
Original Assignee
Shandong Runtong Gear Group Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Runtong Gear Group Co ltd filed Critical Shandong Runtong Gear Group Co ltd
Priority to CN202311766342.XA priority Critical patent/CN117437233B/en
Publication of CN117437233A publication Critical patent/CN117437233A/en
Application granted granted Critical
Publication of CN117437233B publication Critical patent/CN117437233B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/0008Industrial image inspection checking presence/absence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

The invention relates to the technical field of image processing, in particular to a gear defect detection method and system based on image processing. The method comprises the steps of obtaining an initial gear image and further obtaining a single-tooth image; identifying normal surface pixel points, wear surface pixel points and boundary pixel points in the single-tooth image; acquiring a rough area of the wear surface; fitting a first boundary line curve and a second boundary line curve; obtaining a first boundary line reference line and a second boundary line reference line; obtaining a defect evaluation area; and acquiring the maximum reference wear area, and calculating the gear wear degree by combining the Euclidean distance between the first boundary line reference straight line and the first boundary line curve, the Euclidean distance between the second boundary line reference straight line and the second boundary line curve and the area of the defect evaluation area. The method can rapidly and accurately detect the abrasion defect of the gear and acquire the abrasion degree of the gear.

Description

Gear defect detection method and system based on image processing
Technical Field
The present invention relates generally to the field of image processing technology. More particularly, the invention relates to a gear defect detection method and system based on image processing.
Background
Mechanical gears are extremely prone to wear, pitting or other forms of defect during operation. For example: in the transmission process of the gear, relative sliding exists between the meshing surfaces of the gear teeth, and the tooth surfaces are worn due to the relative sliding between the meshing surfaces under the condition of stress of the gear teeth; tooth surface abrasion can damage the tooth surface shape, so that the transmission is unstable; and tooth surface wear can thin the teeth and reduce strength. In addition, when the gear works, the meshing surface can be repeatedly subjected to contact extrusion, when the pressure is too large, the tooth surface of the gear tooth can generate fine fatigue cracks, and when the gear continues to work, the cracks can be continuously expanded along the surface layer, so that small pieces of metal of the tooth surface are peeled off, and pits and spot pits are formed. When the gear is defective, the transmission mechanism where the gear is located is failed, so that defect detection on the gear is very important.
In the prior art, two methods for detecting the defects of the gears exist, one method is to detect the defects of the gears through optical devices, but the method has high cost, and a light source in an actual scene may not meet the requirements, so that the detection efficiency is affected; the second method is to take an image of the gear and process the image to automatically detect the defect of the gear, but the existing method for detecting the defect of the gear by using image processing has a problem of low accuracy. For example, in the specification of chinese patent application publication No. CN113658133B, a method and a system for detecting a gear surface defect based on image processing are disclosed, in which a boundary line curve is fitted according to boundary pixel points, when a wear surface is very narrow, the boundary line curve can be accurately fitted by adopting the method, but when the wear surface is relatively wide and has an irregular shape, the fitted boundary line curve cannot accurately represent the boundary between the upper surface and the front surface of the gear, so that an error exists between the obtained upper surface boundary line and the front surface boundary line, and thus, the calculated wear degree has an error.
Disclosure of Invention
In order to solve one or more of the above technical problems, the present invention proposes to obtain wear surface pixels according to an initial gear image, determine whether a gear has a tooth surface wear defect according to the number of wear surface pixels, and further calculate the gear wear degree. To this end, the present invention provides solutions in various aspects as follows.
In a first aspect, the present invention provides a gear defect detection method based on image processing, which is characterized by comprising:
acquiring an initial gear image, and performing segmentation processing on the initial gear image to acquire a single-tooth image;
acquiring the gray scale of each pixel point of the single-tooth image, classifying the pixel points with gray scale values belonging to a first reference gray scale value and a preset gray scale neighborhood range of the first reference gray scale value and the number larger than a first threshold value as normal surface pixel points, classifying the pixel points with gray scale values belonging to a second reference gray scale value and a preset gray scale neighborhood range of the second reference gray scale value as wearing surface pixel points, and classifying other pixel points in the single-tooth image as boundary pixel points; the wear surface pixel points refer to pixel points on the surface of a gear wear area;
responding to the number of the pixel points of the abrasion surface being larger than a preset threshold value, judging that the tooth surface abrasion defect exists in the gear, and acquiring the approximate area of the abrasion surface according to the pixel points of the abrasion surface;
marking boundary pixel points above the approximately-regional area of the wear surface as first regional boundary pixel points, fitting a first boundary line curve according to the first regional boundary pixel points, marking boundary pixel points below the approximately-regional area of the wear surface as second regional boundary pixel points, and fitting a second boundary line curve according to the second regional boundary pixel points; the fitting the first boundary line curve includes: fitting the first region boundary pixel points to obtain a first fitting curve; if the dispersion between the first fitting curve and the first region boundary pixel point is smaller than a preset dispersion threshold value, marking the first fitting curve as a first boundary line curve; otherwise, acquiring the pixel points which are positioned on one side with less pixel points of the first fitting curve and are farthest from the first fitting curve, recording the distance corresponding to the pixel points as the maximum distance, reserving the pixel points with the distance between the side and the first fitting curve being less than two thirds of the maximum distance, and reserving all the pixel points on one side with more pixel points of the first fitting curve so as to acquire new first area boundary pixel points, thereby realizing fitting of the new first area boundary pixel points and acquiring a new first fitting curve;
fitting boundary pixel points on the first boundary line curve to obtain a first boundary line reference line; fitting boundary pixel points on the second boundary line curve to obtain a second boundary line reference line; acquiring an upper surface boundary line according to the first boundary line curve, acquiring a front surface boundary line according to the second boundary line curve, and taking a region between the upper surface boundary line and the front surface boundary line as a defect evaluation region;
taking one half of the sum of the area of the upper surface of the single tooth and the area of the front surface of the single tooth as the maximum reference abrasion area, and calculating the abrasion degree of the gear by combining the Euclidean distance between the first boundary line reference straight line and the first boundary line curve, the Euclidean distance between the second boundary line reference straight line and the second boundary line curve and the area of the defect evaluation area, wherein the calculation expression is as follows:
in the method, in the process of the invention,indicating the degree of wear of the gears>Representing natural constant->Represents the area of the defect evaluation area, +.>Represents the maximum reference wear area,/->Indicating the Euclidean distance between the first boundary line reference line and the first boundary line curve, +.>Representing the euclidean distance between the second boundary line reference line and the second boundary line curve.
In another embodiment, the method for calculating the euclidean distance between the first boundary line reference line and the first boundary line curve includes: and calculating Euclidean distances between all pixel points on the first boundary line curve and the first boundary line reference line, calculating an average value of the Euclidean distances corresponding to all pixel points on the first boundary line curve, and taking the average value as the Euclidean distance between the first boundary line reference line and the first boundary line curve.
In another embodiment, the method for obtaining the first reference gray value is: shooting normal surfaces of different single teeth by using a camera so as to obtain all pixel points of the normal surfaces of each single tooth, solving a mean value of the gray values of all the obtained pixel points, and taking the mean value as a first reference gray value.
In another embodiment, the method further comprises: acquiring a single-tooth upper surface area and a single-tooth front surface area according to the normal surface pixel points, and identifying the number of boundary pixel points in the single-tooth upper surface area and the single-tooth front surface area; and in response to the fact that the number of the boundary pixel points in the single-tooth upper surface area or the single-tooth front surface area is larger than a preset fourth threshold value and the boundary pixel points enclose a closed graph, judging that the gear has the spot pit defect.
In another embodiment, a trigonometric fit is used in fitting the first boundary line curve and the second boundary line curve.
In a second aspect, the present invention provides an image processing-based gear defect detection system comprising a processor and a memory, the memory storing computer program instructions which, when executed by the processor, implement the image processing-based gear defect detection method of the present invention.
The invention has the technical effects that: when the boundary line curves are obtained, an upper boundary line curve and a lower boundary line curve are fitted according to boundary pixel points obtained by classification, and the pixel points on the side with less pixel points of the first fitted curve are selectively deleted by fitting the boundary pixel points of the first area and the boundary pixel points of the second area respectively, so that new boundary line pixel points of the first area are obtained, and the first fitted curve obtained by next fitting is offset to the side with more pixel points of the first fitted curve of the current fitting by a small amplitude, so that the boundary line curve which is more in line with the real situation is obtained; better support is provided for further calculation of the wear degree, so that the accuracy of the wear degree calculation is improved.
Further, the spot defect of the gear can be accurately identified by identifying the number of boundary pixel points in the upper surface area of the single tooth or the front surface area of the single tooth and whether the boundary pixel points enclose a closed pattern.
Drawings
The above, as well as additional purposes, features, and advantages of exemplary embodiments of the present invention will become readily apparent from the following detailed description when read in conjunction with the accompanying drawings. In the drawings, embodiments of the invention are illustrated by way of example and not by way of limitation, and like reference numerals refer to similar or corresponding parts and in which:
fig. 1 is a flowchart schematically showing a gear defect detection method based on image processing in an embodiment of the present invention;
FIG. 2 is a schematic diagram schematically illustrating tooth surface wear;
FIG. 3 is a flow chart schematically illustrating a method of fitting a first boundary line curve in an embodiment of the invention;
FIG. 4 is a flowchart schematically showing a pit defect detection method of an embodiment of the present invention;
fig. 5 is a diagram schematically showing the effect of the gear defect detection method based on image processing according to the embodiment of the present invention;
fig. 6 is a diagram showing schematically the effect of the gear defect detection method based on image processing according to the embodiment of the present invention;
fig. 7 is an effect comparison chart three schematically showing a gear defect detection method based on image processing in the embodiment of the present invention;
fig. 8 is a graph showing schematically the effect of the gear defect detection method based on image processing according to the embodiment of the present invention.
Fig. 9 is a schematic diagram schematically illustrating a gear defect detection system based on image processing in an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Specific embodiments of the present invention are described in detail below with reference to the accompanying drawings.
Gear defect detection method embodiment based on image processing:
as shown in fig. 1, the gear defect detection method based on image processing of the present invention includes:
s101, acquiring a single-tooth image of a gear to be detected; specifically, an initial gear image is acquired, and a single-tooth image is acquired by dividing the initial gear image.
The gear to be detected can be photographed with a camera at a diagonally downward angle of view, thereby acquiring an initial gear image. The method for dividing the initial gear image comprises the steps of firstly removing background information of a gear to obtain a gear image, then dividing the gear image to obtain a single-tooth area, and further obtaining the single-tooth image. The angle of view obliquely downward may be 45 degrees obliquely downward. The gear to be detected is placed in a posture that the end face of the gear is perpendicular to the horizontal plane.
S102, acquiring normal surface pixel points, wear surface pixel points and boundary pixel points in a single-tooth image; specifically, the gray scale of each pixel point of a single-tooth image is obtained, the pixels with gray scale values belonging to a first reference gray scale value and a preset gray scale neighborhood range of the first reference gray scale value and the number larger than a first threshold value are classified as normal surface pixels, the pixels with gray scale values belonging to a second reference gray scale value and a preset gray scale neighborhood range of the second reference gray scale value are classified as wearing surface pixels, and other pixels in the single-tooth image are classified as boundary pixels; the wear surface pixels refer to pixels on the surface of the gear wear area.
The preset gray scale neighborhood range may be 4, 6 or 8.
For example: under the general condition, the normal surface of the single tooth is darker in color and the abrasion part is brighter in color and luster, and even if the surface of the single tooth is abraded, the area of the abrasion surface is obviously smaller than that of the normal surface, and the number of pixels corresponding to the abrasion surface is smaller than that of pixels corresponding to the normal surface. Therefore, the gray value corresponding to the pixel point of the normal surface is set as the first reference gray value, the gray value corresponding to the pixel point of the wearing surface is set as the second reference gray value, and the pixel point of the wearing surface and the pixel point of the normal surface can be distinguished more accurately by combining the number of the pixel points corresponding to the gray values. In general, only the pixel points of the wear surface, the pixel points of the normal surface and the boundary pixel points in the single-tooth image are provided, so that the pixel points left after the pixel points of the wear surface and the pixel points of the normal surface are selected are all edge pixel points. The method for acquiring the first reference gray value comprises the following steps: shooting normal surfaces of different single teeth by using a camera so as to obtain all pixel points of the normal surfaces of each single tooth, solving a mean value of the gray values of all the obtained pixel points, and taking the mean value as a first reference gray value. The method for obtaining the second reference gray value is the same as the method for obtaining the first reference gray value, and will not be described here again.
S103, if the gear has tooth surface abrasion defects, acquiring an approximate area of an abrasion surface; specifically, in response to the number of the wear surface pixels being greater than a preset threshold, it is determined that a tooth surface wear defect exists in the gear, and an approximate region of the wear surface is obtained according to the wear surface pixels.
For example: in the case of light reflection, the gray value of a part of pixels on the normal surface may be reduced, so that the pixels on the normal surface are erroneously identified as the pixels on the wear surface, and in order to avoid interference of light on the judgment of the pixels on the wear surface, the identified pixels on the wear surface can be determined to be real pixels on the wear surface only when the number of the pixels on the wear surface is greater than a preset threshold, thereby realizing more accurate identification of whether the gear to be detected has a tooth surface wear defect.
S104, acquiring a first boundary line curve and a second boundary line curve; specifically, the boundary pixel point above the general area of the wear surface is marked as a first area boundary pixel point, a first boundary line curve is fitted according to the first area boundary pixel point, the boundary pixel point below the general area of the wear surface is marked as a second area boundary pixel point, and a second boundary line curve is fitted according to the second area boundary pixel point.
During image processing, due to the influence of noise, some pixels are wrongly identified as boundary pixels, and by fitting the boundary pixels of the first area and the boundary pixels of the second area respectively, more accurate boundary lines can be obtained and noise can be removed. By fitting the first boundary line curve and the second boundary line curve, the boundary of the upper surface and the front surface of the single tooth can be determined more accurately. Fitting modes such as least square method, polynomial fitting, nonparametric fitting and the like can be adopted in fitting.
S105, acquiring a first boundary line reference line, a second boundary line reference line and a defect evaluation area; specifically, boundary pixel points on a first boundary line curve are fitted to obtain a first boundary line reference straight line; fitting boundary pixel points on a second boundary line curve to obtain a second boundary line reference line; and acquiring an upper surface boundary line according to the first boundary line curve, acquiring a front surface boundary line according to the second boundary line curve, and taking a region between the upper surface boundary line and the front surface boundary line as a defect evaluation region.
For example: as shown in fig. 2, in general, the wear surface of a single tooth is narrower at one end and wider at the other end, and the more serious the single tooth surface wear, i.e., the larger the wear surface area of a single tooth, the larger the curvature of a boundary line curve, the more deviated from a straight line, and therefore, the degree of wear of a single tooth is evaluated by fitting a boundary line reference straight line according to the degree of proximity of the boundary line reference straight line and the boundary line curve.
The method for acquiring the upper surface boundary line comprises the following steps: and searching the pixel points upwards by taking the first boundary line curve as the center until the normal surface pixel point is searched, taking the boundary pixel point before the normal surface pixel point is searched as the upper edge point of the first boundary line area, and mapping the upper edge points of all the first boundary line areas by taking the first boundary line reference straight line as a symmetry axis, thereby obtaining the upper surface boundary line.
The method for acquiring the front surface boundary line comprises the following steps: and searching the pixel points downwards by taking the second boundary line curve as the center until the normal surface pixel points are searched, taking the boundary pixel points before the normal surface pixel points are searched as the lower edge points of the second boundary line area, and taking the lower edge points of all the second boundary line areas as the front surface boundary lines.
S106, calculating the abrasion degree of the gear; specifically, the gear wear degree is calculated by taking half of the sum of the area of the upper surface of the single tooth and the area of the front surface of the single tooth as the maximum reference wear area and combining the euclidean distance between the first boundary line reference straight line and the first boundary line curve, the euclidean distance between the second boundary line reference straight line and the second boundary line curve, and the area of the defect evaluation area, wherein the calculation expression is as follows:
in the method, in the process of the invention,indicating the degree of wear of the gears>Representing natural constant->Represents the area of the defect evaluation area, +.>Represents the maximum reference wear area,/->Indicating the Euclidean distance between the first boundary line reference line and the first boundary line curve, +.>Representing the euclidean distance between the second boundary line reference line and the second boundary line curve.
As can be seen from the above embodiments, the first boundary line curve and the second boundary line curve need to be obtained by polynomial fitting, and in an alternative embodiment, the fitting methods of the first boundary line curve and the second boundary line curve are the same. As shown in fig. 3, in an embodiment, a method for fitting a first boundary line curve is described as an example, where fitting the first boundary line curve includes:
s301, fitting the boundary pixel points of the first area to obtain a first fitting curve; specifically, fitting is performed on the boundary pixel points of the first area to obtain a first fitting curve.
Polynomial fits, such as a trigonometric or a tetrad fit, are used in fitting both the first and second boundary curves.
S302, judging whether the first fitting curve meets the requirement, if so, marking the first fitting curve as a first boundary line curve, otherwise, re-fitting; specifically, if the dispersion between the first fitting curve and the boundary pixel points of the first area is smaller than a preset dispersion threshold value, marking the first fitting curve as a first boundary line curve; otherwise, obtaining the pixel points which are positioned on one side with less pixel points of the first fitting curve and are farthest from the first fitting curve, recording the distance corresponding to the pixel points as the maximum distance, reserving the pixel points with the distance between the side and the first fitting curve being less than two thirds of the maximum distance, and reserving all the pixel points on one side with more pixel points of the first fitting curve so as to obtain new first area boundary pixel points, thereby realizing fitting of the new first area boundary pixel points and obtaining a new first fitting curve.
The smaller the dispersion between the first fitting curve and the first region boundary pixel point, the closer the fitted curve is to the real boundary line. When the dispersion degree is too large, the pixel points on the side with the small number of the pixel points of the fitting curve are selectively deleted, so that new pixel points on the boundary of the first area are obtained, and the first fitting curve obtained by next fitting is offset to the side with the large number of the pixel points of the first fitting curve.
As can be seen from the above embodiments, the gear wear degree is calculated according to the euclidean distance between the first boundary line reference line and the first boundary line curve and the euclidean distance between the second boundary line reference line and the second boundary line curve, and in one embodiment, the method for calculating the euclidean distance between the first boundary line reference line and the first boundary line curve includes: and calculating Euclidean distances between all pixel points on the first boundary line curve and the first boundary line reference line, calculating an average value of the Euclidean distances corresponding to all pixel points on the first boundary line curve, and taking the average value as the Euclidean distance between the first boundary line reference line and the first boundary line curve.
The method for calculating the euclidean distance between the second boundary line reference line and the second boundary line curve is the same as the method for calculating the euclidean distance between the first boundary line reference line and the first boundary line curve, and will not be described herein.
The gear defect detection method based on image processing in the above embodiment may detect a tooth surface wear defect, and in order to detect a pit defect on a tooth surface, as shown in fig. 4, in one embodiment, further includes:
s401, identifying the number of boundary pixel points in a single-tooth upper surface area and a single-tooth front surface area; specifically, a single-tooth upper surface area and a single-tooth front surface area are obtained according to normal surface pixel points, and then the number of boundary pixel points in the single-tooth upper surface area and the single-tooth front surface area is identified.
In general, if the gear is not worn, there is a boundary line between the upper surface of the single tooth and the front surface of the single tooth, and the boundary line between the upper surface of the single tooth and the front surface of the single tooth can be obtained through the boundary pixel points, so that the upper surface area of the single tooth and the front surface area of the single tooth are obtained. If the gear is worn, a wear surface exists between the upper surface of the single tooth and the front surface of the single tooth, and the area of the wear surface is determined through the pixel points of the wear surface, so that the area of the upper surface of the single tooth and the area of the front surface of the single tooth can be determined. The pixel position of each boundary pixel point can be used for judging which boundary pixel points are positioned in the single-tooth upper surface area or the single-tooth front surface area.
S402, judging whether the gear has a spot defect or not according to the number of the identified boundary pixels and the shape surrounded by the boundary pixels; specifically, in response to the number of boundary pixel points in the single-tooth upper surface area or the single-tooth front surface area being greater than a preset fourth threshold value and the boundary pixel points enclosing a closed graph, it is determined that the gear has a spot defect.
If the spot pits exist on the upper surface or the front surface of the single tooth, boundary pixel points exist on the upper surface or the front surface of the single tooth and form a closed pattern, so that the spot pit defect on the tooth surface can be judged under the condition that the number of the boundary pixel points in the upper surface area or the front surface area of the single tooth is larger than a preset fourth threshold value and the boundary pixel points form the closed pattern.
The contrast diagrams of the effects of the gear surface defect detection method in the prior art and the gear defect detection method based on image processing of the invention are shown in fig. 5 to 8, black lines in the diagrams are identified boundary pixel points, a boundary line curve obtained by the gear surface defect detection method in the prior art is a curve a1 in the diagram of fig. 5, and an obtained boundary line reference straight line is a straight line b1 in the diagram of fig. 6; the first boundary line curve obtained by the image processing-based gear defect detection method of the present invention is a curve a2 in fig. 7, and the obtained first boundary line reference straight line is a straight line b2 in fig. 8. As can be seen from fig. 5 and 7, when the wear area is too wide, the boundary line curve fitted by the method of the prior art is farther from the upper boundary of the wear area, resulting in errors in the upper surface boundary and the front surface boundary obtained later, which is not beneficial to the subsequent assessment of the degree of wear of the gear; the first boundary line curve obtained by the image processing-based gear defect detection method is closer to the upper boundary of the abrasion area, so that the upper surface boundary and the front surface boundary obtained later are more accurate, and further, the gear abrasion degree evaluation is ensured to be more accurate.
Image processing-based gear defect detection system embodiments:
the invention also provides a gear defect detection system based on image processing. As shown in fig. 9, the image processing-based gear defect detection system includes a processor and a memory storing computer program instructions that when executed by the processor implement the image processing-based gear defect detection method according to the first aspect of the present invention.
The image processing-based gear defect detection system further comprises other components such as a communication bus and a communication interface, which are well known to those skilled in the art, and the arrangement and function of which are known in the art, and thus are not described in detail herein.
In the context of this patent, the foregoing memory may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. For example, the computer readable storage medium may be any suitable magnetic or magneto-optical storage medium, such as, for example, resistance change Memory RRAM (Resistive Random Access Memory), dynamic Random Access Memory DRAM (Dynamic Random Access Memory), static Random Access Memory SRAM (Static Random-Access Memory), enhanced dynamic Random Access Memory EDRAM (Enhanced Dynamic Random Access Memory), high-Bandwidth Memory HBM (High-Bandwidth Memory), hybrid storage cube HMC (Hybrid Memory Cube), etc., or any other medium that may be used to store the desired information and that may be accessed by an application, a module, or both. Any such computer storage media may be part of, or accessible by, or connectable to, the device. Any of the applications or modules described herein may be implemented using computer-readable/executable instructions that may be stored or otherwise maintained by such computer-readable media.
In the description of the present specification, the meaning of "a plurality", "a number" or "a plurality" is at least two, for example, two, three or more, etc., unless explicitly defined otherwise.
While various embodiments of the present invention have been shown and described herein, it will be obvious to those skilled in the art that such embodiments are provided by way of example only. Many modifications, changes, and substitutions will now occur to those skilled in the art without departing from the spirit and scope of the invention. It should be understood that various alternatives to the embodiments of the invention described herein may be employed in practicing the invention.

Claims (6)

1. A gear defect detection method based on image processing, comprising:
acquiring an initial gear image, and performing segmentation processing on the initial gear image to acquire a single-tooth image;
acquiring the gray scale of each pixel point of the single-tooth image, classifying the pixel points with gray scale values belonging to a first reference gray scale value and a preset gray scale neighborhood range of the first reference gray scale value and the number larger than a first threshold value as normal surface pixel points, classifying the pixel points with gray scale values belonging to a second reference gray scale value and a preset gray scale neighborhood range of the second reference gray scale value as wearing surface pixel points, and classifying other pixel points in the single-tooth image as boundary pixel points; the wear surface pixel points refer to pixel points on the surface of a gear wear area;
responding to the number of the pixel points of the abrasion surface being larger than a preset threshold value, judging that the tooth surface abrasion defect exists in the gear, and acquiring the approximate area of the abrasion surface according to the pixel points of the abrasion surface;
marking boundary pixel points above the approximately-regional area of the wear surface as first regional boundary pixel points, fitting a first boundary line curve according to the first regional boundary pixel points, marking boundary pixel points below the approximately-regional area of the wear surface as second regional boundary pixel points, and fitting a second boundary line curve according to the second regional boundary pixel points; the fitting the first boundary line curve includes: fitting the first region boundary pixel points to obtain a first fitting curve; if the dispersion between the first fitting curve and the first region boundary pixel point is smaller than a preset dispersion threshold value, marking the first fitting curve as a first boundary line curve; otherwise, acquiring the pixel points which are positioned on one side with less pixel points of the first fitting curve and are farthest from the first fitting curve, recording the distance corresponding to the pixel points as the maximum distance, reserving the pixel points with the distance between the side and the first fitting curve being less than two thirds of the maximum distance, and reserving all the pixel points on one side with more pixel points of the first fitting curve so as to acquire new first area boundary pixel points, thereby realizing fitting of the new first area boundary pixel points and acquiring a new first fitting curve;
fitting boundary pixel points on the first boundary line curve to obtain a first boundary line reference line; fitting boundary pixel points on the second boundary line curve to obtain a second boundary line reference line; acquiring an upper surface boundary line according to the first boundary line curve, acquiring a front surface boundary line according to the second boundary line curve, and taking a region between the upper surface boundary line and the front surface boundary line as a defect evaluation region;
taking one half of the sum of the area of the upper surface of the single tooth and the area of the front surface of the single tooth as the maximum reference abrasion area, and calculating the abrasion degree of the gear by combining the Euclidean distance between the first boundary line reference straight line and the first boundary line curve, the Euclidean distance between the second boundary line reference straight line and the second boundary line curve and the area of the defect evaluation area, wherein the calculation expression is as follows:
in the method, in the process of the invention,indicating the degree of wear of the gears>Representing natural constant->Represents the area of the defect evaluation area, +.>Represents the maximum reference wear area,/->Indicating the Euclidean distance between the first boundary line reference line and the first boundary line curve, +.>Representing the euclidean distance between the second boundary line reference line and the second boundary line curve.
2. The image processing-based gear defect detection method according to claim 1, wherein the calculation method of the euclidean distance between the first boundary line reference line and the first boundary line curve includes: and calculating Euclidean distances between all pixel points on the first boundary line curve and the first boundary line reference line, calculating an average value of the Euclidean distances corresponding to all pixel points on the first boundary line curve, and taking the average value as the Euclidean distance between the first boundary line reference line and the first boundary line curve.
3. The image processing-based gear defect detection method according to claim 1, wherein the first reference gray value acquisition method is: shooting normal surfaces of different single teeth by using a camera so as to obtain all pixel points of the normal surfaces of each single tooth, solving a mean value of the gray values of all the obtained pixel points, and taking the mean value as a first reference gray value.
4. The image processing-based gear defect detection method according to claim 1, further comprising:
acquiring a single-tooth upper surface area and a single-tooth front surface area according to the normal surface pixel points, and identifying the number of boundary pixel points in the single-tooth upper surface area and the single-tooth front surface area;
and in response to the fact that the number of the boundary pixel points in the single-tooth upper surface area or the single-tooth front surface area is larger than a preset fourth threshold value and the boundary pixel points enclose a closed graph, judging that the gear has the spot pit defect.
5. The image processing-based gear defect detection method according to any one of claims 1 to 4, wherein a trigonometric fit is used in fitting the first boundary line curve and the second boundary line curve.
6. An image processing-based gear defect detection system, comprising a processor and a memory, wherein the memory stores computer program instructions that, when executed by the processor, implement the image processing-based gear defect detection method of any one of claims 1-5.
CN202311766342.XA 2023-12-21 2023-12-21 Gear defect detection method and system based on image processing Active CN117437233B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311766342.XA CN117437233B (en) 2023-12-21 2023-12-21 Gear defect detection method and system based on image processing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311766342.XA CN117437233B (en) 2023-12-21 2023-12-21 Gear defect detection method and system based on image processing

Publications (2)

Publication Number Publication Date
CN117437233A true CN117437233A (en) 2024-01-23
CN117437233B CN117437233B (en) 2024-03-26

Family

ID=89548400

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311766342.XA Active CN117437233B (en) 2023-12-21 2023-12-21 Gear defect detection method and system based on image processing

Country Status (1)

Country Link
CN (1) CN117437233B (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013140050A (en) * 2011-12-28 2013-07-18 Sumitomo Chemical Co Ltd Defect inspection device and defect inspection method
US20170270647A1 (en) * 2014-12-09 2017-09-21 SZ DJI Technology Co., Ltd. Image processing method, device and photographic apparatus
CN107966450A (en) * 2017-12-12 2018-04-27 宁波格劳博机器人有限公司 Battery pole piece coating defect detects separation system
CN109142366A (en) * 2018-06-13 2019-01-04 广东拓斯达科技股份有限公司 Spherical housing defect inspection method, device and computer readable storage medium
CN109325930A (en) * 2018-09-12 2019-02-12 苏州优纳科技有限公司 Detection method, device and the detection device of boundary defect
WO2020133046A1 (en) * 2018-12-27 2020-07-02 深圳配天智能技术研究院有限公司 Defect detection method and device
CN113393569A (en) * 2021-06-09 2021-09-14 昆山一麦自动化科技有限公司 Fitting method based on distance priority strategy and application thereof
CN113658133A (en) * 2021-08-16 2021-11-16 江苏鑫丰源机电有限公司 Gear surface defect detection method and system based on image processing
CN114913177A (en) * 2022-07-19 2022-08-16 山东聊城富锋汽车部件有限公司 Automobile part defect detection method based on Hough circle
CN115049670A (en) * 2022-08-16 2022-09-13 南通兴拓精密机械有限公司 Tooth profile defect detection method based on gear
CN115641348A (en) * 2022-10-17 2023-01-24 沈阳化工大学 Method for determining pupil edge of eye based on user-defined area factor
CN116152749A (en) * 2023-04-20 2023-05-23 青岛义龙包装机械有限公司 Intelligent gear wear monitoring method based on digital twin
WO2023134789A1 (en) * 2022-10-25 2023-07-20 苏州德斯米尔智能科技有限公司 Automatic inspection method for belt-type conveying device
CN116721106A (en) * 2023-08-11 2023-09-08 山东明达圣昌铝业集团有限公司 Profile flaw visual detection method based on image processing

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013140050A (en) * 2011-12-28 2013-07-18 Sumitomo Chemical Co Ltd Defect inspection device and defect inspection method
US20170270647A1 (en) * 2014-12-09 2017-09-21 SZ DJI Technology Co., Ltd. Image processing method, device and photographic apparatus
CN107966450A (en) * 2017-12-12 2018-04-27 宁波格劳博机器人有限公司 Battery pole piece coating defect detects separation system
CN109142366A (en) * 2018-06-13 2019-01-04 广东拓斯达科技股份有限公司 Spherical housing defect inspection method, device and computer readable storage medium
CN109325930A (en) * 2018-09-12 2019-02-12 苏州优纳科技有限公司 Detection method, device and the detection device of boundary defect
WO2020133046A1 (en) * 2018-12-27 2020-07-02 深圳配天智能技术研究院有限公司 Defect detection method and device
CN113393569A (en) * 2021-06-09 2021-09-14 昆山一麦自动化科技有限公司 Fitting method based on distance priority strategy and application thereof
CN113658133A (en) * 2021-08-16 2021-11-16 江苏鑫丰源机电有限公司 Gear surface defect detection method and system based on image processing
CN114913177A (en) * 2022-07-19 2022-08-16 山东聊城富锋汽车部件有限公司 Automobile part defect detection method based on Hough circle
CN115049670A (en) * 2022-08-16 2022-09-13 南通兴拓精密机械有限公司 Tooth profile defect detection method based on gear
CN115641348A (en) * 2022-10-17 2023-01-24 沈阳化工大学 Method for determining pupil edge of eye based on user-defined area factor
WO2023134789A1 (en) * 2022-10-25 2023-07-20 苏州德斯米尔智能科技有限公司 Automatic inspection method for belt-type conveying device
CN116152749A (en) * 2023-04-20 2023-05-23 青岛义龙包装机械有限公司 Intelligent gear wear monitoring method based on digital twin
CN116721106A (en) * 2023-08-11 2023-09-08 山东明达圣昌铝业集团有限公司 Profile flaw visual detection method based on image processing

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
DONGXU LIU 等: "Bidimensional local characteristic-scale decomposition and its application in gear surface defect detection", 《MEAS. SCI. TECHNOL》, 9 November 2023 (2023-11-09) *
罗玮;张荣福;郁浩;邬奇;: "基于图像处理的钢坯缺陷检测研究", 软件导刊, 26 August 2016 (2016-08-26) *
豆永坤: "基于机器视觉的机械零件几何外形检测研究", 《知网》, vol. 2018, no. 9, 15 September 2018 (2018-09-15) *

Also Published As

Publication number Publication date
CN117437233B (en) 2024-03-26

Similar Documents

Publication Publication Date Title
CN115049664B (en) Vision-based ship engine fitting defect detection method
CN112435248B (en) Defect detection method, defect detection device, control device and readable storage medium
CN113920096B (en) Integrated circuit metal packaging defect detection method
CN116563282B (en) Drilling tool detection method and system based on machine vision
CN114693610A (en) Welding seam surface defect detection method, equipment and medium based on machine vision
WO2022105676A1 (en) Method and system for measuring wear of workpiece plane
CN115100191A (en) Metal casting defect identification method based on industrial detection
CN114495098B (en) Diaxing algae cell statistical method and system based on microscope image
CN115359053A (en) Intelligent detection method and system for defects of metal plate
CN116168028B (en) High-speed rail original image processing method and system based on edge filtering under low visibility
CN115456945A (en) Chip pin defect detection method, detection device and equipment
CN116777877A (en) Circuit board defect detection method, device, computer equipment and storage medium
CN115272336A (en) Metal part defect accurate detection method based on gradient vector
CN110823106B (en) Method for detecting quality of plate glass based on laser continuous wave modulation principle
CN116128873A (en) Bearing retainer detection method, device and medium based on image recognition
CN117437233B (en) Gear defect detection method and system based on image processing
CN117333458A (en) Method and system for detecting irregular workpieces with holes based on contour difference
CN110223339B (en) Thermal protector calibration point center positioning method based on machine vision
CN115222744A (en) Cutter wear degree judgment method based on depth estimation
CN113012121B (en) Method and device for processing bare chip scanning result, electronic equipment and storage medium
CN114862816A (en) Glitch detection method, system, and computer-readable storage medium
CN118229673B (en) Pipeline inner wall defect detection method and system based on image recognition
CN118429333B (en) Visual detection method and device for surface defects of pipe
CN118014996B (en) Electric energy meter circuit board defect detection method and system based on image recognition
CN118365867B (en) Titanium alloy bar identification method and system based on image analysis

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant