CN115239661A - Mechanical part burr detection method and system based on image processing - Google Patents

Mechanical part burr detection method and system based on image processing Download PDF

Info

Publication number
CN115239661A
CN115239661A CN202210851986.8A CN202210851986A CN115239661A CN 115239661 A CN115239661 A CN 115239661A CN 202210851986 A CN202210851986 A CN 202210851986A CN 115239661 A CN115239661 A CN 115239661A
Authority
CN
China
Prior art keywords
edge pixel
edge
pixel point
burr
ideal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210851986.8A
Other languages
Chinese (zh)
Inventor
牧笛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Henan University of Animal Husbandry and Economy
Original Assignee
Henan University of Animal Husbandry and Economy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Henan University of Animal Husbandry and Economy filed Critical Henan University of Animal Husbandry and Economy
Priority to CN202210851986.8A priority Critical patent/CN115239661A/en
Publication of CN115239661A publication Critical patent/CN115239661A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/761Proximity, similarity or dissimilarity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/06Recognition of objects for industrial automation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to the technical field of artificial intelligence, in particular to a method and a system for detecting burrs of a mechanical part based on image processing, wherein the method comprises the following steps: acquiring a surface image of a part to be detected, and further determining an actual edge image of the part to be detected; acquiring the number and the gray level of each edge pixel point in the preset field of each edge pixel point in the actual edge image, and further acquiring a feature descriptor of each edge pixel point, so as to determine each non-burr edge pixel point in each edge pixel point and a corresponding feature descriptor thereof, and finally determining the probability of each edge pixel point being burr; according to the non-burr edge pixel points and the burr probability of each edge pixel point, acquiring an ideal edge image of the part to be detected without burrs, and further determining a burr detection result of the part to be detected. The invention not only improves the detection efficiency of the burrs of the mechanical part, but also is beneficial to improving the accuracy of the burr detection and saving the labor cost.

Description

Mechanical part burr detection method and system based on image processing
Technical Field
The invention relates to the technical field of artificial intelligence, in particular to a method and a system for detecting burrs of a mechanical part based on image processing.
Background
With the rapid development of industry in recent two years in China, the demand of mechanical part products is larger and larger, but in the machining process of mechanical parts, burrs often appear at the edges of the parts, namely uneven flash exists at the edges of cold-cut, hot-saw or flame-cut steel products. In general, a part is allowed to have burrs with a certain height, but some mechanical parts such as welded pipes and the like must be subjected to smooth processing and burr removal, and in order to guarantee the processing production quality of the mechanical parts, the mechanical parts need to be subjected to burr detection.
The traditional burr detection mode is generally whether there is burr and the concrete position of burr through staff scene range estimation or long-range camera shooting image of watching, and this detection mode not only detects for a long time, and the cost of labor is high, and the efficiency that detects moreover is greatly influenced by staff proficiency, the fatigue degree of work, to minimum or the burr of the hidden position of locating, the easy missed measure extremely. In addition, the part burr can be detected by using infrared rays generally, but the detection method for detecting the part burr by using infrared rays is complex in applied equipment and expensive in equipment cost, and the requirement of detecting the part burr by using infrared rays on environmental conditions is high, so that the cost of a part burr detection process is overlarge.
Disclosure of Invention
In order to solve the technical problem of poor burr detection accuracy of the mechanical part, the invention aims to provide a method and a system for detecting burrs of the mechanical part based on image processing.
In order to solve the technical problem, the invention provides a mechanical part burr detection method based on image processing, which comprises the following steps:
acquiring a surface image of a part to be detected, and further determining an actual edge image of the part to be detected;
acquiring the number and the gray level of each edge pixel point in the preset field of each edge pixel point in the actual edge image of the part to be detected, and acquiring a feature descriptor of each edge pixel point in the actual edge image according to the number and the gray level of each edge pixel point in the preset field of each edge pixel point;
determining each non-burr edge pixel point and a corresponding feature descriptor in each edge pixel point in the actual edge image according to the feature descriptors of each edge pixel point in the actual edge image, and further determining the probability that each edge pixel point in the actual edge image is burr;
acquiring an ideal edge image of the part to be detected without burrs according to each non-burr edge pixel point and the probability that each edge pixel point is burr;
and determining the burr detection result of the part to be detected according to the actual edge image of the part to be detected and the ideal edge image of the part to be detected without burrs.
Further, the step of acquiring the ideal edge image without burrs of the part to be detected comprises the following steps:
establishing a coordinate system according to an actual edge image of a part to be detected, and acquiring actual coordinates of each edge pixel point of the actual edge image;
clustering each non-burr edge pixel point according to the actual coordinate of each non-burr edge pixel point so as to obtain each dense area;
acquiring ideal coordinates corresponding to each burr pixel point in each dense area according to each non-burr edge pixel point in each dense area and the actual abscissa corresponding to each burr pixel point in each dense area;
acquiring ideal edges corresponding to the dense areas according to the actual coordinates of the non-burr edge pixel points in the dense areas and the ideal coordinates of the burr pixel points;
and determining an ideal edge image without burrs of the part to be detected according to the ideal edges corresponding to the dense areas and the actual coordinates of the edge pixel points which are not in the dense areas and the probability of the ideal edges being burrs.
Further, the step of determining the ideal edge image of the part to be detected without burrs comprises the following steps:
determining nearest edge pixel points which are not in each dense area at any end of an ideal edge corresponding to the dense area, and determining predicted vertical coordinates of the nearest edge pixel points which are not in each dense area according to the actual horizontal coordinates of the ideal edge and the nearest edge pixel points which are not in each dense area;
determining the ideal vertical coordinate of the nearest edge pixel point which is not in each dense area according to the predicted vertical coordinate, the actual vertical coordinate and the probability of being a burr of the nearest edge pixel point which is not in each dense area;
and updating the ideal edge corresponding to the dense area according to the actual abscissa and the ideal ordinate of the nearest edge pixel point not in each dense area to obtain the updated ideal edge corresponding to the dense area, determining the nearest edge pixel point not in each dense area at any end of the updated ideal edge corresponding to the dense area, and repeating the steps until the ideal ordinates of all the edge pixel points not in each dense area are determined.
Further, a calculation formula for determining the ideal vertical coordinate of the nearest edge pixel point not in each dense area is as follows:
Figure BDA0003753845530000021
wherein Y is k The nearest edge pixel point k which is not in each dense area at any end of the ideal edge corresponding to the dense area corresponds to an ideal ordinate, p (x) k ,y k ) The probability that the nearest edge pixel point k which is not in each dense area at any end of the ideal edge corresponding to the dense area is a burr,
Figure BDA0003753845530000022
is the prediction ordinate, y, of the nearest edge pixel point k not in each dense region at any end of the ideal edge corresponding to the dense region k Is the actual vertical coordinate, x, of the nearest edge pixel point k not in each dense region at any end of the ideal edge corresponding to the dense region k And the actual abscissa of the nearest edge pixel point k which is not in each dense region at any end of the ideal edge corresponding to the dense region.
Further, the step of obtaining the feature descriptors of the edge pixel points comprises:
calculating the gray gradient of each edge pixel point in the preset field of each edge pixel point, and determining the gray gradient change characteristics of each edge pixel point according to the gray gradient of each edge pixel point in the preset field of each edge pixel point;
and acquiring a feature descriptor of each edge pixel point according to the number of each edge pixel point in the preset field of each edge pixel point and the gray gradient change feature of each edge pixel point.
Further, the step of calculating the gray scale gradient change characteristics of each edge pixel comprises:
acquiring gradient unit vectors of every two adjacent edge pixel points in the preset field of the edge pixel points, and further acquiring cosine similarity of the gradient unit vectors of every two adjacent edge pixel points in the preset field of the edge pixel points;
determining a change sequence of the gradient unit vectors of the edge pixel points in the preset field according to the cosine similarity of the gradient unit vectors of every two adjacent edge pixel points in the preset field of the edge pixel points, and further acquiring an autocorrelation matrix of the edge pixel points;
and determining the gray gradient change characteristics of the edge pixel points according to the autocorrelation matrix of the edge pixel points.
Further, the step of determining the probability of burrs existing in each edge pixel point comprises the following steps:
and calculating the similarity between the feature descriptors of the edge pixel points and the feature descriptors of the non-burr edge pixel points, and taking the similarity as the probability that the corresponding edge pixel points are burrs.
Further, a calculation formula for calculating the probability of each edge pixel point being a burr is as follows:
Figure BDA0003753845530000031
wherein p (x, y) is the probability that the edge pixel point (x, y) is a burr,N (x,y) the number of each edge pixel point in the preset field of the edge pixel point (x, y), N 0 Number, T, of edge pixels in a predetermined field of non-burred pixels (x,y) Is the gray scale gradient change characteristic of the edge pixel point (x, y), T 0 The gray gradient change characteristics of the non-burred pixel points.
Further, the step of obtaining the burr detection result of the part to be detected comprises:
and performing exclusive OR operation on the real edge image of the part to be detected and the ideal edge image of the part to be detected without burrs, acquiring comparison images corresponding to the two edge images, and acquiring a burr area of the part to be detected according to the comparison images.
The processor is used for processing instructions stored in the memory so as to realize the mechanical part burr detection method based on the image processing.
The invention has the following beneficial effects:
the method comprises the steps of firstly obtaining an actual edge image of a part to be detected, and obtaining a feature descriptor of each edge pixel point according to the number of each edge pixel point in the field of each edge pixel point in the actual edge image of the part to be detected and the gray level change feature of each edge pixel point. Through each non-burr edge pixel point in each edge pixel point of probability that is the burr of each edge pixel point can be accurate determination, and then acquire the ideal coordinate of each non-burr edge pixel point in ideal edge image, can acquire the ideal edge image that does not have the burr from this, through the contrast of the ideal edge image and the actual edge image of waiting to detect the part, can accurately acquire the burr testing result of waiting to detect the part different positions, the accuracy that the burr detected has been improved, the efficiency that the burr detected has also been improved simultaneously.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions and advantages of the prior art, the drawings used in the embodiments or the description of the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a flowchart of a method for detecting burrs of a mechanical part based on image processing according to the present invention.
Detailed Description
To further explain the technical means and effects of the present invention adopted to achieve the predetermined objects, the following detailed description of the embodiments, structures, features and effects of the technical solutions according to the present invention will be given with reference to the accompanying drawings and preferred embodiments. In the following description, different references to "one embodiment" or "another embodiment" do not necessarily refer to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The embodiment provides a method for detecting burrs of a mechanical part based on image processing, which comprises the following steps of:
(1) Acquiring a surface image of the part to be detected, and further determining an actual edge image of the part to be detected.
In this embodiment, a camera is used to obtain a surface image of a to-be-detected part, where the camera is an RGB camera, and the surface image is an RGB image of a surface, and the surface image of the to-be-detected part is subjected to a graying process. Processing the grayed surface image of the part to be detected by using a Canny edge detection operator so as to obtain an actual edge image I of the part to be detected by That is, the edge region of the RGB image is the ROI region for burr detection. Acquiring an actual edge image of a part to be detected, and acquiring the edge of the edge area of the RGB imageThe information comprises the burr characteristics of the part, and is beneficial to accurately identifying the burr area and the position of the part to be detected subsequently. Both the graying process and the Canny edge detector are prior art and are not within the scope of the present invention, and will not be described in detail herein.
(2) The method comprises the steps of obtaining the number and the gray level of each edge pixel point in the preset field of each edge pixel point in an actual edge image of a part to be detected, and obtaining a feature descriptor of each edge pixel point in the actual edge image according to the number and the gray level of each edge pixel point in the preset field of each edge pixel point.
(2-1) acquiring the number and the gray level of each edge pixel point in the preset field of each edge pixel point in the actual edge image of the part to be detected, wherein the method comprises the following steps:
in this embodiment, the actual coordinates (x, y) of any edge pixel point in the actual edge image of the part to be detected are obtained, where x is an abscissa and y is an ordinate, the preset domain range of the edge pixel point (x, y) in the abscissa direction is [ x- α, x + α ], the preset domain range in the ordinate direction is [ y- α, y + α ], and α is a preset parameter, and the value of the preset parameter α in this embodiment is 2. Of course, in other embodiments, the implementer sets the magnitude of the parameter α according to the specific situation.
According to the size of the field range of the edge pixel point (x, y) and each edge pixel point in the field range of the edge pixel point (x, y), the length of the edge in the preset field of the edge pixel point (x, y), namely the number N of each edge pixel point in the preset field of the edge pixel point (x, y), is obtained (x,y) And obtaining the gray value of each edge pixel point in the preset field of the edge pixel point (x, y).
It should be noted that, as can be known from the priori knowledge, if the edge pixel (x, y) is a burr pixel, the number of edge pixels of the burr edge pixel is obviously greater than that of the non-burr edge pixels within the field range, so N is (x,y) The burr characteristics at the edge pixel point (x, y) can be reflected.
(2-2) according to the number and the gray level of each edge pixel point in the preset field of each edge pixel point, acquiring a feature descriptor of each edge pixel point in an actual edge image, wherein the steps comprise:
(2-2-1) calculating the gray gradient of each edge pixel point in the preset field of each edge pixel point, and determining the gray gradient change characteristics of each edge pixel point according to the gray gradient of each edge pixel point in the preset field of each edge pixel point.
It should be noted that, since the burr region of the part is an irregular region, the gradient direction distribution of each edge pixel point in the preset field of each edge pixel point in the burr region is more disordered than that of the non-burr region of the part, and therefore, the gray scale gradient direction characteristic T of each edge pixel point (x,y) The burr characteristics of the edge pixel points (x, y) can be reflected, and the gray gradient direction characteristic T of each edge pixel point (x,y) The acquiring step comprises:
(2-2-1-1) obtaining gradient unit vectors of every two adjacent edge pixel points in the preset field of the edge pixel points, and further obtaining cosine similarity of the gradient unit vectors of every two adjacent edge pixel points in the preset field of the edge pixel points.
In this embodiment, the gray gradient direction of each edge pixel point within the preset domain range of the edge pixel point (x, y) is obtained first, then the edge pixel points are numbered according to a certain sequence, the number of each edge pixel point is recorded as l in this embodiment, and the gray gradient direction of the edge pixel point with the number of l within the preset domain range of the edge pixel point (x, y) is the gray gradient direction of the edge pixel point with the number of l
Figure BDA0003753845530000051
And the gray gradient direction of the adjacent edge pixel point with the serial number of l +1 is
Figure BDA0003753845530000052
Herein, the
Figure BDA0003753845530000053
And
Figure BDA0003753845530000054
calculating the cosine similarity of the gradient unit vectors of every two adjacent edge pixel points in the preset field of the edge pixel points (x, y) as a unit vector, wherein the calculation formula is as follows:
Figure BDA0003753845530000061
wherein the content of the first and second substances,
Figure BDA0003753845530000062
is the cosine similarity between the edge pixel point with the number of l and the adjacent edge pixel point with the number of l +1 in the preset domain range of the edge pixel point (x, y),
Figure BDA0003753845530000063
is the gray gradient unit vector of the edge pixel point with the number of l in the preset domain range of the edge pixel point (x, y),
Figure BDA0003753845530000064
the gray level gradient unit vector is the gray level gradient unit vector of the edge pixel point with the serial number of l +1 in the preset domain range of the edge pixel point (x, y).
According to the same method, the cosine similarity of every two adjacent edge pixel points in the preset field range of the edge pixel points (x, y) is obtained.
(2-2-1-2) determining a change sequence of the gradient unit vectors of the edge pixel points in the preset field according to the cosine similarity of the gradient unit vectors of every two adjacent edge pixel points in the preset field of the edge pixel points, and further acquiring an autocorrelation matrix of the edge pixel points.
In this embodiment, 1 line N is obtained according to the cosine similarity of all two adjacent edge pixel points within the preset domain range of the edge pixel point (x, y) obtained in the step (2-2-1-1) (x,y) -1 column of gray gradient direction change sequences from which a size of (N) is constructed (x,y) -1)×(N (x,y) Autocorrelation moments of-1)And the matrix Z, wherein the calculation formula corresponding to the numerical value of the u row v column in the autocorrelation matrix Z is as follows:
Z u,v =exp(-|Sim u -Sim v |)
wherein Z is u,v Is the value of the u row v column in the autocorrelation matrix Z, sim u Is the u-th cosine similarity, sim, in the gray scale gradient direction change sequence v Is the v-th cosine similarity in the gray scale gradient direction change sequence.
(2-2-1-3) determining the gray gradient change characteristics of the edge pixel points according to the autocorrelation matrix of the edge pixel points.
In this embodiment, the gray gradient change characteristic of the edge pixel point (x, y) is calculated according to the autocorrelation matrix Z of the edge pixel point (x, y), and the calculation formula is as follows:
Figure BDA0003753845530000065
wherein, T (x,y) For the gray gradient variation characteristics in the preset field of the edge pixel points (x, y) | Z | |) 1 Is L1 norm of matrix Z (which represents the sum of absolute values of each element in the matrix), N (x,y) The number of each edge pixel point in the preset field of the edge pixel point (x, y) is shown.
It should be noted that, if the gray scale gradient change of the edge pixel point (x, y) is relatively smooth, the closer the gray scale gradient change degree of all adjacent edge pixel points in the preset field of the edge pixel point (x, y) is, the closer the numerical value in the autocorrelation matrix Z is to 1, and at this time, the gray scale gradient change characteristic T of the edge pixel point (x, y) is (x,y) The closer to 1.
(2-2-2) acquiring a feature descriptor of each edge pixel point according to the number of each edge pixel point in the preset field of each edge pixel point and the gray gradient change feature of each edge pixel point.
In this embodiment, the number N of edge pixel points in the preset field according to the edge pixel point (x, y) is determined (x,y) And the gray gradient change characteristics of each edge pixel pointT (x,y) A feature descriptor of the edge pixel (x, y), that is, a vector (N) of the edge pixel (x, y), is obtained (x,y) ,T (x,y) )。
In addition, it should be noted that, in this embodiment, the feature descriptors of the edge pixel points (x, y) are already obtained at present, and the feature descriptors of each edge pixel point in the actual edge image can be obtained according to the step (2-2).
(3) According to the feature descriptors of all edge pixel points in the actual edge image, determining all non-burr edge pixel points in all edge pixel points in the actual edge image and the corresponding feature descriptors thereof, and further determining the probability that all edge pixel points in the actual edge image are burrs.
(3-1) determining each non-burr edge pixel point and the corresponding feature descriptor in each edge pixel point in the actual edge image according to the feature descriptor of each edge pixel point in the actual edge image, wherein the steps comprise:
in this embodiment, the number of times of appearance of the feature descriptor corresponding to each edge pixel point in the actual edge image is counted first, and the three-dimensional curved surface is drawn, and the height value of the three-dimensional curved surface can reflect the number of times of appearance of the feature descriptor corresponding to each edge pixel point. Under the condition that no major error occurs in the machining process, most part edge regions are free of burrs, so that the feature descriptors corresponding to the maximum height value of the three-dimensional curved surface are determined to be the feature descriptors of all non-burr edge pixel points in all edge pixel points in the actual edge image, and then all non-burr edge pixel points in all edge pixel points in the actual edge image are determined.
And (3-2) calculating the similarity between the feature descriptors of the edge pixel points and the feature descriptors of the non-burr edge pixel points, and taking the similarity as the probability that the corresponding edge pixel points are burrs.
In this embodiment, the similarity between the feature descriptors of the edge pixel and the feature descriptors of the non-burr edge pixel is used as the probability p that the edge pixel (x, y) is a burr, and the calculation formula of the probability p that the edge pixel (x, y) is a burr is as follows:
Figure BDA0003753845530000071
wherein p (x, y) is the probability that the edge pixel point (x, y) is a burr, N (x,y) The number of each edge pixel point in the preset field of the edge pixel point (x, y), N 0 Number of edge pixel points, T, in a predetermined field of non-burred edge pixel points (x,y) Is the gray scale gradient change characteristic of the edge pixel point (x, y), T 0 The gray gradient change characteristics of the non-burr edge pixel points are obtained.
(4) According to the probability that each non-burr edge pixel point and each edge pixel point are burrs, acquiring an ideal edge image of the part to be detected without burrs, wherein the method comprises the following steps:
and (4-1) establishing a coordinate system according to the actual edge image of the part to be detected, and acquiring the actual coordinates of each edge pixel point of the actual edge image.
And establishing a coordinate system on the actual edge image of the part to be detected, so as to obtain the actual coordinates of each pixel point on the actual edge image of the part to be detected, namely obtaining the actual coordinates of each edge pixel point of the actual edge image. The ideal ordinate of each burr edge pixel point can be conveniently determined subsequently through the obtained actual abscissa of each edge pixel point of the actual edge image.
And (4-2) clustering the non-burr edge pixel points according to the actual coordinates of the non-burr edge pixel points, so as to obtain the dense area of the edge pixel points.
In this embodiment, each non-burr edge pixel point is analyzed by using the DBSCAN density clustering algorithm, that is, each non-burr edge pixel point is clustered by using the DBSCAN density clustering algorithm, so as to obtain each cluster. The DBSCAN density clustering algorithm is prior art and is not within the scope of the present invention, and is not described in detail herein.
In this embodiment, a rectangular frame with a size of 3 × 3 is set, so that the rectangular frame slides in each class cluster until all class clusters are traversed by sliding, and during the sliding, the density of the non-burr edge pixel points in the rectangular frame is calculated every time the rectangular frame slides, where the density calculation formula is:
Figure BDA0003753845530000081
rho is the density of non-burr edge pixel points in the rectangular frame during the sliding, num is the number of non-burr edge pixel points in the rectangular frame, namely the edge pixel points with the burr probability p of 0, and the numerical value 9 is the area of the rectangular frame during the sliding.
It should be noted that, the larger the number of non-burr edge pixel points in the rectangular frame is, the larger the density thereof is, the dense index ρ of the rectangular frame during each sliding is obtained according to the density calculation formula, and whether the dense index ρ is larger than the set density threshold ρ is determined 0 If it is greater than density threshold rho 0 If the edge pixel points in the rectangular frame are dense areas, the density threshold value rho is set in this embodiment 0 Set to 4.
According to the step (4-2), the dense region of each edge pixel point can be screened from each cluster.
And (4-3) acquiring ideal coordinates corresponding to each burr pixel point in each dense area according to each non-burr edge pixel point in each dense area and the actual horizontal coordinates corresponding to each burr pixel point in each dense area.
In this embodiment, the interpolation algorithm is used to obtain the coordinates of each burr edge pixel point between each non-burr edge pixel point in each edge pixel point dense area, that is, the actual coordinates of each non-burr edge pixel point in each edge pixel point dense area and the actual abscissa corresponding to each burr pixel point in each dense area are known, and the ideal ordinate corresponding to each burr pixel point in each dense area is determined, so as to obtain the ideal coordinate corresponding to each burr pixel point in each dense area. The interpolation algorithm is prior art and is not within the scope of the present invention and will not be described in detail herein.
It should be noted that the ideal coordinates corresponding to each burr pixel point in each dense area refer to coordinates on an ideal edge image where no burr exists in the part to be detected. If the edge pixel point has a burr, the actual coordinate of the edge pixel point on the actual edge image is different from the ideal coordinate of the ideal edge image without the burr, so that the ideal coordinate corresponding to each burr pixel point in each dense area needs to be obtained.
And (4-4) acquiring ideal edges corresponding to the dense areas according to the actual coordinates of the non-burr edge pixel points and the ideal coordinates of the burr pixel points in the dense areas.
Utilize the actual coordinate of each non-burr edge pixel in each intensive region and the ideal coordinate of each burr pixel to carry out the polynomial fitting to each non-burr edge pixel in the intensive region, in order to guarantee that the polynomial of fitting accords with the true trend of edge simultaneously, this embodiment sets for 5 according to prior knowledge with the highest power n of polynomial, constructs polynomial y = f (x), and the computational formula is as follows:
Figure BDA0003753845530000091
where n is the highest power of the polynomial, j is the power of the polynomial, and w is the coefficient of the polynomial.
The step is to take the obtained actual coordinates of the non-burr edge pixel points and the ideal coordinates of the burr pixel points in the dense areas as polynomial fitting data, and obtain a final fitting result, namely the ideal edge corresponding to each dense area by using a least square method. The process of fitting the coordinates of the discrete points using the least squares method is prior art and is not within the scope of the present invention and will not be described in detail here.
And (4-5) determining an ideal edge image without burrs of the part to be detected according to the ideal edges corresponding to the dense areas and the actual coordinate sum of the edge pixel points which are not in the dense areas.
(4-5-1) determining nearest edge pixel points which are not in each dense region at any end of an ideal edge corresponding to the dense region, and determining the predicted ordinate of the nearest edge pixel points which are not in each dense region according to the actual abscissa of the ideal edge and the nearest edge pixel points which are not in each dense region.
First, for an ideal edge corresponding to each dense region, edge pixel points that are not in each dense region and are closest to both ends of each ideal edge are simultaneously obtained, that is, each end of each ideal edge correspondingly finds an edge pixel point that is closest to the ideal edge and not in each dense region. And secondly, substituting actual abscissas of the first target edge point and the second target edge point corresponding to the ideal edge of each dense area into the fitting polynomial corresponding to the ideal edge respectively to obtain the predicted ordinates of the first target edge point and the second target edge point.
(4-5-2) determining the ideal ordinate of the nearest edge pixel point which is not in each dense area according to the predicted ordinate, the actual ordinate and the probability of being a burr of the nearest edge pixel point which is not in each dense area, wherein the calculation formula is as follows:
Figure BDA0003753845530000101
wherein Y is k The nearest ideal ordinate, p (x), of the edge pixel point k not in each dense region at any end of the ideal edge corresponding to the dense region k ,y k ) The probability that the nearest edge pixel point k which is not in each dense region at any end of the ideal edge corresponding to the dense region is a burr,
Figure BDA0003753845530000102
is the prediction ordinate, y, of the nearest edge pixel point k not in each dense region at any end of the ideal edge corresponding to the dense region k The nearest edge pixel point k not in each dense region at any end of the ideal edge corresponding to the dense region is the actual vertical coordinate, x k And the actual abscissa of the nearest edge pixel point k which is not in each dense region at any end of the ideal edge corresponding to the dense region.
It should be noted that, the probability that the nearest edge pixel k not in each dense region at any end of the ideal edge corresponding to the dense region is a burr is used as the weight of the predicted ordinate and the actual ordinate, so as to ensure that the ideal ordinate of the nearest edge pixel k not in each dense region is more accurate, such as when p (x) is measured k ,y k ) When the number of the edge pixels k is 0, directly taking the actual ordinate of the nearest edge pixel k not in each dense area as the ideal ordinate; when p (x) k ,y k ) When the value is 1, the prediction ordinate of the nearest edge pixel point k not in each dense area is directly taken as the ideal ordinate.
Through the steps (4-5-2) in the above manner, for the ideal edge corresponding to each dense region, the ideal ordinate of the first target edge point and the second target edge point corresponding to each dense region can be determined.
(4-5-3) updating the ideal edge corresponding to the dense area according to the actual abscissa and the ideal ordinate of the nearest edge pixel point not in each dense area to obtain the updated ideal edge corresponding to the dense area, determining the nearest edge pixel point not in each dense area at any end of the updated ideal edge corresponding to the dense area, and repeating the steps until the ideal ordinates of all the edge pixel points not in each dense area are determined.
And updating the ideal edge corresponding to the dense region corresponding to the edge pixel point k according to the acquired actual abscissa and ideal ordinate of the edge pixel point k which is not in each dense region, namely adding the edge pixel point k which is not in each dense region into the ideal edge corresponding to the dense region, even if the ideal edge corresponding to the dense region extends to the edge pixel point k. In this way, for each ideal edge corresponding to each dense region, the ideal edge can be extended to the actual abscissa and ideal ordinate positions of the first target edge point and the second target edge point corresponding to the ideal edge, so as to implement the update process for each ideal edge.
And (4) according to the ideal edge corresponding to the updated dense area, determining the nearest edge pixel points which are not in each dense area at any end of the ideal edge corresponding to the updated dense area again, namely repeating the steps (4-5-1) - (4-5-3) until the ideal ordinate of all the edge pixel points which are not in each dense area is obtained, namely, extending each ideal edge to the actual abscissa and ideal ordinate positions corresponding to all the edge pixel points which are not in each dense area, and finally obtaining the ideal edge image without burrs of the part to be detected.
(5) And determining the burr area of the part to be detected according to the actual edge image of the part to be detected and the ideal edge image of the part to be detected without burrs.
And performing exclusive OR operation on the real edge image of the part to be detected and the ideal edge image of the part to be detected without burrs, acquiring two comparison images corresponding to the edge images, acquiring the burr area of the part to be detected according to the comparison images, and determining the specific positions of different burr areas.
In the embodiment, the ideal edge image which is obtained in the step (4) and has no burr of the part to be detected is marked as
Figure BDA0003753845530000111
Will real edge image I by And ideal edge image
Figure BDA0003753845530000112
An exclusive-or operation (same as 0, different from 1) was performed to obtain a comparison graph. If the actual edge image I by If no burr exists in the image, the edge pixel point of each position in the contrast image is 0; if the actual edge image I by If the burr exists in the image, the edge pixel point in the contrast image is 1, and the area with the edge pixel point of 1 is the edge area of the burr.
This example utilizes the closed edge of connected domain analysis extraction all burrs, can accurately acquire the burr testing result of waiting to detect the different positions of part fast, and the testing result includes whether there is the burr and the regional concrete position of burr in waiting to detect the part. The connected domain analysis is the prior art, and is not within the protection scope of the present invention, and is not described in detail herein.
The embodiment also provides an image processing-based mechanical part burr detection system, which comprises a processor and a memory, wherein the processor is used for processing instructions stored in the memory to realize the image processing-based mechanical part burr detection method, which is the content described above and is not explained in detail herein.
It should be noted that: the sequence of the above embodiments of the present invention is only for description, and does not represent the advantages or disadvantages of the embodiments. And specific embodiments thereof have been described above. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and should not be taken as limiting the scope of the present invention, which is intended to cover any modifications, equivalents, improvements, etc. within the spirit and scope of the present invention.

Claims (10)

1. A mechanical part burr detection method based on image processing is characterized by comprising the following steps:
acquiring a surface image of a part to be detected, and further determining an actual edge image of the part to be detected;
acquiring the number and the gray level of each edge pixel point in the preset field of each edge pixel point in the actual edge image of the part to be detected, and acquiring a feature descriptor of each edge pixel point in the actual edge image according to the number and the gray level of each edge pixel point in the preset field of each edge pixel point;
determining each non-burr edge pixel point and a corresponding feature descriptor in each edge pixel point in the actual edge image according to the feature descriptors of each edge pixel point in the actual edge image, and further determining the probability that each edge pixel point in the actual edge image is burr;
acquiring an ideal edge image of the part to be detected without burrs according to each non-burr edge pixel point and the probability that each edge pixel point is a burr;
and determining the burr detection result of the part to be detected according to the actual edge image of the part to be detected and the ideal edge image of the part to be detected without burrs.
2. The image processing-based mechanical part burr detection method according to claim 1, wherein the step of obtaining an ideal edge image of the part to be detected without burrs comprises:
establishing a coordinate system according to an actual edge image of a part to be detected, and acquiring actual coordinates of each edge pixel point of the actual edge image;
clustering each non-burr edge pixel point according to the actual coordinate of each non-burr edge pixel point so as to obtain each dense area;
acquiring ideal coordinates corresponding to each burr pixel point in each dense area according to each non-burr edge pixel point in each dense area and the actual abscissa corresponding to each burr pixel point in each dense area;
acquiring ideal edges corresponding to each dense area according to actual coordinates of each non-burr edge pixel point in each dense area and ideal coordinates of each burr pixel point;
and determining an ideal edge image without burrs of the part to be detected according to the ideal edges corresponding to the dense areas and the actual coordinates of the edge pixel points which are not in the dense areas and the probability of the ideal edges being burrs.
3. The image processing-based mechanical part burr detection method according to claim 2, wherein the step of determining an ideal edge image of the part to be detected without burrs comprises:
determining nearest edge pixel points which are not in each dense region at any end of an ideal edge corresponding to the dense region, and determining predicted vertical coordinates of the nearest edge pixel points which are not in each dense region according to actual horizontal coordinates of the ideal edge and the nearest edge pixel points which are not in each dense region;
determining the ideal vertical coordinate of the nearest edge pixel point which is not in each dense area according to the predicted vertical coordinate, the actual vertical coordinate and the probability of being a burr of the nearest edge pixel point which is not in each dense area;
and updating the ideal edge corresponding to the dense area according to the actual abscissa and the ideal ordinate of the nearest edge pixel point which is not in each dense area to obtain the updated ideal edge corresponding to the dense area, determining the nearest edge pixel point which is not in each dense area at any end of the updated ideal edge corresponding to the dense area, and repeating the steps until the ideal ordinates of all the edge pixel points which are not in each dense area are determined.
4. The image-processing-based mechanical part burr detection method according to claim 3, wherein a calculation formula for determining the ideal ordinate of the nearest edge pixel point that is not within each dense area is as follows:
Figure FDA0003753845520000021
wherein Y is k The nearest edge pixel point k which is not in each dense area at any end of the ideal edge corresponding to the dense area corresponds to an ideal ordinate, p (x) k ,y k ) The probability that the nearest edge pixel point k which is not in each dense region at any end of the ideal edge corresponding to the dense region is a burr,
Figure FDA0003753845520000022
is the prediction ordinate, y, of the nearest edge pixel point k not in each dense region at any end of the ideal edge corresponding to the dense region k Is the actual vertical coordinate, x, of the nearest edge pixel point k not in each dense region at any end of the ideal edge corresponding to the dense region k And the actual abscissa of the nearest edge pixel point k which is not in each dense region at any end of the ideal edge corresponding to the dense region.
5. The image processing-based mechanical part burr detection method according to claim 1, wherein the step of obtaining the feature descriptors of the pixel points at each edge comprises:
calculating the gray gradient of each edge pixel point in the preset field of each edge pixel point, and determining the gray gradient change characteristics of each edge pixel point according to the gray gradient of each edge pixel point in the preset field of each edge pixel point;
and acquiring a feature descriptor of each edge pixel point according to the number of each edge pixel point in the preset field of each edge pixel point and the gray gradient change feature of each edge pixel point.
6. The image processing-based mechanical part burr detection method according to claim 5, wherein the step of calculating the gradation gradient change characteristic of each edge pixel comprises:
acquiring gradient unit vectors of every two adjacent edge pixel points in the preset field of the edge pixel points, and further acquiring cosine similarity of the gradient unit vectors of every two adjacent edge pixel points in the preset field of the edge pixel points;
determining a change sequence of the gradient unit vectors of the edge pixel points in the preset field according to the cosine similarity of the gradient unit vectors of every two adjacent edge pixel points in the preset field of the edge pixel points, and further acquiring an autocorrelation matrix of the edge pixel points;
and determining the gray gradient change characteristics of the edge pixel points according to the autocorrelation matrix of the edge pixel points.
7. The image processing-based mechanical part burr detection method according to claim 5, wherein the step of determining the probability of burrs existing at each edge pixel point comprises:
and calculating the similarity between the feature descriptors of the edge pixel points and the feature descriptors of the non-burr edge pixel points, and taking the similarity as the probability that the corresponding edge pixel points are burrs.
8. The image processing-based mechanical part burr detection method according to claim 7, wherein a calculation formula for calculating the probability of each edge pixel being a burr is as follows:
Figure FDA0003753845520000031
wherein p (x, y) is the probability that the edge pixel point (x, y) is a burr, N (x,y) Is the number, N, of edge pixels in a predetermined field of edge pixels (x, y) 0 Number of edge pixel points within a predetermined field of non-burred pixel points,T (x,y) Is the gray scale gradient change characteristic of the edge pixel point (x, y), T 0 The gray scale gradient change characteristics of the non-burred pixel points are obtained.
9. The image processing-based mechanical part burr detection method according to claim 1, wherein the step of obtaining the burr detection result of the part to be detected comprises:
and performing exclusive OR operation on the real edge image of the part to be detected and the ideal edge image of the part to be detected without burrs, acquiring comparison images corresponding to the two edge images, and acquiring a burr area of the part to be detected according to the comparison images.
10. An image processing-based machine part burr detection system, comprising a processor and a memory, the processor being configured to process instructions stored in the memory to implement the image processing-based machine part burr detection method according to any one of claims 1 to 9.
CN202210851986.8A 2022-07-19 2022-07-19 Mechanical part burr detection method and system based on image processing Pending CN115239661A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210851986.8A CN115239661A (en) 2022-07-19 2022-07-19 Mechanical part burr detection method and system based on image processing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210851986.8A CN115239661A (en) 2022-07-19 2022-07-19 Mechanical part burr detection method and system based on image processing

Publications (1)

Publication Number Publication Date
CN115239661A true CN115239661A (en) 2022-10-25

Family

ID=83672642

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210851986.8A Pending CN115239661A (en) 2022-07-19 2022-07-19 Mechanical part burr detection method and system based on image processing

Country Status (1)

Country Link
CN (1) CN115239661A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116740057A (en) * 2023-08-11 2023-09-12 深圳市鹏基精密工业有限公司 Cylindrical workpiece burr online detection method and system
CN117325211A (en) * 2023-12-01 2024-01-02 江苏中科云控智能工业装备有限公司 Deburring robot pose monitoring system and method based on Internet of things
CN117921197A (en) * 2024-03-21 2024-04-26 上海强华实业股份有限公司 Method, system, equipment and medium for manufacturing special-shaped groove plate by laser precision cutting

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116740057A (en) * 2023-08-11 2023-09-12 深圳市鹏基精密工业有限公司 Cylindrical workpiece burr online detection method and system
CN116740057B (en) * 2023-08-11 2023-12-01 深圳市鹏基精密工业有限公司 Cylindrical workpiece burr online detection method and system
CN117325211A (en) * 2023-12-01 2024-01-02 江苏中科云控智能工业装备有限公司 Deburring robot pose monitoring system and method based on Internet of things
CN117325211B (en) * 2023-12-01 2024-02-09 江苏中科云控智能工业装备有限公司 Deburring robot pose monitoring system and method based on Internet of things
CN117921197A (en) * 2024-03-21 2024-04-26 上海强华实业股份有限公司 Method, system, equipment and medium for manufacturing special-shaped groove plate by laser precision cutting
CN117921197B (en) * 2024-03-21 2024-06-07 上海强华实业股份有限公司 Method, system, equipment and medium for manufacturing special-shaped groove plate by laser precision cutting

Similar Documents

Publication Publication Date Title
CN115239661A (en) Mechanical part burr detection method and system based on image processing
CN107014294B (en) Contact net geometric parameter detection method and system based on infrared image
US8553980B2 (en) Method and apparatus extracting feature points and image based localization method using extracted feature points
CN109118473B (en) Angular point detection method based on neural network, storage medium and image processing system
CN108229475B (en) Vehicle tracking method, system, computer device and readable storage medium
US9767383B2 (en) Method and apparatus for detecting incorrect associations between keypoints of a first image and keypoints of a second image
CN110546651B (en) Method, system and computer readable medium for identifying objects
CN108898132B (en) Terahertz image dangerous article identification method based on shape context description
CN111161222B (en) Printing roller defect detection method based on visual saliency
CN116091504B (en) Connecting pipe connector quality detection method based on image processing
CN111080661A (en) Image-based line detection method and device and electronic equipment
CN111667470B (en) Industrial pipeline flaw detection inner wall detection method based on digital image
CN114612345B (en) Light source detection method based on image processing
CN110084830B (en) Video moving object detection and tracking method
CN116137036B (en) Gene detection data intelligent processing system based on machine learning
CN116740072B (en) Road surface defect detection method and system based on machine vision
CN113837198B (en) Improved self-adaptive threshold Canny edge detection method based on three-dimensional block matching
CN111080631A (en) Fault positioning method and system for detecting floor defects of spliced images
CN107610174B (en) Robust depth information-based plane detection method and system
CN111462056A (en) Workpiece surface defect detection method, device, equipment and storage medium
CN114943744A (en) Edge detection method based on local Otsu thresholding
CN107808165B (en) Infrared image matching method based on SUSAN corner detection
CN113643290B (en) Straw counting method and device based on image processing and storage medium
CN111473944B (en) PIV data correction method and device for observing complex wall surface in flow field
CN111768436B (en) Improved image feature block registration method based on fast-RCNN

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination