CN115861313B - Abrasion detection method for grinding head - Google Patents

Abrasion detection method for grinding head Download PDF

Info

Publication number
CN115861313B
CN115861313B CN202310162114.5A CN202310162114A CN115861313B CN 115861313 B CN115861313 B CN 115861313B CN 202310162114 A CN202310162114 A CN 202310162114A CN 115861313 B CN115861313 B CN 115861313B
Authority
CN
China
Prior art keywords
abrasion
region
area
gray
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310162114.5A
Other languages
Chinese (zh)
Other versions
CN115861313A (en
Inventor
罗强
黄大路
柯善政
罗永念
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dongguan Chuncao Grinding Technology Co ltd
Original Assignee
Dongguan Chuncao Grinding Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dongguan Chuncao Grinding Technology Co ltd filed Critical Dongguan Chuncao Grinding Technology Co ltd
Priority to CN202310162114.5A priority Critical patent/CN115861313B/en
Publication of CN115861313A publication Critical patent/CN115861313A/en
Application granted granted Critical
Publication of CN115861313B publication Critical patent/CN115861313B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Image Analysis (AREA)
  • Constituent Portions Of Griding Lathes, Driving, Sensing And Control (AREA)

Abstract

The invention discloses a wear detection method for a grinding head, belonging to the technical field of image processing; the method comprises the following steps: acquiring a grinding head surface image; taking the pixel point corresponding to the maximum comprehensive significance as a wear seed point; acquiring a currently growing abrasion area; acquiring a wear area growing next time; and analogically, acquiring a wear area until the grown wear area is not changed any more; and judging the abrasion degree of the grinding head according to the ratio of the number of the pixel points in the abrasion area to the total number of the pixel points in the grinding head surface image. According to the invention, through the distribution condition in the abrasion scene, the self-adaptive change range of the merging threshold is determined, and the abrasion area of the surface of the grinding head is accurately extracted.

Description

Abrasion detection method for grinding head
Technical Field
The invention relates to the technical field of image processing, in particular to a wear detection method for a grinding head.
Background
The grinding head is a small grinding tool, the appearance is in a handle shape, the grinding head can be divided into a plurality of types according to different materials, different types of grinding heads act on different positions, for example, a rubber grinding head is commonly used for polishing a die, and a abrasive cloth grinding head is commonly used for polishing the inner wall of the aperture.
In the grinding process, when the phenomena of excessively high hardness of grinding materials and the like occur, the grinding head can generate mechanical abrasion to a certain extent in the grinding process, under the condition of serious abrasion, the grinding head can be passivated quickly to lose grinding capability, the grinding head can not be used continuously, the mechanical abrasion not only reduces the performance of the grinding head as a grinding tool, but also can directly influence the quality and the production efficiency of a product once the abrasion of the grinding head is not detected in time, so that the abrasion detection of the grinding head is a key joint in the mould production process. However, under the current production conditions, the grinding head is usually detected by on-site detection of workers, and the detection mode has a plurality of limiting conditions and the reliability of detection results is not high.
In order to achieve the above object, in the process of acquiring the wear area by using the area growing method, the selection of seed points is relatively random, the judging condition of area combination is fixed, and the method is suitable for relatively regular area identification, but for the respectively uneven wear areas, the wear areas on the grinding head are difficult to accurately identify, so that the wear condition of the grinding head is difficult to judge, and therefore, an efficient and rapid wear detection method for industrial production is needed for detecting the grinding head.
Disclosure of Invention
In order to solve the deficiencies in the background art; the invention provides a wear detection method for a grinding head, which is characterized in that a seed point of a wear area with higher confidence is obtained as a growth starting point of the wear area through judging the corresponding wear degree between pixel points in a gray level chart of a surface ROI of the grinding head, and the growth rule of the wear area is set according to the wear condition, so that the influence of a grinding head interference area is effectively avoided, the segmentation result of the wear area of the surface of the grinding head is accurately extracted, and the detection precision of the grinding head wear is improved.
The invention aims to provide a wear detection method for a grinding head, which comprises the following steps of:
acquiring a grinding head surface image; acquiring an interested area containing abrasion in the grinding head surface image; graying treatment is carried out on the region of interest to obtain a gray scale map of the region of interest;
acquiring the comprehensive significance of each pixel point according to the gray value of each pixel point in the gray map of the region of interest, the gray average value of the line and the gray average value of the column; taking the pixel point corresponding to the maximum comprehensive significance as a wear seed point;
combining the pixel points of which the difference value of the comprehensive significance in the abrasion seed points and the adjacent areas is smaller than the initial first combining threshold value to obtain an initial abrasion area;
taking the deviation degree of the comprehensive significance of each pixel point in the initial wear area relative to the comprehensive significance of the wear seed point as a merging coefficient of the initial wear area; acquiring a current first merging threshold range according to the comprehensive significance mean value and the merging coefficient of the pixel points in the initial wear area;
combining the pixel points, the difference value of which is smaller than the current first combining threshold range, of the comprehensive significance of the abrasion seed points and the adjacent difference value of the initial abrasion areas with the initial abrasion areas to obtain the abrasion areas growing currently;
similarly, acquiring a next first merging threshold range according to the currently grown abrasion region; combining the pixel points, of which the comprehensive saliency difference value is smaller than the range of the next first combining threshold value, adjacent to the abrasion seed point and the abrasion area growing at present with the abrasion area growing at present to obtain the abrasion area growing at next time; and analogically, acquiring a wear area until the grown wear area is not changed any more;
and judging the abrasion degree of the grinding head according to the ratio of the number of the pixel points in the abrasion area to the total number of the pixel points in the grinding head surface image.
In an embodiment, the process of acquiring the wear area further includes:
taking the pixel point corresponding to the minimum comprehensive significance as a non-abrasion seed point; acquiring a non-abrasion area according to the step of acquiring the non-abrasion seed points through area growth;
the worn region and the non-worn region are worn regions obtained when the worn region and the non-worn region grow from the worn seed point and the non-worn seed point, respectively, until the worn region and the non-worn region grown are no longer changed, and the worn region and the non-worn region do not overlap.
In one embodiment, the non-worn area is obtained by:
combining the non-abrasion seed points with the pixel points in the neighborhood of which the difference value of the comprehensive significance is smaller than a second combining threshold value to obtain an initial non-abrasion region;
taking the deviation degree of the comprehensive significance of each pixel point in the initial non-abrasion area relative to the comprehensive significance of the non-abrasion seed point as a merging coefficient of the initial non-abrasion area; acquiring a current second merging threshold range according to the comprehensive saliency mean value and the merging coefficient of the pixel points in the initial non-abrasion area;
combining the pixel points, the difference value of which is smaller than the current second combining threshold range, of the comprehensive significance of the non-abrasion seed points and the adjacent difference value of the initial non-abrasion areas with the initial non-abrasion areas to obtain the currently growing non-abrasion areas;
similarly, a second merging threshold range is obtained next time according to the currently grown non-abrasion area; combining the pixel points, of which the comprehensive saliency difference value between the non-abrasion seed points and the non-abrasion areas growing at present is smaller than the second combining threshold range next, with the non-abrasion areas growing at present to obtain non-abrasion areas growing at next time; and so on until the growing non-worn area is no longer changed, the non-worn area is acquired.
In an embodiment, an absolute value of a difference between the integrated saliency of the non-worn seed point and a mean value of the integrated saliency of the pixel points in the gray scale map of the region of interest is used as the second merging threshold.
In an embodiment, the integrated saliency of each pixel is obtained according to the following steps:
according to the difference between the gray average value of each pixel point in the gray map of the region of interest and the gray average value of the gray map of the region of interest and the difference between the gray value of each pixel point and the gray average value of the gray map of the region of interest, the significance of each pixel point in the gray map of the region of interest in the horizontal direction is obtained;
according to the difference between the gray average value of each pixel point in the gray map of the region of interest and the gray average value of the gray map of the region of interest and the difference between the gray value of each pixel point and the gray average value of the gray map of the region of interest, the significance of each pixel point in the gray map of the region of interest in the vertical direction is obtained;
and acquiring the comprehensive saliency of each pixel point according to the saliency of each pixel point in the gray scale map of the region of interest in the horizontal direction and the saliency of each pixel point in the vertical direction.
In an embodiment, when there are a plurality of pixels corresponding to the obtained maximum integrated saliency, a pixel corresponding to the maximum saliency in the horizontal direction and/or the maximum saliency in the vertical direction is selected from the plurality of pixels corresponding to the maximum integrated saliency as the wear seed point.
In an embodiment, the difference between the combined saliency of the wear seed points and the combined saliency of the non-wear seed points is taken as a first combining threshold.
The beneficial effects of the invention are as follows: according to the abrasion detection method for the grinding head, provided by the invention, the seed points of the abrasion area with higher confidence coefficient are obtained to serve as the growth starting points of the abrasion area through judging the corresponding abrasion degree between the pixel points in the ROI gray level chart of the surface of the grinding head, and the growth rules of the abrasion area are set according to the abrasion condition, so that the influence of the interference area of the grinding head is effectively avoided, the segmentation result of the abrasion area of the surface of the grinding head is accurately extracted, and the abrasion detection precision of the grinding head is improved.
According to the invention, the seed points are selected according to the distribution condition of the areas in the scene, the scene characteristics are set, the seed points in the scene are more reliable, the determination of the merging coefficients is used as the parameters of the subsequent merging threshold values, the self-adaptive change range of the merging threshold values is determined according to the distribution condition in the scene, and the accurate extraction of the abrasion areas of the grinding head surface is realized.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flow chart illustrating the general steps of an embodiment of a wear detection method for a grinding head according to the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The following examples illustrate the invention by way of wear detection for a silicon carbide sintered T-shaped grinding head.
The invention provides a wear detection method for a grinding head, which is shown in fig. 1, and comprises the following steps:
s1, acquiring a grinding head surface image; acquiring an interested area containing abrasion in the grinding head surface image; graying treatment is carried out on the region of interest to obtain a gray scale map of the region of interest;
in the embodiment, an industrial camera is installed above the working position of the grinding head, images of all surfaces ground by the grinding head are obtained, and the obtained images are RGB images; there is a lot of noise in the grinding head wear image obtained by the image acquisition device, and these interference points affect the extraction of the subsequent wear area. Therefore, before the grinding head abrasion area is obtained, the obtained image is denoising processed by using the median filtering denoising technology in the embodiment, the median filtering denoising is a known technology, and the detailed description of the specific process is omitted. Taking the abrasion region as a saliency region in the grinding head surface image, acquiring a region of interest (ROI) which possibly is the abrasion region in the grinding head surface image by using a saliency detection algorithm, wherein the saliency detection is a known technology, and the detailed process is not repeated.
It should be noted that, the reason for the abrasion area of the grinding head is mechanical abrasion, so that the abrasion surface tends to be rough, the abrasion area is difficult to be accurately segmented from the grinding head surface image, the ROI area primarily confirms the abrasion approximate area, and the calculation amount is reduced. In order to reduce the complexity of subsequent image processing, the region of interest is subjected to graying processing to obtain a region of interest (ROI) gray scale map. The grinding head abrasion image mainly comprises an abrasion region and a background region, and the ROI gray scale image approximately shows the range of the abrasion region, but still contains a large number of background pixel points, and the background pixel points hardly contain useful abrasion information. The reflection phenomenon of the grinding head region enables an interference region to exist in the obtained ROI gray scale image, and the brightness characteristics of the interference region are similar to those of the pixel points in the abrasion region. But the edges of the interference area are more regular, in sharp contrast to the worn areas caused by mechanical wear.
S2, acquiring abrasion seed points and non-abrasion seed points;
acquiring the comprehensive significance of each pixel point according to the gray value of each pixel point in the gray map of the region of interest, the gray average value of the line and the gray average value of the column; taking the pixel point corresponding to the maximum comprehensive significance as a wear seed point;
it should be noted that, the region growth starts from the seed point, and adjacent pixels having similar gray values, texture information, color features, and the like to the seed point are combined into one region. Therefore, in this embodiment, through the distribution characteristics among the image pixels in the grinding head ROI gray map, the pixels located in the wear area are selected as the first type of area growth start points, namely the wear seed points, the pixels located in the background area are selected as the second type of area seed points, the remaining pixels in the ROI gray map and the two types of seed points are respectively subjected to similarity measurement, and the area growth is performed according to the measurement result, so that the type of each pixel in the ROI gray map can be accurately distinguished by the area growth result satisfying the two measurements, and a more accurate wear area segmentation result is obtained. In addition, the growth rules are important parameters which need to be determined by the area growth algorithm, the growth rules determine the measurement rules and the merging modes of the seed pixel points to the next pixel point, and different growth rules are designed aiming at the characteristics of the seed points in the abrasion area and the seed points in the non-abrasion area.
In this embodiment, since the abrasion is caused by the abrasion of the grinding head in the process of grinding materials, the abrasion region distribution on the surface of the grinding head is observed, the characteristics of irregular shape and irregular distribution of the abrasion region are not difficult to find, the characteristics are reflected in the ROI gray scale map, the gray scale value changes in the horizontal direction and the vertical direction in the abrasion region are relatively intense, the abrasion significance is constructed here and used for representing the gray scale change characteristics among pixels in the ROI gray scale map, and the comprehensive significance of each pixel is obtained according to the following steps:
according to the difference between the gray average value of each pixel point in the gray map of the region of interest and the gray average value of the gray map of the region of interest and the difference between the gray value of each pixel point and the gray average value of the gray map of the region of interest, the significance of each pixel point in the gray map of the region of interest in the horizontal direction is obtained; the significance calculation formula of each pixel point in the horizontal direction is as follows:
Figure SMS_1
in the method, in the process of the invention,
Figure SMS_3
is the significance of pixel i in the horizontal direction; n is the number of pixels in the row where pixel point i is located;
Figure SMS_6
Is the gray value average value of the ROI gray scale map;
Figure SMS_8
Is the gray value of pixel i;
Figure SMS_4
Figure SMS_5
Is to calculate->
Figure SMS_7
Weight parameter of time, < ->
Figure SMS_9
Experience values of 0.7 +.>
Figure SMS_2
The empirical values are respectively 0.3;
according to the difference between the gray average value of each pixel point in the gray map of the region of interest and the gray average value of the gray map of the region of interest and the difference between the gray value of each pixel point and the gray average value of the gray map of the region of interest, the significance of each pixel point in the gray map of the region of interest in the vertical direction is obtained; the saliency calculation formula of each pixel point in the vertical direction is as follows:
Figure SMS_10
in the method, in the process of the invention,
Figure SMS_11
is the saliency of pixel i in the vertical direction; m is the number of pixels in the column where pixel point i is located;
Figure SMS_12
Figure SMS_13
Is to calculate->
Figure SMS_14
Weight parameters at the time;
Figure SMS_15
Experience values of 0.7 +.>
Figure SMS_16
The empirical values are respectively 0.3;
wherein,,
Figure SMS_17
Figure SMS_18
and->
Figure SMS_19
Figure SMS_20
The effect of the weights is to amplify the difference, say 10 and 9 differ by 1, which is a relatively small difference value compared to 10 and 9, but at the same time by 10, the difference becomes 10, which is more pronounced.
Acquiring the comprehensive saliency of each pixel point according to the saliency of each pixel point in the gray scale map of the region of interest in the horizontal direction and the saliency of each pixel point in the vertical direction; the comprehensive saliency calculation formula of each pixel point is as follows:
Figure SMS_21
wherein, the formula is the comprehensive significance of the pixel point i;
Figure SMS_22
is the significance of pixel i in the horizontal direction;
Figure SMS_23
Is the saliency of pixel i in the vertical direction; the integrated significance reflects the gray scale change condition of the pixel point in the ROI gray scale map, < +.>
Figure SMS_24
The larger the pixel gray value of the row where the pixel point i is located is, the more obvious the difference between the gray average value of the pixel gray value of the row where the pixel point i is located and the gray average value of the whole ROI gray image is, and the more obvious the difference between the pixel gray value of the column where the pixel point i is located and the gray average value of the whole ROI gray image is, the more likely the pixel point i is the pixel point in the abrasion region. It should be noted that, the integrated saliency of each pixel point is comprehensively represented by multiplying the saliency in the horizontal direction and the saliency in the vertical direction, because the difference of the abrasion saliency can be better represented by accumulating the saliency in two different directions by multiplication, the larger the accumulated value is, the more obvious the difference is, and the saliency of each pixel point in the gray scale image of interest can be represented.
Taking the pixel point corresponding to the maximum comprehensive significance as a wear seed point; when a plurality of pixels corresponding to the maximum comprehensive saliency are obtained, selecting the pixel corresponding to the maximum saliency in the horizontal direction and/or the pixel corresponding to the maximum saliency in the vertical direction from the pixels corresponding to the maximum comprehensive saliency as a wear seed point.
Further, taking the pixel point corresponding to the minimum comprehensive significance as a non-abrasion seed point; similarly, when a plurality of pixels corresponding to the minimum integrated saliency are obtained, selecting the pixel corresponding to the minimum saliency in the horizontal direction and/or the pixel corresponding to the minimum saliency in the vertical direction from the pixels corresponding to the minimum integrated saliency as a wear seed point.
In this embodiment, the entire ROI gray map is traversed, the integrated saliency of each pixel point is calculated, the maximum saliency in the horizontal direction is selected with the maximum integrated saliency as a precondition, the intersection point of the maximum saliency in the vertical direction is the seed point of the wear region, that is, the wear seed point, the minimum saliency in the horizontal direction is selected with the minimum integrated saliency as a precondition, the intersection point of the minimum saliency in the vertical direction is selected as the seed point of the background region, that is, the non-wear seed point.
S3, acquiring a wear area and a non-wear area;
combining the pixel points of which the difference value of the comprehensive significance in the abrasion seed points and the adjacent areas is smaller than the initial first combining threshold value to obtain an initial abrasion area;
taking the deviation degree of the comprehensive significance of each pixel point in the initial wear area relative to the comprehensive significance of the wear seed point as a merging coefficient of the initial wear area; acquiring a current first merging threshold range according to the comprehensive significance mean value and the merging coefficient of the pixel points in the initial wear area;
combining the pixel points, the difference value of which is smaller than the current first combining threshold range, of the comprehensive significance of the abrasion seed points and the adjacent difference value of the initial abrasion areas with the initial abrasion areas to obtain the abrasion areas growing currently;
similarly, acquiring a next first merging threshold range according to the currently grown abrasion region; combining the pixel points, of which the comprehensive saliency difference value is smaller than the range of the next first combining threshold value, adjacent to the abrasion seed point and the abrasion area growing at present with the abrasion area growing at present to obtain the abrasion area growing at next time; and analogically, acquiring a wear area until the grown wear area is not changed any more;
wherein, in the process of obtaining the wearing area, further comprising:
taking the pixel point corresponding to the minimum comprehensive significance as a non-abrasion seed point; acquiring a non-abrasion area according to the step of acquiring the non-abrasion seed points through area growth;
the worn area and the unworn area are worn areas obtained when the worn area and the unworn area are not changed and the worn area and the unworn area are not overlapped, and the worn area and the unworn area are grown simultaneously from the worn seed point and the unworn seed point respectively.
The non-worn area is obtained by the following steps:
combining the non-abrasion seed points with the pixel points in the neighborhood of which the difference value of the comprehensive significance is smaller than a second combining threshold value to obtain an initial non-abrasion region;
taking the deviation degree of the comprehensive significance of each pixel point in the initial non-abrasion area relative to the comprehensive significance of the non-abrasion seed point as a merging coefficient of the initial non-abrasion area; acquiring a current second merging threshold range according to the comprehensive saliency mean value and the merging coefficient of the pixel points in the initial non-abrasion area; and taking the absolute value of the difference value between the comprehensive significance of the non-abrasion seed points and the average value of the comprehensive significance of the pixel points in the gray level image of the region of interest as a second merging threshold value.
Combining the pixel points, the difference value of which is smaller than the current second combining threshold range, of the comprehensive significance of the non-abrasion seed points and the adjacent difference value of the initial non-abrasion areas with the initial non-abrasion areas to obtain the currently growing non-abrasion areas;
similarly, a second merging threshold range is obtained next time according to the currently grown non-abrasion area; combining the pixel points, of which the comprehensive saliency difference value is smaller than the range of the next second combining threshold value, of the non-abrasion seed points adjacent to the non-abrasion area growing at present with the non-abrasion area growing at present to obtain the non-abrasion area growing at next time; and so on until the growing non-worn area is no longer changed, the non-worn area is acquired.
In this embodiment, the worn area and the unworn area are obtained as follows:
step 1, the difference value of the comprehensive saliency between the selected abrasion seed point j and the eight surrounding areas is smaller than a first merging threshold value
Figure SMS_25
Is combined into an initial wear area +.>
Figure SMS_26
The method comprises the steps of carrying out a first treatment on the surface of the Wherein the difference between the combined significance of the wear seed points and the combined significance of the non-wear seed points is taken as a first combining threshold +.>
Figure SMS_27
The method comprises the steps of carrying out a first treatment on the surface of the It should be noted that, the feature value corresponding to the pixel point in the wear area should be close to the wear seed point and far from the non-wear seed point, and the difference between the integrated saliency of the pixel point in the wear area and the integrated saliency of the wear seed point is considered to be smaller than the difference between the integrated saliency of the wear seed point and the integrated saliency of the non-wear seed point, so the difference between the integrated saliency of the wear seed point and the integrated saliency of the non-wear seed point is taken as the first combining threshold.
Step 2, setting the combination coefficient of the abrasion region, wherein the combination coefficient of the abrasion region has the function of judging the condition of the next growth combination of the region according to the existing growth result in the ROI gray level diagram, and calculating the combination coefficient of the abrasion region corresponding to the kth abrasion region because the abrasion conditions of different positions of the grinding head are different
Figure SMS_28
The calculation formula is as follows:
Figure SMS_29
wherein X is the number of pixels in the wear area obtained by combining the kth growth,
Figure SMS_30
is the comprehensive significance corresponding to the x-th pixel point;
Figure SMS_31
Is the comprehensive significance corresponding to the abrasion seed points;
Figure SMS_32
Is the combining coefficient of the wear area obtained by the kth growth combining; when k=1, then->
Figure SMS_33
Is the merging coefficient of the initial wear region. It should be noted that, the pixel points in the current abrasion area have a certain degree of similarity, and meanwhile, there is a slight difference, because the abrasion degrees of the different grinding head positions are different, the difference reflects the difference between the corresponding comprehensive saliency of the pixel points, so that the abrasion degree of the current area is represented by the distribution condition of the comprehensive saliency of the pixel points in the area, and the greater the difference of the comprehensive saliency of the pixel points in the area is, the area merging coefficient is>
Figure SMS_34
The larger the abrasion in the current abrasion zone is, the more serious the abrasion in the current abrasion zone is, further, in the embodiment, the abrasion zone combination coefficient of the current zone is taken as one of judging factors of the next zone growth.
Step 3, for each region growth, calculating a comprehensive saliency mean value of pixel points in the merging region and a wear region merging coefficient obtaining threshold value according to the previous growth
Figure SMS_35
Is based on the wear area combination coefficient +.>
Figure SMS_36
And the already grown wear zone->
Figure SMS_37
Calculate the wear area of the next growth result +.>
Figure SMS_38
Calculate the combined threshold range for growth k+1st time +.>
Figure SMS_39
Figure SMS_40
In the method, in the process of the invention,
Figure SMS_41
is the integrated saliency mean value of the pixel points in the wear area of the kth growth result, +.>
Figure SMS_42
The combination coefficient of the wear areas corresponding to the kth wear area is obtained, c is an influence factor, and the magnitude of c is 0.1. It should be noted that, since the pixel points with similar features are to be merged in the region growing, that is, for the current existing region, determining whether the next adjacent pixel points can be merged needs to evaluate the degree of similarity between the two, that is, whether the region growing criterion is satisfied. Merging is performed if the degree of similarity meets the criterion of region growth, otherwise not merging. In this embodiment, the similarity of the pixels is evaluated by the integrated saliency, and for the worn area, the integrated saliency corresponding to the pixels in the area is relatively close, that is, the adjacent pixels are similar to the average value of the integrated saliency of the current area within a certain threshold range. Further, since the wear degree of the grinding heads is different at different positions, the threshold range needs to be adjusted adaptively with the change of the merging area. So that firstly, the region merging coefficient is obtained by the difference of the comprehensive significance between the pixel point and the seed point in the current existing region>
Figure SMS_43
Secondly based on->
Figure SMS_44
Obtaining a threshold range from the mean value of the default significance of the current wearing region, and obtaining a region merging coefficient +.>
Figure SMS_45
The larger the abrasion area is, the more uneven the abrasion degree is, the area combination coefficient is +.>
Figure SMS_46
The larger the threshold range +.>
Figure SMS_47
The larger the range of the adjacent pixels is, the larger the probability of being combined, because the uneven degree of wear on the grinding head indicates that the larger the degree of wear is, the higher the intensity used in the grinding process is, and the larger the corresponding worn area should be.
Step 4, further calculating the worn area
Figure SMS_48
Wear significance mean ∈>
Figure SMS_49
And the corresponding wear zone combination coefficient->
Figure SMS_50
Step 5, taking the non-abrasion seed point p as a starting point, growing a non-abrasion region according to the principles of the steps 2-4, and enabling the comprehensive significance difference between the seed point and the eight surrounding areas to be smaller than a second merging threshold value
Figure SMS_51
Is combined into an initial non-wearing region +.>
Figure SMS_52
Calculating threshold value of non-wearing area +.>
Figure SMS_53
: wherein, the absolute value of the difference value between the comprehensive significance of the non-abrasion seed points and the average value of the comprehensive significance of the pixel points in the gray level image of the region of interest is taken as a second merging threshold value +.>
Figure SMS_54
The method comprises the steps of carrying out a first treatment on the surface of the It should be noted that the integrated significance of the pixel points in the non-worn area should be close to the integrated significance of the non-worn seed points, and on the other hand, the defect exists on the grinding head, so that the whole grinding head area pair is causedThe corresponding integrated saliency average value is larger than the integrated saliency of the non-abrasion seed point, and the difference value between the integrated saliency of the non-abrasion seed point p and the integrated saliency of the inner pixel points in the surrounding 8 neighborhood is smaller than the absolute value of the difference value between the integrated saliency of the non-abrasion seed point and the average value of the integrated saliency of the pixel points in the gray map of the region of interest, so the absolute value of the difference value between the integrated saliency of the non-abrasion seed point and the average value of the integrated saliency of the pixel points in the gray map of the region of interest is taken as a second combining threshold value.
Further, the region merging coefficients of the non-worn region are constructed in the same manner
Figure SMS_55
Threshold->
Figure SMS_56
And (3) production merging is performed on the non-worn area.
The growth rules for the worn seed point j and the unworn seed point p obtained thus far include a first merge threshold for the worn area metric
Figure SMS_57
Wear zone merger coefficient->
Figure SMS_58
Non-wearing region measures a second merge threshold +.>
Figure SMS_59
And region merging coefficient of non-worn region +.>
Figure SMS_60
And 6, repeating the steps 2, 3, 4 and 5 until the growth result of the abrasion region is not changed any more, and no overlapping region exists between the abrasion region and the non-abrasion region, so as to obtain a final abrasion region.
And S4, judging the abrasion degree of the grinding head according to the ratio of the number of the pixel points in the abrasion area to the total number of the pixel points in the grinding head surface image.
In this embodiment, in a grinding operation site, a plurality of grinding head surface images with different use durations are collected, respective ROI gray images are obtained by using a neural network, and a wear area in each grinding head surface image is obtained according to step 3. However, for industrial production, the grinding head is discarded without slight abrasion, so that huge economic loss is caused, and the abrasion of the grinding head is divided into two conditions of continuous use and immediate stop use;
constructing the wear zone duty ratio according to the result of the step 3 adaptive zone growth
Figure SMS_61
The abrasion area ratio is used for representing the abrasion degree of the surface of the grinding head in the grinding process, and the abrasion area ratio in any grinding head detection result is +.>
Figure SMS_62
Equal to the ratio of the number of pixels in the wear area to the total number of pixels in the image of the grinding head surface. If the proportion of the worn area in the grinding head detection result
Figure SMS_63
The threshold empirical value is 0.2 when the abrasion loss is smaller than the threshold value, the abrasion loss is considered to be abrasion loss condition which can be continuously used, if the abrasion loss area ratio in the abrasion loss detection result of the abrasion loss is +.>
Figure SMS_64
Above the threshold, the grater is considered to be in a condition of immediate out of service.
According to the abrasion detection method for the grinding head, provided by the invention, the abrasion seed points with higher confidence coefficient are obtained to serve as the growth starting points of the abrasion areas through judging the corresponding abrasion degree between the pixel points in the ROI gray level chart of the surface of the grinding head, and the growth rules of the abrasion areas are set according to the abrasion conditions, so that the influence of the interference areas of the grinding head is effectively avoided, the segmentation result of the abrasion areas of the surface of the grinding head is accurately extracted, and the abrasion detection precision of the grinding head is improved.
The foregoing description of the preferred embodiments of the invention is not intended to be limiting, but rather is intended to cover all modifications, equivalents, alternatives, and improvements that fall within the spirit and scope of the invention.

Claims (6)

1. A wear detection method for a grinding head, comprising the steps of:
acquiring a grinding head surface image; acquiring an interested area containing abrasion in the grinding head surface image; graying treatment is carried out on the region of interest to obtain a gray scale map of the region of interest;
acquiring the comprehensive significance of each pixel point according to the gray value of each pixel point in the gray map of the region of interest, the gray average value of the line and the gray average value of the column; taking the pixel point corresponding to the maximum comprehensive significance as a wear seed point;
combining the pixel points of which the difference value of the comprehensive significance in the abrasion seed points and the adjacent areas is smaller than the initial first combining threshold value to obtain an initial abrasion area;
taking the deviation degree of the comprehensive significance of each pixel point in the initial wear area relative to the comprehensive significance of the wear seed point as a merging coefficient of the initial wear area; acquiring a current first merging threshold range according to the comprehensive significance mean value and the merging coefficient of the pixel points in the initial wear area;
combining the pixel points, the difference value of which is smaller than the current first combining threshold range, of the comprehensive significance of the abrasion seed points and the adjacent difference value of the initial abrasion areas with the initial abrasion areas to obtain the abrasion areas growing currently;
similarly, acquiring a next first merging threshold range according to the currently grown abrasion region; combining the pixel points, of which the comprehensive saliency difference value is smaller than the range of the next first combining threshold value, adjacent to the abrasion seed point and the abrasion area growing at present with the abrasion area growing at present to obtain the abrasion area growing at next time; and analogically, acquiring a wear area until the grown wear area is not changed any more;
judging the abrasion degree of the grinding head according to the ratio of the number of the pixel points in the abrasion area to the total number of the pixel points in the grinding head surface image;
the comprehensive significance of each pixel point is obtained according to the following steps:
according to the difference between the gray average value of each pixel point in the gray map of the region of interest and the gray average value of the gray map of the region of interest and the difference between the gray value of each pixel point and the gray average value of the gray map of the region of interest, the significance of each pixel point in the gray map of the region of interest in the horizontal direction is obtained;
according to the difference between the gray average value of each pixel point in the gray map of the region of interest and the gray average value of the gray map of the region of interest and the difference between the gray value of each pixel point and the gray average value of the gray map of the region of interest, the significance of each pixel point in the gray map of the region of interest in the vertical direction is obtained;
acquiring the comprehensive saliency of each pixel point according to the saliency of each pixel point in the gray scale map of the region of interest in the horizontal direction and the saliency of each pixel point in the vertical direction;
the calculation formula of the significance of each pixel point in the gray level map of the region of interest in the horizontal direction is as follows:
Figure QLYQS_1
in the method, in the process of the invention,
Figure QLYQS_3
is the significance of pixel i in the horizontal direction; n is the number of pixels in the row where pixel point i is located;
Figure QLYQS_7
Is the gray value average value of the gray map of the region of interest;
Figure QLYQS_9
Is the gray value of pixel i;
Figure QLYQS_4
Figure QLYQS_5
Is to calculate->
Figure QLYQS_6
Weight parameter of time, < ->
Figure QLYQS_8
Experience values of 0.7 +.>
Figure QLYQS_2
The empirical values are respectively 0.3;
the calculation formula of the significance of each pixel point in the gray level map of the region of interest in the vertical direction is as follows:
Figure QLYQS_10
in the method, in the process of the invention,
Figure QLYQS_11
is the saliency of pixel i in the vertical direction; m is the number of pixels in the column where pixel point i is located;
Figure QLYQS_12
Figure QLYQS_13
Is to calculate->
Figure QLYQS_14
Weight parameters at the time;
Figure QLYQS_15
Experience values of 0.7 +.>
Figure QLYQS_16
The empirical values are respectively 0.3;
the comprehensive significance calculation formula of each pixel point is as follows:
Figure QLYQS_17
in the method, in the process of the invention,
Figure QLYQS_18
is the integrated saliency of pixel i.
2. The wear detection method for a grinding head according to claim 1, wherein in the process of acquiring the wear area, further comprising:
taking the pixel point corresponding to the minimum comprehensive significance as a non-abrasion seed point; acquiring a non-abrasion area according to the step of acquiring the non-abrasion seed points through area growth;
the worn region and the non-worn region are worn regions obtained when the worn region and the non-worn region grow from the worn seed point and the non-worn seed point, respectively, until the worn region and the non-worn region grown are no longer changed, and the worn region and the non-worn region do not overlap.
3. The wear detection method for a grinding head according to claim 2, wherein the non-worn area is obtained by:
combining the non-abrasion seed points with the pixel points in the neighborhood of which the difference value of the comprehensive significance is smaller than a second combining threshold value to obtain an initial non-abrasion region;
taking the deviation degree of the comprehensive significance of each pixel point in the initial non-abrasion area relative to the comprehensive significance of the non-abrasion seed point as a merging coefficient of the initial non-abrasion area; acquiring a current second merging threshold range according to the comprehensive saliency mean value and the merging coefficient of the pixel points in the initial non-abrasion area;
combining the pixel points, the difference value of which is smaller than the current second combining threshold range, of the comprehensive significance of the non-abrasion seed points and the adjacent difference value of the initial non-abrasion areas with the initial non-abrasion areas to obtain the currently growing non-abrasion areas;
similarly, a second merging threshold range is obtained next time according to the currently grown non-abrasion area; combining the pixel points, of which the comprehensive saliency difference value between the non-abrasion seed points and the non-abrasion areas growing at present is smaller than the second combining threshold range next, with the non-abrasion areas growing at present to obtain non-abrasion areas growing at next time; and so on until the growing non-worn area is no longer changed, the non-worn area is acquired.
4. A wear detection method for a grinding head according to claim 3, characterized in that an absolute value of a difference between the integrated significance of the non-wear seed points and the average of the integrated significance of the pixel points in the gray scale map of the region of interest is taken as the second combining threshold.
5. The abrasion detection method for a grinding head according to claim 1, wherein when there are a plurality of pixels corresponding to the obtained maximum integrated saliency, a pixel corresponding to the maximum saliency in the horizontal direction and/or the maximum saliency in the vertical direction is selected as an abrasion seed point from among the plurality of pixels corresponding to the maximum integrated saliency.
6. The wear detection method for a grinding head according to claim 1, wherein a difference between the integrated significance of the wear seed points and the integrated significance of the non-wear seed points is taken as a first combining threshold.
CN202310162114.5A 2023-02-24 2023-02-24 Abrasion detection method for grinding head Active CN115861313B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310162114.5A CN115861313B (en) 2023-02-24 2023-02-24 Abrasion detection method for grinding head

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310162114.5A CN115861313B (en) 2023-02-24 2023-02-24 Abrasion detection method for grinding head

Publications (2)

Publication Number Publication Date
CN115861313A CN115861313A (en) 2023-03-28
CN115861313B true CN115861313B (en) 2023-05-09

Family

ID=85658876

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310162114.5A Active CN115861313B (en) 2023-02-24 2023-02-24 Abrasion detection method for grinding head

Country Status (1)

Country Link
CN (1) CN115861313B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116030058B (en) * 2023-03-29 2023-06-06 无锡斯达新能源科技股份有限公司 Quality evaluation method for surface roughness of polishing pad
CN117011297B (en) * 2023-10-07 2024-02-02 惠州市凯默金属制品有限公司 Aluminum alloy automobile accessory die defect detection method based on image processing

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105069778A (en) * 2015-07-16 2015-11-18 西安工程大学 Industrial product surface defect detection method constructed based on target characteristic saliency map
CN111830906A (en) * 2020-07-27 2020-10-27 上海威研精密科技有限公司 On-machine monitoring system for failure state of rotary cutter and detection method thereof

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6293505B2 (en) * 2014-02-03 2018-03-14 株式会社タカコ Tool inspection method and tool inspection apparatus
CN110728667B (en) * 2019-10-08 2023-05-02 南京航空航天大学 Automatic accurate measuring method for cutter abrasion loss based on gray image probability

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105069778A (en) * 2015-07-16 2015-11-18 西安工程大学 Industrial product surface defect detection method constructed based on target characteristic saliency map
CN111830906A (en) * 2020-07-27 2020-10-27 上海威研精密科技有限公司 On-machine monitoring system for failure state of rotary cutter and detection method thereof

Also Published As

Publication number Publication date
CN115861313A (en) 2023-03-28

Similar Documents

Publication Publication Date Title
CN115861313B (en) Abrasion detection method for grinding head
CN117173189B (en) Visual inspection system for polishing effect of aluminum alloy surface
CN115829883B (en) Surface image denoising method for special-shaped metal structural member
CN110263192B (en) Abrasive particle morphology database creation method for generating countermeasure network based on conditions
CN107369136B (en) Visual detection method for surface cracks of polycrystalline diamond compact
CN109255757B (en) Method for segmenting fruit stem region of grape bunch naturally placed by machine vision
CN115908411B (en) Concrete curing quality analysis method based on visual detection
CN115984271A (en) Metal burr identification method based on angular point detection
CN104792792A (en) Stepwise-refinement pavement crack detection method
CN116110053B (en) Container surface information detection method based on image recognition
CN116758077B (en) Online detection method and system for surface flatness of surfboard
CN116823824B (en) Underground belt conveyor dust fall detecting system based on machine vision
CN116402810B (en) Image processing-based lubricating oil anti-abrasive particle quality detection method
CN116703251B (en) Rubber ring production quality detection method based on artificial intelligence
CN114943736A (en) Production quality detection method and system for automobile radiating fins
CN110749598A (en) Silkworm cocoon surface defect detection method integrating color, shape and texture characteristics
CN117745724B (en) Stone polishing processing defect region segmentation method based on visual analysis
CN116703898A (en) Quality detection method for end face of precision mechanical bearing
CN112629409A (en) Method for extracting line structure light stripe center
CN117291985B (en) Image positioning method for part punching
CN115994870B (en) Image processing method for enhancing denoising
CN116029941A (en) Visual image enhancement processing method for construction waste
CN116342586A (en) Road surface quality detection method based on machine vision
CN117541605A (en) Rapid segmentation method for rusted image area of steel structure
CN117237354B (en) Visual detection method for defects of textile clothes

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
PE01 Entry into force of the registration of the contract for pledge of patent right
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: A Wear Detection Method for Grinding Heads

Effective date of registration: 20230927

Granted publication date: 20230509

Pledgee: China Co. truction Bank Corp Dongguan branch

Pledgor: DONGGUAN CHUNCAO GRINDING TECHNOLOGY Co.,Ltd.

Registration number: Y2023980059320