CN116930195A - Intelligent CAM system for hardware processing and surface defect detection method and device - Google Patents

Intelligent CAM system for hardware processing and surface defect detection method and device Download PDF

Info

Publication number
CN116930195A
CN116930195A CN202311195766.5A CN202311195766A CN116930195A CN 116930195 A CN116930195 A CN 116930195A CN 202311195766 A CN202311195766 A CN 202311195766A CN 116930195 A CN116930195 A CN 116930195A
Authority
CN
China
Prior art keywords
image
workpiece
images
average
hardware
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311195766.5A
Other languages
Chinese (zh)
Other versions
CN116930195B (en
Inventor
李焕明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Hengxintong Intelligent Precision Technology Co ltd
Original Assignee
Shenzhen Hengxintong Intelligent Precision Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Hengxintong Intelligent Precision Technology Co ltd filed Critical Shenzhen Hengxintong Intelligent Precision Technology Co ltd
Priority to CN202311195766.5A priority Critical patent/CN116930195B/en
Publication of CN116930195A publication Critical patent/CN116930195A/en
Application granted granted Critical
Publication of CN116930195B publication Critical patent/CN116930195B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

The invention relates to the technical field of hardware processing systems, in particular to an intelligent CAM system for hardware processing and a surface defect detection method and device, wherein the system comprises a surface defect detection device for detecting the surface defect of a hardware workpiece with rotational symmetry. The method comprises the steps of irradiating strip-shaped light to a hardware workpiece, shooting a workpiece image of the hardware workpiece, controlling the hardware workpiece to rotate for a certain angle, shooting again, repeating the steps to obtain a plurality of workpiece images, calculating a first average image of the workpiece images, comparing each workpiece image with the first average image one by one, judging a maximum difference image, removing the maximum difference image, calculating a second average image of the residual image, comparing the maximum difference image with the second average image, and judging whether the hardware workpiece has surface defects. Therefore, the detection of surface defects is realized through the comparison of the images of the hardware workpieces.

Description

Intelligent CAM system for hardware processing and surface defect detection method and device
Technical Field
The invention relates to the technical field of hardware processing systems, in particular to an intelligent CAM system for hardware processing, and a surface defect detection method and device.
Background
An intelligent CAM system is a system that optimizes hardware manufacturing processes through computer aided design and manufacturing techniques. CAM is an abbreviation for computer aided manufacturing that utilizes computer software and hardware to automate and control the process, thereby improving production efficiency and quality.
The intelligent CAM system has wide application in hardware processing. The automatic numerical control machining device can automatically generate a machining path and a cutter path according to the design drawing, and realize automatic numerical control machining. The intelligent CAM system may also optimize cutting parameters, including cutting speed, feed speed, and depth of cut, based on material properties and machining requirements to ensure machining quality and workpiece accuracy. Besides automatic processing, part of intelligent CAM systems also have the function of detecting surface defects of hardware workpieces, so that the surface defects of the hardware workpieces are detected while the hardware workpieces are processed, unqualified hardware workpieces with obvious surface defects are detected, and the overall quality of products is guaranteed.
In the prior art, a visual recognition technology is often used for carrying out surface defects on a hardware workpiece, and the surface defects of the machined hardware workpiece are recognized by comparing a preset standard image of the hardware workpiece without defects with an image of the machined hardware workpiece. However, using this approach requires that a standard image of a defect-free hardware workpiece must be pre-set, which also means that a defect-free hardware workpiece must be pre-processed. In this way, for the case of processing only a small number of hardware workpieces for a long time and the number of processed parts per hardware workpiece is large, the unit cost per processed hardware workpiece is acceptable because the cost is reduced by the number of processed parts, but if the number of processed hardware workpieces is large or the number of processed parts per hardware workpiece is small, the unit cost is significantly increased, further, the number of processed parts per hardware workpiece is small for the customized processing business, and the cost generated by this way becomes unacceptable. Therefore, there is a need for an intelligent CAM system for hardware processing, and a method and apparatus for detecting surface defects of a hardware workpiece, which can detect surface defects of the hardware workpiece without presetting standard images of the hardware workpiece without defects.
The information disclosed in the background section of the application is only for enhancement of understanding of the general background of the application and should not be taken as an acknowledgement or any form of suggestion that this information forms the prior art already known to a person skilled in the art.
Disclosure of Invention
An embodiment of the present application provides an intelligent CAM system for hardware processing, the system including a surface defect detecting device for detecting a surface defect of a hardware workpiece having rotational symmetry, the hardware workpiece having a rotation axis and a rotation angle α, surface profiles of the hardware workpiece before and after the hardware workpiece rotates around the rotation axis through the rotation angle α being coincident, the surface defect detecting device including:
the light source module is used for emitting strip-shaped light to irradiate the hardware workpiece, wherein the strip-shaped light completely covers the hardware workpiece in the length direction and does not completely cover the hardware workpiece in the width direction;
the shooting module is used for shooting the workpiece images of the hardware workpiece, and shooting the workpiece images of the hardware workpiece again after a preset time t is set, until a preset number m is reached, so as to obtain m workpiece images, wherein m is more than or equal to 3;
the workpiece rotating module is used for rotating the hardware workpiece around the rotating shaft by the rotating angle alpha after shooting the workpiece image of the hardware workpiece for the first time, and rotating the hardware workpiece around the rotating shaft by the rotating angle alpha again after a preset time t is spaced until the preset times m-1 are reached;
The preprocessing module is used for preprocessing the photographed m workpiece images to obtain m preprocessed images;
the first average image generation module is used for calculating and generating a first average image of the m preprocessed images;
the maximum difference judging module is used for comparing the m preprocessed images with the first average image respectively and judging the maximum difference image in the m preprocessed images according to the difference degree of the m preprocessed images and the first average image;
the second average image generation module is used for calculating and generating a second average image of the remaining m-1 preprocessed images after the maximum difference image is removed from the m preprocessed images;
and the workpiece qualification judging module is used for comparing the characteristics of the maximum difference image with the second average image, judging that the hardware workpiece has surface defects if the difference degree of the maximum difference image and the second average image is out of a preset range, and judging that the hardware workpiece has no surface defects if the difference of the maximum difference image and the second average image is in the preset range.
In some embodiments of the present invention, the "preprocessing the captured m workpiece images" specifically includes the following operations performed on each workpiece image separately:
Removing dark parts of the workpiece images, reserving bright parts of the workpiece images, and obtaining bright part images, wherein the bright parts are parts of five workpieces in the workpiece images, which are irradiated by strip light, and the dark parts are parts of hardware workpieces in the workpiece images, which are not irradiated by strip light;
and carrying out noise reduction processing on the bright part image.
In some embodiments of the present invention, the "preprocessing the captured m workpiece images" specifically further includes:
if the bright part image is a gray level image, performing contrast enhancement processing on the bright part image subjected to noise reduction processing to obtain a preprocessing image;
and if the bright part image is a color image, converting the bright part image subjected to noise reduction into a gray level image, and then performing contrast enhancement processing to obtain a preprocessed image.
In some embodiments of the invention, the contrast enhancement process is performed according to the following formula:
wherein, the coordinates in the gray level image for the bright portion image are +.>Is used for the gray value of the pixel point,for the coordinates +.>Gray of pixel point of (2)Metric value->For the maximum gray value in the pixel points of the gray image,/for the gray value >A main gray scale range of the gray scale image, wherein the main gray scale range refers to the gray scale imagek% of the range in which the gray value of the pixel is located,% of the pixel>The gray scale expansion range is preset, and the gray scale expansion range refers to the pre-processing imagek%The gray value of the pixel of (2) is within 90%>
In some embodiments of the present invention, the "calculating and generating the first average image of the m preprocessed images" specifically includes:
respectively calculating gray value average values of corresponding coordinate pixel points of the m preprocessed images;
and generating the first average image according to the gray value mean value of each coordinate pixel point.
In some embodiments of the present invention, the "comparing the m preprocessed images with the first average image, and determining the largest difference image among the m preprocessed images according to the difference degree between the m preprocessed images" specifically includes:
respectively calculating the pixel point gray value average value of each pretreatment image in the m pretreatment images;
calculating the pixel point gray value average value of the first average image;
calculating the difference value between the pixel gray value average value of the m preprocessed images and the pixel gray value average value of the first average image one by one;
And judging the preprocessed image with the largest absolute value of the difference value as the largest difference image.
In some embodiments of the present invention, the "comparing the m preprocessed images with the first average image, and determining the largest difference image among the m preprocessed images according to the difference degree between the m preprocessed images" specifically includes:
respectively subtracting the gray value of the pixel point of the corresponding coordinate of the first average image from the gray value of the pixel point of each preprocessed image in the m preprocessed images, taking the absolute value of the gray value difference value of the gray value of the pixel point of the corresponding coordinate of the first average image, and generating a difference image according to the absolute value;
performing binarization processing on the differential image, and setting the gray value of a certain pixel of the differential image to 255 if the gray value of the certain pixel is larger than a preset threshold value; if the gray value of a certain pixel of the differential image is smaller than a preset threshold value, setting the gray value of the certain pixel to be 0;
and respectively calculating the number of pixel points with the gray value of 255 in the binarized differential image corresponding to each preprocessed image, and judging the corresponding preprocessed image with the largest number as the largest differential image.
In some embodiments of the present invention, the "comparing the m preprocessed images with the first average image, and determining the largest difference image among the m preprocessed images according to the difference degree between the m preprocessed images" specifically includes:
Respectively subtracting the gray value of the pixel point of the corresponding coordinate of the first average image from the gray value of the pixel point of each preprocessed image in the m preprocessed images, taking the absolute value of the gray value difference value of the gray value of the pixel point of the corresponding coordinate of the first average image, and generating a difference image according to the absolute value;
and respectively calculating the pixel point gray value average value of the difference image corresponding to each preprocessed image, and judging the corresponding preprocessed image with the largest pixel point gray value average value as the largest difference image.
The invention also provides a surface defect detection method for detecting surface defects of a hardware workpiece with rotational symmetry, the hardware workpiece has a rotation axis and a rotation angle alpha, and surface contours of the hardware workpiece before and after the hardware workpiece rotates around the rotation axis and the rotation angle alpha are overlapped, the method comprises the following steps:
emitting a strip light to irradiate the hardware workpiece, wherein the strip light completely covers the hardware workpiece in the length direction and does not completely cover the hardware workpiece in the width direction;
shooting workpiece images of the hardware workpiece, and shooting the workpiece images of the hardware workpiece again after a preset time t is set, until a preset number m is reached, so as to obtain m workpiece images, wherein m is more than or equal to 3;
After the workpiece image of the hardware workpiece is photographed for the first time, the hardware workpiece is rotated around the rotation shaft by the rotation angle alpha, and after a preset time t is spaced, the hardware workpiece is rotated around the rotation shaft again by the rotation angle alpha until the preset times m-1 are reached;
preprocessing the photographed m workpiece images to obtain m preprocessed images;
calculating and generating a first average image of the m preprocessed images;
comparing the m preprocessed images with the first average image respectively, and judging the largest difference image in the m preprocessed images according to the difference degree of the m preprocessed images;
after the maximum difference image is removed from the m preprocessed images, calculating and generating a second average image of the remaining m-1 preprocessed images;
and comparing the characteristics of the maximum difference image with the second average image, judging that the hardware workpiece has surface defects if the difference degree of the maximum difference image and the second average image is out of a preset range, and judging that the hardware workpiece has no surface defects if the difference degree of the maximum difference image and the second average image is out of the preset range.
The present invention also provides a surface defect detecting apparatus comprising:
a processor;
A memory for storing processor-executable instructions;
wherein the processor is configured to invoke the instructions stored in the memory to perform the method according to the above embodiments.
The invention also provides a computer readable storage medium having stored thereon computer program instructions which when executed by a processor implement the method according to any of the embodiments described above.
According to the embodiment of the invention, the intelligent CAM system for hardware processing and the surface defect detection method and equipment thereof are provided, wherein the system comprises a surface defect detection device for detecting the surface defect of a hardware workpiece with rotational symmetry. The method comprises the steps of irradiating strip-shaped light to a hardware workpiece, shooting a workpiece image of the hardware workpiece, controlling the hardware workpiece to rotate for a certain angle, shooting again, repeating the steps to obtain a plurality of workpiece images, calculating a first average image of the workpiece images, comparing each workpiece image with the first average image one by one, judging a maximum difference image, removing the maximum difference image, calculating a second average image of the residual image, comparing the maximum difference image with the second average image, and judging whether the hardware workpiece has surface defects. Therefore, the detection of surface defects is realized through the comparison of the images of the hardware workpieces.
Drawings
FIG. 1 schematically illustrates a block diagram of an intelligent CAM system for hardware processing according to an embodiment of the invention;
fig. 2 schematically shows a flow chart of a surface defect detection method according to an embodiment of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The terms "first," "second," "third," "fourth" and the like in the description and in the claims and in the above drawings, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein.
It should be understood that, in various embodiments of the present invention, the sequence number of each process does not mean that the execution sequence of each process should be determined by its functions and internal logic, and should not constitute any limitation on the implementation process of the embodiments of the present invention.
It should be understood that in the present invention, "comprising" and "having" and any variations thereof are intended to cover non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements that are expressly listed or inherent to such process, method, article, or apparatus.
It should be understood that in the present invention, "plurality" means two or more. "and/or" is merely an association relationship describing an association object, and means that three relationships may exist, for example, and/or B may mean: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship. "comprising A, B and C", "comprising A, B, C" means that all three of A, B, C comprise, "comprising A, B or C" means that one of the three comprises A, B, C, and "comprising A, B and/or C" means that any 1 or any 2 or 3 of the three comprises A, B, C.
It should be understood that in the present invention, "B corresponding to a", "a corresponding to B", or "B corresponding to a" means that B is associated with a, from which B can be determined. Determining B from a does not mean determining B from a alone, but may also determine B from a and/or other information. The matching of A and B is that the similarity of A and B is larger than or equal to a preset threshold value.
As used herein, "if" may be interpreted as "at … …" or "at … …" or "in response to a determination" or "in response to detection" depending on the context.
The technical scheme of the invention is described in detail below by specific examples. The following embodiments may be combined with each other, and some embodiments may not be repeated for the same or similar concepts or processes.
Fig. 1 schematically shows a block diagram of an intelligent CAM system for hardware machining according to an embodiment of the present invention, which includes a surface defect detecting device for detecting a surface defect of a hardware workpiece having rotational symmetry, the hardware workpiece having a rotation axis and a rotation angle α, and surface contours of the hardware workpiece before and after the rotation of the hardware workpiece around the rotation axis through the rotation angle α being coincident.
Rotational symmetry means that an object, after a certain angular rotation, still has a shape profile that coincides with the original shape profile. That is, the object has a certain central axis, called the rotation axis, around which its shape remains unchanged in space, and a certain angle α, called the rotation angle. Common hardware workpieces with rotational symmetry include gears, shafts, bearings, bolts, and the like.
The surface defect detection device for the intelligent CAM system for hardware processing comprises:
the light source module 101 is configured to emit a strip light to the hardware workpiece, where the strip light completely covers the hardware workpiece in a length direction and does not completely cover the hardware workpiece in a width direction.
The main function of this module is to emit a strip of light onto the hardware workpiece. The strip light completely covers the hardware workpiece in the length direction, so that after the hardware workpiece rotates for many times, each part of the hardware workpiece appears under the strip light and is shot by the shooting module, omission does not appear, the strip light does not completely cover in the width direction, the part which appears under the strip light each time and is shot by the shooting module is only one part of the hardware workpiece, and the comparison of images of different parts of the hardware workpiece is convenient to realize subsequently.
The shooting module 102 is used for shooting the workpiece image of the hardware workpiece, and shooting the workpiece image of the hardware workpiece again after a preset time t is set, until a preset number of times m is reached, so as to obtain m workpiece images, wherein m is more than or equal to 3.
After the first shooting of the workpiece image, shooting again after an interval time t, wherein the interval time t is used for waiting for the hardware workpiece to rotate. The preset time t can be set according to specific situations, such as the rotating speed of the hardware workpiece. After the shooting reaches the preset times m, m workpiece images are obtained, and shooting is finished. The specific value of m can be selected according to actual conditions, but is not too small, otherwise, the subsequent first average image and the second average image are unfavorable for representing the characteristics of the hardware workpiece. The smaller the specific value of m, the larger the influence of the workpiece image with the surface defect part on the first average image is, which is unfavorable for the subsequent comparison of the workpiece image with the first average image and the second average image, but the too large workpiece image is not easy to cause the too long time consumption of the surface defect detection. The specific value of m can be determined according to the number of rotations required to rotate the hardware workpiece just one turn.
The workpiece rotating module 103 is configured to rotate the hardware workpiece around the rotation axis by the rotation angle α after the first image of the workpiece of the hardware workpiece is captured, and rotate the hardware workpiece around the rotation axis again by the rotation angle α after a preset time t is set, until a preset number of times m-1 is reached.
After each shooting of the workpiece image, the hardware workpiece is rotated around the rotation shaft by a rotation angle alpha, and the hardware workpiece is rotated again after a preset interval of time t. The interval of the preset time t is to wait for the photographing module S102 to photograph the image of the workpiece. The number of rotations is determined according to the number of shots, and since no rotation is required before the first shot, the number of rotations is m-1.
And the preprocessing module 104 is used for preprocessing the m photographed workpiece images to obtain m preprocessed images.
The function of the module is to pre-process the photographed m workpiece images, such as denoising, enhancing, zooming, cutting and the like, so as to improve the accuracy of subsequent processing.
A first average image generation module 105 for calculating and generating a first average image of the m preprocessed images;
the module carries out average processing on the m preprocessed images to generate a first average image, wherein the first average image reflects the overall condition of the hardware workpiece and is used for subsequent comparison with the workpiece image one by one. The averaging process may be to average parameters such as gray scale, color, etc. of the m preprocessed images.
And the maximum difference judging module 106 is configured to compare the m preprocessed images with the first average image, and judge the maximum difference image in the m preprocessed images according to the difference degree of the m preprocessed images and the first average image.
The module compares the preprocessed image with the first average image to find out the image with the largest difference, and because the first average image is an average image of m preprocessed images, although five flaws of workpieces possibly exist in a certain image in the m preprocessed images, the flaws are subjected to average processing, the image characteristics of the flaws are obviously weakened in the first average image, for example, the flaws are represented as color blocks with darker colors, but after the average processing, the flaws are not particularly highlighted in the first average image, and therefore, the preprocessed image with the darker color blocks can be judged as the largest difference image by comparing the m preprocessed images with the first average image one by one.
A second average image generating module 107, configured to calculate and generate a second average image of the remaining m-1 preprocessed images after removing the maximum difference image from the m preprocessed images.
After the largest difference image is removed, carrying out average processing on the rest preprocessed images to generate a second average image. This further eliminates the effect of the largest difference image of the portion with the flaw on the average image, resulting in a second average image that is closer to the characteristics of the hardware workpiece without flaws.
And the workpiece qualification judging module 108 is configured to compare the features of the maximum difference image with the second average image, judge that the hardware workpiece has a surface defect if the difference degree of the maximum difference image and the second average image is out of a preset range, and judge that the hardware workpiece has no surface defect if the difference between the maximum difference image and the second average image is in the preset range.
Comparing the features of the maximum difference image with the second average image, wherein the maximum difference image is already proposed when the second average image is calculated, so that the second average image can be approximately regarded as a standard image of the flawless hardware workpiece, and if the difference degree between the maximum difference image and the second average image is out of an acceptable range, namely out of a preset range, the hardware workpiece can be regarded as having surface defects; otherwise, judging that the hardware workpiece has no surface defect. The preset range can be set according to the specific comparison between the characteristic of the maximum difference image and the second average image, for example, whether the characteristic of the maximum difference image and the second average image have obvious color lump difference is compared, for example, the defect part is represented by a color lump with darker color, the preset range can be the size or the color depth of the color lump, and specific data can be set according to the surface quality requirement of the hardware workpiece.
According to the intelligent CAM system for hardware processing, the strip-shaped light is emitted to the hardware workpiece, the workpiece image of the hardware workpiece is shot, then the hardware workpiece is controlled to rotate for a certain angle and then shot again, the steps are repeated to obtain a plurality of workpiece images, then a first average image of the workpiece images is calculated, each workpiece image is compared with the first average image one by one, the largest difference image is judged, the largest difference image is removed, then a second average image of the residual image is calculated, and the largest difference image is compared with the second average image to judge whether the hardware workpiece has surface defects. Therefore, the detection and identification of the surface defects of the hardware workpiece with rotational symmetry are realized through the comparison of the images of the hardware workpiece.
In some embodiments of the present invention, the "preprocessing the captured m workpiece images" specifically includes the following operations performed on each workpiece image separately:
removing dark parts of the workpiece images, reserving bright parts of the workpiece images, and obtaining bright part images, wherein the bright parts are parts of five workpieces in the workpiece images, which are irradiated by strip light, and the dark parts are parts of hardware workpieces in the workpiece images, which are not irradiated by strip light;
And carrying out noise reduction processing on the bright part image.
The bright part can embody more details of the hardware workpiece, and the dark part is removed to better limit the comparison range, so that the bright part can be used for carrying out subsequent average image calculation and image comparison, and a better effect is obtained. In an actual working environment, dust or other foreign matter is inevitably affected, and thus noise reduction processing is required for the bright portion image.
In some embodiments of the present invention, the "preprocessing the captured m workpiece images" specifically further includes:
if the bright part image is a gray level image, performing contrast enhancement processing on the bright part image subjected to noise reduction processing to obtain a preprocessing image;
and if the bright part image is a color image, converting the bright part image subjected to noise reduction into a gray level image, and then performing contrast enhancement processing to obtain a preprocessed image.
The adoption of the gray level image can simplify the subsequent processes of calculating the average image and comparing the images, and some shooting modules can directly shoot the gray level image, so that if the workpiece image shot by the shooting module is the gray level image, the bright part image is naturally also the gray level image, the contrast enhancement processing can be directly carried out on the bright part image after the noise reduction processing, and if the workpiece image shot by the shooting module is a color image, a process of converting the bright part image into the gray level image is also needed to be added. The contrast enhancement process can further highlight differences of various features in the bright portion image, which is more beneficial to subsequent comparison with the preprocessed image and is more beneficial to identifying surface defects.
In some embodiments of the invention, the contrast enhancement process is performed according to the following formula:
wherein, the coordinates in the gray level image for the bright portion image are +.>Is used for the gray value of the pixel point,for the coordinates +.>Gray value of pixel of +.>For the maximum gray value in the pixel points of the gray image,/for the gray value>A main gray scale range of the gray scale image, wherein the main gray scale range refers to the gray scale imagek% of the range in which the gray value of the pixel is located,% of the pixel>The gray scale expansion range is preset, and the gray scale expansion range refers to the pre-processing imagek%The gray value of the pixel of (2) is within 90%>
The formula expands the gray scale range of the main pixel point which is the pixel point with the k% of the bright image, but does not expand the gray scale range of all the pixel points, because a small amount of noise points or bright spots are inevitably present in the bright image, and if the bright image is not distinguished, the image is easily distorted. Therefore, the formula expands the gray scale range of the main pixel points, enhances the contrast ratio, and endows the pixel points with over-high or under-high gray scale with boundary values of the gray scale expansion range, thereby reducing the image distortion caused by the contrast ratio enhancement processing. Wherein, The value of (2) can be determined according to the specific condition of the workpiece image, if the noise or bright spot amount of the workpiece image is small and the gray value of the pixel point is concentrated, the gray value of the pixel point can be increased>The value of (2) is set to be larger, whereas can be set to be smaller, and can be determined according to the specific situation.
In some embodiments of the present invention, the "calculating and generating the first average image of the m preprocessed images" specifically includes:
respectively calculating gray value average values of corresponding coordinate pixel points of the m preprocessed images;
and generating the first average image according to the gray value mean value of each coordinate pixel point.
The step of calculating a first average image of the preprocessed images is mainly used for obtaining an image representing the average state of all preprocessed images. Specifically, this process mainly includes the following two steps:
and respectively calculating gray value average values for corresponding coordinate pixel points of the m preprocessed images. The purpose of this step is to calculate the average number of gray values for the pixels of each preprocessed image at the same coordinate position. For example, if we have 3 images, we will find the pixels at the same coordinate location on each image. Then we add the gray values of these pixels and then divide by the number of images (3 in this example) to get the average gray value for this coordinate position. For example, assume that we have three images, and at the (100 ) th pixel, the gray values of the three images are 60, 70, 80, respectively. The average gray value at this position is (60+70+80)/3=70.
And generating the first average image according to the gray value mean value of each coordinate pixel point. After calculating the average gray values for all coordinate positions, we can generate a first average image. Specifically, the average gray value of each coordinate position is taken as the gray value of the position on the preprocessed image. Thus we get a new image-the preprocessed image-representing the average state of all preprocessed images.
In some embodiments of the present invention, the "comparing the m preprocessed images with the first average image, and determining the largest difference image among the m preprocessed images according to the difference degree between the m preprocessed images" specifically includes:
respectively calculating the pixel point gray value average value of each pretreatment image in the m pretreatment images;
calculating the pixel point gray value average value of the first average image;
calculating the difference value between the pixel gray value average value of the m preprocessed images and the pixel gray value average value of the first average image one by one;
and judging the preprocessed image with the largest absolute value of the difference value as the largest difference image.
First, an average gray value of each preprocessed image is calculated. For example, if there are 3 preprocessed images, the sum of the gray values of all pixels of each image is calculated and then divided by the number of pixels to obtain the average gray value of each image. For example, the sum of gray values of all pixels of the first image is 30000 and the number of pixels is 500, so that the average gray value of the first image is 30000/500=60.
Next, an average gray value of the first average image is calculated. The calculation method is the same as the previous step, except that the gray value of the first average image is calculated this time.
Next, the difference between each of the preprocessed images and the first average image is calculated. Specifically, the average gray value of the first average image is subtracted from the average gray value of each preprocessed image to obtain a difference value. For example, if the average gray value of the first preprocessed image is 60 and the average gray value of the first average image is 50, the difference is 60-50=10.
And finally, finding out the preprocessed image with the largest difference from the first average image according to the difference. And comparing the difference values between all the preprocessed images and the first average image, wherein the image with the largest absolute value of the difference value is the largest difference image to be found.
In some embodiments of the present invention, the "comparing the m preprocessed images with the first average image, and determining the largest difference image among the m preprocessed images according to the difference degree between the m preprocessed images" specifically includes:
respectively subtracting the gray value of the pixel point of the corresponding coordinate of the first average image from the gray value of the pixel point of each preprocessed image in the m preprocessed images, taking the absolute value of the gray value difference value of the gray value of the pixel point of the corresponding coordinate of the first average image, and generating a difference image according to the absolute value;
Performing binarization processing on the differential image, and setting the gray value of a certain pixel of the differential image to 255 if the gray value of the certain pixel is larger than a preset threshold value; if the gray value of a certain pixel of the differential image is smaller than a preset threshold value, setting the gray value of the certain pixel to be 0;
and respectively calculating the number of pixel points with the gray value of 255 in the binarized differential image corresponding to each preprocessed image, and judging the corresponding preprocessed image with the largest number as the largest differential image.
Since surface defects are often embodied as lighter or darker colored patches in the workpiece image, the determination of surface defects can be made by preprocessing the differential image of the image and the first average image. Further, in order to further highlight the surface defects, the differential image is subjected to binarization processing. And then judging the maximum difference image according to the number of the pixels with the gray level of 255 in the binarized difference image. Methods for generating differential images and binarizing the images are well known in the prior art and are not described in detail herein.
In some embodiments of the present invention, the "comparing the m preprocessed images with the first average image, and determining the largest difference image among the m preprocessed images according to the difference degree between the m preprocessed images" specifically includes:
Respectively subtracting the gray value of the pixel point of the corresponding coordinate of the first average image from the gray value of the pixel point of each preprocessed image in the m preprocessed images, taking the absolute value of the gray value difference value of the gray value of the pixel point of the corresponding coordinate of the first average image, and generating a difference image according to the absolute value;
and respectively calculating the pixel point gray value average value of the difference image corresponding to each preprocessed image, and judging the corresponding preprocessed image with the largest pixel point gray value average value as the largest difference image.
Since surface defects are often embodied as lighter or darker colored patches in the workpiece image, the determination of surface defects can be made by preprocessing the differential image of the image and the first average image. And judging the maximum difference image according to the pixel gray value average value of the difference image. Methods for generating differential images are well known in the prior art and are not described in detail herein.
An intelligent CAM system for hardware processing according to an embodiment of the invention includes a surface defect detection device for detecting a surface defect of a hardware workpiece having rotational symmetry. The method comprises the steps of irradiating strip-shaped light to a hardware workpiece, shooting a workpiece image of the hardware workpiece, controlling the hardware workpiece to rotate for a certain angle, shooting again, repeating the steps to obtain a plurality of workpiece images, calculating a first average image of the workpiece images, comparing each workpiece image with the first average image one by one, judging a maximum difference image, removing the maximum difference image, calculating a second average image of the residual image, comparing the maximum difference image with the second average image, and judging whether the hardware workpiece has surface defects. Therefore, the detection of surface defects is realized through the comparison of the images of the hardware workpieces. Furthermore, the embodiment of the invention also provides a specific implementation means, thereby achieving a better detection effect of the surface defects of the hardware workpiece.
Fig. 2 schematically shows a flow chart of a surface defect detection method according to an embodiment of the present invention, for detecting a surface defect of a hardware workpiece having rotational symmetry, the hardware workpiece having a rotation axis and a rotation angle α, and surface profiles of the hardware workpiece before and after the rotation of the hardware workpiece around the rotation axis by the rotation angle α being coincident.
Rotational symmetry means that an object, after a certain angular rotation, still has a shape profile that coincides with the original shape profile. That is, the object has a certain central axis, called the rotation axis, around which its shape remains unchanged in space, and a certain angle α, called the rotation angle. Common hardware workpieces with rotational symmetry include gears, shafts, bearings, bolts, and the like.
The surface defect detection method comprises the following steps:
step S201, emitting a strip light to irradiate the hardware workpiece, wherein the strip light completely covers the hardware workpiece in the length direction and does not completely cover the hardware workpiece in the width direction.
And emitting a strip light to irradiate the hardware workpiece. The strip light completely covers the hardware workpiece in the length direction, so that after the hardware workpiece rotates for many times, each part of the hardware workpiece appears under the strip light and is shot by the shooting module, omission does not appear, the strip light does not completely cover in the width direction, the part which appears under the strip light each time and is shot by the shooting module is only one part of the hardware workpiece, and the comparison of images of different parts of the hardware workpiece is convenient to realize subsequently.
And S202, shooting workpiece images of the hardware workpiece, and shooting the workpiece images of the hardware workpiece again after a preset time t is set, until a preset number of times m is reached, so as to obtain m workpiece images, wherein m is more than or equal to 3.
After the first shooting of the workpiece image, shooting again after an interval time t, wherein the interval time t is used for waiting for the hardware workpiece to rotate. The preset time t can be set according to specific situations, such as the rotating speed of the hardware workpiece. After the shooting reaches the preset times m, m workpiece images are obtained, and shooting is finished. The specific value of m can be selected according to actual conditions, but is not too small, otherwise, the subsequent first average image and the second average image are unfavorable for representing the characteristics of the hardware workpiece. The smaller the specific value of m, the larger the influence of the workpiece image with the surface defect part on the first average image is, which is unfavorable for the subsequent comparison of the workpiece image with the first average image and the second average image, but the too large workpiece image is not easy to cause the too long time consumption of the surface defect detection. The specific value of m can be determined according to the number of rotations required to rotate the hardware workpiece just one turn.
Step S203, after photographing the workpiece image of the hardware workpiece for the first time, rotating the hardware workpiece around the rotation axis by the rotation angle α, and after a preset time t, rotating the hardware workpiece around the rotation axis again by the rotation angle α until a predetermined number of times m-1 is reached.
After each shooting of the workpiece image, the hardware workpiece is rotated around the rotation shaft by a rotation angle alpha, and the hardware workpiece is rotated again after a preset interval of time t. The interval of the preset time t is to wait for the step S202 to take the image of the workpiece. The number of rotations is determined according to the number of shots, and since no rotation is required before the first shot, the number of rotations is m-1.
Step S204, preprocessing the m photographed workpiece images to obtain m preprocessed images.
The function of this step is to pre-process the photographed m workpiece images, such as denoising, enhancement, scaling, cropping, etc., to improve the accuracy of subsequent processing.
Step S205, calculating and generating a first average image of the m preprocessed images;
the effect of the step is to average the m preprocessed images to generate a first average image, wherein the first average image reflects the overall condition of the hardware workpiece and is used for subsequent comparison with the workpiece image one by one. The averaging process may be to average parameters such as gray scale, color, etc. of the m preprocessed images.
And S206, comparing the m preprocessed images with the first average image respectively, and judging the largest difference image in the m preprocessed images according to the difference degree of the m preprocessed images and the first average image.
The step compares the preprocessed image with the first average image to find out the image with the largest difference, and because the first average image is an average image of m preprocessed images, although five flaws of workpieces possibly exist in a certain image in the m preprocessed images, the flaws are subjected to average processing, the image characteristics of the flaws are obviously weakened in the first average image, for example, the flaws are represented as color blocks with darker colors, but after the average processing, the flaws are not particularly highlighted in the first average image, and therefore, the preprocessed image with the darker color blocks can be judged as the largest difference image by comparing the m preprocessed images with the first average image one by one.
Step S207, after the maximum difference image is removed from the m preprocessed images, calculating and generating a second average image of the remaining m-1 preprocessed images.
After the largest difference image is removed, carrying out average processing on the rest preprocessed images to generate a second average image. This further eliminates the effect of the largest difference image of the portion with the flaw on the average image, resulting in a second average image that is closer to the characteristics of the hardware workpiece without flaws.
And step S208, comparing the characteristics of the maximum difference image with the second average image, judging that the hardware workpiece has surface defects if the difference degree of the maximum difference image and the second average image is out of a preset range, and judging that the hardware workpiece has no surface defects if the difference degree of the maximum difference image and the second average image is in the preset range.
Comparing the features of the maximum difference image with the second average image, wherein the maximum difference image is already proposed when the second average image is calculated, so that the second average image can be approximately regarded as a standard image of the flawless hardware workpiece, and if the difference degree between the maximum difference image and the second average image is out of an acceptable range, namely out of a preset range, the hardware workpiece can be regarded as having surface defects; otherwise, judging that the hardware workpiece has no surface defect. The preset range can be set according to the specific comparison between the characteristic of the maximum difference image and the second average image, for example, whether the characteristic of the maximum difference image and the second average image have obvious color lump difference is compared, for example, the defect part is represented by a color lump with darker color, the preset range can be the size or the color depth of the color lump, and specific data can be set according to the surface quality requirement of the hardware workpiece.
According to the intelligent CAM system for hardware processing, the strip-shaped light is emitted to the hardware workpiece, the workpiece image of the hardware workpiece is shot, then the hardware workpiece is controlled to rotate for a certain angle and then shot again, the steps are repeated to obtain a plurality of workpiece images, then a first average image of the workpiece images is calculated, each workpiece image is compared with the first average image one by one, the largest difference image is judged, the largest difference image is removed, then a second average image of the residual image is calculated, and the largest difference image is compared with the second average image to judge whether the hardware workpiece has surface defects. Therefore, the detection and identification of the surface defects of the hardware workpiece with rotational symmetry are realized through the comparison of the images of the hardware workpiece.
In some embodiments of the present invention, the "preprocessing the captured m workpiece images" specifically includes the following operations performed on each workpiece image separately:
removing dark parts of the workpiece images, reserving bright parts of the workpiece images, and obtaining bright part images, wherein the bright parts are parts of five workpieces in the workpiece images, which are irradiated by strip light, and the dark parts are parts of hardware workpieces in the workpiece images, which are not irradiated by strip light;
And carrying out noise reduction processing on the bright part image.
The bright part can embody more details of the hardware workpiece, and the dark part is removed to better limit the comparison range, so that the bright part can be used for carrying out subsequent average image calculation and image comparison, and a better effect is obtained. In an actual working environment, dust or other foreign matter is inevitably affected, and thus noise reduction processing is required for the bright portion image.
In some embodiments of the present invention, the "preprocessing the captured m workpiece images" specifically further includes:
if the bright part image is a gray level image, performing contrast enhancement processing on the bright part image subjected to noise reduction processing to obtain a preprocessing image;
and if the bright part image is a color image, converting the bright part image subjected to noise reduction into a gray level image, and then performing contrast enhancement processing to obtain a preprocessed image.
The adoption of the gray level image can simplify the subsequent processes of calculating the average image and comparing the images, and some shooting modules can directly shoot the gray level image, so that if the workpiece image shot by the shooting module is the gray level image, the bright part image is naturally also the gray level image, the contrast enhancement processing can be directly carried out on the bright part image after the noise reduction processing, and if the workpiece image shot by the shooting module is a color image, a process of converting the bright part image into the gray level image is also needed to be added. The contrast enhancement process can further highlight differences of various features in the bright portion image, which is more beneficial to subsequent comparison with the preprocessed image and is more beneficial to identifying surface defects.
In some embodiments of the invention, the contrast enhancement process is performed according to the following formula:
wherein, the coordinates in the gray level image for the bright portion image are +.>Is used for the gray value of the pixel point,for the coordinates +.>Gray value of pixel of +.>For the maximum gray value in the pixel points of the gray image,/for the gray value>A main gray scale range of the gray scale image, wherein the main gray scale range refers to the gray scale imagek% of (wt.%) ofGray value of pixel is in the range, < >>The gray scale expansion range is preset, and the gray scale expansion range refers to the pre-processing imagek%The gray value of the pixel of (2) is within 90%>
The formula aims at the duty ratio of the bright part imagek% of pixels, i.e., main pixels, are subjected to gray scale range expansion, but not all pixels, because a small amount of noise or bright spots are inevitably present in the bright image, and if the bright image is not distinguished, image distortion is easily caused. Therefore, the formula expands the gray scale range of the main pixel points, enhances the contrast ratio, and endows the pixel points with over-high or under-high gray scale with boundary values of the gray scale expansion range, thereby reducing the image distortion caused by the contrast ratio enhancement processing. Wherein, The value of (2) can be determined according to the specific condition of the workpiece image, if the noise or bright spot amount of the workpiece image is small and the gray value of the pixel point is concentrated, the gray value of the pixel point can be increased>The value of (2) is set to be larger, whereas can be set to be smaller, and can be determined according to the specific situation.
In some embodiments of the present invention, the "calculating and generating the first average image of the m preprocessed images" specifically includes:
respectively calculating gray value average values of corresponding coordinate pixel points of the m preprocessed images;
and generating the first average image according to the gray value mean value of each coordinate pixel point.
The step of calculating a first average image of the preprocessed images is mainly used for obtaining an image representing the average state of all preprocessed images. Specifically, this process mainly includes the following two steps:
and respectively calculating gray value average values for corresponding coordinate pixel points of the m preprocessed images. The purpose of this step is to calculate the average number of gray values for the pixels of each preprocessed image at the same coordinate position. For example, if we have 3 images, we will find the pixels at the same coordinate location on each image. Then we add the gray values of these pixels and then divide by the number of images (3 in this example) to get the average gray value for this coordinate position. For example, assume that we have three images, and at the (100 ) th pixel, the gray values of the three images are 60, 70, 80, respectively. The average gray value at this position is (60+70+80)/3=70.
And generating the first average image according to the gray value mean value of each coordinate pixel point. After calculating the average gray values for all coordinate positions, we can generate a first average image. Specifically, the average gray value of each coordinate position is taken as the gray value of the position on the preprocessed image. Thus we get a new image-the preprocessed image-representing the average state of all preprocessed images.
In some embodiments of the present invention, the "comparing the m preprocessed images with the first average image, and determining the largest difference image among the m preprocessed images according to the difference degree between the m preprocessed images" specifically includes:
respectively calculating the pixel point gray value average value of each pretreatment image in the m pretreatment images;
calculating the pixel point gray value average value of the first average image;
calculating the difference value between the pixel gray value average value of the m preprocessed images and the pixel gray value average value of the first average image one by one;
and judging the preprocessed image with the largest absolute value of the difference value as the largest difference image.
First, an average gray value of each preprocessed image is calculated. For example, if there are 3 preprocessed images, the sum of the gray values of all pixels of each image is calculated and then divided by the number of pixels to obtain the average gray value of each image. For example, the sum of gray values of all pixels of the first image is 30000 and the number of pixels is 500, so that the average gray value of the first image is 30000/500=60.
Next, an average gray value of the first average image is calculated. The calculation method is the same as the previous step, except that the gray value of the first average image is calculated this time.
Next, the difference between each of the preprocessed images and the first average image is calculated. Specifically, the average gray value of the first average image is subtracted from the average gray value of each preprocessed image to obtain a difference value. For example, if the average gray value of the first preprocessed image is 60 and the average gray value of the first average image is 50, the difference is 60-50=10.
And finally, finding out the preprocessed image with the largest difference from the first average image according to the difference. And comparing the difference values between all the preprocessed images and the first average image, wherein the image with the largest absolute value of the difference value is the largest difference image to be found.
In some embodiments of the present invention, the "comparing the m preprocessed images with the first average image, and determining the largest difference image among the m preprocessed images according to the difference degree between the m preprocessed images" specifically includes:
respectively subtracting the gray value of the pixel point of the corresponding coordinate of the first average image from the gray value of the pixel point of each preprocessed image in the m preprocessed images, taking the absolute value of the gray value difference value of the gray value of the pixel point of the corresponding coordinate of the first average image, and generating a difference image according to the absolute value;
Performing binarization processing on the differential image, and setting the gray value of a certain pixel of the differential image to 255 if the gray value of the certain pixel is larger than a preset threshold value; if the gray value of a certain pixel of the differential image is smaller than a preset threshold value, setting the gray value of the certain pixel to be 0;
and respectively calculating the number of pixel points with the gray value of 255 in the binarized differential image corresponding to each preprocessed image, and judging the corresponding preprocessed image with the largest number as the largest differential image.
Since surface defects are often embodied as lighter or darker colored patches in the workpiece image, the determination of surface defects can be made by preprocessing the differential image of the image and the first average image. Further, in order to further highlight the surface defects, the differential image is subjected to binarization processing. And then judging the maximum difference image according to the number of the pixels with the gray level of 255 in the binarized difference image. Methods for generating differential images and binarizing the images are well known in the prior art and are not described in detail herein.
In some embodiments of the present invention, the "comparing the m preprocessed images with the first average image, and determining the largest difference image among the m preprocessed images according to the difference degree between the m preprocessed images" specifically includes:
Respectively subtracting the gray value of the pixel point of the corresponding coordinate of the first average image from the gray value of the pixel point of each preprocessed image in the m preprocessed images, taking the absolute value of the gray value difference value of the gray value of the pixel point of the corresponding coordinate of the first average image, and generating a difference image according to the absolute value;
and respectively calculating the pixel point gray value average value of the difference image corresponding to each preprocessed image, and judging the corresponding preprocessed image with the largest pixel point gray value average value as the largest difference image.
Since surface defects are often embodied as lighter or darker colored patches in the workpiece image, the determination of surface defects can be made by preprocessing the differential image of the image and the first average image. And judging the maximum difference image according to the pixel gray value average value of the difference image. Methods for generating differential images are well known in the prior art and are not described in detail herein.
The embodiment of the invention also provides surface defect detection equipment, which comprises the following steps:
a processor:
a memory for storing processor-executable instructions;
wherein the processor is configured to invoke the instructions stored in the memory to perform the method according to any of the embodiments described above.
Embodiments of the present invention also provide a computer readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the method according to any of the embodiments described above.
The present invention may be a method, apparatus, system, and/or computer program product. The computer program product may include a computer readable storage medium having computer readable program instructions embodied thereon for performing various aspects of the present invention.
The computer readable storage medium may be a tangible device that can hold and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: portable computer disks, hard disks, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), static Random Access Memory (SRAM), portable compact disk read-only memory (CD-ROM), digital Versatile Disks (DVD), memory sticks, floppy disks, mechanical coding devices, punch cards or in-groove structures such as punch cards or grooves having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media, as used herein, are not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through waveguides or other transmission media (e.g., optical pulses through fiber optic cables), or electrical signals transmitted through wires.
The computer readable program instructions described herein may be downloaded from a computer readable storage medium to a respective computing/processing device or to an external computer or external storage device over a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmissions, wireless transmissions, routers, firewalls, switches, gateway computers and/or edge servers. The network interface card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium in the respective computing/processing device.
Computer program instructions for carrying out operations of the present invention may be assembly instructions, instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, c++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer readable program instructions may be executed entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, aspects of the present invention are implemented by personalizing electronic circuitry, such as programmable logic circuitry, field Programmable Gate Arrays (FPGAs), or Programmable Logic Arrays (PLAs), with state information for computer readable program instructions, which can execute the computer readable program instructions.
Various aspects of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer readable program instructions may be provided to a processing unit of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processing unit of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable medium having the instructions stored therein includes an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Note that all features disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise. Thus, unless expressly stated otherwise, each feature disclosed is one example only of a generic set of equivalent or similar features. Where used, further, preferably, still further and preferably, the brief description of the other embodiment is provided on the basis of the foregoing embodiment, and further, preferably, further or more preferably, the combination of the contents of the rear band with the foregoing embodiment is provided as a complete construct of the other embodiment. A further embodiment is composed of several further, preferably, still further or preferably arrangements of the strips after the same embodiment, which may be combined arbitrarily.
It will be appreciated by persons skilled in the art that the embodiments of the invention described above and shown in the drawings are by way of example only and are not limiting. The objects of the present invention have been fully and effectively achieved. The functional and structural principles of the present invention have been shown and described in the examples and embodiments of the invention may be modified or practiced without departing from the principles described.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and not for limiting the same; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the invention.

Claims (10)

1. An intelligent CAM system for hardware machining, the system comprising a surface defect detection device for detecting a surface defect of a hardware workpiece having rotational symmetry, the hardware workpiece having a rotational axis and a rotational angle α, the hardware workpiece being rotated about the rotational axis by a surface contour coincidence around the rotational angle α, characterized in that the surface defect detection device comprises:
the light source module is used for emitting strip-shaped light to irradiate the hardware workpiece, wherein the strip-shaped light completely covers the hardware workpiece in the length direction and does not completely cover the hardware workpiece in the width direction;
The shooting module is used for shooting the workpiece images of the hardware workpiece, and shooting the workpiece images of the hardware workpiece again after a preset time t is set, until a preset number m is reached, so as to obtain m workpiece images, wherein m is more than or equal to 3;
the workpiece rotating module is used for rotating the hardware workpiece around the rotating shaft by the rotating angle alpha after shooting the workpiece image of the hardware workpiece for the first time, and rotating the hardware workpiece around the rotating shaft by the rotating angle alpha again after a preset time t is spaced until the preset times m-1 are reached;
the preprocessing module is used for preprocessing the photographed m workpiece images to obtain m preprocessed images;
the first average image generation module is used for calculating and generating a first average image of the m preprocessed images;
the maximum difference judging module is used for comparing the m preprocessed images with the first average image respectively and judging the maximum difference image in the m preprocessed images according to the difference degree of the m preprocessed images and the first average image;
the second average image generation module is used for calculating and generating a second average image of the remaining m-1 preprocessed images after the maximum difference image is removed from the m preprocessed images;
And the workpiece qualification judging module is used for comparing the characteristics of the maximum difference image with the second average image, judging that the hardware workpiece has surface defects if the difference degree of the maximum difference image and the second average image is out of a preset range, and judging that the hardware workpiece has no surface defects if the difference of the maximum difference image and the second average image is in the preset range.
2. The intelligent CAM system for hardware processing according to claim 1, wherein the preprocessing of the captured m workpiece images specifically comprises the following operations for each workpiece image:
removing dark parts of the workpiece images, reserving bright parts of the workpiece images, and obtaining bright part images, wherein the bright parts are parts of five workpieces in the workpiece images, which are irradiated by strip light, and the dark parts are parts of hardware workpieces in the workpiece images, which are not irradiated by strip light;
and carrying out noise reduction processing on the bright part image.
3. The intelligent CAM system for hardware processing according to claim 2, wherein the preprocessing the captured m workpiece images specifically further comprises:
if the bright part image is a gray level image, performing contrast enhancement processing on the bright part image subjected to noise reduction processing to obtain a preprocessing image;
And if the bright part image is a color image, converting the bright part image subjected to noise reduction into a gray level image, and then performing contrast enhancement processing to obtain a preprocessed image.
4. The intelligent CAM system for hardware manufacturing of claim 3, wherein the contrast enhancement process is performed according to the following formula:
wherein, the coordinates in the gray level image for the bright portion image are +.>Gray value of pixel of +.>For the coordinates +.>Gray value of pixel of +.>For the maximum gray value in the pixel points of the gray image,/for the gray value>A main gray scale range of the gray scale image, wherein the main gray scale range refers to the gray scale imagek% of the range in which the gray value of the pixel is located,% of the pixel>The gray scale expansion range is preset, and the gray scale expansion range refers to the pre-processing imagek%The gray value of the pixel of (2) is within 90%>
5. The intelligent CAM system for hardware manufacturing according to claim 4, wherein the calculating and generating the first average image of the m preprocessed images specifically comprises:
respectively calculating gray value average values of corresponding coordinate pixel points of the m preprocessed images;
And generating the first average image according to the gray value mean value of each coordinate pixel point.
6. The intelligent CAM system for hardware processing according to claim 5, wherein the comparing the m preprocessed images with the first average image respectively, and determining the largest difference image of the m preprocessed images according to the difference degree of the m preprocessed images specifically comprises:
respectively calculating the pixel point gray value average value of each pretreatment image in the m pretreatment images;
calculating the pixel point gray value average value of the first average image;
calculating the difference value between the pixel gray value average value of the m preprocessed images and the pixel gray value average value of the first average image one by one;
and judging the preprocessed image with the largest absolute value of the difference value as the largest difference image.
7. The intelligent CAM system for hardware processing according to claim 5, wherein the comparing the m preprocessed images with the first average image respectively, and determining the largest difference image of the m preprocessed images according to the difference degree of the m preprocessed images specifically comprises:
respectively subtracting the gray value of the pixel point of the corresponding coordinate of the first average image from the gray value of the pixel point of each preprocessed image in the m preprocessed images, taking the absolute value of the gray value difference value of the gray value of the pixel point of the corresponding coordinate of the first average image, and generating a difference image according to the absolute value;
Performing binarization processing on the differential image, and setting the gray value of a certain pixel of the differential image to 255 if the gray value of the certain pixel is larger than a preset threshold value; if the gray value of a certain pixel of the differential image is smaller than a preset threshold value, setting the gray value of the certain pixel to be 0;
and respectively calculating the number of pixel points with the gray value of 255 in the binarized differential image corresponding to each preprocessed image, and judging the corresponding preprocessed image with the largest number as the largest differential image.
8. The intelligent CAM system for hardware processing according to claim 5, wherein the comparing the m preprocessed images with the first average image respectively, and determining the largest difference image of the m preprocessed images according to the difference degree of the m preprocessed images specifically comprises:
respectively subtracting the gray value of the pixel point of the corresponding coordinate of the first average image from the gray value of the pixel point of each preprocessed image in the m preprocessed images, taking the absolute value of the gray value difference value of the gray value of the pixel point of the corresponding coordinate of the first average image, and generating a difference image according to the absolute value;
and respectively calculating the pixel point gray value average value of the difference image corresponding to each preprocessed image, and judging the corresponding preprocessed image with the largest pixel point gray value average value as the largest difference image.
9. A surface defect detection method for detecting a surface defect of a hardware workpiece having rotational symmetry, the hardware workpiece having a rotation axis and a rotation angle α, surface contours of the hardware workpiece before and after the rotation of the hardware workpiece around the rotation axis through the rotation angle α being coincident, the method comprising:
emitting a strip light to irradiate the hardware workpiece, wherein the strip light completely covers the hardware workpiece in the length direction and does not completely cover the hardware workpiece in the width direction;
shooting workpiece images of the hardware workpiece, and shooting the workpiece images of the hardware workpiece again after a preset time t is set, until a preset number m is reached, so as to obtain m workpiece images, wherein m is more than or equal to 3;
after the workpiece image of the hardware workpiece is photographed for the first time, the hardware workpiece is rotated around the rotation shaft by the rotation angle alpha, and after a preset time t is spaced, the hardware workpiece is rotated around the rotation shaft again by the rotation angle alpha until the preset times m-1 are reached;
preprocessing the photographed m workpiece images to obtain m preprocessed images;
calculating and generating a first average image of the m preprocessed images;
Comparing the m preprocessed images with the first average image respectively, and judging the largest difference image in the m preprocessed images according to the difference degree of the m preprocessed images;
after the maximum difference image is removed from the m preprocessed images, calculating and generating a second average image of the remaining m-1 preprocessed images;
and comparing the characteristics of the maximum difference image with the second average image, judging that the hardware workpiece has surface defects if the difference degree of the maximum difference image and the second average image is out of a preset range, and judging that the hardware workpiece has no surface defects if the difference degree of the maximum difference image and the second average image is out of the preset range.
10. A surface defect inspection apparatus, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to invoke the instructions stored in the memory to perform the method of claim 9.
CN202311195766.5A 2023-09-18 2023-09-18 Intelligent CAM system for hardware processing and surface defect detection method and device Active CN116930195B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311195766.5A CN116930195B (en) 2023-09-18 2023-09-18 Intelligent CAM system for hardware processing and surface defect detection method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311195766.5A CN116930195B (en) 2023-09-18 2023-09-18 Intelligent CAM system for hardware processing and surface defect detection method and device

Publications (2)

Publication Number Publication Date
CN116930195A true CN116930195A (en) 2023-10-24
CN116930195B CN116930195B (en) 2023-11-17

Family

ID=88375768

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311195766.5A Active CN116930195B (en) 2023-09-18 2023-09-18 Intelligent CAM system for hardware processing and surface defect detection method and device

Country Status (1)

Country Link
CN (1) CN116930195B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117589690A (en) * 2024-01-18 2024-02-23 常州宝捷冲片有限公司 Visual inspection system and working method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004037415A (en) * 2002-07-08 2004-02-05 Mitsubishi Nuclear Fuel Co Ltd Appearance inspection device and appearance inspection method
CN113344865A (en) * 2021-05-21 2021-09-03 深圳中科精工科技有限公司 Method, device, equipment and medium for detecting surface defects of smooth object
CN116380915A (en) * 2023-04-04 2023-07-04 深圳熙卓科技有限公司 Method, device, medium and electronic equipment for detecting surface defects of identity card

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004037415A (en) * 2002-07-08 2004-02-05 Mitsubishi Nuclear Fuel Co Ltd Appearance inspection device and appearance inspection method
CN113344865A (en) * 2021-05-21 2021-09-03 深圳中科精工科技有限公司 Method, device, equipment and medium for detecting surface defects of smooth object
CN116380915A (en) * 2023-04-04 2023-07-04 深圳熙卓科技有限公司 Method, device, medium and electronic equipment for detecting surface defects of identity card

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117589690A (en) * 2024-01-18 2024-02-23 常州宝捷冲片有限公司 Visual inspection system and working method
CN117589690B (en) * 2024-01-18 2024-03-19 常州宝捷冲片有限公司 Visual inspection system and working method

Also Published As

Publication number Publication date
CN116930195B (en) 2023-11-17

Similar Documents

Publication Publication Date Title
CN116930195B (en) Intelligent CAM system for hardware processing and surface defect detection method and device
US11747284B2 (en) Apparatus for optimizing inspection of exterior of target object and method thereof
CN111275659B (en) Weld image processing method and device, terminal equipment and storage medium
JP2009097922A (en) Visual inspection method and apparatus
KR20130072073A (en) Apparatus and method for extracting edge in image
KR102012318B1 (en) Apparatus for welding quality total inspection using image sensor and method thereof
CN110930376A (en) Method and system for detecting welding spot burrs based on machine vision
CN110774055B (en) Cutter breakage monitoring method and system based on image edge detection
CN113066088A (en) Detection method, detection device and storage medium in industrial detection
CN113066051A (en) Groove defect detection method, computing equipment and readable storage medium
CN112288680B (en) Automatic defect area extraction method and system for automobile hub X-ray image
CN113298775A (en) Self-priming pump double-sided metal impeller appearance defect detection method, system and medium
CN109211919B (en) Method and device for identifying magnetic tile defect area
CN115018829A (en) Glass flaw positioning method and device
US11644427B2 (en) Automatic detection method and automatic detection system for detecting crack on wafer edges
CN115456969A (en) Method for detecting appearance defect, electronic device and storage medium
CN115511902A (en) Angular point feature extraction method and system
CN113763491A (en) Visual detection method for tobacco shred barrel residues
CN113469988A (en) Defect identification method
CN110766707B (en) Cavitation bubble image processing method based on multi-operator fusion edge detection technology
CN112200805A (en) Industrial product image target extraction and defect judgment method
CN115266759B (en) Explosive column debonding defect online automatic identification method based on shearing speckle interference
CN117218097B (en) Method and device for detecting surface defects of shaft sleeve type silk screen gasket part
CN110009612B (en) Sub-pixel precision-based mobile phone lens window glass image segmentation method
Park et al. An improved algorithm for laser point detection based on otsu thresholding method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant