CN111833350A - Machine vision detection method and system - Google Patents

Machine vision detection method and system Download PDF

Info

Publication number
CN111833350A
CN111833350A CN202010867013.4A CN202010867013A CN111833350A CN 111833350 A CN111833350 A CN 111833350A CN 202010867013 A CN202010867013 A CN 202010867013A CN 111833350 A CN111833350 A CN 111833350A
Authority
CN
China
Prior art keywords
value
pixel
row
image
detection area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010867013.4A
Other languages
Chinese (zh)
Other versions
CN111833350B (en
Inventor
郑李明
于涛
崔兵兵
黄帆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Yuanjue Information And Technology Co
Original Assignee
Nanjing Yuanjue Information And Technology Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Yuanjue Information And Technology Co filed Critical Nanjing Yuanjue Information And Technology Co
Priority to CN202010867013.4A priority Critical patent/CN111833350B/en
Publication of CN111833350A publication Critical patent/CN111833350A/en
Application granted granted Critical
Publication of CN111833350B publication Critical patent/CN111833350B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20061Hough transform
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Abstract

The invention discloses a machine vision detection method, which is characterized by comprising the following steps: acquiring an image of a workpiece detection area with a complex surface; calculating the workpiece detection area image by using the combination of a plurality of masks, algorithms and parameters to respectively obtain a plurality of processed detection area images; combining the processed detection area images to obtain a detection result detection area image; and detecting defects in the area images according to the detection result, and judging whether the surfaces of the workpieces corresponding to the plurality of masks have defects or not.

Description

Machine vision detection method and system
Technical Field
The invention belongs to the field of machine vision, and relates to a technology for detecting surface defects of a complex workpiece by utilizing machine vision.
Background
With the development of the mechanical industry in China, particularly the mass production of metal mechanical workpieces, various defects or defects such as cracks, pits, scratches, holes, corrosion and the like can occur in the manufacturing and processing process. These defects may reduce the mechanical properties of the workpiece, such as corrosion resistance, fatigue strength, wear resistance, etc. In particular, since the surface of a workpiece having a complex surface includes a non-planar complex curved surface, the shape is complex, and it is difficult to detect the outer edge, the lead angle, and the critical portion of the joint of the workpiece. The quality check must be performed by manual sampling detection after the manufacturing process, and the consistency and efficiency of universal detection are lacked. If the manual full inspection is used, the cost of labor and time is also required to be borne.
In order to increase the number and efficiency of detection and reduce the labor and time cost of detection, the technology of using machine vision to automatically detect the surface defects of complex workpieces in the mechanical industry is required, and the consistency standard of quality detection can be provided.
Disclosure of Invention
The invention provides a machine vision detection method and a system for solving the defects in the prior art, and aims to provide an automatic machine vision detection system and a method for workpieces with complex surfaces, which can provide consistency standard of quality detection, improve the detection quantity and efficiency and reduce the detection labor and time cost.
In order to achieve the purpose, the invention adopts the following technical scheme:
according to an aspect of the present application, there is provided a machine vision inspection method, including: acquiring an image of a workpiece detection area with a complex surface; calculating the workpiece detection area image by using the combination of a plurality of masks, algorithms and parameters to respectively obtain a plurality of processed detection area images; combining the processed detection area images to obtain a detection result detection area image; and detecting defects in the area images according to the detection result, and judging whether the surfaces of the workpieces corresponding to the plurality of masks have defects or not.
Further, in order to calculate the surface defects by using the mean and standard deviation of the non-zero values in a certain area, a first algorithm among the plurality of algorithms comprises the steps of: shielding the image of the workpiece detection area by using the mask corresponding to the first algorithm; copying the non-zero pixel value of the shielded workpiece detection area image to a working detection area image; calculating a corresponding mean u and standard deviation s for one or more first regions of the working detection region image; respectively calculating an upper limit threshold corresponding to each first area according to the forward threshold corresponding to each first area, the mean value u and the standard deviation s; and traversing the pixels of each first region of the shielded workpiece detection region image, and when the value of the pixel is greater than the upper limit threshold corresponding to the first region, setting the value of the corresponding pixel of the processed detection region image as the difference value between the pixel and the mean value u, otherwise, setting the pixel as a second value. When the value of the pixel is greater than the upper threshold corresponding to the first region, the pixel may be a crack portion of the workpiece detection region.
Further, in order to preserve the defects below the lower threshold, the first algorithm of the plurality of algorithms further comprises the steps of: respectively calculating a lower limit threshold corresponding to each first area according to the negative threshold corresponding to each first area, the mean value u and the standard deviation s; and traversing the pixel of each first area of the shielded workpiece detection area image, and setting the value of the pixel corresponding to the processed detection area image as a first value when the value of the pixel is smaller than the lower threshold corresponding to the first area. When the value of the pixel is smaller than the lower threshold corresponding to the first area, the pixel may be a pit or a black skin portion of the workpiece detection area.
Further, to provide a global scan algorithm, a line scan algorithm, or a column scan algorithm, the first region is one of: the work detects the overall situation of the regional image; a row of the working detection area image, the row comprising at least one pixel of non-zero pixel value; and a row of the working detection area image, the row including at least one pixel of non-zero pixel value.
Further, in order to provide a row-column scanning algorithm, a row-column scanning algorithm of the plurality of algorithms comprises the steps of: shielding the image of the workpiece detection area by using the mask corresponding to the line-row scanning algorithm; copying the non-zero pixel value of the shielded workpiece detection area image to a working detection area image; calculating a corresponding row mean value u and a row standard deviation s for one or more rows of the image of the working detection area; calculating a corresponding row mean u and a row standard deviation s for one or more rows of the image of the working detection area; respectively calculating the upper limit threshold of the line corresponding to each line according to the forward threshold, the line mean u and the line standard deviation s corresponding to each line; respectively calculating the column upper limit threshold corresponding to each column according to the forward threshold corresponding to each column, the column mean u and the column standard deviation s; and traversing each pixel of the shielded workpiece detection area image, calculating a row and column upper limit threshold according to the row upper limit threshold and the column upper limit threshold corresponding to the pixel, setting the value of the pixel corresponding to the processed detection area image as the difference value between the pixel value and the row mean value or the column mean value when the pixel value is greater than the corresponding row and column upper limit threshold, and otherwise, setting the pixel value as a second value. When the value of the pixel is larger than the corresponding upper row-column threshold, the pixel may be a crack portion of the workpiece detection area.
Further, in order to preserve the defects below the lower threshold in the row and column scanning algorithm, the row and column scanning algorithm further comprises the following steps: respectively calculating a lower limit threshold corresponding to each row area according to the negative threshold corresponding to each row, the row mean u and the row standard deviation s; respectively calculating a lower limit threshold corresponding to each row of areas according to the negative threshold corresponding to each row, the row mean u and the row standard deviation s; traversing each pixel of the shielded workpiece detection area image, calculating a line lower threshold according to the line lower threshold and the row lower threshold corresponding to the pixel, and setting the value of the pixel corresponding to the detection area image as a first value when the value of the pixel is smaller than the corresponding line lower threshold. When the value of the pixel is smaller than the corresponding lower row-column threshold, the pixel may be a pit or a black skin portion of the workpiece detection area.
Further, in order to detect an orientation of the workpiece using a plurality of mask and algorithm, parameter combinations, a second algorithm of the plurality of algorithms comprises the steps of: shielding the image of the workpiece detection area by using the mask corresponding to the second algorithm; copying the non-zero pixel value of the shielded workpiece detection area image to a working detection area image; calculating a corresponding mean u and standard deviation s for one or more second regions of the working detection region image; respectively calculating an upper limit threshold corresponding to each second area according to the forward threshold corresponding to each second area, the mean u and the standard deviation s; and traversing the pixels of each second area of the masked workpiece detection area image, when the value of the pixel is greater than the upper threshold corresponding to the second area, setting the value of the pixel corresponding to the processed detection area image as the difference between the pixel value and the upper threshold, otherwise, setting the pixel value as the second value, wherein the first area and the second area are one of the following areas, and the first area is different from the second area: the work detects the overall situation of the regional image; a row of the working detection area image, the row comprising at least one pixel of non-zero pixel value; and a row of the working detection area image, the row including at least one pixel of non-zero pixel value. When the value of the pixel is greater than the upper threshold corresponding to the second region, the pixel may be a crack portion of the workpiece detection region.
Further, in order to preserve the defects below the lower threshold, the second algorithm of the plurality of algorithms further comprises the steps of: respectively calculating a lower limit threshold corresponding to each second area according to the negative threshold corresponding to each second area, the mean value u and the standard deviation s; and traversing the pixels of each second area of the shielded workpiece detection area image, and setting the value of the corresponding pixel of the processed detection area image as a first value when the value of the pixel is smaller than the lower threshold corresponding to the second area. When the value of the pixel is smaller than the lower threshold corresponding to the second area, the pixel may be a pit or a black skin portion of the workpiece detection area.
Further, in order to calculate the defect of the overlapping portion between the adjacent masks, when the mask corresponding to the first algorithm overlaps with the mask corresponding to the second algorithm, the method further includes performing a logic operation on two corresponding pixels of the overlapping portion between the first detection area image and the second row detection area image in the step of combining the first detection area image processed by the first algorithm and the second detection area image processed by the second algorithm, and when the value of one pixel of the two pixels is the first value or the difference value, setting the value of the corresponding pixel of the detection result detection area image as a defect value, otherwise, setting the value as a non-defect value.
Further, in order to determine the type of the surface defect, the machine vision inspection method further includes: carrying out Hough transformation on the detection result detection area image to obtain a Hough detection area image; setting a minimum circumscribed rectangle corresponding to each defective area in the image of the Hough detection area; and judging whether the defect area is a crack according to the minimum circumscribed rectangle, wherein the step of judging whether the surfaces of the workpiece corresponding to the plurality of masks have defects further comprises the step of judging whether the surfaces of the workpiece corresponding to the plurality of masks have defects during the work when the image of the Hough detection area contains cracks.
Further, in order to filter out the isolated points to avoid interfering with the hough transform, the machine vision detection method further includes: before the hough transform step, when the value of a certain pixel of the detection result detection area image is a defect value and the values of a plurality of pixels adjacent to the pixel are non-defect values, the value of the pixel is set as the non-defect value.
Further, in order to filter out noise, the machine vision detection method further includes: the minimum bounding rectangle of the defective region is not set when the defective region satisfies one of the following conditions: the outline area of the defect region is smaller than a threshold value; and the length of the defective region is less than another threshold.
Further, in order to determine whether the defect is a long and narrow crack, the step of determining whether the defect region is a crack further includes: calculating the proportion of the minimum circumscribed rectangle corresponding to the defect area; and when the ratio is greater than a threshold value, judging the defect area to be a crack, wherein the ratio is one of the following: comparing the difference value of the minimum circumscribed rectangle area and the area of the defect area with the minimum circumscribed rectangle area; and comparing the length of the long side of the minimum circumscribed rectangle with the length of the short side of the minimum circumscribed rectangle.
Further, in order to detect the crack more easily, the machine vision detection method further includes: before the obtaining step, magnetic powder is scattered on the workpiece detection area.
According to an aspect of the present application, there is provided a machine vision inspection system, comprising: the camera module is used for shooting a workpiece to be detected; and a calculator module for executing software to implement the following steps: making the camera module shoot a workpiece to be detected; acquiring an image of a workpiece detection area with a complex surface from the camera module; calculating the workpiece detection area image by using the combination of a plurality of masks, algorithms and parameters to respectively obtain a plurality of processed detection area images; combining the processed detection area images to obtain a detection result detection area image; and detecting defects in the area images according to the detection result, and judging whether the surfaces of the workpieces corresponding to the plurality of masks have defects or not.
Further, in order to calculate the surface defects by using the mean and standard deviation of the non-zero values in a certain area, a first algorithm among the plurality of algorithms comprises the steps of: shielding the image of the workpiece detection area by using the mask corresponding to the first algorithm; copying the non-zero pixel value of the shielded workpiece detection area image to a working detection area image; calculating a corresponding mean u and standard deviation s for one or more first regions of the working detection region image; respectively calculating an upper limit threshold corresponding to each first area according to the forward threshold corresponding to each first area, the mean value u and the standard deviation s; and traversing the pixels of each first region of the shielded workpiece detection region image, and when the value of the pixel is greater than the upper threshold corresponding to the first region, setting the value of the pixel corresponding to the processed detection region image as the difference value between the pixel and the upper threshold, otherwise, setting the pixel as a second value. When the value of the pixel is greater than the upper threshold corresponding to the first region, the pixel may be a crack portion of the workpiece detection region.
Further, in order to preserve the defects below the lower threshold, the first algorithm of the plurality of algorithms further comprises the steps of: respectively calculating a lower limit threshold corresponding to each first area according to the negative threshold corresponding to each first area, the mean value u and the standard deviation s; and traversing the pixel of each first area of the shielded workpiece detection area image, and setting the value of the pixel corresponding to the processed detection area image as a first value when the value of the pixel is smaller than the lower threshold corresponding to the first area. When the value of the pixel is smaller than the lower threshold corresponding to the first area, the pixel may be a pit or a black skin portion of the workpiece detection area.
Further, to provide a global scan algorithm, a line scan algorithm, or a column scan algorithm, the first region is one of: the work detects the overall situation of the regional image; a row of the working detection area image, the row comprising at least one pixel of non-zero pixel value; and a row of the working detection area image, the row including at least one pixel of non-zero pixel value.
Further, in order to provide a row-column scanning algorithm, a row-column scanning algorithm of the plurality of algorithms comprises the steps of: shielding the image of the workpiece detection area by using the mask corresponding to the row-column scanning algorithm; copying the non-zero pixel value of the shielded workpiece detection area image to a working detection area image; calculating a corresponding row mean value u and a row standard deviation s for one or more rows of the image of the working detection area; calculating a corresponding row mean u and a row standard deviation s for one or more rows of the image of the working detection area; respectively calculating the upper limit threshold of the line corresponding to each line according to the forward threshold, the line mean u and the line standard deviation s corresponding to each line; respectively calculating the column upper limit threshold corresponding to each column according to the forward threshold corresponding to each column, the column mean u and the column standard deviation s; and traversing each pixel of the shielded workpiece detection area image, when the value of the pixel is greater than the corresponding upper limit threshold, calculating a row and column upper limit threshold according to the row upper limit threshold and the column upper limit threshold corresponding to the pixel, setting the value of the pixel corresponding to the processed detection area image as the difference value between the pixel and the row and column upper limit threshold, and otherwise, setting the pixel as a second value. When the value of the pixel is larger than the corresponding upper row-column threshold, the pixel may be a crack portion of the workpiece detection area.
Further, in order to preserve the defects below the lower threshold in the row and column scanning algorithm, the row and column scanning algorithm further comprises the following steps: respectively calculating a lower limit threshold corresponding to each row area according to the negative threshold corresponding to each row, the row mean u and the row standard deviation s; respectively calculating a lower limit threshold corresponding to each row of areas according to the negative threshold corresponding to each row, the row mean u and the row standard deviation s; traversing each pixel of the shielded workpiece detection area image, calculating a line lower threshold according to the line lower threshold and the row lower threshold corresponding to the pixel, and setting the value of the pixel corresponding to the detection area image as a first value when the value of the pixel is smaller than the corresponding line lower threshold. When the value of the pixel is smaller than the corresponding lower row-column threshold, the pixel may be a pit or a black skin portion of the workpiece detection area.
Further, in order to detect an orientation of the workpiece using a plurality of mask and algorithm, parameter combinations, a second algorithm of the plurality of algorithms comprises the steps of: shielding the image of the workpiece detection area by using the mask corresponding to the second algorithm; copying the non-zero pixel value of the shielded workpiece detection area image to a working detection area image; calculating a corresponding mean u and standard deviation s for one or more second regions of the working detection region image; respectively calculating an upper limit threshold corresponding to each second area according to the forward threshold corresponding to each second area, the mean u and the standard deviation s; and traversing the pixels of each second area of the masked workpiece detection area image, when the value of the pixel is greater than the upper threshold corresponding to the second area, setting the value of the pixel corresponding to the processed detection area image as the difference between the pixel value and the upper threshold, otherwise, setting the pixel value as the second value, wherein the first area and the second area are one of the following areas, and the first area is different from the second area: the work detects the overall situation of the regional image; a row of the working detection area image, the row comprising at least one pixel of non-zero pixel value; and a row of the working detection area image, the row including at least one pixel of non-zero pixel value. When the value of the pixel is greater than the upper threshold corresponding to the second region, the pixel may be a crack portion of the workpiece detection region.
Further, in order to preserve the defects below the lower threshold, the second algorithm of the plurality of algorithms further comprises the steps of: respectively calculating a lower limit threshold corresponding to each second area according to the negative threshold corresponding to each second area, the mean value u and the standard deviation s; and traversing the pixels of each second area of the shielded workpiece detection area image, and setting the value of the corresponding pixel of the processed detection area image as a first value when the value of the pixel is smaller than the lower threshold corresponding to the second area. When the value of the pixel is smaller than the lower threshold corresponding to the second area, the pixel may be a pit or a black skin portion of the workpiece detection area.
Further, in order to calculate the defect of the overlapping portion between the adjacent masks, when the mask corresponding to the first algorithm overlaps with the mask corresponding to the second algorithm, the method further includes performing a logic operation on two corresponding pixels of the overlapping portion between the first detection area image and the second row detection area image in the step of combining the first detection area image processed by the first algorithm and the second detection area image processed by the second algorithm, and when the value of one pixel of the two pixels is the first value or the difference value, setting the value of the corresponding pixel of the detection result detection area image as a defect value, otherwise, setting the value as a non-defect value.
Further, to determine the type of surface defect, the calculator module is further configured to: carrying out Hough transformation on the detection result detection area image to obtain a Hough detection area image; setting a minimum circumscribed rectangle corresponding to each defective area in the image of the Hough detection area; and judging whether the defect area is a crack according to the minimum circumscribed rectangle, wherein the step of judging whether the surfaces of the workpiece corresponding to the plurality of masks have defects further comprises the step of judging whether the surfaces of the workpiece corresponding to the plurality of masks have defects during the work when the image of the Hough detection area contains cracks.
Further, in order to filter the isolated points to avoid interfering with the hough transform, the calculator module is further configured to: before the hough transform step, when the value of a certain pixel of the detection result detection area image is a defect value and the values of a plurality of pixels adjacent to the pixel are non-defect values, the value of the pixel is set as the non-defect value.
Further, to reduce noise, the calculator module is further configured to: the minimum bounding rectangle of the defective region is not set when the defective region satisfies one of the following conditions: the outline area of the defect region is smaller than a threshold value; and the length of the defective region is less than another threshold.
Further, in order to determine whether the defect is a long and narrow crack, the step of determining whether the defect region is a crack further includes: calculating the proportion of the minimum circumscribed rectangle corresponding to the defect area; and when the ratio is greater than a threshold value, judging the defect area to be a crack, wherein the ratio is one of the following: comparing the difference value of the minimum circumscribed rectangle area and the area of the defect area with the minimum circumscribed rectangle area; and comparing the length of the long side of the minimum circumscribed rectangle with the length of the short side of the minimum circumscribed rectangle.
Further, in order to inspect a large or large number of workpieces, the machine vision inspection system further comprises one or any combination of the following modules: the mechanical arm module is used for bearing the camera module; the illumination module is used for illuminating the workpiece to be detected; and the conveying belt module is used for moving and bearing the workpiece to be detected to a corresponding shooting position of the camera module.
Further, in order to speed up the detection of the surface defects of the workpiece by using programmable hardware, the calculator module further comprises a parallel acceleration calculation module consisting of field programmable logic gate arrays, and is used for executing the calculation steps of the software.
Further, in order to more easily detect cracks, the work inspection area has been scattered with magnetic powder before photographing the work to be inspected.
Due to the adoption of the scheme, the invention has the beneficial effects that: different threshold values can be set according to the difference of light and the like aiming at different areas of a workpiece with a complex surface, so that the automatic machine vision detection is effectively carried out, the consistency standard of quality detection can be provided, the detection quantity and efficiency can be improved, and the detection labor and time cost are reduced.
Drawings
Fig. 1A is a block schematic diagram of a machine vision inspection system according to an embodiment of the present application.
Fig. 1B is a block schematic diagram of a machine vision inspection system according to another embodiment of the present application.
FIG. 2 is a block schematic diagram of a calculator module according to an embodiment of the present application.
FIG. 3 is a block diagram of an acceleration calculation module according to an embodiment of the present application.
Fig. 4 is a block diagram illustrating data stored in a memory module according to an embodiment of the present application.
Fig. 5 is a flowchart illustrating a machine vision inspection method according to an embodiment of the present application.
Fig. 6 is a flow chart diagram of a scanning algorithm according to an embodiment of the present application.
Fig. 7 is a flow chart diagram of a scanning algorithm according to an embodiment of the present application.
Fig. 8 is a flowchart illustrating a method for determining a surface defect type according to an embodiment of the present disclosure.
Fig. 9 is a diagram of steps of a machine vision inspection method according to an embodiment of the present application after execution.
Fig. 10 is a workpiece diagram and a crack recognition diagram of a machine vision inspection method according to an embodiment of the present application.
FIG. 11 is a schematic view of a mask according to an embodiment of the present application.
FIG. 12A is a diagram of a workpiece according to an embodiment of the application.
Fig. 12B and 12C are views of a mask of the workpiece of fig. 12A, respectively.
Fig. 13A is a diagram of a workpiece according to an embodiment of the application.
Fig. 13B and 13C are views of the mask of the workpiece of fig. 13A, respectively.
Fig. 14A to 14D are diagrams after steps of a machine vision inspection method according to an embodiment of the present application are performed.
FIG. 15A is a diagram of a claw pole workpiece according to an embodiment of the present application.
Fig. 15B is a schematic illustration of three masks for one jaw of the claw-pole workpiece shown in fig. 15A.
FIG. 15C is a schematic illustration of a surface defect of the workpiece shown in FIG. 15A.
Fig. 15D is a schematic view of the detection area of the workpiece shown in fig. 15C.
Fig. 15E is a schematic view of the detection result of the workpiece shown in fig. 15C.
FIG. 16A is a schematic illustration of a surface imperfection of the workpiece shown in FIG. 15A.
Fig. 16B is a schematic view of the detection area of the workpiece shown in fig. 16A.
Fig. 16C is a schematic view of the detection result of the workpiece shown in fig. 16A.
FIG. 17A is a schematic illustration of a surface defect of the workpiece shown in FIG. 15A.
Fig. 17B is a schematic view of the detection area of the workpiece shown in fig. 17A.
Fig. 17C is a schematic diagram of the detection result of the workpiece shown in fig. 17A.
FIG. 18A is a schematic illustration of a surface defect of the workpiece shown in FIG. 15A.
Fig. 18B is a schematic view of the detection area of the workpiece shown in fig. 18A.
Fig. 18C is a schematic view of the detection result of the workpiece shown in fig. 18A.
FIG. 19A is a schematic illustration of a surface imperfection of the workpiece shown in FIG. 15A.
Fig. 19B is a schematic view of the detection area of the workpiece shown in fig. 19A.
Fig. 19C is a schematic view of the detection result of the workpiece shown in fig. 19A.
FIG. 20A is a schematic illustration of a surface defect of the workpiece shown in FIG. 15A.
Fig. 20B is a schematic view of the detection area of the workpiece shown in fig. 20A.
Fig. 20C is a schematic view of the detection result of the workpiece shown in fig. 20A.
FIG. 21A is a schematic illustration of the workpiece shown in FIG. 15A without surface defects.
Fig. 21B is a schematic view of the detection area of the workpiece shown in fig. 21A.
Fig. 21C is a schematic view of the detection result of the workpiece shown in fig. 21A.
FIG. 22 is a diagram of a cage workpiece according to an embodiment of the present application.
FIG. 23A is a schematic illustration of a surface imperfection of the workpiece shown in FIG. 22.
Fig. 23B is a schematic view of the detection area of the workpiece shown in fig. 23A.
Fig. 23C is a schematic diagram of the detection result of the workpiece shown in fig. 23A.
FIG. 24A is a schematic illustration of a surface imperfection of the workpiece shown in FIG. 22.
Fig. 24B is a schematic view of the detection area of the workpiece shown in fig. 24A.
Fig. 24C is a schematic view of the detection result of the workpiece shown in fig. 24A.
FIG. 25A is a schematic illustration of a surface imperfection of the workpiece shown in FIG. 22.
Fig. 25B is a schematic view of the detection region of the workpiece shown in fig. 25A.
Fig. 25C is a schematic view of the detection result of the workpiece shown in fig. 25A.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments.
The description of the present application primarily contains several sections, first describing the machine vision inspection system, i.e., the hardware section. Next, a machine vision inspection method, that is, a software portion, which can be implemented in the machine vision inspection system is described. Further, several results obtained by using the above-mentioned system and method are used to illustrate the performance of the system and method provided in the present application. Finally, various embodiments and variations thereof provided herein are described.
Please refer to fig. 1A, which is a block diagram illustrating a machine vision inspection system 100 according to an embodiment of the present application. The machine vision inspection system 100 is an electro-mechanical system for automated inspection of one or more surface defects of a workpiece 199 and may be comprised of the following modules. Because of the possible need to install in a factory environment, the various modules of the machine vision inspection system 100 can be adapted to the specifications of an industrial environment, such as dust-proof, water-proof, drop-proof, and temperature-proof design requirements.
The calculator module 110 is the core of the machine vision inspection system 100 and is used to control the entire machine vision inspection system 100. In one embodiment, the calculator module 110 can be an industrial-level control host, but the application is not limited to the calculator module 110 being necessarily an industrial-level control host. For example, the calculator module 110 may include a central processing unit that executes the x86 or x64 instruction sets, or may include a central processing unit that executes other types of instruction sets. The calculator module 110 may be used to connect the following modules and execute an operating system and an application program to implement the machine vision inspection method provided in the present application. One of ordinary skill in the art would have the general knowledge of the structure and organization of a calculator, with the understanding that the calculator module 110 may have variations.
The machine vision inspection system 100 may include one or more camera modules 120, each camera module 120 having a lens group and an electro-optic element for capturing a surface of a workpiece 199. When a single workpiece 199 has multiple faces to be detected, multiple camera modules 120 can be used to shoot multiple faces simultaneously, so as to save the detection time.
The electro-optic components of the camera module 120 may be monochrome, or may be multi-color or multi-color. The application does not limit the light sensing spectrum frequency band of the camera module 120. For example, in surface inspection, a magnetic powder with fluorescence may be applied to the surface of the workpiece to highlight surface defects. Since the magnetic lines of force of the portion of the workpiece having the surface defect are exposed, more magnetic powder can be collected. When the light sensing frequency band of the camera module 120 includes the ultraviolet frequency band, the defect of gathering the magnetic powder can be detected more effectively. In another example, the heated workpiece, when cooled, has portions of surface defects that are not cooled at the same rate as other portions. When the light sensing frequency band of the camera module 120 includes an infrared frequency band, the surface defect having a different cooling degree from that of the other portions can be detected more effectively.
The machine vision inspection system 100 may include an optional illumination module 130 for cooperating with the camera module 120 described above. Since the light source may need to be turned on or off during the shooting of each camera module 120, the calculator module 110 may control parameters such as the on/off, brightness, color temperature, and frequency band of each lighting module 130.
The machine vision inspection system 100 may include a conveyor module 140 for moving a plurality of workpieces 199 to be inspected in a pipelined manner. Because it takes time to detect each workpiece, the calculator module 110 can control the speed, distance, direction, etc. of movement of the conveyor module 140. In one embodiment, when a defective workpiece 199 is detected, the calculator module 110 can also cause the conveyor module 140 to convey the workpiece 199 to a collection area for an abnormal workpiece. Since the conveyor belt module 140 is already a module frequently used in industry, the working principle thereof will not be described in detail in this application.
It may also be desirable for an inspector to monitor the machine vision inspection system 100 while performing an inspection task. Accordingly, a display module 150 may be included for displaying photographic original images, inspection reports, system configuration, and the like. The calculator module 110 may also prompt the inspection personnel with a visual or audible message via the alarm module 160 when a defective workpiece 199 is detected.
Please refer to fig. 1B, which is a block diagram illustrating a machine vision inspection system 100 according to another embodiment of the present application. The components shown in FIG. 1A may be adapted for use in the embodiment of FIG. 1B. The embodiment shown in fig. 1A can be applied to smaller workpieces. However, for complex large workpieces, such as the crankshaft of a motor vehicle, the number of surfaces to be inspected is too large to be able to complete inspection of all surfaces by the fixed position camera module 120. Thus, in the embodiment shown in FIG. 1B, a movable robot module is also added to help detect multiple surfaces of workpiece 199. In another embodiment, the camera module 120 and/or the illumination module 130 carried by the robot module can be freely moved to a preferred position, so that different surface areas of the complex part have more uniform light angles and camera lens angles. Therefore, the images of different surface areas can fall into a better shooting range. Since the robot arm module is already a module frequently used in industry, the working principle thereof will not be described in detail in this application. The number and type of the robot arms are not limited in the present application.
In the embodiment shown in fig. 1B, the robot module 170A carries one camera module 120 and one illumination module 130. The robot module 170B carries one camera module 120. The robot module 170C carries an illumination module 130. The illumination module 130 mounted on the robot module 170C can be used to capture an image in cooperation with the image capture module 120 mounted on the robot module 170B. The present application is not limited to the same type of robot modules in the same machine vision inspection system 100, and may include different types of robot modules. And one robot module may carry one or more camera modules 120 or illumination modules 130. The present application is not limited to a robot module that can carry only one camera module 120 or one illumination module 130.
The calculator module 110 may control the movement of each robot module so that the camera module 120 or the illumination module 130 carried by it may be moved into the appropriate position. The timing of the operation of the camera module 120 or the illumination module 130 is then controlled to complete the detection of a surface defect of one of the faces of the workpiece 199.
Although all of the camera modules 120 or the illumination modules 130 are mounted on the robot module in the embodiment shown in fig. 1B, a fixed camera module 120 or illumination module 130 may be included. In addition, the workpiece 199 may be mounted on one of the robot modules, so that the workpiece 199 is moved during the inspection. It is not necessary that only the robot arm module mounting the camera module 120 or the illumination module 130 can move. It will be appreciated by those skilled in the art that whether workpiece 199 or camera module 120 is in motion, multiple aspects of the same workpiece 199 may be imaged using the same camera module 120.
Please refer to fig. 2, which is a block diagram illustrating a calculator module 110 according to an embodiment of the present disclosure. As previously mentioned, in some embodiments, the calculator module 110 can be an industrial-level control host. The calculator module 110 may include a central processor module 210, a bus module 220, and one or more acceleration calculation modules 230, as well as a memory module 215 coupled to the central processor module 210 and a memory module 340 coupled to the acceleration calculation modules 230. For example, the bus module 220 may be an industry standard bus such as SCSI, iSCSI, PCI, PCT-Express, I2C, USB, or a proprietary bus. The bus module 220 may also be used to connect the modules shown in FIGS. 1A and 1B. For clarity, FIG. 2 does not show the connection relationship between the bus module 220 or the CPU module 210 and the modules shown in FIGS. 1A and 1B. One of ordinary skill in the art would have the common general knowledge of the structure and organization of a computer, and would understand the variations that the bus module 220 may have.
To increase the speed of detecting surface defects, the embodiment shown in FIG. 2 includes two acceleration calculation modules 230. The calculator module 110 may include only a single acceleration calculation module 230. The accelerated computing module 230 may be used to process the image data, so as to avoid the processing speed of the cpu module 210 being insufficient to handle high-speed computing. In the embodiment of fig. 2, the cpu module 210 is mainly used for controlling and coordinating operations of the modules shown in fig. 1A and 1B, but not for processing image data, but the present application does not necessarily require an acceleration processing module 230. The central processor module 210 also has an increasing processing speed due to moore's law, which may include multiple processing cores to facilitate parallel processing. In some embodiments, the CPU module 210 may be used to process image data without using the acceleration computing module 230.
As shown in the embodiment of fig. 2, an acceleration calculation module 230 may be directly connected to one or more camera modules 120. In some embodiments, the image captured by the camera module 120 can be directly transmitted to the acceleration computing module 230 through the bus module 220, or can be transmitted to each acceleration computing module 230 through the central processing unit 210. When the camera module 120 is directly connected to the fixed acceleration calculation module 230, the design of the overall system is relatively simple. However, when the performance of the acceleration computing module 230 is fully occupied, the camera module 120 cannot transfer the image data to the remaining idle acceleration computing modules 230 for processing. Conversely, when the camera module 120 can dynamically correspond to different acceleration calculation modules 230, the complexity of the overall system is increased. However, when the performance of one of the acceleration computing modules 230 is full, the camera module 120 can forward the image data to the rest of the idle acceleration computing modules 230 for processing.
Please refer to fig. 3, which is a block diagram illustrating an acceleration calculating module 230 according to an embodiment of the present application. The accelerated computing module 230 includes a processor module 310, a hardware accelerated processing module 320, an external memory module 330, and a memory module 340. The processor module 310 may be an embedded processor, such as a microprocessor executing the ARM instruction set or the 8051 instruction set. The processor module 310 may be used to execute a real-time operating system, coupled to the bus module 220, for scheduling control of the operations of the accelerated calculator module 230.
The hardware acceleration processing module 320 may be a hardware processing module specifically designed for certain operations. For example, a circuit provided by a Field Programmable Gate Array (FPGA) may be included. These circuits may be used in particular for performing certain operations of certain algorithms. The use of the hardware acceleration processing module 320 can be accomplished in a shorter time than executing the same operation using a processor executing software.
The external memory module 330 may be used to upgrade or modify the program executed by the processor module 310. In one embodiment, the external memory module 330 may include a pluggable memory such as an SD memory card for updating the programs executed by the processor module 310 and the data and configuration required for processing, such as the configuration parameters of the mask and algorithm to be described later.
The processor module 310 and the hardware acceleration processing module 320 can use the memory module 340 to store images captured by the camera module 120, images temporarily stored in processing work, and programs and data used for implementing algorithms.
Please refer to fig. 4, which is a block diagram illustrating data stored in the memory module 340 according to an embodiment of the present application. Since the present application does not limit the capacity of the memory module 340, nor the resolution and size of the image captured by the camera module 120, the embodiment shown in fig. 4 is only an example. Among the memories, a plurality of memory blocks may be included. As shown on the left side of fig. 4, eight memory blocks are illustrated. On the right side of fig. 4, each memory block may be subdivided into a plurality of regions according to the image resolution and size. Each area may include a plurality of storage areas with the image size of the detection area for storing images captured by the camera module 120, processing images temporarily stored in the work, and programs and data for implementing algorithms, i.e., so-called masks.
Please refer to fig. 5, which is a flowchart illustrating a machine vision inspection method 500 according to an embodiment of the present disclosure. The machine vision inspection method 500 may be performed by the calculator module 110 shown in fig. 1A or fig. 1B. Some steps or a portion of a step may be implemented by the cpu module 210 or the accelerometer module 230. Some steps or portions of certain steps may also be performed by the hardware accelerated processing module 320.
Optional step 505: when the workpiece itself has magnetism or ferromagnetism, magnetic powder may be sprayed on the surface of the magnetic workpiece. Whether a workpiece will crack or not depends mainly on the machining process. In some processes, the workpiece does not crack. For example, the cutting process typically does not produce cracks. Therefore, step 505 need not be performed for every workpiece that is undergoing the machine vision inspection method 500. When the surface of the workpiece to be detected may generate cracks in the machining process, due to the fact that the cracks cause discontinuous line segments or broken segments on the surface, magnetic lines of force on the surface or near the surface of the workpiece generate local distortion in the discontinuous broken segment parts to generate a leakage magnetic field, and magnetic powder scattered on the surface of the workpiece can be adsorbed. Even if the work piece itself has no magnetism or the magnetism is very weak, magnetic powder with very small particles is easily accumulated in the cracks. These magnetic particles can be displayed in particular in the image of the examination area after irradiation with light, the brightness or the measurement of which is higher than in the surrounding area, so that cracks can be obtained after processing the image of the examination area by the algorithm described below.
In one embodiment, step 505 may be performed using a machine that sprays magnetic particles. Such as energizing the workpiece or passing the workpiece through a magnetic field to magnetize the workpiece. Then, the magnetic powder is uniformly sprayed on the surface of the magnetic workpiece to be detected. The steps of magnetizing and spraying magnetic powder can be adjusted according to the characteristics of the surface of the workpiece. In another embodiment, the magnetic particles may be of a lighter color or may be fluorescent. After step 505 is performed, flow may continue to step 508 or step 510.
Optional step 508: a mask, algorithm, and configuration parameters are received. If there are no masks, algorithms, and configuration parameters preset, the mask, algorithms, and configuration parameters applied by the machine vision inspection method 500, the scanning algorithm 600, and the scanning algorithm 700, and the parameters applied by the surface defect type determination method 800 may be received in this step. For example, the types of parameters include (1) mask-related parameters, such as mask-related parameters mentioned in step 530, scan algorithm 600, or scan algorithm 700, e.g., positive and negative thresholds; the surface defect type determination method 800 uses various types of parameters, such as (2) the area threshold parameter or the length threshold parameter used in the noise reduction step of step 830; and (3) morphological parameters such as an area ratio threshold parameter or an aspect ratio threshold parameter used in step 860. Although only a few of the above parameters are mentioned in the present application, it will be understood by those skilled in the art that the present application is not exhaustive of all parameter values to be configured by each algorithm. More relevant configuration parameters may be needed, especially when various kinds of variations of algorithms or derived algorithms are used, but they may be received in this step. In one embodiment, step 508 may receive only configuration parameters associated with an algorithm, without receiving masks and algorithms. In one embodiment, step 508 may only receive the algorithm that already contains configuration parameters, and no additional configuration parameters associated with the algorithm need be received. In addition, the present application does not limit the execution sequence of step 508, and only the mask, algorithm and configuration parameters need to be received before step 530.
Step 510: a video stream of the workpiece is captured. The camera module 120 may be caused to take a picture of the surface of the workpiece 199 to be detected. Since the calculation time of the subsequent processing is not fixed, the camera module 120 can be made to continuously capture the video stream, and when the data processing is available, the detection area image in the video stream is extracted.
Step 520: an image of a workpiece inspection area having a complex surface is acquired. This step may also include controlling the illumination module 130 and/or the motion robot module. The brightness and position of the image of the workpiece detection area obtained from the video stream are appropriate, and the subsequent mask and the combination of the algorithm and the parameter can be matched. According to experience, if the illumination can be adjusted, the light receiving of the surface area to be detected is uniform, the high-brightness reflection is avoided, and the probability of detecting defects can be improved. In one embodiment, the detection area image may include a color of one dimension. For example, in the embodiment shown in FIG. 4, each pixel value is 8 bits deep and may represent a gray scale value of 0-255. If the captured image of the detection region includes colors of multiple dimensions, such as three dimensions of red, blue, and green, graying can be performed first to convert the values of multiple dimensions into grayscale values of a single dimension. For example, the gray value is the sum of the red value of the first multiple and the blue value of the second multiple and the green value of the third multiple, and the first multiple, the second multiple and the third multiple are adjustable parameter values.
Step 530: and calculating the image of the workpiece detection area by using a mask, an algorithm and a configuration parameter combination. The mask and algorithm, configuration parameter combination may be received in optional step 508. Due to the complex surface of the workpiece, if the entire workpiece in the image of the inspection area is operated, the turning line between two consecutive surfaces on the complex surface may be misjudged as a defect. Thus, the complex surface of the workpiece can be first broken down into individual continuous surfaces. Then, the calculation is performed for each continuous surface. Since different successive planes have different images, a best performing algorithm can be assigned for each successive plane or corresponding mask for processing. The application does not prohibit adjacent consecutive facets from using the same algorithm.
In the present application, the algorithm used in step 530 can be selected from at least four algorithms: a global scan mode, a row scan mode, a column scan mode, and a row column scan mode. The four algorithms have advantages and disadvantages respectively and cannot be applied to various workpieces. Thus, the most appropriate algorithm for the mask can be selected based on the characteristics of the workpiece, such as surface roughness and angle of the illumination source.
Empirically, the following conclusions can be drawn: 1) for a workpiece with good surface uniformity and few detection interference items, the surface flaw detection is preferably carried out by adopting global scanning; 2) aiming at a workpiece with a good surface uniformity in the horizontal direction and a certain inclination angle of a flaw, line scanning is preferably adopted for detecting the surface flaw; 3) for a workpiece with a surface with good uniformity in the vertical direction and a small scratch inclination angle, row scanning is preferably adopted for surface scratch detection; 4) for workpieces with poor surface uniformity and more detection interference items, the line-column scanning is preferably adopted for the surface flaw detection. In short, according to the uniformity (e.g. roughness) of the workpiece surface and the characteristics (e.g. inclination angle, area size, scratch type) of the surface scratch, the corresponding scratch detection mode and the corresponding mask should be selected and manufactured, so as to obtain better detection effect. These four algorithms will be described later. It will be appreciated by those skilled in the art that other kinds of algorithms, such as high-pass filtering, may be used in addition to these four algorithms or variations thereof. Any mask can be used with the corresponding mask as long as it is effective in detecting surface defects.
The mask is a shield for covering the portions not to be treated. That is, the workpiece can be divided into different regions by the mask to be processed. Thus, the mask may be a binary map. In the region to be calculated, a first value is set. Generally, one mask only processes one area. In the portion outside the region, the second value is set. For example, the first value may be 1 and the second value may be 0. Therefore, after each pixel value of the detection area image is multiplied by the corresponding pixel value of the mask or subjected to AND (AND) logical operation, the portion not to be processed can be changed to 0, AND the portion to be processed still retains the original value. Therefore, in the following description of the algorithm, there are parts which are processed only when referring to non-zero values, and the parts with zero values are not processed, and the shading effect of the mask is considered. After the defects are found in the area to be calculated by the mask, the positions of the defects on the surface of the workpiece can be located according to the mask, so that the defects of the workpiece which can be repaired after the defects are detected.
When two adjacent continuous surfaces of the workpiece need to be detected and the two continuous surfaces are separated by a turning line, a first set of masks and algorithms can be set to calculate the first continuous surface and a second set of masks and algorithms can be set to calculate the second continuous surface. In order to solve the problem of discontinuity caused by using different algorithms on the two sides of the turning line, the shielding region of the first set of masks may include the turning line and its neighboring region, and the shielding region of the second set of masks may also include the turning line and its neighboring region. In other words, the first set of masks and algorithms calculates the inflection line and its vicinity, and the second set of masks and algorithms also calculates the inflection line and its vicinity.
In one embodiment, after algorithmic processing, an image of the surface defects is obtained. For example, the processed image may be a binary image. A surface defect comprises a contiguous region, the pixel value of which may be a first value, for example 1. The pixel value of the non-surface defect may be a second value, for example 0. The image may include one or more surface defects. Accordingly, parameters such as the size, length, height and the like of the area of each surface defect can be judged.
In another embodiment, the depth of the pixel values of the image may exceed one bit.
Step 540: and judging whether the image pickup of the workpiece is finished. When all the image capturing processes for all the faces of the workpiece to be detected are completed, the flow may proceed to step 550. Otherwise, the process may return to step 520 to continue processing the surface to be inspected.
Step 550: each calculation is combined to find surface defects. When all of the processing operations of one orientation of the workpiece are finished, one or more calculation results corresponding to one or more groups of masks and algorithms can be obtained. When one face of the workpiece includes a plurality of masks, the masks and the processing result of the algorithm, that is, the binary images, may be subjected to an OR (OR) logical operation. As mentioned above, when two continuous surfaces corresponding to two masks have a turning line, the turning line and its neighboring area are processed twice by two algorithms corresponding to the two masks respectively. If the surface defects exist in the turning line and the area near the turning line in any one processing, the surface defects of the turning line and the area near the turning line are preserved when the logic operation is performed on the two images.
When combining two images, the part to be considered includes the union part of the two masks and the processing result of the algorithm. When the region to be detected is set to the first value, any one of the two masks and the processing result of the algorithm having the first value is the region to be combined. After bonding, one or more surface defects can be found in the area of the two masks to be inspected. Flow may then proceed to optional step 560 or directly to step 570.
Optional step 560: the combined image or binary image may be subjected to hough transform (Houghtransform) to determine the morphology of the surface defect. The Hough transform or a derivative algorithm of the Hough transform is a feature extraction method and is widely applied to the aspects of image analysis, machine vision and digital image processing. The algorithm is used for solving the problem that the shape of a defect cannot be accurately divided into a set of straight lines, circles, ellipses or other shapes because the surface defect in an image is incomplete to cause missing of certain pixel points or noise interference. The shape of the surface defect can be found out by Hough transform algorithm, and then the type of the found surface defect can be judged.
Step 570: and judging whether the workpiece is qualified or not according to the found defects. In some embodiments, no defects can be found in the workpiece. In other embodiments, the manufacturer may accept a workpiece having a certain type of surface defects.
Optional step 580: when the workpiece is determined to be unqualified, the alarm module 160 may be made to send an alarm, or the conveyor belt module 140 may be made to kick out the unqualified workpiece.
Next, the four algorithms previously mentioned in step 530 will be introduced. Before the four algorithms are executed, the original detection area image srcImage of the workpiece is shielded by using a mask, and a work detection area image temp is obtained. Then, in an optional step, pixel values below a certain threshold are regarded as noise, for example, a point with a pixel value greater than 2 is retained in the working detection area image temp. In other words, the pixel point of the working detection area image temp is not equal to 0, or is greater than 2. When the depth of each pixel value of the image of the detection area of the mask is recorded to be not one bit, a third value may appear for the pixel value of the jagged portion of the edge of the mask. For example, when the depth of the pixel value is eight bits, the first value is 255, the second value is 0, and the pixel value at the jagged portion of the edge of the mask may be a third value of 1. Therefore, the noise at the edge portion of the mask is removed by this step.
The first global scanning mode calculates the mean u, variance v and standard deviation s of the working detection area image temp. The calculation formula of the mean value is as follows:
Figure DEST_PATH_IMAGE001
(1-1)
equation 1-1 is a calculation equation of the global mean u, where m and n are the number of rows and columns of the working detection area image temp, respectively, and aijThe pixel value of the ith row and the jth column which are not 0, and t is the number of non-zero pixel values. In other words, only the mean of the non-zero pixel values may be calculated, and the pixel values that are considered as noise may not participate in the calculation.
Figure DEST_PATH_IMAGE002
(1-2)
The formula 1-2 is a calculation formula of the global variance v, wherein v is the pixel variance of the image global pixel point and is the square of the pixel standard deviation s of the image global pixel point. Only the variance v of non-zero pixel values may be calculated and pixel values that are considered as noise may not participate in the calculation.
Figure DEST_PATH_IMAGE003
(1-3)
From equations 1-2, equations 1-3 can be derived, which are the calculation of the global standard deviation s.
Next, a global forward threshold k is obtainedo_upAnd a global negative-going threshold ko_downAnd then calculating a global upper threshold and a global lower threshold according to the global mean u and the standard deviation s. Wherein the global upper threshold is the standard deviation s multiplied by the global forward threshold ko_upThe sum of the mean u, the global lower threshold is the mean u minus the standard deviation s times the global negative threshold ko_downThe difference of the product of the two. The global upper threshold will necessarily be greater than the global lower threshold.
When the original detection area image srcmmage has a pixel value smaller than the global lower threshold, the pixel value of the corresponding position in the processing result detection area image resImage is set as the first value. When the original detection region image srcmmage has a pixel value between the global upper threshold and the global lower threshold, the pixel value of the corresponding position in the processing result detection region image resImage is set as the second value. When the original detection region image srcImage has a pixel value greater than the global upper threshold, the pixel value of the corresponding position in the processing result detection region image resImage is set as the difference between the original pixel value and the average value u.
The number of rows and columns of the processing result detection area image resImage is the same as that of the working detection area image, and the expression of each pixel value is as follows:
Figure DEST_PATH_IMAGE004
(1-4)
in formulae 1-4, rijThe value is the value of the ith row and jth column pixel of the processing result detection area image resImage. In equations 1-4, the first value is 255, the second value is 0, and the difference is neither the first value 255 nor the second value 0. Therefore, each pixel value of the processing result detection area image resImage occupies eight bits. The storage bits occupied by each pixel value are not limited in this application.
A second line scan mode. For each row of the working detection area image temp, the mean u, the variance v, and the standard deviation s of the ith row of the working detection area image temp are calculated. The calculation formula is shown below.
Figure DEST_PATH_IMAGE005
(2-1)
Formula 2-1 is a calculation formula of the i-th row pixel point mean value u, wherein m and n are the row number and column number of the image pixel point respectively, and aijAnd if the pixel value of the ith row and the jth column of pixel points is the non-zero pixel point, and t is the number of the ith row and the nonzero pixel point of the image, u is the pixel average value of the ith row of pixel points of the image. Note that the calculation is only added when the pixel value is non-zero.
Figure DEST_PATH_IMAGE006
(2-2)
And the formula 2-2 is a calculation formula of the ith row variance v, and v is the pixel variance of the ith row pixel point of the image and is the square of the pixel standard deviation s of the ith row pixel point of the image. Note that the calculation is only added when the pixel value is non-zero.
Figure DEST_PATH_IMAGE007
(2-3)
Equations 2-3 are equations for calculating the standard deviation s in row i. Note that the calculation is only added when the pixel value is non-zero.
Next, a line forward threshold k is obtainedrow_upAnd a row negative threshold krow_downThen, the upper threshold and the lower threshold are calculated according to the mean u and the standard deviation s. Wherein the upper threshold of the line is the standard deviation s multiplied by the forward threshold k of the linerow_upThe sum of the mean u, the lower line threshold is the mean u minus the standard deviation s times the line negative threshold krow_downThe difference of the product of the two. The upper row threshold will necessarily be greater than the lower row threshold.
When the original detection area image srcmmage has a pixel value smaller than the lower limit threshold of the line, the pixel value of the corresponding position in the processing result detection area image is set as a first value. When the original detection region image srcmmage has a pixel value between the upper line threshold and the lower line threshold, the pixel value of the corresponding position in the processing result detection region image is set as the second value. When the original detection area image srcImage has a pixel value greater than the upper line threshold, the pixel value of the corresponding position in the processing result detection area image resImage is set as the difference between the original pixel value and the line average value u.
The number of rows and columns of the processing result detection area image resImage is the same as that of the working detection area image, and the expression of each pixel point value in the ith row is as follows:
Figure DEST_PATH_IMAGE008
(2-4)
in the formulae 2 to 4, rijThe value is the value of the ith row and jth column pixel in the processing result detection area image resImage. Note that the calculation is only added when the pixel value is non-zero. In equations 2-4, the first value is 255, the second value is 0, and the difference is neither the first value 255 nor the second value 0. Therefore, each pixel value of the processing result detection area image resImage occupies eight bits. The storage bits occupied by each pixel value are not limited in this application.
Then, the above process is repeated for the next line until the entire processing result detection area image resImage is obtained.
The third algorithm is a column scan mode. For each row of the working detection area image temp, the mean u, the variance v, and the standard deviation s of the jth row of the working detection area image temp are calculated. The calculation formula is shown below.
Figure DEST_PATH_IMAGE009
(3-1)
Formula 3-1 is a calculation formula of the jth row pixel point mean value u, wherein m and n are the row number and the column number of the image pixel point respectively, and aijIf the pixel value of the ith row and jth column pixel point is to be a non-zero pixel point, and t is the number of the jth column non-zero pixel points of the image, thenAnd u is the pixel average value of the j-th row of pixel points of the image. Note that the calculation is only added when the pixel value is non-zero.
Figure DEST_PATH_IMAGE010
(3-2)
The formula 3-2 is a calculation formula of the jth column variance v, wherein v is the pixel variance of the jth column pixel point of the image, namely the square of the jth column standard deviation s. Note that the calculation is only added when the pixel value is non-zero.
Figure DEST_PATH_IMAGE011
(3-3)
The formula 3-3 is a formula for calculating the standard deviation s in the jth column. Note that the calculation is only added when the pixel value is non-zero.
Then, a column forward threshold k is obtainedc_upAnd column negative threshold kc_downThen, the upper threshold and the lower threshold are calculated according to the mean u and the standard deviation s. Wherein the column upper limit threshold is a standard deviation s multiplied by a column forward threshold kc_upThe sum of the mean u, the column lower threshold is the mean u minus the standard deviation s times the column negative threshold kc_downThe difference of the product of the two. The column upper threshold value must be greater than the column lower threshold value.
When the original detection area image srcmmage has a pixel value smaller than the row lower limit threshold, the pixel value of the corresponding position in the processing result detection area image is set as a first value. Otherwise, when the original detection region image srcImage has a pixel value between the upper limit threshold and the lower limit threshold, the pixel value of the corresponding position in the processing result detection region image is set as the second value. When the original detection region image srcImage has a pixel value greater than the upper limit threshold, the pixel value of the corresponding position in the processing result detection region image resImage is set as the difference between the original pixel value and the row average value u.
The number of rows and columns of the processing result detection area image resImage is the same as that of the working detection area image, and the expression of each pixel value is as follows:
Figure DEST_PATH_IMAGE012
(3-4)
in the formulae 3-4, rijThe value is the value of the ith row and jth column pixel in the processing result detection area image resImage. Note that the calculation is only added when the pixel value is non-zero. In equations 3-4, the first value is 255, the second value is 0, and the difference is neither the first value 255 nor the second value 0. Therefore, each pixel value of the processing result detection area image resImage occupies eight bits. The storage bits occupied by each pixel value are not limited in this application.
Then, the above process is repeated for the next row until the entire processing result detection area image resImage is obtained.
The fourth algorithm is a row-column scanning mode. In the present scanning mode, the upper threshold and the lower threshold of each line can be obtained as in the line scanning mode. For example, in a column scan mode, a column upper threshold and a column lower threshold corresponding to each column are obtained. In other words, for each pixel value, the corresponding upper row threshold, lower row threshold, upper column threshold, and lower column threshold may be used. Then, the upper row threshold and the upper column threshold may be compared, and the smaller value of the two may be obtained as the upper row threshold. The row lower limit threshold and the column lower limit threshold may be compared, and the larger value of the two may be obtained as the row lower limit threshold. In another embodiment, the upper row threshold and the upper column threshold may be compared, and the larger of the two may be taken as the upper row threshold. The lower row threshold and the lower column threshold may be compared, and the smaller value of the two may be obtained as the lower row threshold. In other embodiments, the row and column upper threshold may be an average value or a weighted average value of the row upper threshold and the column upper threshold, or the row and column lower threshold may be an average value or a weighted average value of the row lower threshold and the column lower threshold. In other words, the upper row/column threshold is calculated from the upper row threshold and the upper column threshold, and the lower row/column threshold is calculated from the lower row threshold and the lower column threshold.
When the original detection area image srcImage has a pixel value smaller than the lower threshold of the row and column, the pixel value of the corresponding position in the processing result detection area image is set as a first value. Otherwise, when the original detection region image srcmmage has a pixel value between the upper row-column threshold and the lower row-column threshold, the pixel value of the corresponding position in the processing result detection region image is set as the second value. When the original detection area image srcmmage has a pixel value larger than the upper threshold of the row and column, the pixel value of the corresponding position in the processing result detection area image resImage is set as the difference between the original pixel value and the row mean value or the column mean value.
Figure DEST_PATH_IMAGE013
(4-7)
In the above formulae 4 to 7, rijThe value is the value of the ith row and jth column pixel in the processing result detection area image resImage, aijThe value is the value of the ith row and jth column pixel in the original detection area image srcImage. Note that the calculation is only added when the pixel value is non-zero.
In the four algorithms described above, if there are three negative thresholds ko_down、krow_down、kc_downSet to 0, the subtraction of the product of the mean u and the standard deviation s with the corresponding negative threshold is not calculated, and naturally the comparison of pixel values with differences is not required, reducing the amount of calculation. In addition, only non-zero pixel values are calculated in the comparison, i.e. not for the areas masked by the mask. In other words, when the row scanning mode and the column scanning mode are performed, when the pixel values of the entire row or the entire column are all 0, the calculation for the row or the column is not required.
In the above four algorithms, in one embodiment, when a pixel value of the processing result detection area image resImage is not the first value (255) or not the second value (0), it indicates that the pixel value is greater than the upper threshold, and the pixel value may be regarded as a defect. In an embodiment, when a certain pixel value of the processing result detection area image resImage is the first value (255), it indicates that the pixel value is smaller than the lower threshold, and the pixel value may also be regarded as a defect. In other words, defects exist at positions where pixel values greater than the upper threshold and/or less than the lower threshold are located. If the processing result detection area image is converted into a binary image, the pixel values having defects, such as the difference values and/or the first values, can be set as defect values, and the remaining pixel values can be set as non-defect values.
The depth of a detected surface defect pit is influenced by different positive threshold values and negative threshold values when the surface is detected. Therefore, different positive thresholds and negative thresholds can be set for different algorithms according to different conditions of light reflected by the complex surface, and therefore the probability of detecting defects on the complex surface can be improved.
In the aforementioned optional step 505, it is mentioned that when the workpiece itself has magnetism or ferromagnetism, the magnetic powder can be scattered on the surface of the magnetic workpiece. The crack region of the adsorbed magnetic powder is generally higher than the upper threshold value after being detected by the algorithm, so that the crack on the surface of the workpiece can be detected.
For the pit or black skin area which cannot be irradiated by the light, the corresponding pixel value is usually lower than the lower threshold, so that the pit or black skin area on the surface of the workpiece can be detected. By setting an upper threshold and a lower threshold on a normally distributed gray scale image, the four algorithms provided by the invention can detect abnormal parts at the upper end and the lower end.
Please refer to fig. 6, which is a flowchart illustrating a scanning algorithm 600 according to an embodiment of the present application. The scanning algorithm 600 may be part of step 530 shown in FIG. 5 and executed by the calculator module 110 shown in FIG. 1A or FIG. 1B. Some steps or portions of certain steps may be performed by the central processor module 210 or the accelerated computing module 230. Some steps or portions of certain steps may also be performed by the hardware accelerated processing module 320. The scanning algorithm 600 may be one of the previously mentioned global scan mode, row scan mode, or column scan mode. The scanning algorithm 600 may refer to the previous description regarding step 530.
Step 610: and shielding the original detection area image by using a mask.
Optional step 620: and clearing the pixel values corresponding to the sawtooth parts of the edges of the mask. For example, values below the threshold value 2 are cleared.
Step 630: and copying the non-zero pixel value to the working detection area image. The work detection area image is the work detection area image temp as described previously.
Step 640: the mean u and standard deviation s are calculated for one region of the image of the work area. As mentioned above, when the area is a working detection area image temp with all non-zero pixel values, the scanning algorithm 600 may be in the global scanning mode. When the area is a non-zero pixel value of a row of the image of the working inspection area, the scanning algorithm 600 may be in a row scanning mode. The scanning algorithm 600 may be in a row scan mode when the region is a non-zero pixel value of a row of the image of the working inspection region.
Step 650: and calculating an upper limit threshold according to the forward threshold, the mean value u and the standard deviation s corresponding to the region. When the scanning algorithm 600 is in global scan mode, the forward threshold may be a global forward threshold ko_up. The upper threshold is the standard deviation s times the global forward threshold ko_upThe product of (d) plus the sum of the global mean u. When scanning algorithm 600 is in a line scan mode, the forward threshold may be a line forward threshold krow_up. The upper threshold is the standard deviation s times the line forward threshold krow_upAnd the sum of the upper averages u is added. When scanning algorithm 600 is in column scan mode, the forward threshold may be a column forward threshold kc_up. The upper threshold is the standard deviation s times the column forward threshold kc_upAnd the sum of the above-listed means u.
Optional step 655: and calculating a lower limit threshold according to the negative threshold, the mean u and the standard deviation s corresponding to the region. When the scanning algorithm 600 is in global scanning mode, the negative threshold can be a global negative threshold ko_down. The lower threshold is the global mean u minus the standard deviation s times the global negative threshold ko_downThe difference of the product of the two. When scanning algorithm 600 is in a line scan mode, the negative threshold can be a line negative threshold krow_down. The lower threshold is the row mean u minus the standard deviation s times the row negative threshold krow_downThe difference of the product of the two. When scanning algorithm 600 is in column scan mode, the negative threshold can be a column negative threshold kc_down. The lower threshold is the global mean u minus the standard deviation s times the column negative threshold kc_downThe difference of the product of the two. Step 655 is optional as previously described.
Step 660: and when the pixel value in the area corresponding to the original detection area image is higher than the upper limit threshold value, setting the corresponding pixel value of the processed detection area image as the difference value between the pixel value and the average value. In the aforementioned optional step 505, it is mentioned that when the workpiece itself has magnetism or ferromagnetism, the magnetic powder can be scattered on the surface of the magnetic workpiece. The crack region of the adsorbed magnetic powder is generally higher than the upper threshold value after being detected by the algorithm, so that the crack on the surface of the workpiece can be detected.
Optional step 665: and when the pixel value in the area corresponding to the original detection area image is lower than the lower threshold, setting the corresponding pixel value of the processed detection area image as a first value. The first value may be 255 or 1. When step 655 is not performed, step 665 need not be performed. For the pit or black skin area which cannot be irradiated by the light, the pit or black skin area on the surface of the workpiece can be detected because the corresponding pixel value is usually lower than the lower threshold.
Step 670: and setting the rest corresponding pixel values in the region of the processed detection region image as second values. In other words, the remaining pixel values in the region may be set to the second value except for the pixel values set to the difference and the first value in steps 660 and 665. The second value may be 0.
Step 680: and judging whether all the areas of the image of the work detection area are processed. When the scanning algorithm is a row scanning mode or a column scanning mode, the above-described processing must be performed for each row or each column. When all the regions have been processed, the process ends. Otherwise, flow returns to step 640 for processing for a new region.
Please refer to fig. 7, which is a flowchart illustrating a scanning algorithm 700 according to an embodiment of the present application. The scanning algorithm 700 may be part of step 530 shown in FIG. 5 and executed by the calculator module 110 shown in FIG. 1A or FIG. 1B. Some steps or portions of certain steps may be performed by the central processor module 210 or the accelerated computing module 230. Some steps or portions of certain steps may also be performed by the hardware accelerated processing module 320. The scanning algorithm 700 may be the previously mentioned row and column scanning mode. The first three steps 610-630 can be referred to the description of the embodiment shown in FIG. 6.
Step 740: the mean u and the standard deviation s are calculated for each row and each column of the image of the working inspection area. This step is the same as step 640 of FIG. 6, but is performed for each row and each column. Step 750: and calculating the upper line threshold and the lower line threshold according to the positive line threshold, the negative line threshold, the mean u and the standard deviation s which are respectively corresponding to each line.
Step 755: and calculating an upper limit threshold and a lower limit threshold of the column according to the positive threshold, the negative threshold, the mean u and the standard deviation s of the column corresponding to each column.
Step 760: and calculating the upper limit threshold of the rows and the columns according to the upper limit threshold of the rows and the upper limit threshold of the columns corresponding to the pixel values of the original detection area image.
Step 765: and calculating a row-column lower limit threshold according to the row lower limit threshold and the column lower limit threshold corresponding to the pixel value of the original detection area image.
Step 770: and setting the corresponding pixel value of the processed detection area image as a second value. The second value may be 0.
Step 780: and when the pixel value corresponding to the working detection area image is higher than the row-column upper limit threshold value, setting the pixel value corresponding to the processed detection area image as the difference value between the pixel value and the row mean value or the column mean value. In the aforementioned optional step 505, it is mentioned that when the workpiece itself has magnetism or ferromagnetism, the magnetic powder can be scattered on the surface of the magnetic workpiece. The crack region of the adsorbed magnetic powder is generally higher than the upper threshold value after being detected by the algorithm, so that the crack on the surface of the workpiece can be detected.
Step 785: and when the pixel value corresponding to the working detection area image is lower than the row-column lower limit threshold, setting the pixel value corresponding to the processed detection area image as a first value. The first value may be 255 or 1. For the pit or black skin area which cannot be irradiated by the light, the pit or black skin area on the surface of the workpiece can be detected because the corresponding pixel value is usually lower than the lower threshold.
Step 790: and judging whether all the pixel values are processed. When all pixel values have been processed, the process ends. Otherwise, flow returns to step 760 for processing of the new pixel value.
Please refer to fig. 8, which is a flowchart illustrating a method 800 for determining a surface defect type according to an embodiment of the present application. The method 800 for determining surface defect type may be an embodiment of step 560 shown in fig. 5, and is used to determine whether the surface defect calculated in step 550 is a crack or a non-crack to be searched, so as to further filter out the qualified or unqualified target workpiece. The method 800 for determining the type of surface defect can be performed by the calculator module 110 shown in fig. 1A or fig. 1B. Some steps or portions of certain steps may be performed by the central processor module 210 or the accelerated computing module 230. Some steps or portions of certain steps may also be performed by the hardware accelerated processing module 320.
Step 810: and filtering out the isolated point noise. When the left and right pixel points, the upper and lower pixel points, or the upper, lower, left and right pixel points of a certain pixel point are non-defect values, and the pixel value of the pixel point is a defect value, the pixel value of the pixel point is set as the non-defect value.
Step 820: and carrying out Hough transform. For example, the openCV function library disclosed includes the hough transform function. It will be understood by those skilled in the art that during this step, hough transform or its variants can be implemented to generate hough image of hough detection region. The original detection area image and the hough detection area image have the same size.
Optional step 830: and (5) noise reduction. The outline area of each connected region in Hough image of Hough detection region is found, and then the region with the outline area smaller than a threshold value or the region with the outline length smaller than another threshold value is removed. In other words, a defect region that is too small can be regarded as noise and ignored. The contour area parameter threshold or length threshold may be received in optional step 508.
Step 840: the minimum bounding rectangle for each region is set. I.e., find a minimum box to frame each connected region.
Step 850: and calculating the proportion of each area according to the minimum bounding rectangle. Let the length and width of the minimum circumscribed rectangle of the ith connected region be xi、yi(ii) a Let the area in the outline of the ith connected region be areai(ii) a Let the ith connected region be constantsi. In one embodiment, the ratio may be
Figure 86941DEST_PATH_IMAGE014
. In another embodiment, the ratio may be
Figure 833442DEST_PATH_IMAGE015
. As the value of the ratio is closer to 1, the more likely the connected region is to be cracked.
Step 860: it is determined whether the ratio is greater than a threshold. Now, an aspect ratio threshold aspect _ ratio or an area ratio threshold area _ ratio is set, if the connected region and its contour conform to
Figure 95403DEST_PATH_IMAGE016
Greater than the aspect ratio threshold aspect ratio or
Figure 84088DEST_PATH_IMAGE015
If the area ratio is larger than the area _ ratio threshold, the connected region is considered as a crack trace, that is:
Figure 852455DEST_PATH_IMAGE017
in other words, if the long side of the minimum circumscribed rectangle is far larger than the wide side, it can be determined that the minimum circumscribed rectangle is a crack. Or the area of the minimum circumscribed rectangle can be judged to be far larger than that of the connecting region, and then the crack can be judged. When the determination in step 860 is yes, the flow proceeds to step 870, otherwise the flow proceeds to step 880. The aspect ratio threshold aspect ratio or area ratio threshold area ratio may be received in optional step 508.
Step 870: the region was judged to be a crack.
Step 880: the region was judged to be non-cracked. Possibly as plaque or dust.
Step 890: and judging whether all the areas are processed. When the process is not complete, flow returns to step 840, otherwise the flow ends.
Please refer to fig. 9, which is a diagram illustrating steps of a machine vision inspection method 500 according to an embodiment of the present application after being executed. From left to right, fig. 9 includes the original image, three original image regions masked by the upper, middle, and lower masks, three binary images calculated in step 530, a surface defect map after hough transform, and a result map combining the three regions. As can be seen from fig. 9, a crack that must be carefully detected with the naked eye can clearly show a discontinuous crack in the middle of a workpiece after flaw detection by the machine vision inspection method 500 provided by the present application.
Please refer to fig. 10, which illustrates a workpiece diagram and a crack recognition diagram of a machine vision inspection method according to an embodiment of the present application. Fig. 10 omits diagrams after execution of the steps shown in fig. 9. It can also be seen from fig. 10 that a crack, which must be carefully detected by the naked eye, clearly shows a discontinuous crack in the middle of the workpiece after inspection by the machine vision inspection method 500 provided in the present application.
Please refer to fig. 11, which is a schematic diagram of a mask according to an embodiment of the present application. One mask data mainly contains two pieces of information: the cut boundary and keying information of the part is detected. It is noted that there may be multiple masks for the same type of workpiece, rather than just one.
Please refer to fig. 12A, which is a diagram illustrating a workpiece according to an embodiment of the present application. Please refer to fig. 12B and 12C, which are diagrams of a mask of the workpiece of fig. 12A, respectively. The two masks divide the workpiece shown in fig. 12A into a middle, and left and right sides.
Please refer to fig. 13A, which is a diagram illustrating a workpiece according to an embodiment of the present application. Please refer to fig. 13B and 13C, which are diagrams of a mask of the workpiece of fig. 13A, respectively. As with fig. 12B and 12C, the two masks divide the workpiece shown in fig. 13A into a middle, and left and right sides.
Please refer to fig. 14A to 14D, which are diagrams illustrating steps of a machine vision inspection method according to an embodiment of the present disclosure after being executed. Fig. 14A is an original. Fig. 14B is a diagram of the result after step 530 has been performed. Fig. 14C is a diagram of the result after step 820 has been performed. FIG. 14D is a diagram of the results after steps 560 or 890 have been performed.
In the embodiment of fig. 14A to 14D, the algorithm is a row-column scanning mode. Line forward threshold krow_upRow negative going threshold krow_downColumn forward threshold kc_upAnd column negative threshold kc_downAre both 2.5. In this embodiment, a scanning algorithm 700 is performed. In step 830, the minimum threshold for the area is set to 80. In step 860, the aspect ratio threshold aspect _ ratio is set to 0.8 and the area ratio threshold area _ ratio is set to 0.9.
Please refer to fig. 15A, which is a diagram of a claw pole workpiece according to an embodiment of the present application. In the following embodiments, the claw pole workpiece shown in fig. 15A is used. The following experiment is performed to verify whether the system and method provided by the present application is suitable for detecting surface defects (e.g., pits, scale) of a forged claw pole workpiece. In the experiment, 6 claw pole sample pieces with surface defects are selected to carry out surface defect detection testing so as to verify the correctness, feasibility and practicability of the visual inspection algorithm applied to the claw pole workpiece surface defect detection.
Please refer to fig. 15B, which is a schematic diagram of three masks of one of the claws of the claw-pole workpiece shown in fig. 15A. As can be seen in fig. 15B, each jaw surface of the claw-pole workpiece has three continuous faces separated by two turning lines. Therefore, in the following experiments, the surface of the claw was inspected using three masks. The three masks may overlap at portions of the two turning lines to avoid filtering out surface defects near the turning lines.
The first claw pole workpiece experiment: surface defect detection experiment of No. 1 claw pole sample piece (with concave pit at excircle)
Please refer to fig. 15C, which is a schematic diagram illustrating a surface defect of the workpiece shown in fig. 15A. The surface defect of this workpiece is a pit at the outer circle, located in the box shown in the figure.
Please refer to fig. 15D, which is a schematic diagram of the detection area of the workpiece shown in fig. 15C. This figure is a graph of the results after masking.
Please refer to fig. 15E, which is a diagram illustrating the detection result of the workpiece shown in fig. 15C. The pits can be clearly seen.
And (4) test conclusion: as shown in fig. 15C to 15E, the detection effect graphs can all display the pit print with a deep claw pole outer circle, that is, the application is applicable to the outer circle defect detection of the forged claw pole workpiece.
Claw pole workpiece experiment two: surface defect detection experiment for No. 2 claw pole sample piece (surface defect at magnetic chamfer)
And (4) test conclusion: as shown in fig. 16A to 16C, the detection effect graphs can all show the pit marks with deeper surface defects at the magnetic chamfer of the claw pole, i.e. the application is applicable to the detection of the surface defects at the magnetic chamfer of the forged claw pole workpiece.
A claw pole workpiece experiment III: surface defect detection experiment of No. 3 claw pole sample piece (with concave pit at claw bend)
And (4) test conclusion: as shown in fig. 17A to 17C, the detection effect graph can show a concave pit print with a deep depth at the claw-pole claw bend, i.e. it indicates that the present application is applicable to surface defect detection at the claw bend of a claw-pole workpiece in forging processing.
Claw pole workpiece experiment four: surface defect detection experiment of No. 4 claw pole sample piece (with pits at excircle and magnetic chamfer)
And (4) test conclusion: as shown in fig. 18A to 18C, the detection effect graph can show the deep pit marks at the outer circle and the magnetic chamfer of the claw pole, that is, the application is applicable to surface defect detection at the outer circle and the magnetic chamfer of the forged claw pole workpiece.
Claw pole workpiece experiment five: surface defect detection experiment of No. 5 claw pole sample piece (concave pit is arranged at joint of excircle and magnetic chamfer)
And (4) test conclusion: as shown in fig. 18A to 18C, the detection effect graph can show a pit mark with a deep depth at the junction between the outer circle of the claw pole and the magnetic chamfer, and does not show a shallow pit mark at the magnetic chamfer of the claw pole, which means that the method and the device are not affected by shallow pits on the surface, and are suitable for detecting surface defects at the junction between the outer circle of the forged claw pole workpiece and the magnetic chamfer.
Claw pole workpiece experiment six: surface defect detection experiment of No. 6 claw pole sample piece (with pits at magnetic chamfer positions)
And (4) test conclusion: as shown in fig. 19A to 19C, the detection effect graph can show a pit mark with a deep depth at the claw pole magnetic chamfer, that is, the application is applicable to surface defect detection at the magnetic chamfer of the forged claw pole workpiece.
A claw pole workpiece experiment seven: surface defect detection experiment of No. 7 claw pole sample piece (concave pit with deep claw bending part)
And (4) test conclusion: as shown in fig. 20A to 20C, the detection effect graph can show a concave pit print with a deep depth at the claw-pole claw bend, i.e. it indicates that the present application is applicable to surface defect detection at the claw bend of a claw-pole workpiece in forging processing.
Eight claw pole workpiece experiments: surface defect detection experiment of No. 8 claw pole sample piece (intact surface)
And (4) test conclusion: as shown in fig. 21A to 21C, the detection effect of the perfect surface of the claw pole sample piece does not show any abnormal mark, i.e. the false detection phenomenon is not sent, which indicates that the present application is applicable to the surface defect detection of the forged claw pole workpiece, and the stability of the present application is good.
According to the surface defect detection experiment result of the claw pole workpiece, the following conclusion can be obtained:
the detection effect graph can display pit marks with deep surface depths of the excircle, the magnetic chamfer and the joint of the excircle and the magnetic chamfer of the claw pole, and the pit marks indicate that the method is suitable for detecting surface defects of the excircle, the magnetic chamfer and the joint of the excircle and the magnetic chamfer of the claw pole workpiece.
The detection effect graph does not show any abnormal print for the detection of the complete surface or shallow pits of the claw pole, namely, the false detection phenomenon is not sent, and the detection effect graph shows that the method and the device can be suitable for detecting the surface defects of the forged claw pole workpiece and have good stability.
Please refer to fig. 22, which is a diagram illustrating a cage workpiece according to an embodiment of the present application. In the following embodiments, the claw pole workpiece shown in fig. 22 is used. The following experiment is performed to verify whether the system and method provided by the present application are suitable for detecting surface defects (e.g., cracks, pits, etc.) of a ball cage workpiece. The experiment collects the surface picture of the ball cage sample piece to detect and test the surface defect of the workpiece, so as to verify the correctness, feasibility and practicability of the visual inspection algorithm applied to the detection of the surface defect of the ball cage workpiece.
Ball cage experiment one a: detection experiment for surface defects of ball cage sample with surface defects
Please refer to fig. 23A, which is a schematic diagram illustrating a surface defect of the ball cage workpiece shown in fig. 22. The surface defect of this piece is a crack at the upper edge of the inner side of the ball cage, located in the box shown in the figure.
Please refer to fig. 23B, which is a schematic diagram of the detection area of the workpiece shown in fig. 23A. This figure is a graph of the results after masking.
Please refer to fig. 23C, which is a diagram illustrating a detection result of the workpiece shown in fig. 23A. Multiple cracks can be clearly seen.
Ball cage experiment one B: detection experiment for surface defects of ball cage sample with surface defects
Please refer to fig. 24A, which is a schematic diagram illustrating a surface defect of the ball cage workpiece shown in fig. 22. The surface defect of this workpiece is a crack at the lower edge of the inner side of the ball cage, located in the box shown in the figure.
Please refer to fig. 24B, which is a schematic diagram illustrating the detection area of the workpiece shown in fig. 24A. This figure is a graph of the results after masking.
Please refer to fig. 24C, which is a diagram illustrating a detection result of the workpiece shown in fig. 24A. Multiple cracks can be clearly seen.
And (4) test conclusion: as shown in fig. 23A to fig. 24C, the detection effect graphs of the machine vision inspection algorithm can display crack traces on the surface of the ball cage sample, that is, the machine vision inspection algorithm can detect the surface defects of the ball cage sample.
And (4) performing a ball cage experiment II: detection experiment for surface defects of ball cage sample piece without surface defects
Please refer to fig. 25A, which is a schematic diagram illustrating a surface defect of the ball cage workpiece shown in fig. 22. The workpiece has no surface defects.
Please refer to fig. 25B, which is a schematic diagram illustrating the detection area of the workpiece shown in fig. 25A. This figure is a graph of the results after masking.
Please refer to fig. 25C, which is a diagram illustrating the detection result of the workpiece shown in fig. 25A. It can be clearly seen that there are no surface defects.
And (4) test conclusion: as shown in fig. 25A-25C, the detection effect map of the perfect surface of the ball cage sample does not show any abnormal print, that is, no false detection occurs, which indicates that the machine vision flaw detection algorithm can be used for detecting the surface defects of the ball cage workpiece, and the stability of the algorithm is good.
According to the first ball cage experiment A and the second ball cage experiment B, the detection effect graph of the machine vision flaw detection algorithm can display the crack trace on the surface of the ball cage sample, namely, the machine vision flaw detection algorithm can detect the surface defect of the ball cage sample.
According to the second ball cage experiment, the detection effect graph of the perfect surface of the ball cage sample piece does not show any abnormal print, namely, the false detection phenomenon does not occur, namely, the machine vision flaw detection algorithm can be used for detecting the surface defects of the ball cage workpiece, and the stability of the machine vision flaw detection algorithm is good.
According to an aspect of the present application, there is provided a machine vision inspection method, including: acquiring an image of a workpiece detection area with a complex surface; calculating the workpiece detection area image by using the combination of a plurality of masks, algorithms and parameters to respectively obtain a plurality of processed detection area images; combining the processed detection area images to obtain a detection result detection area image; and detecting defects in the area images according to the detection result, and judging whether the surfaces of the workpieces corresponding to the plurality of masks have defects or not.
Further, in order to calculate the surface defects by using the mean and standard deviation of the non-zero values in a certain area, a first algorithm among the plurality of algorithms comprises the steps of: shielding the image of the workpiece detection area by using the mask corresponding to the first algorithm; copying the non-zero pixel value of the shielded workpiece detection area image to a working detection area image; calculating a corresponding mean u and standard deviation s for one or more first regions of the working detection region image; respectively calculating an upper limit threshold corresponding to each first area according to the forward threshold corresponding to each first area, the mean value u and the standard deviation s; and traversing the pixels of each first region of the shielded workpiece detection region image, and when the value of the pixel is greater than the upper limit threshold corresponding to the first region, setting the value of the corresponding pixel of the processed detection region image as the difference value between the pixel and the mean value u, otherwise, setting the pixel as a second value. When the value of the pixel is greater than the upper threshold corresponding to the first region, the pixel may be a crack portion of the workpiece detection region.
Further, in order to preserve the defects below the lower threshold, the first algorithm of the plurality of algorithms further comprises the steps of: respectively calculating a lower limit threshold corresponding to each first area according to the negative threshold corresponding to each first area, the mean value u and the standard deviation s; and traversing the pixel of each first area of the shielded workpiece detection area image, and setting the value of the pixel corresponding to the processed detection area image as a first value when the value of the pixel is smaller than the lower threshold corresponding to the first area. When the value of the pixel is smaller than the lower threshold corresponding to the first area, the pixel may be a pit or a black skin portion of the workpiece detection area.
Further, to provide a global scan algorithm, a line scan algorithm, or a column scan algorithm, the first region is one of: the work detects the overall situation of the regional image; a row of the working detection area image, the row comprising at least one pixel of non-zero pixel value; and a row of the working detection area image, the row including at least one pixel of non-zero pixel value.
Further, in order to provide a row-column scanning algorithm, a row-column scanning algorithm of the plurality of algorithms comprises the steps of: shielding the image of the workpiece detection area by using the mask corresponding to the line-row scanning algorithm; copying the non-zero pixel value of the shielded workpiece detection area image to a working detection area image; calculating a corresponding row mean value u and a row standard deviation s for one or more rows of the image of the working detection area; calculating a corresponding row mean u and a row standard deviation s for one or more rows of the image of the working detection area; respectively calculating the upper limit threshold of the line corresponding to each line according to the forward threshold, the line mean u and the line standard deviation s corresponding to each line; respectively calculating the column upper limit threshold corresponding to each column according to the forward threshold corresponding to each column, the column mean u and the column standard deviation s; and traversing each pixel of the shielded workpiece detection area image, calculating a row and column upper limit threshold according to the row upper limit threshold and the column upper limit threshold corresponding to the pixel, setting the value of the pixel corresponding to the processed detection area image as the difference value between the pixel value and the row mean value or the column mean value when the pixel value is greater than the corresponding row and column upper limit threshold, and otherwise, setting the pixel value as a second value. When the value of the pixel is larger than the corresponding upper row-column threshold, the pixel may be a crack portion of the workpiece detection area.
Further, in order to preserve the defects below the lower threshold in the row and column scanning algorithm, the row and column scanning algorithm further comprises the following steps: respectively calculating a lower limit threshold corresponding to each row area according to the negative threshold corresponding to each row, the row mean u and the row standard deviation s; respectively calculating a lower limit threshold corresponding to each row of areas according to the negative threshold corresponding to each row, the row mean u and the row standard deviation s; traversing each pixel of the shielded workpiece detection area image, calculating a line lower threshold according to the line lower threshold and the row lower threshold corresponding to the pixel, and setting the value of the pixel corresponding to the detection area image as a first value when the value of the pixel is smaller than the corresponding line lower threshold. When the value of the pixel is smaller than the corresponding lower row-column threshold, the pixel may be a pit or a black skin portion of the workpiece detection area.
Further, in order to detect an orientation of the workpiece using a plurality of mask and algorithm, parameter combinations, a second algorithm of the plurality of algorithms comprises the steps of: shielding the image of the workpiece detection area by using the mask corresponding to the second algorithm; copying the non-zero pixel value of the shielded workpiece detection area image to a working detection area image; calculating a corresponding mean u and standard deviation s for one or more second regions of the working detection region image; respectively calculating an upper limit threshold corresponding to each second area according to the forward threshold corresponding to each second area, the mean u and the standard deviation s; and traversing the pixels of each second area of the masked workpiece detection area image, when the value of the pixel is greater than the upper threshold corresponding to the second area, setting the value of the pixel corresponding to the processed detection area image as the difference between the pixel value and the upper threshold, otherwise, setting the pixel value as the second value, wherein the first area and the second area are one of the following areas, and the first area is different from the second area: the work detects the overall situation of the regional image; a row of the working detection area image, the row comprising at least one pixel of non-zero pixel value; and a row of the working detection area image, the row including at least one pixel of non-zero pixel value. When the value of the pixel is larger than the upper threshold corresponding to the second area, the pixel may be a crack portion of the workpiece detection area.
Further, in order to preserve the defects below the lower threshold, the second algorithm of the plurality of algorithms further comprises the steps of: respectively calculating a lower limit threshold corresponding to each second area according to the negative threshold corresponding to each second area, the mean value u and the standard deviation s; and traversing the pixels of each second area of the shielded workpiece detection area image, and setting the value of the corresponding pixel of the processed detection area image as a first value when the value of the pixel is smaller than the lower threshold corresponding to the second area. When the value of the pixel is smaller than the lower threshold corresponding to the second area, the pixel may be a pit or a black skin portion of the workpiece detection area.
Further, in order to calculate the defect of the overlapping portion between the adjacent masks, when the mask corresponding to the first algorithm overlaps with the mask corresponding to the second algorithm, the method further includes performing a logic operation on two corresponding pixels of the overlapping portion between the first detection area image and the second row detection area image in the step of combining the first detection area image processed by the first algorithm and the second detection area image processed by the second algorithm, and when the value of one pixel of the two pixels is the first value or the difference value, setting the value of the corresponding pixel of the detection result detection area image as a defect value, otherwise, setting the value as a non-defect value.
Further, in order to determine the type of the surface defect, the machine vision inspection method further includes: carrying out Hough transformation on the detection result detection area image to obtain a Hough detection area image; setting a minimum circumscribed rectangle corresponding to each defective area in the image of the Hough detection area; and judging whether the defect area is a crack according to the minimum circumscribed rectangle, wherein the step of judging whether the surfaces of the workpiece corresponding to the plurality of masks have defects further comprises the step of judging whether the surfaces of the workpiece corresponding to the plurality of masks have defects during the work when the image of the Hough detection area contains cracks.
Further, in order to filter out the isolated points to avoid interfering with the hough transform, the machine vision detection method further includes: before the hough transform step, when the value of a certain pixel of the detection result detection area image is a defect value and the values of a plurality of pixels adjacent to the pixel are non-defect values, the value of the pixel is set as the non-defect value.
Further, in order to filter out noise, the machine vision detection method further includes: the minimum bounding rectangle of the defective region is not set when the defective region satisfies one of the following conditions: the outline area of the defect region is smaller than a threshold value; and the length of the defective region is less than another threshold.
Further, in order to determine whether the defect is a long and narrow crack, the step of determining whether the defect region is a crack further includes: calculating the proportion of the minimum circumscribed rectangle corresponding to the defect area; and when the ratio is greater than a threshold value, judging the defect area to be a crack, wherein the ratio is one of the following: comparing the difference value of the minimum circumscribed rectangle area and the area of the defect area with the minimum circumscribed rectangle area; and comparing the length of the long side of the minimum circumscribed rectangle with the length of the short side of the minimum circumscribed rectangle.
Further, in order to detect the crack more easily, the machine vision detection method further includes: before the obtaining step, magnetic powder is scattered on the workpiece detection area.
According to an aspect of the present application, there is provided a machine vision inspection system, comprising: the camera module is used for shooting a workpiece to be detected; and a calculator module for executing software to implement the following steps: making the camera module shoot a workpiece to be detected; acquiring an image of a workpiece detection area with a complex surface from the camera module; calculating the workpiece detection area image by using the combination of a plurality of masks, algorithms and parameters to respectively obtain a plurality of processed detection area images; combining the processed detection area images to obtain a detection result detection area image; and detecting defects in the area images according to the detection result, and judging whether the surfaces of the workpieces corresponding to the plurality of masks have defects or not.
Further, in order to calculate the surface defects by using the mean and standard deviation of the non-zero values in a certain area, a first algorithm among the plurality of algorithms comprises the steps of: shielding the image of the workpiece detection area by using the mask corresponding to the first algorithm; copying the non-zero pixel value of the shielded workpiece detection area image to a working detection area image; calculating a corresponding mean u and standard deviation s for one or more first regions of the working detection region image; respectively calculating an upper limit threshold corresponding to each first area according to the forward threshold corresponding to each first area, the mean value u and the standard deviation s; and traversing the pixels of each first region of the shielded workpiece detection region image, and when the value of the pixel is greater than the upper threshold corresponding to the first region, setting the value of the pixel corresponding to the processed detection region image as the difference value between the pixel and the upper threshold, otherwise, setting the pixel as a second value. When the value of the pixel is greater than the upper threshold corresponding to the first region, the pixel may be a crack portion of the workpiece detection region.
Further, in order to preserve the defects below the lower threshold, the first algorithm of the plurality of algorithms further comprises the steps of: respectively calculating a lower limit threshold corresponding to each first area according to the negative threshold corresponding to each first area, the mean value u and the standard deviation s; and traversing the pixel of each first area of the shielded workpiece detection area image, and setting the value of the pixel corresponding to the processed detection area image as a first value when the value of the pixel is smaller than the lower threshold corresponding to the first area. When the value of the pixel is smaller than the lower threshold corresponding to the first area, the pixel may be a pit or a black skin portion of the workpiece detection area.
Further, to provide a global scan algorithm, a line scan algorithm, or a column scan algorithm, the first region is one of: the work detects the overall situation of the regional image; a row of the working detection area image, the row comprising at least one pixel of non-zero pixel value; and a row of the working detection area image, the row including at least one pixel of non-zero pixel value.
Further, in order to provide a row-column scanning algorithm, a row-column scanning algorithm of the plurality of algorithms comprises the steps of: shielding the image of the workpiece detection area by using the mask corresponding to the row-column scanning algorithm; copying the non-zero pixel value of the shielded workpiece detection area image to a working detection area image; calculating a corresponding row mean value u and a row standard deviation s for one or more rows of the image of the working detection area; calculating a corresponding row mean u and a row standard deviation s for one or more rows of the image of the working detection area; respectively calculating the upper limit threshold of the line corresponding to each line according to the forward threshold, the line mean u and the line standard deviation s corresponding to each line; respectively calculating the column upper limit threshold corresponding to each column according to the forward threshold corresponding to each column, the column mean u and the column standard deviation s; and traversing each pixel of the shielded workpiece detection area image, when the value of the pixel is greater than the corresponding upper limit threshold, calculating a row and column upper limit threshold according to the row upper limit threshold and the column upper limit threshold corresponding to the pixel, setting the value of the pixel corresponding to the processed detection area image as the difference value between the pixel and the row and column upper limit threshold, and otherwise, setting the pixel as a second value. When the value of the pixel is larger than the corresponding upper row-column threshold, the pixel may be a crack portion of the workpiece detection area.
Further, in order to preserve the defects below the lower threshold in the row and column scanning algorithm, the row and column scanning algorithm further comprises the following steps: respectively calculating a lower limit threshold corresponding to each row area according to the negative threshold corresponding to each row, the row mean u and the row standard deviation s; respectively calculating a lower limit threshold corresponding to each row of areas according to the negative threshold corresponding to each row, the row mean u and the row standard deviation s; traversing each pixel of the shielded workpiece detection area image, calculating a line lower threshold according to the line lower threshold and the row lower threshold corresponding to the pixel, and setting the value of the pixel corresponding to the detection area image as a first value when the value of the pixel is smaller than the corresponding line lower threshold. When the value of the pixel is smaller than the corresponding lower row-column threshold, the pixel may be a pit or a black skin portion of the workpiece detection area.
Further, in order to detect an orientation of the workpiece using a plurality of mask and algorithm, parameter combinations, a second algorithm of the plurality of algorithms comprises the steps of: shielding the image of the workpiece detection area by using the mask corresponding to the second algorithm; copying the non-zero pixel value of the shielded workpiece detection area image to a working detection area image; calculating a corresponding mean u and standard deviation s for one or more second regions of the working detection region image; respectively calculating an upper limit threshold corresponding to each second area according to the forward threshold corresponding to each second area, the mean u and the standard deviation s; and traversing the pixels of each second area of the masked workpiece detection area image, when the value of the pixel is greater than the upper threshold corresponding to the second area, setting the value of the pixel corresponding to the processed detection area image as the difference between the pixel value and the upper threshold, otherwise, setting the pixel value as the second value, wherein the first area and the second area are one of the following areas, and the first area is different from the second area: the work detects the overall situation of the regional image; a row of the working detection area image, the row comprising at least one pixel of non-zero pixel value; and a row of the working detection area image, the row including at least one pixel of non-zero pixel value. When the value of the pixel is larger than the upper threshold corresponding to the second area, the pixel may be a crack portion of the workpiece detection area.
Further, in order to preserve the defects below the lower threshold, the second algorithm of the plurality of algorithms further comprises the steps of: respectively calculating a lower limit threshold corresponding to each second area according to the negative threshold corresponding to each second area, the mean value u and the standard deviation s; and traversing the pixels of each second area of the shielded workpiece detection area image, and setting the value of the corresponding pixel of the processed detection area image as a first value when the value of the pixel is smaller than the lower threshold corresponding to the second area. When the value of the pixel is smaller than the lower threshold corresponding to the second area, the pixel may be a pit or a black skin portion of the workpiece detection area.
Further, in order to calculate the defect of the overlapping portion between the adjacent masks, when the mask corresponding to the first algorithm overlaps with the mask corresponding to the second algorithm, the method further includes performing a logic operation on two corresponding pixels of the overlapping portion between the first detection area image and the second row detection area image in the step of combining the first detection area image processed by the first algorithm and the second detection area image processed by the second algorithm, and when the value of one pixel of the two pixels is the first value or the difference value, setting the value of the corresponding pixel of the detection result detection area image as a defect value, otherwise, setting the value as a non-defect value.
Further, to determine the type of surface defect, the calculator module is further configured to: carrying out Hough transformation on the detection result detection area image to obtain a Hough detection area image; setting a minimum circumscribed rectangle corresponding to each defective area in the image of the Hough detection area; and judging whether the defect area is a crack according to the minimum circumscribed rectangle, wherein the step of judging whether the surfaces of the workpiece corresponding to the plurality of masks have defects further comprises the step of judging whether the surfaces of the workpiece corresponding to the plurality of masks have defects during the work when the image of the Hough detection area contains cracks.
Further, in order to filter the isolated points to avoid interfering with the hough transform, the calculator module is further configured to: before the hough transform step, when the value of a certain pixel of the detection result detection area image is a defect value and the values of a plurality of pixels adjacent to the pixel are non-defect values, the value of the pixel is set as the non-defect value.
Further, to reduce noise, the calculator module is further configured to: the minimum bounding rectangle of the defective region is not set when the defective region satisfies one of the following conditions: the outline area of the defect region is smaller than a threshold value; and the length of the defective region is less than another threshold.
Further, in order to determine whether the defect is a long and narrow crack, the step of determining whether the defect region is a crack further includes: calculating the proportion of the minimum circumscribed rectangle corresponding to the defect area; and when the ratio is greater than a threshold value, judging the defect area to be a crack, wherein the ratio is one of the following: comparing the difference value of the minimum circumscribed rectangle area and the area of the defect area with the minimum circumscribed rectangle area; and comparing the length of the long side of the minimum circumscribed rectangle with the length of the short side of the minimum circumscribed rectangle.
Further, in order to inspect a large or large number of workpieces, the machine vision inspection system further comprises one or any combination of the following modules: the mechanical arm module is used for bearing the camera module; the illumination module is used for illuminating the workpiece to be detected; and the conveying belt module is used for moving and bearing the workpiece to be detected to a corresponding shooting position of the camera module.
Further, in order to speed up the detection of the surface defects of the workpiece by using programmable hardware, the calculator module further comprises a parallel acceleration calculation module consisting of field programmable logic gate arrays, and is used for executing the calculation steps of the software.
Further, in order to more easily detect cracks, the work inspection area has been scattered with magnetic powder before photographing the work to be inspected.
Due to the adoption of the scheme, the invention has the beneficial effects that: different threshold values can be set according to the difference of light and the like aiming at different areas of a workpiece with a complex surface, so that the automatic machine vision detection is effectively carried out, the consistency standard of quality detection can be provided, the detection quantity and efficiency can be improved, and the detection labor and time cost are reduced.
The above description is only for the preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art should be considered to be within the technical scope of the present invention, and the technical solutions and the inventive concepts thereof according to the present invention should be equivalent or changed within the scope of the present invention.

Claims (26)

1. A machine vision inspection method, comprising:
acquiring an image of a workpiece detection area with a complex surface;
calculating the workpiece detection area image by using the combination of a plurality of masks, algorithms and parameters to respectively obtain a plurality of processed detection area images;
combining the processed detection area images to obtain a detection result detection area image; and
detecting defects in the area images according to the detection result, judging whether the surfaces of the workpieces corresponding to the plurality of masks have defects or not,
a first algorithm of the plurality of algorithms comprises the steps of:
shielding the image of the non-detection area of the workpiece by using the mask corresponding to the first algorithm;
copying the non-zero pixel value of the shielded workpiece detection area image to a working detection area image;
calculating a corresponding mean value u and standard deviation s for one or more first regions of the image of the working detection region;
respectively calculating an upper limit threshold corresponding to each first area according to the forward threshold corresponding to each first area, the mean value u and the standard deviation s; and
traversing the pixel of each first region of the shielded workpiece detection region image, and when the value of the pixel is greater than the upper threshold corresponding to the first region, setting the value of the pixel corresponding to the processed detection region image as the difference value between the pixel and the mean value u, otherwise, setting the pixel as a second value.
2. The machine-vision inspection method of claim 1, wherein the first algorithm of the plurality of algorithms further comprises the steps of:
respectively calculating a lower limit threshold corresponding to each first area according to the negative threshold corresponding to each first area, the mean value u and the standard deviation s; and
traversing the pixel of each first area of the shielded workpiece detection area image, and setting the value of the pixel corresponding to the processed detection area image as a first value when the value of the pixel is smaller than the lower threshold corresponding to the first area.
3. The machine-vision inspection method of claim 1, wherein the first area is one of:
the work detects the overall situation of the regional image;
a row of the working detection area image, the row comprising at least one pixel of non-zero pixel value; and
a row of the work detection area image, the row including at least one pixel of non-zero pixel value.
4. The machine-vision inspection method of claim 1, wherein a row-column scanning algorithm among the plurality of algorithms comprises the steps of:
shielding the image of the workpiece detection area by using the mask corresponding to the row-column scanning algorithm;
copying the non-zero pixel value of the workpiece detection area image which is shielded by the mask and corresponds to the row-column scanning algorithm to a working detection area image corresponding to the row-column scanning algorithm;
calculating a corresponding row mean value u and a corresponding row standard deviation s for one or more rows of the working detection area image corresponding to the row-column scanning algorithm;
calculating a corresponding row mean u and a row standard deviation s for one or more rows of the working detection area image corresponding to the row-column scanning algorithm;
respectively calculating the upper limit threshold of the line corresponding to each line according to the forward threshold, the line mean u and the line standard deviation s corresponding to each line;
respectively calculating the column upper limit threshold corresponding to each column according to the forward threshold corresponding to each column, the column mean u and the column standard deviation s; and
traversing each pixel of the workpiece detection area image shielded by the mask corresponding to the row-column scanning algorithm, calculating a row-column upper limit threshold according to the row upper limit threshold and the column upper limit threshold corresponding to the pixel, setting the value of the pixel corresponding to the processed detection area image as the difference value between the pixel value and the row-column upper limit threshold when the pixel value is greater than the row-column upper limit threshold, and otherwise, setting the pixel value as a second value.
5. The machine-vision inspection method of claim 4, wherein the row-column scanning algorithm further comprises the steps of:
respectively calculating a lower limit threshold corresponding to each row area according to the negative threshold corresponding to each row, the row mean u and the row standard deviation s;
respectively calculating a lower limit threshold corresponding to each row of areas according to the negative threshold corresponding to each row, the row mean u and the row standard deviation s; and
traversing each pixel of the workpiece detection area image corresponding to the shielded row-column scanning algorithm, calculating a row-column lower limit threshold according to the row lower limit threshold and the column lower limit threshold corresponding to the pixel, and setting the value of the pixel corresponding to the processed detection area image as a first value when the value of the pixel is smaller than the corresponding row-column lower limit threshold.
6. The machine-vision inspection method of claim 1, wherein a second algorithm of the plurality of algorithms comprises the steps of:
shielding the image of the workpiece detection area by using the mask corresponding to the second algorithm;
copying the non-zero pixel value of the shielded workpiece detection area image to a working detection area image;
calculating a corresponding mean u and standard deviation s for one or more second regions of the working detection region image;
respectively calculating an upper limit threshold corresponding to each second area according to the forward threshold corresponding to each second area, the mean u and the standard deviation s; and
traversing the pixel of each second region of the masked workpiece detection region image, setting the value of the corresponding pixel of the processed detection region image as the difference value between the pixel value and the mean value u when the pixel value is greater than the upper threshold value corresponding to the second region, otherwise setting the pixel value as the second value,
wherein the first region and the second region are one of the following, and the first region is different from the second region:
the work detects the overall situation of the regional image;
a row of the working detection area image, the row comprising at least one pixel of non-zero pixel value; and
a row of the work detection area image, the row including at least one pixel of non-zero pixel value.
7. The machine-vision inspection method of claim 6, wherein the second algorithm of the plurality of algorithms further comprises the steps of:
respectively calculating a lower limit threshold corresponding to each second area according to the negative threshold corresponding to each second area, the mean value u and the standard deviation s; and
traversing the pixel of each second area of the shielded workpiece detection area image, and setting the value of the pixel corresponding to the processed detection area image as a first value when the value of the pixel is smaller than the lower threshold corresponding to the second area.
8. The machine-vision inspection method of claim 6, wherein when the mask corresponding to the first algorithm is partially overlapped with the mask corresponding to the second algorithm, the step of combining the first inspection area image processed by the first algorithm with the second inspection area image processed by the second algorithm further comprises performing a logic operation on two pixels corresponding to the overlapped portion of the first inspection area image and the second row of inspection area images, and when the value of one of the two pixels is the first value or the difference value, the value of the corresponding pixel of the inspection result inspection area image is set as a defect value, otherwise, the value is set as a non-defect value.
9. The machine-vision inspection method of claim 1, further comprising:
carrying out Hough transformation on the detection result detection area image to obtain a Hough detection area image;
setting a minimum circumscribed rectangle corresponding to each defective area in the image of the Hough detection area; and
judging whether the defect area is a crack or not according to the minimum circumscribed rectangle,
the step of determining whether the surfaces of the workpiece corresponding to the plurality of masks have defects further comprises determining that the surfaces of the workpiece corresponding to the plurality of masks have defects when the image of the Hough inspection area includes cracks.
10. The machine-vision inspection method of claim 9, further comprising:
before the hough transform step, when the value of a certain pixel of the detection result detection area image is a defect value and the values of a plurality of pixels adjacent to the pixel are non-defect values, the value of the pixel is set as the non-defect value.
11. The machine-vision inspection method of claim 9, further comprising: the minimum bounding rectangle of the defective region is not set when the defective region satisfies one of the following conditions:
the outline area of the defect region is smaller than a threshold value; and
the length of the defective region is less than another threshold.
12. The machine vision inspection method of claim 9, wherein the step of determining whether the defect region is a crack further comprises:
calculating the proportion of the minimum circumscribed rectangle corresponding to the defect area; and
when the ratio is larger than the threshold value, in order to judge the defect area as a crack,
wherein the ratio is one of the following:
comparing the difference value of the minimum circumscribed rectangle area and the area of the defect area with the minimum circumscribed rectangle area; and
and comparing the length of the long side of the minimum circumscribed rectangle with the length of the short side of the minimum circumscribed rectangle.
13. A machine vision inspection system, comprising:
the camera module is used for shooting a workpiece to be detected; and
a calculator module for executing software to perform the steps of:
making the camera module shoot a workpiece to be detected;
acquiring an image of a workpiece detection area with a complex surface from the camera module;
calculating the workpiece detection area image by using the combination of a plurality of masks, algorithms and parameters to respectively obtain a plurality of processed detection area images;
combining the processed detection area images to obtain a detection result detection area image; and
detecting defects in the area images according to the detection result, judging whether the surfaces of the workpieces corresponding to the plurality of masks have defects or not,
a first algorithm of the plurality of algorithms comprises the steps of:
shielding the image of the workpiece detection area by using the mask corresponding to the first algorithm;
copying the non-zero pixel value of the shielded workpiece detection area image to a working detection area image;
calculating a corresponding mean u and standard deviation s for one or more first regions of the working detection region image;
respectively calculating an upper limit threshold corresponding to each first area according to the forward threshold corresponding to each first area, the mean value u and the standard deviation s; and
traversing the pixel of each first region of the shielded workpiece detection region image, and when the value of the pixel is greater than the upper threshold corresponding to the first region, setting the value of the pixel corresponding to the processed detection region image as the difference value between the pixel and the mean value u, otherwise, setting the pixel as a second value.
14. The machine-vision inspection system of claim 13, wherein the first one of the plurality of algorithms further comprises:
respectively calculating a lower limit threshold corresponding to each first area according to the negative threshold corresponding to each first area, the mean value u and the standard deviation s; and
traversing the pixel of each first area of the shielded workpiece detection area image, and setting the value of the pixel corresponding to the processed detection area image as a first value when the value of the pixel is smaller than the lower threshold corresponding to the first area.
15. The machine-vision inspection system of claim 13, wherein the first region is one of:
the work detects the overall situation of the regional image;
a row of the working detection area image, the row comprising at least one pixel of non-zero pixel value; and
a row of the work detection area image, the row including at least one pixel of non-zero pixel value.
16. The machine-vision inspection system of claim 13, wherein a row-column scanning algorithm among the plurality of algorithms comprises the steps of:
shielding the image of the workpiece detection area by using the mask corresponding to the row-column scanning algorithm;
copying the non-zero pixel value of the workpiece detection area image which is shielded by the mask and corresponds to the row-column scanning algorithm to a working detection area image corresponding to the row-column scanning algorithm;
calculating a corresponding row mean value u and a corresponding row standard deviation s for one or more rows of the working detection area image corresponding to the row-column scanning algorithm;
calculating a corresponding row mean u and a row standard deviation s for one or more rows of the working detection area image corresponding to the row-column scanning algorithm;
respectively calculating the upper limit threshold of the line corresponding to each line according to the forward threshold, the line mean u and the line standard deviation s corresponding to each line;
respectively calculating the column upper limit threshold corresponding to each column according to the forward threshold corresponding to each column, the column mean u and the column standard deviation s; and
traversing each pixel of the workpiece detection area image shielded by the mask corresponding to the row-column scanning algorithm, calculating a row-column upper limit threshold according to the row upper limit threshold and the column upper limit threshold corresponding to the pixel, setting the value of the pixel corresponding to the processed detection area image as the difference value between the pixel value and the row-column upper limit threshold when the pixel value is greater than the row-column upper limit threshold, and otherwise, setting the pixel value as a second value.
17. The machine-vision inspection system of claim 16, wherein the row-column scanning algorithm further comprises the steps of:
respectively calculating a lower limit threshold corresponding to each row area according to the negative threshold corresponding to each row, the row mean u and the row standard deviation s;
respectively calculating a lower limit threshold corresponding to each row of areas according to the negative threshold corresponding to each row, the row mean u and the row standard deviation s; and
traversing each pixel of the workpiece detection area image corresponding to the shielded row-column scanning algorithm, calculating a row-column lower limit threshold according to the row lower limit threshold and the column lower limit threshold corresponding to the pixel, and setting the value of the pixel corresponding to the processed detection area image as a first value when the value of the pixel is smaller than the corresponding row-column lower limit threshold.
18. The machine-vision inspection system of claim 13, wherein a second algorithm of the plurality of algorithms comprises the steps of:
shielding the image of the workpiece detection area by using the mask corresponding to the second algorithm;
copying the non-zero pixel value of the shielded workpiece detection area image to a working detection area image;
calculating a corresponding mean u and standard deviation s for one or more second regions of the working detection region image;
respectively calculating an upper limit threshold corresponding to each second area according to the forward threshold corresponding to each second area, the mean u and the standard deviation s; and
traversing the pixel of each second region of the masked workpiece detection region image, setting the value of the corresponding pixel of the processed detection region image as the difference value between the pixel value and the upper threshold value when the pixel value is greater than the upper threshold value corresponding to the second region, otherwise setting the pixel value as the second value,
wherein the first region and the second region are one of the following, and the first region is different from the second region:
the work detects the overall situation of the regional image;
a row of the working detection area image, the row comprising at least one pixel of non-zero pixel value; and
a row of the work detection area image, the row including at least one pixel of non-zero pixel value.
19. The machine-vision inspection system of claim 18, wherein the second algorithm of the plurality of algorithms further comprises the steps of:
respectively calculating a lower limit threshold corresponding to each second area according to the negative threshold corresponding to each second area, the mean value u and the standard deviation s; and
traversing the pixel of each second area of the shielded workpiece detection area image, and setting the value of the pixel corresponding to the processed detection area image as a first value when the value of the pixel is smaller than the lower threshold corresponding to the second area.
20. The machine-vision inspection system of claim 18, wherein when the mask corresponding to the first algorithm overlaps with the mask corresponding to the second algorithm, the step of combining the first inspection area image processed by the first algorithm with the second inspection area image processed by the second algorithm further comprises performing a logic operation on two pixels corresponding to the overlapping portion of the first inspection area image and the second row of inspection area images, and when the value of one of the two pixels is the first value or the difference value, the value of the corresponding pixel of the inspection result inspection area image is set as a defect value, otherwise, the value is set as a non-defect value.
21. The machine-vision inspection system of claim 13, wherein the calculator module is further configured to:
carrying out Hough transformation on the detection result detection area image to obtain a Hough detection area image;
setting a minimum circumscribed rectangle corresponding to each defective area in the image of the Hough detection area; and
judging whether the defect area is a crack or not according to the minimum circumscribed rectangle,
the step of determining whether the surfaces of the workpiece corresponding to the plurality of masks have defects further comprises determining that the surfaces of the workpiece corresponding to the plurality of masks have defects when the image of the Hough inspection area includes cracks.
22. The machine-vision inspection system of claim 21, wherein the calculator module is further configured to:
before the hough transform step, when the value of a certain pixel of the detection result detection area image is a defect value and the values of a plurality of pixels adjacent to the pixel are non-defect values, the value of the pixel is set as the non-defect value.
23. The machine-vision inspection system of claim 21, wherein the calculator module is further configured to: the minimum bounding rectangle of the defective region is not set when the defective region satisfies one of the following conditions:
the outline area of the defect region is smaller than a threshold value; and
the length of the defective region is less than another threshold.
24. The machine vision inspection system of claim 21, wherein the step of determining whether the defect region is a crack further comprises:
calculating the proportion of the minimum circumscribed rectangle corresponding to the defect area; and
when the ratio is larger than the threshold value, in order to judge the defect area as a crack,
wherein the ratio is one of the following:
comparing the difference value of the minimum circumscribed rectangle area and the area of the defect area with the minimum circumscribed rectangle area; and
and comparing the length of the long side of the minimum circumscribed rectangle with the length of the short side of the minimum circumscribed rectangle.
25. The machine vision inspection system of claim 13, further comprising one or any combination of the following modules:
the mechanical arm module is used for bearing the camera module;
the illumination module is used for illuminating the workpiece to be detected; and
and the conveyor belt module is used for moving and bearing the workpiece to be detected to the corresponding shooting position of the camera module.
26. The machine-vision inspection system of claim 13, wherein the calculator module further comprises a parallel accelerated computation module of field programmable gate arrays for performing the computing steps of the software.
CN202010867013.4A 2020-08-26 2020-08-26 Machine vision detection method and system Active CN111833350B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010867013.4A CN111833350B (en) 2020-08-26 2020-08-26 Machine vision detection method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010867013.4A CN111833350B (en) 2020-08-26 2020-08-26 Machine vision detection method and system

Publications (2)

Publication Number Publication Date
CN111833350A true CN111833350A (en) 2020-10-27
CN111833350B CN111833350B (en) 2023-06-06

Family

ID=72918915

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010867013.4A Active CN111833350B (en) 2020-08-26 2020-08-26 Machine vision detection method and system

Country Status (1)

Country Link
CN (1) CN111833350B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113554648A (en) * 2021-09-18 2021-10-26 四川太平洋药业有限责任公司 Production line detection method
CN113724258A (en) * 2021-11-02 2021-11-30 山东中都机器有限公司 Conveyor belt tearing detection method and system based on image processing

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060233434A1 (en) * 2005-04-15 2006-10-19 Akira Hamamatsu Method and apparatus for inspection
JP2012159376A (en) * 2011-01-31 2012-08-23 Jfe Steel Corp Surface defect detector and surface defect detection method
CN103914827A (en) * 2013-09-06 2014-07-09 贵州大学 Method for visual inspection of shortages of automobile sealing strip profile
CN109447989A (en) * 2019-01-08 2019-03-08 哈尔滨理工大学 Defect detecting device and method based on motor copper bar burr growth district

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060233434A1 (en) * 2005-04-15 2006-10-19 Akira Hamamatsu Method and apparatus for inspection
JP2012159376A (en) * 2011-01-31 2012-08-23 Jfe Steel Corp Surface defect detector and surface defect detection method
CN103914827A (en) * 2013-09-06 2014-07-09 贵州大学 Method for visual inspection of shortages of automobile sealing strip profile
CN109447989A (en) * 2019-01-08 2019-03-08 哈尔滨理工大学 Defect detecting device and method based on motor copper bar burr growth district

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113554648A (en) * 2021-09-18 2021-10-26 四川太平洋药业有限责任公司 Production line detection method
CN113724258A (en) * 2021-11-02 2021-11-30 山东中都机器有限公司 Conveyor belt tearing detection method and system based on image processing
CN113724258B (en) * 2021-11-02 2022-02-08 山东中都机器有限公司 Conveyor belt tearing detection method and system based on image processing

Also Published As

Publication number Publication date
CN111833350B (en) 2023-06-06

Similar Documents

Publication Publication Date Title
CN109141232B (en) Online detection method for disc castings based on machine vision
CN109772733B (en) Defect bamboo block detecting and sorting device and method based on vision
CN109872300B (en) Visual saliency detection method for appearance defects of friction plate
CN116758061B (en) Casting surface defect detection method based on computer vision
WO2017141611A1 (en) Defect detection apparatus, defect detection method, and program
CN109900711A (en) Workpiece, defect detection method based on machine vision
CN107490582B (en) Assembly line workpiece detection system
CN111968100B (en) Machine vision detection method and system
CN108257171A (en) Car radar assembling aperture detection method based on light vision
CN111539927B (en) Detection method of automobile plastic assembly fastening buckle missing detection device
CN113077437B (en) Workpiece quality detection method and system
CN114881915A (en) Symmetry-based mobile phone glass cover plate window area defect detection method
CN113177924A (en) Industrial production line product flaw detection method
CN106529551B (en) Intelligent recognition counting detection method for round-like objects in packaging industry
CN111833350A (en) Machine vision detection method and system
CN114280075A (en) Online visual inspection system and method for surface defects of pipe parts
CN114719749A (en) Metal surface crack detection and real size measurement method and system based on machine vision
CN115931898A (en) Visual detection method and device for surface defects of ceramic substrate and storage medium
CN114897881A (en) Crystal grain defect detection method based on edge characteristics
Tang et al. Surface inspection system of steel strip based on machine vision
Hashmi et al. Computer-vision based visual inspection and crack detection of railroad tracks
JP2017166957A (en) Defect detection device, defect detection method and program
CN110516725B (en) Machine vision-based wood board stripe spacing and color detection method
CN115753791A (en) Defect detection method, device and system based on machine vision
JP7469740B2 (en) Belt inspection system and belt inspection program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant