CN114119535A - Laser cleaning effect on-line monitoring method based on visual detection - Google Patents

Laser cleaning effect on-line monitoring method based on visual detection Download PDF

Info

Publication number
CN114119535A
CN114119535A CN202111401432.XA CN202111401432A CN114119535A CN 114119535 A CN114119535 A CN 114119535A CN 202111401432 A CN202111401432 A CN 202111401432A CN 114119535 A CN114119535 A CN 114119535A
Authority
CN
China
Prior art keywords
image
gray
unwashed
coating
processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111401432.XA
Other languages
Chinese (zh)
Inventor
范鑫
孙涛
邓阳俊
艾克南
郭太行
谢道秀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Hangyi High Tech Development Research Institute Co ltd
Original Assignee
Shanghai Hangyi High Tech Development Research Institute Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Hangyi High Tech Development Research Institute Co ltd filed Critical Shanghai Hangyi High Tech Development Research Institute Co ltd
Priority to CN202111401432.XA priority Critical patent/CN114119535A/en
Publication of CN114119535A publication Critical patent/CN114119535A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration by the use of local operators
    • G06T5/70
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component

Abstract

The invention relates to a laser cleaning effect on-line monitoring method based on visual detection, which receives a cleaning completion signal and acquires a RGB color image of the current surface of a cleaning workpiece; carrying out graying processing on the obtained RGB color image to generate a gray image; identifying and dividing the current gray level image of the cleaning workpiece into a plurality of subarea images according to gray level gradient; respectively calculating the average gray value of each subregion image; comparing the average gray values of all the subarea images with a preset cleaning workpiece gray standard value table one by one; generating an atlas comprising all the images of the unwashed subareas according to the comparison result; performing identification analysis on each unwashed subregion image to generate a rapid scanning path of the whole unwashed subregion image set; feeding back the rapid scanning path to a laser cleaning device to carry out cleaning operation again; until no such unwashed region atlas is generated. The cleaning effect can be evaluated quickly and accurately, so that the working efficiency is improved.

Description

Laser cleaning effect on-line monitoring method based on visual detection
Technical Field
The invention relates to a laser cleaning technology, in particular to a laser cleaning effect online monitoring method based on visual detection.
Background
Laser cleaning is a process of irradiating the surface of a workpiece by using laser, so that pollutants, rusts and the like on the surface can be instantly peeled off or evaporated after absorbing laser energy, and finally, the pollutants, the rusts and the like can be effectively removed from a base material. Compared with the traditional cleaning modes such as mechanical friction cleaning, chemical corrosion cleaning and the like, the laser cleaning is a green and efficient cleaning mode and has obvious advantages, so that the laser cleaning is gradually accepted by the public in industrial production and domestic application.
In the laser cleaning process, due to different cleaning parameters, the quality of the cleaned product is different, and therefore timely confirmation is needed. Currently, the results of laser cleaning are mostly checked and evaluated manually. The cleaned workpieces are manually inspected one by production personnel, so that time and labor are wasted, and the efficiency is lower.
Disclosure of Invention
Aiming at the problem of automatic online monitoring of the whole laser cleaning process, the online monitoring method for the laser cleaning effect based on visual detection is provided.
The technical scheme of the invention is as follows: a laser cleaning effect online monitoring method based on visual detection comprises the following steps:
receiving a cleaning completion signal, and acquiring an RGB color image of the current surface of the workpiece to be cleaned;
carrying out graying processing on the obtained RGB color image to generate a gray image;
identifying and dividing the current gray level image of the cleaning workpiece into a plurality of subarea images according to gray level gradient;
respectively calculating the average gray value of each subregion image;
comparing the average gray values of all the subarea images with a preset cleaning workpiece gray standard value table one by one; generating an atlas comprising all the images of the unwashed subareas according to the comparison result;
performing identification analysis on each unwashed subregion image to generate a rapid scanning path of the whole unwashed subregion image set;
feeding back the rapid scanning path to a laser cleaning device to carry out cleaning operation again;
repeating the above steps until no unwashed region atlas is generated.
Further, the method for generating the gray standard value table of the cleaning workpiece comprises the following steps
Coating numbering is carried out according to the structure and position sequence of the coating of the workpiece to be cleaned;
acquiring coating standard RGB images corresponding to the coating numbers one by one;
sequentially carrying out noise reduction processing and graying processing on each coating standard RGB image to obtain a standard grayscale image; for each standard gray image, identifying gray values of all pixels, performing weighted average processing on the gray values of all the pixels, and calculating to generate a gray standard value;
and recording the corresponding relation between the coating number and the corresponding gray standard value to generate a gray standard value table.
Further, the average gray values of the sub-region images are compared with a preset cleaning workpiece gray standard value table one by one, including being compared with the gray standard value of the coating which is to be cleaned at present, and being compared with all the gray standard values of the coating which is not cleaned after the cleaning is finished.
Further, the method for comparing the average gray values of the sub-area images with a preset cleaning workpiece gray standard value table one by one comprises the following steps: and judging that the coating is cleaned and not cleaned and the coating is ablated according to the comparison result value falling within the allowable deviation range of the gray standard value of the coating, wherein the allowable deviation value is preset for the gray standard values corresponding to all the coatings.
Further, the generating a fast scan path over the entire atlas of unwashed sub-regions: the method comprises the following steps:
for each subarea image in the unwashed area image set, identifying the gray value of each pixel in the subarea image;
comparing the gray value of each pixel with the gray standard value of the coating to be cleaned after the cleaning;
counting the number of unwashed pixels in each subregion image according to the comparison result;
sequencing all the subarea images in the unwashed area image set according to the number of the unwashed pixels;
and generating the fast scanning path according to the sequencing result.
Further, the coating is judged not to be ablated, and the treatment method comprises the following steps: generating a processing interruption signal and an alarm signal, and feeding back the processing interruption signal to the laser cleaning device to stop the laser cleaning device; meanwhile, the alarm signal is fed back to the alarm device, so that the alarm device gives a prompt to workers to replace the ablated workpiece and check the laser cleaning device.
The system for realizing the laser cleaning effect on-line monitoring method based on the visual detection comprises the following steps:
the information acquisition module is used for receiving the cleaning completion signal and the RGB color image;
a processing module for generating an image acquisition signal; the system comprises a processing module, a processing module and a display module, wherein the processing module is used for carrying out graying processing on the RGB color image so as to generate a grayscale image; the real-time gray scale image processing device is used for processing the real-time gray scale image into a subarea image; used for calculating the cracking gray value; processing and generating an unwashed region atlas; for generating a fast scan path;
the storage module is used for storing the processing method and storing various types of information and data generated by the processing module;
and the signal sending module is used for sending the image acquisition signal and the fast scanning path.
The invention has the beneficial effects that: the laser cleaning effect on-line monitoring method based on visual detection realizes quick and accurate evaluation on the cleaning effect, thereby improving the working efficiency.
Drawings
FIG. 1 is a schematic diagram illustrating a process of generating a gray scale standard value table according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of an online monitoring process according to an embodiment of the present invention;
FIG. 3 is a schematic diagram illustrating a fast scan path generation process for an unwashed region atlas according to an embodiment of the invention;
fig. 4 is a block diagram of a laser cleaning effect online monitoring system in an embodiment of the present invention.
Reference numerals: 110. a signal acquisition module; 120. a processing module; 130. a storage module; 140. and a signal sending module.
Detailed Description
The invention is described in detail below with reference to the figures and specific embodiments. The present embodiment is implemented on the premise of the technical solution of the present invention, and a detailed implementation manner and a specific operation process are given, but the scope of the present invention is not limited to the following embodiments.
The embodiment of the application discloses a laser cleaning effect online monitoring method based on visual detection, which comprises a step S1 of establishing a standard gray value table before cleaning is started and an online monitoring step S2 after cleaning is completed each time, and the method is explained with reference to FIG. 1 and FIG. 2.
As shown in fig. 1, a schematic flow chart for generating a gray scale standard value table specifically includes the following steps:
s10: a coating number table is obtained.
Specifically, the coating number table is input by the operator according to the coating structure of the workpiece to be cleaned. The coating number table records a plurality of coating numbers, all the coating numbers in one table are arranged in the coating number table according to the position sequence of each coating to be cleaned on the workpiece, and the total number of the coating numbers is the total number of the coatings on the workpiece. For example, if a coating is assigned a coating number of 1, it represents that the coating is the outermost layer of the unwashed workpiece and thus will be cleaned first, and if a coating is assigned a coating number of 4, it represents that the coating is the fourth layer of the unwashed workpiece from the outside to the inside.
S11: and acquiring standard RGB images corresponding to the coating numbers one by one.
Specifically, the worker places all the standard parts with the processing coatings of the workpiece to be cleaned within the field of view of a preset image acquisition device, which is a CCD line camera in this embodiment. The CCD linear array camera scans each coating of the standard component in sequence by using a standard light source with the wavelength ranging from 600nm to 800nm, so that an RGB standard image of each coating is obtained. Before scanning each time, the staff can input the corresponding coating number. After the RGB standard image fed back by the CCD linear array camera in real time is acquired, the intelligent terminal correspondingly stores the RGB standard image and the coating number.
S12: and establishing a two-dimensional rectangular coordinate system, and identifying the coordinate of each pixel point.
Specifically, the standard RGB image is a rectangular chart, and for each standard RGB image, two adjacent rectangular sides of the standard RGB image are taken as a coordinate axis X and a coordinate axis Y, and an intersection point of the coordinate axis X and the coordinate axis Y is taken as an origin, and coordinates (X, Y) of each pixel are identified.
S13: and carrying out noise reduction processing on the standard RGB image.
Specifically, for each standard RGB image, a gaussian filter is used to eliminate image noise, and the two-dimensional gaussian function expression of the gaussian filter used in this embodiment is:
Figure BDA0003371105530000041
where (x, y) is the coordinate of any one of the pixels identified and generated in S12, and δ is the standard deviation of the normal distribution.
S14: and generating a standard gray-scale image.
Specifically, each standard RGB image subjected to noise reduction processing is subjected to graying processing, so that a standard grayscale image of each coating is obtained, and the standard grayscale images correspond to coating numbers one to one.
S15: and calculating to generate a gray standard value.
Specifically, for each standard gray scale image, the gray scale value A of each pixel is identifiednWherein A isnFrom a grey scale conversion function An=((Rn×299)+(Gn×587)+(BnX 114))/1000, wherein R is obtained by calculationn、Gn、BnRespectively R, G, B tristimulus values for this pixel.
Gray value A for all pixels in the gray standard imagenPerforming weighted average processing to obtain the gray standard value A of the coating corresponding to the gray standard imagesi. The specific calculation formula is as follows:
Figure BDA0003371105530000051
wherein the content of the first and second substances,
Figure BDA0003371105530000052
wherein i represents a coating number, fnIs a gray value of AnN represents the total number of pixels in the gray scale standard image, asiRepresenting the standard value of the gray scale corresponding to the coating with coating number i, e.g. As3Represents the gray scale standard value corresponding to the coating of the third layer from the outside to the inside of the uncleaned workpiece. It should be noted that, the gray scale values corresponding to different coatings have obvious difference, and the gray scale value A is obtained by comparing the gray scale values of all pixelsnThe weighted average processing can eliminate the interference caused by accidental recognition errors, thereby leading the gray standard value A to besiHas good reliability.
S16: and generating a gray standard value table.
Specifically, the coating number table obtained in S10 and the gray scale standard value A generated in S15 are combinedsiGenerating a gray scale standard value table, grayThe degree standard value table records a coating number i and a corresponding grey standard value AsiThe corresponding relationship of (1).
After receiving the cleaning completion signal, the process proceeds to S2.
As shown in the online monitoring flow diagram of fig. 2, S2 specifically includes the following steps:
s20: and generating and feeding back an image acquisition signal so as to acquire an RGB color image of the current surface of the workpiece.
Specifically, before the workpiece is cleaned, a worker inputs a current coating number k corresponding to the current coating, and in other embodiments, the current coating number k may also be automatically identified by the cooperation of the image acquisition device. After the first cleaning operation of any coating is completed, the preset laser cleaning device can feed back a cleaning completion signal. After receiving the cleaning completion signal, the terminal generates and feeds back an image acquisition signal to the preset image acquisition device, so that the preset image acquisition device acquires the RGB color image of the current surface of the workpiece, and the specific acquisition mode is similar to S11, which is not described herein again.
S21: and establishing a two-dimensional rectangular coordinate system, and identifying the coordinate of each pixel point.
Specifically, similar to S12, the RGB color image is a rectangular chart, two adjacent rectangular sides of the RGB color image are taken as a coordinate axis X and a coordinate axis Y, and an intersection of the coordinate axis X and the coordinate axis Y is taken as an origin, and coordinates (X, Y) of each pixel are identified.
S22: and performing noise reduction processing on the RGB color image.
The specific processing procedure refers to S13, and is not described here.
S23: the processing generates a grayscale image.
Specifically, the RGB color image subjected to the noise reduction processing is subjected to the graying processing, so that a corresponding grayscale image is obtained. Here, the coordinates of each pixel in the grayscale image match the coordinates of itself in S21. In the above S14 and S23, the conversion of the RGB image into the grayscale map is mainly to reduce the workload when data processing is subsequently performed, thereby improving the processing efficiency.
S24: and calculating to generate a gradient image.
Specifically, the gray-scale image generated in S23 is calculated by the Sobel operator or Laplacian operator to generate a corresponding gradient image.
S25: and calculating to generate a subarea image.
Specifically, the gradient image generated by the processing in S24 is segmented by using a watershed algorithm, so that the gradient image is segmented into a plurality of sub-region images, and the surface of the workpiece is segmented according to the shape and color of the surface of the workpiece, thereby facilitating the subsequent further processing.
S26: the processing generates an atlas of uncleaned regions.
Specifically, referring to fig. 2, S26 includes:
s261: and calculating the average gray value corresponding to each subregion image.
Specifically, for each sub-region image, referring to S15, identifying the gray values of all the pixel points in the sub-region image, and performing weighted average calculation on the gray values of all the pixels in the sub-region image, thereby obtaining the average gray value of all the pixels in the sub-region image
Figure BDA0003371105530000061
S262: average gray value of each subregion image
Figure BDA0003371105530000062
And comparing with a gray standard value table.
Specifically, the current coating number k input in S20 is first identified, and then, for each sub-region image, the average gray value corresponding to the sub-region image is determined
Figure BDA0003371105530000063
And comparing with all the gray standard values in the gray standard value table. On one hand, compared with the method for processing the whole gray level image, the method has higher accuracy by respectively processing each subregion image; on the other hand, the method is also helpful for knowing the specific position of the unwashed area on the workpiece,thereby facilitating subsequent further processing.
If the average gray value of a sub-area image is identified
Figure BDA0003371105530000071
The method meets the first condition, wherein the first condition is as follows:
Figure BDA0003371105530000072
unwashed labels are added to the subregion image. Wherein A isskWhen the gray scale value of the coating to be cleaned is close to A, the gray scale value of the average gray scale value corresponding to any subarea image is equal to AskAnd if so, the area corresponding to the subarea image is not cleaned, and the current coating is still remained.
If the average gray value is identified
Figure BDA0003371105530000073
And a second condition is satisfied, wherein the second condition is:
Figure BDA0003371105530000074
no treatment is done. Wherein A iss(k+1)The gray standard value of the coating to be cleaned next time after the cleaning is finished, namely the gray standard value of the coating to be exposed after the cleaning is finished, the coating to be exposed is exposed after the current coating is cleaned, and therefore when the average gray value corresponding to any subarea image is close to As(k+1)And when the image of the subarea is detected, the area corresponding to the image of the subarea is cleaned.
If the average gray value is identified
Figure BDA0003371105530000075
The third condition is satisfied, wherein the third condition is
Figure BDA0003371105530000076
S263 is entered. Wherein b is more than or equal to 2, the value of k + b is less than or equal to the total coating number of the workpiece, As(k+b)Represents any part of the workpiece which has not been cleaned and is except the coating layer which should be exposedA gray scale standard value of the coating, when the average gray scale value corresponding to any sub-area image is close to As(k+b)When it is, it indicates that the exposed coating of the workpiece has been ablated.
The comparison of all the subarea images is completed, and the average gray value meeting the condition one exists
Figure BDA0003371105530000077
There is no average gray value satisfying the condition three
Figure BDA0003371105530000078
If so, the process proceeds to S264;
the contrast of all the subarea images is completed, and the average gray value of all the subarea images
Figure BDA0003371105530000079
If both conditions are satisfied, the process proceeds directly to S28.
The contrast of all the subarea images is completed, and the average gray value of all the subarea images
Figure BDA00033711055300000710
If all the conditions satisfy the third condition, the process proceeds to S263.
S263: and simultaneously generating a processing interrupt signal and an alarm signal.
Specifically, a processing interruption signal and an alarm signal are generated, and the processing interruption signal is fed back to a preset laser cleaning device, so that the laser cleaning device stops running; meanwhile, the alarm signal is fed back to the preset alarm device, so that the alarm device sends a prompt to the working personnel, the working personnel can replace the ablated workpiece in time and check the laser cleaning device, and whether the cleaning setting exists or the laser cleaning device has the problem of part damage is detected.
S264: an unwashed region atlas is generated.
Specifically, in order to perform orderly supplementary cleaning on the surface of the unwashed workpiece, the intelligent terminal also generates an unwashed region atlas, wherein the unwashed region atlas is a set of all sub-region images with unwashed labels.
S27: and establishing a fast scanning path corresponding to the unwashed region atlas.
Specifically, referring to fig. 3, S27 includes:
s271: a counter is allocated.
Specifically, a Counter j is set for each sub-region image in the unwashed region image set, the value range of j is [1, c ], c is the number of sub-region images in the unwashed region image set, and the initial value of each Counter j is 0.
S272: and counting the number of unwashed pixels in each subregion image.
Specifically, for each sub-area image in the unwashed area image set, the gray value of each pixel is identified, and the gray value of each pixel is respectively compared with AskComparing, and recognizing that the gray value of any pixel is Ask- Δ and AskWhen the pixel is within + Δ, the pixel is marked as an unwashed pixel, and the value of the Counter j corresponding to the sub-region image is incremented by one. The final value of the Counter j is the total number of unwashed pixels in the corresponding sub-region image.
S273: all the sub-region images in the unwashed region image set are sorted by the value of the counter, and position numbers are assigned.
Specifically, all the sub-region atlas in the unwashed region atlas is sorted according to the size of the final value of the Counter j corresponding to each sub-region image; allocating position numbers P to each subregion image according to the sequencing resultjWherein j has a value range of [1, c]And c is the number of sub-region images in the unwashed atlas. In the present embodiment, the position numbers P are assigned in the order of the final value of the Counter j from large to smalljSo that P is1Corresponding to the subregion image, P, having the most unwashed pixelscCorresponding to the subregion image with the fewest unwashed pixels.
S274: a fast scan path is generated.
According to position number PjGenerating a rapid scanning path, wherein the cleaning priority and the position number P of the corresponding area of each subregion image are recorded in the rapid scanning pathjThe smaller the value of the middle j is, the higher the cleaning priority of the corresponding area of the corresponding subarea image is. Meanwhile, the coordinates of all unwashed pixels in each subregion image are recorded in the fast scan path, wherein the coordinates of the pixels are the coordinates in S23.
S275: the fast scan path is fed back.
Specifically, the fast scanning path is fed back to a preset laser cleaning device, so that the following operations are performed: the laser cleaning device firstly moves the laser cleaning head to the position with the number of P1Then, the laser cleaning head moves along the workpiece area corresponding to the sub-area image, and emits light at each unwashed pixel according to the coordinates of the unwashed pixel, thereby cleaning the unwashed position.
After the preset laser cleaning device finishes cleaning, a cleaning completion signal is produced again and fed back, and after the cleaning completion signal is received, the operation returns to S20.
S28: and generating and feeding back a cleaning end signal.
Specifically, a cleaning end signal is generated and fed back to a preset prompting device, and the prompting device can be a prompting lamp, a broadcast loudspeaker and the like. In this embodiment, the prompting device is a prompting lamp, and the prompting lamp is lighted after receiving the cleaning end signal, so as to remind the staff.
Based on the above method, the embodiment of the present application further discloses an online laser cleaning effect monitoring system based on visual inspection, and referring to fig. 4, the online laser cleaning effect monitoring system based on visual inspection includes:
an information acquisition module 110, configured to receive a cleaning completion signal and an RGB color image;
a processing module 120 for generating an image acquisition signal; the system comprises a processing module, a processing module and a display module, wherein the processing module is used for carrying out graying processing on the RGB color image so as to generate a grayscale image; the real-time gray scale image processing device is used for processing the real-time gray scale image into a subarea image; used for calculating the cracking gray value; processing and generating an unwashed region atlas; for generating a fast scan path;
a storage module 130, configured to store preset process rules and processing methods, and to store various types of information and data generated by the processing module;
a signal sending module 140 for sending the image acquisition signal and the fast scan path.
The embodiment of the application also discloses an intelligent terminal, which comprises a memory and a processor, wherein the memory is stored with a computer program which can be loaded by the processor and can execute the online laser cleaning effect monitoring method based on the visual inspection technology.
The embodiment of the present application further discloses a computer-readable storage medium, which stores a computer program that can be loaded by a processor and execute the above-mentioned laser cleaning effect online monitoring method based on the visual inspection technology, and the computer-readable storage medium includes, for example: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (7)

1. A laser cleaning effect on-line monitoring method based on visual detection is characterized by comprising the following steps:
receiving a cleaning completion signal, and acquiring an RGB color image of the current surface of the workpiece to be cleaned;
carrying out graying processing on the obtained RGB color image to generate a gray image;
identifying and dividing the current gray level image of the cleaning workpiece into a plurality of subarea images according to gray level gradient;
respectively calculating the average gray value of each subregion image;
comparing the average gray values of all the subarea images with a preset cleaning workpiece gray standard value table one by one; generating an atlas comprising all the images of the unwashed subareas according to the comparison result;
performing identification analysis on each unwashed subregion image to generate a rapid scanning path of the whole unwashed subregion image set;
feeding back the rapid scanning path to a laser cleaning device to carry out cleaning operation again;
repeating the above steps until no unwashed region atlas is generated.
2. The online laser cleaning effect monitoring method based on visual inspection as claimed in claim 1, wherein the method for generating the gray standard value table of the workpiece to be cleaned comprises
Coating numbering is carried out according to the structure and position sequence of the coating of the workpiece to be cleaned;
acquiring coating standard RGB images corresponding to the coating numbers one by one;
sequentially carrying out noise reduction processing and graying processing on each coating standard RGB image to obtain a standard grayscale image; for each standard gray image, identifying gray values of all pixels, performing weighted average processing on the gray values of all the pixels, and calculating to generate a gray standard value;
and recording the corresponding relation between the coating number and the corresponding gray standard value to generate a gray standard value table.
3. The online laser cleaning effect monitoring method based on visual inspection as claimed in claim 2, wherein the average gray values of the sub-region images are compared with a preset cleaning workpiece gray standard value table one by one, including comparing with the gray standard value of the coating which should be cleaned at present, and comparing with the gray standard value of all coatings which are not cleaned after the cleaning is completed.
4. The online laser cleaning effect monitoring method based on visual inspection as claimed in claim 3, wherein the average gray values of the sub-region images are compared with a preset cleaning workpiece gray standard value table one by: and judging that the coating is cleaned and not cleaned and the coating is ablated according to the comparison result value falling within the allowable deviation range of the gray standard value of the coating, wherein the allowable deviation value is preset for the gray standard values corresponding to all the coatings.
5. The online laser cleaning effect monitoring method based on visual inspection according to claim 1 or 4, wherein the fast scanning path for the whole set of the unwashed sub-region maps is generated by: the method comprises the following steps:
for each subarea image in the unwashed area image set, identifying the gray value of each pixel in the subarea image;
comparing the gray value of each pixel with the gray standard value of the coating to be cleaned after the cleaning;
counting the number of unwashed pixels in each subregion image according to the comparison result;
sequencing all the subarea images in the unwashed area image set according to the number of the unwashed pixels;
and generating the fast scanning path according to the sequencing result.
6. The online laser cleaning effect monitoring method based on visual inspection as claimed in claim 4, wherein the coating is determined not to be ablated, and the treatment method comprises the following steps: generating a processing interruption signal and an alarm signal, and feeding back the processing interruption signal to the laser cleaning device to stop the laser cleaning device; meanwhile, the alarm signal is fed back to the alarm device, so that the alarm device gives a prompt to workers to replace the ablated workpiece and check the laser cleaning device.
7. The system for realizing the laser cleaning effect on-line monitoring method based on visual detection is characterized by comprising the following steps:
the information acquisition module is used for receiving the cleaning completion signal and the RGB color image;
a processing module for generating an image acquisition signal; the system comprises a processing module, a processing module and a display module, wherein the processing module is used for carrying out graying processing on the RGB color image so as to generate a grayscale image; the real-time gray scale image processing device is used for processing the real-time gray scale image into a subarea image; used for calculating the cracking gray value; processing and generating an unwashed region atlas; for generating a fast scan path;
the storage module is used for storing the processing method and storing various types of information and data generated by the processing module;
and the signal sending module is used for sending the image acquisition signal and the fast scanning path.
CN202111401432.XA 2021-11-24 2021-11-24 Laser cleaning effect on-line monitoring method based on visual detection Pending CN114119535A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111401432.XA CN114119535A (en) 2021-11-24 2021-11-24 Laser cleaning effect on-line monitoring method based on visual detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111401432.XA CN114119535A (en) 2021-11-24 2021-11-24 Laser cleaning effect on-line monitoring method based on visual detection

Publications (1)

Publication Number Publication Date
CN114119535A true CN114119535A (en) 2022-03-01

Family

ID=80440894

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111401432.XA Pending CN114119535A (en) 2021-11-24 2021-11-24 Laser cleaning effect on-line monitoring method based on visual detection

Country Status (1)

Country Link
CN (1) CN114119535A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115760685A (en) * 2022-09-23 2023-03-07 北京珞安科技有限责任公司 Hidden threat sensing system and method for industrial production
CN116400588A (en) * 2023-06-07 2023-07-07 烟台金丝猴食品科技有限公司 Automatic positioning and cleaning method and equipment for bread mold residues
CN116672115A (en) * 2023-07-04 2023-09-01 深圳市宇华智界科技有限公司 Control system and method of dental floss cleaning machine with handle and cleaning machine

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4574393A (en) * 1983-04-14 1986-03-04 Blackwell George F Gray scale image processor
CN103063167A (en) * 2012-12-28 2013-04-24 江苏大学 Method for judging laser cleaning effect automatically
CN107610125A (en) * 2017-10-16 2018-01-19 云南电网有限责任公司临沧供电局 A kind of long distance laser derusting monitoring in real time and feedback method, apparatus and system
CN111783795A (en) * 2020-06-10 2020-10-16 恒通西交智能机器(广东)有限公司 Method, apparatus, device and medium for converting image into laser scanning path

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4574393A (en) * 1983-04-14 1986-03-04 Blackwell George F Gray scale image processor
CN103063167A (en) * 2012-12-28 2013-04-24 江苏大学 Method for judging laser cleaning effect automatically
CN107610125A (en) * 2017-10-16 2018-01-19 云南电网有限责任公司临沧供电局 A kind of long distance laser derusting monitoring in real time and feedback method, apparatus and system
CN111783795A (en) * 2020-06-10 2020-10-16 恒通西交智能机器(广东)有限公司 Method, apparatus, device and medium for converting image into laser scanning path

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115760685A (en) * 2022-09-23 2023-03-07 北京珞安科技有限责任公司 Hidden threat sensing system and method for industrial production
CN115760685B (en) * 2022-09-23 2023-07-28 北京珞安科技有限责任公司 Hidden threat perception system and method for industrial production
CN116400588A (en) * 2023-06-07 2023-07-07 烟台金丝猴食品科技有限公司 Automatic positioning and cleaning method and equipment for bread mold residues
CN116400588B (en) * 2023-06-07 2023-08-15 烟台金丝猴食品科技有限公司 Automatic positioning and cleaning method and equipment for bread mold residues
CN116672115A (en) * 2023-07-04 2023-09-01 深圳市宇华智界科技有限公司 Control system and method of dental floss cleaning machine with handle and cleaning machine

Similar Documents

Publication Publication Date Title
CN114119535A (en) Laser cleaning effect on-line monitoring method based on visual detection
CN110390669B (en) Method for detecting cracks in bridge image
CN113570605B (en) Defect detection method and system based on liquid crystal display panel
CN109840900B (en) Fault online detection system and detection method applied to intelligent manufacturing workshop
JP4150390B2 (en) Appearance inspection method and appearance inspection apparatus
JP2000057349A (en) Method for sorting defect, device therefor and method for generating data for instruction
JP5852919B2 (en) Crack detection method
JP2014006222A (en) Method and apparatus for detecting change of concrete surface
CN116559183B (en) Method and system for improving defect judging efficiency
CN110596120A (en) Glass boundary defect detection method, device, terminal and storage medium
US10726535B2 (en) Automatically generating image datasets for use in image recognition and detection
JP2014228357A (en) Crack detecting method
CN115493843B (en) Quality monitoring method and equipment based on bearing retainer
CN109693140A (en) A kind of intelligent flexible production line and its working method
CN115937101A (en) Quality detection method, device, equipment and storage medium
CN112070762A (en) Mura defect detection method and device for liquid crystal panel, storage medium and terminal
CN108346138A (en) A kind of detection method of surface flaw and system based on image procossing
CN114693529A (en) Image splicing method, device, equipment and storage medium
CN116978834B (en) Intelligent monitoring and early warning system for wafer production
CN111105413B (en) Intelligent spark plug appearance defect detection system
JP2021189238A (en) Belt inspection system and belt inspection program
JP2021189239A (en) Belt inspection system and belt inspection program
CN115035071A (en) Visual detection method for black spot defect of PAD light guide plate
CN115100110A (en) Defect detection method, device and equipment for polarized lens and readable storage medium
CN110211113B (en) Detection algorithm and calculation equipment for groove abnormity

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20220301