CN114998328A - Workpiece spraying defect detection method and system based on machine vision and readable storage medium - Google Patents
Workpiece spraying defect detection method and system based on machine vision and readable storage medium Download PDFInfo
- Publication number
- CN114998328A CN114998328A CN202210889416.8A CN202210889416A CN114998328A CN 114998328 A CN114998328 A CN 114998328A CN 202210889416 A CN202210889416 A CN 202210889416A CN 114998328 A CN114998328 A CN 114998328A
- Authority
- CN
- China
- Prior art keywords
- image
- workpiece
- point cloud
- dimensional
- defect detection
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000007547 defect Effects 0.000 title claims abstract description 56
- 238000001514 detection method Methods 0.000 title claims abstract description 31
- 238000005507 spraying Methods 0.000 title claims abstract description 28
- 238000000034 method Methods 0.000 claims abstract description 23
- 238000007781 pre-processing Methods 0.000 claims abstract description 9
- 230000015654 memory Effects 0.000 claims description 21
- 239000013598 vector Substances 0.000 claims description 15
- 238000004590 computer program Methods 0.000 claims description 11
- 238000001914 filtration Methods 0.000 claims description 11
- 238000012937 correction Methods 0.000 claims description 6
- 230000006870 function Effects 0.000 claims description 6
- 238000003672 processing method Methods 0.000 claims description 6
- 239000007921 spray Substances 0.000 claims description 6
- 238000000354 decomposition reaction Methods 0.000 claims description 3
- 238000013519 translation Methods 0.000 claims description 3
- 230000009466 transformation Effects 0.000 claims description 2
- 238000012545 processing Methods 0.000 abstract description 3
- 238000004141 dimensional analysis Methods 0.000 abstract description 2
- 238000005516 engineering process Methods 0.000 description 3
- 230000007246 mechanism Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 239000000463 material Substances 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000007405 data analysis Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000014759 maintenance of location Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000012163 sequencing technique Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000005211 surface analysis Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
- G01N2021/8887—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20024—Filtering details
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20048—Transform domain processing
- G06T2207/20064—Wavelet transform [DWT]
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Computing Systems (AREA)
- Analytical Chemistry (AREA)
- Chemical & Material Sciences (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Artificial Intelligence (AREA)
- Biochemistry (AREA)
- Databases & Information Systems (AREA)
- Evolutionary Computation (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Quality & Reliability (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention discloses a workpiece spraying defect detection method and system based on machine vision and a readable storage medium, wherein the method comprises the following steps of obtaining a three-dimensional image of a workpiece, preprocessing the three-dimensional image, extracting three-dimensional point cloud data from the preprocessed three-dimensional image, carrying out standardized processing on the three-dimensional point cloud data, and extracting point cloud data characteristics; inputting the point cloud data characteristics into a defect detection model to generate defect parameter information; comparing the defect parameter information with preset information to obtain a deviation rate; judging whether the deviation rate is greater than a preset threshold value or not; if the spraying parameter is larger than the preset value, generating compensation information, and optimizing the spraying parameter through the compensation information; if the defect parameter information is smaller than the preset defect parameter information, the defect parameter information is displayed according to a preset mode, the workpiece spraying form can be better shown through the three-dimensional point cloud data, multi-dimensional analysis of the data is achieved, and the defect detection precision is improved.
Description
Technical Field
The invention relates to the technical field of industrial vision detection, in particular to a workpiece spraying defect detection method and system based on machine vision and a readable storage medium.
Background
Machine vision is a branch of the rapid development of artificial intelligence, and the machine vision is to replace human eyes with machines for measurement and judgment. The machine vision system converts the shot target into image signals through a machine vision product (namely an image shooting device which is divided into a CMOS (complementary metal oxide semiconductor) product and a CCD (charge coupled device) product), transmits the image signals to a special image processing system to obtain the form information of the shot target, converts the form information into digital signals according to the information of pixel distribution, brightness, color and the like, performs various operations on the signals by the image system to extract the characteristics of the target, further controls the on-site equipment action according to the judgment result, detects the defect of workpiece spraying through the machine vision, and has important value and significance.
Traditional defect detection is through infrared scanning, then carries out the judgement of spraying defect, and the precision is relatively poor, and in addition, traditional defect detection only judges work piece spraying defect through shooing the two-dimensional image to the work piece surface, and data analysis is accurate inadequately, and the defect judgement result is skew actual result easily, influences the judgement of defect.
Disclosure of Invention
The invention aims to provide a workpiece spraying defect detection method and system based on machine vision and a readable storage medium, which are simple to operate, high in efficiency and good in universality.
In order to achieve the purpose of the invention, the technical scheme adopted by the invention is as follows: a workpiece spraying defect detection method based on machine vision comprises the following steps:
acquiring a three-dimensional image of a workpiece, and preprocessing the three-dimensional image;
extracting three-dimensional point cloud data from the preprocessed three-dimensional image, standardizing the three-dimensional point cloud data, and extracting point cloud data characteristics;
inputting the point cloud data characteristics into a defect detection model to generate defect parameter information;
comparing the defect parameter information with preset information to obtain a deviation rate;
judging whether the deviation rate is greater than a preset threshold value or not;
if the number of the spraying parameters is larger than the preset value, generating compensation information, and optimizing the spraying parameters through the compensation information;
and if the defect parameter information is smaller than the preset defect parameter information, displaying the defect parameter information according to a preset mode.
Preferably, acquiring a three-dimensional image of the workpiece, and preprocessing the three-dimensional image, specifically includes:
acquiring an original left image and an original right image of a workpiece;
calculating the parallax between the original left image and the original right image according to a binocular camera calibration principle, and calibrating a binocular camera;
acquiring a new left image and a new right image of the workpiece through the calibrated binocular camera;
and forming a binocular stereoscopic vision image according to the new left image and the new right image.
Preferably, the method further comprises the following steps:
establishing a three-dimensional space coordinate system, and calculating a three-dimensional coordinate of the workpiece in the three-dimensional space according to the spatial geometrical relationship;
and segmenting the image into a plurality of regions with the same size according to the three-dimensional coordinates, and filtering the segmented image.
Preferably, the image filtering processing method is as follows:
carrying out wavelet transform multi-scale decomposition on the image;
removing noise in the image by using the scale coefficient;
the image is reconstructed by inverse wavelet transform.
Preferably, the inverse wavelet transform formula is as follows:
in the formulaWhich represents the inverse wavelet transform function, is,represents a wavelet function, s represents a scale factor, k represents a translation factor,indicating the correction factor.
Preferably, the method for calculating the three-dimensional coordinates of the Q point in space is as follows: the left image and the right image are shot by two cameras respectively, the distance between the connecting lines of the projection centers of the two cameras is recorded as p, and the coordinate correction coefficient is recorded asAssuming that the left image and the right image are on the same horizontal plane, the Y coordinate of the Q point in space is the same, and thus it can be known that:
in the formula, the left image coordinate isThe right image coordinate isParallax of left and right images,
From this, three-dimensional coordinates of the point Q in space are calculated as
Preferably, the three-dimensional point cloud data processing method comprises the following steps:
analyzing each point in a statistical filtering mode, and calculating the distance from the point to an adjacent point;
calculating a mean value m and a standard deviation n to obtain a threshold value range w;
removing point cloud data outside the threshold range;
wherein the threshold range w is (m-a n, m + a n), wherein a represents a constant.
Preferably, the method for extracting the point cloud data features is as follows:
calculating a normal vector according to the geometrical characteristics of the point cloud;
sorting the normal vectors, and judging the offset angle of the normal vectors by using the calculated normal vectors;
judging whether the offset angle is larger than a preset angle or not;
if the value is larger than the threshold value, adjusting the normal vector parameters, extracting the edge characteristics of the point cloud, and performing boundary division on the point cloud data.
The invention also claims a computer device, which comprises a memory, a processor and a computer program stored in the memory and capable of running on the processor, wherein the processor executes the computer program to execute the instructions of the workpiece spraying defect detection method based on machine vision.
The invention also claims a computer readable storage medium storing a computer program which, when executed by a processor, executes instructions of the above-described machine vision-based workpiece spray defect detection method.
Due to the application of the technical scheme, compared with the prior art, the invention has the following advantages:
according to the method and the device, the three-dimensional image and the three-dimensional point cloud data of the workpiece are obtained, the workpiece spraying form can be better displayed through the three-dimensional point cloud data, multi-dimensional analysis of the data is achieved, the defect detection precision is improved, and the defect detection result is closer to an actual value.
Drawings
FIG. 1 is a flow chart of a workpiece spray defect detection method based on machine vision according to the present invention;
FIG. 2 is a flow chart of a three-dimensional image preprocessing method of the present invention;
FIG. 3 is a flow chart of a method for extracting point cloud features according to the present invention.
Detailed Description
In order to make those skilled in the art better understand the technical solutions in the present specification, the technical solutions in the embodiments of the present specification will be clearly and completely described below with reference to the drawings in the embodiments of the present specification, and it is obvious that the described embodiments are only a part of the embodiments of the present specification, and not all of the embodiments. All other embodiments obtained by a person skilled in the art based on the embodiments in the present specification without any inventive step should fall within the scope of protection of the present specification.
Example 1
As shown in FIG. 1, the invention discloses a workpiece spraying defect detection method based on machine vision, which comprises the following steps:
s102, acquiring a three-dimensional image of a workpiece, and preprocessing the three-dimensional image;
s104, extracting three-dimensional point cloud data from the preprocessed three-dimensional image, standardizing the three-dimensional point cloud data, and extracting point cloud data characteristics;
s106, inputting the point cloud data characteristics into a defect detection model to generate defect parameter information;
s108, comparing the defect parameter information with preset information to obtain a deviation rate;
s110, judging whether the deviation rate is larger than a preset threshold value or not;
s112, if the value is larger than the preset value, generating compensation information, and optimizing the spraying parameters through the compensation information;
and S114, if the defect parameter information is smaller than the preset defect parameter information, displaying the defect parameter information according to a preset mode.
Furthermore, the representation form of the three-dimensional image mainly comprises a voxel, a grid and a point cloud model, and point cloud data, namely a point cloud, is obtained by a line scanning device and is displayed in the space empty image like a cloud.
Fig. 2 shows a flow chart of a three-dimensional image preprocessing method.
Preferably, acquiring a three-dimensional image of the workpiece, and preprocessing the three-dimensional image, specifically includes:
s202, acquiring an original left image and an original right image of the workpiece;
s204, calculating the parallax between the original left image and the original right image according to the calibration principle of the binocular camera, and calibrating the binocular camera;
s206, acquiring a new left image and a new right image of the workpiece through the calibrated binocular camera;
and S208, forming a binocular stereoscopic vision image according to the new left image and the new right image.
Furthermore, binocular stereo vision fuses images obtained by two eyes and observes differences between the images, so that people can obtain obvious depth feeling, a corresponding relation between features is established, and differences of mapping points of the same space physical point in different images are called parallax images.
Preferably, the method further comprises the following steps: establishing a three-dimensional space coordinate system, and calculating the three-dimensional coordinates of the workpiece in the three-dimensional space according to the spatial geometrical relationship;
and segmenting the image into a plurality of regions with the same size according to the three-dimensional coordinates, and filtering the segmented image.
FIG. 3 shows a flow chart of a method of extracting point cloud data.
Further, the method for extracting the point cloud data features comprises the following steps:
s302, calculating a normal vector according to the geometrical characteristics of the point cloud;
s304, sequencing the normal vectors, and judging the offset angles of the normal vectors by using the calculated normal vectors;
s306, judging whether the offset angle is larger than a preset angle or not;
and S308, if the value is larger than the threshold value, adjusting the normal vector parameters, extracting the edge characteristics of the point cloud, and performing boundary division on the point cloud data.
Preferably, the image filtering processing method is as follows:
carrying out wavelet transformation multi-scale decomposition on the image;
removing noise in the image by using the scale coefficient;
the image is reconstructed by inverse wavelet transform.
Preferably, the inverse wavelet transform is formulated as follows:
in the formulaWhich represents the inverse wavelet transform function, and,representing a wavelet function, s a scale factor, k a translation factor,indicating the correction factor.
Further, the method for calculating the three-dimensional coordinate of the point Q in space is as follows: the left image and the right image are shot by two cameras respectively, the distance between the connecting lines of the projection centers of the two cameras is recorded as p, and the coordinate correction coefficient is recorded asAssuming that the left image and the right image are on the same horizontal plane, the Y coordinate of the Q point in space is the same, and thus it can be seen that:
in the formula, the left image coordinate isThe right image coordinate isParallax of left and right images,
From this, three-dimensional coordinates of the point Q in space are calculated as
Furthermore, the parallax of the same workpiece in the binocular camera by taking pixels as units is related to the base line and the focal length of the binocular camera and the pixel size, the shorter the base line of the binocular camera is, the smaller the focal length is, the larger the matching range is, the closer the depth can be detected is, the longer the base line is and the larger the focal length is, the farther the depth can be detected is, and the binocular camera needs to be calibrated and calibrated before the left image and the right image are acquired through the binocular camera.
Preferably, the three-dimensional point cloud data processing method comprises the following steps:
analyzing each point in a statistical filtering mode, and calculating the distance from the point to an adjacent point;
calculating a mean value m and a standard deviation n to obtain a threshold value range w;
removing point cloud data outside the threshold range;
wherein the threshold range w is (m-a n, m + a n), wherein a represents a constant.
Further, the value of the constant a ranges from 1 to 1.5, preferably a = 1.25.
Furthermore, certain noise exists in the point cloud, the point cloud needs to be denoised, the point cloud denoising is to detect and remove the noise or points which are not interested in the point cloud through a filtering principle, a common point cloud filtering method mainly comprises a voxel and moving average least square method, the moving average least square method is to eliminate discrete points through a curved surface fitting mode, and the discrete points are unrelated points in the characteristic representation of a workpiece or an object.
Example 2
The present disclosure also provides a computer device comprising a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor executes the computer program to perform the instructions of the workpiece spray coating defect detection method based on machine vision described in the above embodiments.
The computer device may include one or more processors, such as one or more Central Processing Units (CPUs) or Graphics Processors (GPUs), each of which may implement one or more hardware threads. The computer device may also comprise any memory for storing any kind of information, such as code, settings, data, etc., and in a particular embodiment a computer program on the memory and executable on the processor, which computer program when executed by the processor may perform the instructions of the method of any of the above embodiments. For example, and without limitation, memory may include any one or combination of the following: any type of RAM, any type of ROM, flash memory devices, hard disks, optical disks, etc. More generally, any memory may use any technology to store information. Further, any memory may provide volatile or non-volatile retention of information. Further, any memory may represent fixed or removable components of the computer device. In one case, when the processor executes the associated instructions stored in any memory or combination of memories, the computer device can perform any of the operations of the associated instructions. The computer device also includes one or more drive mechanisms for interacting with any memory, such as a hard disk drive mechanism, an optical disk drive mechanism, and so forth.
The present disclosure also provides a computer-readable medium, which may be contained in the electronic device described in the above embodiments; or may exist separately without being assembled into the electronic device. The computer-readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to implement the method described in embodiment 1 or 2 above; computer-readable media, including both permanent and non-permanent, removable and non-removable media, may implement the information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium which can be used to store information that can be accessed by a computer device. As defined herein, computer readable media does not include transitory computer readable media (transmyedia) such as modulated data signals and carrier waves.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, as for the system embodiment, since it is substantially similar to the method embodiment, the description is relatively simple, and reference may be made to the partial description of the method embodiment for relevant points. In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of an embodiment of the specification. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Moreover, various embodiments or examples and features of various embodiments or examples described in this specification can be combined and combined by one skilled in the art without being mutually inconsistent.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.
Claims (10)
1. A workpiece spraying defect detection method based on machine vision is characterized by comprising the following steps,
acquiring a three-dimensional image of the workpiece, preprocessing the three-dimensional image,
extracting three-dimensional point cloud data from the preprocessed three-dimensional image, standardizing the three-dimensional point cloud data, and extracting point cloud data characteristics;
inputting the point cloud data characteristics into a defect detection model to generate defect parameter information;
comparing the defect parameter information with preset information to obtain a deviation rate;
judging whether the deviation rate is greater than a preset threshold value or not;
if the spraying parameter is larger than the preset value, generating compensation information, and optimizing the spraying parameter through the compensation information;
and if the defect parameter information is smaller than the preset defect parameter information, displaying the defect parameter information according to a preset mode.
2. The workpiece spraying defect detection method based on machine vision as claimed in claim 1, wherein a three-dimensional image of the workpiece is obtained, and the preprocessing of the three-dimensional image specifically comprises:
acquiring an original left image and an original right image of a workpiece;
calculating the parallax between the original left image and the original right image according to a binocular camera calibration principle, and calibrating a binocular camera;
acquiring a new left image and a new right image of the workpiece through the calibrated binocular camera;
and forming a binocular stereoscopic vision image according to the new left image and the new right image.
3. The machine vision-based workpiece spray defect detection method of claim 2, further comprising:
establishing a three-dimensional space coordinate system, and calculating the three-dimensional coordinates of the workpiece in the three-dimensional space according to the spatial geometrical relationship;
and segmenting the image into a plurality of regions with the same size according to the three-dimensional coordinates, and filtering the segmented image.
4. The workpiece spraying defect detection method based on the machine vision as claimed in claim 3, characterized in that the image filtering processing method comprises the following steps:
carrying out wavelet transform multi-scale decomposition on the image;
removing noise in the image by using the scale coefficient;
the image is reconstructed by inverse wavelet transform.
5. The workpiece spray coating defect detection method based on machine vision as claimed in claim 4, characterized in that the wavelet inverse transformation formula is as follows:
6. The method for detecting the workpiece spraying defects based on the machine vision as claimed in claim 2, wherein the method for calculating the three-dimensional coordinates of the Q point in the space is as follows: the left image and the right image are shot by two cameras respectively, the distance between the projection center connecting lines of the two cameras is recorded as p, and the coordinate correction coefficient is recorded asAssuming that the left image and the right image are on the same horizontal plane, the Y coordinate of the Q point in space is the same, and thus it can be known that:
in the formula, the left image coordinate isThe right image coordinate isParallax of left and right images,
From this, three-dimensional coordinates of the point Q in space are calculated as
7. The workpiece spraying defect detection method based on machine vision as claimed in claim 1, characterized in that the three-dimensional point cloud data processing method is as follows:
analyzing each point in a statistical filtering mode, and calculating the distance from the point to an adjacent point;
calculating a mean value m and a standard deviation n to obtain a threshold value range w;
removing point cloud data outside the threshold range;
wherein the threshold range w is (m-a n, m + a n), wherein a represents a constant.
8. The workpiece spraying defect detection method based on machine vision as claimed in claim 3, characterized in that the method for extracting the point cloud data features is as follows:
calculating a normal vector according to the geometrical characteristics of the point cloud;
sorting the normal vectors, and judging the offset angle of the normal vectors by using the calculated normal vectors;
judging whether the offset angle is larger than a preset angle or not;
if the value is larger than the threshold value, adjusting the normal vector parameters, extracting the edge characteristics of the point cloud, and performing boundary division on the point cloud data.
9. A machine vision based workpiece spray defect detection system comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor executes the computer program to perform the instructions of the machine vision based workpiece spray defect detection method according to any one of claims 1 to 8.
10. A computer-readable storage medium storing a computer program, wherein the computer program when executed by a processor executes instructions of the machine vision based workpiece spray defect detection method of any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210889416.8A CN114998328A (en) | 2022-07-27 | 2022-07-27 | Workpiece spraying defect detection method and system based on machine vision and readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210889416.8A CN114998328A (en) | 2022-07-27 | 2022-07-27 | Workpiece spraying defect detection method and system based on machine vision and readable storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114998328A true CN114998328A (en) | 2022-09-02 |
Family
ID=83021998
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210889416.8A Pending CN114998328A (en) | 2022-07-27 | 2022-07-27 | Workpiece spraying defect detection method and system based on machine vision and readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114998328A (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115561249A (en) * | 2022-11-09 | 2023-01-03 | 松乐智能装备(深圳)有限公司 | Intelligent monitoring method and system for spraying equipment |
CN115932864A (en) * | 2023-02-24 | 2023-04-07 | 深圳市博铭维技术股份有限公司 | Pipeline defect detection method and pipeline defect detection device |
CN116124081A (en) * | 2023-04-18 | 2023-05-16 | 菲特(天津)检测技术有限公司 | Non-contact workpiece detection method and device, electronic equipment and medium |
CN116990692A (en) * | 2023-09-28 | 2023-11-03 | 深圳康普盾科技股份有限公司 | Lithium battery health condition assessment and residual life prediction method and system |
CN118347942A (en) * | 2024-06-17 | 2024-07-16 | 华羿微电子股份有限公司 | Method, equipment and storage medium for detecting appearance of semiconductor product after rib cutting molding |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102566291A (en) * | 2010-12-29 | 2012-07-11 | 中芯国际集成电路制造(上海)有限公司 | Test system for projection mask |
CN113888531A (en) * | 2021-11-02 | 2022-01-04 | 中南大学 | Concrete surface defect detection method and device, electronic equipment and storage medium |
CN114549519A (en) * | 2022-04-08 | 2022-05-27 | 苏州天成涂装系统股份有限公司 | Visual detection method and system for automobile spraying production line and readable storage medium |
-
2022
- 2022-07-27 CN CN202210889416.8A patent/CN114998328A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102566291A (en) * | 2010-12-29 | 2012-07-11 | 中芯国际集成电路制造(上海)有限公司 | Test system for projection mask |
CN113888531A (en) * | 2021-11-02 | 2022-01-04 | 中南大学 | Concrete surface defect detection method and device, electronic equipment and storage medium |
CN114549519A (en) * | 2022-04-08 | 2022-05-27 | 苏州天成涂装系统股份有限公司 | Visual detection method and system for automobile spraying production line and readable storage medium |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115561249A (en) * | 2022-11-09 | 2023-01-03 | 松乐智能装备(深圳)有限公司 | Intelligent monitoring method and system for spraying equipment |
CN115932864A (en) * | 2023-02-24 | 2023-04-07 | 深圳市博铭维技术股份有限公司 | Pipeline defect detection method and pipeline defect detection device |
CN116124081A (en) * | 2023-04-18 | 2023-05-16 | 菲特(天津)检测技术有限公司 | Non-contact workpiece detection method and device, electronic equipment and medium |
CN116124081B (en) * | 2023-04-18 | 2023-06-27 | 菲特(天津)检测技术有限公司 | Non-contact workpiece detection method and device, electronic equipment and medium |
CN116990692A (en) * | 2023-09-28 | 2023-11-03 | 深圳康普盾科技股份有限公司 | Lithium battery health condition assessment and residual life prediction method and system |
CN116990692B (en) * | 2023-09-28 | 2023-12-08 | 深圳康普盾科技股份有限公司 | Lithium battery health condition assessment and residual life prediction method and system |
CN118347942A (en) * | 2024-06-17 | 2024-07-16 | 华羿微电子股份有限公司 | Method, equipment and storage medium for detecting appearance of semiconductor product after rib cutting molding |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN114998328A (en) | Workpiece spraying defect detection method and system based on machine vision and readable storage medium | |
Wolff et al. | Point cloud noise and outlier removal for image-based 3D reconstruction | |
CN109461181B (en) | Depth image acquisition method and system based on speckle structured light | |
US9773302B2 (en) | Three-dimensional object model tagging | |
CN107392958B (en) | Method and device for determining object volume based on binocular stereo camera | |
EP3273412B1 (en) | Three-dimensional modelling method and device | |
CN111582054B (en) | Point cloud data processing method and device and obstacle detection method and device | |
CN107481271B (en) | Stereo matching method, system and mobile terminal | |
EP3488424A1 (en) | Systems and methods for improved surface normal estimation | |
CN112686877A (en) | Binocular camera-based three-dimensional house damage model construction and measurement method and system | |
CN111412842A (en) | Method, device and system for measuring cross-sectional dimension of wall surface | |
CN111553946A (en) | Method and device for removing ground point cloud and obstacle detection method and device | |
CN115456945A (en) | Chip pin defect detection method, detection device and equipment | |
CN116129037A (en) | Visual touch sensor, three-dimensional reconstruction method, system, equipment and storage medium thereof | |
Chiang et al. | Active stereo vision system with rotated structured light patterns and two-step denoising process for improved spatial resolution | |
Bormann et al. | Fast and accurate normal estimation by efficient 3d edge detection | |
CN114638891A (en) | Target detection positioning method and system based on image and point cloud fusion | |
Brink et al. | Indexing Uncoded Stripe Patterns in Structured Light Systems by Maximum Spanning Trees. | |
US10223803B2 (en) | Method for characterising a scene by computing 3D orientation | |
Wang et al. | LBP-based edge detection method for depth images with low resolutions | |
CN110969650B (en) | Intensity image and texture sequence registration method based on central projection | |
CN116645418A (en) | Screen button detection method and device based on 2D and 3D cameras and relevant medium thereof | |
KR101526465B1 (en) | A Depth Image Enhancement Method based on GPGPU | |
WO2020141161A1 (en) | Method for 3d reconstruction of an object | |
CN116379936A (en) | Intelligent recognition distance measuring and calculating method and device based on binocular camera |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20220902 |