CN116721067B - Impregnated paper impregnation quality detection method based on machine vision - Google Patents

Impregnated paper impregnation quality detection method based on machine vision Download PDF

Info

Publication number
CN116721067B
CN116721067B CN202310623590.2A CN202310623590A CN116721067B CN 116721067 B CN116721067 B CN 116721067B CN 202310623590 A CN202310623590 A CN 202310623590A CN 116721067 B CN116721067 B CN 116721067B
Authority
CN
China
Prior art keywords
pixel point
paper image
impregnated paper
value
difference
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310623590.2A
Other languages
Chinese (zh)
Other versions
CN116721067A (en
Inventor
王杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suqian Kaida Environmental Protection Equipment Manufacturing Co ltd
Original Assignee
Suqian Kaida Environmental Protection Equipment Manufacturing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suqian Kaida Environmental Protection Equipment Manufacturing Co ltd filed Critical Suqian Kaida Environmental Protection Equipment Manufacturing Co ltd
Priority to CN202310623590.2A priority Critical patent/CN116721067B/en
Publication of CN116721067A publication Critical patent/CN116721067A/en
Application granted granted Critical
Publication of CN116721067B publication Critical patent/CN116721067B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/761Proximity, similarity or dissimilarity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/762Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
    • G06V10/763Non-hierarchical techniques, e.g. based on statistics of modelling distributions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30124Fabrics; Textile; Paper
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Quality & Reliability (AREA)
  • Probability & Statistics with Applications (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

The invention relates to the field of artificial intelligence, in particular to a machine vision-based impregnated paper impregnation quality detection method, which comprises the steps of performing key point matching by using a SIFT algorithm to obtain a corresponding relation of two image pixel points; respectively calculating a distance sequence of each pixel point in the base paper image and the impregnated paper image, and performing difference to obtain a distance difference sequence; thereby obtaining a displacement variation value; taking the gray value difference value of the base paper and the impregnated paper image as a gray value difference value; thereby obtaining a gray level variation value; calculating the warping degree of each pixel point according to the displacement change value and the gray change value; calculating impregnation uniformity; and detecting the impregnation quality according to the impregnation uniformity. According to the invention, the image is analyzed by utilizing machine vision, the dipping quality of the dipping process is detected according to the position change condition and the gray level change condition of the pixel points before and after dipping, the detection efficiency is improved, and the phenomenon that the accuracy rate or the randomness is not guaranteed due to the artificial spot inspection is avoided.

Description

Impregnated paper impregnation quality detection method based on machine vision
Technical Field
The invention relates to the field of artificial intelligence, in particular to a machine vision-based impregnated paper impregnation quality detection method.
Background
The impregnated paper is high-quality decorative paper, and has the characteristics of wear resistance, rich style, impact resistance, deformation resistance, pollution resistance, flame retardance, moisture resistance, environmental protection, fastness, simple and convenient installation, easy management and the like. In the process of manufacturing impregnated paper, steps such as impregnation and drying are required. If some links in the impregnation process are not tightly controlled, the impregnation quality of the impregnated paper is easy to be poor, and the use effect is affected, so that the impregnation quality of the produced impregnated paper is required to be detected.
The method is characterized in that the immersion quality of the impregnated paper is generally detected by adopting a manual spot inspection mode, the labor cost is high, the accuracy cannot be ensured, the spot inspection mode is random, and the possibility of missed inspection is very high; however, the detection is performed by adopting the existing image processing method, and the interference of the pattern on the impregnated paper on the detection result is difficult to be eliminated.
Disclosure of Invention
The invention provides a machine vision-based impregnated paper impregnation quality detection method, which aims to solve the problems that the existing manual detection method for detecting the impregnated paper impregnation quality is high in labor cost and cannot guarantee accuracy, the random detection method is high in probability of missed detection; however, the existing image processing method is used for detecting, and patterns on impregnated paper have the problem of interference on detection results.
The machine vision-based impregnated paper impregnation quality detection method provided by the invention adopts the following technical scheme that the method comprises the following steps:
acquiring a base paper image and an impregnated paper image;
acquiring a matched key point pair of a base paper image and a impregnated paper image; obtaining corresponding pixel points corresponding to the pixel points in the impregnated paper image in the base paper image by utilizing the pixel points in the impregnated paper image;
calculating Euclidean distance between each pixel point and each key point in the impregnated paper image to obtain an impregnated paper distance sequence of each pixel point in the impregnated paper image;
obtaining a base paper distance sequence of each corresponding pixel point in the base paper image according to Euclidean distance between each pixel point and each key point in the obtained base paper image;
the method comprises the steps of performing difference between an impregnated paper distance sequence of each pixel point in an impregnated paper image and a base paper distance sequence of each corresponding pixel point in a base paper image to obtain a distance difference value sequence of each pixel point, and taking the difference value with the largest absolute value in the difference value sequence obtained by each pixel point as the distance difference of each pixel point;
taking the absolute value of the difference between the distance difference of each pixel point and the standard distance difference as a displacement variation value;
calculating gray scale difference weight of each pixel point according to the displacement change value of the pixel point;
the gray value difference of each pixel point in the impregnated paper image and the gray value difference of each corresponding pixel point in the base paper image are utilized to obtain the gray value difference of each pixel point, and the difference value between the gray value difference of each pixel point and the standard gray value difference is utilized as a gray change value;
calculating the warping degree of each pixel point by using the gray level change value and the displacement change value of each pixel point;
carrying out sliding window processing on the impregnated paper image to obtain a plurality of areas, and calculating impregnation uniformity according to the determined warp of each area, wherein the warp of each pixel point in each area;
and detecting the impregnation quality according to the impregnation uniformity.
The method for obtaining the corresponding pixel points corresponding to the pixel points in the impregnated paper image in the base paper image comprises the following steps:
matching key points on the base paper image and the impregnated paper image by using a SIFT algorithm, calculating Euclidean distance from each non-key point in the base paper image to each key point to obtain a key point distance sequence of each non-key point, carrying out normalization processing on the minimum 5 values in the key point distance sequence, resetting other values to 0 to obtain a position vector of a pixel point, and traversing all the non-key points in the base paper image by using the method to obtain the position vector of each non-key point on the base paper image;
the same method obtains the position vector of each non-key point in the impregnated paper image;
and calculating cosine similarity between a certain pixel point position vector in the impregnated paper image and each non-key point position vector in the base paper image, selecting a pixel point in the base paper image with the maximum cosine similarity to correspond to the pixel point, and traversing all pixel points in the impregnated paper image to obtain corresponding pixel points corresponding to all pixel points in the impregnated paper image in the base paper image.
The standard distance difference is calculated as follows:
performing mean shift clustering on the distance difference sequences to obtain a plurality of categories, and calculating the density of each category, wherein the calculation formula is as follows:
wherein: ρ is the density of each category, n is the amount of data for that category, and S is the variance of the data in that category;
and calculating the density of each category, extracting the data average value of the category corresponding to the maximum density, and taking the average value as a standard distance difference value.
The calculation formula of the weight of each gray level difference is as follows:
wherein: w is the weight of the gray scale difference value; f (q) is a displacement variation function; u is the sequence number of the gray level change value and is also the sequence number of the displacement change value; l is the number of gray level variation values and also the number of displacement variation values, namely the number of pixels of the impregnated paper image;
wherein, the expression of the displacement change function is:
wherein: q is the displacement variation value, q u Is the u-th displacement variation value.
The calculation formula of the warp of each pixel point is as follows:
wherein: h is the warp of the pixel point in the impregnated paper image, q is the displacement variation value of the pixel point, g is the gray scale variation value of the pixel point, and u is the serial number of the pixel point.
The further impregnation uniformity was calculated as follows:
carrying out sliding window processing on the impregnated paper image to obtain t areas, and calculating impregnation uniformity according to the warping degree of all pixel points and the warping degree of the area to which the pixel points belong, wherein the calculation formula is as follows:
wherein: p is the dipping uniformity, l is the number of pixels of the dipping paper image, h u To impregnate the warp degree of the ith pixel point in the paper image, H t(u) In order to impregnate the warp degree of the area that the (u) th pixel point belongs to in the paper image, u is the serial number of the pixel point.
The method for obtaining the warpage of each region comprises the following steps:
and respectively acquiring the warp of the pixel points of each region for the t regions obtained by the sliding window processing, extracting the warp of the pixel point with the minimum warp of the pixel points of each region, and taking the warp as the warp of the region.
The method for detecting the impregnation quality according to the impregnation uniformity comprises the following steps:
comparing the soaking uniformity with a set threshold, if the soaking uniformity is greater than or equal to the set threshold, determining that the soaking quality detection result is qualified, and if the soaking uniformity is less than the set threshold, determining that the soaking quality detection result is unqualified.
The beneficial effects of the invention are as follows: according to the invention, the image is analyzed by utilizing machine vision, whether the impregnated paper has warping phenomenon or not is judged according to the position change condition and the gray level change condition of the pixel points before and after impregnation, sliding window processing is carried out on the impregnated paper image, the impregnation uniformity is calculated according to the warping degree of each area, and the impregnation quality of the impregnation process is detected according to the impregnation uniformity, so that the detection efficiency and the accuracy of the detection result can be improved, and meanwhile, the phenomenon that the accuracy or the randomness cannot be ensured due to manual sampling inspection can be avoided.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions of the prior art, the drawings which are used in the description of the embodiments or the prior art will be briefly described, it being obvious that the drawings in the description below are only some embodiments of the invention, and that other drawings can be obtained according to these drawings without inventive faculty for a person skilled in the art.
FIG. 1 is a system flow diagram of the present invention;
FIG. 2 is a flow chart of the method of embodiment 1 of the present invention;
fig. 3 is a flow chart of the method of embodiment 2 of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Example 1
An embodiment of a machine vision-based impregnated paper impregnation quality detection method of the invention is shown in fig. 1 to 2.
S101, acquiring a base paper image and a impregnated paper image
When shooting products on a product line, the background parts except the products are inevitably shot, so that the shot images are required to be processed, and the base paper images and the impregnated paper images are obtained by semantic segmentation, so that the interference of background pixels on subsequent result analysis can be avoided.
S102, performing key point matching by using SIFT algorithm
The base paper becomes impregnated paper after impregnation processing, if the impregnation is uneven, the glue content of each area of the impregnated paper is different, and through a drying procedure, the moisture volatilization amount of different areas is different, so that shrinkage is uneven, the impregnated paper generates buckling deformation, and then image matching is needed to be carried out, and an image of the base paper is corresponding to pixel points in the image of the impregnated paper for analysis.
And performing key point matching on the base paper image and the impregnated paper image through a SIFT algorithm, and then performing further analysis.
S103, obtaining corresponding pixel points corresponding to each pixel point in the impregnated paper image in the base paper image
When the SIFT algorithm performs feature point matching, only a part of pixel points with obvious features, such as corner points, edge points, bright points in a dark area and dark points in a bright area, can be matched, so that corresponding pixel points corresponding to all the pixel points in the impregnated paper image in the base paper image need to be obtained according to the corresponding relation of the key points.
The detected dipping quality can be more accurate by changing the position and the gray level according to the corresponding pixel points.
S104, acquiring a distance sequence of the base paper image and a distance sequence of the impregnated paper image
And calculating the distance between each pixel point and each key point to respectively obtain the distance sequences of each pixel point in the base paper image and the impregnated paper image, and judging whether the positions of the pixel points before and after impregnation change according to the distance relation of each pixel point in the base paper image and the impregnated paper image so as to judge whether warping exists.
S105, performing difference on the base paper image distance sequence and the impregnated paper image distance sequence of each pixel point to obtain a distance difference sequence
If the warping portion exists in the impregnated paper, the pixel points of the warping portion are displaced compared with the positions of the corresponding pixel points in the base paper, a distance difference sequence is obtained through distance change before and after the impregnation of each pixel point, and the displacement condition of the pixel points is further analyzed through the distance difference before and after the impregnation.
S106, calculating a standard distance difference value
Because the shooting places of the base paper image and the impregnated paper image are different, the sizes of the shot images may have differences, so that the distances from all pixel points in the impregnated paper image to the key points are different from the distances from the corresponding pixel points in the base paper image to the key points, the difference is basically consistent, and the difference is a standard distance difference value.
And correcting the displacement of the pixel point through the standard distance difference value, so that the calculated warping degree is more accurate.
S107, calculating the displacement change condition to obtain a displacement change value
And correcting the distance difference sequence according to the standard distance difference value to obtain the displacement change condition of each pixel point, and counting the displacement change condition of each pixel point to obtain a displacement change value.
S108, calculating gray value difference values of corresponding pixel points in the base paper image and the impregnated paper image
The pixel points of the warping portion of the impregnated paper are affected by light reflection, the brightness of the pixel points of the non-warping portion is different, and whether the pixel points warp or not is further analyzed by analyzing the gray difference value between the base paper and the impregnated paper.
S109, calculating standard gray scale difference value
Because of different photographing environments, the overall brightness of the impregnated paper is different from the overall brightness of the base paper, the difference is called a standard gray scale difference value, and the gray scale difference value is corrected according to the standard gray scale difference value to obtain gray scale change conditions of pixel points before and after impregnation.
S110, calculating the gray level change condition to obtain a gray level change value
The change of the gray level shows the brightness change of the warping part of the impregnated paper, the gray level difference value is corrected according to the standard gray level difference value to obtain the gray level change condition of each pixel point, and the gray level change condition of each pixel point is counted to obtain the gray level change value.
S111, calculating the warping degree of each pixel point according to the displacement change value and the gray change value
The warping degree reflects the warping degree of the impregnated paper, and the impregnating uniformity of the whole impregnated paper is analyzed by calculating the warping degree of each pixel point, so that the impregnating quality is detected.
S112, calculating impregnation uniformity according to pixel point warping degree of each region
The impregnated paper image is subjected to sliding window treatment to obtain a plurality of areas, and the impregnation uniformity obtained according to the warping degree of each area can better reflect the overall quality of the impregnated paper.
S113, detecting the dipping quality according to the dipping uniformity
When all the warping degrees are close to 0, the impregnated paper is not warped, and the impregnation uniformity is high; when the difference between the warpage in the region and 0 is large or the difference between the warpage in the region is large, the warpage is serious, and the impregnation uniformity is small. The quality of the impregnated paper is detected through the impregnation uniformity, and the detection result is more accurate.
Example 2
An embodiment of a machine vision-based impregnated paper impregnation quality detection method of the invention is shown in figures 1 and 3,
s201, acquiring a base paper image and a impregnated paper image
When shooting products on a product line, the background parts except the products are inevitably shot, so that the shot images are required to be processed, and the base paper images and the impregnated paper images are obtained by semantic segmentation, so that the interference of background pixels on subsequent result analysis can be avoided.
The embodiment adopts a DNN semantic segmentation mode to identify targets in the segmented image:
a. the used data set is a product image data set on a conveyor belt which is acquired in a overlooking mode;
b. the pixels to be segmented are divided into 2 classes, namely, the labeling process of the corresponding label of the training set is as follows: the single-channel semantic label is marked as 0, and the pixel at the corresponding position belongs to the background class, and the mark of the single-channel semantic label belongs to base paper or impregnated paper as 1;
c. the task of the network is classification, so the loss function used is a cross entropy loss function.
The image processing process is realized through DNN semantic segmentation, and the base paper image and the impregnated paper image are obtained.
S202, performing key point matching by using SIFT algorithm
The base paper becomes impregnated paper after impregnation processing, if the impregnation is uneven, the glue content of each area of the impregnated paper is different, and through a drying procedure, the moisture volatilization amount of different areas is different, so that shrinkage is uneven, the impregnated paper generates buckling deformation, and then image matching is needed to be carried out, and an image of the base paper is corresponding to pixel points in the image of the impregnated paper for analysis.
The corresponding impregnated paper image A 'after the impregnation of the base paper image A is A', and A is needed to measure the paper change before and after the impregnationThe pixel points on the pixel points are corresponding to the pixel points on the A'. Performing key point matching on the image A and the image A' by using a SIFT algorithm to obtain key points { B } on the image A 1 ,B 2 ,…,B k Key point { B ' on A ' corresponding to it one by one ' 1 ,B′ 2 ,…,B′ k K key points matched on A, and k corresponding key points on A'.
And combining the matched key point pairs, and matching the non-key points on the image A and the image A'. And if the pixel point is a non-key point, calculating the Euclidean distance from the pixel point to each key point on the image A to obtain a key point distance sequence of the ith pixel. When the key point distance sequence is analyzed, if impregnated paper is impregnated unevenly, warping may occur, so that the pixel points are displaced, the pixel points of a single warping region are wholly displaced, the relative positions among the pixel points in the region are unchanged, and the pixel points can be considered to have local position invariance. Therefore, the minimum 5 distances in the key point distance sequence of the ith pixel are taken for normalization, and the rest distance values are set to be 0, so that a vector [ d ] is obtained i1 ,d i2 ,…,d ik ]The position vector of the i-th pixel point is called as a position vector of the i-th pixel point and used for representing the position of the pixel point. According to the method, the position vector of each non-key point on the base paper image A is calculated.
Similarly, a position vector of each non-critical point on the impregnated paper image a' is obtained.
S203, obtaining corresponding pixel points corresponding to each pixel point in the impregnated paper image in the base paper image
When the SIFT algorithm is used for matching the characteristic points, only a part of pixel points with obvious characteristics, such as corner points, edge points, bright points in a dark area and dark points in a bright area, can be matched, so that each pixel point in the impregnated paper image and the corresponding pixel point in the base paper image need to be determined according to the corresponding relation of the key points.
Calculating cosine similarity between an ith pixel point position vector in the impregnated paper image A 'and each non-key point position vector in the base paper image A, and taking a pixel point in the base paper image A with the maximum cosine similarity as the ith pixel point in the impregnated paper image A'And corresponding pixel points. Obtaining the corresponding pixel point in A for the pixel point of each non-key point in A ', and synthesizing the key point pairs to obtain the corresponding relation between the image A' and the pixel point pairs in A: all pixel points { C 'in impregnated paper image A' 1 ,C′ 2 ,…,C′ l And pixel point { C } in the base paper image A 1 ,C 2 ,…,C l One-to-one correspondence.
The detected dipping quality can be more accurate by changing the position and the gray level according to the corresponding pixel points.
S204, calculating the warping degree of each pixel point according to the displacement change value and the gray change value of each pixel point
The warping degree reflects the warping degree of the impregnated paper, and the impregnating uniformity of the whole impregnated paper is analyzed by calculating the warping degree of each pixel point, so that the impregnating quality is detected.
The calculation formula of the warp of each pixel point is as follows:
wherein: h is the warp of the pixel point in the impregnated paper image, q is the displacement variation value of the pixel point, g is the gray scale variation value of the pixel point, and u is the serial number of the pixel point.
tan () is a hyperbolic tangent function for changing the gray level g u Compressing to between-1 and 1, wherein the gray value of the pixel point is influenced by noise, the difference of the warp gray is also influenced by noise points, and g is determined by the error of the warp degree u The proportion of the warp gray scale difference in calculation of the warp degree is reduced by compressing the warp gray scale difference to between-1 and 1, so that the error is reduced. When the displacement difference of the pixel point is larger and the warping gray scale difference is positive, the area where the pixel point is located is warped upwards, namely, bulges; when the displacement difference of the pixel point is larger and the warping gray scale difference is negative, the area where the pixel point is located warps downwards, namely is concave; when the displacement difference of the pixel point is smaller, the warping degree of the region where the pixel point is located is smaller.
S205, calculating displacement change condition to obtain displacement change value
And correcting the distance difference sequence according to the standard distance difference value to obtain the displacement variation value of each pixel point.
If the warping portion exists in the impregnated paper image a ', the pixel point of the warping portion is displaced compared with the position of the corresponding pixel point in the base paper image a, and the distance from the pixel point of the warping portion to the key point B' is different from the distance from the corresponding pixel point in the base paper image a to the key point B by a displacement difference besides a standard distance difference.
And carrying out difference on each value in the distance difference sequence and the standard distance difference value, and correcting each distance difference in the distance difference sequence to obtain a displacement variation value of each pixel point.
S206, acquiring a distance sequence of the base paper image and a distance sequence of the impregnated paper image
And calculating the distance between each pixel point and each key point to respectively obtain a distance sequence of each pixel point in the base paper image and the impregnated paper image, and judging whether the positions of the pixel points before and after impregnation change according to the distance relation of the corresponding pixel point pairs so as to judge whether warping exists.
For pixel point { C in base paper image 1 ,C 2 ,…,C l Respectively calculating Euclidean distances from a certain pixel point to all key points to obtain a base paper distance sequence from the pixel point to all key points in the base paper image A, traversing all pixel points in the base paper image according to the method, and obtaining the base paper distance sequence of each pixel point in the base paper image; for each pixel point { C 'in the impregnated paper image' 1 ,C′ 2 ,…,C′ l And respectively calculating Euclidean distances from a certain pixel point to all key points to obtain an impregnated paper distance sequence from the certain pixel point to all key points in the impregnated paper image A', traversing all the pixel points in the impregnated paper image according to the method, and obtaining the impregnated paper distance sequence of each pixel point in the impregnated paper image.
In this embodiment, 10 key points are selected, and the key points on the 10 base paper images and the impregnated paper image are traversed according to the above method, respectively.
S207, performing difference on the base paper image distance sequence and the impregnated paper image distance sequence to obtain a distance difference sequence
If the warping portion exists in the impregnated paper, the pixel points of the warping portion are displaced compared with the positions of the corresponding pixel points in the base paper, a distance difference sequence is obtained through distance change before and after the impregnation of each pixel point, and the displacement condition of the pixel points is further analyzed through the distance difference before and after the impregnation.
If the warping portion exists in the impregnated paper image A ', the pixel point of the warping portion is displaced compared with the position of the corresponding pixel point in the base paper image A, and the displacement difference exists between the distance from the pixel point of the warping portion to the key point B' and the distance from the corresponding pixel point in the base paper image A to the key point B.
10 base paper distance sequences and 10 impregnated paper distance sequences of each pixel point can be obtained through S206, so that 10 distance differences can be obtained for each pixel point, the distance difference with the largest absolute value is taken as the distance difference of the pixel point pair, and the distance differences of all the pixel points are counted to obtain a distance difference sequence.
S208, calculating standard distance difference
Because the shooting places of the base paper image and the impregnated paper image are different, the sizes of the shot images may have differences, so that the distances from all pixel points in the impregnated paper image to the key points are different from the distances from the corresponding pixel points in the base paper image to the key points, the difference is basically consistent, and the difference is a standard distance difference value.
Performing mean shift clustering on the distance difference sequences to obtain m categories, counting the data amount in each category, calculating the variance of the data in the category, and calculating the density of the category according to the data amount of each category and the data variance in the category, wherein the calculation formula is as follows:
wherein: ρ is the density of each category, n is the amount of data in that category, and C is the variance of the data in that category.
The greater the density ρ of a category, the more data is in that category and the more concentrated the data in that category. For the impregnated paper image, the number of pixel points of the non-warping area is more than that of other warping areas, and in the difference between the distance from the pixel point of the area to the key point B' and the distance from the corresponding pixel point to the key point B in the base paper image A, the category with the maximum density rho is taken, the average value of data contained in the category is calculated, and the average value is taken as a standard distance difference value.
And correcting the displacement of the pixel point through the standard distance difference value, so that the calculated warping degree is more accurate.
S209, calculating the gray level change condition to obtain a gray level change value
The change of the gray level shows the brightness change of the warping part of the impregnated paper, the gray level difference value is corrected according to the standard gray level difference value to obtain the gray level change condition of each pixel point, and the gray level change condition of each pixel point is counted to obtain the displacement change value.
And (3) carrying out difference on each gray level difference value and the standard gray level difference value to obtain a gray level change value of each pixel point.
S210, calculating gray value difference values of corresponding pixel points in the base paper image and the impregnated paper image
The pixel points of the warping portion of the impregnated paper are affected by light reflection, the brightness of the pixel points of the non-warping portion is different, and whether the pixel points warp or not is further analyzed by analyzing the gray difference value between the base paper and the impregnated paper.
Obtaining a pixel point { C } in a base paper image A 1 ,C 2 ,…,C l Corresponding pixel point { C 'in the impregnated paper image A }, and' 1 ,C′ 2 ,…,C′ l And (3) gray scale, and carrying out difference on gray scale values of all pixel point pairs to obtain a gray scale value difference value.
S211, calculating standard gray scale difference value
Because of different photographing environments, the overall brightness of the impregnated paper is different from the overall brightness of the base paper, the difference is called a standard gray scale difference value, and the gray scale difference value is corrected according to the standard gray scale difference value to obtain gray scale change conditions of pixel points before and after impregnation.
The pixel points of the warp portion of the impregnated paper are affected by reflection of light, and there is a difference in brightness of the pixel points of the non-warp portion. If the area where the pixel is located is not warped, the gray difference obtained in S210 only includes the standard gray difference, and if the area where the pixel is located is warped, the gray difference obtained in S210 includes not only the gray difference caused by the warp but also the standard gray difference.
The displacement variation value can reflect whether each position is warped to a certain extent, and a weight w is set for each gray level difference value by combining the displacement variation value.
The weight of each gray scale difference is calculated as follows:
wherein: w is the weight of the gray scale difference value; f (q) is a displacement variation function; u is the sequence number of the gray level change value and is also the sequence number of the displacement change value; l is the number of gray level variation values and also the number of displacement variation values, namely the number of pixels of the impregnated paper image;
wherein, the expression of the displacement change function is:
wherein: q is the displacement variation value, q u Is the u-th displacement variation value.
And carrying out weighted summation on the gray values of the pixel points, wherein the obtained result is the standard gray difference value.
S212, calculating impregnation uniformity according to pixel point warping degree of each region
The impregnated paper image is subjected to sliding window treatment to obtain a plurality of areas, and the impregnation uniformity obtained according to the warping degree of each area can better reflect the overall quality of the impregnated paper.
Carrying out sliding window processing on the impregnated paper image by a window with the size of 3 multiplied by 3 to obtain t areas, and calculating impregnation uniformity according to the warping degree of all pixel points and the warping degree of the area to which the pixel points belong, wherein the calculation formula is as follows:
wherein: p is the dipping uniformity, l is the number of pixels of the dipping paper image, h u To impregnate the warp degree of the ith pixel point in the paper image, H t(u) In order to impregnate the warp degree of the area that the (u) th pixel point belongs to in the paper image, u is the serial number of the pixel point.
The method for obtaining the warpage of each region comprises the following steps:
and respectively acquiring the warp of the pixel points of each region for the t regions obtained by the sliding window processing, extracting the warp of the pixel point with the minimum warp of the pixel points of each region, and taking the warp as the warp of the region. Namely: obtaining minimum value { H } of pixel point warping degree in each region 1-min ,H 2-min … }, wherein H 1-min The warp of the 1 st region is shown.
S213, detecting the dipping quality according to the dipping uniformity
When all the warping degrees are close to 0, the impregnated paper is not warped, and the impregnation uniformity is high; when the difference between the warpage in the region and 0 is large or the difference between the warpage in the region is large, the warpage is serious, and the impregnation uniformity is small. The quality of the impregnated paper is detected through the impregnation uniformity, and the detection result is more accurate.
According to the impregnation uniformity, the impregnation quality is evaluated:
if p is more than or equal to alpha, the impregnation is uniform, and the impregnation quality is qualified;
if p < alpha, the impregnation is uneven, and the impregnation quality is unqualified;
where α is a set threshold, given manually, the empirical value is 0.9.
The foregoing description of the preferred embodiments of the invention is not intended to be limiting, but rather is intended to cover all modifications, equivalents, alternatives, and improvements that fall within the spirit and scope of the invention.

Claims (6)

1. A method for detecting the impregnation quality of impregnated paper is characterized by comprising the following steps: the method comprises the following steps:
acquiring a base paper image and an impregnated paper image;
acquiring a matched key point pair of a base paper image and a impregnated paper image; obtaining corresponding pixel points corresponding to the pixel points in the impregnated paper image in the base paper image by utilizing the pixel points in the impregnated paper image;
calculating Euclidean distance between each pixel point and each key point in the impregnated paper image to obtain an impregnated paper distance sequence of each pixel point in the impregnated paper image;
obtaining a base paper distance sequence of each corresponding pixel point in the base paper image according to Euclidean distance between each pixel point and each key point in the obtained base paper image;
the method comprises the steps of performing difference between an impregnated paper distance sequence of each pixel point in an impregnated paper image and a base paper distance sequence of each corresponding pixel point in a base paper image to obtain a distance difference value sequence of each pixel point, and taking the difference value with the largest absolute value in the difference value sequence obtained by each pixel point as the distance difference of each pixel point;
taking the absolute value of the difference between the distance difference of each pixel point and the standard distance difference as a displacement variation value;
calculating gray scale difference weight of each pixel point according to the displacement change value of the pixel point;
the gray value difference of each pixel point in the impregnated paper image and the gray value difference of each corresponding pixel point in the base paper image are utilized to obtain the gray value difference of each pixel point, and the difference value between the gray value difference of each pixel point and the standard gray value difference is utilized as a gray change value;
calculating the warping degree of each pixel point by using the gray level change value and the displacement change value of each pixel point;
carrying out sliding window processing on the impregnated paper image to obtain a plurality of areas, and calculating impregnation uniformity according to the determined warp of each area, wherein the warp of each pixel point in each area;
detecting the impregnation quality according to the impregnation uniformity;
the standard distance difference value calculating method comprises the following steps:
performing mean shift clustering on the distance difference sequences to obtain a plurality of categories, and calculating the density of each category, wherein the calculation formula is as follows:
wherein: ρ is the density of each category, n is the amount of data for that category, and S is the variance of the data in that category;
calculating to obtain the density of each category, extracting the data average value of the category corresponding to the maximum density, and taking the average value as a standard distance difference value;
the calculation formula of the weight of each gray level difference is as follows:
wherein: w is the gray difference weight of the gray difference; f (q) is a displacement variation function; u is the sequence number of the gray level change value and is also the sequence number of the displacement change value; l is the number of gray level variation values and also the number of displacement variation values, namely the number of pixels of the impregnated paper image;
wherein, the expression of the displacement change function is:
wherein: q is the displacement variation value, q u Is the u-th displacement variation value;
and carrying out weighted summation on the gray difference value corresponding to each pixel point according to the gray difference weight to obtain a standard gray value difference value.
2. The impregnation quality detection method of impregnated paper according to claim 1, wherein: the method for obtaining the corresponding pixel points corresponding to the pixel points in the impregnated paper image in the base paper image comprises the following steps:
matching key points on the base paper image and the impregnated paper image by using a SIFT algorithm, calculating Euclidean distance from each non-key point in the base paper image to each key point to obtain a key point distance sequence of each non-key point, carrying out normalization processing on the minimum 5 values in the key point distance sequence, resetting other values to 0 to obtain a position vector of a pixel point, and traversing all the non-key points in the base paper image by using the method to obtain the position vector of each non-key point on the base paper image;
the same method obtains the position vector of each non-key point in the impregnated paper image;
and calculating cosine similarity between a certain pixel point position vector in the impregnated paper image and each non-key point position vector in the base paper image, selecting a pixel point in the base paper image with the maximum cosine similarity to correspond to the pixel point, and traversing all pixel points in the impregnated paper image to obtain corresponding pixel points corresponding to all pixel points in the impregnated paper image in the base paper image.
3. The impregnation quality detection method of impregnated paper according to claim 1, wherein: the calculation formula of the warping degree of each pixel point is as follows:
wherein: h is the warp of the pixel point in the impregnated paper image, q is the displacement variation value of the pixel point, g is the gray scale variation value of the pixel point, and u is the serial number of the pixel point.
4. The impregnation quality detection method of impregnated paper according to claim 1, wherein: the method for calculating the impregnation uniformity comprises the following steps:
carrying out sliding window processing on the impregnated paper image to obtain t areas, and calculating impregnation uniformity according to the warping degree of all pixel points and the warping degree of the area to which the pixel points belong, wherein the calculation formula is as follows:
wherein: p is the dipping uniformity, l is the number of pixels of the dipping paper image, h u To impregnate the warp degree of the ith pixel point in the paper image, H t(u) For the warping degree of the region of the u-th pixel point in the impregnated paper image, u is the serial number of the pixel point, exp is an exponential function based on a natural constant, max is a maximum value selection function, and min is a minimum value selection function.
5. The method for detecting the impregnation quality of impregnated paper according to claim 4, wherein: the method for obtaining the warpage of each region comprises the following steps:
and respectively acquiring the warp of the pixel points of each region for the t regions obtained by the sliding window processing, extracting the warp of the pixel point with the minimum warp of the pixel points of each region, and taking the warp as the warp of the region.
6. The impregnation quality detection method of impregnated paper according to claim 1, wherein: the method for detecting the impregnation quality according to the impregnation uniformity comprises the following steps:
comparing the soaking uniformity with a set threshold, if the soaking uniformity is greater than or equal to the set threshold, determining that the soaking quality detection result is qualified, and if the soaking uniformity is less than the set threshold, determining that the soaking quality detection result is unqualified.
CN202310623590.2A 2023-05-29 2023-05-29 Impregnated paper impregnation quality detection method based on machine vision Active CN116721067B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310623590.2A CN116721067B (en) 2023-05-29 2023-05-29 Impregnated paper impregnation quality detection method based on machine vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310623590.2A CN116721067B (en) 2023-05-29 2023-05-29 Impregnated paper impregnation quality detection method based on machine vision

Publications (2)

Publication Number Publication Date
CN116721067A CN116721067A (en) 2023-09-08
CN116721067B true CN116721067B (en) 2024-04-12

Family

ID=87864063

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310623590.2A Active CN116721067B (en) 2023-05-29 2023-05-29 Impregnated paper impregnation quality detection method based on machine vision

Country Status (1)

Country Link
CN (1) CN116721067B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010085166A (en) * 2008-09-30 2010-04-15 Toray Ind Inc Prepreg defect inspection method
CN103163133A (en) * 2013-02-05 2013-06-19 华东理工大学 Continuous filament reinforced plastic impregnation uniformity evaluation method
CN111309271A (en) * 2020-01-14 2020-06-19 浙江大学滨海产业技术研究院 Machine vision-based double-color printer detection method
CN112211362A (en) * 2020-10-28 2021-01-12 成都市美康三杉木业有限公司 Wear-resistant and moisture-proof impregnated paper solid wood composite board and production method thereof
CN114219794A (en) * 2021-12-17 2022-03-22 沭阳县桐盛木业制品厂(普通合伙) Method and system for evaluating surface quality of shaving board based on machine vision
CN114445387A (en) * 2022-01-29 2022-05-06 泗阳富艺木业股份有限公司 Fiberboard quality classification method based on machine vision
CN114782562A (en) * 2022-06-18 2022-07-22 南通寝尚纺织品有限公司 Garment fabric dip dyeing monitoring method based on data identification and artificial intelligence system
CN115319979A (en) * 2022-10-12 2022-11-11 江苏明锋食品有限公司 Gum dipping control method for protective gloves
CN115439481A (en) * 2022-11-09 2022-12-06 青岛平电锅炉辅机有限公司 Deaerator welding quality detection method based on image processing
CN115719344A (en) * 2022-11-24 2023-02-28 南通世森布业有限公司 Fabric defect analysis and identification method for textile fabric quality inspection

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102682514B (en) * 2012-05-17 2014-07-02 广州广电运通金融电子股份有限公司 Paper identification method and relative device

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010085166A (en) * 2008-09-30 2010-04-15 Toray Ind Inc Prepreg defect inspection method
CN103163133A (en) * 2013-02-05 2013-06-19 华东理工大学 Continuous filament reinforced plastic impregnation uniformity evaluation method
CN111309271A (en) * 2020-01-14 2020-06-19 浙江大学滨海产业技术研究院 Machine vision-based double-color printer detection method
CN112211362A (en) * 2020-10-28 2021-01-12 成都市美康三杉木业有限公司 Wear-resistant and moisture-proof impregnated paper solid wood composite board and production method thereof
CN114219794A (en) * 2021-12-17 2022-03-22 沭阳县桐盛木业制品厂(普通合伙) Method and system for evaluating surface quality of shaving board based on machine vision
CN114445387A (en) * 2022-01-29 2022-05-06 泗阳富艺木业股份有限公司 Fiberboard quality classification method based on machine vision
CN114782562A (en) * 2022-06-18 2022-07-22 南通寝尚纺织品有限公司 Garment fabric dip dyeing monitoring method based on data identification and artificial intelligence system
CN115319979A (en) * 2022-10-12 2022-11-11 江苏明锋食品有限公司 Gum dipping control method for protective gloves
CN115439481A (en) * 2022-11-09 2022-12-06 青岛平电锅炉辅机有限公司 Deaerator welding quality detection method based on image processing
CN115719344A (en) * 2022-11-24 2023-02-28 南通世森布业有限公司 Fabric defect analysis and identification method for textile fabric quality inspection

Also Published As

Publication number Publication date
CN116721067A (en) 2023-09-08

Similar Documents

Publication Publication Date Title
CN115239735B (en) Communication cabinet surface defect detection method based on computer vision
Korus et al. Multi-scale analysis strategies in PRNU-based tampering localization
CN116205919B (en) Hardware part production quality detection method and system based on artificial intelligence
CN115351598A (en) Numerical control machine tool bearing detection method
Kuo et al. Application of computer vision in the automatic identification and classification of woven fabric weave patterns
CN115147409B (en) Mobile phone shell production quality detection method based on machine vision
CN116912261B (en) Plastic mold injection molding surface defect detection method
CN116977358B (en) Visual auxiliary detection method for corrugated paper production quality
CN114372955A (en) Casting defect X-ray diagram automatic identification method based on improved neural network
Unser et al. Feature extraction and decision procedure for automated inspection of textured materials
CN116503394B (en) Printed product surface roughness detection method based on image
CN116309599B (en) Water quality visual monitoring method based on sewage pretreatment
CN111724376B (en) Paper disease detection method based on texture feature analysis
CN115994907B (en) Intelligent processing system and method for comprehensive information of food detection mechanism
CN117372432A (en) Electronic cigarette surface defect detection method and system based on image segmentation
CN116402742A (en) Visual detection method and system for surface defects of automobile sheet metal part
CN115311503A (en) Fiber classification method, system, computer device and medium
CN116721067B (en) Impregnated paper impregnation quality detection method based on machine vision
CN116246174B (en) Sweet potato variety identification method based on image processing
CN117388263A (en) Hardware terminal quality detection method for charging gun
CN115841491B (en) Quality detection method for porous metal material
CN110059557A (en) A kind of face identification method adaptive based on low-light (level)
CN107301718B (en) A kind of image matching method and device
CN113705672A (en) Threshold value selection method, system and device for image target detection and storage medium
Wang et al. Woven fabric defect detection based on the cascade classifier

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant