CN114219794B - Method and system for evaluating surface quality of shaving board based on machine vision - Google Patents
Method and system for evaluating surface quality of shaving board based on machine vision Download PDFInfo
- Publication number
- CN114219794B CN114219794B CN202111554577.3A CN202111554577A CN114219794B CN 114219794 B CN114219794 B CN 114219794B CN 202111554577 A CN202111554577 A CN 202111554577A CN 114219794 B CN114219794 B CN 114219794B
- Authority
- CN
- China
- Prior art keywords
- point
- connecting line
- gray
- pixel
- shaving
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
Abstract
The invention relates to the technical field of machine vision, in particular to a method and a system for evaluating the surface quality of a shaving board based on machine vision, wherein the method comprises the following steps: matching pixel points based on the surface image of the shaving board to obtain a plurality of point pairs, wherein the pixel points in the point pairs are respectively positioned on a group of opposite sides of the shaving board; acquiring the connection direction, connection gray value and connection distance of each point pair; the gray average value of the point-to-point connecting line passing through the pixels is the gray value of the connecting line; sliding a window on the surface image of the shaving board, grouping point pairs in the window according to the connecting direction and the connecting gray value of the point pairs for each window, wherein the point pairs in each group belong to the same shavings, and calculating the rough planing degree corresponding to pixels in the window according to the number of the point pairs in each group and the connecting distance; and identifying and positioning the large shaving defects based on the rough shaving degree corresponding to each pixel in the surface image of the shaving board. The defect detection result of the invention is not easily interfered by the complex texture on the surface of the shaving board.
Description
Technical Field
The invention relates to the field of machine vision, in particular to a method and a system for evaluating the surface quality of a shaving board based on machine vision.
Background
At present, the shaving board is used as a main base material for producing furniture boards, and finished furniture boards are manufactured by secondary surface processing such as melamine impregnated paper veneering. In order to reduce the cost, the gram paper weight of the melamine impregnated paper is continuously reduced, and the covering performance of the paper is also reduced, so that the bottom penetration phenomenon is easy to occur in the veneering process of the shaving board base material, namely, thick shavings of the base material shaving board are hidden and not visible, the visual effect is influenced, and the appearance quality of the product is reduced.
The reason for this penetration is that, in addition to the decrease in the covering property of paper, the surface quality of the particle board substrate itself, i.e., the presence of coarse particles on the surface of the particle board substrate, is also an important factor affecting the quality of the veneer. Therefore, the quality of the shaving board base material is evaluated in time before the veneering process, so that the appearance defect rate of the finished product can be effectively reduced. The surface quality defect of the base material of the shaving board is mainly caused by the large shaving on the surface, and the large shaving needs to be detected.
In the prior art, the detection of the large wood shavings mainly comprises manual visual detection, manual neural network algorithm detection and traditional machine vision detection; wherein, the manual visual inspection has low efficiency and is easy to make mistakes; the artificial neural network algorithm requires a large amount of training data and the data acquisition cost is high; the defects are generally divided by a mean value variance method in the traditional machine vision detection, and the method cannot detect the shaving board base material with complex texture and complex color of the shavings.
Disclosure of Invention
In order to solve the above technical problems, the present invention aims to provide a method and a system for evaluating the surface quality of a particle board based on machine vision, and the adopted technical scheme is as follows:
in a first aspect, an embodiment of the present invention provides a method for evaluating the surface quality of a shaving board based on machine vision, which includes the following specific steps:
matching pixel points based on the shaving board surface image to obtain a plurality of point pairs, wherein the pixel points in the point pairs are respectively positioned on a group of opposite sides of the shaving board; acquiring the connection direction, the connection gray value and the connection distance of each point pair; taking the gray average value of the pixels of the point-to-point connection line as the gray value of the connection line;
sliding a window on the surface image of the shaving board, grouping point pairs in the window according to the connecting direction and the connecting gray value of the point pairs for each window, wherein the point pairs in each group belong to the same shavings, and calculating the rough planing degree corresponding to pixels in the window according to the number of the point pairs in each group and the connecting distance;
and identifying and positioning the large shaving defects based on the rough shaving degree corresponding to each pixel in the surface image of the shaving board.
Further, for each window, grouping the point pairs in the window according to the line direction and the line gray value of the point pair, specifically:
for each window, acquiring the connecting line direction grade and the connecting line gray value grade corresponding to the connecting line direction and the connecting line gray value of each point pair in the window; constructing a gray-direction co-occurrence matrix for point pair grouping by taking the connecting line direction grade and the connecting line gray value grade as two dimensions; the element values at the alpha row and the beta column in the gray-scale-direction co-occurrence matrix represent the number of point pairs with the line-connecting direction grade of alpha and the line-connecting gray value grade of beta in the window, and the ratio p (alpha, beta) is the ratio.
Further, calculating the rough planing degree corresponding to the pixels in the window according to the number of the point pairs in each group and the connecting line distance, specifically:
for each window, obtaining a distance index d (alpha, beta) according to a connecting line distance average value of a point pair of which the connecting line direction grade is alpha and the connecting line gray value grade is beta in a gray-direction co-occurrence matrix corresponding to the window;
r represents the rough planing degree corresponding to the pixels in the window, and A and B represent the divided connecting line direction grade number and the connecting line gray value grade number respectively.
Further, if the connection distance mean value is smaller than the distance threshold, the distance index is the ratio of the connection distance mean value to the distance threshold, otherwise, the distance index is a preset value.
Further, if the pixels correspond to a plurality of rough shaving degrees, the identification and the positioning of the large shaving defect are carried out based on the average value of the rough shaving degrees.
Further, the obtaining of the point pair specifically includes:
for each pixel point in the image on the surface of the shaving board, dividing a neighborhood region of each pixel point into two sub-regions by using a vertical line in the gradient direction of each pixel point; the gray mean value corresponding to the subregion with smaller gray variance is the gray index value corresponding to the pixel, and the direction of the pixel point to the subregion with smaller gray variance along the straight line of the gradient direction is the internal direction of the wood shavings corresponding to the pixel point;
classifying the pixels in the image on the surface of the shaving board based on the gray index value, and for each category, sequentially taking each pixel in the category as a target pixel, wherein the pixels in the category opposite to the internal direction of the target pixel constitute a candidate matching point set of the target pixels; if the candidate matching point set corresponding to the point in the candidate matching point set does not comprise the target pixel point, the point is removed, and the remaining points and the target pixel point respectively form candidate point pairs; candidate point pairs corresponding to each pixel point in the category form a candidate point pair set, and point pairs are selected in the candidate point pair set based on the connecting line distance; wherein the pixel points in the selected pair are different.
Further, selecting a point pair from the candidate point pair set based on the connecting line distance specifically includes:
and performing ascending sorting of the connecting line distance on the point pairs in the candidate point pair set, wherein the point with the minimum connecting line distance is the selected point pair, removing the point pairs of the pixel points in the selected point pair in the candidate point pair set, taking the point with the minimum connecting line distance in the residual point pairs as the selected point pair, and iteratively selecting the point pairs until no point pair exists in the candidate point pair set.
In a second aspect, another embodiment of the invention provides a machine vision-based particle board surface quality assessment system, comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of a machine vision-based particle board surface quality assessment method.
The embodiment of the invention at least has the following beneficial effects: calculating the rough planing degree corresponding to the pixels in the window based on the number of the point pairs in each group in the window and the connecting line distance; then, carrying out large shaving positioning based on the rough shaving degree corresponding to each pixel in the surface image of the shaving board; the defect detection result of the invention is not easily interfered by the complex texture on the surface of the shaving board, and the defect judgment and positioning of the large shavings can be realized according to the complex texture characteristics on the surface of the shaving board.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions and advantages of the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a flowchart illustrating steps according to an embodiment of the present invention.
Detailed Description
To further illustrate the technical means and effects of the present invention adopted to achieve the predetermined objects, the following detailed description of the embodiments, structures, features and effects of the method and system for evaluating the surface quality of a shaving board based on machine vision according to the present invention will be made with reference to the accompanying drawings and preferred embodiments. In the following description, different "one embodiment" or "another embodiment" refers to not necessarily the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The following application scenarios are taken as examples to illustrate the present invention:
the application scene is as follows: in the production of furniture board, the shaving board is used as a base material, secondary surface processing such as veneering is needed to be carried out on the shaving board, and if the surface of the shaving board base material has defects (mainly large shaving defects) before the veneering process, the veneering process defects of veneering and bottom penetration are easy to occur after the veneering process is carried out on the shaving board base material. Therefore, the surface quality of the shaving board base material needs to be detected before the veneering process is carried out, so that the defect rate of the finished veneered product is reduced. According to the embodiment of the invention, a camera is arranged before a feeding port of a veneering production line to acquire the surface image of the shaving board base material, the acquired image is processed to obtain the surface quality evaluation result of the current shaving board base material, and meanwhile, if the surface quality is unqualified, the approximate area of the defect on the surface of the base material is marked.
The specific scheme of the method and the system for evaluating the surface quality of the shaving board based on the machine vision provided by the invention is specifically described below by combining the attached drawings.
Referring to fig. 1, a flow chart illustrating steps of a method for evaluating a surface quality of a particleboard according to an embodiment of the present invention, the method comprising the steps of:
matching pixel points based on the shaving board surface image to obtain a plurality of point pairs, wherein the pixel points in the point pairs are respectively positioned on a group of opposite sides of the shaving board; acquiring the connection direction, connection gray value and connection distance of each point pair; the gray average value of the point-to-point connecting line passing through the pixels is the gray value of the connecting line;
sliding a window on the surface image of the shaving board, grouping point pairs in the window according to the connecting direction and the connecting gray value of the point pairs for each window, wherein the point pairs in each group belong to the same shavings, and calculating the rough planing degree corresponding to pixels in the window according to the number of the point pairs in each group and the connecting distance;
and identifying and positioning the large shaving defects based on the rough shaving degree corresponding to each pixel in the surface image of the shaving board.
The following steps are specifically developed:
step S1, carrying out pixel point matching based on a surface image of a shaving board to obtain a plurality of point pairs, wherein pixel points in the point pairs are respectively positioned on a group of opposite sides of the shaving board; acquiring the connection direction, connection gray value and connection distance of each point pair; and the gray average value of the point pair connecting line passing through the pixels is the gray value of the connecting line.
(a) The gradient direction of each pixel point in the surface image of the shaving board is obtained, specifically, the surface image of the shaving board is converted into a gray-scale map, the gradient direction of the pixel points is calculated based on the gray-scale map, and preferably, the gradient direction of each pixel point is obtained by utilizing a sobel operator in the embodiment.
(b) For a pixel point in the image on the surface of the shaving board, if the pixel point is an edge point of a certain wood chip on the surface of the shaving board, a straight line where the gradient direction of the pixel point passes through the inside and the outside of the wood chip, according to prior, the pixel point with uniform gray value inside the wood chip is, and the point inside other random wood chips outside the wood chip is. The single wood shavings are made by cutting wood and are characterized in that the wood shavings are slender wood veneers with two irregular ends and long sides approximately parallel to each other; consequently, combine the wood shavings characteristic, carry out the pixel matching based on the gradient direction of pixel, obtain a plurality of pairs, specifically:
for each pixel point in the image on the surface of the shaving board, dividing a neighborhood region of each pixel point into two sub-regions by using a perpendicular line in the gradient direction of each pixel point, wherein the neighborhood region in the embodiment is preferably an 8-neighborhood region; the gray mean value corresponding to the subregion with smaller gray variance is the gray index value corresponding to the pixel, and the direction of the pixel point to the subregion with smaller gray variance along the straight line of the gradient direction is the direction of the interior of the wood shavings corresponding to the pixel point; classifying pixel points in the image on the surface of the shaving board based on the gray index value, wherein specifically, the pixel points with the same gray index value or the gray index value within a preset index value range are classified into one type; for each category, sequentially taking each pixel point in the category as a target pixel point, wherein the pixel points with the opposite or approximately opposite directions in the shavings of the target pixel points in the category form a candidate matching point set of the target pixel points; if the candidate matching point set corresponding to the point in the candidate matching point set does not comprise the target pixel point, the point is removed, and the remaining points and the target pixel point respectively form candidate point pairs; candidate point pairs corresponding to each pixel point in the category form a candidate point pair set, and point pairs are selected in the candidate point pair set based on the connecting line distance; wherein the pixel points in the selected pair are different. Specifically, the point pair selection process is as follows: and performing ascending sorting of the connecting line distance on the point pairs in the candidate point pair set, wherein the point with the minimum connecting line distance is the selected point pair, removing the point pairs of the pixel points in the selected point pair in the candidate point pair set, taking the point with the minimum connecting line distance in the residual point pairs as the selected point pair, and iteratively selecting the point pairs until no point pair exists in the candidate point pair set. It should be noted that when removing the point pairs including the pixel points in the selected point pair from the candidate point pair set, the selected point pair is also removed.
If a pixel point in the image on the surface of the shaving board is an edge point of the shaving, the sub-region with smaller gray variance corresponding to the pixel point is an internal region of the shaving to which the pixel point belongs, and further, gray index values corresponding to the edge point of the same shaving are the same or similar, so that the pixel points in the image on the surface of the shaving board are classified based on the gray index values, and the pixel points in each category obtained after classification comprise the pixel point at the edge of a certain shaving. In addition, if a pixel point in the image on the surface of the wood shaving board is an edge point of the wood shaving, the internal direction of the wood shaving corresponding to the pixel point is perpendicular to the edge of the wood shaving to which the pixel point belongs and points to the internal part of the wood shaving.
It should be noted that the above-mentioned point pair selection process is performed on a per category basis, that is, each category corresponds to one candidate point pair set, and the point pairs are respectively selected in each candidate point pair set, so as to finally obtain a plurality of point pairs. The internal orientation of the shavings is considered when the point pairs are obtained, and therefore the pixel points in the point pairs are respectively located on a group of opposite sides of the shavings.
(c) Acquiring the connecting line direction, the connecting line gray value and the connecting line distance of each point pair, specifically: representing the direction of the point-to-point connecting line by using an included angle between the point-to-point connecting line and a preset direction, wherein the preset direction can be the column direction or the row direction of the image; the gray average value of the point-to-point connecting line passing through the pixels is the gray value of the connecting line; the Euclidean distance between two pixel points in a point pair is the point-to-point connecting line distance.
And S2, sliding a window on the surface image of the shaving board, grouping point pairs in the window according to the line connecting direction and the line connecting gray value of the point pairs for each window, wherein the point pairs in each group belong to the same shavings, and calculating the rough planing degree corresponding to pixels in the window according to the number of the point pairs in each group and the line connecting distance.
For each window, grouping the point pairs in the window according to the connecting line direction and the connecting line gray value of the point pairs, wherein the point pairs in each group belong to the same wood shavings after grouping, and specifically: for each window, acquiring a connecting line direction grade and a connecting line gray value grade corresponding to the connecting line direction and the connecting line gray value of each point pair in the window; constructing a gray-direction co-occurrence matrix for point pair grouping by taking the connecting line direction grade and the connecting line gray value grade as two dimensions; the element values at the α -th row and the β -th column in the gray-level-direction co-occurrence matrix represent the number of the point pairs with the link direction level α and the link gray-level β in the window, and specifically, p (α, β) is the ratio of the number n (α, β) of the point pairs with the link direction level α and the link gray-level β in the window to the number of the point pairs included in the window. Specifically, grouping based on the gray-direction co-occurrence matrix corresponding to the window is that if n (α, β) point pairs with the gray-direction co-occurrence matrix knowing that the connection direction level is α and the connection gray-value level is β are provided, the n (α, β) point pairs are in one group, the point pairs in each group belong to the same wood shavings, and it needs to be explained that the value of n (α, β) may be 0.
Calculating the rough planing degree corresponding to the pixels in the window according to the number of the point pairs in each group and the connecting line distance, and specifically: for each window, obtaining a distance index d (alpha, beta) according to a connecting line distance average value of a point pair of which the connecting line direction grade is alpha and the connecting line gray value grade is beta in a gray-direction co-occurrence matrix corresponding to the window; if the mean value of the connecting line distances is smaller than the distance threshold value, the distance index is the ratio of the mean value of the connecting line distances to the distance threshold value, otherwise, the distance index is a preset value; the distance threshold value is set manually according to the model or the process parameter of the base material of the shaving board in the production process, namely, the width of the shaving board is set to be larger than that of the large shaving board, and the distance threshold value is set to be twice of the width of the preset shaving board in the embodiment; preferably, the preset value in the embodiment is 1.
R represents the rough planing degree corresponding to the pixels in the window, a and B represent the divided link direction level and link gray value level, respectively, preferably, in the embodiment, the link direction and the link gray value are divided into 10 levels, respectively, the 10 levels are represented by numbers 1 to 10, respectively, that is, the link direction level and the link gray value level are both 10, and then the size of the gray-direction co-occurrence matrix is 10 rows and 10 columns. It should be noted that, for each window, the rough-shaving degree obtained based on the gray-level-direction co-occurrence matrix corresponding to the window is the rough-shaving degree corresponding to all pixels in the window, that is, the rough-shaving degrees corresponding to all pixels in the window are the same.
In the embodiment, the size of the window is 16 × 16, each window corresponds to one gray-direction co-occurrence matrix, and the rough planing degree of the pixels in the corresponding window can be obtained based on each gray-direction co-occurrence matrix; if the pixel corresponds to a plurality of rough planing degrees, identifying and positioning the large planing defect based on the average value of the rough planing degrees, specifically, when the windows are overlapped, one pixel corresponds to the plurality of rough planing degrees, and at this time, the average value of the plurality of rough planing degrees corresponding to the pixel is the final rough planing degree of the pixel; and each pixel point corresponds to a rough planing degree, the rough planing degree is normalized, the normalized rough planing degree is used as a rough planing characteristic value of the pixel point, each pixel point in the surface image of the shaving board corresponds to a rough planing characteristic value, and then a rough planing characteristic diagram with the size equal to that of the surface image of the shaving board can be obtained.
And S3, identifying and positioning the large shaving defects based on the rough shaving degree corresponding to each pixel in the surface image of the shaving board.
Identifying and positioning the large shaving defect based on the rough shaving characteristic diagram, wherein in one embodiment, the rough shaving characteristic diagram is processed by utilizing a neural network to identify and position the large shaving defect; in another embodiment, threshold segmentation is performed on the rough planing feature map, and identification and positioning of the large planing defects are performed, specifically, the value of the point in the rough planing feature map, which is greater than the segmentation threshold, is set to 1, the values of the other points are set to 0, and a defect segmentation binary map is obtained, the point in the defect segmentation binary map, which has the value of 1, is the large planing defect point, and preferably, the segmentation threshold is set to 0.7; counting the number proportion of points with the median of 1 in the defect segmentation binary image, namely obtaining the area proportion of the large shaving defect, and performing quality evaluation on the shaving board based on the number proportion, wherein specifically, the number proportion is [0,0.2], and the shaving board is a high-quality board; the method comprises the steps of obtaining a defect segmentation binary image, wherein the defect segmentation binary image comprises a large shaving defect point, the number ratio is (0.2,0.3) and a flakeboard is a qualified slab, the number ratio is (0.3,1 and the flakeboard is a defective slab, and the positioning of the large shaving is specifically to obtain an outer surrounding frame of the large shaving defect point with the value of 1 in the defect segmentation binary image, and the area in the surrounding frame is a rough area of the large shaving defect.
Based on the same inventive concept as the above-described method embodiments, an embodiment of the invention provides a machine vision based particle board surface quality evaluation system, comprising a memory, a processor and a computer program stored on the memory and executable on the processor, which computer program, when executed by the processor, performs the steps of a machine vision based particle board surface quality evaluation method.
It should be noted that: the precedence order of the above embodiments of the present invention is only for description, and does not represent the merits of the embodiments. And specific embodiments thereof have been described above. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and should not be taken as limiting the scope of the present invention, which is intended to cover any modifications, equivalents, improvements, etc. within the spirit and scope of the present invention.
Claims (5)
1. A machine vision based particle board surface quality assessment method, comprising:
matching pixel points based on the surface image of the shaving board to obtain a plurality of point pairs, wherein the pixel points in the point pairs are respectively positioned on a group of opposite sides of the shaving board; acquiring the connection direction, connection gray value and connection distance of each point pair; the gray average value of the point-to-point connecting line passing through the pixels is the gray value of the connecting line;
sliding a window on the surface image of the shaving board, grouping point pairs in the window according to the connecting direction and the connecting gray value of the point pairs for each window, wherein the point pairs in each group belong to the same shavings, and calculating the rough planing degree corresponding to pixels in the window according to the number of the point pairs in each group and the connecting distance;
identifying and positioning the large shaving defects based on the rough shaving degree corresponding to each pixel in the surface image of the shaving board;
the point pair acquisition specifically comprises:
for each pixel point in the image on the surface of the shaving board, dividing a neighborhood region of each pixel point into two sub-regions by using a vertical line in the gradient direction of each pixel point; the gray mean value corresponding to the subregion with smaller gray variance is the gray index value corresponding to the pixel, and the direction of the pixel point to the subregion with smaller gray variance along the straight line of the gradient direction is the internal direction of the wood shavings corresponding to the pixel point;
classifying the pixels in the image on the surface of the shaving board based on the gray index value, and for each category, sequentially taking each pixel in the category as a target pixel, wherein the pixels in the category opposite to the internal direction of the target pixel constitute a candidate matching point set of the target pixels; if the candidate matching point set corresponding to the point in the candidate matching point set does not comprise the target pixel point, the point is removed, and the remaining points and the target pixel point respectively form candidate point pairs; candidate point pairs corresponding to each pixel point in the category form a candidate point pair set, and point pairs are selected in the candidate point pair set based on the connecting line distance; wherein, the pixel points in the selected point pairs are different;
for each window, grouping the point pairs in the window according to the line connecting direction and the line connecting gray value of the point pairs, specifically:
for each window, acquiring the connecting line direction grade and the connecting line gray value grade corresponding to the connecting line direction and the connecting line gray value of each point pair in the window; constructing a gray-direction co-occurrence matrix for point pair grouping by taking the connecting line direction grade and the connecting line gray value grade as two dimensions; wherein, the element value in the alpha row and beta column in the gray-scale-direction co-occurrence matrix represents the number of the point pairs with the line-connecting direction grade of alpha and the line-connecting gray value grade of beta in the window, which accounts for the ratio p (alpha, beta);
calculating the rough planing degree corresponding to the pixels in the window according to the number of the point pairs in each group and the connection distance, specifically:
for each window, obtaining a distance index d (alpha, beta) according to a connecting line distance average value of a point pair of which the connecting line direction grade is alpha and the connecting line gray value grade is beta in a gray-direction co-occurrence matrix corresponding to the window;
r represents the roughness corresponding to the pixels in the window, and A and B represent the divided connecting line direction grade number and the connecting line gray value grade number respectively.
2. The method of claim 1, wherein the distance indicator is a ratio of the link distance mean to a distance threshold if the link distance mean is less than the distance threshold, otherwise, the distance indicator is a predetermined value.
3. The method of claim 2, wherein if the pixel corresponds to a plurality of rough shaving degrees, the identification and location of the large shaving defect is performed based on an average of the rough shaving degrees.
4. The method of claim 1, wherein the selection of point pairs in the set of candidate point pairs is based on link distance, and is further characterized by:
and performing ascending sorting of the connecting line distance on the point pairs in the candidate point pair set, wherein the point with the minimum connecting line distance is the selected point pair, removing the point pairs of the pixel points in the selected point pair in the candidate point pair set, taking the point with the minimum connecting line distance in the residual point pairs as the selected point pair, and iteratively selecting the point pairs until no point pair exists in the candidate point pair set.
5. A machine vision based particle board surface quality assessment system comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the computer program, when executed by the processor, carries out the steps of the method according to any one of claims 1-4.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111554577.3A CN114219794B (en) | 2021-12-17 | 2021-12-17 | Method and system for evaluating surface quality of shaving board based on machine vision |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111554577.3A CN114219794B (en) | 2021-12-17 | 2021-12-17 | Method and system for evaluating surface quality of shaving board based on machine vision |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114219794A CN114219794A (en) | 2022-03-22 |
CN114219794B true CN114219794B (en) | 2023-01-20 |
Family
ID=80703786
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111554577.3A Active CN114219794B (en) | 2021-12-17 | 2021-12-17 | Method and system for evaluating surface quality of shaving board based on machine vision |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114219794B (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114723705B (en) * | 2022-03-31 | 2023-08-22 | 深圳市启灵图像科技有限公司 | Cloth flaw detection method based on image processing |
CN114708224B (en) * | 2022-03-31 | 2023-06-23 | 吴江市双泽纺织有限公司 | Textile texture quality assessment method and system based on artificial intelligence |
CN114943739B (en) * | 2022-07-26 | 2022-10-21 | 山东三微新材料有限公司 | Aluminum pipe quality detection method |
CN115115646B (en) * | 2022-08-30 | 2022-11-18 | 启东市固德防水布有限公司 | Waterproof cloth coating quality evaluation method based on image processing |
CN116721067B (en) * | 2023-05-29 | 2024-04-12 | 宿迁凯达环保设备制造有限公司 | Impregnated paper impregnation quality detection method based on machine vision |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104636706A (en) * | 2015-03-04 | 2015-05-20 | 深圳市金准生物医学工程有限公司 | Complicated background bar code image automatic partitioning method based on gradient direction consistency |
CN106503704A (en) * | 2016-10-21 | 2017-03-15 | 河南大学 | Circular traffic sign localization method in a kind of natural scene |
CN107256406A (en) * | 2017-04-19 | 2017-10-17 | 深圳清华大学研究院 | Overlapping fibers image partition method, device, storage medium and computer equipment |
CN108830279A (en) * | 2018-04-03 | 2018-11-16 | 南昌奇眸科技有限公司 | A kind of image characteristics extraction and matching process |
CN109215020A (en) * | 2018-08-30 | 2019-01-15 | 国网黑龙江省电力有限公司佳木斯供电公司 | Ultra-high-tension power transmission line fault recognition method based on computer vision |
CN109242870A (en) * | 2018-07-13 | 2019-01-18 | 上海大学 | A kind of sea horizon detection method divided based on image with textural characteristics |
CN111709386A (en) * | 2020-06-22 | 2020-09-25 | 中国科学院空天信息创新研究院 | Method and system for classifying bottom materials of underwater shallow stratum profile image |
CN112508826A (en) * | 2020-11-16 | 2021-03-16 | 哈尔滨工业大学(深圳) | Printed matter defect detection method based on feature registration and gradient shape matching fusion |
CN112630222A (en) * | 2020-11-24 | 2021-04-09 | 河海大学常州校区 | Mobile phone cover plate glass defect detection method based on machine vision |
CN112819844A (en) * | 2021-01-29 | 2021-05-18 | 山东建筑大学 | Image edge detection method and device |
CN112837287A (en) * | 2021-01-29 | 2021-05-25 | 山东建筑大学 | Method and device for extracting defect area of board surface |
CN112862773A (en) * | 2021-01-29 | 2021-05-28 | 山东建筑大学 | Shaving board image defect classification detection method and device |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111650205B (en) * | 2020-05-11 | 2021-12-07 | 东风汽车集团有限公司 | Part surface defect detection method and system based on structured light image matching |
CN113538429B (en) * | 2021-09-16 | 2021-11-26 | 海门市创睿机械有限公司 | Mechanical part surface defect detection method based on image processing |
-
2021
- 2021-12-17 CN CN202111554577.3A patent/CN114219794B/en active Active
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104636706A (en) * | 2015-03-04 | 2015-05-20 | 深圳市金准生物医学工程有限公司 | Complicated background bar code image automatic partitioning method based on gradient direction consistency |
CN106503704A (en) * | 2016-10-21 | 2017-03-15 | 河南大学 | Circular traffic sign localization method in a kind of natural scene |
CN107256406A (en) * | 2017-04-19 | 2017-10-17 | 深圳清华大学研究院 | Overlapping fibers image partition method, device, storage medium and computer equipment |
CN108830279A (en) * | 2018-04-03 | 2018-11-16 | 南昌奇眸科技有限公司 | A kind of image characteristics extraction and matching process |
CN109242870A (en) * | 2018-07-13 | 2019-01-18 | 上海大学 | A kind of sea horizon detection method divided based on image with textural characteristics |
CN109215020A (en) * | 2018-08-30 | 2019-01-15 | 国网黑龙江省电力有限公司佳木斯供电公司 | Ultra-high-tension power transmission line fault recognition method based on computer vision |
CN111709386A (en) * | 2020-06-22 | 2020-09-25 | 中国科学院空天信息创新研究院 | Method and system for classifying bottom materials of underwater shallow stratum profile image |
CN112508826A (en) * | 2020-11-16 | 2021-03-16 | 哈尔滨工业大学(深圳) | Printed matter defect detection method based on feature registration and gradient shape matching fusion |
CN112630222A (en) * | 2020-11-24 | 2021-04-09 | 河海大学常州校区 | Mobile phone cover plate glass defect detection method based on machine vision |
CN112819844A (en) * | 2021-01-29 | 2021-05-18 | 山东建筑大学 | Image edge detection method and device |
CN112837287A (en) * | 2021-01-29 | 2021-05-25 | 山东建筑大学 | Method and device for extracting defect area of board surface |
CN112862773A (en) * | 2021-01-29 | 2021-05-28 | 山东建筑大学 | Shaving board image defect classification detection method and device |
Also Published As
Publication number | Publication date |
---|---|
CN114219794A (en) | 2022-03-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN114219794B (en) | Method and system for evaluating surface quality of shaving board based on machine vision | |
CN110349126B (en) | Convolutional neural network-based marked steel plate surface defect detection method | |
CN104794491B (en) | Based on the fuzzy clustering Surface Defects in Steel Plate detection method presorted | |
CN109377485B (en) | Machine vision detection method for instant noodle packaging defects | |
CN104990925B (en) | One kind is based on gradient multi thresholds optimization defect inspection method | |
CN108629775B (en) | Thermal state high-speed wire rod surface image processing method | |
CN109444169B (en) | Bearing defect detection method and system | |
CN112651968B (en) | Wood board deformation and pit detection method based on depth information | |
CN115082683A (en) | Injection molding defect detection method based on image processing | |
CN104933720B (en) | A kind of SOP elements positioning of view-based access control model and defect inspection method | |
CN115311292A (en) | Strip steel surface defect detection method and system based on image processing | |
CN115082419A (en) | Blow-molded luggage production defect detection method | |
CN111915704A (en) | Apple hierarchical identification method based on deep learning | |
CN103593670A (en) | Copper sheet and strip surface defect detection method based on-line sequential extreme learning machine | |
CN109540925B (en) | Complex ceramic tile surface defect detection method based on difference method and local variance measurement operator | |
CN116645367B (en) | Steel plate cutting quality detection method for high-end manufacturing | |
CN115115612B (en) | Surface defect detection method and system for mechanical parts | |
CN108985337A (en) | A kind of product surface scratch detection method based on picture depth study | |
CN116735612B (en) | Welding defect detection method for precise electronic components | |
CN109886960A (en) | The method of glass edge defects detection based on machine vision | |
CN115294144B (en) | Method for identifying surface defects of furniture composite board | |
CN108647706A (en) | Article identification classification based on machine vision and flaw detection method | |
CN111161237A (en) | Fruit and vegetable surface quality detection method, storage medium and sorting device thereof | |
CN111753794A (en) | Fruit quality classification method and device, electronic equipment and readable storage medium | |
CN114820625A (en) | Automobile top block defect detection method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |