CN117237385A - Textile transfer printing pattern extraction method and system based on pattern cutting - Google Patents

Textile transfer printing pattern extraction method and system based on pattern cutting Download PDF

Info

Publication number
CN117237385A
CN117237385A CN202311524548.1A CN202311524548A CN117237385A CN 117237385 A CN117237385 A CN 117237385A CN 202311524548 A CN202311524548 A CN 202311524548A CN 117237385 A CN117237385 A CN 117237385A
Authority
CN
China
Prior art keywords
pixel points
gray
row
pixel
illumination influence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311524548.1A
Other languages
Chinese (zh)
Other versions
CN117237385B (en
Inventor
吴超
朱杰
朱玲燕
王尧
丁菊炎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Longda Textile Technology Co ltd
Original Assignee
Jiangsu Longda Textile Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu Longda Textile Technology Co ltd filed Critical Jiangsu Longda Textile Technology Co ltd
Priority to CN202311524548.1A priority Critical patent/CN117237385B/en
Publication of CN117237385A publication Critical patent/CN117237385A/en
Application granted granted Critical
Publication of CN117237385B publication Critical patent/CN117237385B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Image Processing (AREA)

Abstract

The invention relates to the technical field of image segmentation, in particular to a method and a system for extracting a textile transfer printing pattern based on image segmentation. The method comprises the following steps: classifying the pixel points according to the gray difference of each row of pixel points in the gray image of the textile to obtain the pixel points of each category, and adjusting the gray value of the pixel point of the category with the maximum gray average value based on the gray values of the pixel points of other categories except the category with the maximum gray average value to obtain a target image; obtaining illumination influence indexes corresponding to each pixel point according to gray level distribution of each row of pixel points and each column of pixel points in the window corresponding to each pixel point; based on the illumination influence index corresponding to each pixel point in the target image and different illumination influence index thresholds, dividing the pixel points in the target image to obtain an optimal illumination influence index threshold, and further dividing the gray level image to obtain a textile transfer printing pattern. The invention improves the extraction precision of the textile transfer printing pattern.

Description

Textile transfer printing pattern extraction method and system based on pattern cutting
Technical Field
The invention relates to the technical field of image segmentation, in particular to a method and a system for extracting a textile transfer printing pattern based on image segmentation.
Background
In the process of textile transfer printing and extraction, an acquired surface image of a textile is required to be subjected to image segmentation to obtain a printing region, and in the process of carrying out iterative threshold segmentation on a surface image of a plain weave textile with illumination influence, the prior method has the defects that the integral gray values of different regions are different due to different influences of light on different regions, so that the integral image is subjected to iterative threshold segmentation, and a single segmentation threshold cannot completely segment a printing part, so that the extraction effect of a textile transfer printing pattern is poor.
Disclosure of Invention
In order to solve the problem of poor extraction effect of the existing method when extracting the textile transfer printing patterns in the surface images of textiles, the invention aims to provide a pattern-cutting-based textile transfer printing pattern extraction method and system, and the adopted technical scheme is as follows:
in a first aspect, the present invention provides a pattern-cut-based textile transfer printing pattern extraction method, comprising the steps of:
acquiring a gray image of a textile;
classifying the pixel points in the gray image according to the gray difference of each row of pixel points in the gray image to obtain the pixel points of each category, and adjusting the gray value of the pixel point of the category with the maximum gray average value based on the gray values of the pixel points of other categories except the category with the maximum gray average value to obtain the target image of the textile; respectively taking each pixel point in the target image as a center to construct a window corresponding to each pixel point; obtaining illumination influence indexes corresponding to each pixel point according to the gray level distribution of each row of pixel points and the gray level distribution of each column of pixel points in the window corresponding to each pixel point;
dividing the pixel points in the target image based on the illumination influence indexes and different illumination influence index thresholds corresponding to each pixel point in the target image, and determining the preference degree corresponding to each illumination influence index threshold;
obtaining an optimal illumination impact index threshold based on the degree of preference; dividing the gray level image based on an optimal illumination influence index threshold value to obtain a textile transfer printing pattern;
the obtaining the illumination influence index corresponding to each pixel point according to the gray level distribution of each row of pixel points and the gray level distribution of each column of pixel points in the window corresponding to each pixel point comprises the following steps:
for the pixel point of the ith row and the jth column in the target image:
respectively carrying out curve fitting on each row of pixel points based on the gray value of each row of pixel points in the window corresponding to the pixel points in the ith row and the jth column to obtain a fitting curve corresponding to each row; respectively carrying out curve fitting on each column of pixel points based on the gray value of each column of pixel points in the window corresponding to the ith row and the jth column of pixel points to obtain a fitting curve corresponding to each column;
obtaining fitting errors corresponding to each fitting curve;
counting the number of pixel points, of which the gray values between the first extreme point and the second extreme point are larger than the average gray value of all pixel points on the fitting curve, of any fitting curve, marking the number as a first number corresponding to the fitting curve, and marking the difference value between the number of the pixel points on the fitting curve and the first number as a second number corresponding to the fitting curve;
obtaining illumination influence indexes corresponding to pixel points of an ith row and a jth column in the target image according to the fitting error corresponding to each fitting curve, the first quantity and the second quantity corresponding to each fitting curve;
calculating illumination influence indexes corresponding to pixel points of an ith row and a jth column in the target image by adopting the following formula:
wherein,a lighting influence index corresponding to the pixel point of the ith row and the jth column in the target image,/for the target image>Representing the row number of the window corresponding to the pixel point of the ith row and the jth column,/for the pixel point of the ith row and the jth column>Column number of window corresponding to pixel point of ith row and jth column is represented by +.>Fitting error corresponding to the nth row of the window corresponding to the ith row and jth column of pixel points is represented,/>Fitting error corresponding to the mth column of the window corresponding to the ith row and jth column of pixel points,/and/or>A first number of fitting curves corresponding to an nth row of a window corresponding to a pixel point of an ith row and a jth column, and +.>A second number of fitting curves corresponding to an nth row of the window corresponding to the pixel point of an ith row and a jth column, and +.>A first number of fitting curves corresponding to an mth column of the window corresponding to the ith row and jth column of the pixel dot, and +.>And a second number corresponding to the fitting curve corresponding to the m-th column of the window corresponding to the pixel point of the i-th row and the j-th column is represented.
In a second aspect, the present invention provides a pattern-cutting-based textile transfer printing pattern extraction system, which comprises a memory and a processor, wherein the processor executes a computer program stored in the memory to realize the pattern-cutting-based textile transfer printing pattern extraction method.
Preferably, the classifying the pixel points in the gray image according to the gray difference of each row of pixel points in the gray image to obtain each class of pixel points includes:
respectively calculating the variance of the gray value of each row of pixel points in the gray image, and marking the variance as the gray variance of the corresponding row;
taking pixel points in a row with gray variance larger than a preset variance threshold as first-class pixel points;
for any row of first class pixels: clustering the row of pixel points based on the gray value of the row of pixel points to obtain at least two clusters, wherein the pixel points in each cluster are used as the pixel points of the same category.
Preferably, the gray value of the pixel point of the class with the largest gray average value is adjusted based on the gray values of the pixel points of other classes except the class with the largest gray average value, so as to obtain a target image of the textile, which comprises the following steps:
for any row of first class pixels: the gray values of the pixel points in the cluster with the largest gray average value are replaced by the gray average value of the pixel points in the cluster with the smallest gray average value;
the image after the replacement is recorded as a target image of the textile.
Preferably, the dividing the pixel points in the target image based on the illumination influence indexes and different illumination influence index thresholds corresponding to each pixel point in the target image, and determining the preference degree corresponding to each illumination influence index threshold includes:
for any illumination impact index threshold:
marking the pixel points with the illumination influence indexes larger than the threshold value of the illumination influence indexes in the target image as third-class pixel points;
clustering all the third-class pixel points based on illumination influence indexes corresponding to the third-class pixel points to obtain at least two clustering clusters; acquiring the minimum circumscribed rectangle of the cluster with the largest area; the connected domain formed by other areas except the minimum circumscribed rectangle in the target image is marked as a reference area;
and obtaining the preference degree corresponding to the illumination influence index threshold according to the difference between the pixel points in the reference area and the illumination influence index corresponding to the pixel points in the minimum circumscribed rectangle.
Preferably, the obtaining the preferred degree corresponding to the threshold value of the illumination influence index according to the difference between the pixel point in the reference area and the illumination influence index corresponding to the pixel point in the minimum circumscribed rectangle includes:
based on illumination influence indexes corresponding to each reference area and the pixel points in the minimum bounding rectangle, calculating the maximum inter-class variance between each reference area and the minimum bounding rectangle respectively, and taking the maximum inter-class variance as a characteristic value of each reference area;
and taking the average characteristic value of all the reference areas as the preference degree corresponding to the illumination influence index threshold value.
Preferably, the obtaining the optimal illumination influence index threshold based on the preference degree includes:
and determining the illumination influence index threshold corresponding to the maximum preference degree as the optimal illumination influence index threshold.
Preferably, a K-means clustering algorithm is adopted to cluster all the third-class pixel points.
The invention has the following beneficial effects:
according to the method, firstly, the pixel points in the gray level image of the textile are classified according to the gray level difference of each row of pixel points in the gray level image to obtain the pixel points of each class, the gray level value of the pixel point of the class with the maximum gray level average value is adjusted based on the gray level value of the pixel points of the other class except the class with the maximum gray level average value, the target image of the textile is obtained, then the optimal light effect index threshold is determined according to the gray level distribution of each row of pixel points and the gray level distribution of each column of pixel points in the window corresponding to each pixel point in the target image of the textile, the light effect index corresponding to each pixel point is determined, the light effect index is used for representing the influence degree of each pixel point under light, namely, the light effect is estimated by utilizing the characteristics of the target image of the textile under light, the light effect index is further combined with different light effect index threshold values, the dividing result of each pixel point in the target image is evaluated, the optimal light effect index threshold is determined, the light effect index transfer effect of the image is removed, therefore, the light effect transfer pattern transfer printing based on the optimal light effect index threshold is removed, and the light effect transfer pattern transfer printing pattern is obtained on the textile image is improved, and the light effect transfer pattern is obtained, and the light effect transfer pattern is improved.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions and advantages of the prior art, the following description will briefly explain the drawings used in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are only some embodiments of the invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of a method for extracting a pattern of transfer printing on a textile according to an embodiment of the invention.
Detailed Description
In order to further describe the technical means and effects adopted by the invention to achieve the preset aim, the following detailed description is given to a method and a system for extracting textile transfer printing patterns based on pattern cutting according to the invention by combining the attached drawings and the preferred embodiment.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The following specifically describes a specific scheme of the method and system for extracting the pattern of the transfer printing of the textile based on the pattern cutting provided by the invention with reference to the accompanying drawings.
Textile transfer printing pattern extraction method embodiment based on pattern cutting:
the specific scene aimed at by this embodiment is: according to the embodiment, the gray level image of the textile is collected, the gray level features presented in the collected gray level image of the textile are analyzed, different thresholds are set, the optimal segmentation threshold is determined based on segmentation results of the different thresholds, the gray level image of the textile is segmented by the optimal segmentation threshold to obtain the textile transfer printing pattern, and the extraction precision of the textile transfer printing pattern is improved.
The embodiment provides a pattern-cutting-based textile transfer printing pattern extraction method, as shown in fig. 1, which comprises the following steps:
and S1, acquiring a gray image of the textile.
In the embodiment, the plain weave fabric is firstly laid on the collection table, a high-resolution camera is arranged right above the collection table, the high-resolution camera is opposite to the fabric to collect the surface image of the plain weave fabric, and the surface image of the plain weave fabric collected in the embodiment is an RGB image. The plain weave textiles in this example were all textiles with a printed pattern. And carrying out graying treatment on the acquired surface image of the plain weave textile, and marking the image after graying treatment as a gray image of the textile. The graying process of the image is the prior art, and will not be repeated here.
Thus, the embodiment acquires the gray scale image of the textile.
Step S2, classifying the pixel points in the gray level image according to the gray level difference of each row of pixel points in the gray level image to obtain the pixel points of each category, and adjusting the gray level value of the pixel point of the category with the maximum gray level average value based on the gray level value of the pixel points of other categories except the category with the maximum gray level average value to obtain the target image of the textile; respectively taking each pixel point in the target image as a center to construct a window corresponding to each pixel point; and obtaining the illumination influence index corresponding to each pixel according to the gray distribution of each row of pixel and the gray distribution of each column of pixel in the window corresponding to each pixel.
The flat woven fabric is often provided with transverse warps and stitches which are staggered one by one, the surface of the flat woven fabric can show transverse regular concave-convex fluctuation, the warps show convex characteristics, the stitches show concave characteristics, and meanwhile, the warps and the warps also show longitudinal concave-convex characteristics. The warp part and the stitch part in the gray level image of the textile often show periodic characteristics of brightness and darkness alternation, and meanwhile, due to illumination influence, illumination with different degrees influences periodic performance of the textile, if a uniform threshold value is adopted to carry out iterative threshold value segmentation on the gray level image of the textile, the textile transfer printing pattern cannot be completely segmented. Based on the above, the embodiment evaluates the illumination influence of the pixel points by utilizing the periodical characteristics of the warps and the stitches under the illumination influence, so that the image is divided into areas by utilizing the illumination influence, and then the complete textile transfer printing image is segmented.
The color of the textile printing area image is vivid, namely the gray value is large, the influence of illumination on image segmentation is in units of blocks and is clearer, and the condition that one block is alternately distributed in brightness and darkness does not occur, so that the gray value of the pixels of the printing part image is replaced when the illumination influence of the pixels of the background part in the image is evaluated.
Specifically, the gray value change characteristics of the pixel rows of the concave part between the warps and the pixel rows of the warps and the stitch are different, so that in order to distinguish the pixel rows of the concave part between the warps and the pixel rows of the stitch, the variance of the gray value of each row of pixel points in the gray image of the textile is calculated respectively and is recorded as the gray variance of the corresponding row, namely, each row of pixel points in the gray image of the textile corresponds to one gray variance; and taking the pixel points in the lines with the gray variance larger than the preset variance threshold as the first type pixel points, and taking the pixel points in the lines with the gray variance smaller than or equal to the preset variance threshold as the second type pixel points. The first type of pixel points are the pixel points between the warp and the stitch, and the second type of pixel points are the pixel points between the warp and the warp. The preset variance threshold in this embodiment is 0.35, and in a specific application, the practitioner can set according to the specific situation. The gray value of the pixel point is stable and does not change obviously under the influence of illumination, so that the gray value of the pixel point in each row is differentiated by the variance of the gray value of the pixel point, the smaller the variance is, the more likely the pixel point is between warp lines, and the larger the variance is, the more likely the pixel point is between warp lines and pins. For any row of first class pixels: based on the gray value of the row of pixel points, clustering the row of pixel points by adopting a DBSCAN clustering algorithm to obtain a plurality of clusters, taking the pixel points in each cluster as the pixel points in the same category, respectively calculating the gray average value of the pixel points in each cluster, and completely replacing the gray value of the pixel point in the cluster with the largest gray average value with the gray average value of the pixel point in the cluster with the smallest gray average value. In the embodiment, when a DBSCAN clustering algorithm is adopted for clustering, the clustering radius is 10, the minimum cluster number is 5, and in specific application, an implementer can set according to specific conditions; the DBSCAN clustering algorithm is prior art and will not be described in detail here. By adopting the method, the gray values of the pixel points in the cluster with the largest gray average value in each row of the first-class pixel points are replaced respectively, and the image after all replacement is recorded as the target image of the textile.
Because the warp thread is in a convex state compared with the stitch, the gray value of the pixel point at the warp thread is higher under illumination; the pixel points at the stitch and between the warp threads are concave, and the corresponding gray value is lower under illumination. The warp and stitch of the plain weave textile presents a periodic arrangement rule under different illumination influences, the gray value of the pixel points also presents a regular change, and the pixel changes in the period under different illumination influences are different. Based on this, the illumination effect of the pixels can be analyzed by utilizing the transverse and longitudinal periodic characteristics in the target image of the textile.
For the periodicity of the pixel points of the warp and the stitch in the transverse direction, the larger the illumination influence is, the more the number of the continuous pixel points with higher gray values and the fewer the number of the continuous pixel points with lower gray values in the corresponding period are, and the transition gray change between the warp and the stitch is rapid and the light and dark alternation is obvious; the smaller the illumination influence is, the smaller the number of continuous pixels with higher gray values in the corresponding period is, the larger the number of continuous pixels with lower gray values is, and the transition gray change between warp and stitch is slower. For the periodicity of the pixel columns between the longitudinal warps, the larger the illumination influence is, the more stable the gray level change is, and the smaller the difference between the gray level value at the warps and the gray level value between the warps is from the higher illumination influence to the lower illumination influence. Therefore, the embodiment utilizes the horizontal and vertical gray scale variation and gray scale expression in the local range of the pixel point to represent the illumination influence of the pixel point.
In the embodiment, each pixel point in the target image of the textile is taken as a center, and a window with a preset size is constructed and used as a window corresponding to each pixel point; in this embodiment, the preset size isIn a specific application, the implementer may make settings according to the specific circumstances.
For the pixel point of the ith row and the jth column in the target image:
respectively carrying out curve fitting on each row of pixel points based on the gray value of each row of pixel points in the window corresponding to the pixel points in the ith row and the jth column to obtain a fitting curve corresponding to each row; respectively carrying out curve fitting on each column of pixel points based on the gray value of each column of pixel points in the window corresponding to the ith row and the jth column of pixel points to obtain a fitting curve corresponding to each column; and obtaining fitting errors corresponding to each fitting curve. Counting the number of pixel points, of which the gray values between the first extreme point and the second extreme point are larger than the average gray value of all pixel points on the fitting curve, of any fitting curve, marking the number as a first number corresponding to the fitting curve, and marking the difference value between the number of the pixel points on the fitting curve and the first number as a second number corresponding to the fitting curve; and obtaining illumination influence indexes corresponding to the pixel points of the ith row and the jth column in the target image according to the fitting error corresponding to each fitting curve, the first quantity and the second quantity corresponding to each fitting curve. The specific calculation formula of the illumination influence index corresponding to the pixel point of the ith row and the jth column in the target image is as follows:
wherein,a lighting influence index corresponding to the pixel point of the ith row and the jth column in the target image,/for the target image>Representing the row number of the window corresponding to the pixel point of the ith row and the jth column,/for the pixel point of the ith row and the jth column>Column number of window corresponding to pixel point of ith row and jth column is represented by +.>Fitting error corresponding to the nth row of the window corresponding to the ith row and jth column of pixel points is represented,/>Fitting error corresponding to the mth column of the window corresponding to the ith row and jth column of pixel points,/and/or>A first number of fitting curves corresponding to an nth row of a window corresponding to a pixel point of an ith row and a jth column, and +.>A second number of fitting curves corresponding to an nth row of the window corresponding to the pixel point of an ith row and a jth column, and +.>A first number of fitting curves corresponding to an mth column of the window corresponding to the ith row and jth column of the pixel dot, and +.>Window corresponding to pixel point representing ith row and jth columnA second number corresponding to the fitting curve corresponding to the mth column of (c).
The larger the fitting error is, the more severe the gray value change of the pixel point is; the larger the ratio of the first number to the second number, the more the number of pixel points with larger gray values representing the pixel points is, the larger the illumination influence is.The illumination influence size of the lateral pixel point is characterized, < ->The illumination impact size of the longitudinal pixel points is characterized. When the fitting error is larger and the ratio of the first quantity to the second quantity is larger, the illumination influence index corresponding to the pixel points of the ith row and the jth column is larger. The curve fitting, the obtaining of the extreme points and the obtaining process of the fitting errors are all the prior art, and are not repeated here.
By adopting the method, the illumination influence index corresponding to each pixel point in the target image of the textile can be obtained.
And step S3, dividing the pixel points in the target image based on the illumination influence indexes and different illumination influence index thresholds corresponding to each pixel point in the target image, and determining the preference degree corresponding to each illumination influence index threshold.
In the process of dividing the target image of the textile, the threshold value is continuously adjusted, and the target image of the textile is divided for a plurality of times based on the size of the illumination influence index, so that the image can be divided into a plurality of areas, the maximum inter-class variance is calculated after the areas are divided, and the optimal illumination influence index threshold value is determined based on the maximum inter-class variance, because the larger the maximum inter-class variance is, the more obvious the difference between two peaks is, and the better the segmentation effect is.
In the embodiment, a plurality of illumination influence index thresholds are set, the smallest illumination influence index threshold is the minimum value of illumination influence indexes corresponding to all pixel points in a target image of a textile, the difference between adjacent illumination influence index thresholds is 0.2, the largest illumination influence index threshold is smaller than the maximum value of illumination influence indexes corresponding to all pixel points in the target image of the textile, and the difference between the largest illumination influence index threshold and the maximum value of illumination influence indexes corresponding to all pixel points in the target image of the textile is smaller than 0.2; in a specific application, the practitioner may set up according to the specific circumstances.
Because of the illumination characteristic, the pixels with larger illumination influence indexes in the target image of the textile are close together, and meanwhile, the pixels with larger illumination influence indexes are separated in the dividing process due to local errors, and in addition, partial points with larger illumination influence indexes exist in the illumination influence area, so that the determination of the illumination area is inaccurate. In the embodiment, the pixel points in the target image of the textile are divided based on the illumination influence indexes corresponding to each pixel point in the target image of the textile and different illumination influence index thresholds respectively, and the preference degree corresponding to each illumination influence index threshold is determined.
Specifically, for any illumination impact index threshold:
marking the pixel points with the illumination influence indexes larger than the threshold value of the illumination influence indexes in the target image as third-class pixel points; based on illumination influence indexes corresponding to the third class of pixel points, clustering all the third class of pixel points by adopting a K-means clustering algorithm to obtain a plurality of cluster clusters, wherein the number of the cluster clusters is set to be 5 when the K-means clustering algorithm is adopted for clustering, so that all the third class of pixel points are clustered to be 5 cluster clusters, and in specific application, an implementer can set according to specific situations; acquiring the minimum circumscribed rectangle of the cluster with the largest area; the connected domain formed by other areas except the minimum circumscribed rectangle in the target image is marked as a reference area; based on illumination influence indexes corresponding to each reference area and the pixel points in the minimum bounding rectangle, calculating the maximum inter-class variance between each reference area and the minimum bounding rectangle respectively, and taking the maximum inter-class variance as a characteristic value of each reference area; and taking the average characteristic value of all the reference areas as the preference degree corresponding to the illumination influence index threshold value. The calculation process of the maximum inter-class variance is the prior art, and will not be repeated here.
By adopting the method, the preference degree corresponding to each illumination influence index threshold value can be obtained.
Step S4, obtaining an optimal illumination influence index threshold value based on the preference degree; dividing the gray level image based on the optimal illumination influence index threshold value to obtain a textile transfer printing pattern.
In the present embodiment, the degree of preference corresponding to each illumination-affected index threshold is obtained in step S3, and the greater the degree of preference, the more suitable the corresponding illumination-affected index threshold is as the segmentation threshold of the image, so in the present embodiment, the illumination-affected index threshold corresponding to the greatest degree of preference is determined as the optimal illumination-affected index threshold.
The dividing effect of dividing the target image of the textile based on the optimal illumination influence index threshold is best, so that the optimal illumination influence index threshold is used as the dividing threshold of the gray level image of the textile, the gray level image of the textile is divided, and the area formed by the pixel points in the gray level image of the textile, which is larger than the optimal illumination influence index threshold, is used as the textile transfer printing pattern area.
According to the embodiment, firstly, pixel points in a gray image of a textile are respectively classified according to gray differences of pixel points in each row in the gray image to obtain pixel points of each category, gray values of the pixel points of the category with the maximum gray average value are adjusted based on gray values of the pixel points of the other category except the category with the maximum gray average value, a target image of the textile is obtained, then according to gray distribution of the pixel points in each row in a window corresponding to each pixel point in the target image of the textile and gray distribution of the pixel points in each column, illumination influence indexes corresponding to each pixel point are determined, the illumination influence indexes are used for representing the influence degree of illumination of each pixel point, namely, the illumination influence of the characteristics of the target image of the textile under illumination is estimated, the illumination influence indexes and different illumination influence index thresholds are further combined, the pixel points in the target image are divided, the division result of each illumination influence index threshold is evaluated, the optimal illumination influence index threshold is determined, the image is subjected to interference of illumination is eliminated based on the optimal illumination influence index threshold, and therefore the printing pattern transfer accuracy of the textile pattern is improved based on the best illumination influence transfer pattern is obtained.
Pattern cut based textile transfer printing pattern extraction system embodiment:
the textile transfer printing pattern extraction system based on pattern cutting of the embodiment comprises a memory and a processor, wherein the processor executes a computer program stored in the memory to realize the textile transfer printing pattern extraction method based on pattern cutting.
Since the method for extracting the pattern-based textile transfer printing pattern has been described in the embodiment of the pattern-based textile transfer printing pattern extracting method, the embodiment will not be repeated.
It should be noted that: the foregoing description of the preferred embodiments of the present invention is not intended to be limiting, but rather, any modifications, equivalents, improvements, etc. that fall within the principles of the present invention are intended to be included within the scope of the present invention.

Claims (8)

1. The pattern-cutting-based textile transfer printing pattern extraction method is characterized by comprising the following steps of:
acquiring a gray image of a textile;
classifying the pixel points in the gray image according to the gray difference of each row of pixel points in the gray image to obtain the pixel points of each category, and adjusting the gray value of the pixel point of the category with the maximum gray average value based on the gray values of the pixel points of other categories except the category with the maximum gray average value to obtain the target image of the textile; respectively taking each pixel point in the target image as a center to construct a window corresponding to each pixel point; obtaining illumination influence indexes corresponding to each pixel point according to the gray level distribution of each row of pixel points and the gray level distribution of each column of pixel points in the window corresponding to each pixel point;
dividing the pixel points in the target image based on the illumination influence indexes and different illumination influence index thresholds corresponding to each pixel point in the target image, and determining the preference degree corresponding to each illumination influence index threshold;
obtaining an optimal illumination impact index threshold based on the degree of preference; dividing the gray level image based on an optimal illumination influence index threshold value to obtain a textile transfer printing pattern;
the obtaining the illumination influence index corresponding to each pixel point according to the gray level distribution of each row of pixel points and the gray level distribution of each column of pixel points in the window corresponding to each pixel point comprises the following steps:
for the pixel point of the ith row and the jth column in the target image:
respectively carrying out curve fitting on each row of pixel points based on the gray value of each row of pixel points in the window corresponding to the pixel points in the ith row and the jth column to obtain a fitting curve corresponding to each row; respectively carrying out curve fitting on each column of pixel points based on the gray value of each column of pixel points in the window corresponding to the ith row and the jth column of pixel points to obtain a fitting curve corresponding to each column;
obtaining fitting errors corresponding to each fitting curve;
counting the number of pixel points, of which the gray values between the first extreme point and the second extreme point are larger than the average gray value of all pixel points on the fitting curve, of any fitting curve, marking the number as a first number corresponding to the fitting curve, and marking the difference value between the number of the pixel points on the fitting curve and the first number as a second number corresponding to the fitting curve;
obtaining illumination influence indexes corresponding to pixel points of an ith row and a jth column in the target image according to the fitting error corresponding to each fitting curve, the first quantity and the second quantity corresponding to each fitting curve;
calculating illumination influence indexes corresponding to pixel points of an ith row and a jth column in the target image by adopting the following formula:
wherein,a lighting influence index corresponding to the pixel point of the ith row and the jth column in the target image,/for the target image>Representing the row number of the window corresponding to the pixel point of the ith row and the jth column,/for the pixel point of the ith row and the jth column>Representing the column number of the window corresponding to the pixel point of the ith row and the jth column,fitting error corresponding to the nth row of the window corresponding to the ith row and jth column of pixel points is represented,/>Fitting error corresponding to the mth column of the window corresponding to the ith row and jth column of pixel points,/and/or>A first number of fitting curves corresponding to an nth row of a window corresponding to a pixel point of an ith row and a jth column, and +.>A second number of fitting curves corresponding to an nth row of the window corresponding to the pixel point of an ith row and a jth column, and +.>A first number of fitting curves corresponding to an mth column of the window corresponding to the ith row and jth column of the pixel dot, and +.>And a second number corresponding to the fitting curve corresponding to the m-th column of the window corresponding to the pixel point of the i-th row and the j-th column is represented.
2. The method for extracting pattern-based textile transfer printing patterns according to claim 1, wherein classifying the pixels in the gray scale image according to the gray scale difference of each row of pixels in the gray scale image to obtain each class of pixels comprises:
respectively calculating the variance of the gray value of each row of pixel points in the gray image, and marking the variance as the gray variance of the corresponding row;
taking pixel points in a row with gray variance larger than a preset variance threshold as first-class pixel points;
for any row of first class pixels: clustering the row of pixel points based on the gray value of the row of pixel points to obtain at least two clusters, wherein the pixel points in each cluster are used as the pixel points of the same category.
3. The pattern-based textile transfer printing pattern extraction method of claim 2, wherein the step of adjusting the gray value of the pixel of the class having the largest gray average value based on the gray values of the pixels of the other classes except the class having the largest gray average value, to obtain the target image of the textile comprises the steps of:
for any row of first class pixels: the gray values of the pixel points in the cluster with the largest gray average value are replaced by the gray average value of the pixel points in the cluster with the smallest gray average value;
the image after the replacement is recorded as a target image of the textile.
4. The method for extracting pattern-based textile transfer printing patterns according to claim 1, wherein the dividing the pixels in the target image based on the illumination influence indexes and different illumination influence index thresholds corresponding to each pixel in the target image, respectively, and determining the preference degree corresponding to each illumination influence index threshold comprises:
for any illumination impact index threshold:
marking the pixel points with the illumination influence indexes larger than the threshold value of the illumination influence indexes in the target image as third-class pixel points;
clustering all the third-class pixel points based on illumination influence indexes corresponding to the third-class pixel points to obtain at least two clustering clusters; acquiring the minimum circumscribed rectangle of the cluster with the largest area; the connected domain formed by other areas except the minimum circumscribed rectangle in the target image is marked as a reference area;
and obtaining the preference degree corresponding to the illumination influence index threshold according to the difference between the pixel points in the reference area and the illumination influence index corresponding to the pixel points in the minimum circumscribed rectangle.
5. The pattern-cut-based textile transfer printing pattern extraction method of claim 4, wherein obtaining the preference degree corresponding to the illumination influence index threshold according to the difference between the illumination influence indexes corresponding to the pixel points in the reference area and the pixel points in the minimum bounding rectangle comprises:
based on illumination influence indexes corresponding to each reference area and the pixel points in the minimum bounding rectangle, calculating the maximum inter-class variance between each reference area and the minimum bounding rectangle respectively, and taking the maximum inter-class variance as a characteristic value of each reference area;
and taking the average characteristic value of all the reference areas as the preference degree corresponding to the illumination influence index threshold value.
6. The graphics-based textile transfer print extraction method of claim 1, wherein said obtaining an optimal illumination impact index threshold based on said preference level comprises:
and determining the illumination influence index threshold corresponding to the maximum preference degree as the optimal illumination influence index threshold.
7. The pattern-cutting-based textile transfer printing pattern extraction method as claimed in claim 4, wherein the K-means clustering algorithm is adopted to cluster all third-class pixels.
8. A cut-based textile transfer print pattern extraction system comprising a memory and a processor, wherein the processor executes a computer program stored in the memory to implement the cut-based textile transfer print pattern extraction method of any one of claims 1-7.
CN202311524548.1A 2023-11-16 2023-11-16 Textile transfer printing pattern extraction method and system based on pattern cutting Active CN117237385B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311524548.1A CN117237385B (en) 2023-11-16 2023-11-16 Textile transfer printing pattern extraction method and system based on pattern cutting

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311524548.1A CN117237385B (en) 2023-11-16 2023-11-16 Textile transfer printing pattern extraction method and system based on pattern cutting

Publications (2)

Publication Number Publication Date
CN117237385A true CN117237385A (en) 2023-12-15
CN117237385B CN117237385B (en) 2024-01-26

Family

ID=89095327

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311524548.1A Active CN117237385B (en) 2023-11-16 2023-11-16 Textile transfer printing pattern extraction method and system based on pattern cutting

Country Status (1)

Country Link
CN (1) CN117237385B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10740916B2 (en) * 2018-05-18 2020-08-11 Quanta Computer Inc. Method and device for improving efficiency of reconstructing three-dimensional model
CN115311310A (en) * 2022-10-10 2022-11-08 江苏欧罗曼家纺有限公司 Method for extracting printed patterns of textiles through graph cutting
CN116645367A (en) * 2023-07-27 2023-08-25 山东昌啸商贸有限公司 Steel plate cutting quality detection method for high-end manufacturing

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10740916B2 (en) * 2018-05-18 2020-08-11 Quanta Computer Inc. Method and device for improving efficiency of reconstructing three-dimensional model
CN115311310A (en) * 2022-10-10 2022-11-08 江苏欧罗曼家纺有限公司 Method for extracting printed patterns of textiles through graph cutting
CN116645367A (en) * 2023-07-27 2023-08-25 山东昌啸商贸有限公司 Steel plate cutting quality detection method for high-end manufacturing

Also Published As

Publication number Publication date
CN117237385B (en) 2024-01-26

Similar Documents

Publication Publication Date Title
CN104834912B (en) A kind of weather recognition methods and device based on image information detection
CN103971126A (en) Method and device for identifying traffic signs
Kuo et al. Application of computer vision in the automatic identification and classification of woven fabric weave patterns
CN106971158B (en) A kind of pedestrian detection method based on CoLBP symbiosis feature Yu GSS feature
CN115131348B (en) Method and system for detecting textile surface defects
CN112001299B (en) Tunnel vehicle finger device and lighting lamp fault identification method
CN103366373B (en) Multi-time-phase remote-sensing image change detection method based on fuzzy compatible chart
CN115311310A (en) Method for extracting printed patterns of textiles through graph cutting
KR100858681B1 (en) Image filter combination generating method for fingerprint image generation
CN117237385B (en) Textile transfer printing pattern extraction method and system based on pattern cutting
Jing et al. Automatic recognition of weave pattern and repeat for yarn-dyed fabric based on KFCM and IDMF
Xiang et al. Yarn-dyed woven fabric density measurement method and system based on multi-directional illumination image fusion enhancement technology
CN117522719B (en) Bronchoscope image auxiliary optimization system based on machine learning
Li et al. Yarn-dyed woven defect characterization and classification using combined features and support vector machine
CN116842210B (en) Textile printing texture intelligent retrieval method based on texture features
TWI498830B (en) A method and system for license plate recognition under non-uniform illumination
Ajallouian et al. A novel method for the identification of weave repeat through image processing
CN109165650B (en) Automatic detection method for minimum repeated unit of printed fabric image
CN115082741B (en) Waste textile classification method based on image processing
Vij et al. Performance evaluation of color image segmentation using K means clustering and watershed technique
CN115294097A (en) Textile surface defect detection method based on machine vision
GB2399629A (en) Automatic thresholding algorithm method and apparatus
Pan et al. Color clustering analysis of yarn-dyed fabric in HSL color space
CN109919028B (en) Flexible coordinate system establishing and shape identifying method based on fabric weave structure
Fan et al. Recognition and analysis of fabric texture by double-sided fusion of transmission and reflection images under compound light source

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant