CN116188376A - Digital printing stepping channel detection method based on twin neural network - Google Patents

Digital printing stepping channel detection method based on twin neural network Download PDF

Info

Publication number
CN116188376A
CN116188376A CN202211677258.6A CN202211677258A CN116188376A CN 116188376 A CN116188376 A CN 116188376A CN 202211677258 A CN202211677258 A CN 202211677258A CN 116188376 A CN116188376 A CN 116188376A
Authority
CN
China
Prior art keywords
detection
images
image
twin neural
digital printing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211677258.6A
Other languages
Chinese (zh)
Inventor
顾梦奇
许毅杰
李晓鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Lianshitai Electronic Information Technology Co ltd
Original Assignee
Suzhou Lianshitai Electronic Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Lianshitai Electronic Information Technology Co ltd filed Critical Suzhou Lianshitai Electronic Information Technology Co ltd
Priority to CN202211677258.6A priority Critical patent/CN116188376A/en
Publication of CN116188376A publication Critical patent/CN116188376A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4038Scaling the whole image or part thereof for image mosaicing, i.e. plane images composed of plane sub-images
    • G06T5/70
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Abstract

The invention discloses a digital printing stepping channel detection method based on a twin neural network, which relates to the technical field of defect detection and comprises the following steps: collecting detection images and template images of a digital printing machine, and performing image labeling to construct an image set; establishing a twin neural detection network model, and training the twin neural detection network model by utilizing the image set and the loss function so as to adjust model parameters; and installing the parameter-adjusted twin neural detection network model and the detection code in a detection environment for actual detection. The method overcomes the interference of similar patterns and cloth textures, can accurately detect the step defects of digital printing products with various patterns, saves labor cost and simultaneously realizes real-time detection.

Description

Digital printing stepping channel detection method based on twin neural network
Technical Field
The invention belongs to the technical field of defect detection, and particularly relates to a digital printing walking defect detection method based on a twin neural network.
Background
With the development of high-speed computers, digital printing technology is mature, has the characteristics of high speed, high resolution and the like, and has finer product patterns, thereby bringing about a great technical revolution in textile industry. In the process of digital printing, various defects can be generated in the product due to equipment failure or cloth defects, and for material cost, the production process needs to be automatically monitored in real time to ensure the product quality, otherwise, huge economic loss can be caused by generating a large number of defective products. Because the traditional manual visual spot check can have the condition of false detection, the method for carrying out the online detection of the defects by collecting the images is indispensable.
In the field of digital printing defect detection, traditional algorithm researches mainly aim at cloth with single color or texture, and the detection efficiency and accuracy of the algorithms are not high when facing printed products with rich colors and complex patterns. Conventional algorithms for detecting walkways include LBP (local binary pattern) algorithms, histogram-based analysis algorithms, etc., which perform poorly in detecting walkways and printing defects in complex patterns, especially where the defect characterization is similar to a printed pattern or cloth.
Because the printing machine inevitably generates mechanical errors, the distance of each step is different, and the printed product has the defect of stepping, the specific expression form is as follows: when the stepping distance is smaller than the correct threshold value, dark stepping paths appear in overlapping mode in the junction area of the two prints, and when the stepping distance is larger than the correct threshold value, light stepping paths appear in white mode in the junction area of the two prints. Since such defects have an actual physical size of 1mm or less and may interfere with the patterns or the patterns, it is difficult to detect them using conventional image processing methods.
There are currently few correlation algorithms for step detection, and more so, step detection algorithms based on deep learning are few. Compared with the traditional histogram similarity and background difference detection algorithm, the step-by-step road detection method based on the deep learning technology has stronger characteristic learning capability, and can detect possible step-by-step road defects more efficiently and accurately.
Since it is necessary to determine whether the linear feature in the image to be detected is a part of the printed natural pattern or a step defect, a special detection method is required to be able to realize the step detection.
Disclosure of Invention
The invention aims to provide a digital printing stepping channel detection method based on a twin neural network, which can realize accurate real-time detection of digital printing stepping channel defects.
In order to achieve the above object, an embodiment of the present invention provides a digital printing stepping channel detection method based on a twin neural network, including the following steps:
collecting detection images and template images of a digital printing machine, and performing image labeling to construct an image set;
establishing a twin neural detection network model, and training the twin neural detection network model by utilizing the image set and the loss function so as to adjust model parameters;
and installing the parameter-adjusted twin neural detection network model and the detection code in a detection environment for actual detection.
Preferably, the collecting the detection image and the matching image of the digital printing, and performing image labeling to construct an image set includes:
collecting detection images by using a camera arranged above a guide rail of the digital printing machine, wherein the detection images comprise defect-free images, dark-color step images and light-color step images;
splicing the defect-free images in the period by using a splicing algorithm to obtain a correct printing pattern template, obtaining a matching image corresponding to the detection image in the printing pattern template by using a template matching algorithm, and taking the detection image and the matching image as a group of images;
and arranging each group of images into a uniform format, and carrying out classified labeling to establish an image set with labels.
Preferably, the twin neural detection network model comprises two resnet18 networks, after feature images of the detection image and the matched image are respectively extracted through the resnet18 networks, the two feature images are spliced and then pass through an average pooling layer, a convolution layer and a full connection layer, so that the classification prediction probability of the detection image is obtained.
Preferably, the body of each resnet18 network is composed of residual blocks, each residual block contains a convolution layer, each convolution layer is followed by a batch normalization layer and a relu activation function, a cross-layer data transfer route is designed, and the input is directly added before the relu activation function.
Preferably, the training the vertical twin neural detection network model using the image set and the loss function to adjust model parameters includes:
after preprocessing and brightness correction are carried out on the detection image and the matching image, n 224 x 224 pictures to be detected are cut out according to the position of the walking path, the pictures to be detected are respectively sent into a twin neural detection network model after filtering treatment, and the probability that the detection image belongs to a defect-free walking path, a dark walking path and a light walking path is calculated and output;
and calculating a loss function according to the probability and the image label, and adjusting model parameters according to the loss function.
Preferably, the loss function employs a cross entropy loss function.
Preferably, the installing the parameter-adjusted twin neural detection network model and the detection code in the detection environment for actual detection includes:
(a) Carrying out pyrach-onnx model conversion on the trained twin neural model, generating a TensorRT reasoning engine, and deploying the TensorRT reasoning engine on an Xavier platform based on C++ language;
(b) Collecting a certain number of pictures and splicing to obtain a template image;
(c) Each time the digital printing machine steps, photographing to obtain a detection image, preprocessing and secondarily matching the detection image to obtain an accurate detection image and a matching image, cutting out n images from the detection image and the matching image, and normalizing to meet the input requirement of a model;
(d) Inputting the cut images into a TensorRT reasoning engine converted by the model, deducing a detection result, voting after each detection result, judging that the images are suspected to be step defects if the detection result exceeds a first threshold value, sending the images into an abnormal queue, judging that the images are step defects and reporting the detection result after the number of the images in the abnormal queue exceeds a second threshold value after the images with a plurality of suspected step defects appear.
Preferably, the first threshold value range is [4,8].
Preferably, the second threshold value range is [3,6].
Compared with the prior art, the invention has the beneficial effects that at least the following steps are included:
the digital printing step channel detection method based on the twin neural network provided by the invention can realize real-time detection of digital printing step channel defects, is based on the twin neural network, trains by collecting images of actual printing products, carries out parameter adjustment according to labeling information and loss function results, and obtains a trained model, and the digital printing step channel defects can be accurately detected in real time. The method overcomes the interference of similar patterns and cloth textures, and can accurately detect the step defects of digital printing products with various patterns.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a digital printing stepping channel detection method based on a twin neural network;
FIG. 2 is a schematic structural diagram of a twin neural detection network model provided by the invention;
FIG. 3 is a training process diagram of a twin neural detection network model provided by the invention;
FIG. 4 is a flowchart of a step-by-step defect detection operation according to the present invention.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the detailed description is presented by way of example only and is not intended to limit the scope of the invention.
Because the linear characteristic in the image to be detected needs to be judged to be a part of the printing inherent pattern or a walking defect, the pattern to be detected and the correct matching pattern for reference are fed into the neural network together, so that the accuracy of detection is improved, and the twin neural network with the double-branch structure is more suitable than the traditional classification network for the difference comparison. The two branch networks of the twin neural network extract characteristic information in the same domain from two compared images through sharing network weights, and then similarity of the two images is obtained, so that whether the currently detected image possibly has a step-by-step road defect is judged. Based on the background, the invention provides a digital printing stepping channel detection method based on a twin neural network.
Fig. 1 is a flow chart of a digital printing stepping channel detection method based on a twin neural network. As shown in fig. 1, the digital printing stepping channel detection method based on the twin neural network provided by the embodiment includes the following steps:
step 1, collecting detection images and template images of a digital printing machine, and performing image annotation to construct an image set.
In the embodiment, the printing printer prints the pictures of the periodic patterns one by one, the camera arranged above the guide rail of the digital printing machine is utilized to collect detection images, each time the printing machine finishes printing, the detection system collects one detection image, the precision of the detection image is 3000 x 4096, and the field of view of the real machine is 31.6cm x 43.1cm. Before the method is implemented, the template splicing algorithm splices the non-defective images in the period to obtain a correct printing pattern template, and can assume that the first period is correct, i.e. splice all the non-defective patterns in the first period to obtain the correct printing pattern template; and then, a template matching algorithm is adopted to obtain a correct matching image corresponding to the detection image in the printing pattern template, and the detection image and the matching image are stored as a group of data. For classification training, two hundred images of 30 patterns are acquired, wherein detection images of dark-colored walkways and light-colored walkways are respectively acquired by setting motor parameters of a digital printing machine and used as detection images with step-by-step channel defects, defect-free detection images are also acquired, the detection images and the matching images are arranged into a unified format, classification labeling is carried out on the detection images and the matching images, and a data set used for model training is established and comprises a training set, a test set and a verification set.
And 2, establishing a twin neural detection network model, and training the vertical twin neural detection network model by utilizing the image set and the loss function so as to adjust model parameters.
In an embodiment, as shown in fig. 2, the constructed twin neural detection network model includes two resnet18 networks, the main body of the resnet18 network structure is composed of common residual blocks, each residual block includes two 3*3 convolution layers, each convolution layer is followed by a batch normalization layer and a relu activation function, a cross-layer data transmission route is designed, and input is directly added before the activation function. In the whole twin neural detection network model, a detection image and a matching image are respectively input into a resnet18 network, 4 residual blocks are connected through a maximum pooling layer of a convolution layer joint 3*3 of 7*7, and finally, the classification prediction probability of the detection image is obtained through an average pooling layer, a convolution layer of 1*1 and two full-connection layers for characteristic extraction classification after feature graphs output by the resnet18 network are spliced.
In an embodiment, the training process of the twin neural detection network model is performed on a linux server, and a python programming language and a pytorch deep learning framework are used. As shown in fig. 3, the specific training process includes:
the detection image and the matching image are preprocessed, and firstly, brightness correction is carried out on the detection image and the matching image, so that interference of the light-emitting environment on the colors of the detection image and the matching image is avoided. Then cutting the detection image and the matching image according to the position of the step, wherein the step defect possibly occurs at the joint position of two printing blocks when the step is performed, so that the ordinate of the step defect in the image is a fixed position, n 224 x 224 pictures to be detected are cut on the ordinate of the detection image and the ordinate of the matching image respectively, the pictures to be detected are respectively sent into a model after being filtered, the pictures are respectively sent into the model after being subjected to convolution layer, batch normalization layer, maximum pooling layer and 4 residual blocks, the two feature images are spliced, and the probability that the images respectively belong to defect-free, dark step and light step is obtained through an average pooling layer, a convolution layer and two full connection layers, and the model is subjected to parameter adjustment through a cross entropy loss function, so that a trained twin neural detection network model is obtained.
Figure BDA0004017507920000071
And step 3, installing the twin neural detection network model with the parameters adjusted and the detection codes in a detection environment for actual detection.
In the embodiment, as shown in fig. 4, a pyrach-onnx model conversion is performed on a trained twin neural detection network model, and a TensorRT reasoning engine is generated and deployed on an Xavier platform based on a C++ language. The specific detection flow is as follows: and collecting a certain number of pictures to splice the template images. After the template image is spliced, the digital printing machine performs stepping every time, the digital camera photographs and acquires 3000 x 4096 detection images, the detection images are preprocessed, the position of a stepping channel is intercepted, the intercepting position is changed according to the stepping length, the intercepting length is slightly larger than the stepping length, secondary matching is performed, and an accurate detection image and an accurate matching image are obtained. And cutting out n 224 x 224 images from the detection image and the matching image, wherein the images meet the input requirement of the model after normalization processing. The image is input to a TensorRT reasoning engine converted by the model to deduce the detection result. Voting is carried out after detection results of the images are obtained, if each detection result exceeds a first threshold value, the suspected step defects of the images are judged, the images are sent into an abnormal queue, and after the images with the suspected step defects appear, the number of the images in the abnormal queue exceeds a second threshold value, the step defects of the images are judged, and the detection results are reported.
In an embodiment, the first threshold value is 6, and the second threshold value is 4.
The foregoing detailed description of the preferred embodiments and advantages of the invention will be appreciated that the foregoing description is merely illustrative of the presently preferred embodiments of the invention, and that no changes, additions, substitutions and equivalents of those embodiments are intended to be included within the scope of the invention.

Claims (9)

1. The digital printing stepping channel detection method based on the twin neural network is characterized by comprising the following steps of:
collecting detection images and template images of a digital printing machine, and performing image labeling to construct an image set;
establishing a twin neural detection network model, and training the twin neural detection network model by utilizing the image set and the loss function so as to adjust model parameters;
and installing the parameter-adjusted twin neural detection network model and the detection code in a detection environment for actual detection.
2. The digital printing step-by-step lane detection method based on a twin neural network according to claim 1, wherein the capturing the detection image and the template image of the digital printing and performing image labeling to construct an image set comprises:
collecting detection images by using a camera arranged above a guide rail of the digital printing machine, wherein the detection images comprise defect-free images, dark-color step images and light-color step images;
splicing the defect-free images in the period by using a splicing algorithm to obtain a correct printing pattern template, obtaining a matching image corresponding to the detection image in the printing pattern template by using a template matching algorithm, and taking the detection image and the matching image as a group of images;
and arranging each group of images into a uniform format, and carrying out classified labeling to establish an image set with labels.
3. The digital printing step channel detection method based on the twin neural network according to claim 1, wherein the twin neural network detection network model comprises two resnet18 networks, after feature images are extracted from detection images and matching images through the resnet18 networks respectively, the two feature images are spliced and then pass through an average pooling layer, a convolution layer and a full connection layer, so that the classification prediction probability of the detection images is obtained.
4. A digital printing stepping channel detection method based on a twin neural network according to claim 3, wherein the main body of each resnet18 network consists of residual blocks, each residual block comprises convolution layers, each convolution layer is connected with a batch normalization layer and a relu activation function, a cross-layer data transmission route is designed, and input is directly added before the relu activation function.
5. The digital printing stepping channel detection method based on a twin neural network according to claim 1, wherein training a vertical twin neural detection network model using an image set and a loss function to adjust model parameters comprises:
after preprocessing and brightness correction are carried out on the detection image and the corresponding template image, n 224 pictures to be detected of the detection image and the template image are cut according to the position where the walking path appears, the pictures are respectively sent into a twin neural detection network model after being filtered, and the probability that the detection image belongs to a defect-free walking path, a dark walking path and a light walking path is calculated and output;
and calculating a loss function according to the probability and the image label, and adjusting model parameters according to the loss function.
6. The method for digital printing step-by-step channel detection based on a twin neural network according to claim 4, wherein the loss function uses a cross entropy loss function.
7. The digital printing stepping channel detection method based on the twin neural network according to claim 1, wherein the installing the twin neural detection network model with the parameter adjusted and the detection code in the detection environment for actual detection comprises:
(a) Carrying out pyrach-onnx model conversion on the trained twin neural model, generating a TensorRT reasoning engine, and deploying the TensorRT reasoning engine on an Xavier platform based on C++ language;
(b) Collecting a certain number of pictures and splicing to obtain a template image;
(c) Each time the digital printing machine steps, photographing to obtain a detection image, preprocessing and secondarily matching the detection image to obtain an accurate detection image and a matching image, cutting out n images from the detection image and the matching image, and normalizing to meet the input requirement of a model;
(d) Inputting the cut images into a TensorRT reasoning engine converted by the model, deducing a detection result, voting after each detection result, judging that the images are suspected to be step defects if the detection result exceeds a first threshold value, sending the images into an abnormal queue, judging that the images are step defects and reporting the detection result after the number of the images in the abnormal queue exceeds a second threshold value after the images with a plurality of suspected step defects appear.
8. The digital printing step-by-step lane detection method based on a twin neural network as claimed in claim 7, wherein the first threshold value range is [4,8].
9. The digital printing step-by-step lane detection method based on a twin neural network as claimed in claim 7, wherein the second threshold value range is [3,6].
CN202211677258.6A 2022-12-26 2022-12-26 Digital printing stepping channel detection method based on twin neural network Pending CN116188376A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211677258.6A CN116188376A (en) 2022-12-26 2022-12-26 Digital printing stepping channel detection method based on twin neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211677258.6A CN116188376A (en) 2022-12-26 2022-12-26 Digital printing stepping channel detection method based on twin neural network

Publications (1)

Publication Number Publication Date
CN116188376A true CN116188376A (en) 2023-05-30

Family

ID=86437535

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211677258.6A Pending CN116188376A (en) 2022-12-26 2022-12-26 Digital printing stepping channel detection method based on twin neural network

Country Status (1)

Country Link
CN (1) CN116188376A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117115167A (en) * 2023-10-24 2023-11-24 诺比侃人工智能科技(成都)股份有限公司 Coiled steel displacement judging method and system based on feature detection

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117115167A (en) * 2023-10-24 2023-11-24 诺比侃人工智能科技(成都)股份有限公司 Coiled steel displacement judging method and system based on feature detection
CN117115167B (en) * 2023-10-24 2023-12-29 诺比侃人工智能科技(成都)股份有限公司 Coiled steel displacement judging method and system based on feature detection

Similar Documents

Publication Publication Date Title
CN109724984B (en) Defect detection and identification device and method based on deep learning algorithm
KR102254773B1 (en) Automatic decision and classification system for each defects of building components using image information, and method for the same
KR102168724B1 (en) Method And Apparatus for Discriminating Normal and Abnormal by using Vision Inspection
CN107944504B (en) Board recognition and machine learning method and device for board recognition and electronic equipment
CN110490842B (en) Strip steel surface defect detection method based on deep learning
CN102157024B (en) System and method for on-line secondary detection checking of checking data of large-sheet checking machine
CN111242896A (en) Color printing label defect detection and quality rating method
CN106355579A (en) Defect detecting method of cigarette carton surface wrinkles
CN115375614A (en) System and method for sorting products manufactured by a manufacturing process
CN104483320A (en) Digitized defect detection device and detection method of industrial denitration catalyst
US11790517B2 (en) Subtle defect detection method based on coarse-to-fine strategy
CN116188376A (en) Digital printing stepping channel detection method based on twin neural network
CN113158969A (en) Apple appearance defect identification system and method
CN114445365A (en) Banknote printing quality inspection method based on deep learning algorithm
CN112381175A (en) Circuit board identification and analysis method based on image processing
CN114359235A (en) Wood surface defect detection method based on improved YOLOv5l network
CN110728269B (en) High-speed rail contact net support pole number plate identification method based on C2 detection data
CN112508911A (en) Rail joint touch net suspension support component crack detection system based on inspection robot and detection method thereof
CN113139572B (en) Image-based train air spring fault detection method
CN216525503U (en) Carbon fiber prepreg surface defect on-line measuring device based on machine vision
WO2023080587A1 (en) Deep learning-based mlcc stacked alignment inspection system and method
CN114662594B (en) Target feature recognition analysis system
CN111028250A (en) Real-time intelligent cloth inspecting method and system
CN117252840B (en) Photovoltaic array defect elimination evaluation method and device and computer equipment
KR102666787B1 (en) Method, apparatus and program for inspecting defect

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination