CN110376211B - Wet-process-gummed synthetic leather hemming on-line detection device and method - Google Patents

Wet-process-gummed synthetic leather hemming on-line detection device and method Download PDF

Info

Publication number
CN110376211B
CN110376211B CN201910726981.0A CN201910726981A CN110376211B CN 110376211 B CN110376211 B CN 110376211B CN 201910726981 A CN201910726981 A CN 201910726981A CN 110376211 B CN110376211 B CN 110376211B
Authority
CN
China
Prior art keywords
detection
image
value
edge
detection system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910726981.0A
Other languages
Chinese (zh)
Other versions
CN110376211A (en
Inventor
林建宇
潘凌锋
陈浙泊
颜文俊
林斌
郑军
蒋婷婷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Research Institute of Zhejiang University Taizhou
Original Assignee
Research Institute of Zhejiang University Taizhou
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Research Institute of Zhejiang University Taizhou filed Critical Research Institute of Zhejiang University Taizhou
Priority to CN202110784407.8A priority Critical patent/CN113552134B/en
Priority to CN202110793717.6A priority patent/CN113567447A/en
Priority to CN201910726981.0A priority patent/CN110376211B/en
Publication of CN110376211A publication Critical patent/CN110376211A/en
Application granted granted Critical
Publication of CN110376211B publication Critical patent/CN110376211B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8806Specially adapted optical and illumination features
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/03Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by measuring coordinates of points
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N2021/8411Application to online plant, process monitoring
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8854Grading and classifying of flaws
    • G01N2021/8858Flaw counting
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8854Grading and classifying of flaws
    • G01N2021/8861Determining coordinates of flaws
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8854Grading and classifying of flaws
    • G01N2021/8874Taking dimensions of defect into account
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8854Grading and classifying of flaws
    • G01N2021/888Marking defects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Signal Processing (AREA)
  • Image Analysis (AREA)

Abstract

A device and a method for online detection of synthetic leather hemming by wet gluing comprise a support rod, a support rod chassis, a first cross rod, a second cross rod, a backlight source, a short rod, a support rod, a front surface light source, a drawing collecting module, an alarm lamp, an industrial personal computer and an electric control cabinet; the supporting rods are arranged on the supporting chassis, the supporting chassis is arranged on the ground, and the supporting rods are erected on the left side and the right side of the detection point; the first cross rod is arranged between the support rods; the backlight source is arranged on the first cross rod; the short rod is arranged on the support rod, and a support rod is arranged between the short rod and the support rod; the second cross rod is arranged between the short rods; the image acquisition module, the front light source and the alarm lamp are arranged on the second cross rod; the industrial personal computer and the electrical control cabinet are arranged on the supporting rod and are electrically connected with the image acquisition module and the front light source; the invention replaces manual work by a machine, thereby avoiding the condition that the manual inspection possibly has missing inspection or the problem is not found in time, and ensuring the quality of the produced cloth.

Description

Wet-process-gummed synthetic leather hemming on-line detection device and method
Technical Field
The invention relates to synthetic leather detection, in particular to a device and a method for detecting a synthetic leather hemming by wet gluing.
Background
On a synthetic leather wet gluing automatic production line, the glued cloth may have curled edges and the curled edge width exceeds a threshold value, the quality of the synthetic leather is directly influenced by the curled edge width exceeding the threshold value, and finished products with the length of tens of meters or even longer are scrapped under severe conditions. The main reason for the crimp width exceeding the threshold value is that the cloth is stretched by the front and rear rollers during the transportation on the rollers, the side edges of the cloth are sometimes turned over in the front direction, the turned-over area and the glued area are stuck together because the glue is not dried yet, and the width of the sticking area is too wide.
However, manufacturers currently monitor the operation of the system by manual inspection. Each production line is provided with an inspector, three positions of the production line can be used for checking the edge condition of the cloth, when the inspector finds that the edge curling width exceeds the threshold value, the production line is paused, if the condition is not serious, the production line is cut off by a blade, a thin and flat clamp is clamped at the joint of the edge curling width exceeding threshold value section and the normal section of the cloth, and then the equipment is started to operate. The clamp is placed to greatly reduce the probability that the edge of the nearby cloth has the edge curl width exceeding the threshold value again; if the condition is serious, the folded cloth is pulled strongly and restored to the original shape, a thin and flat clamp is clamped at the joint of the section with the edge curling width exceeding the threshold value and the normal section of the cloth, and then the equipment is started to operate; and the piece of defective cloth is cut off when the finished product is finally inspected.
However, the above method of monitoring by manual inspection has the following two disadvantages: (1) because the monitoring section of the production line is longer and about 100m, and three monitoring points are sequentially spaced by more than 30 meters, missing detection exists in a manual inspection mode, namely the condition that the edge curling width exceeds a threshold value but is not found is found until the final detection of a finished product at the tail end of the production line; (2) it is found that in time, the length of the curl which causes the width to exceed the threshold value is too long, and the curl width becomes large in the case of the front curl and the wider the curl. The above two defects increase the production cost and reduce the quality of the synthetic leather.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provides the device and the method for online detection of the synthetic leather hemming after wet gluing, which have the advantages of simple structure and convenience in use.
The device for online detection of the edge curling of the synthetic leather glued by the wet method is characterized by comprising a support rod, a support chassis, a first cross rod, a second cross rod, a backlight source, a short rod, a support rod, a front surface light source, a drawing collecting module, an alarm lamp, an industrial personal computer and an electric control cabinet; the supporting rods are arranged on the supporting chassis, the supporting chassis is arranged on the ground, and the supporting rods are erected on the left side and the right side of the detection point; the first cross rod is arranged between the support rods; the backlight source is arranged on the first cross rod; the short rod is arranged on the support rod, and a support rod is arranged between the short rod and the support rod; the second cross rod is arranged between the short rods; the image acquisition module, the front light source and the alarm lamp are arranged on the second cross rod; the industrial personal computer and the electrical control cabinet are arranged on the supporting rod and are electrically connected with the image acquisition module and the front light source.
Furthermore, the cross bar I is parallel to the ground, and the cross bar I is positioned above the cloth; the irradiation surface of the backlight source is parallel to the cloth, and the backlight source is in a normally bright state in the running process of the device; the short rod is perpendicular to the support rod; the support rod, the short rod and the support rod connecting part form a right-angled triangle; the second cross rod is positioned below the cloth and parallel to the ground.
Furthermore, the image acquisition module comprises an area-array camera, a lens, a protective cover and a fixing piece; the shooting surface of the image acquisition module is parallel to the cloth; the front light sources are arranged on the left side and the right side of the image acquisition module, and the irradiation surfaces of the front light sources are parallel to the cloth.
Further, a detection method based on the device of claim 1, specifically comprising the following steps:
101) opening a detection system and entering a starting interface; the detection system receives an initialization instruction and enters an initialization process; the initialization process comprises the steps that an industrial personal computer controls an image acquisition module to continuously acquire images; ending the initialization flow and entering an initialization interface;
201) the detection system receives a setting instruction, enters a parameter setting interface, and inputs parameters on the parameter setting interface; returning to an initialization interface after the parameter setting is successful;
301) the detection system receives a detection instruction and enters a detection process, wherein the detection process comprises out-of-bounds detection, detection of whether burrs exist on the edges of the cloth and edge distance detection;
302) firstly, carrying out-of-bound detection on a collected frame image; if the frame image is judged to be out of bounds, the detection system is converted from a normal state to an out of bounds state, the frame image is stored and displayed, the alarm lamp sends out an alarm signal, the detection process of the frame image is finished, and the detection system starts to carry out of bounds detection on the next frame image; if the frame image is judged not to be out of bounds, detecting whether burrs exist at the edge of the cloth of the frame image;
303) if the detection result in the step 302) is that the frame image does not go out of bounds, performing cloth edge burr detection on the frame image; if the detection result of the detection of whether the burrs exist on the edge of the cloth is burrs, the detection system judges that the frame image is normal, the frame image is displayed, the detection process of the frame image is finished, and the detection system starts to carry out-of-bounds detection on the next frame image; if the detection result is no burr, the detection system judges that the frame image is curled, the detection system is switched from a normal state to a curled state, the frame image is displayed, the detection process of the frame image is finished, and the detection system starts to directly detect the edge distance of the next frame image;
304) if the detection result in the step 303) is no burr, the detection system enters a curling state, and the detection system starts to directly detect the edge distance of the next frame of image; if the detection result of the edge distance detection is that the edge distance does not exceed the threshold, the detection system judges that the frame image is normal, displays the frame image, and finishes the detection process of the frame image, and the detection system performs out-of-bounds detection on the next frame image; if the detection result is that the edge distance exceeds the threshold value, the detection system judges that the frame image is in a state that the edge width exceeds the threshold value, the detection system converts the frame image into a state that the edge width exceeds the threshold value, the frame image is stored and displayed, the alarm lamp sends out an alarm signal, the detection process of the frame image is finished, and the detection system directly performs edge distance detection on the next frame image;
401) the detection system receives a detection flow ending instruction, ends the detection flow, returns to an initialization interface, receives an abnormal recording instruction and enters an abnormal recording interface; the abnormal record interface can view images and abnormal records stored in the detection process of the images;
501) the detection system receives the exit instruction and is closed.
Further, the out-of-bounds detection comprises the steps that an acquisition module acquires an image in a 24-bit BMP format with a set pixel size; selecting a frame of image to be converted into a gray image and carrying out mean filtering; performing N equal division on the image subjected to the average filtering, and setting the first line of a certain region subjected to the N equal division as a parameter i; the detection system judges whether the value of the parameter i is less than the total line number;
if the value of i is less than the total line number, judging whether the currently acquired image is a left image or a right image; if the image is the left image, setting a parameter j as the 1 st line of the ith row; setting the pixel value of the ith row and the 0 th column to be 255; setting the absolute value of the difference between the pixel values of the 0 th row and the j in the ith row to be greater than the set threshold number of times to be 0, and setting the absolute value of the difference between the pixel values not to be greater than the set threshold number of times to be 0; the set threshold is 127; if the image is the right image, setting a parameter j as the 1 st line of the ith row; setting the pixel value of the 0 th column in the ith row as 0; setting the absolute value of the difference between the pixel values of the 0 th row and the j in the ith row to be greater than the set threshold number of times to be 0, and setting the absolute value of the difference between the pixel values not to be greater than the set threshold number of times to be 0;
if the value of j is less than (the total number of columns is-1), calculating the absolute value of the difference between the pixel values of the 0 th column and the j in the ith row, and judging whether the absolute value is greater than a set threshold value;
if the absolute value of the difference between the pixel values of the 0 th row and the j in the ith row is larger than the set threshold, judging whether the number of times that the absolute value is larger than the set threshold is larger than 0; if the absolute value is larger than the set threshold, the number of times is not larger than 0, the j value is saved, and the number of times of the absolute value is larger than the set threshold is added with 1; if the absolute value is greater than the set threshold, the number of times is greater than 0, the absolute value is greater than the set threshold, the number of times is increased by 1, whether the absolute value is greater than the set threshold or not is judged, if yes, the value i is stored, and if not, the value j is increased by 1; the sum of j and 1 is the number of columns of j plus one;
if the absolute value of the difference between the pixel values of the 0 th row and the j in the ith row is not greater than the set threshold, judging whether the absolute value is greater than the set threshold for times greater than 0; if the absolute value is larger than the set threshold value, the number of times is not more than 0, j is added with 1; if the absolute value is larger than the set threshold, the times of the absolute value not larger than the set threshold are added by 1, whether the times of the absolute value not larger than the set threshold are larger than a set value or not is judged, and if not, j is added by 1; if yes, deleting the saved j value of the ith row, clearing times with the absolute value not greater than a set threshold value, and clearing times with the absolute value greater than the set threshold value;
if the value of i is equal to the total number of lines, judging whether the number of the stored lines is more than 2; if yes, counting the j column value of each i row, removing the maximum value, and then calculating the average value, namely the positioned cloth edge column value, and ending the process; if not, the process is determined to be out of bounds and the flow is ended.
Further, the detection of whether the rags exist at the edges of the cloth comprises the step that a detection system extracts a region to be detected according to the positioned edge column values of the cloth, wherein the region to be detected is represented by a letter ROI; then converting the ROI into a gray map, and carrying out Sober edge detection and thresholding on the ROI; extracting all edge contour pixel points, finding out the contour with the longest contour perimeter, and forming a contour point set by the pixel points on the contour; traversing the contour point set to find out the maximum distance between two points; judging whether the longest perimeter is larger than a set multiple of the maximum distance; if so, determining the burr, and finishing the burr detection; if not, determining that no burr exists, and finishing burr detection;
the image thresholding is to adopt a maximum inter-class variance method to determine a threshold k of the optimal gray value of the image*(ii) a Then using k*-1 image thresholding, greater than k*The gray value of the pixel point of-1 is 255, and the gray values of the other pixel points are 0.
An optimal threshold k obtained by the maximum inter-class variance method*Is that
Figure BDA0002159244760000041
Where k is the assumed gray value threshold,
Figure BDA0002159244760000042
is the between-class variance.
Further, the Sober edge detection processing procedure is as follows:
(a) let matrix a represent a matrix of pixel values for the ROI region;
(b) respectively taking derivatives G in the horizontal direction (x)xAnd the derivative G of the vertical direction (y)y
Horizontal direction: convolving A with a kernel of size 3, as shown in equation (1):
Figure BDA0002159244760000043
vertical direction: convolving A with a kernel of size 3, as in equation (2):
Figure BDA0002159244760000051
the gradation values of each pixel of the image in the horizontal direction and the vertical direction are combined by the following formula (3) to calculate the gradation value G of the pixel;
Figure BDA0002159244760000052
further, the edge distance detection comprises the steps that the image acquisition module acquires a frame of image and sends the frame of image to the detection system; the detection system converts the image into a gray scale image and performs gray scale value detection; the gray value detection specifically comprises the steps of adjusting the exposure of a primary camera according to a gray setting threshold and judging whether the average gray value of an image is adjusted to the setting threshold or not; if not, the acquisition module is switched back to acquire one frame of image, and adjustment and judgment are continued; if so, then performing bilateral filtering on the image; performing Sober edge detection and thresholding on the image, and extracting image edge pixel points; carrying out Hough line transformation by using the extracted image edge pixel points, finding all straight line segments in the image, storing the straight line segments with the slope larger than the set slope, and judging whether the number of the stored straight line segments is larger than 1; if the number of the stored straight line segments is not more than 1, turning to the detection result that the edge distance does not exceed the threshold value, and finishing the detection; if the number of the stored straight line segments is more than 1, then judging whether the image processed this time is a left image; if the image is the left image, comparing the abscissa values of the midpoints of the straight-line segments, and finding out two straight-line segments with the minimum and the second smallest abscissa values of the midpoints; if the image is not the left image, the image is indicated to be the right image, the abscissa values of the midpoints of the straight-line segments are compared, and two straight-line segments with the largest and next largest midpoint abscissa values are found;
calculating the distance between the two straight line segments and judging whether the distance is greater than a set edge distance threshold value; if so, turning to the detection result that the edge distance exceeds the threshold value, and finishing the detection; if not, the detection result is that the edge distance does not exceed the threshold, and the detection is finished.
Further, the realization idea of the hough line transformation alignment line segment is as follows:
firstly, a polar coordinate system is used for representing a straight line, the expression of the straight line is shown as a formula (4), wherein r represents a polar diameter, and theta represents a polar angle;
Figure BDA0002159244760000053
reducing equation (4) to equation (5):
r=x cosθ+y sinθ (5)
next, for a given point (x)0,y0) A cluster of straight lines passing through the point can be collectively defined as:
rθ=x0cosθ+y0sinθ (6)
this means that each pair (r)θθ) represents a passing point (x)0,y0) A straight line of (a);
for a given invariant point (x)0,y0),rθVaries with the variation of θ; plotting all passing points (x) at the plane θ -r0,y0) Obtaining a sine curve by the straight line of (1);
performing the above operation on all edge pixel points in the image, wherein the number of curves intersected at one point on the plane theta-r exceeds a certain threshold, and then the parameter pair represented by the intersection point can be considered to be a straight line in the original image; and then the formula (4) is used for reverse calculation.
Further, the abnormal record comprises an out-of-range information record and a crimp width super-threshold information record; when the detection system is in an out-of-bounds state and a normal state, only burr detection and out-of-bounds detection are carried out; when the detection system is in a curling state and the curling width exceeds a threshold value, only edge distance detection is carried out; when the detection process carries out-of-bound detection in a normal state, the counting and judging functions of out-of-bound and burr-free detection times are added; when the detection process detects whether burrs exist or not in an out-of-bound state, counting and judging functions for the number of detection times of burrs and burrs are added; when the detection process carries out edge distance detection in a curling state, the counting and judging functions of the detection times that the edge distance exceeds the threshold value and the edge distance does not exceed the threshold value are added; and when the edge distance is detected in the state that the width of the turned edge exceeds the threshold value, the detection process increases the counting and judging functions of the number of times that the edge distance does not exceed the threshold value.
The beneficial effects of the invention are as follows:
the invention designs a device for online detection of the hemming of the synthetic leather glued by a wet method, which adopts an online detection method based on machine vision to monitor the left side and the right side of the cloth in the production process of the wet gluing of the synthetic leather in real time. The device is arranged at three positions of the edge of the cloth to be checked on a production line, so that real-time monitoring is carried out, monitoring data are transmitted to a workshop monitoring room in real time, when the condition that the width of the turned edge exceeds a threshold value is detected, the device stores and displays the image, an alarm automatically gives an alarm, the abnormal state of the production line is fed back to a control room in time, and real-time monitoring and rapid processing are realized; in other cases the device can also save the image for viewing at any time.
The invention replaces manual work by a machine, thereby avoiding the condition that the manual inspection possibly has missing inspection or the problem is not found in time, and ensuring the quality of the produced cloth.
Drawings
FIG. 1 is a schematic structural view of the present invention;
FIG. 2 is a flow chart of the operation of the human-computer interaction interface of the present invention;
FIG. 3 is a general flow chart of the detection of the present invention;
FIG. 4 is a flow chart of out-of-bounds detection and edge location according to the present invention;
FIG. 5 is a flow chart of burr detection according to the present invention;
FIG. 6 is a flowchart illustrating the edge distance detection of the present invention;
FIG. 7 is a flowchart illustrating the gray level detection process of the present invention.
Description of reference numerals: the device comprises a support rod 1, a support chassis 2, a first cross rod 3, a backlight source 4, a short rod 5, a second cross rod 6, a picture collecting module 7, a front light source 8, an alarm lamp 9, an industrial personal computer 10, an electrical control cabinet 11 and a support rod 12.
Detailed Description
The invention will be further described with reference to the drawings and specific examples.
The first embodiment is as follows:
as shown in figure 1, the device for on-line detection of the synthetic leather hemming by wet gluing comprises a support rod 1, a support chassis 2, a first cross rod 3, a backlight source 4, a short rod 5, a second cross rod 6, a drawing collecting module 7, a front light source 8, an alarm lamp 9, an industrial personal computer 10, an electric control cabinet 11 and a support rod 12. The number of the support rods 1 is two, the two support rods 1 are respectively erected on the left side and the right side of the detection point, and the support rods 1 are arranged on the support chassis 2. The supporting chassis 2 is arranged on the ground. The supporting rod 1 and the supporting chassis 2 form a bearing supporting structure of the whole detection device. The first cross rod 3 is arranged between the support rods 1, the first cross rod 3 is parallel to the ground and located above the cloth, and the cross rod 3 is provided with a backlight source 4. The number of the backlight sources 4 is two, the irradiation surface of the backlight source 4 is parallel to the lower surface of the cloth, and the backlight source 4 is in a normally bright state in the whole detection process. The short rod 5 is arranged on the support rod 1, the short rod 5 is perpendicular to the front end face of the support rod 1, and the front end face is one face of the cloth entering the detection device. A support rod 12 is arranged between the short rod 5 and the support rod 1, and the support rod 12 is arranged on the lower end face of the short rod 5 and the front end face of the support rod 1. The connecting parts of the support rod 12 arranged at the same side of the detection point, the support rod 1 and the short rod 5 form a right-angled triangle structure, so that the short rod 5 is firmer in structure and stronger in bearing capacity. The second cross rod 6 is arranged between the support rods 5, the second cross rod 6 is arranged in parallel to the ground, and the second cross rod is located below the cloth. And the picture-taking module 7, the front light source 8 and the alarm lamp 9 are arranged on the second cross rod 6. The image acquisition module 7 comprises an area-array camera, a lens, a protective cover and a fixing piece. The shooting surface of the image acquisition module is parallel to the cloth. The number of the image acquisition modules 7 is two, and the two image acquisition modules are respectively used for acquiring images of two side edges of the cloth. The front light source 8 is arranged at the left side and the right side of the image collecting module 7. The front light source 8 is turned on only when the detection system operation state is the hemming state or the hemming width super threshold state, and the other states are turned off. The irradiation surface of the front light source 8 is parallel to the cloth surface. Alarm lamp 9 is used for reporting to the police and reminds the effect, and alarm lamp 9 sends alarm signal when detecting system operation is out of bounds state or turn-up width surpasss the threshold value state, and specific alarm signal is for sending the chimes of doom and change into red light by normal green light. The industrial personal computer 10 and the electric appliance control cabinet 11 are arranged on the front end face of the support rod 1 and below the short rod 5. The industrial personal computer 10 is electrically connected with the image acquisition module 7 and the front surface light source 8. The image collected by the image collecting module 7 is transmitted to the industrial personal computer 10, and the industrial personal computer 10 can also control the switch of the positive surface light source 8. The electrical control cabinet 11 is used for converting an input 220V commercial power into voltage and current required by each component in the detection device, and outputting the converted voltage and current to each component.
As shown in figure 2, the method for detecting the edge curl of the synthetic leather glued by the wet method on line depends on the detection device. The detection method comprises a detection system; and the detection system enters a starting interface after being started, and the starting interface comprises an initialization button and an exit button. And an exit button of the starting-up interface can close the detection system. And clicking the initialization button to send an initialization instruction, receiving the initialization instruction by the detection system, and entering an initialization process. The initialization process comprises the steps that the industrial personal computer 10 sends signals, and the image acquisition module 7 receives the signals sent by the industrial personal computer 10 and starts to acquire images. And entering an initialization interface after the initialization process is finished. The images acquired by the image acquisition module 7 can be displayed in real time on the initialization interface.
The initialization interface comprises a setting button, a detection button, an abnormal recording button and an exit button. Clicking the setting button to send a setting instruction, receiving the setting instruction by the detection system, and entering a parameter setting interface, wherein the parameter setting interface comprises a confirmation button and a cancel button, and if the confirmation button is clicked, the parameter setting is successful and the initialization interface is returned, and if the cancel button is clicked, the parameter is not successfully set and the initialization interface is returned. The parameter setting is specifically manual setting. Clicking the detection button, sending a detection instruction, receiving the detection instruction by the detection system, entering a detection interface and starting a detection process. The detection interface includes a stop button. Clicking a stop button to send a stop instruction, receiving the stop instruction by a detection system, ending the detection process, and returning to an initialization interface; otherwise, the detection process is continuously carried out. And clicking the abnormal recording button to send an abnormal recording instruction, receiving the abnormal recording instruction by the detection system, entering an abnormal recording interface, and checking the image and the abnormal record stored in the detection process on the abnormal recording interface. The abnormal records comprise a record of out-of-bounds information of the cloth and a record of the edge curl width exceeding a threshold value. The abnormal recording interface comprises a return button, and the abnormal recording interface can be quitted and returned to the initialization interface by clicking the return button; an exit button of the initialization interface can turn off the detection system. The initial state of the detection system is a normal state.
As shown in fig. 3, the detection process is specifically that the detection system first performs boundary detection on a frame of image acquired by the acquisition module 7. If the detection system judges that the frame image is out of bounds, the detection system is converted from a normal state to an out of bounds state, the detection system stores the frame image, displays the frame image on a display interface and marks the frame image as out of bounds, meanwhile, an alarm lamp 9 sends out an alarm signal, the detection process of the frame image is finished, and the detection system starts out of bounds detection on the next frame image; if the frame image is judged not to be out of bounds, detecting whether the edge of the cloth has burrs or not on the frame image, if the state of the detection system is out of bounds at the moment, converting the frame image into a normal state, and if not, keeping the frame image unchanged.
Detecting whether burrs exist on the edges of the cloth, if the detection result is the burrs, judging that the frame image is normal by the detection system, displaying the frame image and marking the frame image as normal, ending the detection process of the frame image, and starting the detection process of taking out-of-bounds detection as the start for the next frame image by the detection system; if the detection result is no burr, the detection system judges that the frame image is curled edge, the detection system is switched from a normal state to a curled edge state, the frame image is displayed on a display interface and is marked as curled edge, the detection process of the frame image is finished, the detection system starts to detect the next frame image, and the edge distance detection is directly carried out on the next frame image.
Detecting the edge distance, wherein if the edge distance does not exceed the threshold value, the detection system judges that the frame image is normal, displays the frame image on a display interface and marks the frame image as normal, the detection process of the frame image is finished, the detection system is converted into a normal state, and the detection system starts a detection process which starts to perform out-of-bounds detection on the next frame image; if the detection result is that the edge distance exceeds the threshold value, the detection system judges that the frame image is the curling width exceeding threshold value, if the detection system is in the curling state, the detection system is converted into the curling width exceeding threshold value state, otherwise, the detection system keeps the curling width exceeding threshold value state, the detection system stores the frame image, the frame image is displayed on a display interface and is marked as the edge distance exceeding threshold value, meanwhile, an alarm lamp 9 sends out an alarm signal, the detection process of the frame image is finished, the detection system starts to detect the next frame image, and the edge distance detection is directly carried out on the next frame image.
In order to reduce the misjudgment of out-of-bounds and burr-free when the detection system is in a normal state, the counting and judging functions of the detection times of the two judging processes are added in the detection process. In order to reduce the false judgment of burrs and no burrs when the detection system is in an out-of-bounds state, the counting and judging functions of the detection times of the two judging processes are added in the detection process. In order to reduce the misjudgment that the distance exceeds the threshold value and the distance does not exceed the threshold value when the detection system is in the curling state, the counting and judging functions of the detection times of the two judging processes are added in the detection process. In order to reduce the misjudgment that the distance does not exceed the threshold value when the detection system is in the state that the hemming width exceeds the threshold value, the counting and judging function of the detection times of the judging process is added in the detection process. When the detection system is in a normal state, if the detection result is that the continuous occurrence frequency of the out-of-bounds condition exceeds a set threshold value, the detection system enters the out-of-bounds state; and if the detection result is that the continuous occurrence frequency of the situation without burrs exceeds a set threshold value, the detection system enters a crimping state. When the detection system is in an out-of-range state, if the detection result is that the continuous occurrence frequency of the burrs exceeds a set threshold value, the detection system enters a normal state; and if the detection result is that the continuous occurrence frequency of the situation without burrs exceeds a set threshold value, the detection system enters a crimping state. When the detection system is in a curling state, if the detection result is that the continuous occurrence frequency of the condition that the edge distance does not exceed the threshold value exceeds the set threshold value, the detection system enters a normal state; and if the detection result is that the continuous occurrence frequency of the condition that the edge distance exceeds the threshold value exceeds the set threshold value, the detection system enters a state that the edge curling width exceeds the threshold value. When the detection system is in the state that the width of the turned edge exceeds the threshold value, if the detection result shows that the continuous occurrence frequency of the condition that the edge distance does not exceed the threshold value exceeds the set threshold value, the detection system enters a normal state.
The out-of-bounds detection flow is shown in fig. 4. Firstly, the image acquisition module 7 acquires an image of a cloth production line, wherein the image is an image in 24-bit BMP format with a set pixel size. The detection system selects a frame of image from the collected images, converts the image into a gray image and carries out mean value filtering. The process of converting the image into the gray-scale image comprises the steps of converting the 24-bit image into an 8-bit 3-channel image and converting the obtained 8-bit 3-channel image into a single-channel 256-color image. The mean filtering is to perform filtering processing on the gray level image by adopting a neighborhood mean method, and the processing of the domain mean value on the image can enable the integral display of the image to be smoother. Specifically, the neighborhood averaging method may be described as selecting a target pixel, where the target pixel combines 8 surrounding pixels surrounding the target pixel to form a filter template, and then replacing the gray value of the target pixel with the average gray value of all the pixels in the filter template, where the all the pixels are 9 pixels including the target pixel. Next, the filtered grayscale map is divided into N equal parts, and the first line of a certain region after the N equal parts is set as a parameter i, so the value range of the parameter i is the 0 th line, the (total line/N), and the (total line- (total line/N)) of the total line number, and the value of N is set to 5 in this embodiment. The detection system judges whether the value of the parameter i is less than the total line number, if so, whether the currently acquired image is a left image or a right image is judged, and the left image is an image acquired by an image acquisition module arranged on the left side; the right image is a picture obtained by the picture-taking module arranged on the right side. If the image is the left image, setting a parameter j as the 1 st line of the ith row; setting the pixel value of the 0 th column in the ith row to be 255; the absolute value of the difference between the pixel values of the 0 th row and the j in the ith row is set to be 0 more than the set threshold number of times, and the absolute value of the difference between the pixel values is not more than the set threshold number of times and is 0, wherein the set threshold is 127 in the embodiment. If the image is the right image, setting a parameter j as the 1 st line of the ith row; setting the pixel value of the 0 th column in the ith row as 0; setting the absolute value of the difference between the pixel values of the 0 th column and j of the ith row to be greater than the set threshold number of times to be 0, and setting the absolute value of the difference between the pixel values not greater than the set threshold number of times to be 0. If j is smaller than (total column number-1), find the absolute value of the difference between the pixel values of the 0 th column and j in the ith row. And judging whether the absolute value is larger than a set threshold value.
And if the absolute value of the difference between the pixel values of the 0 th row and the j in the ith row is larger than the set threshold, judging whether the number of times that the absolute value is larger than the set threshold is larger than 0. If the absolute value is larger than the set threshold, the number of times is not larger than 0, the j value is saved, and the number of times of the absolute value is larger than the set threshold is added with 1; and if the absolute value is greater than the set threshold, the number of times is greater than 0, the absolute value is greater than the set threshold and is added by 1, whether the number of times is greater than the set threshold is judged, if yes, the value i is stored, and if not, the value j is added by 1. And j plus 1 is the number of the columns of j plus one, j is the pixel value of the 2 nd column of the ith row after j is added with 1 for the first time, and the like.
And if the absolute value of the difference between the pixel values of the 0 th row and the j in the ith row is not larger than the set threshold, judging whether the number of times that the absolute value is larger than the set threshold is larger than 0. If the absolute value is larger than the set threshold value, the number of times is not more than 0, j is added with 1; if the absolute value is greater than the set threshold, adding 1 to the number of times that the absolute value is not greater than the set threshold, judging whether the absolute value is not greater than the set threshold, and if not, adding 1 to j; if yes, deleting the saved j value of the ith row, clearing times when the absolute value is not more than the set threshold value, and clearing times when the absolute value is more than the set threshold value.
If the value of i is equal to the total number of lines, judging whether the number of lines is larger than 2. If yes, counting the j column value of each i row, removing the maximum value, and then calculating the average value, namely the positioned cloth edge column value, and ending the process; if not, the process is determined to be out of bounds and the flow is ended.
The flow of detecting whether there is a burr on the edge of the cloth is shown in fig. 5. The detection system can extract a region to be detected (ROI) according to the positioned edge column value of the cloth. Specifically, if the column value is smaller than 1/20 of the total number of columns of the image, the ROI is a rectangular region whose starting point at the upper left corner is the origin of the image, the coordinates are (0,0), the length is the total number of rows of the image, and the width is (the total number of columns of the image) - (the positioned column value) + (1/20 of the total number of columns of the image). If the column value is greater than 19/20 for the total number of columns in the image, the ROI is a rectangular region whose top left corner starting point is in the image with coordinates ((located column value- (1/20 for the total number of columns in the image)), 0), length (total number of rows in the image), width (total number of columns in the image) - (located column value) + (1/20 for the total number of columns in the image). If the column value is not less than 1/20 and not greater than 19/20 of the total number of columns of the image, the ROI is a rectangular region whose coordinates of the starting point at the upper left corner are ((located column value- (1/20 of the total number of columns of the image)), 0), length is the total number of rows of the image, and width is 1/10 of the total number of columns of the image.
The ROI is converted into a gray scale image, and then Sober edge detection processing is carried out on the ROI.
The Sober edge detection processing procedure is as follows:
(1) let matrix a represent a matrix of pixel values for the ROI region;
(2) respectively taking derivatives G in the horizontal direction (x)xAnd the derivative G of the vertical direction (y)y
Horizontal direction: convolving A with a kernel of size 3, as shown in equation (1):
Figure BDA0002159244760000111
vertical direction: convolving A with a kernel of size 3, as in equation (2):
Figure BDA0002159244760000112
the gradation values of the horizontal direction and the vertical direction of each pixel of the image are combined by the following formula (3) to calculate the gradation value G of the pixel.
Figure BDA0002159244760000121
And then carrying out image thresholding on the ROI subjected to the edge detection processing. The image thresholding adopts a maximum inter-class variance method to determine a threshold k of the optimal gray value of the image*. The principle of the maximum between-class variance method is that the larger the between-class variance between the background and the target is, the larger the difference between two parts forming the image is, that means, the probability of error classification of the two parts forming the image is minimum, and then the optimal threshold k of the gray value for thresholding the image is obtained*. Finding k*Then, using k*-1 image thresholding, greater than k*The gray value of the pixel point of-1 is 255, and the gray values of the other pixel points are 0. The thresholded gray image is changed into a black-white two-color image, the gray value of 0 represents black, and the gray value of 255 represents white.
Then, extracting all edge contour pixel points, finding out the contour with the longest contour perimeter, and forming a contour point set by the pixel points on the contour. And traversing the contour point set to obtain the maximum distance between two points. It is determined whether the longest circumference is larger than a set multiple of the maximum pitch (in this embodiment, the set multiple is 2.1). If so, determining the burr, and finishing the burr detection; if not, the burr is judged to be not burr, and the burr detection is finished.
The maximum inter-class variance method is implemented according to the following thinking:
the to-be-detected gray scale map has 256 gray scale values [1, 2.,. 256. ], 256]. The number of the pixel points with the gray value i is ni, and the total number of the pixel points is
Figure BDA0002159244760000122
Using the normalized grey value histogram and regarding the grey value histogram as the probability distribution of the image, the probability that the grey value of a random pixel point is i is pi:
Figure BDA0002159244760000123
now suppose to passA threshold with a gray value of k divides these pixel points into two categories: c0And C1;C0Representing a gray value of [1, 2., k]Pixel point of (2), C1Denotes a gray value of [ k + 1., 256%]The pixel point of (2). Then, a random pixel point is set to belong to C0Has a probability of ω0(ii) a Random one pixel point belonging to C1Has a probability of ω1;C0The average gray value of the middle pixel point is mu0;C1The average gray value of the middle pixel point is mu1. The specific calculation is shown in equations (5) to (8):
Figure BDA0002159244760000124
Figure BDA0002159244760000125
Figure BDA0002159244760000126
Figure BDA0002159244760000127
wherein, ω (k) represents the probability that the gray value of one random pixel belongs to [1, 2., k ]; μ (k) represents the average gray value of the set of pixel points whose gray value belongs to [1, 2.., k ],
Figure BDA0002159244760000131
Figure BDA0002159244760000132
while muTRepresenting the average gray value of the entire image,
Figure BDA0002159244760000133
for any selected k, there are:
ω0μ01μ1=μT,ω01=1. (12)
eta in the following formula (13) was selected as a measurement criterion for evaluating "good or bad" (separability) by selecting k as a threshold
Figure BDA0002159244760000134
Wherein the content of the first and second substances,
Figure BDA0002159244760000135
representing the between-class variance;
Figure BDA0002159244760000136
total variance representing gray value:
Figure BDA0002159244760000137
again, from equation (12), it follows:
Figure BDA0002159244760000138
the following formula is used to select different k value sequence search, and according to the formulas (9) and (10), the optimal threshold k is found*So that η takes a maximum value. The formulas (4) and (11) are introduced into the formulas (14) and (15), and the total variance of the gray scale values is easily observed
Figure BDA0002159244760000139
Is not changed by the value of k, so making η maximum is equivalent to making η maximum
Figure BDA00021592447600001310
To a maximum.
Figure BDA00021592447600001311
Figure BDA00021592447600001312
Therefore, the optimum threshold k*As is derived from the equation (18),
Figure BDA00021592447600001313
the edge distance detection process is shown in fig. 6. The image acquisition module 7 acquires a frame of image and sends the frame of image to the detection system, and the detection system converts the frame of image into a gray scale image and performs gray scale value detection. The gray value detection specifically comprises adjusting the exposure of the primary camera according to a gray setting threshold and judging whether the average gray value of the image is adjusted to the setting threshold. If not, the image-collecting module 7 collects a frame of image and continues to adjust and judge; if so, the average gray value of the image is within the range of the set threshold value, and then bilateral filtering is performed on the image, so that the boundaries of the glued area and the non-glued area of the edge of the cloth are clearer. And performing Sober edge detection and thresholding on the image to extract image edge pixel points. And (4) utilizing the extracted edge pixel points, finding all straight-line segments in the image by utilizing Hough line transformation, and storing the straight-line segments with the slope larger than the set slope. And then judging whether the number of the saved straight line segments is more than 1. If the number of the stored straight line segments is not more than 1, turning to the detection result that the edge distance does not exceed the threshold value, and finishing the detection; if the number of the saved straight line segments is more than 1, then judging whether the image processed this time is a left image. If the image is the left image, comparing the abscissa values of the midpoints of the straight-line segments, and finding out two straight-line segments with the minimum and the second smallest abscissa values of the midpoints; if the image is not the left image, the image is the right image, the abscissa values of the midpoints of the straight-line segments are compared, and two straight-line segments with the largest midpoint abscissa value and the next largest midpoint abscissa value are found. And calculating the distance between the two straight line segments and judging whether the distance is larger than a set edge distance threshold value. If so, turning to the detection result that the edge distance exceeds the threshold value, and finishing the detection; if not, the detection result is that the edge distance does not exceed the threshold, and the detection is finished.
The realization idea of finding straight line segments by Hough line transformation is as follows:
firstly, a polar coordinate system is used for representing a straight line, the expression of the straight line is shown as formula (19), wherein r represents a polar diameter, and theta represents a polar angle;
Figure BDA0002159244760000141
simplifying to obtain:
r=x cosθ+y sinθ (20)
next, for a given point (x)0,y0) A cluster of straight lines passing through the point can be collectively defined as:
rθ=x0cosθ+y0sinθ (21)
this means that each pair (r)θθ) represents a passing point (x)0,y0) Is measured.
For a given invariant point (x)0,y0),rθAll passing points (x) are plotted in polar coordinates versus the polar angle plane of the polar radius as theta changes0,y0) A sine curve will be obtained. The horizontal axis of the polar coordinate to the polar angle plane of the polar diameter is theta, and the vertical axis is r, namely a plane theta-r which is described below.
This can be done for all points in the image, if the curves resulting from the above operation for two different points intersect in the plane θ -r, which means that they pass through the same straight line.
What the hough line transformation needs to do is to track the intersection point between the curves in the plane theta-r corresponding to each point in the image, and if the number of curves intersected at one point exceeds a certain threshold value, the parameter pair represented by the intersection point can be considered to be a straight line in the original image. The straight line can be obtained by reverse calculation using the formula (19).
The gray value detection flow is shown in fig. 7. And selecting an area in the gray image according to the positioned edge column value of the cloth. The selected area is a rectangular area, the starting point of the upper left corner of the rectangle is the coordinate point in the image ((total image columns-located column values)/10), 0), the length is the total number of rows of the image, and the width is (total image columns-located column values)/10. And traversing all pixel points of the region to calculate an average gray value. And judging whether the average gray value is within a set threshold range, wherein the set threshold range is 70-80 in the embodiment. If the average gray value of the selected area is within the range of the set threshold value, judging that the gray value is proper, and finishing the detection; if the average gray value of the selected area is not in the set threshold range, the gray value is judged to be larger or smaller, and the detection is finished.
The above description is only an embodiment of the present invention, and is not intended to limit the present invention in any way, and simple modifications, equivalent changes and modifications may be made without departing from the technical solutions of the present invention, and the scope of the present invention is defined by the appended claims.

Claims (7)

1. The method for online detection of the wet-process glued synthetic leather hemming is characterized by comprising the following steps:
101) opening a detection system and entering a starting interface; the detection system receives an initialization instruction and enters an initialization process; the initialization process comprises the steps that an industrial personal computer controls an image acquisition module to continuously acquire images; ending the initialization flow and entering an initialization interface;
201) the detection system receives a setting instruction, enters a parameter setting interface, and inputs parameters on the parameter setting interface; returning to an initialization interface after the parameter setting is successful;
301) the detection system receives a detection instruction and enters a detection process, wherein the detection process comprises out-of-bounds detection, detection of whether burrs exist on the edges of the cloth and edge distance detection;
302) firstly, carrying out-of-bound detection on a collected frame image; if the frame image is judged to be out of bounds, the detection system is converted from a normal state to an out of bounds state, the frame image is stored and displayed, the alarm lamp sends out an alarm signal, the detection process of the frame image is finished, and the detection system starts to carry out of bounds detection on the next frame image; if the frame image is judged not to be out of bounds, detecting whether burrs exist at the edge of the cloth of the frame image;
303) if the detection result in the step 302) is that the frame image does not go out of bounds, performing cloth edge burr detection on the frame image; if the detection result of the detection of whether the burrs exist on the edge of the cloth is burrs, the detection system judges that the frame image is normal, the frame image is displayed, the detection process of the frame image is finished, and the detection system starts to carry out-of-bounds detection on the next frame image; if the detection result is no burr, the detection system judges that the frame image is curled, the detection system is switched from a normal state to a curled state, the frame image is displayed, the detection process of the frame image is finished, and the detection system starts to directly detect the edge distance of the next frame image;
304) if the detection result in the step 303) is no burr, the detection system enters a curling state, and the detection system starts to directly detect the edge distance of the next frame of image; if the detection result of the edge distance detection is that the edge distance does not exceed the threshold, the detection system judges that the frame image is normal, displays the frame image, and finishes the detection process of the frame image, and the detection system performs out-of-bounds detection on the next frame image; if the detection result is that the edge distance exceeds the threshold value, the detection system judges that the frame image is in a state that the edge width exceeds the threshold value, the detection system converts the frame image into a state that the edge width exceeds the threshold value, the frame image is stored and displayed, the alarm lamp sends out an alarm signal, the detection process of the frame image is finished, and the detection system directly performs edge distance detection on the next frame image;
401) the detection system receives a detection flow ending instruction, ends the detection flow, returns to an initialization interface, receives an abnormal recording instruction and enters an abnormal recording interface; the abnormal record interface can view images and abnormal records stored in the detection process of the images;
501) the detection system receives the exit instruction and is closed;
the detection device applied by the detection method comprises a support rod, a support chassis, a first cross rod, a second cross rod, a backlight source, a short rod, a support rod, a positive surface light source, a picture collecting module, an alarm lamp, an industrial personal computer and an electric control cabinet; the supporting rods are arranged on the supporting chassis, the supporting chassis is arranged on the ground, and the supporting rods are erected on the left side and the right side of the detection point; the first cross rod is arranged between the support rods; the backlight source is arranged on the first cross rod; the short rod is arranged on the support rod, and a support rod is arranged between the short rod and the support rod; the second cross rod is arranged between the short rods; the image acquisition module, the front light source and the alarm lamp are arranged on the second cross rod; the industrial personal computer and the electrical control cabinet are arranged on the supporting rod and are electrically connected with the image acquisition module and the front light source.
2. The method for on-line detection of synthetic leather hemming with wet gluing according to claim 1, wherein the out-of-bounds detection comprises an acquisition module acquiring an image in 24-bit BMP format with a set pixel size; selecting a frame of image to be converted into a gray image and carrying out mean filtering; performing N equal division on the image subjected to the average filtering, and setting the first line of a certain region subjected to the N equal division as a parameter i; the detection system judges whether the value of the parameter i is less than the total line number;
if the value of i is less than the total line number, judging whether the currently acquired image is a left image or a right image; if the image is the left image, setting a parameter j as the 1 st line of the ith row; setting the pixel value of the ith row and the 0 th column to be 255; setting the absolute value of the difference between the pixel values of the 0 th row and the j in the ith row to be greater than the set threshold number of times to be 0, and setting the absolute value of the difference between the pixel values not to be greater than the set threshold number of times to be 0; the set threshold is 127; if the image is the right image, setting a parameter j as the 1 st line of the ith row; setting the pixel value of the 0 th column in the ith row as 0;
setting the absolute value of the difference between the pixel values of the 0 th row and the j in the ith row to be greater than the set threshold number of times to be 0, and setting the absolute value of the difference between the pixel values not to be greater than the set threshold number of times to be 0;
if the j value is smaller than the difference value of the total row number minus 1, solving the absolute value of the difference between the pixel values of the 0 th row and the j in the ith row, and judging whether the absolute value is larger than a set threshold value or not;
if the absolute value of the difference between the pixel values of the 0 th row and the j in the ith row is larger than the set threshold, judging whether the number of times that the absolute value is larger than the set threshold is larger than 0; if the absolute value is larger than the set threshold, the number of times is not larger than 0, the j value is saved, and the number of times of the absolute value is larger than the set threshold is added with 1; if the absolute value is greater than the set threshold, the number of times is greater than 0, the absolute value is greater than the set threshold, the number of times is increased by 1, whether the absolute value is greater than the set threshold or not is judged, if yes, the value i is stored, and if not, the value j is increased by 1; the sum of j and 1 is the number of columns of j plus one;
if the absolute value of the difference between the pixel values of the 0 th row and the j in the ith row is not greater than the set threshold, judging whether the absolute value is greater than the set threshold for times greater than 0; if the absolute value is larger than the set threshold value, the number of times is not more than 0, j is added with 1;
if the absolute value is larger than the set threshold, the times of the absolute value not larger than the set threshold are added by 1, whether the times of the absolute value not larger than the set threshold are larger than a set value or not is judged, and if not, j is added by 1; if yes, deleting the saved j value of the ith row, clearing times with the absolute value not greater than a set threshold value, and clearing times with the absolute value greater than the set threshold value;
if the value of i is equal to the total number of lines, judging whether the number of the stored lines is more than 2; if yes, counting the j column value of each i row, removing the maximum value, and then calculating the average value, namely the positioned cloth edge column value, and ending the process; if not, the process is determined to be out of bounds and the flow is ended.
3. The method for on-line detection of synthetic leather hemming by wet gluing according to claim 1, wherein the detection of whether there is a burr at the edge of the cloth comprises the steps of extracting a region to be detected by a detection system according to a column value of the edge of the cloth which is positioned, wherein the region to be detected is represented by a letter ROI; then converting the ROI into a gray map, and carrying out Sober edge detection and thresholding on the ROI; extracting all edge contour pixel points, finding out the contour with the longest contour perimeter, and forming a contour point set by the pixel points on the contour; traversing the contour point set to find out the maximum distance between two points; judging whether the longest perimeter is larger than a set multiple of the maximum distance;
if so, determining the burr, and finishing the burr detection; if not, determining that no burr exists, and finishing burr detection;
the image thresholding is to adoptMethod for determining optimal threshold k of image gray value by maximum inter-class variance method*(ii) a Then using k*-1 image thresholding, greater than k*Setting the gray value of the pixel point of-1 to be 255 and setting the gray values of the other pixel points to be 0;
an optimal threshold k obtained by the maximum inter-class variance method*Is that
Figure FDA0003096054340000031
Where k is the assumed gray value threshold,
Figure FDA0003096054340000032
is the between-class variance.
4. The on-line detection method for the wet-process glued synthetic leather hemming is characterized in that the Sober edge detection treatment process is as follows:
(a) let matrix a represent a matrix of pixel values for the ROI region;
(b) respectively taking derivatives G in the horizontal direction (x)xAnd the derivative G of the vertical direction (y)y
Horizontal direction: convolving A with a kernel of size 3, as shown in equation (1):
Figure FDA0003096054340000033
vertical direction: convolving A with a kernel of size 3, as in equation (2):
Figure FDA0003096054340000041
the gradation values of each pixel of the image in the horizontal direction and the vertical direction are combined by the following formula (3) to calculate the gradation value G of the pixel;
Figure FDA0003096054340000042
5. the method for on-line detection of synthetic leather hemming with wet gluing according to claim 1, wherein the inspecting the margin comprises acquiring a frame of image by an image acquisition module and sending the frame of image to a detection system; the detection system converts the image into a gray scale image and performs gray scale value detection; the gray value detection specifically comprises the steps of adjusting the exposure of a primary camera according to a gray setting threshold and judging whether the average gray value of an image is adjusted to the setting threshold or not; if not, the acquisition module is switched back to acquire one frame of image, and adjustment and judgment are continued; if so, then performing bilateral filtering on the image; performing Sober edge detection and thresholding on the image, and extracting image edge pixel points; carrying out Hough line transformation by using the extracted image edge pixel points, finding all straight line segments in the image, storing the straight line segments with the slope larger than the set slope, and judging whether the number of the stored straight line segments is larger than 1; if the number of the stored straight line segments is not more than 1, turning to the detection result that the edge distance does not exceed the threshold value, and finishing the detection; if the number of the stored straight line segments is more than 1, then judging whether the image processed this time is a left image; if the image is the left image, comparing the abscissa values of the midpoints of the straight-line segments, and finding out two straight-line segments with the minimum and the second smallest abscissa values of the midpoints; if the image is not the left image, the image is indicated to be the right image, the abscissa values of the midpoints of the straight-line segments are compared, and two straight-line segments with the largest and next largest midpoint abscissa values are found;
calculating the distance between the two straight line segments and judging whether the distance is greater than a set edge distance threshold value; if so, turning to the detection result that the edge distance exceeds the threshold value, and finishing the detection; if not, the detection result is that the edge distance does not exceed the threshold, and the detection is finished.
6. The method for online detection of the wet-process glued synthetic leather hemming according to claim 5, wherein the realization idea of the Hough line transformation straightening line segment is as follows:
firstly, a polar coordinate system is used for representing a straight line, the expression of the straight line is shown as a formula (4), wherein r represents a polar diameter, and theta represents a polar angle;
Figure FDA0003096054340000043
reducing equation (4) to equation (5):
r=x cosθ+y sinθ (5)
next, for a given point (x)0,y0) A cluster of straight lines passing through the point can be collectively defined as:
rθ=x0 cosθ+y0 sinθ (6)
this means that each pair (r)θθ) represents a passing point (x)0,y0) A straight line of (a);
for a given invariant point (x)0,y0) Polar diameter rθChanges with changes in the polar angle θ; plotting all passing points (x) at the plane θ -r0,y0) Obtaining a sine curve by the straight line of (1); the plane theta-r is a polar coordinate to polar diameter polar angle plane;
performing the above operation on all edge pixel points in the image, wherein the number of curves intersected at one point on the plane theta-r exceeds a certain threshold, and then the parameter pair represented by the intersection point can be considered to be a straight line in the original image; and then the formula (4) is used for reverse calculation.
7. The method for on-line detection of the wet-process glued synthetic leather hemming according to claim 1, wherein the abnormal record comprises an out-of-bounds information record and a hemming width super-threshold information record; when the detection system is in an out-of-bounds state and a normal state, only burr detection and out-of-bounds detection are carried out; when the detection system is in a curling state and the curling width exceeds a threshold value, only edge distance detection is carried out; when the detection process carries out-of-bound detection in a normal state, the counting and judging functions of out-of-bound and burr-free detection times are added; when the detection process detects whether burrs exist or not in an out-of-bound state, counting and judging functions for the number of detection times of burrs and burrs are added; when the detection process carries out edge distance detection in a curling state, the counting and judging functions of the detection times that the edge distance exceeds the threshold value and the edge distance does not exceed the threshold value are added; and when the edge distance is detected in the state that the width of the turned edge exceeds the threshold value, the detection process increases the counting and judging functions of the number of times that the edge distance does not exceed the threshold value.
CN201910726981.0A 2019-08-07 2019-08-07 Wet-process-gummed synthetic leather hemming on-line detection device and method Active CN110376211B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202110784407.8A CN113552134B (en) 2019-08-07 2019-08-07 Wet-process gluing synthetic leather hemming detection method
CN202110793717.6A CN113567447A (en) 2019-08-07 2019-08-07 Synthetic leather hemming online detection method
CN201910726981.0A CN110376211B (en) 2019-08-07 2019-08-07 Wet-process-gummed synthetic leather hemming on-line detection device and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910726981.0A CN110376211B (en) 2019-08-07 2019-08-07 Wet-process-gummed synthetic leather hemming on-line detection device and method

Related Child Applications (2)

Application Number Title Priority Date Filing Date
CN202110784407.8A Division CN113552134B (en) 2019-08-07 2019-08-07 Wet-process gluing synthetic leather hemming detection method
CN202110793717.6A Division CN113567447A (en) 2019-08-07 2019-08-07 Synthetic leather hemming online detection method

Publications (2)

Publication Number Publication Date
CN110376211A CN110376211A (en) 2019-10-25
CN110376211B true CN110376211B (en) 2021-07-27

Family

ID=68258458

Family Applications (3)

Application Number Title Priority Date Filing Date
CN201910726981.0A Active CN110376211B (en) 2019-08-07 2019-08-07 Wet-process-gummed synthetic leather hemming on-line detection device and method
CN202110793717.6A Pending CN113567447A (en) 2019-08-07 2019-08-07 Synthetic leather hemming online detection method
CN202110784407.8A Active CN113552134B (en) 2019-08-07 2019-08-07 Wet-process gluing synthetic leather hemming detection method

Family Applications After (2)

Application Number Title Priority Date Filing Date
CN202110793717.6A Pending CN113567447A (en) 2019-08-07 2019-08-07 Synthetic leather hemming online detection method
CN202110784407.8A Active CN113552134B (en) 2019-08-07 2019-08-07 Wet-process gluing synthetic leather hemming detection method

Country Status (1)

Country Link
CN (3) CN110376211B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110376211B (en) * 2019-08-07 2021-07-27 浙江大学台州研究院 Wet-process-gummed synthetic leather hemming on-line detection device and method
CN114264661B (en) * 2021-12-06 2024-05-31 浙江大学台州研究院 Definition self-adaptive coiled material detection method, device and system
CN114486903B (en) * 2021-12-06 2024-05-14 浙江大学台州研究院 Gray-scale self-adaptive coiled material detection system, device and algorithm
CN116148265A (en) * 2023-02-14 2023-05-23 浙江迈沐智能科技有限公司 Flaw analysis method and system based on synthetic leather high-quality image acquisition

Family Cites Families (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
BE621843A (en) * 1961-09-05
GB9024936D0 (en) * 1990-11-16 1991-01-02 Leicester Polytechnic Methods and apparatus for fabric inspection
JP2896326B2 (en) * 1995-05-02 1999-05-31 鐘紡株式会社 Evaluation method of crimp standing property of strong twist crimped fabric
CN101498672B (en) * 2008-01-31 2010-12-15 东莞市硕源电子材料有限公司 Method for detecting surface finish quality of dust-free cloth
JP2010133744A (en) * 2008-12-02 2010-06-17 Omron Corp Defect detection method, and visual inspection device using the same
CN102221559B (en) * 2011-03-05 2012-08-29 河海大学常州校区 Online automatic detection method of fabric defects based on machine vision and device thereof
CN202119723U (en) * 2011-06-09 2012-01-18 洛阳方智测控股份有限公司 Automatic cloth detecting machine
CN202330297U (en) * 2011-11-28 2012-07-11 陕西长岭纺织机电科技有限公司 Device for automatically detecting texture defects
CN104259111B (en) * 2013-02-27 2016-09-14 南通大学 The Pluma Anseris domestica of Photoelectric Detection scalds whole sorting equipment
US9778205B2 (en) * 2014-03-25 2017-10-03 Kla-Tencor Corporation Delta die and delta database inspection
CN105158272B (en) * 2015-09-22 2018-06-22 浙江工商大学 A kind of method for detecting textile defect
JP2017062154A (en) * 2015-09-24 2017-03-30 アイシン精機株式会社 Defect detection device and defect detection method
CN205749316U (en) * 2016-01-20 2016-11-30 浙江工业职业技术学院 A kind of cloth aberration defect detection machine
US10783624B2 (en) * 2016-07-18 2020-09-22 Instrumental, Inc. Modular optical inspection station
CN207142401U (en) * 2017-03-08 2018-03-27 扬州市嘉鑫织造实业有限公司 A kind of batcher with Defect Detection and processing function
CN107154039B (en) * 2017-04-28 2020-10-27 北京简易科技有限公司 Rubber tube online defect detection method
CN107870172A (en) * 2017-07-06 2018-04-03 黎明职业大学 A kind of Fabric Defects Inspection detection method based on image procossing
CN109521023A (en) * 2017-09-19 2019-03-26 东莞市伟通自动化科技有限公司 A kind of cladding surface detecting system
CN109166098A (en) * 2018-07-18 2019-01-08 上海理工大学 Work-piece burr detection method based on image procossing
CN110021006B (en) * 2018-09-06 2023-11-17 浙江大学台州研究院 Device and method for detecting whether automobile parts are installed or not
CN109727230B (en) * 2018-11-30 2023-04-28 西安工程大学 Device and method for measuring surface quality of fluff fabric
CN109884073A (en) * 2019-03-19 2019-06-14 东华大学 A kind of fabric defects detection device
CN109946311A (en) * 2019-03-25 2019-06-28 江苏博虏智能科技有限公司 Multi-angle multiple light courcess detection device
CN110084787B (en) * 2019-04-12 2021-03-23 浙江大学台州研究院 Synthetic leather gluing online monitoring method based on machine vision
CN110376211B (en) * 2019-08-07 2021-07-27 浙江大学台州研究院 Wet-process-gummed synthetic leather hemming on-line detection device and method

Also Published As

Publication number Publication date
CN113552134A (en) 2021-10-26
CN113552134B (en) 2024-05-24
CN110376211A (en) 2019-10-25
CN113567447A (en) 2021-10-29

Similar Documents

Publication Publication Date Title
CN110376211B (en) Wet-process-gummed synthetic leather hemming on-line detection device and method
CN110349145B (en) Defect detection method, defect detection device, electronic equipment and storage medium
DE102007055912A1 (en) Eyelid detection device, eyelid detection method and associated program
CN106887004A (en) A kind of method for detecting lane lines based on Block- matching
CN113777030A (en) Cloth surface defect detection device and method based on machine vision
CN112819844B (en) Image edge detection method and device
CN114881915A (en) Symmetry-based mobile phone glass cover plate window area defect detection method
CN113706490B (en) Wafer defect detection method
CN110648330B (en) Defect detection method for camera glass
CN102901735B (en) System for carrying out automatic detections upon workpiece defect, cracking, and deformation by using computer
CN116228651A (en) Cloth defect detection method, system, equipment and medium
CN113838043A (en) Machine vision-based quality analysis method in metal foil manufacturing
CN113390882A (en) Tire inner side defect detector based on machine vision and deep learning algorithm
CN111426693A (en) Quality defect detection system and detection method thereof
CN113139943B (en) Method and system for detecting appearance defects of open circular ring workpiece and computer storage medium
US11493453B2 (en) Belt inspection system, belt inspection method, and recording medium for belt inspection program
CN210720181U (en) Wet-process gummed synthetic leather hemming on-line measuring device
CN103605973A (en) Image character detection and identification method
JP4293653B2 (en) Appearance inspection method
CN109886912A (en) A kind of thrust bearing retainer detection method of surface flaw
CN114494142A (en) Mobile terminal middle frame defect detection method and device based on deep learning
JP7469740B2 (en) Belt inspection system and belt inspection program
CN114240920A (en) Appearance defect detection method
CN108062528A (en) A kind of lane recognition system and method based on Streaming Media inside rear-view mirror system
CN112161996A (en) Photovoltaic cell panel welding strip burnout detection method and system based on image recognition

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant