CN111398294B - Straw particle defect detection system and detection method based on machine vision - Google Patents
Straw particle defect detection system and detection method based on machine vision Download PDFInfo
- Publication number
- CN111398294B CN111398294B CN202010301633.1A CN202010301633A CN111398294B CN 111398294 B CN111398294 B CN 111398294B CN 202010301633 A CN202010301633 A CN 202010301633A CN 111398294 B CN111398294 B CN 111398294B
- Authority
- CN
- China
- Prior art keywords
- particle
- straw
- boundary
- line segments
- straight
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8806—Specially adapted optical and illumination features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/213—Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
- G06F18/2135—Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on approximation criteria, e.g. principal component analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/23—Clustering techniques
- G06F18/232—Non-hierarchical techniques
- G06F18/2321—Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
- G06F18/23213—Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2411—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
- G06T7/62—Analysis of geometric attributes of area, perimeter, diameter or volume
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N2021/8466—Investigation of vegetal material, e.g. leaves, plants, fruits
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
- G01N2021/8854—Grading and classifying of flaws
- G01N2021/8867—Grading and classifying of flaws using sequentially two or more inspection runs, e.g. coarse and fine, or detecting then analysing
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
- G01N2021/8887—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30181—Earth observation
- G06T2207/30188—Vegetation; Agriculture
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02E—REDUCTION OF GREENHOUSE GAS [GHG] EMISSIONS, RELATED TO ENERGY GENERATION, TRANSMISSION OR DISTRIBUTION
- Y02E50/00—Technologies for the production of fuel of non-fossil origin
- Y02E50/10—Biofuels, e.g. bio-diesel
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02E—REDUCTION OF GREENHOUSE GAS [GHG] EMISSIONS, RELATED TO ENERGY GENERATION, TRANSMISSION OR DISTRIBUTION
- Y02E50/00—Technologies for the production of fuel of non-fossil origin
- Y02E50/30—Fuel from waste, e.g. synthetic alcohol or diesel
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Artificial Intelligence (AREA)
- Chemical & Material Sciences (AREA)
- Health & Medical Sciences (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- General Health & Medical Sciences (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Quality & Reliability (AREA)
- Probability & Statistics with Applications (AREA)
- Geometry (AREA)
- Signal Processing (AREA)
- Image Analysis (AREA)
- Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
Abstract
The invention discloses a straw particle defect detection system based on machine vision, which belongs to the field of agricultural mechanization and comprises a particle control module, a particle image acquisition module, a camera bellows and a particle defect detection module; the particle control module acquires and discharges particles; a particle image acquisition module arranged in the camera bellows acquires images of the particles discharged by the particle control module to obtain particle images; the particle defect detection module is used for detecting the defects of the particle images; the particle acquisition module comprises a particle loading and unloading unit and an electric control unit; the electric control unit controls the particle loading and unloading unit to obtain straw particles, and the obtained straw particles are conveyed into the camera bellows to be unloaded, the system is suitable for a visual detection system of the straw particles, the actual requirements of the defect detection speed and the detection accuracy can be met, the system has the advantages of being dustproof, not influencing the normal work of the ring die granulator, and being capable of accurately and quickly detecting the defects of cracks, pits, improper length, improper roundness and the like of the straw particles, so that a worker is reminded to adjust parameters such as moisture and temperature or replace a die.
Description
Technical Field
The invention relates to the field of agricultural mechanization, in particular to a straw particle defect detection system based on machine vision.
Background
The straw solid forming fuel has the advantages of high energy density and combustion efficiency, high strength, convenient storage, transportation and use, environmental friendliness, huge development potential and the like, so that the straw solid forming specific gravity is gradually improved, and the straw solid forming fuel is favored by various countries. However, in the process of solidifying and forming the straw, the circular mold granulator has low automation degree, low efficiency and easy blockage, and seriously restricts the rapid development of the straw forming industry. The automatic detection system for the straw particle defects by applying the machine vision automatic control technology improves the straw forming quality and efficiency, and is particularly urgent. Since the ring die granulator forms straw particles from the crushed straw by using high temperature, if the temperature is too high, cracks are generated on the surfaces of the straw particles, and if the temperature is too low, the straw particles are not formed. In addition, the crushed straw particles can only form straw particles by needing a proper amount of water, and if the water content is too much, the straw particles cannot be formed; moreover, ring die granulators can cause straw pellets to pit if the die is damaged. In the traditional judgment method, a person checks the state of straw particles through naked eyes, and adjusts parameters such as temperature and moisture or replaces a mould according to the state of the straw particles. Since the ring die granulator needs to be installed on a large-sized machine harvester, it is not easy to observe whether and what defects exist in the straw particles by naked eyes. Therefore, the invention provides a straw particle defect detection system based on machine vision, which obtains an image of straw particles through a non-contact sensor, analyzes the image to obtain the defects of the straw particles, and provides the working personnel for judging whether the working state of the ring die granulator is normal.
Disclosure of Invention
According to the problems in the prior art, the invention discloses a straw particle defect detection system based on machine vision, which is characterized in that: the particle defect detection device comprises a particle control module, a particle image acquisition module, a camera bellows and a particle defect detection module;
the particle control module acquires and discharges particles;
the particle image acquisition module arranged in the camera bellows acquires images of the particles discharged by the particle control module to obtain particle images;
the particle defect detection module is used for detecting the defects of the particle images;
the particle acquisition module comprises a particle loading and unloading unit and an electric control unit;
the electric control unit controls the particle loading and unloading unit to obtain straw particles, and the obtained straw particles are sent into the camera bellows for unloading;
the particle image acquisition module comprises a main light source, a backlight source and imaging equipment;
the imaging device is located at the top of the dark box,
the main light source is positioned in the middle of the dark box and below a lens of the imaging device;
the backlight source is positioned at the bottom of the dark box;
the main light source corresponds to the backlight source.
Further: the backlight source adopts a white light LED array.
Further, the method comprises the following steps: the particle defect detection module adopts the following steps to detect:
s1: extracting each independent area in the straw particle image collected under the backlight source in the imaging equipment by adopting an inner boundary tracking area detection algorithm based on line scanning;
s2: dividing the region types according to the straw scraps, the independent straw particle regions and the adhered straw particle regions according to the region areas; ( The abandoned straw scrap area reserves a single straw particle area; carrying out the next step on the adhered straw particle area; )
S3: dividing the adhered particle region into a plurality of sub-regions by using a prior-based kMean clustering algorithm;
s4: processing each subarea by a local search method to obtain a single straw particle area;
s5: judging whether the single straw particle area obtained in the S2 and the S4 meets the appearance judgment standard or not, if the single straw particle meets the appearance judgment standard, carrying out S6, and if the single straw particle does not meet the appearance judgment standard, determining that the particle has a length defect, a roundness defect or a flatness defect;
s6: and (5) extracting the characteristics of the single straw particles from the image area collected under the main light source corresponding to the single straw particle area meeting the appearance standard in the step (S5), and judging whether the single straw particles are normal or have the defects of pits and cracks according to the characteristics.
S7: and counting the proportion of the straw particles with various defects in the total number of the straw particles, and outputting the working state of the ring die granulator according to the proportion.
Further: the following method is specifically adopted in S1:
s11, searching from the upper left of the straw particle image collected by the imaging equipment in a line scanning mode until a pixel P0 smaller than a boundary set threshold value is found; numbering eight neighborhood positions of any pixel as 0-7, wherein the right pixel is an initial number 0, the pixel counterclockwise goes to the lower right corner, and the lower right corner pixel is numbered as 7;
s12, searching an eight-neighborhood of the pixel Pn-1 in a counterclockwise direction until the pixel Pn smaller than a set threshold value M1 is found, and marking the Pn as a boundary point; the starting pixel searched in the anticlockwise direction is determined according to the eight neighborhood position of the boundary point Pn-1 relative to the boundary point Pn-2;
s13: judging whether Pn-1 is a left boundary point or a right boundary point of the region according to the eight neighborhood position of the boundary point Pn-1 relative to the boundary point Pn-2 and the eight neighborhood position of the boundary point Pn-1 relative to the boundary point Pn;
s14, when Pn-1 is a left boundary point, filling to the right, sequentially setting pixel points smaller than a set threshold value as a marking value, wherein the marking value is a number arbitrarily larger than M1 until the last pixel smaller than the set threshold value is found, and marking the pixel as a boundary point; if the Pn-1 is the right boundary point, filling leftwards, setting pixel points smaller than a set threshold value as a marking value in sequence until the last pixel smaller than the set threshold value is reached, and marking the pixel as the boundary point;
s15: repeating S12-S14 until the pixels of which the S2 is less than the set threshold are found;
s16: and (3) the number of the pixel points and the number of the boundary points which pass through in the filling process are combined to be recorded as the area of the current region, and the boundary points marked in the P0-Pn and the S14 are used as the boundary point set of the current region.
Further: s3 comprises the following steps:
s31: calculating the gradient directions of all boundary points of the adhered straw particle area;
s32, calculating the number Ki of clustering centers according to the area Ai of the region;
s33, randomly selecting Ki cluster central points, and initializing Ki sets to store boundary points contained in each cluster;
s34, dividing the boundary points into sets corresponding to the clustering center points with the shortest distance according to the gradient direction of the boundary points;
s35, calculating the average gradient direction of each class and taking the average gradient direction as a new clustering center point;
s36, repeating S34-S35 until the variation error of the Ki clustering center points is smaller than a set value;
s37: for the Ki sets containing the boundary points, performing linear fitting on the boundary points in each set by using a least square method for eliminating the maximum error point, reserving straight segments with the lengths meeting the set value, and performing S38;
s38, finding out straight-line segments with parallel directions from the straight-line segments with the lengths meeting a set value, respectively calculating the distances of the straight-line segments with the parallel directions, and grouping the straight-line segments with the distances which are about integral multiples of the average distance of the two straight-line segments and the nearest distance into one group, wherein the same straight-line segment can be grouped into different groups;
s39: dividing the region into a plurality of sub-regions according to the distance and the length of each group of straight line segments;
further: s39 specifically adopts the following mode:
s391, calculating the distance of each group of straight line segments, if the distance meets the allowable distance of a single straw particle, S392 is carried out, otherwise, S395 is carried out;
s392: if the lengths of the two straight-line segments meet the length allowed by a single straw particle, respectively connecting the end points of the two straight-line segments which are relatively close to each other to obtain independent sub-regions; otherwise, performing S393;
s393: if one straight line segment l1 in the lengths of the two straight line segments meets the length allowed by a single straw particle, extending the other straight line segment to enable two end points of the other straight line segment to be respectively connected with the two end points of the one straight line segment to obtain two straight line segments which are perpendicular to the one straight line segment l1, so as to obtain independent sub-regions; otherwise, proceed to S394;
s394: in other cases, the lengths of the two straight line segments can not meet the length allowed by a single straw particle, if other straight line segments are intersected with the two straight line segments, the two straight line segments are respectively connected with the end points which are close to each other, otherwise, the two straight line segments are extended to the boundary point of the region, and an independent subregion is obtained;
s395: adding straight line segments which are parallel to the L1 and the L2 and the end points of which are region boundary points at integral multiples of the average length of the two straight line segments of the straw particles away from the L1 or the L2 to obtain a plurality of independent sub-regions.
Further: s4, the following method is specifically adopted:
s41: detecting edges in the image by using a canny algorithm near the boundary of each sub-area;
s42: if the edge exists, the boundary of the sub-region is replaced by the edge, otherwise, the boundary of the sub-region still adopts the straight line segment compensated in the S39;
s43: the treated boundary point of the subarea is the boundary point of the single straw particle.
Further: s5, the appearance judgment standard is as follows: the length of the single particles meets the condition that the average length error is within plus or minus 5 percent, the roundness of the single particles meets the condition that the roundness error of the particles is within plus or minus 5 percent, and the flatness of the single particles meets the condition that the average flatness is within plus or minus 5 percent.
Further: s6, the following method is specifically adopted:
s61: standardizing the single straw particle image according to the width;
s62: dividing a single straw particle image into a plurality of overlapped blocks in the length direction;
s63: extracting the characteristics of each block by using a convolutional neural network to obtain the characteristic vector of each block;
s64: reducing the dimension of each feature vector by using PCA;
s65: and classifying the straw particles by using a support vector machine according to the feature vector after the dimension reduction, and judging whether cracks or pits exist.
By adopting the technical scheme, the straw particle defect detection system based on machine vision provided by the invention has the advantages that the defects of pits, cracks and the like of straw particles are caused by the fact that the moisture, the temperature and the like of the ring die granulator are not easy to control, the visual detection system suitable for the straw particles is designed by fully exploiting and utilizing the imaging characteristics of the straw particles, the defect detection speed and the detection accuracy can meet the actual requirements, and the system has the characteristics of dust prevention and no influence on the normal operation of the ring die granulator and can accurately and quickly detect the defects of cracks, pits, improper length, improper roundness and the like of the straw particles so as to monitor the working state of the ring die granulator.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments described in the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 (a) is a schematic diagram of a hopper for picking up straw particles according to the present invention;
FIG. 1 (b) is a schematic view of a hopper according to the present invention moving to a specified position in a black box;
FIG. 1 (c) is a schematic diagram of the hopper for recovering straw particles according to the present invention;
FIG. 2 is a schematic diagram of a primary light source image according to the present invention;
FIG. 3 is a schematic view of an imaging apparatus according to the present invention;
FIG. 4 is a schematic view of an image of a backlight according to the present invention;
FIG. 5 (a) is a schematic diagram of a backlight according to the present invention;
FIG. 5 (b) is a schematic diagram of a main light source according to the present invention;
FIG. 6 is a schematic diagram of eight neighborhood direction definitions according to the present invention;
FIG. 7 is a schematic diagram illustrating the definition of the left and right boundaries of the independent areas according to the present invention;
FIG. 8 (a) is a schematic view of the gradient direction according to the present invention;
FIG. 8 (b) is a diagram showing the gradient direction finding according to the present invention
FIG. 9 (a) is a schematic view of an original adhesion image of straw particles according to the present invention;
FIG. 9 (b) is a schematic diagram of the adhesion area of straw particles according to the present invention;
FIG. 10 (a) is a schematic view of an original image before kMEAns are segmented according to the present invention;
FIG. 10 (b) is a schematic view of the regions of each straw particle obtained after the kMEAns division according to the present invention;
FIG. 11 is a schematic view of the rotation of straw particles according to the present invention;
FIG. 12 is a block diagram of the present invention;
FIG. 13 is a schematic diagram of the defective status of straw particles according to the present invention.
Detailed Description
In order to make the technical solutions and advantages of the present invention clearer, the following describes the technical solutions in the embodiments of the present invention clearly and completely with reference to the drawings in the embodiments of the present invention:
the invention adopts an automatic pickup device to sample straw particles produced by a ring die granulator, collects images of the straw particles by combining a main light source and a backlight source, obtains the images of single straw particles in a bottom-up mode, judges whether the defects of length and roundness exist or not by appearance characteristics, judges whether the defects of cracks and pits exist or not by a convolutional neural network and a support vector machine, realizes real-time detection of the defects of the straw particles, and does not influence the normal work of the ring die granulator.
A straw particle defect detection system based on machine vision comprises a particle control module, a particle image acquisition module, a camera bellows and a particle defect detection module;
in order to realize real-time detection of the straw particles, a certain amount of straw particles need to be picked up from the die outlet of the ring die granulator. The particle control module acquires and discharges particles; during the operation of the ring die granulator, a large amount of dust is inevitably generated. Industrial lenses are susceptible to contamination and even blur damage if the visual inspection device is exposed to sunlight. Therefore, the particle image acquisition module arranged in the camera bellows acquires images of the particles discharged by the particle control module to obtain particle images; the particle defect detection module is used for detecting the defects of the particle images;
the particle acquisition module comprises a particle loading and unloading unit and an electric control unit; the electric control unit controls the particle loading and unloading unit to obtain straw particles, and the obtained straw particles are sent into the camera bellows to be unloaded;
wherein the particle control module can adopt a mode as shown in figures 1 (a), (b) (c), and a hopper is used for picking up straw particles of a plurality of die holes around a die as shown in figure 1 (a); then, opening a movable baffle on the side surface of the black box under the control of a motor, and conveying the funnel into the black box as shown in a picture (b); after the straw particles reach the upper part of the backlight plate, the straw particles are freely scattered on the backlight plate, and the scattering speed and position of the funnel are controlled, so that the scattered straw particles are not folded and are not adhered as much as possible. Subsequently, the control funnel is moved under the black box bottom baffle, as shown by the blue arrow in fig. (c); after the straw particle image is collected, the movable baffle at the bottom of the black box is opened, and the straw particles on the backlight plate are swept into the funnel, as shown in fig. 1 (c). The control hopper then sends the straw pellets to a recovery unit.
The particle image acquisition module comprises a main light source, a backlight source and imaging equipment; the image collected under the main light source is shown in fig. 2, and it can be seen that there is an obvious shadow, which affects the extraction of straw particles. Therefore, the invention designs a mode of combining the backlight source and the main light source, the imaging device is shown in figure 3, the image is collected under the backlight, as shown in figure 4, so as to extract single straw particles and detect the appearance defects of the straw particles, and then the image is collected under the main light source, so as to detect the defects of cracks and pits of the straw particles by using the textures of the straw particles. By calculation, the present invention uses 36 diodes arranged as a backlight in the form shown in fig. 5 (a). Fig. 5 (b) shows a main light source, which is a conical LED array.
Through the analysis of the working state of the ring die granulator, the defects of the straw particles, such as cracks, pits, too short length and improper roundness, are caused. In order to obtain the defects, firstly, extracting single straw particles for prejudging, and analyzing the length and the roundness; and then detecting cracks and pits of the single straw particles by using textures. And finally, counting the proportion of each defect, and outputting the working state of the ring die granulator according to the proportion. The particle defect detection module comprises the following specific detection steps:
s1: extracting each independent area in the straw particle image collected under the backlight source in the imaging equipment by adopting an inner boundary tracking area detection algorithm based on line scanning; since the image contains a plurality of regions containing straw particles, the detection of the next region is affected if the resulting region is not filled. The inner boundary tracking area detection algorithm based on line scanning can detect the boundary points of the areas and fill the areas at the same time, and traverses the areas once, so that the purposes of obtaining the boundary points of the areas and filling the areas are achieved. The inner boundary tracking area detection algorithm based on line scanning comprises the following specific steps:
s11, searching from the upper left of the straw particle image collected by the imaging equipment in a line scanning mode until a pixel P0 smaller than a boundary set threshold value is found; numbering eight neighborhood positions of any pixel as 0-7, the pixel on the right side as an initial number 0, the pixel counterclockwise going to the lower right corner and the pixel on the lower right corner as 7, as shown in FIG. 6;
s12, searching an eight-neighborhood of the pixel Pn-1 in a counterclockwise direction until the pixel Pn smaller than a set threshold value M1 is found, and marking the Pn as a boundary point; determining an initial pixel searched in the counterclockwise direction according to an eight-neighborhood position dir of the boundary point Pn-1 relative to the boundary point Pn-2, and if dir is an even number, starting searching from a domain point corresponding to a remainder of dividing dir +7 by 8; if dir is an odd number, searching from a domain point corresponding to a remainder of dividing dir +6 by 8;
s13: judging whether Pn-1 is a left boundary point or a right boundary point of the region according to the eight neighborhood position dir of the boundary point Pn-1 relative to the boundary point Pn-2 and the eight neighborhood position dirnext of the boundary point Pn-1 relative to the boundary point Pn; the left and right borders are shown in fig. 7, with red being the left border and green being the right border. If Pn-1 is the left boundary point or the right boundary point of the area, performing the filling operation in S4, and if the Pn-1 is other conditions, performing S5, namely not filling; the method for judging Pn-1 comprises the following steps: pn-1 is the left boundary point if (dir = 5) or (dir =6 and dirnext ≠ 1) or (dir =7 and dirnext ≠ 1 and dirnext ≠ 2); pn-1 is the right boundary if (dir =1 and dienxt ≠ 7) or (dir =2 and dirnext ≠ 5) or (dir =3 and dirnext ≠ 1 and dirnext ≠ 6);
s14, if Pn is the left boundary point, filling to the right, sequentially setting pixel points smaller than a set threshold value as a marking value, wherein the marking value is any number larger than M1 until the last pixel smaller than the set threshold value is found, and marking the pixel as the boundary point; if the current pixel is the right boundary point, filling leftwards, sequentially setting pixel points smaller than a set threshold value as a marking value until the last pixel smaller than the set threshold value is reached, and marking the pixel as the boundary point;
s15: updating the value of dir to dirNext, updating the value of n to n +1, and repeating S12-S14 until the pixels of which S2 is smaller than the set threshold are found;
s16: and (3) the number of the pixel points and the number of the boundary points which pass through in the filling process are combined to be recorded as the area of the current region, and the boundary points marked in the P0-Pn and the S14 are used as the boundary point set of the current region.
S2: judging whether each region type is straw scraps, an independent straw particle region or an adhered straw particle region according to the region area; discarding the straw chip area; reserving a separate straw particle area; s3, carrying out S3 on the adhered straw particle area;
s3: dividing the adhered particle region into a plurality of sub-regions by using a prior-based kMean clustering algorithm; although the motor can avoid the granule adhesion when controlling the straw granule whereabouts as far as possible, the straw granule is cylindric solid, drops on the board that is shaded from the sun, and inevitable can take place the motion, causes the adhesion. The average area and the average length of the single straw particles are the prior knowledge which can be used. In order to save computation time, the present invention computes the gradient direction of the boundary point using a gradient direction lookup table as shown in fig. 8. FIG. 8 (a) is a graph showing the values of the gradient direction, vertically down at 0 degrees, counterclockwise to 2 π; fig. 8 (b) is a gradient direction lookup table, and the abscissa and ordinate are horizontal and vertical gradient values calculated from pixel values, respectively, which are the gradient directions. The segmentation method comprises the following specific steps:
s31: calculating the gradient directions of all boundary points of the adhered straw particle area; and after the horizontal and vertical gradient values of the boundary point are calculated, the gradient direction of the boundary point is obtained according to the gradient direction lookup table.
S32, calculating the number Ki of clustering centers according to the area Ai of the region; the number of possible straw particles in the region is the quotient of the area Ai divided by the average area of single straw particles, and 3 is added on the basis of the quotient to serve as the allowance in consideration of the fact that the region possibly contains short straw particles; the number of possible straight line segments in the region, namely the number of clustering centers, is Ki = (Ai/average area + 3) × 2;
s33, randomly selecting Ki clustering center points, and initializing Ki sets to store boundary points contained in each class; the value range of the central point is 0 to 2 pi;
s34, respectively dividing the boundary points into sets corresponding to the clustering central points with the shortest distances according to the gradient directions of the boundary points;
s35, calculating the average gradient direction of each class and taking the average gradient direction as a new clustering center point;
and S36, repeating S34-S35 until the variation errors of the Ki cluster center points are smaller than a set value.
S37: for the Ki sets containing boundary points, because position factors are not considered during clustering, each class may contain points which are not on the same straight line segment, and the points may be the boundary points at two ends of the straw particles or the points on the straight line segments at other positions in the same direction, so that the least square method for rejecting the maximum error points is used for performing straight line fitting on the boundary points in each set, and the least square method is used for fitting the rejected error points again until the number of the remaining error points is less than a certain value, straight line segments with the length meeting the set value are reserved, and the next step is performed;
s38, finding out straight-line segments with parallel directions from the straight-line segments with the lengths meeting the set value, respectively calculating the distances of the straight-line segments with the parallel directions, and grouping the straight-line segments with the distances which are about integral multiples of the average distance of the two straight-line segments and the distances between the straight-line segments with the shortest distances into a group, wherein the same straight-line segment can be grouped into different groups. During grouping, two straight-line segments belonging to the same straw particle are divided into a group, the two straight-line segments of the same straw particle are parallel, the distance is one time of the average distance, and meanwhile, the straw particles are not overlapped, and no other straight-line segment exists between the straight lines of the two straight-line segments, so that the straight-line segments belonging to the same straw particle are closest to each other. For the case that a plurality of straw particles are adhered in parallel, as shown in fig. 9, straight line segments at the adhered part may not be detected, and at this time, the outer straight line segments of the parallel straw particles are parallel and the distance is an integral multiple of the average distance.
S39: dividing the region into a plurality of sub-regions according to the distance and the length of each group of straight line segments; when the region is divided, the distance and length of each group of straight line segments need to be considered, and the processing is performed under different conditions, wherein the specific conditions and the processing measures are as follows:
s391, calculating the distance between each group of straight line segments L1 and L2, if the distance meets the allowable distance of a single straw particle, carrying out S392, otherwise, carrying out S395;
s392: if the lengths of the two straight line segments meet the length allowed by a single straw particle, respectively connecting the end points of the two straight line segments which are relatively close to each other to obtain independent sub-regions; otherwise, performing S393;
s393: if one straight line section L1 in the lengths of the two straight line sections meets the length allowed by a single straw particle, extending the other straight line section to enable two end points of the other straight line section to be respectively connected with the two end points of the L1 to obtain two straight line sections which are perpendicular to the L1, so as to obtain independent sub-areas; otherwise, proceed to S394;
s394: in other cases, the lengths of the two straight line segments can not meet the length allowed by a single straw particle, if other straight line segments are intersected with the two straight line segments, the two straight line segments are respectively connected with the end points which are close to each other, otherwise, the two straight line segments are extended to the boundary point of the region, and an independent subregion is obtained;
s395: adding straight-line segments which are parallel to the L1 and the L2 and the end points of which are region boundary points at integral multiples of the average length of the two straight-line segments of the straw particles away from the L1 or the L2 to obtain a plurality of independent sub-regions;
s4: processing each subarea by a local search method to obtain a single straw particle area; because the sub-regions obtained in S3 are formed by the fitted straight line segments, the added straight line segments, and the boundary points, further processing is required to obtain the boundary points of each sub-region on the image, and the specific steps are as follows:
s41: detecting edges in the image by using a canny algorithm near the boundary of each sub-area; the Canny extracted edge is a single edge and is suitable for describing the boundary of the region.
S42: if the edge exists, the boundary of the sub-region is replaced by the edge, otherwise, the boundary of the sub-region still adopts the straight line segment compensated in the S39;
s43: the treated boundary point of the sub-area is the boundary point of the single straw particle.
S5: the single straw particles obtained in S2 and S4 are shown in FIG. 10, and whether the single straw particles meet the appearance judgment standard or not is judged, wherein the appearance judgment standard is as follows: the length of each single particle is within plus or minus 5 percent of the average length error, the roundness of each single particle is within plus or minus 5 percent of the average roundness error of each particle, and when each single straw particle meets the appearance judgment standard, S6 is carried out, wherein the average length refers to the average length of two straight line segments of each single straw particle counted off-line, the roundness refers to the percentage of points, the distance from one straight line segment to the other straight line segment meets a threshold value, and the average roundness is the average roundness of each single straw particle counted off-line; when the single straw particle does not meet the appearance judgment standard, the particle has a length defect or a roundness defect;
s6: for the image area collected under the main light source corresponding to the single straw particle area satisfying the appearance standard in S5, extracting the characteristics of the single straw particle, and judging whether the single straw particle is normal or has the defects of pits and cracks according to the characteristics, the method comprises the following specific steps:
s61: rotating a single straw particle into a vertical state, wherein the rotating angle is an included angle between straight line sections on two sides of the single straw particle and a vertical line, as shown in fig. 11; then standardizing the single straw particle image according to the width which is the average distance of two straight line segments of the straw particles;
s62: the single straw particle image is divided into a plurality of overlapped blocks in the length direction, and the adoption of the division mode of the overlapped blocks is helpful for solving the problem that the pits and the cracks are divided into a plurality of blocks as shown in FIG. 12 because the positions and the lengths of the pits and the cracks are not fixed;
s63: extracting the characteristics of each block by using a convolutional neural network to obtain the characteristic vector of each block, and reducing the dimension by using PCA (principal component analysis);
s64: classifying each block by using a support vector machine according to the feature vector after dimensionality reduction, and if any one block has cracks or pits, judging that the straw particles have the defects of the cracks or the pits, as shown in FIG. 13;
s7: and counting the proportion of the straw particles with various defects in the total number of the straw particles, and outputting the working state of the ring die granulator according to the proportion.
The above description is only for the preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art should be considered as the technical solutions and the inventive concepts of the present invention within the technical scope of the present invention.
Claims (5)
1. The utility model provides a straw granule defect detecting system based on machine vision which characterized in that: the particle defect detection device comprises a particle control module, a particle image acquisition module, a camera bellows and a particle defect detection module;
the particle control module acquires and discharges particles;
the particle image acquisition module arranged in the camera bellows acquires images of the particles discharged by the particle control module to obtain particle images;
the particle defect detection module is used for detecting the defects of the particle images;
the particle acquisition module comprises a particle loading and unloading unit and an electric control unit;
the electric control unit controls the particle loading and unloading unit to obtain straw particles, and the obtained straw particles are sent into the camera bellows to be unloaded;
the particle image acquisition module comprises a main light source, a backlight source and imaging equipment;
the imaging device is located at the top of the dark box,
the main light source is positioned in the middle of the dark box and below a lens of the imaging device;
the backlight source is positioned at the bottom of the dark box;
the main light source corresponds to the backlight source;
the straw particle defect detection system adopts the following method for detection:
the particle defect detection module adopts the following steps to detect:
s1: extracting each independent area in the straw particle image collected under the backlight source in the imaging equipment by adopting an inner boundary tracking area detection algorithm based on line scanning;
s2: dividing the region types according to the straw scraps, the independent straw particle regions and the adhered straw particle regions according to the region areas;
s3: dividing the adhered particle region into a plurality of sub-regions by using a prior-based kMean clustering algorithm;
s4: processing each subarea by a local search method to obtain a single straw particle area;
s5: judging whether the single straw particle area obtained in the S2 and the S4 meets the appearance judgment standard or not, if the single straw particle meets the appearance judgment standard, carrying out S6, and if the single straw particle does not meet the appearance judgment standard, determining that the particle has a length defect, a roundness defect or a flatness defect;
s6: extracting the characteristics of the single straw particles from the image area collected under the main light source corresponding to the single straw particle area meeting the appearance standard in the step S5, and judging whether the single straw particles are normal or have pit and crack defects according to the characteristics;
s7: counting the proportion of the straw particles with various defects in the total number of the straw particles, and outputting the working state of the ring die granulator according to the proportion;
the following mode is specifically adopted in S1:
s11, searching from the upper left of the straw particle image collected by the imaging equipment in a line scanning mode until a pixel P0 smaller than a boundary set threshold value is found; numbering eight neighborhood positions of any pixel as 0-7, wherein the right pixel is an initial number 0, the pixel counterclockwise goes to the lower right corner, and the lower right corner pixel is numbered as 7;
s12, searching an eight-neighborhood of the pixel Pn-1 in a counterclockwise direction until the pixel Pn smaller than a set threshold value M1 is found, and marking the Pn as a boundary point; the starting pixel searched in the anticlockwise direction is determined according to the eight neighborhood position of the boundary point Pn-1 relative to the boundary point Pn-2;
s13: judging whether Pn-1 is a left boundary point or a right boundary point of the region according to the eight neighborhood position of the boundary point Pn-1 relative to the boundary point Pn-2 and the eight neighborhood position of the boundary point Pn-1 relative to the boundary point Pn;
s14, when Pn-1 is a left boundary point, filling to the right, sequentially setting pixel points smaller than a set threshold value as a marking value, wherein the marking value is a number arbitrarily larger than M1 until the last pixel smaller than the set threshold value is found, and marking the pixel as a boundary point; if the Pn-1 is the right boundary point, filling leftwards, setting pixel points smaller than a set threshold value as a marking value in sequence until the last pixel smaller than the set threshold value is reached, and marking the pixel as the boundary point;
s15: repeating S12-S14 until the pixels of which the S2 is less than the set threshold are found;
s16: the number of pixel points and the number of boundary points passing through in the filling process are combined and recorded as the area of the current region, and the boundary points marked in P0-Pn and S14 are used as the boundary point set of the current region;
s3 comprises the following steps:
s31: calculating the gradient directions of all boundary points of the adhered straw particle area;
s32, calculating the number Ki of clustering centers according to the area Ai of the region;
s33, randomly selecting Ki clustering center points, and initializing Ki sets to store boundary points contained in each class;
s34, dividing the boundary points into sets corresponding to the clustering center points with the shortest distance according to the gradient direction of the boundary points;
s35, calculating the average gradient direction of each class and taking the average gradient direction as a new clustering center point;
s36, repeating S34-S35 until the variation error of the Ki cluster central points is smaller than a set value;
s37: for the Ki sets containing the boundary points, performing linear fitting on the boundary points in each set by using a least square method for eliminating the maximum error point, reserving straight segments with the lengths meeting the set value, and performing S38;
s38, finding out straight-line segments with parallel directions from the straight-line segments with the lengths meeting a set value, respectively calculating the distances of the straight-line segments with the parallel directions, and grouping the straight-line segments with the distances which are about integral multiples of the average distance of the two straight-line segments and the nearest distance into one group, wherein the same straight-line segment can be grouped into different groups;
s39: dividing the region into a plurality of sub-regions according to the distance and the length of each group of straight line segments;
s4, the following method is specifically adopted:
s41: detecting edges in the image by using a canny algorithm near the boundary of each sub-area;
s42: if the edge exists, the boundary of the sub-region is replaced by the edge, otherwise, the boundary of the sub-region still adopts the straight line segment compensated in the S39;
s43: the treated boundary point of the subarea is the boundary point of the single straw particle.
2. The straw particle defect detection system based on machine vision as claimed in claim 1, further characterized in that: the backlight source adopts a white light LED array.
3. The straw particle defect detection system based on machine vision as claimed in claim 1, further characterized in that: s39 specifically adopts the following mode:
s391, calculating the distance between each group of straight line segments L1 and L2, if the distance meets the allowable distance of a single straw particle, carrying out S392, otherwise, carrying out S395;
s392: if the lengths of the two straight line segments meet the length allowed by a single straw particle, respectively connecting the end points of the two straight line segments which are relatively close to each other to obtain independent sub-regions; otherwise, performing S393;
s393: if one straight line segment L1 in the two straight line segment lengths meets the length allowed by a single straw particle, extending the other straight line segment to enable two end points of the other straight line segment to be respectively connected with the two end points of the L1 to obtain two straight line segments which are perpendicular to the L1, so as to obtain independent sub-regions; otherwise, proceed to S394;
s394: otherwise, the lengths of the two straight-line segments cannot meet the length allowed by a single straw particle, if other straight-line segments intersect with the two straight-line segments, the two straight-line segments are respectively connected with the end points which are close to each other, otherwise, the two straight-line segments are extended to the boundary point of the region, and an independent subregion is obtained;
s395: adding straight line segments which are parallel to the L1 and the L2 and the end points of which are region boundary points at integral multiples of the average length of the two straight line segments of the straw particles away from the L1 or the L2 to obtain a plurality of independent sub-regions.
4. The straw particle defect detection system based on machine vision as claimed in claim 1, further characterized in that: s5, the appearance judgment standard is as follows: the length of the single particle is within plus or minus 5 percent of the average length error, the roundness of the single particle is within plus or minus 5 percent of the roundness error of the single particle, and the flatness of the single particle is within plus or minus 5 percent of the average flatness.
5. The straw particle defect detection system based on machine vision as claimed in claim 1, further characterized in that: s6 specifically adopts the following mode:
s61: standardizing the single straw particle image according to the width;
s62: dividing a single straw particle image into a plurality of overlapped blocks in the length direction;
s63: extracting the characteristics of each block by using a convolutional neural network to obtain the characteristic vector of each block;
s64: reducing the dimension of each feature vector by using PCA;
s65: and classifying the straw particles by using a support vector machine according to the feature vector after the dimension reduction, and judging whether cracks or pits exist.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010301633.1A CN111398294B (en) | 2020-04-16 | 2020-04-16 | Straw particle defect detection system and detection method based on machine vision |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010301633.1A CN111398294B (en) | 2020-04-16 | 2020-04-16 | Straw particle defect detection system and detection method based on machine vision |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111398294A CN111398294A (en) | 2020-07-10 |
CN111398294B true CN111398294B (en) | 2022-11-25 |
Family
ID=71431588
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010301633.1A Active CN111398294B (en) | 2020-04-16 | 2020-04-16 | Straw particle defect detection system and detection method based on machine vision |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111398294B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116026855B (en) * | 2023-02-27 | 2024-08-20 | 英飞智信(北京)科技有限公司 | Sample analysis system for solid particle inspection based on visual analysis |
CN117589641B (en) * | 2024-01-19 | 2024-04-02 | 北京春风药业有限公司 | Particle screening detection system for traditional Chinese medicine particle production |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110706210A (en) * | 2019-09-18 | 2020-01-17 | 五邑大学 | Deep learning-based rebar counting method and device |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB1561110A (en) * | 1977-12-02 | 1980-02-13 | Probe Eng Co Ltd | Grain detection apparatus |
DE102004056520A1 (en) * | 2004-11-24 | 2006-06-01 | Amazonen-Werke H. Dreyer Gmbh & Co. Kg | Method for determining the particle shape and / or size of agricultural good particles |
CN103752535A (en) * | 2014-01-26 | 2014-04-30 | 东北农业大学 | Machine vision based soybean seed selection method |
DE102014204603B3 (en) * | 2014-03-12 | 2015-07-30 | Deere & Company | A method for automatically adjusting threshing parameters of a combine harvester during harvest using a straw quality detection arrangement |
CN205786372U (en) * | 2016-06-29 | 2016-12-07 | 中国农业大学 | Portable corn particle detection auxiliary device and the detection device of going mouldy |
CN106362958B (en) * | 2016-10-08 | 2018-10-19 | 东北农业大学 | A kind of light source adjusting bracket suitable for little particle body agricultural product vision sorter system |
CN210036917U (en) * | 2019-06-17 | 2020-02-07 | 湖南农业大学 | Thousand grain weight rapid survey device of seed |
-
2020
- 2020-04-16 CN CN202010301633.1A patent/CN111398294B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110706210A (en) * | 2019-09-18 | 2020-01-17 | 五邑大学 | Deep learning-based rebar counting method and device |
Also Published As
Publication number | Publication date |
---|---|
CN111398294A (en) | 2020-07-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021168733A1 (en) | Defect detection method and apparatus for defect image, and computer-readable storage medium | |
CN111398294B (en) | Straw particle defect detection system and detection method based on machine vision | |
KR101297395B1 (en) | Secondary battery eloctrode panel vision detection method | |
CN107525808A (en) | Blister medication classification and the online visible detection method of defect on a kind of production line | |
CN111672773B (en) | Product surface defect detection system and detection method based on machine vision | |
CN111766245B (en) | Button cell negative electrode shell defect detection method based on machine vision | |
CN110057198B (en) | Method and device for detecting working state of trolley wheel of sintering machine | |
CN104034637B (en) | Based on the online quality inspection device of diamond wire particle of machine vision | |
CN109671078A (en) | A kind of product surface image abnormity detection method and device | |
CN107301634A (en) | A kind of robot automatic sorting method and system | |
US20150051860A1 (en) | Automatic optical appearance inspection by line scan apparatus | |
CN113343834A (en) | Belt deviation diagnosis method based on machine vision and laser line | |
CN111476712B (en) | Trolley grate image shooting and detecting method and system of sintering machine | |
CN114998217A (en) | Method for determining defect grade of glass substrate, computer device and storage medium | |
CN114519696A (en) | PVC heat shrinkage film detection method and system based on optical intelligence | |
CN118501177A (en) | Appearance defect detection method and system for formed foil | |
CN203965287U (en) | The online quality inspection device of diamond wire particle based on machine vision | |
CN113192061A (en) | LED package appearance detection image extraction method and device, electronic equipment and storage medium | |
CN116026861A (en) | Glass bottle detection method and system | |
CN113628155A (en) | Green ball particle size detection method and system of disc pelletizer | |
CN117191809A (en) | Glass detection equipment fault monitoring and early warning system based on data analysis | |
US20230152781A1 (en) | Manufacturing intelligence service system connected to mes in smart factory | |
CN212284937U (en) | Machine vision detection system for product surface defects | |
KR20150074942A (en) | Apparatus and method of detecting scap defect | |
CN113409297A (en) | Aggregate volume calculation method, particle form grading data generation method, system and equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |