CN111398294A - Straw particle defect detection system and detection method based on machine vision - Google Patents

Straw particle defect detection system and detection method based on machine vision Download PDF

Info

Publication number
CN111398294A
CN111398294A CN202010301633.1A CN202010301633A CN111398294A CN 111398294 A CN111398294 A CN 111398294A CN 202010301633 A CN202010301633 A CN 202010301633A CN 111398294 A CN111398294 A CN 111398294A
Authority
CN
China
Prior art keywords
particle
straw
straight line
particles
boundary
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010301633.1A
Other languages
Chinese (zh)
Other versions
CN111398294B (en
Inventor
王伟
宫元娟
李宁
白雪卫
任德志
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Liaoning Ningyue Agricultural Equipment Co ltd
Shenyang Agricultural University
Original Assignee
Liaoning Ningyue Agricultural Equipment Co ltd
Shenyang Agricultural University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Liaoning Ningyue Agricultural Equipment Co ltd, Shenyang Agricultural University filed Critical Liaoning Ningyue Agricultural Equipment Co ltd
Priority to CN202010301633.1A priority Critical patent/CN111398294B/en
Publication of CN111398294A publication Critical patent/CN111398294A/en
Application granted granted Critical
Publication of CN111398294B publication Critical patent/CN111398294B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8806Specially adapted optical and illumination features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • G06F18/2135Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on approximation criteria, e.g. principal component analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • G06F18/23213Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N2021/8466Investigation of vegetal material, e.g. leaves, plants, fruits
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8854Grading and classifying of flaws
    • G01N2021/8867Grading and classifying of flaws using sequentially two or more inspection runs, e.g. coarse and fine, or detecting then analysing
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30188Vegetation; Agriculture
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02EREDUCTION OF GREENHOUSE GAS [GHG] EMISSIONS, RELATED TO ENERGY GENERATION, TRANSMISSION OR DISTRIBUTION
    • Y02E50/00Technologies for the production of fuel of non-fossil origin
    • Y02E50/10Biofuels, e.g. bio-diesel
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02EREDUCTION OF GREENHOUSE GAS [GHG] EMISSIONS, RELATED TO ENERGY GENERATION, TRANSMISSION OR DISTRIBUTION
    • Y02E50/00Technologies for the production of fuel of non-fossil origin
    • Y02E50/30Fuel from waste, e.g. synthetic alcohol or diesel

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Geometry (AREA)
  • Signal Processing (AREA)
  • Probability & Statistics with Applications (AREA)
  • Quality & Reliability (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a straw particle defect detection system based on machine vision, which belongs to the field of agricultural mechanization and comprises a particle control module, a particle image acquisition module, a camera bellows and a particle defect detection module; the particle control module acquires and discharges particles; a particle image acquisition module arranged in the camera bellows acquires images of the particles discharged by the particle control module to obtain particle images; the particle defect detection module is used for detecting the defects of the particle images; the particle acquisition module comprises a particle loading and unloading unit and an electric control unit; the electric control unit controls the particle loading and unloading unit to obtain straw particles, and the obtained straw particles are conveyed into the camera bellows to be unloaded, the system is suitable for a visual detection system of the straw particles, the actual requirements of the defect detection speed and the detection accuracy can be met, the system has the advantages of being dustproof, not influencing the normal work of the ring die granulator, and being capable of accurately and quickly detecting the defects of cracks, pits, improper length, improper roundness and the like of the straw particles, so that a worker is reminded to adjust parameters such as moisture and temperature or replace a die.

Description

Straw particle defect detection system and detection method based on machine vision
Technical Field
The invention relates to the field of agricultural mechanization, in particular to a straw particle defect detection system based on machine vision.
Background
The straw solid forming fuel has the advantages of high energy density and combustion efficiency, high strength, convenient storage, transportation and use, environmental friendliness, huge development potential and the like, so that the straw solid forming specific gravity is gradually improved, and the straw solid forming fuel is favored by various countries. However, in the process of solidification and forming of the straw, the circular mold granulator is low in automation degree, low in efficiency and easy to block, and the rapid development of the straw forming industry is severely restricted. The automatic detection system for the straw particle defects by applying the machine vision automatic control technology improves the straw forming quality and efficiency, and is particularly urgent. Since the ring die granulator forms the crushed straw into straw particles through high temperature, if the temperature is too high, cracks are generated on the surfaces of the straw particles, and if the temperature is too low, the straw particles are not formed. In addition, the crushed straw particles can only form straw particles by needing a proper amount of water, and if the water content is too much, the straw particles cannot be formed; moreover, ring die granulators can cause straw pellets to pit if the die is damaged. In the traditional judgment method, a person checks the state of straw particles through naked eyes, and adjusts parameters such as temperature and moisture or replaces a mould according to the state of the straw particles. Since the ring die granulator needs to be installed on a large-sized machine harvester, it is not easy to observe whether and what defects exist in the straw particles by naked eyes. Therefore, the invention provides a straw particle defect detection system based on machine vision, which obtains an image of straw particles through a non-contact sensor, analyzes the image to obtain the defects of the straw particles, and provides the working personnel for judging whether the working state of the ring die granulator is normal.
Disclosure of Invention
According to the problems in the prior art, the invention discloses a straw particle defect detection system based on machine vision, which is characterized in that: the particle defect detection device comprises a particle control module, a particle image acquisition module, a camera bellows and a particle defect detection module;
the particle control module acquires and discharges particles;
the particle image acquisition module arranged in the camera bellows acquires images of the particles discharged by the particle control module to obtain particle images;
the particle defect detection module is used for detecting the defects of the particle images;
the particle acquisition module comprises a particle loading and unloading unit and an electric control unit;
the electric control unit controls the particle loading and unloading unit to obtain straw particles, and the obtained straw particles are sent into the camera bellows to be unloaded;
the particle image acquisition module comprises a main light source, a backlight source and imaging equipment;
the imaging device is located at the top of the dark box,
the main light source is positioned in the middle of the dark box and below a lens of the imaging device;
the backlight source is positioned at the bottom of the dark box;
the main light source corresponds to the backlight source.
Further, the backlight source adopts a white light L ED array.
Further: the particle defect detection module adopts the following steps to detect:
s1: extracting each independent area in the straw particle image collected under the backlight source in the imaging equipment by adopting an inner boundary tracking area detection algorithm based on line scanning;
s2: dividing the region types according to the straw scraps, the independent straw particle regions and the adhered straw particle regions according to the region areas; (the area of the discarded straw scraps is reserved as a separate straw particle area; the area of the adhered straw particles is subjected to the next step;)
S3: dividing the adhered particle region into a plurality of sub-regions by using a prior-based kMean clustering algorithm;
s4: processing each subarea by a local search method to obtain a single straw particle area;
s5: judging whether the single straw particle areas obtained in S2 and S4 meet the appearance judgment standard, if the single straw particles meet the appearance judgment standard, carrying out S6, and if the single straw particles do not meet the appearance judgment standard, determining that the particles have length defects, roundness defects or flatness defects;
s6: and extracting the characteristics of the single straw particles from the image area collected under the main light source corresponding to the single straw particle area meeting the appearance standard in the S5, and judging whether the single straw particles are normal or have the defects of pits and cracks according to the characteristics.
S7: and counting the proportion of the straw particles with various defects in the total number of the straw particles, and outputting the working state of the ring die granulator according to the proportion.
Further: the following method is specifically adopted in S1:
s11, searching from the upper left of the straw particle image collected by the imaging device in a line scanning mode until a pixel P0 smaller than a boundary set threshold is found; numbering eight neighborhood positions of any pixel as 0-7, wherein the right pixel is an initial number 0, the pixel counterclockwise goes to the lower right corner, and the lower right corner pixel is numbered as 7;
s12, searching eight fields of the pixel Pn-1 in a counterclockwise direction until the pixel Pn smaller than a set threshold M1 is found, and marking the Pn as a boundary point; the starting pixel searched in the anticlockwise direction is determined according to the eight neighborhood position of the boundary point Pn-1 relative to the boundary point Pn-2;
s13: judging whether Pn-1 is a left boundary point or a right boundary point of the region according to the eight neighborhood position of the boundary point Pn-1 relative to the boundary point Pn-2 and the eight neighborhood position of the boundary point Pn-1 relative to the boundary point Pn;
s14, when Pn-1 is the left boundary point, filling to the right, setting the pixel points smaller than the set threshold as the marking values which are any number larger than M1 until the last pixel smaller than the set threshold is found, and marking the pixel as the boundary point; if the Pn-1 is the right boundary point, filling leftwards, setting pixel points smaller than a set threshold value as a marking value in sequence until the last pixel smaller than the set threshold value is reached, and marking the pixel as the boundary point;
s15: repeating S12-S14 until S2 satisfies that the pixels smaller than the set threshold have been found;
s16: and (4) the number of the pixel points and the number of the boundary points which pass through in the filling process are combined to be recorded as the area of the current region, and the boundary points marked in P0-Pn and S14 are used as the boundary point set of the current region.
Further: s3 includes the steps of:
s31: calculating the gradient directions of all boundary points of the adhered straw particle area;
s32, calculating the number Ki of clustering centers according to the area Ai of the region;
s33, randomly selecting Ki cluster center points and initializing Ki sets to store boundary points contained in each class;
s34, dividing the boundary points into sets corresponding to the cluster center points with the shortest distance according to the gradient direction of the boundary points;
s35, calculating the average gradient direction of each class and using the average gradient direction as a new clustering center point;
s36, repeating S34-S35 until the variation error of the Ki cluster center points is smaller than a set value;
s37: for the Ki sets containing the boundary points, performing linear fitting on the boundary points in each set by using a least square method for eliminating the maximum error point, reserving straight segments with the lengths meeting the set value, and performing S38;
s38, finding out straight line segments with parallel directions from the straight line segments with the lengths meeting a set value, respectively calculating the distances of the straight line segments with the parallel directions, and grouping the straight line segments with the distances which are about integral multiples of the average distance of the two straight line segments and the nearest distance into one group, wherein the same straight line segment can be grouped into different groups;
s39: dividing the region into a plurality of sub-regions according to the distance and the length of each group of straight line segments;
further: s39 specifically adopts the following method:
s391, calculating the distance of each group of straight line segments, if the distance meets the allowable distance of a single straw particle, S392 is carried out, otherwise, S395 is carried out;
s392: if the lengths of the two straight line segments meet the length allowed by a single straw particle, respectively connecting the end points of the two straight line segments which are relatively close to each other to obtain independent sub-regions; otherwise, performing S393;
s393: if one straight line segment l1 in the lengths of the two straight line segments meets the length allowed by a single straw particle, extending the other straight line segment to enable two end points of the other straight line segment to be respectively connected with two end points of l1 to obtain two straight line segments which are perpendicular to l1, so as to obtain independent sub-regions; otherwise, proceed to S394;
s394: in other cases, the lengths of the two straight line segments can not meet the length allowed by a single straw particle, if other straight line segments are intersected with the two straight line segments, the two straight line segments are respectively connected with the end points which are close to each other, otherwise, the two straight line segments are extended to the boundary point of the region, and an independent subregion is obtained;
and S395, adding linear segments which are parallel to L1 and L2 and have end points as zone boundary points at integral multiples of the average length of the two linear segments of the straw particles from L1 or L2 to obtain a plurality of independent sub-zones.
Further: s4 specifically adopts the following method:
s41: detecting edges in the image by using a canny algorithm near the boundary of each sub-area;
s42: if there is an edge, the boundary of the sub-region is replaced by the edge, otherwise, the boundary of the sub-region still adopts the straight line segment complemented in S39;
s43: the treated boundary point of the subarea is the boundary point of the single straw particle.
Further: the appearance judgment criteria of S5 are as follows: the length of the single particles meets the condition that the average length error is within plus or minus 5 percent, the roundness of the single particles meets the condition that the roundness error of the particles is within plus or minus 5 percent, and the flatness of the single particles meets the condition that the average flatness is within plus or minus 5 percent.
Further: s6 specifically adopts the following method:
s61: standardizing the single straw particle image according to the width;
s62: dividing a single straw particle image into a plurality of overlapped blocks in the length direction;
s63: extracting the characteristics of each block by using a convolutional neural network to obtain the characteristic vector of each block;
s64: reducing the dimension of each feature vector by using PCA;
s65: and classifying the straw particles by using a support vector machine according to the feature vector after the dimension reduction, and judging whether cracks or pits exist.
By adopting the technical scheme, the straw particle defect detection system based on machine vision provided by the invention has the advantages that the defects of pits, cracks and the like of straw particles are caused by the fact that the moisture, the temperature and the like of the ring die granulator are not easy to control, the visual detection system suitable for the straw particles is designed by fully exploiting and utilizing the imaging characteristics of the straw particles, the defect detection speed and the detection accuracy can meet the actual requirements, and the system has the characteristics of dust prevention and no influence on the normal operation of the ring die granulator and can accurately and quickly detect the defects of cracks, pits, improper length, improper roundness and the like of the straw particles so as to monitor the working state of the ring die granulator.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments described in the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1(a) is a schematic diagram of a hopper for picking up straw particles according to the present invention;
FIG. 1(b) is a schematic view of a hopper according to the present invention moving to a specified position in a black box;
FIG. 1(c) is a schematic diagram of the hopper for recovering straw particles according to the present invention;
FIG. 2 is a schematic diagram of a primary light source image according to the present invention;
FIG. 3 is a schematic view of an imaging apparatus according to the present invention;
FIG. 4 is a schematic view of a backlight image according to the present invention;
FIG. 5(a) is a schematic diagram of a backlight according to the present invention;
FIG. 5(b) is a schematic diagram of a main light source according to the present invention;
FIG. 6 is a schematic view of eight domain direction definitions related to the present invention;
FIG. 7 is a schematic diagram illustrating the definition of the left and right boundaries of the independent areas according to the present invention;
FIG. 8(a) is a schematic view of the gradient direction according to the present invention;
FIG. 8(b) is a diagram showing the gradient direction finding according to the present invention
FIG. 9(a) is a schematic view of an original image of straw particles adhered according to the present invention;
FIG. 9(b) is a schematic view of the adhesion area of straw particles according to the present invention;
FIG. 10(a) is a schematic diagram of an original image before kMEAns are segmented according to the present invention;
FIG. 10(b) is a schematic view of the regions of each straw particle obtained after the kMEAns division according to the present invention;
FIG. 11 is a schematic view of the rotation of straw particles according to the present invention;
FIG. 12 is a block diagram of the present invention;
FIG. 13 is a schematic view of the defective state of straw particles according to the present invention.
Detailed Description
In order to make the technical solutions and advantages of the present invention clearer, the following describes the technical solutions in the embodiments of the present invention clearly and completely with reference to the drawings in the embodiments of the present invention:
the invention adopts an automatic pickup device to sample straw particles produced by a ring die granulator, collects images of the straw particles by combining a main light source and a backlight source, obtains the images of single straw particles in a bottom-up mode, judges whether the defects of length and roundness exist or not by appearance characteristics, judges whether the defects of cracks and pits exist or not by a convolutional neural network and a support vector machine, realizes real-time detection of the defects of the straw particles, and does not influence the normal work of the ring die granulator.
A straw particle defect detection system based on machine vision comprises a particle control module, a particle image acquisition module, a camera bellows and a particle defect detection module;
in order to realize real-time detection of the straw particles, a certain amount of straw particles need to be picked up from the die outlet of the ring die granulator. The particle control module acquires and discharges particles; during the operation of the ring die granulator, a large amount of dust is inevitably generated. Industrial lenses are susceptible to contamination and even blur damage if the visual inspection device is exposed to sunlight. Therefore, the particle image acquisition module arranged in the camera bellows acquires images of the particles discharged by the particle control module to obtain particle images; the particle defect detection module is used for detecting the defects of the particle images;
the particle acquisition module comprises a particle loading and unloading unit and an electric control unit; the electric control unit controls the particle loading and unloading unit to obtain straw particles, and the obtained straw particles are sent into the camera bellows to be unloaded;
wherein the particle control module can adopt a mode as shown in figures 1(a), (b) (c), and a hopper is used for picking up straw particles of a plurality of die holes around a die as shown in figure 1 (a); then, opening a movable baffle on the side surface of the black box under the control of a motor, and conveying the funnel into the black box, as shown in a figure (b); after the straw particles reach the upper part of the backlight plate, the straw particles are freely scattered on the backlight plate, and the scattering speed and position of the funnel are controlled, so that the scattered straw particles are not folded and are not adhered as much as possible. Subsequently, the control funnel is moved under the black box bottom baffle, as shown by the blue arrow in fig. (c); after the straw particle image is collected, the movable baffle at the bottom of the black box is opened, and the straw particles on the backlight plate are swept into the funnel, as shown in fig. 1 (c). The control hopper then sends the straw pellets to a recovery unit.
The particle image acquisition module comprises a main light source, a backlight source and imaging equipment, wherein an image acquired under the main light source is shown in FIG. 2, so that obvious shadows can exist to influence the extraction of straw particles, therefore, the invention designs a mode of combining the backlight source and the main light source, an imaging device is shown in FIG. 3, the image is acquired under the backlight, as shown in FIG. 4, so that single straw particles are extracted and the appearance defects of the straw particles are detected, then the image is acquired under the main light source, so that the cracks and the pit defects of the straw particles are detected by utilizing the textures of the straw particles, through calculation, the invention adopts a mode that 36 diodes are arranged as shown in FIG. 5(a) to serve as the backlight source, and FIG. 5(b) is the main light source and adopts L ED arrays which are arranged in a conical.
Through the analysis of the working state of the ring die granulator, the defects of the straw particles, such as cracks, pits, too short length and improper roundness, are caused. In order to obtain the defects, firstly, extracting single straw particles for prejudging, and analyzing the length and the roundness; and then detecting cracks and pits of the single straw particles by using textures. And finally, counting the proportion of each defect, and outputting the working state of the ring die granulator according to the proportion. The particle defect detection module comprises the following specific detection steps:
s1: extracting each independent area in the straw particle image collected under the backlight source in the imaging equipment by adopting an inner boundary tracking area detection algorithm based on line scanning; since the image contains a plurality of regions containing straw particles, the detection of the next region is affected if the resulting region is not filled. The inner boundary tracking area detection algorithm based on line scanning can detect the boundary points of the areas and fill the areas at the same time, and traverses the areas once, so that the purposes of obtaining the boundary points of the areas and filling the areas are achieved. The inner boundary tracking area detection algorithm based on line scanning comprises the following specific steps:
s11, searching from the upper left of the straw particle image collected by the imaging device in a line scanning mode until a pixel P0 smaller than a boundary set threshold is found; numbering eight neighborhood positions of any pixel as 0-7, the pixel on the right side as an initial number 0, the pixel counterclockwise going to the lower right corner and the pixel on the lower right corner as 7, as shown in FIG. 6;
s12, searching eight fields of the pixel Pn-1 in a counterclockwise direction until the pixel Pn smaller than a set threshold M1 is found, and marking the Pn as a boundary point; determining an initial pixel searched in the counterclockwise direction according to an eight-neighborhood position dir of the boundary point Pn-1 relative to the boundary point Pn-2, and if dir is an even number, starting searching from a domain point corresponding to a remainder of dividing dir +7 by 8; if dir is an odd number, searching from a domain point corresponding to a remainder of dividing dir +6 by 8;
s13: judging whether Pn-1 is a left boundary point or a right boundary point of the region according to the eight neighborhood position dir of the boundary point Pn-1 relative to the boundary point Pn-2 and the eight neighborhood position dirnext of the boundary point Pn-1 relative to the boundary point Pn; the left and right borders are shown in fig. 7, with red being the left border and green being the right border. If Pn-1 is the left boundary point or the right boundary point of the region, performing the filling operation in S4, and if the Pn-1 is not the left boundary point or the right boundary point of the region, performing S5, namely not filling; the method for judging Pn-1 comprises the following steps: pn-1 is the left boundary if (dir ═ 5) or (dir ═ 6 and dirnext ≠ 1) or (dir ═ 7 and dirnext ≠ 1 and dirnext ≠ 2); pn-1 is the right boundary point if (dir ═ 1 and dienxt ≠ 7) or (dir ═ 2 and dirnext ≠ 5) or (dir ═ 3 and dirnext ≠ 1 and dirnext ≠ 6);
s14, if Pn is the left boundary point, filling to the right, setting the pixel points smaller than the set threshold as the marking values which are any number larger than M1 until the last pixel smaller than the set threshold is found, and marking the pixel as the boundary point; if the current pixel is the right boundary point, filling leftwards, sequentially setting pixel points smaller than a set threshold value as a marking value until the last pixel smaller than the set threshold value is reached, and marking the pixel as the boundary point;
s15: updating dir to dirNext, updating n to n +1, and repeating S12-S14 until the pixels with S2 smaller than the set threshold are found;
s16: and (4) the number of the pixel points and the number of the boundary points which pass through in the filling process are combined to be recorded as the area of the current region, and the boundary points marked in P0-Pn and S14 are used as the boundary point set of the current region.
S2: judging whether each region type is straw scraps, an independent straw particle region or an adhered straw particle region according to the region area; discarding the straw chip area; reserving a separate straw particle area; subjecting the adhered straw particle area to S3;
s3: dividing the adhered particle region into a plurality of sub-regions by using a prior-based kMean clustering algorithm; although the motor can avoid the granule adhesion when controlling the straw granule whereabouts as far as possible, the straw granule is cylindric solid, drops on the board that is shaded from the sun, and inevitable can take place the motion, causes the adhesion. The average area and the average length of the single straw particles are the prior knowledge which can be used. In order to save computation time, the present invention computes the gradient direction of the boundary point using a gradient direction lookup table as shown in fig. 8. FIG. 8(a) is a graph showing the values of the gradient direction, vertically down at 0 degrees, counterclockwise to 2 π; fig. 8(b) is a gradient direction lookup table, and the abscissa and ordinate are horizontal and vertical gradient values calculated from pixel values, respectively, which are the gradient directions. The segmentation method comprises the following specific steps:
s31: calculating the gradient directions of all boundary points of the adhered straw particle area; and after the horizontal and vertical gradient values of the boundary point are calculated, the gradient direction of the boundary point is obtained according to the gradient direction lookup table.
S32, calculating the number Ki of clustering centers according to the area Ai of the region; the number of possible straw particles in the region is the quotient of the area Ai divided by the average area of single straw particles, and 3 is added on the basis of the quotient to serve as the allowance in consideration of the fact that the region possibly contains short straw particles; the number of possible straight line segments in the region, namely the number of clustering centers, is Ki (Ai/average area +3) × 2;
s33, randomly selecting Ki cluster center points and initializing Ki sets to store boundary points contained in each class; the value range of the central point is 0 to 2 pi;
s34, dividing the boundary points into sets corresponding to the cluster center points with the shortest distance according to the gradient direction of the boundary points;
s35, calculating the average gradient direction of each class and using the average gradient direction as a new clustering center point;
and S36, repeating S34-S35 until the variation errors of the Ki cluster center points are smaller than the set value.
S37: for the Ki sets containing boundary points, because position factors are not considered during clustering, each set may contain points which are not on the same straight line segment, and the points may be the boundary points at two ends of the straw particles or the points on straight line segments at other positions with the same direction, so that the straight line fitting is carried out on the boundary points in each set by using a least square method for removing the maximum error points, and the least square fitting is carried out on the removed error points again until the number of the remaining error points is less than a certain value, the straight line segments with the length meeting the set value are reserved, and the next step is carried out;
and S38, finding out straight line segments with parallel directions from the straight line segments with the lengths meeting the set value, respectively calculating the distances of the straight line segments with the parallel directions, and grouping the straight line segments with the distances which are about integral multiples of the average distance of the two straight line segments and are closest to each other into a group, wherein the same straight line segment can be grouped into different groups. During grouping, two straight-line segments belonging to the same straw particle are divided into a group, the two straight-line segments of the same straw particle are parallel, the distance is one time of the average distance, and meanwhile, the straw particles are not overlapped, and no other straight-line segment exists between the straight lines of the two straight-line segments, so that the straight-line segments belonging to the same straw particle are closest to each other. For the case that a plurality of straw particles are adhered in parallel, as shown in fig. 9, straight line segments at the adhered part may not be detected, and at this time, the outer straight line segments of the parallel straw particles are parallel and the distance is an integral multiple of the average distance.
S39: dividing the region into a plurality of sub-regions according to the distance and the length of each group of straight line segments; when the region is divided, the distance and length of each group of straight line segments need to be considered, and the processing is performed under different conditions, wherein the specific conditions and the processing measures are as follows:
s391, calculating the distance between each group of straight line segments L1, L2, if the distance meets the allowable distance of single straw particles, S392 is carried out, otherwise, S395 is carried out;
s392: if the lengths of the two straight line segments meet the length allowed by a single straw particle, respectively connecting the end points of the two straight line segments which are relatively close to each other to obtain independent sub-regions; otherwise, performing S393;
s393, if one straight line segment L1 of the two straight line segments meets the length allowed by a single straw particle, extending the other straight line segment to enable two end points of the straight line segment to be respectively connected with two end points L1 to obtain two straight line segments which are perpendicular to L1, and obtaining independent sub-regions, otherwise, carrying out S394;
s394: in other cases, the lengths of the two straight line segments can not meet the length allowed by a single straw particle, if other straight line segments are intersected with the two straight line segments, the two straight line segments are respectively connected with the end points which are close to each other, otherwise, the two straight line segments are extended to the boundary point of the region, and an independent subregion is obtained;
s395, adding straight line segments which are parallel to L1 and L2 and have end points as region boundary points at integral multiples of the average length of the two straight line segments of the straw particles away from L1 or L2 to obtain a plurality of independent sub-regions;
s4: processing each subarea by a local search method to obtain a single straw particle area; since the sub-regions obtained in S3 are formed by the fitted straight line segment, the added straight line segment, and the boundary point, further processing is required to obtain the boundary point of each sub-region on the image, and the specific steps are as follows:
s41: detecting edges in the image by using a canny algorithm near the boundary of each sub-area; the Canny extracted edge is a single edge and is suitable for describing the boundary of the region.
S42: if there is an edge, the boundary of the sub-region is replaced by the edge, otherwise, the boundary of the sub-region still adopts the straight line segment complemented in S39;
s43: the treated boundary point of the subarea is the boundary point of the single straw particle.
S5: as shown in FIG. 10, the single straw particles obtained from S2 and S4 are judged whether to meet the appearance judgment standard as follows: the length of each single particle is within plus or minus 5 percent of the average length error, the roundness of each single particle is within plus or minus 5 percent of the average roundness error of each particle, and when the single straw particle meets the appearance judgment standard, S6 is carried out, wherein the average length refers to the average length of two straight line segments of the single straw particle counted off-line, the roundness refers to the percentage of points, the distance from one straight line segment to the other straight line segment of which meets the threshold value, and the average roundness is the average roundness of the single straw particle counted off-line; when a single straw particle does not meet the appearance judgment standard, the particle has a length defect or a roundness defect;
s6: extracting the characteristics of the single straw particles from the image area collected under the main light source corresponding to the single straw particle area meeting the appearance standard in S5, and judging whether the single straw particles are normal or have the defects of pits and cracks according to the characteristics, wherein the method specifically comprises the following steps:
s61: rotating a single straw particle into a vertical state, wherein the rotating angle is an included angle between straight line sections on two sides of the single straw particle and a vertical line, as shown in fig. 11; then standardizing the single straw particle image according to the width which is the average distance of two straight line segments of the straw particles;
s62: the single straw particle image is divided into a plurality of overlapped blocks in the length direction, and the adoption of the division mode of the overlapped blocks is helpful for solving the problem that the pits and the cracks are divided into a plurality of blocks as shown in FIG. 12 because the positions and the lengths of the pits and the cracks are not fixed;
s63: extracting the characteristics of each block by using a convolutional neural network to obtain the characteristic vector of each block, and reducing the dimension by using PCA (principal component analysis);
s64: classifying each block by using a support vector machine according to the feature vector after dimensionality reduction, and if any one block has cracks or pits, judging that the straw particles have the defects of the cracks or the pits, as shown in FIG. 13;
s7: and counting the proportion of the straw particles with various defects in the total number of the straw particles, and outputting the working state of the ring die granulator according to the proportion.
The above description is only for the preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art should be considered to be within the technical scope of the present invention, and the technical solutions and the inventive concepts thereof according to the present invention should be equivalent or changed within the scope of the present invention.

Claims (9)

1. The utility model provides a straw granule defect detecting system based on machine vision which characterized in that: the particle defect detection device comprises a particle control module, a particle image acquisition module, a camera bellows and a particle defect detection module;
the particle control module acquires and discharges particles;
the particle image acquisition module arranged in the camera bellows acquires images of the particles discharged by the particle control module to obtain particle images;
the particle defect detection module is used for detecting the defects of the particle images;
the particle acquisition module comprises a particle loading and unloading unit and an electric control unit;
the electric control unit controls the particle loading and unloading unit to obtain straw particles, and the obtained straw particles are sent into the camera bellows to be unloaded;
the particle image acquisition module comprises a main light source, a backlight source and imaging equipment;
the imaging device is located at the top of the dark box,
the main light source is positioned in the middle of the dark box and below a lens of the imaging device;
the backlight source is positioned at the bottom of the dark box;
the main light source corresponds to the backlight source.
2. The machine vision-based straw particle defect detecting system as claimed in claim 1, wherein the backlight source employs a white light L ED array.
3. A detection method of a straw particle defect detection system based on machine vision is further characterized in that: the particle defect detection module adopts the following steps to detect:
s1: extracting each independent area in the straw particle image collected under the backlight source in the imaging equipment by adopting an inner boundary tracking area detection algorithm based on line scanning;
s2: dividing the region types according to the straw scraps, the independent straw particle regions and the adhered straw particle regions according to the region areas;
s3: dividing the adhered particle region into a plurality of sub-regions by using a prior-based kMean clustering algorithm;
s4: processing each subarea by a local search method to obtain a single straw particle area;
s5: judging whether the single straw particle areas obtained in S2 and S4 meet the appearance judgment standard, if the single straw particles meet the appearance judgment standard, carrying out S6, and if the single straw particles do not meet the appearance judgment standard, determining that the particles have length defects, roundness defects or flatness defects;
s6: and extracting the characteristics of the single straw particles from the image area collected under the main light source corresponding to the single straw particle area meeting the appearance standard in the S5, and judging whether the single straw particles are normal or have the defects of pits and cracks according to the characteristics.
S7: and counting the proportion of the straw particles with various defects in the total number of the straw particles, and outputting the working state of the ring die granulator according to the proportion.
4. The detection method of the straw particle defect detection system based on the machine vision as claimed in claim 3, further characterized by comprising the following steps: the following method is specifically adopted in S1:
s11, searching from the upper left of the straw particle image collected by the imaging device in a line scanning mode until a pixel P0 smaller than a boundary set threshold is found; numbering eight neighborhood positions of any pixel as 0-7, wherein the right pixel is an initial number 0, the pixel counterclockwise goes to the lower right corner, and the lower right corner pixel is numbered as 7;
s12, searching eight fields of the pixel Pn-1 in a counterclockwise direction until the pixel Pn smaller than a set threshold M1 is found, and marking the Pn as a boundary point; the starting pixel searched in the anticlockwise direction is determined according to the eight neighborhood position of the boundary point Pn-1 relative to the boundary point Pn-2;
s13: judging whether Pn-1 is a left boundary point or a right boundary point of the region according to the eight neighborhood position of the boundary point Pn-1 relative to the boundary point Pn-2 and the eight neighborhood position of the boundary point Pn-1 relative to the boundary point Pn;
s14, when Pn-1 is the left boundary point, filling to the right, setting the pixel points smaller than the set threshold as the marking values which are any number larger than M1 until the last pixel smaller than the set threshold is found, and marking the pixel as the boundary point; if the Pn-1 is the right boundary point, filling leftwards, setting pixel points smaller than a set threshold value as a marking value in sequence until the last pixel smaller than the set threshold value is reached, and marking the pixel as the boundary point;
s15: repeating S12-S14 until S2 satisfies that the pixels smaller than the set threshold have been found;
s16: and (4) the number of the pixel points and the number of the boundary points which pass through in the filling process are combined to be recorded as the area of the current region, and the boundary points marked in P0-Pn and S14 are used as the boundary point set of the current region.
5. The detection method of the straw particle defect detection system based on the machine vision as claimed in claim 3, further characterized in that: s3 includes the steps of:
s31: calculating the gradient directions of all boundary points of the adhered straw particle area;
s32, calculating the number Ki of clustering centers according to the area Ai of the region;
s33, randomly selecting Ki cluster center points and initializing Ki sets to store boundary points contained in each class;
s34, dividing the boundary points into sets corresponding to the cluster center points with the shortest distance according to the gradient direction of the boundary points;
s35, calculating the average gradient direction of each class and using the average gradient direction as a new clustering center point;
s36, repeating S34-S35 until the variation error of the Ki cluster center points is smaller than a set value;
s37: for the Ki sets containing the boundary points, performing linear fitting on the boundary points in each set by using a least square method for eliminating the maximum error point, reserving straight segments with the lengths meeting the set value, and performing S38;
s38, finding out straight line segments with parallel directions from the straight line segments with the lengths meeting a set value, respectively calculating the distances of the straight line segments with the parallel directions, and grouping the straight line segments with the distances which are about integral multiples of the average distance of the two straight line segments and the nearest distance into one group, wherein the same straight line segment can be grouped into different groups;
s39: and dividing the region into a plurality of sub-regions according to the distance and the length of each group of straight line segments.
6. The detection method of the straw particle defect detection system based on the machine vision as claimed in claim 5, further characterized in that: s39 specifically adopts the following method:
s391, calculating the distance of each group of straight line segments, if the distance meets the allowable distance of a single straw particle, S392 is carried out, otherwise, S395 is carried out;
s392: if the lengths of the two straight line segments meet the length allowed by a single straw particle, respectively connecting the end points of the two straight line segments which are relatively close to each other to obtain independent sub-regions; otherwise, performing S393;
s393: if one straight line segment l1 in the lengths of the two straight line segments meets the length allowed by a single straw particle, extending the other straight line segment to enable two end points of the other straight line segment to be respectively connected with two end points of l1 to obtain two straight line segments which are perpendicular to l1, so as to obtain independent sub-regions; otherwise, proceed to S394;
s394: in other cases, the lengths of the two straight line segments can not meet the length allowed by a single straw particle, if other straight line segments are intersected with the two straight line segments, the two straight line segments are respectively connected with the end points which are close to each other, otherwise, the two straight line segments are extended to the boundary point of the region, and an independent subregion is obtained;
and S395, adding linear segments which are parallel to L1 and L2 and have end points as zone boundary points at integral multiples of the average length of the two linear segments of the straw particles from L1 or L2 to obtain a plurality of independent sub-zones.
7. The detection method of the straw particle defect detection system based on the machine vision as claimed in claim 3, further characterized in that: s4 specifically adopts the following method:
s41: detecting edges in the image by using a canny algorithm near the boundary of each sub-area;
s42: if there is an edge, the boundary of the sub-region is replaced by the edge, otherwise, the boundary of the sub-region still adopts the straight line segment complemented in S39;
s43: the treated boundary point of the subarea is the boundary point of the single straw particle.
8. The detection method of the straw particle defect detection system based on the machine vision as claimed in claim 3, further characterized in that: the appearance judgment criteria of S5 are as follows: the length of the single particles meets the condition that the average length error is within plus or minus 5 percent, the roundness of the single particles meets the condition that the roundness error of the particles is within plus or minus 5 percent, and the flatness of the single particles meets the condition that the average flatness is within plus or minus 5 percent.
9. The detection method of the straw particle defect detection system based on the machine vision as claimed in claim 3, further characterized in that: s6 specifically adopts the following method:
s61: standardizing the single straw particle image according to the width;
s62: dividing a single straw particle image into a plurality of overlapped blocks in the length direction;
s63: extracting the characteristics of each block by using a convolutional neural network to obtain the characteristic vector of each block;
s64: reducing the dimension of each feature vector by using PCA;
s65: and classifying the straw particles by using a support vector machine according to the feature vector after the dimension reduction, and judging whether cracks or pits exist.
CN202010301633.1A 2020-04-16 2020-04-16 Straw particle defect detection system and detection method based on machine vision Active CN111398294B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010301633.1A CN111398294B (en) 2020-04-16 2020-04-16 Straw particle defect detection system and detection method based on machine vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010301633.1A CN111398294B (en) 2020-04-16 2020-04-16 Straw particle defect detection system and detection method based on machine vision

Publications (2)

Publication Number Publication Date
CN111398294A true CN111398294A (en) 2020-07-10
CN111398294B CN111398294B (en) 2022-11-25

Family

ID=71431588

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010301633.1A Active CN111398294B (en) 2020-04-16 2020-04-16 Straw particle defect detection system and detection method based on machine vision

Country Status (1)

Country Link
CN (1) CN111398294B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116026855A (en) * 2023-02-27 2023-04-28 英飞智信(北京)科技有限公司 Sample analysis system for solid particle inspection based on visual analysis
CN117589641A (en) * 2024-01-19 2024-02-23 北京春风药业有限公司 Particle screening detection system for traditional Chinese medicine particle production

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB1561110A (en) * 1977-12-02 1980-02-13 Probe Eng Co Ltd Grain detection apparatus
EP1662247A1 (en) * 2004-11-24 2006-05-31 Amazonen-Werke H. Dreyer GmbH & Co. KG Method for determining the particle shape and/or size of agricultural particles
CN103752535A (en) * 2014-01-26 2014-04-30 东北农业大学 Machine vision based soybean seed selection method
US20150262351A1 (en) * 2014-03-12 2015-09-17 Deere & Company Arrangement and Method for Detecting and Documenting Straw Quality
CN205786372U (en) * 2016-06-29 2016-12-07 中国农业大学 Portable corn particle detection auxiliary device and the detection device of going mouldy
CN106362958A (en) * 2016-10-08 2017-02-01 东北农业大学 Light source adjusting bracket suitable for small-granule agricultural product vision sorting system
CN110706210A (en) * 2019-09-18 2020-01-17 五邑大学 Deep learning-based rebar counting method and device
CN210036917U (en) * 2019-06-17 2020-02-07 湖南农业大学 Thousand grain weight rapid survey device of seed

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB1561110A (en) * 1977-12-02 1980-02-13 Probe Eng Co Ltd Grain detection apparatus
EP1662247A1 (en) * 2004-11-24 2006-05-31 Amazonen-Werke H. Dreyer GmbH & Co. KG Method for determining the particle shape and/or size of agricultural particles
CN103752535A (en) * 2014-01-26 2014-04-30 东北农业大学 Machine vision based soybean seed selection method
US20150262351A1 (en) * 2014-03-12 2015-09-17 Deere & Company Arrangement and Method for Detecting and Documenting Straw Quality
CN205786372U (en) * 2016-06-29 2016-12-07 中国农业大学 Portable corn particle detection auxiliary device and the detection device of going mouldy
CN106362958A (en) * 2016-10-08 2017-02-01 东北农业大学 Light source adjusting bracket suitable for small-granule agricultural product vision sorting system
CN210036917U (en) * 2019-06-17 2020-02-07 湖南农业大学 Thousand grain weight rapid survey device of seed
CN110706210A (en) * 2019-09-18 2020-01-17 五邑大学 Deep learning-based rebar counting method and device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116026855A (en) * 2023-02-27 2023-04-28 英飞智信(北京)科技有限公司 Sample analysis system for solid particle inspection based on visual analysis
CN117589641A (en) * 2024-01-19 2024-02-23 北京春风药业有限公司 Particle screening detection system for traditional Chinese medicine particle production
CN117589641B (en) * 2024-01-19 2024-04-02 北京春风药业有限公司 Particle screening detection system for traditional Chinese medicine particle production

Also Published As

Publication number Publication date
CN111398294B (en) 2022-11-25

Similar Documents

Publication Publication Date Title
CN110296997B (en) Method and device for detecting defects of ceramic tiles based on machine vision
WO2021168733A1 (en) Defect detection method and apparatus for defect image, and computer-readable storage medium
CN111398294B (en) Straw particle defect detection system and detection method based on machine vision
KR101297395B1 (en) Secondary battery eloctrode panel vision detection method
CN101464418A (en) Flaw detection method and apparatus
US20150051860A1 (en) Automatic optical appearance inspection by line scan apparatus
CN111672773A (en) Product surface defect detection system and method based on machine vision
CN110160587B (en) Capacitance detection method and system
CN106780437B (en) A kind of quick QFN chip plastic packaging image obtains and amplification method
CN111766245B (en) Button cell negative electrode shell defect detection method based on machine vision
CN111476712B (en) Trolley grate image shooting and detecting method and system of sintering machine
CN104034637A (en) Diamond wire particle online quality inspection device based on machine vision
CN112819812A (en) Powder bed defect detection method based on image processing
CN114519696A (en) PVC heat shrinkage film detection method and system based on optical intelligence
CN203965287U (en) The online quality inspection device of diamond wire particle based on machine vision
CN114529510A (en) Cathode copper online quality automatic detection and classification method
CN111967473B (en) Grain depot storage condition monitoring method, equipment and medium based on image segmentation and template matching
US20230152781A1 (en) Manufacturing intelligence service system connected to mes in smart factory
CN116721079A (en) Machine vision identification method, device and medium for surface defects of silicon wafer
CN114612474B (en) Method and device for detecting state of wafer cleaning and drying module and flattening equipment
KR101543896B1 (en) Apparatus and method of detecting scap defect
CN113192061A (en) LED package appearance detection image extraction method and device, electronic equipment and storage medium
CN114187269A (en) Method for rapidly detecting surface defect edge of small-sized device
CN113628155A (en) Green ball particle size detection method and system of disc pelletizer
CN113409297A (en) Aggregate volume calculation method, particle form grading data generation method, system and equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant