CN115187609A - Method and system for detecting rice yellow grains - Google Patents
Method and system for detecting rice yellow grains Download PDFInfo
- Publication number
- CN115187609A CN115187609A CN202211117703.3A CN202211117703A CN115187609A CN 115187609 A CN115187609 A CN 115187609A CN 202211117703 A CN202211117703 A CN 202211117703A CN 115187609 A CN115187609 A CN 115187609A
- Authority
- CN
- China
- Prior art keywords
- rice
- image
- yellow
- grain
- convolution
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 235000007164 Oryza sativa Nutrition 0.000 title claims abstract description 75
- 235000009566 rice Nutrition 0.000 title claims abstract description 75
- 235000013339 cereals Nutrition 0.000 title claims abstract description 66
- 238000000034 method Methods 0.000 title claims abstract description 25
- 240000007594 Oryza sativa Species 0.000 title 1
- 241000209094 Oryza Species 0.000 claims abstract description 74
- 238000013528 artificial neural network Methods 0.000 claims abstract description 15
- 238000007781 pre-processing Methods 0.000 claims abstract description 6
- 238000004590 computer program Methods 0.000 claims description 16
- 239000011159 matrix material Substances 0.000 claims description 14
- 230000011218 segmentation Effects 0.000 claims description 14
- 238000012549 training Methods 0.000 claims description 10
- 238000001514 detection method Methods 0.000 claims description 8
- 230000004913 activation Effects 0.000 claims description 6
- 238000013527 convolutional neural network Methods 0.000 claims description 6
- 238000001914 filtration Methods 0.000 claims description 6
- 230000008569 process Effects 0.000 claims description 3
- 238000012545 processing Methods 0.000 claims description 3
- 230000009466 transformation Effects 0.000 claims description 3
- 238000001429 visible spectrum Methods 0.000 claims description 3
- 238000004364 calculation method Methods 0.000 claims description 2
- 239000008187 granular material Substances 0.000 claims description 2
- 238000004422 calculation algorithm Methods 0.000 abstract description 5
- 230000007613 environmental effect Effects 0.000 abstract description 4
- 238000005286 illumination Methods 0.000 abstract description 4
- 230000006870 function Effects 0.000 description 14
- 238000005303 weighing Methods 0.000 description 3
- 244000062793 Sorghum vulgare Species 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 239000000835 fiber Substances 0.000 description 2
- 235000019713 millet Nutrition 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 239000002245 particle Substances 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000005260 corrosion Methods 0.000 description 1
- 230000007797 corrosion Effects 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 239000002932 luster Substances 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000011176 pooling Methods 0.000 description 1
- 239000000843 powder Substances 0.000 description 1
- 230000000644 propagated effect Effects 0.000 description 1
- 238000000638 solvent extraction Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N15/00—Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
- G01N15/10—Investigating individual particles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
- G01N21/25—Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
- G01N21/255—Details, e.g. use of specially adapted sources, lighting or optical systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
- G01N21/25—Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
- G01N21/27—Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands using photo-electric detection ; circuits for computing concentration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/20—Image enhancement or restoration using local operators
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/187—Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/26—Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
- G06V10/267—Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/30—Noise filtering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/36—Applying a local operator, i.e. means to operate on image points situated in the vicinity of a given point; Non-linear local filtering operations, e.g. median filtering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20024—Filtering details
- G06T2207/20032—Median filtering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Chemical & Material Sciences (AREA)
- Evolutionary Computation (AREA)
- Life Sciences & Earth Sciences (AREA)
- Computing Systems (AREA)
- Software Systems (AREA)
- Pathology (AREA)
- Biochemistry (AREA)
- Artificial Intelligence (AREA)
- Analytical Chemistry (AREA)
- Immunology (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Databases & Information Systems (AREA)
- Medical Informatics (AREA)
- Mathematical Physics (AREA)
- Biophysics (AREA)
- Nonlinear Science (AREA)
- Biomedical Technology (AREA)
- Dispersion Chemistry (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- Molecular Biology (AREA)
- General Engineering & Computer Science (AREA)
- Quality & Reliability (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a method and a system for detecting rice yellow grains, wherein the method comprises the following steps: s1, image acquisition; s2, preprocessing the image acquired in the step S1; s3, calculating the yellow rice rate according to the image data preprocessed in the step S2; the method provided by the invention has the advantages that the image is preprocessed, the recognition precision is improved by utilizing the deep full convolution neural network, the manual selection of the characteristics is avoided, and due to the environmental factors such as illumination and background during the sample collection, compared with the manual selection of a sample characteristic with strong robustness, the complicated characteristic algorithm can be avoided by constructing the deep full convolution neural network, and the recognition precision is improved.
Description
Technical Field
The invention relates to the field of rice detection, in particular to a method and a system for detecting yellow rice grains.
Background
According to the national standard, the yellow-grained rice refers to rice grains whose endosperm is yellow and whose color and luster are clearly distinguished from those of standard rice. Existing yellow particle monitoring methods include sampling methods and machine vision-based techniques. The sampling method comprises the steps of weighing 150g of coarse grain by using an e =0.1g balance, grinding the coarse grain into rice with national standard precision, removing bran powder, weighing (W) to serve as the weight of a sample, sorting yellow-grained rice according to the specification, and weighing (W1). The calculated% of yellow rice = W1/W × 100%. The method comprises the steps of irradiating rice in a rice sliding state by using an external light source based on a machine vision method, collecting an image by using a ccd camera, obtaining a seed region and a background region through preprocessing operations including graying, background segmentation, edge detection, image binarization, corrosion expansion and the like, converting an original RGB image into an HSI model, graying the image according to hue H, setting a threshold value, and judging whether the image is yellow grains. The existing visual detection method needs manual feature selection, and due to environmental factors such as illumination and background during sample collection, the feature extraction process is complicated, and the identification precision is low.
Disclosure of Invention
In order to solve the existing problems, the invention provides a method and a system for detecting rice yellow grains, and the specific scheme is as follows:
a rice yellow grain detection method comprises the following steps:
s1, image acquisition;
s2, preprocessing the image acquired in the step S1;
and S3, calculating the yellow rice rate according to the image data preprocessed in the step S2.
Preferably, the image acquired in step S1 includes: at least 3 different types of rice grains are adopted for collection, and visible spectrum image information of the rice grains in three wave bands near an R channel 700nm, a G channel 550nm and a B channel 440nm is obtained.
Preferably, the step of preprocessing in step S2 includes:
s21, carrying out median filtering on the image acquired in the step 1 through a 3 x 3 median filter to remove noise points, wherein the median filtering uses a median assignment matrix central point of matrix pixel points;
s22, rotating, turning over and adjusting the image data processed in the step S21; wherein, the rotation formula is:in the formula, theta is a rotation angle (x) 1 ,y 1 ) As the current coordinate, (x) 2 ,y 2 ) For the rotated coordinates, the contrast is adjusted using a gamma transform:s is the output gray level, r is the input gray level, γ is the gamma value, and c is the gray scale factor.
Preferably, the step S3 of calculating the millet ratio includes:
s31, building a deep full convolution neural network of the U-Net frame;
s32, segmenting a rice grain region and a yellow grain region in the deep full convolution neural network established in the step S31 to obtain the proportion of the yellow grain region in a single grain of rice;
s33, counting the total number of the rice grains according to the segmentation of the rice grain regions, and meanwhile, judging whether the rice grains are yellow rice or not according to the ratio of the yellow grain regions to the areas of the rice grains corresponding to the yellow grain regions after segmentation in combination with a set threshold value, so as to count the number of the yellow rice;
preferably, the specific steps of partitioning the yellow grain regions and the rice grain regions in step S32 include:
s321, converting the RGB model of the image collected in the step S1 into an HSI model,
(ii) a Wherein I is the brightness of the image, RGB is the values of RGB three channels of the pixel point, S is the saturation of the image, min (R, G, B) is the minimum value of the RGB values, H is the hue of the image,
s322, calibrating the HSI model image, and manually marking out a rice grain area and a yellow grain area;
s323, inputting the original image before segmentation and the corresponding manually segmented image into the deep fully-convolutional neural network for training;
s324, forward propagation is carried out to calculate a loss value;
s325, adjusting network parameters including parameters of convolution kernels in the convolution layer through the loss values;
and S326, repeating the steps S324 to S325 until the expected value of the loss value or the preset number of training times is reached.
Preferably, the calculating the loss value in step S324 specifically includes: the original image is calculated through the network layer of the deep full convolution neural network to obtain a network pairDividing the original image, comparing the original image with the manually divided image, and calculating a loss value; the loss function is the average absolute error,,in the form of an actual value of the value,is a predicted value, and n is the total number of training samples; the forward propagation process uses an n × n convolution kernel for efficient convolution, the convolution formula being
In whichIn the form of a matrix to be convolved,is composed ofThe point of the position is that of the position,k is the value of the point after convolution, n x n convolution kernel,is k rotatesThe obtained matrix is subjected to nonlinear transformation by the matrix after convolution kernel convolution through an activation function, wherein the activation function is a Relu function, and the Relu function is expressed in the formula
The invention also discloses a computer readable storage medium, wherein a computer program is stored on the medium, and after the computer program runs, the rice yellow grain detection method is executed.
The invention also discloses a computer system, which comprises a processor and a storage medium, wherein the storage medium stores a computer program, and the processor reads the computer program from the storage medium and runs the computer program to execute the rice yellow grain detection method as claimed in any one of claims 1 to 6.
Preferably, the system of the method for detecting the yellow rice grains comprises a camera bellows, a CCD camera integrated with an image acquisition card, a computer, a light source and an object stage;
the top end of the camera bellows is provided with the light sources in an annular array; the center of the bottom of the dark box is provided with the objective table; background paper is pasted on the inner side wall of the dark box; and the CCD camera is arranged right above the objective table outside the camera bellows and is used for collecting rice samples under the irradiation of light sources with different wavelengths and uploading the rice samples to the computer through the image acquisition card for further processing of the samples.
The invention has the beneficial effects that:
the invention preprocesses the image, improves the identification precision by utilizing the deep full convolution neural network and avoids the manual selection of the characteristics. Due to the environmental factors such as illumination and background during sample collection, compared with manual selection of a sample feature with strong robustness, the construction of the deep full convolution neural network can avoid a complex feature algorithm and improve the identification precision.
Drawings
In order to more clearly illustrate the embodiments or technical solutions of the present invention, the drawings used in the embodiments or technical solutions of the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a flow chart of the method of the present invention;
FIG. 2 is a schematic diagram of a deep full convolution neural network according to the present invention;
FIG. 3 is a schematic diagram of the system of the present invention.
The reference numbers are as follows: 1. a camera obscura 2, a light source 3, a CCD camera 4 and an objective table.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be obtained by a person skilled in the art without making any creative effort based on the embodiments in the present invention, belong to the protection scope of the present invention.
Referring to fig. 1, a method for detecting rice yellow grains includes the following steps:
and S1, image acquisition.
The images acquired include: at least 3 different kinds of rice grains are adopted for collection, and visible spectrum image information of the rice grains in three wave bands near an R channel 700nm, a G channel 550nm and a B channel 440nm is obtained.
And S2, preprocessing the image acquired in the step S1.
Wherein the step of pre-treating comprises:
and S21, carrying out median filtering on the image acquired in the step 1 through a 3 x 3 median filter to remove noise points, wherein the median filtering uses a median assignment matrix central point of matrix pixel points.
S22, rotating, turning over and adjusting the image data processed in the step S21; wherein, the rotation formula is:in the formula, theta is a rotation angle (x) 1 ,y 1 ) As the current coordinate, (x) 2 ,y 2 ) For the rotated coordinates, the contrast is adjusted using a gamma transform:s is the output gray level, r is the input gray level, γ is the gamma value, and c is the gray scale factor.
And S3, calculating the yellow rice rate according to the image data preprocessed in the step S2.
The calculation steps of the millet ratio comprise:
and S31, building a deep full convolution neural network of the U-Net framework.
As shown in fig. 2, the deep fully convolutional neural network is composed of two structurally similar parts on the left and right sides, one side (left side in the figure) of the deep fully convolutional neural network includes 4 blocks, each block includes two convolutional layers C1 and C2 and one maximum pooling layer S1, and the other side (right side in the figure) includes 5 blocks, each block includes two convolutional layers C1 and C2 and one deconvolution layer U1.
And S32, segmenting the rice grain region and the yellow grain region in the deep fully-convolutional neural network established in the step S31 to segment the proportion of the yellow grain region in the single grain.
The specific steps of the yellow grain region division and the rice grain region division comprise:
s321, converting the RGB model of the image collected in the step S1 into an HSI model,
(ii) a In the formula, I is the brightness of the image, RGB is the values of RGB three channels of the pixel point, S is the saturation of the image, min (R, G, B) is the minimum value of RGB values, H is the tone of the image,
and S322, calibrating the HSI model image, and manually marking out a rice grain area and a yellow grain area.
And S323, inputting the original image before segmentation and the corresponding manually segmented image into the depth fully-convolutional neural network for training.
S324, the calculated loss value is propagated forward.
Wherein, calculating the loss value specifically comprises: calculating the original image through the network layer of the deep full convolution neural network to obtain the segmentation of the original image by the network, comparing the segmentation with the manually segmented image, and calculating a loss value; the loss function is the average absolute error and,,is a function of the actual value of the measured value,is a predicted value, and n is the total number of training samples; the forward propagation process uses an n × n convolution kernel for efficient convolution, the convolution formula being
WhereinIn the form of a matrix to be convolved,is composed ofThe point of the position is that of the position,k is the convolution kernel of n, which is the value of the point after convolution,is k rotatesThe resulting matrix is formed by convolutionThe matrix after the kernel convolution is subjected to nonlinear transformation through an activation function, wherein the activation function is a Relu function, and the Relu function isMax is used to obtain the maximum value of 0, x.
S325, adjusting network parameters including parameters of convolution kernels in the convolution layer through the loss values.
And S326, repeating the steps S324 to S325 until the expected loss value or the preset training number is reached, specifically, the expected loss value is set to be less than or equal to 0.5, and the preset training number can be set to be between 200 and 300.
And S33, counting the total number of the rice grains according to the segmentation of the rice grain regions, and judging whether the rice grains are yellow rice or not according to the ratio of the yellow grain regions to the areas of the rice grains corresponding to the yellow grain regions after segmentation in combination with a set threshold value, so that the number of the yellow rice is counted.
a system of a rice yellow grain detection method comprises a dark box 1, a CCD camera 3 integrated with an image acquisition card, a computer, a light source 2 and an objective table 4.
The top end of the dark box 1 is provided with the light sources 2 in an annular array; the object stage 4 is arranged in the center of the bottom of the dark box 1; background paper is pasted on the inner side wall of the camera bellows 1; and the CCD camera 3 is arranged right above the objective table 4 outside the dark box 1 and is used for collecting rice samples irradiated by the light sources 2 with different wavelengths and uploading the rice samples to the computer through the image acquisition card for further processing of the samples.
The invention preprocesses the image, improves the identification precision by utilizing the deep full convolution neural network and avoids the manual selection of the characteristics. Due to the environmental factors such as illumination and background during sample collection, compared with manual selection of a sample feature with strong robustness, the construction of the deep full convolution neural network can avoid a complex feature algorithm and improve the identification precision.
The invention also discloses a computer readable storage medium, wherein a computer program is stored on the medium, and after the computer program runs, the method for detecting the rice yellow granules is executed.
The invention also discloses a computer system, which comprises a processor and a storage medium, wherein the storage medium is stored with a computer program, and the processor reads the computer program from the storage medium and runs the computer program to execute the rice yellow particle detection method.
Those of skill would further appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The various illustrative logical blocks, modules, and circuits described in connection with the embodiments disclosed herein may be implemented or performed with a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC. The ASIC may reside in a user terminal. In the alternative, the processor and the storage medium may reside as discrete components in a user terminal.
In one or more exemplary embodiments, the functions described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software as a computer program product, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a computer. By way of example, and not limitation, such computer-readable media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. Any connection is properly termed a computer-readable medium. For example, if the software is transmitted from a web site, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital Subscriber Line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk (disk) and disc (disc), as used herein, includes Compact Disc (CD), laser disc, optical disc, digital Versatile Disc (DVD), floppy disk and blu-ray disc where disks (disks) usually reproduce data magnetically, while discs (discs) reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
The previous description of the disclosure is provided to enable any person skilled in the art to make or use the disclosure. Various modifications to the disclosure will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other variations without departing from the spirit or scope of the disclosure. Thus, the disclosure is not intended to be limited to the examples and designs described herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
Although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.
Claims (6)
1. The method for detecting yellow rice grains is characterized by comprising the following steps:
s1, image acquisition; collecting at least 3 different types of rice grains, namely visible spectrum image information of the rice grains in three wave bands near an R channel 700nm, a G channel 550nm and a B channel 440 nm;
s2, preprocessing the image acquired in the step S1;
s21, carrying out median filtering on the image acquired in the step 1 through a 3 x 3 median filter to remove noise points, wherein the median filtering uses median assignment matrix center points of matrix pixel points;
s22, rotating, overturning and adjusting the image data processed in the step S21; wherein, the rotation formula is:in the formula, theta is a rotation angle, (x) 1 ,y 1 ) Is at presentCoordinate (x) 2 ,y 2 ) For the rotated coordinates, the contrast is adjusted using a gamma transform:s is an output gray level, r is an input gray level, gamma is a gamma value, and c is a gray scaling coefficient;
s3, calculating the yellow rice rate according to the image data preprocessed in the step S2;
wherein, the calculation step of the yellow rice rate comprises the following steps:
s31, building a deep full convolution neural network of the U-Net frame;
s32, segmenting a rice grain region and a yellow grain region in the deep full convolution neural network established in the step S31 to obtain the proportion of the yellow grain region in a single grain of rice;
s33, counting the total number of the rice grains according to the segmentation of the rice grain regions, and meanwhile, judging whether the rice grains are yellow rice or not according to the ratio of the yellow grain regions to the areas of the rice grains corresponding to the yellow grain regions after segmentation in combination with a set threshold value, so that the number of the yellow rice is counted;
2. the method according to claim 1, wherein the specific steps of classifying the yellow grain regions and the rice grain regions in step S32 include:
s321, converting the RGB model of the image collected in the step S1 into an HSI model,
(ii) a In the formula, I is the brightness of the image, RGB is the values of RGB three channels of the pixel point, S is the saturation of the image, min (R, G, B) is the minimum value of RGB values, H is the tone of the image,
s322, calibrating the HSI model image, and manually marking out a rice grain area and a yellow grain area;
s323, inputting the original image before segmentation and the corresponding manually segmented image into the depth fully-convolutional neural network for training;
s324, forward propagation is carried out to calculate a loss value;
s325, adjusting network parameters including parameters of convolution kernels in the convolution layer through the loss values;
and S326, repeating the steps S324 to S325 until the expected loss value or the preset training times are reached.
3. The method according to claim 2, wherein the step S324 of calculating the loss value specifically comprises: calculating the original image through the network layer of the deep full convolution neural network to obtain the segmentation of the original image by the network, comparing the segmentation with the manually segmented image, and calculating a loss value; the loss function is the average absolute error,,in the form of an actual value of the value,is a predicted value, n is the total number of training samples; the forward propagation process uses an n × n convolution kernel for efficient convolution, the convolution formula is
WhereinIn the form of a matrix to be convolved,is composed ofThe point of the position of the mobile phone is,k is the convolution kernel of n, which is the value of the point after convolution,is k rotatesThe obtained matrix is subjected to nonlinear transformation by the matrix after convolution kernel convolution through an activation function, wherein the activation function is a Relu function, and the Relu function is represented byMax is used to obtain the maximum value of 0, x.
4. A computer-readable storage medium, characterized in that: a computer program stored on a medium, which when executed, performs the method of detecting rice yellow grain according to any one of claims 1 to 3.
5. A computer system, characterized by: comprising a processor, a storage medium having a computer program stored thereon, the processor reading the computer program from the storage medium and executing the computer program to perform the rice yellow grain detection method according to any one of claims 1 to 3.
6. The system for detecting rice yellow granules according to any one of claims 1 to 3, wherein: comprises a camera bellows (1), a CCD camera (3) integrated with an image acquisition card, a computer, a light source (2) and an objective table (4);
the top end of the dark box (1) is provided with the light sources (2) in an annular array; the object stage (4) is arranged in the center of the bottom of the dark box (1); background paper is pasted on the inner side wall of the dark box (1); the CCD camera (3) is arranged right above the objective table (4) outside the camera bellows (1) and used for collecting rice samples irradiated by light sources (2) with different wavelengths and uploading the rice samples to the computer through the image collecting card for further processing of the samples.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211117703.3A CN115187609A (en) | 2022-09-14 | 2022-09-14 | Method and system for detecting rice yellow grains |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211117703.3A CN115187609A (en) | 2022-09-14 | 2022-09-14 | Method and system for detecting rice yellow grains |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115187609A true CN115187609A (en) | 2022-10-14 |
Family
ID=83524681
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211117703.3A Pending CN115187609A (en) | 2022-09-14 | 2022-09-14 | Method and system for detecting rice yellow grains |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115187609A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116503402A (en) * | 2023-06-26 | 2023-07-28 | 中储粮成都储藏研究院有限公司 | Method and device for detecting impurity content of grain shoulder |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103674957A (en) * | 2013-12-25 | 2014-03-26 | 浙江工商大学 | Method and system for detecting yellow rice grains |
CN110942454A (en) * | 2019-11-26 | 2020-03-31 | 北京科技大学 | Agricultural image semantic segmentation method |
CN111553240A (en) * | 2020-04-24 | 2020-08-18 | 四川省农业科学院农业信息与农村经济研究所 | Corn disease condition grading method and system and computer equipment |
CN111815574A (en) * | 2020-06-18 | 2020-10-23 | 南通大学 | Coarse set neural network method for fundus retina blood vessel image segmentation |
CN113486975A (en) * | 2021-07-23 | 2021-10-08 | 深圳前海微众银行股份有限公司 | Ground object classification method, device, equipment and storage medium for remote sensing image |
CN114140692A (en) * | 2021-11-25 | 2022-03-04 | 华中农业大学 | Fresh corn maturity prediction method based on unmanned aerial vehicle remote sensing and deep learning |
CN114678121A (en) * | 2022-05-30 | 2022-06-28 | 上海芯超生物科技有限公司 | HP spherical deformation diagnosis model and construction method thereof |
CN114689527A (en) * | 2022-05-31 | 2022-07-01 | 合肥安杰特光电科技有限公司 | Rice chalkiness detection method and system |
-
2022
- 2022-09-14 CN CN202211117703.3A patent/CN115187609A/en active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103674957A (en) * | 2013-12-25 | 2014-03-26 | 浙江工商大学 | Method and system for detecting yellow rice grains |
CN110942454A (en) * | 2019-11-26 | 2020-03-31 | 北京科技大学 | Agricultural image semantic segmentation method |
CN111553240A (en) * | 2020-04-24 | 2020-08-18 | 四川省农业科学院农业信息与农村经济研究所 | Corn disease condition grading method and system and computer equipment |
CN111815574A (en) * | 2020-06-18 | 2020-10-23 | 南通大学 | Coarse set neural network method for fundus retina blood vessel image segmentation |
CN113486975A (en) * | 2021-07-23 | 2021-10-08 | 深圳前海微众银行股份有限公司 | Ground object classification method, device, equipment and storage medium for remote sensing image |
CN114140692A (en) * | 2021-11-25 | 2022-03-04 | 华中农业大学 | Fresh corn maturity prediction method based on unmanned aerial vehicle remote sensing and deep learning |
CN114678121A (en) * | 2022-05-30 | 2022-06-28 | 上海芯超生物科技有限公司 | HP spherical deformation diagnosis model and construction method thereof |
CN114689527A (en) * | 2022-05-31 | 2022-07-01 | 合肥安杰特光电科技有限公司 | Rice chalkiness detection method and system |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116503402A (en) * | 2023-06-26 | 2023-07-28 | 中储粮成都储藏研究院有限公司 | Method and device for detecting impurity content of grain shoulder |
CN116503402B (en) * | 2023-06-26 | 2023-09-08 | 中储粮成都储藏研究院有限公司 | Method and device for detecting impurity content of grain shoulder |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109871884B (en) | Multi-feature-fused object-oriented remote sensing image classification method of support vector machine | |
CN115619793B (en) | Power adapter appearance quality detection method based on computer vision | |
CN114118144A (en) | Anti-interference accurate aerial remote sensing image shadow detection method | |
CN116993731B (en) | Shield tunneling machine tool bit defect detection method based on image | |
WO2020232710A1 (en) | Haze image quality evaluation method and system, storage medium, and electronic device | |
CN111784605A (en) | Image denoising method based on region guidance, computer device and computer readable storage medium | |
CN109903254B (en) | Improved bilateral filtering method based on Poisson nucleus | |
CN109740721A (en) | Wheat head method of counting and device | |
CN111062293A (en) | Unmanned aerial vehicle forest flame identification method based on deep learning | |
CN115082451A (en) | Stainless steel soup ladle defect detection method based on image processing | |
CN116485787B (en) | Method for detecting appearance defects of data line molding outer die | |
CN115187609A (en) | Method and system for detecting rice yellow grains | |
CN118115497B (en) | Quartz sand crushing and grinding detection method and device | |
CN114689527A (en) | Rice chalkiness detection method and system | |
CN117994154A (en) | Intelligent image denoising method based on sensor | |
Wu et al. | Automatic kernel counting on maize ear using RGB images | |
CN108269264B (en) | Denoising and fractal method of bean kernel image | |
CN106846325A (en) | Automatic method for determining optimal segmentation result of remote sensing image | |
CN115511803B (en) | Broken rice detection method and system | |
Garg et al. | Design of Filtration Approach for Image Quality Improvement in Mango Leaf Disease Detection and Pharmaceutical Treatment | |
CN117541484B (en) | Image enhancement method for detecting bran star of flour | |
CN114757892B (en) | Perspective material defect detection method and system based on artificial intelligence | |
CN116681703B (en) | Intelligent switch quality rapid detection method | |
CN117422656B (en) | Low-illumination fuzzy traffic image enhancement method, device, equipment and medium | |
CN117575970B (en) | Classification-based satellite image automatic processing method, device, equipment and medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20221014 |
|
RJ01 | Rejection of invention patent application after publication |