CN115187609A - Method and system for detecting rice yellow grains - Google Patents

Method and system for detecting rice yellow grains Download PDF

Info

Publication number
CN115187609A
CN115187609A CN202211117703.3A CN202211117703A CN115187609A CN 115187609 A CN115187609 A CN 115187609A CN 202211117703 A CN202211117703 A CN 202211117703A CN 115187609 A CN115187609 A CN 115187609A
Authority
CN
China
Prior art keywords
rice
image
yellow
grain
convolution
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211117703.3A
Other languages
Chinese (zh)
Inventor
赵公方
李新奇
樊春晓
沈红艳
严金欣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei Anjiete Optoelectronic Co ltd
Original Assignee
Hefei Anjiete Optoelectronic Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hefei Anjiete Optoelectronic Co ltd filed Critical Hefei Anjiete Optoelectronic Co ltd
Priority to CN202211117703.3A priority Critical patent/CN115187609A/en
Publication of CN115187609A publication Critical patent/CN115187609A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N15/00Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
    • G01N15/10Investigating individual particles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/255Details, e.g. use of specially adapted sources, lighting or optical systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/27Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands using photo-electric detection ; circuits for computing concentration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/36Applying a local operator, i.e. means to operate on image points situated in the vicinity of a given point; Non-linear local filtering operations, e.g. median filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • G06T2207/20032Median filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Chemical & Material Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Pathology (AREA)
  • Biochemistry (AREA)
  • Artificial Intelligence (AREA)
  • Analytical Chemistry (AREA)
  • Immunology (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Databases & Information Systems (AREA)
  • Medical Informatics (AREA)
  • Mathematical Physics (AREA)
  • Biophysics (AREA)
  • Nonlinear Science (AREA)
  • Biomedical Technology (AREA)
  • Dispersion Chemistry (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Molecular Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a method and a system for detecting rice yellow grains, wherein the method comprises the following steps: s1, image acquisition; s2, preprocessing the image acquired in the step S1; s3, calculating the yellow rice rate according to the image data preprocessed in the step S2; the method provided by the invention has the advantages that the image is preprocessed, the recognition precision is improved by utilizing the deep full convolution neural network, the manual selection of the characteristics is avoided, and due to the environmental factors such as illumination and background during the sample collection, compared with the manual selection of a sample characteristic with strong robustness, the complicated characteristic algorithm can be avoided by constructing the deep full convolution neural network, and the recognition precision is improved.

Description

Method and system for detecting rice yellow grains
Technical Field
The invention relates to the field of rice detection, in particular to a method and a system for detecting yellow rice grains.
Background
According to the national standard, the yellow-grained rice refers to rice grains whose endosperm is yellow and whose color and luster are clearly distinguished from those of standard rice. Existing yellow particle monitoring methods include sampling methods and machine vision-based techniques. The sampling method comprises the steps of weighing 150g of coarse grain by using an e =0.1g balance, grinding the coarse grain into rice with national standard precision, removing bran powder, weighing (W) to serve as the weight of a sample, sorting yellow-grained rice according to the specification, and weighing (W1). The calculated% of yellow rice = W1/W × 100%. The method comprises the steps of irradiating rice in a rice sliding state by using an external light source based on a machine vision method, collecting an image by using a ccd camera, obtaining a seed region and a background region through preprocessing operations including graying, background segmentation, edge detection, image binarization, corrosion expansion and the like, converting an original RGB image into an HSI model, graying the image according to hue H, setting a threshold value, and judging whether the image is yellow grains. The existing visual detection method needs manual feature selection, and due to environmental factors such as illumination and background during sample collection, the feature extraction process is complicated, and the identification precision is low.
Disclosure of Invention
In order to solve the existing problems, the invention provides a method and a system for detecting rice yellow grains, and the specific scheme is as follows:
a rice yellow grain detection method comprises the following steps:
s1, image acquisition;
s2, preprocessing the image acquired in the step S1;
and S3, calculating the yellow rice rate according to the image data preprocessed in the step S2.
Preferably, the image acquired in step S1 includes: at least 3 different types of rice grains are adopted for collection, and visible spectrum image information of the rice grains in three wave bands near an R channel 700nm, a G channel 550nm and a B channel 440nm is obtained.
Preferably, the step of preprocessing in step S2 includes:
s21, carrying out median filtering on the image acquired in the step 1 through a 3 x 3 median filter to remove noise points, wherein the median filtering uses a median assignment matrix central point of matrix pixel points;
s22, rotating, turning over and adjusting the image data processed in the step S21; wherein, the rotation formula is:
Figure 708568DEST_PATH_IMAGE001
in the formula, theta is a rotation angle (x) 1 ,y 1 ) As the current coordinate, (x) 2 ,y 2 ) For the rotated coordinates, the contrast is adjusted using a gamma transform:
Figure 916696DEST_PATH_IMAGE002
s is the output gray level, r is the input gray level, γ is the gamma value, and c is the gray scale factor.
Preferably, the step S3 of calculating the millet ratio includes:
s31, building a deep full convolution neural network of the U-Net frame;
s32, segmenting a rice grain region and a yellow grain region in the deep full convolution neural network established in the step S31 to obtain the proportion of the yellow grain region in a single grain of rice;
s33, counting the total number of the rice grains according to the segmentation of the rice grain regions, and meanwhile, judging whether the rice grains are yellow rice or not according to the ratio of the yellow grain regions to the areas of the rice grains corresponding to the yellow grain regions after segmentation in combination with a set threshold value, so as to count the number of the yellow rice;
s34, calculating the yellow rice rate,
Figure 620210DEST_PATH_IMAGE003
preferably, the specific steps of partitioning the yellow grain regions and the rice grain regions in step S32 include:
s321, converting the RGB model of the image collected in the step S1 into an HSI model,
Figure 306406DEST_PATH_IMAGE004
(ii) a Wherein I is the brightness of the image, RGB is the values of RGB three channels of the pixel point, S is the saturation of the image, min (R, G, B) is the minimum value of the RGB values, H is the hue of the image,
Figure 44555DEST_PATH_IMAGE005
s322, calibrating the HSI model image, and manually marking out a rice grain area and a yellow grain area;
s323, inputting the original image before segmentation and the corresponding manually segmented image into the deep fully-convolutional neural network for training;
s324, forward propagation is carried out to calculate a loss value;
s325, adjusting network parameters including parameters of convolution kernels in the convolution layer through the loss values;
and S326, repeating the steps S324 to S325 until the expected value of the loss value or the preset number of training times is reached.
Preferably, the calculating the loss value in step S324 specifically includes: the original image is calculated through the network layer of the deep full convolution neural network to obtain a network pairDividing the original image, comparing the original image with the manually divided image, and calculating a loss value; the loss function is the average absolute error,
Figure 423583DEST_PATH_IMAGE006
Figure 614393DEST_PATH_IMAGE007
in the form of an actual value of the value,
Figure 104280DEST_PATH_IMAGE008
is a predicted value, and n is the total number of training samples; the forward propagation process uses an n × n convolution kernel for efficient convolution, the convolution formula being
Figure 431357DEST_PATH_IMAGE009
In which
Figure 981287DEST_PATH_IMAGE010
In the form of a matrix to be convolved,
Figure 659393DEST_PATH_IMAGE011
is composed of
Figure 952971DEST_PATH_IMAGE012
The point of the position is that of the position,
Figure 400133DEST_PATH_IMAGE013
k is the value of the point after convolution, n x n convolution kernel,
Figure 589805DEST_PATH_IMAGE014
is k rotates
Figure 755208DEST_PATH_IMAGE015
The obtained matrix is subjected to nonlinear transformation by the matrix after convolution kernel convolution through an activation function, wherein the activation function is a Relu function, and the Relu function is expressed in the formula
Figure 586897DEST_PATH_IMAGE016
Max is used to obtain the maximum value of 0, x.
The invention also discloses a computer readable storage medium, wherein a computer program is stored on the medium, and after the computer program runs, the rice yellow grain detection method is executed.
The invention also discloses a computer system, which comprises a processor and a storage medium, wherein the storage medium stores a computer program, and the processor reads the computer program from the storage medium and runs the computer program to execute the rice yellow grain detection method as claimed in any one of claims 1 to 6.
Preferably, the system of the method for detecting the yellow rice grains comprises a camera bellows, a CCD camera integrated with an image acquisition card, a computer, a light source and an object stage;
the top end of the camera bellows is provided with the light sources in an annular array; the center of the bottom of the dark box is provided with the objective table; background paper is pasted on the inner side wall of the dark box; and the CCD camera is arranged right above the objective table outside the camera bellows and is used for collecting rice samples under the irradiation of light sources with different wavelengths and uploading the rice samples to the computer through the image acquisition card for further processing of the samples.
The invention has the beneficial effects that:
the invention preprocesses the image, improves the identification precision by utilizing the deep full convolution neural network and avoids the manual selection of the characteristics. Due to the environmental factors such as illumination and background during sample collection, compared with manual selection of a sample feature with strong robustness, the construction of the deep full convolution neural network can avoid a complex feature algorithm and improve the identification precision.
Drawings
In order to more clearly illustrate the embodiments or technical solutions of the present invention, the drawings used in the embodiments or technical solutions of the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a flow chart of the method of the present invention;
FIG. 2 is a schematic diagram of a deep full convolution neural network according to the present invention;
FIG. 3 is a schematic diagram of the system of the present invention.
The reference numbers are as follows: 1. a camera obscura 2, a light source 3, a CCD camera 4 and an objective table.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be obtained by a person skilled in the art without making any creative effort based on the embodiments in the present invention, belong to the protection scope of the present invention.
Referring to fig. 1, a method for detecting rice yellow grains includes the following steps:
and S1, image acquisition.
The images acquired include: at least 3 different kinds of rice grains are adopted for collection, and visible spectrum image information of the rice grains in three wave bands near an R channel 700nm, a G channel 550nm and a B channel 440nm is obtained.
And S2, preprocessing the image acquired in the step S1.
Wherein the step of pre-treating comprises:
and S21, carrying out median filtering on the image acquired in the step 1 through a 3 x 3 median filter to remove noise points, wherein the median filtering uses a median assignment matrix central point of matrix pixel points.
S22, rotating, turning over and adjusting the image data processed in the step S21; wherein, the rotation formula is:
Figure 154145DEST_PATH_IMAGE017
in the formula, theta is a rotation angle (x) 1 ,y 1 ) As the current coordinate, (x) 2 ,y 2 ) For the rotated coordinates, the contrast is adjusted using a gamma transform:
Figure 780298DEST_PATH_IMAGE018
s is the output gray level, r is the input gray level, γ is the gamma value, and c is the gray scale factor.
And S3, calculating the yellow rice rate according to the image data preprocessed in the step S2.
The calculation steps of the millet ratio comprise:
and S31, building a deep full convolution neural network of the U-Net framework.
As shown in fig. 2, the deep fully convolutional neural network is composed of two structurally similar parts on the left and right sides, one side (left side in the figure) of the deep fully convolutional neural network includes 4 blocks, each block includes two convolutional layers C1 and C2 and one maximum pooling layer S1, and the other side (right side in the figure) includes 5 blocks, each block includes two convolutional layers C1 and C2 and one deconvolution layer U1.
And S32, segmenting the rice grain region and the yellow grain region in the deep fully-convolutional neural network established in the step S31 to segment the proportion of the yellow grain region in the single grain.
The specific steps of the yellow grain region division and the rice grain region division comprise:
s321, converting the RGB model of the image collected in the step S1 into an HSI model,
Figure 432997DEST_PATH_IMAGE019
(ii) a In the formula, I is the brightness of the image, RGB is the values of RGB three channels of the pixel point, S is the saturation of the image, min (R, G, B) is the minimum value of RGB values, H is the tone of the image,
Figure 68377DEST_PATH_IMAGE020
and S322, calibrating the HSI model image, and manually marking out a rice grain area and a yellow grain area.
And S323, inputting the original image before segmentation and the corresponding manually segmented image into the depth fully-convolutional neural network for training.
S324, the calculated loss value is propagated forward.
Wherein, calculating the loss value specifically comprises: calculating the original image through the network layer of the deep full convolution neural network to obtain the segmentation of the original image by the network, comparing the segmentation with the manually segmented image, and calculating a loss value; the loss function is the average absolute error and,
Figure 958973DEST_PATH_IMAGE021
Figure 21607DEST_PATH_IMAGE022
is a function of the actual value of the measured value,
Figure 896022DEST_PATH_IMAGE023
is a predicted value, and n is the total number of training samples; the forward propagation process uses an n × n convolution kernel for efficient convolution, the convolution formula being
Figure 335093DEST_PATH_IMAGE024
Wherein
Figure 345775DEST_PATH_IMAGE010
In the form of a matrix to be convolved,
Figure 579310DEST_PATH_IMAGE011
is composed of
Figure 941021DEST_PATH_IMAGE012
The point of the position is that of the position,
Figure 918205DEST_PATH_IMAGE013
k is the convolution kernel of n, which is the value of the point after convolution,
Figure 783392DEST_PATH_IMAGE014
is k rotates
Figure 187829DEST_PATH_IMAGE015
The resulting matrix is formed by convolutionThe matrix after the kernel convolution is subjected to nonlinear transformation through an activation function, wherein the activation function is a Relu function, and the Relu function is
Figure DEST_PATH_IMAGE025
Max is used to obtain the maximum value of 0, x.
S325, adjusting network parameters including parameters of convolution kernels in the convolution layer through the loss values.
And S326, repeating the steps S324 to S325 until the expected loss value or the preset training number is reached, specifically, the expected loss value is set to be less than or equal to 0.5, and the preset training number can be set to be between 200 and 300.
And S33, counting the total number of the rice grains according to the segmentation of the rice grain regions, and judging whether the rice grains are yellow rice or not according to the ratio of the yellow grain regions to the areas of the rice grains corresponding to the yellow grain regions after segmentation in combination with a set threshold value, so that the number of the yellow rice is counted.
S34, calculating the yellow rice rate,
Figure 36836DEST_PATH_IMAGE026
a system of a rice yellow grain detection method comprises a dark box 1, a CCD camera 3 integrated with an image acquisition card, a computer, a light source 2 and an objective table 4.
The top end of the dark box 1 is provided with the light sources 2 in an annular array; the object stage 4 is arranged in the center of the bottom of the dark box 1; background paper is pasted on the inner side wall of the camera bellows 1; and the CCD camera 3 is arranged right above the objective table 4 outside the dark box 1 and is used for collecting rice samples irradiated by the light sources 2 with different wavelengths and uploading the rice samples to the computer through the image acquisition card for further processing of the samples.
The invention preprocesses the image, improves the identification precision by utilizing the deep full convolution neural network and avoids the manual selection of the characteristics. Due to the environmental factors such as illumination and background during sample collection, compared with manual selection of a sample feature with strong robustness, the construction of the deep full convolution neural network can avoid a complex feature algorithm and improve the identification precision.
The invention also discloses a computer readable storage medium, wherein a computer program is stored on the medium, and after the computer program runs, the method for detecting the rice yellow granules is executed.
The invention also discloses a computer system, which comprises a processor and a storage medium, wherein the storage medium is stored with a computer program, and the processor reads the computer program from the storage medium and runs the computer program to execute the rice yellow particle detection method.
Those of skill would further appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The various illustrative logical blocks, modules, and circuits described in connection with the embodiments disclosed herein may be implemented or performed with a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC. The ASIC may reside in a user terminal. In the alternative, the processor and the storage medium may reside as discrete components in a user terminal.
In one or more exemplary embodiments, the functions described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software as a computer program product, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a computer. By way of example, and not limitation, such computer-readable media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. Any connection is properly termed a computer-readable medium. For example, if the software is transmitted from a web site, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital Subscriber Line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk (disk) and disc (disc), as used herein, includes Compact Disc (CD), laser disc, optical disc, digital Versatile Disc (DVD), floppy disk and blu-ray disc where disks (disks) usually reproduce data magnetically, while discs (discs) reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
The previous description of the disclosure is provided to enable any person skilled in the art to make or use the disclosure. Various modifications to the disclosure will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other variations without departing from the spirit or scope of the disclosure. Thus, the disclosure is not intended to be limited to the examples and designs described herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
Although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (6)

1. The method for detecting yellow rice grains is characterized by comprising the following steps:
s1, image acquisition; collecting at least 3 different types of rice grains, namely visible spectrum image information of the rice grains in three wave bands near an R channel 700nm, a G channel 550nm and a B channel 440 nm;
s2, preprocessing the image acquired in the step S1;
s21, carrying out median filtering on the image acquired in the step 1 through a 3 x 3 median filter to remove noise points, wherein the median filtering uses median assignment matrix center points of matrix pixel points;
s22, rotating, overturning and adjusting the image data processed in the step S21; wherein, the rotation formula is:
Figure DEST_PATH_IMAGE002
in the formula, theta is a rotation angle, (x) 1 ,y 1 ) Is at presentCoordinate (x) 2 ,y 2 ) For the rotated coordinates, the contrast is adjusted using a gamma transform:
Figure DEST_PATH_IMAGE004
s is an output gray level, r is an input gray level, gamma is a gamma value, and c is a gray scaling coefficient;
s3, calculating the yellow rice rate according to the image data preprocessed in the step S2;
wherein, the calculation step of the yellow rice rate comprises the following steps:
s31, building a deep full convolution neural network of the U-Net frame;
s32, segmenting a rice grain region and a yellow grain region in the deep full convolution neural network established in the step S31 to obtain the proportion of the yellow grain region in a single grain of rice;
s33, counting the total number of the rice grains according to the segmentation of the rice grain regions, and meanwhile, judging whether the rice grains are yellow rice or not according to the ratio of the yellow grain regions to the areas of the rice grains corresponding to the yellow grain regions after segmentation in combination with a set threshold value, so that the number of the yellow rice is counted;
s34, calculating the yellow rice rate,
Figure DEST_PATH_IMAGE006
2. the method according to claim 1, wherein the specific steps of classifying the yellow grain regions and the rice grain regions in step S32 include:
s321, converting the RGB model of the image collected in the step S1 into an HSI model,
Figure DEST_PATH_IMAGE008
(ii) a In the formula, I is the brightness of the image, RGB is the values of RGB three channels of the pixel point, S is the saturation of the image, min (R, G, B) is the minimum value of RGB values, H is the tone of the image,
Figure DEST_PATH_IMAGE010
s322, calibrating the HSI model image, and manually marking out a rice grain area and a yellow grain area;
s323, inputting the original image before segmentation and the corresponding manually segmented image into the depth fully-convolutional neural network for training;
s324, forward propagation is carried out to calculate a loss value;
s325, adjusting network parameters including parameters of convolution kernels in the convolution layer through the loss values;
and S326, repeating the steps S324 to S325 until the expected loss value or the preset training times are reached.
3. The method according to claim 2, wherein the step S324 of calculating the loss value specifically comprises: calculating the original image through the network layer of the deep full convolution neural network to obtain the segmentation of the original image by the network, comparing the segmentation with the manually segmented image, and calculating a loss value; the loss function is the average absolute error,
Figure DEST_PATH_IMAGE012
Figure DEST_PATH_IMAGE014
in the form of an actual value of the value,
Figure DEST_PATH_IMAGE016
is a predicted value, n is the total number of training samples; the forward propagation process uses an n × n convolution kernel for efficient convolution, the convolution formula is
Figure DEST_PATH_IMAGE018
Wherein
Figure DEST_PATH_IMAGE020
In the form of a matrix to be convolved,
Figure DEST_PATH_IMAGE022
is composed of
Figure DEST_PATH_IMAGE024
The point of the position of the mobile phone is,
Figure DEST_PATH_IMAGE026
k is the convolution kernel of n, which is the value of the point after convolution,
Figure DEST_PATH_IMAGE028
is k rotates
Figure DEST_PATH_IMAGE030
The obtained matrix is subjected to nonlinear transformation by the matrix after convolution kernel convolution through an activation function, wherein the activation function is a Relu function, and the Relu function is represented by
Figure DEST_PATH_IMAGE032
Max is used to obtain the maximum value of 0, x.
4. A computer-readable storage medium, characterized in that: a computer program stored on a medium, which when executed, performs the method of detecting rice yellow grain according to any one of claims 1 to 3.
5. A computer system, characterized by: comprising a processor, a storage medium having a computer program stored thereon, the processor reading the computer program from the storage medium and executing the computer program to perform the rice yellow grain detection method according to any one of claims 1 to 3.
6. The system for detecting rice yellow granules according to any one of claims 1 to 3, wherein: comprises a camera bellows (1), a CCD camera (3) integrated with an image acquisition card, a computer, a light source (2) and an objective table (4);
the top end of the dark box (1) is provided with the light sources (2) in an annular array; the object stage (4) is arranged in the center of the bottom of the dark box (1); background paper is pasted on the inner side wall of the dark box (1); the CCD camera (3) is arranged right above the objective table (4) outside the camera bellows (1) and used for collecting rice samples irradiated by light sources (2) with different wavelengths and uploading the rice samples to the computer through the image collecting card for further processing of the samples.
CN202211117703.3A 2022-09-14 2022-09-14 Method and system for detecting rice yellow grains Pending CN115187609A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211117703.3A CN115187609A (en) 2022-09-14 2022-09-14 Method and system for detecting rice yellow grains

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211117703.3A CN115187609A (en) 2022-09-14 2022-09-14 Method and system for detecting rice yellow grains

Publications (1)

Publication Number Publication Date
CN115187609A true CN115187609A (en) 2022-10-14

Family

ID=83524681

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211117703.3A Pending CN115187609A (en) 2022-09-14 2022-09-14 Method and system for detecting rice yellow grains

Country Status (1)

Country Link
CN (1) CN115187609A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116503402A (en) * 2023-06-26 2023-07-28 中储粮成都储藏研究院有限公司 Method and device for detecting impurity content of grain shoulder

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103674957A (en) * 2013-12-25 2014-03-26 浙江工商大学 Method and system for detecting yellow rice grains
CN110942454A (en) * 2019-11-26 2020-03-31 北京科技大学 Agricultural image semantic segmentation method
CN111553240A (en) * 2020-04-24 2020-08-18 四川省农业科学院农业信息与农村经济研究所 Corn disease condition grading method and system and computer equipment
CN111815574A (en) * 2020-06-18 2020-10-23 南通大学 Coarse set neural network method for fundus retina blood vessel image segmentation
CN113486975A (en) * 2021-07-23 2021-10-08 深圳前海微众银行股份有限公司 Ground object classification method, device, equipment and storage medium for remote sensing image
CN114140692A (en) * 2021-11-25 2022-03-04 华中农业大学 Fresh corn maturity prediction method based on unmanned aerial vehicle remote sensing and deep learning
CN114678121A (en) * 2022-05-30 2022-06-28 上海芯超生物科技有限公司 HP spherical deformation diagnosis model and construction method thereof
CN114689527A (en) * 2022-05-31 2022-07-01 合肥安杰特光电科技有限公司 Rice chalkiness detection method and system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103674957A (en) * 2013-12-25 2014-03-26 浙江工商大学 Method and system for detecting yellow rice grains
CN110942454A (en) * 2019-11-26 2020-03-31 北京科技大学 Agricultural image semantic segmentation method
CN111553240A (en) * 2020-04-24 2020-08-18 四川省农业科学院农业信息与农村经济研究所 Corn disease condition grading method and system and computer equipment
CN111815574A (en) * 2020-06-18 2020-10-23 南通大学 Coarse set neural network method for fundus retina blood vessel image segmentation
CN113486975A (en) * 2021-07-23 2021-10-08 深圳前海微众银行股份有限公司 Ground object classification method, device, equipment and storage medium for remote sensing image
CN114140692A (en) * 2021-11-25 2022-03-04 华中农业大学 Fresh corn maturity prediction method based on unmanned aerial vehicle remote sensing and deep learning
CN114678121A (en) * 2022-05-30 2022-06-28 上海芯超生物科技有限公司 HP spherical deformation diagnosis model and construction method thereof
CN114689527A (en) * 2022-05-31 2022-07-01 合肥安杰特光电科技有限公司 Rice chalkiness detection method and system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116503402A (en) * 2023-06-26 2023-07-28 中储粮成都储藏研究院有限公司 Method and device for detecting impurity content of grain shoulder
CN116503402B (en) * 2023-06-26 2023-09-08 中储粮成都储藏研究院有限公司 Method and device for detecting impurity content of grain shoulder

Similar Documents

Publication Publication Date Title
CN109154978B (en) System and method for detecting plant diseases
CN109871884B (en) Multi-feature-fused object-oriented remote sensing image classification method of support vector machine
CN109740721B (en) Wheat ear counting method and device
CN115619793B (en) Power adapter appearance quality detection method based on computer vision
CN115082451B (en) Stainless steel soup ladle defect detection method based on image processing
CN114118144A (en) Anti-interference accurate aerial remote sensing image shadow detection method
CN109903254B (en) Improved bilateral filtering method based on Poisson nucleus
CN110047055A (en) A kind of enhancing of infrared image details and denoising method
CN111062293A (en) Unmanned aerial vehicle forest flame identification method based on deep learning
WO2020232710A1 (en) Haze image quality evaluation method and system, storage medium, and electronic device
CN116993731B (en) Shield tunneling machine tool bit defect detection method based on image
CN115187609A (en) Method and system for detecting rice yellow grains
CN111310571A (en) Hyperspectral image classification method and device based on spatial-spectral-dimensional filtering
CN114689527A (en) Rice chalkiness detection method and system
CN111047556A (en) Strip steel surface defect detection method and device
Wu et al. Automatic kernel counting on maize ear using RGB images
CN108269264B (en) Denoising and fractal method of bean kernel image
CN108133467B (en) Underwater image enhancement system and method based on particle calculation
CN115511803B (en) Broken rice detection method and system
CN117541484B (en) Image enhancement method for detecting bran star of flour
CN114757892B (en) Perspective material defect detection method and system based on artificial intelligence
CN116681703B (en) Intelligent switch quality rapid detection method
CN117575970B (en) Classification-based satellite image automatic processing method, device, equipment and medium
CN112541913B (en) Image local fuzzy detection and extraction method based on column rate spectral analysis and noise robustness
CN116740065B (en) Quick tracing method and system for defective products of artificial board based on big data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20221014

RJ01 Rejection of invention patent application after publication