CN114755228A - Detection instrument and detection method for chlorophyll and nitrogen contents of cotton leaves - Google Patents

Detection instrument and detection method for chlorophyll and nitrogen contents of cotton leaves Download PDF

Info

Publication number
CN114755228A
CN114755228A CN202210428296.1A CN202210428296A CN114755228A CN 114755228 A CN114755228 A CN 114755228A CN 202210428296 A CN202210428296 A CN 202210428296A CN 114755228 A CN114755228 A CN 114755228A
Authority
CN
China
Prior art keywords
spectral
leaf
spectrum
features
rgb
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210428296.1A
Other languages
Chinese (zh)
Inventor
冯雷
肖沁林
张初
吴娜
何勇
刘羽飞
刘飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CN202210428296.1A priority Critical patent/CN114755228A/en
Publication of CN114755228A publication Critical patent/CN114755228A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/01Arrangements or apparatus for facilitating the optical investigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/55Specular reflectivity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • G06T7/41Analysis of texture based on statistical description of texture
    • G06T7/45Analysis of texture based on statistical description of texture using co-occurrence matrix computation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N2021/8466Investigation of vegetal material, e.g. leaves, plants, fruits
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Pathology (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Biochemistry (AREA)
  • Immunology (AREA)
  • Analytical Chemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Molecular Biology (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Quality & Reliability (AREA)
  • Geometry (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Probability & Statistics with Applications (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)

Abstract

The invention relates to a detection instrument and a detection method for chlorophyll and nitrogen contents in cotton leaves, and belongs to the technical field of spectral analysis and plant phenotype. The method comprises the steps of obtaining RGB images of cotton leaves to be detected irradiated by a light source by using an RGB camera, obtaining spectral data of the cotton leaves to be detected irradiated by the light source by using a spectral probe, then extracting color features, morphological features and textural features of the RGB images, extracting spectral features of the spectral data, then fusing the color features, the morphological features, the textural features and the spectral features to obtain fusion features, and finally taking the fusion features as input, calculating chlorophyll content and nitrogen content of the cotton leaves to be detected by using a prediction model, so that quick and automatic obtaining of phenotype data of the chlorophyll content and the nitrogen content of the cotton leaves can be realized, and the problems that an existing plant leaf phenotype platform is difficult to carry and the chlorophyll content and the nitrogen content of the cotton leaves in a field can not be quickly obtained in the field can be solved.

Description

Detection instrument and detection method for chlorophyll and nitrogen contents of cotton leaves
Technical Field
The invention relates to the technical field of spectral analysis and plant phenotype, in particular to a closed portable detection instrument and a detection method thereof, which can be used for detecting the chlorophyll and nitrogen contents of cotton leaves.
Background
The cotton has excellent natural characteristics, is one of important economic crops in China, and has very important significance for representing the physiological nutrition state of the cotton by detecting the chlorophyll and nitrogen contents of cotton leaves. The traditional method for detecting the content of chlorophyll and nitrogen in cotton leaves mainly comprises an ultraviolet-visible spectrophotometry and a Kjeldahl nitrogen determination method, and although the methods provide feasibility for detecting the content of chlorophyll and nitrogen, the reproducibility is good, the accuracy is high, a large amount of time is needed, the work is complicated, the sample damage is irreversible, and the defects limit the application of the traditional method. In recent years, with the development of automation technology, machine vision technology and spectral analysis technology, high-throughput, accurate and efficient plant leaf phenotype technology can adopt a sensor to measure chlorophyll and nitrogen phenotype data of plant leaves so as to predict chlorophyll and nitrogen contents according to the phenotype data and obtain good effects, but because a high-throughput plant phenotype platform has the defects of large equipment volume, high cost, low efficiency, difficulty in information storage and processing and the like, the application of the high-throughput plant phenotype platform in practice is greatly limited, and the high-throughput plant phenotype platform is difficult to carry to the field for on-site data acquisition.
Therefore, a detection instrument and a detection method with small volume and high detection efficiency are needed.
Disclosure of Invention
The invention aims to provide a detection instrument and a detection method for chlorophyll and nitrogen contents of cotton leaves, which can realize the quick and automatic acquisition of phenotype data of the chlorophyll and nitrogen contents of the cotton leaves and can overcome the problems that the prior plant leaf phenotype platform is difficult to carry and the chlorophyll and nitrogen contents of the cotton leaves in the field can not be quickly acquired on the spot.
In order to achieve the purpose, the invention provides the following scheme:
a detection instrument for chlorophyll and nitrogen contents in cotton leaves, comprising: the device comprises a shading shell, a control chip, a reflectivity plate, a light source, an RGB camera and a spectrum probe, wherein the reflectivity plate, the light source, the RGB camera and the spectrum probe are positioned in the shading shell; the reflectivity plate is positioned on the bottom surface of the shading shell; the control chip is respectively in communication connection with the RGB camera and the spectrum probe;
the cotton leaf to be detected is positioned on the reflectivity plate; the light source is used for irradiating the cotton leaves to be detected; the RGB camera is used for acquiring RGB images of the cotton leaf to be detected, which are irradiated by the light source; the spectrum probe is used for acquiring the spectrum data of the cotton blade to be detected, which is irradiated by the light source;
The control chip is used for calculating the chlorophyll content and the nitrogen content of the cotton leaf to be detected based on the RGB image and the spectral data.
A method for detecting chlorophyll and nitrogen contents of cotton leaves comprises the following steps:
receiving RGB images acquired by an RGB camera and spectral data acquired by a spectral probe;
extracting color features, morphological features and texture features of the RGB image, and extracting first spectral features and second spectral features of the spectral data;
fusing the color feature, the morphological feature, the texture feature and the first spectral feature to obtain a first fused feature; fusing the color feature, the morphological feature, the texture feature and the second spectral feature to obtain a second fused feature;
calculating the chlorophyll content of the cotton leaf to be detected by using a first prediction model by taking the first fusion characteristics as input; and calculating the nitrogen content of the cotton leaf to be detected by using a second prediction model by taking the second fusion characteristics as input.
According to the specific embodiment provided by the invention, the invention discloses the following technical effects:
the invention provides a detecting instrument and a detecting method for chlorophyll and nitrogen contents of cotton leaves, wherein RGB images of the cotton leaves to be detected, which are irradiated by a light source, are obtained by an RGB camera, spectral data of the cotton leaves to be detected, which are irradiated by the light source, are obtained by a spectral probe, then extracting color features, morphological features and textural features of the RGB image, extracting spectral features of the spectral data, fusing the color features, the morphological features, the textural features and the spectral features to obtain fused features, finally calculating the chlorophyll content and the nitrogen content of the cotton leaf to be detected by using a prediction model with the fused features as input, the method can realize the quick and automatic acquisition of the phenotype data of the chlorophyll and nitrogen contents of the cotton leaves, and can solve the problems that the prior plant leaf phenotype platform is difficult to carry and the chlorophyll and nitrogen contents of the cotton leaves in the field can not be quickly acquired on the spot.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
FIG. 1 is a schematic view of the overall structure of a detecting apparatus provided in embodiment 1 of the present invention;
FIG. 2 is a flowchart of a method of detecting a signal according to embodiment 2 of the present invention;
FIG. 3 is a flowchart showing the operation of the detecting apparatus provided in example 2 of the present invention;
fig. 4 is a schematic network structure diagram of the prediction model provided in embodiment 2 of the present invention.
Description of the symbols:
1-a power supply; 2-power plug; 3-a control chip; 4-a control lever; 5-a light source; 6-a spectral probe; 7-RGB camera; 8-a fan; 9-a light-shielding housing; 10-start button; 11-cotton leaves to be detected; 12-a sample holding drawer at the bottom; 13-a reflectance plate; 14-USB output interface.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The invention aims to provide a detection instrument and a detection method for chlorophyll and nitrogen contents of cotton leaves, which can realize the quick and automatic acquisition of phenotype data of the chlorophyll and nitrogen contents of the cotton leaves and can overcome the problems that the prior plant leaf phenotype platform is difficult to carry and the chlorophyll and nitrogen contents of the cotton leaves in the field can not be quickly acquired on the spot.
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
Example 1:
the embodiment is used for providing a cotton leaf chlorophyll and nitrogen content detection instrument, and the detection instrument is a closed portable instrument. As shown in fig. 1, the detecting apparatus includes: a light-shielding shell 9, a control chip 3, a reflectivity plate 13 positioned inside the light-shielding shell 9, a light source 5, an RGB camera 7 and a spectrum probe 6. The control chip 3 may be an AI control chip and peripheral circuits, and the control chip 3 may be located inside the light shielding housing 9, and may be specifically mounted on the top of the light shielding housing 9. The light source 5 may be a group of arrayed halogen lamps, which are light sources with long service life and stable spectrum, and the light source 5 may be installed on the top of the light shielding housing 9, preferably on the center of the top of the light shielding housing 9. The RGB camera 7 may be a detachable RGB camera and the spectral probe 6 may be a rectangular spectral probe.
The reflectivity plate 13 is located on the bottom surface of the shading shell 9, the cotton blade 11 to be tested is located on the reflectivity plate 13, and the reflectivity of the reflectivity plate 13 can be 30%. The light source 5 is used for irradiating the cotton blade 11 to be measured, the RGB camera 7 is used for acquiring RGB images of the cotton blade 11 to be measured irradiated by the light source 5, and the spectrum probe 6 is used for acquiring spectrum data of the cotton blade 11 to be measured irradiated by the light source 5.
Because cotton blade 11 that awaits measuring is in shading shell 9, and the bottom is reflectivity board 13 that the reflectivity is fixed to be provided with light source 5 and throw light on cotton blade 11 that awaits measuring, can avoid external illumination's influence, in this shooting environment, need not to carry out complicated background elimination work again to the RGB image and the spectral data that acquire, can reduce the handling capacity.
Control chip 3 is connected with RGB camera 7 and spectral probe 6 communication respectively, and control chip 3 is used for calculating the chlorophyll content and the nitrogen content of cotton leaf 11 that awaits measuring based on RGB image and spectral data.
As an optional implementation manner, the bottom of the light shielding shell 9 is designed as a drawer, the bottom of the inner side of the light shielding shell is provided with a bottom sample holding drawer 12, the bottom sample holding drawer 12 is slidably connected with the light shielding shell 9 through a sliding rail, and the light shielding shell 9 and the bottom sample holding drawer 12 form a closed environment, so that the detection process is free from external environment interference. The bottom surface of the bottom sample containing drawer 12 is black, the reflectivity plate 13 is located on the bottom surface of the bottom sample containing drawer 12, and the cotton blade 11 to be tested is still located on the reflectivity plate 13. When cotton blade 11 that awaits measuring is placed to the shading shell 9 in need, then take out bottom and hold appearance drawer 12, place cotton blade 11 that awaits measuring in the bottom and hold appearance drawer 12 after, close bottom and hold appearance drawer 12 again to more convenient place cotton blade 11 that awaits measuring, and can conveniently adjust the position of placing of cotton blade 11 that awaits measuring.
The detecting instrument of the embodiment further comprises a moving part, and the moving part is positioned inside the shading shell 9 and can be specifically installed on the top of the shading shell 9. The RGB camera 7 and the spectrum probe 6 are arranged on a moving component, and the moving component is used for driving the RGB camera 7 and the spectrum probe 6 to move horizontally and lift. Specifically, the moving part comprises a driving part and a control rod 4, the driving part is in driving connection with the control rod 4, the driving part can be a stepping motor, the stepping motor can be fixedly connected to one side wall inside the shading shell 9, and the control rod 4 can be a stepping control rod. The control rod 4 comprises a horizontal shaft and a longitudinal shaft, the RGB camera 7 and the spectrum probe 6 are arranged on the longitudinal shaft, and the lifting and horizontal movement of the spectrum probe 6 and the RGB camera 7 arranged on the control rod are completed under the driving of the stepping motor. By lifting, the imaging height can be conveniently set, for example, when the cotton blade 11 to be measured is small, the lens of the spectrum probe 6 and the lens of the RGB camera 7 can be adjusted to be low, so that more accurate spectrum and RGB images with proper sizes can be obtained.
Because the light source 5, the RGB camera 7, and the spectrum probe 6 generate heat after long-term operation, in order to realize heat dissipation, the detecting instrument of this embodiment further includes a fan 8, the fan 8 is located inside the light shielding case 9 and can be installed on the side wall of the light shielding case 9, and the number of the fans 8 can be multiple. The fan 8 is used for heat dissipation.
The control chip 3 of this embodiment can be in control connection with the light source 5, the stepping motor, the spectrum probe 6, the RGB camera 7 and the fan 8, and is used for controlling the start and the stop of the light source 5, the stepping motor, the spectrum probe 6, the RGB camera 7 and the fan 8, and the control chip 3 is also used for storing the acquired data, and the acquired data includes RGB images and spectrum data.
In order to realize power supply, the detection instrument of the embodiment further comprises a power supply 1 positioned outside the shading shell 9, and the power supply 1 is an external mobile power supply and is convenient to carry. The power supply 1 is connected with a power plug 2 led out from the outer side of the shading shell 9 and used for supplying power to the light source 5, the stepping motor, the RGB camera 7, the spectrum probe 6 and the fan 8 through peripheral circuits.
The light-shielding shell 9 of this embodiment is further provided with a start button 10 and a USB output interface 14, the start button 10 is used for starting the detection instrument, and the USB output interface 14 is used for exporting the detection result of the collected data and the control chip 3.
The detecting instrument of the embodiment is convenient to carry to the field, can supply power through a mobile power supply, collects spectrum and image information at the same time, extracts the color, form and texture characteristics of the blade and the spectral characteristics under the characteristic wavelength, fuses the image characteristics and the spectral characteristics, processes the collected data in real time by using the control chip 3, and performs prediction calculation and output on the chlorophyll and nitrogen content parameters of the blade by using the fusion information of the spectrum and the image according to a model integrated in the control chip 3. The detection instrument provided by the embodiment is a closed portable instrument which can be used for field plant leaf phenotype detection, has the advantages of simple structure, convenience in carrying and operation and lower cost, and can provide an environment for realizing algorithm. Utilize the RGB image and the spectral data of the device acquisition blade of this embodiment, can directly realize the quick acquisition of cotton leaf chlorophyll content and nitrogen content phenotype data, data acquisition, calculation and output all realize on detecting instrument's hardware platform, need not to carry out computational analysis with data through copy or internet transmission to remote server.
Example 2:
the present embodiment is configured to provide a method for detecting chlorophyll and nitrogen contents in a cotton leaf, and control the detection instrument described in embodiment 1 to operate, where as shown in fig. 1 and fig. 2, the method includes:
s1: receiving RGB images acquired by an RGB camera 7 and spectral data acquired by a spectral probe 6;
as shown in fig. 3, the RGB image and the spectrum data are acquired by the detecting instrument as follows: after the power supply 1 is started, the bottom sample containing drawer 12 is opened through the sliding rail, the cotton blade 11 to be tested is placed in the bottom sample containing drawer 12, and the bottom sample containing drawer 12 is closed; the control chip 3 controls and turns on the light source 5, the stepping motor, the RGB camera 7, the spectrum probe 6 and the fan 8; after the light source 5 is stabilized, the step control rod is controlled to move to drive the RGB camera 7 and the spectrum probe 6 carried on the step control rod to move, whether the cotton blade 11 to be detected is in a visual field range is judged according to the imaging or the spectrum reflectivity, the position of the cotton blade 11 to be detected is automatically identified according to the color and reflectivity difference of the cotton blade 11 to be detected and the 30% reflectivity plate 13, the position coordinate of the cotton blade 11 to be detected is determined, whether a complete blade sample is in the visual field range is judged mainly according to the RGB imaging, the movement is continuously carried out until the cotton blade 11 to be detected is in the data acquisition range of the RGB camera 7 and the spectrum probe 6, the RGB camera 7 is used for acquiring RGB images, and the spectrum probe 6 is used for acquiring spectrum data in the coverage range of the spectrum probe 6.
The bottom sample holding drawer 12 can also be opened in an automatic manner: the control chip 3 outputs a driving signal to pop up the bottom sample holding drawer 12.
It should be noted that, if the cotton blade 11 to be measured is too large, no matter how the RGB camera 7 and the spectrum probe 6 are moved, the cotton blade 11 to be measured cannot be completely located within the data acquisition range of the RGB camera 7 and the spectrum probe 6, that is, under the condition that the cotton blade 11 to be measured is large and the RGB camera 7 and the spectrum probe 6 cannot acquire the image and the spectrum data of the entire blade at one time, the RGB camera 7 and the spectrum probe 6 may be continuously moved to perform shooting for a plurality of times and perform data splicing, so as to obtain the RGB image and the spectrum data of the entire cotton blade 11 to be measured, and the problem that the cotton blade 11 to be measured is large and cannot directly acquire data can be better solved.
S2: extracting color features, morphological features and texture features of the RGB image, and extracting first spectral features and second spectral features of the spectral data;
specifically, in S2, the extracting color features, morphological features, and texture features of the RGB image may include:
(1) performing threshold segmentation on the RGB image to obtain a binary image; the binary image comprises blade pixel points and background pixel points;
More specifically, the RGB image is subjected to threshold segmentation in the following manner: converting the RGB image into HSV space, setting a fixed S (saturation) value due to the large difference of S component values of the leaf and the background, forming a mask, processing the RGB image by using the mask to form a binary image, and if the S component value of each pixel point in the RGB image is larger than the fixed S value, determining that the pixel point of the leaf is the pixel point of each pixel point in the RGB image, and taking 1 as the corresponding pixel value; otherwise, the pixel is the background pixel, and the corresponding pixel value is 0.
(2) Respectively converting the RGB image into HSV color space and LAB color space, determining R, G, B, H, S, V, L, A, B component values of pixel points of each blade, wherein R, G, B, H, S, V, L, A, B is different color spaces, namely extracting R, G, B, H, S, V, L, A, B component values of a blade communicating area according to the binary image; and respectively calculating the average value of each component according to R, G, B, H, S, V, L, A, B component values of all the leaf pixel points to obtain the color characteristics of the RGB image, wherein the color characteristics comprise the average value of R, G, B, H, S, V, L, A, B components.
(3) Calculating morphological characteristics of the RGB image based on the number of the leaf pixel points; the morphological characteristics include the area, perimeter, width, height, majorraxislength, MinorAxisLength, Eccentricity and EquivDiameter of the blade;
More specifically, the leaf image can be regarded as a set of leaf pixel points, the pixel points of the whole leaf have a mutual communication relation, and the leaf area in the binary image is a communication set. And analyzing the number of the pixels in the communication area, calculating the proportion of the pixels in the blade communication area to the total pixels, and multiplying the proportion by the image view area to obtain the blade area. The girth is the border of background and blade connected region, the pixel number of the outermost round of blade connected region promptly. The width is the number of pixel points of the smallest rectangle (i.e. the smallest circumscribed rectangle of the connected region) including the connected region of the blade along the X-axis direction, and correspondingly, the height is the number of pixel points of the smallest rectangle including the connected region of the blade along the Y-axis direction. Majorraxislength is the length of the major axis of the ellipse having the same standard second order central moment as the leaf connected region, i.e., the number of pixel points of the major axis of the ellipse. MinorAxisLength is the length of the minor axis of the ellipse having the same standard second order central moment as the vane connected region, i.e., the number of pixel points of the ellipse minor axis. The Eccentricity is the Eccentricity of an ellipse having the same standard second-order central moment as the vane communication region, and is the ratio of the distance between the foci of the ellipse to the length of the major axis thereof. Equivdiameter is the diameter of a circle having the same area, i.e., the same number of pixels, as the blade communication area. These indices may be calculated based on the Regionprops function in Matlab.
(4) Calculating a gray level co-occurrence matrix of the binary image, and calculating texture characteristics of the RGB image based on the gray level co-occurrence matrix; the texture features include maximum probability, correlation, contrast, energy, homogeneity and entropy.
The co-occurrence matrix is defined by the joint probability density of the pixels at two positions, reflects not only the distribution characteristics of the brightness, but also the position distribution characteristics between the pixels with the same brightness or close to the brightness, is a second-order statistical characteristic related to the brightness change of the image, and is the basis for defining a group of texture characteristics. The gray level co-occurrence matrix of the image is a matrix function of pixel distance and angle, and reflects the comprehensive information of the image in direction, interval, change amplitude and speed by calculating the correlation between two points of gray levels in a certain distance and a certain direction in the image. The maximum probability refers to the probability of the maximum pixel pair, the correlation is used for measuring the similarity of spatial gray level co-occurrence matrix elements in the row or column direction, the contrast reflects the definition of an image and the depth of texture grooves, the energy is the square sum of the gray level co-occurrence matrix element values, reflects the uniformity of image gray level distribution and the thickness of the texture, the homogeneity (also called inverse difference) reflects the homogeneity of image texture, measures the local change of the image texture, and the entropy is the measurement of the information content of the image and represents the non-uniformity or the complexity of the texture in the image.
At S2, extracting the first spectral feature and the second spectral feature of the spectral data may include:
(1) performing pixel level correspondence on the RGB image and the spectral data to obtain spectral data of each pixel point in the RGB image;
(2) performing threshold segmentation on the RGB image to obtain a binary image, wherein the binary image comprises leaf pixel points and background pixel points;
(3) calculating according to the spectrum data of all the leaf pixel points to obtain a leaf average spectrum, selecting the leaf average spectrum corresponding to the first characteristic wavelength as a first spectrum characteristic of the spectrum data, and selecting the leaf average spectrum corresponding to the second characteristic wavelength as a second spectrum characteristic of the spectrum data.
And calculating the average value of the spectral data of all the blade pixel points to obtain the average spectrum of the blade. The characteristic wavelength may be: 380-450, 1630-2000, 2280-2495 nm.
It should be noted that the characteristic wavelength is determined in the process of training the prediction model. When the prediction model is trained, a training set needs to be constructed, RGB images and spectral data of a plurality of leaf samples are collected, and feature extraction is performed on the RGB images to obtain color features, texture features and morphological features of each leaf sample. Processing the spectral data, calculating the average spectrum of each leaf sample, then performing combined pretreatment of standard normal variable change and first derivative on the average spectrum of the leaf, selecting characteristic wavelength by using a competitive adaptive weight sampling method-CARS characteristic wavelength selection method, substituting the average spectrum of the leaf of each leaf sample and the actual value of chlorophyll content into calculation by using the CARS characteristic wavelength selection method, namely, the first characteristic wavelength can be calculated, the average spectrum of the leaf of each leaf sample and the actual value of the nitrogen content are substituted and calculated by the CARS characteristic wavelength selection method, the second characteristic wavelength can be calculated, the average spectrum of the blades corresponding to the first characteristic wavelength in the blade samples is selected to form the first spectrum characteristic of each blade sample, and the average spectrum of the blades corresponding to the second characteristic wavelength in the blade samples is selected to form the second spectrum characteristic of each blade sample. And fusing the color characteristic, the texture characteristic, the morphological characteristic and the first spectral characteristic to obtain a first fusion characteristic of each leaf sample, taking the first fusion characteristic of each leaf sample as sample input data, taking the chlorophyll content actual value of each leaf sample as label data of the sample to form a first training data set, and training the initial model to obtain a first prediction model. And fusing the color characteristic, the texture characteristic, the morphological characteristic and the second spectral characteristic to obtain a second fusion characteristic of each leaf sample, taking the second fusion characteristic of each leaf sample as sample input data, taking the actual value of the nitrogen content of each leaf sample as label data of the sample to form a second training data set, and training the initial model to obtain a second prediction model.
S3: fusing the color feature, the morphological feature, the texture feature and the first spectral feature to obtain a first fused feature; fusing the color feature, the morphological feature, the texture feature and the second spectral feature to obtain a second fused feature;
s4: calculating the chlorophyll content of the cotton leaf to be detected by using a first prediction model by taking the first fusion characteristics as input; and calculating the nitrogen content of the cotton leaf to be detected by using a second prediction model by taking the second fusion characteristics as input.
In this embodiment, the first prediction model and the second prediction model both adopt a deep convolutional neural network model for calculating the contents of chlorophyll and nitrogen, and as shown in fig. 4, the deep convolutional neural network model includes an input layer, a feature extraction layer, a fully-connected network layer, and an output layer, which are connected in sequence. The feature extraction layer comprises a plurality of convolution blocks which are connected in sequence, the number of the convolution blocks can be 3, each convolution block comprises a convolution layer, a ReLU activation function and a maximum pooling layer, the convolution kernel size of the convolution layer is 3 x 3, the step size is 1, the size of the maximum pooling layer is 3 x 3, and the step size is 2. The full-connection network layer comprises a plurality of full-connection layers which are connected in sequence, the number of the full-connection layers can be 2, and the number of the neurons is 128 and 32 respectively.
Training the deep convolutional neural network model by adopting a first training data set to learn and establish a mapping relation between first fusion characteristics and chlorophyll content so as to construct a first prediction model; and training the deep convolutional neural network model by adopting a second training data set so as to learn and establish a mapping relation between second fusion characteristics and the nitrogen content, thereby constructing a second prediction model. The first predictive model and the second predictive model are integrated in the control chip 3. And inputting the first fusion characteristics obtained by the processing of S3 as a model of a first prediction model integrated in the control chip 3, inputting the second fusion characteristics obtained by the processing of S3 as a model of a second prediction model integrated in the control chip 3, and calculating and outputting the contents of chlorophyll and nitrogen according to the first prediction model and the second prediction model solidified in the control chip 3.
The control chip 3 of this embodiment performs pixel-level combination correspondence on the acquired spectral data and the RGB image to complete correspondence between image information and spectral information, synthesizes a multi-channel spectral dot matrix image, processes the data by using a spectral image processing algorithm integrated in the control chip 3, calculates leaf area phenotype indexes (including morphological characteristics, color characteristics, texture characteristics, and spectral characteristics), and performs predictive calculation on chlorophyll content and nitrogen content by using fusion characteristics of a spectrum and an image as input based on a predictive model integrated in the control chip 3. After the calculation is completed, the control chip 3 can perform packing storage on the single measurement calculation data, and the external device can access the calculation data result through the USB output interface 14.
In the description, each embodiment is mainly described as different from other embodiments, and the same and similar parts among the embodiments are referred to each other.
The principles and embodiments of the present invention have been described herein using specific examples, which are provided only to help understand the method and the core concept of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, the specific embodiments and the application range may be changed. In view of the foregoing, the description is not to be taken in a limiting sense.

Claims (10)

1. A detection instrument for detecting chlorophyll and nitrogen contents in cotton leaves is characterized by comprising: the device comprises a shading shell, a control chip, a reflectivity plate, a light source, an RGB camera and a spectrum probe, wherein the reflectivity plate, the light source, the RGB camera and the spectrum probe are positioned in the shading shell; the reflectivity plate is positioned on the bottom surface of the shading shell; the control chip is respectively in communication connection with the RGB camera and the spectrum probe;
the cotton leaf to be detected is positioned on the reflectivity plate; the light source is used for irradiating the cotton leaves to be detected; the RGB camera is used for acquiring RGB images of the cotton leaf to be detected, which are irradiated by the light source; the spectrum probe is used for acquiring the spectrum data of the cotton blade to be detected, which is irradiated by the light source;
And the control chip is used for calculating the chlorophyll content and the nitrogen content of the cotton leaf to be detected based on the RGB image and the spectral data.
2. The detection instrument according to claim 1, wherein a bottom sample holding drawer is arranged at the bottom of the inner side of the light shielding shell; the bottom sample holding drawer is connected with the shading shell in a sliding manner; the bottom surface of the bottom sample-containing drawer is black, and the reflectivity plate is located on the bottom surface of the bottom sample-containing drawer.
3. The testing instrument of claim 1, further comprising a moving member positioned within the light-shielded enclosure; the RGB camera and the spectrum probe are mounted on the moving component; the moving component is used for driving the RGB camera and the spectrum probe to move horizontally and lift.
4. A testing instrument according to claim 3 wherein said moving member comprises a drive member and a control rod; the driving piece is in driving connection with the control rod; the control rod comprises a horizontal axis and a longitudinal axis; the RGB camera and the spectral probe are mounted on the longitudinal axis.
5. The testing instrument of claim 1, further comprising a fan positioned inside the light-shielded enclosure; the fan is used for radiating.
6. The testing instrument of claim 1, further comprising a power source; the power supply is used for supplying power to the light source, the RGB camera and the spectrum probe.
7. A method for detecting chlorophyll and nitrogen contents in cotton leaves, which controls the detection instrument of any one of claims 1-6 to work, and is characterized in that the method comprises the following steps:
receiving RGB images acquired by an RGB camera and spectral data acquired by a spectral probe;
extracting color features, morphological features and textural features of the RGB image, and extracting first spectral features and second spectral features of the spectral data;
fusing the color feature, the morphological feature, the texture feature and the first spectral feature to obtain a first fused feature; fusing the color feature, the morphological feature, the texture feature and the second spectral feature to obtain a second fused feature;
calculating the chlorophyll content of the cotton leaf to be detected by using a first prediction model by taking the first fusion characteristics as input; and calculating the nitrogen content of the cotton leaf to be detected by using a second prediction model by taking the second fusion characteristics as input.
8. The method according to claim 7, wherein the extracting color, morphological and texture features of the RGB image specifically comprises:
carrying out threshold segmentation on the RGB image to obtain a binary image; the binary image comprises leaf pixel points and background pixel points;
respectively converting the RGB image into an HSV color space and an LAB color space, and determining R, G, B, H, S, V, L, A, B component values of the pixel points of each leaf; respectively calculating the average value of each component according to the R, G, B, H, S, V, L, A, B component values of all the leaf pixel points to obtain the color characteristics of the RGB image; the color feature comprises an average of R, G, B, H, S, V, L, A, B components;
calculating morphological characteristics of the RGB image based on the number of the leaf pixel points; the morphological characteristics comprise the area, the perimeter, the width, the height, the MajorAxisLength, the MinorAxisLength, the Eccentricity and the Equivdiameter of the blade;
calculating a gray level co-occurrence matrix of the binary image, and calculating texture characteristics of the RGB image based on the gray level co-occurrence matrix; the texture features include maximum probability, correlation, contrast, energy, homogeneity and entropy.
9. The detection method according to claim 7, wherein the extracting of the first and second spectral features of the spectral data specifically comprises:
performing pixel level correspondence on the RGB image and the spectral data to obtain spectral data of each pixel point in the RGB image;
carrying out threshold segmentation on the RGB image to obtain a binary image; the binary image comprises leaf pixel points and background pixel points;
calculating according to the spectrum data of all the leaf pixel points to obtain a leaf average spectrum, selecting a leaf average spectrum corresponding to a first characteristic wavelength as a first spectrum characteristic of the spectrum data, and selecting a leaf average spectrum corresponding to a second characteristic wavelength as a second spectrum characteristic of the spectrum data.
10. The detection method according to claim 7, wherein the first prediction model and the second prediction model both adopt a deep convolutional neural network model; the deep convolutional neural network model comprises an input layer, a feature extraction layer, a full connection network layer and an output layer which are connected in sequence;
the feature extraction layer comprises a plurality of volume blocks which are connected in sequence; each of the volume blocks includes a convolution layer, a ReLU activation function and a max pooling layer; the fully-connected network layer comprises a plurality of fully-connected layers which are connected in sequence.
CN202210428296.1A 2022-04-22 2022-04-22 Detection instrument and detection method for chlorophyll and nitrogen contents of cotton leaves Pending CN114755228A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210428296.1A CN114755228A (en) 2022-04-22 2022-04-22 Detection instrument and detection method for chlorophyll and nitrogen contents of cotton leaves

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210428296.1A CN114755228A (en) 2022-04-22 2022-04-22 Detection instrument and detection method for chlorophyll and nitrogen contents of cotton leaves

Publications (1)

Publication Number Publication Date
CN114755228A true CN114755228A (en) 2022-07-15

Family

ID=82331187

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210428296.1A Pending CN114755228A (en) 2022-04-22 2022-04-22 Detection instrument and detection method for chlorophyll and nitrogen contents of cotton leaves

Country Status (1)

Country Link
CN (1) CN114755228A (en)

Similar Documents

Publication Publication Date Title
CN107192456B (en) A kind of color measuring multi-optical spectrum imaging system based on LED illumination
JP5496509B2 (en) System, method, and apparatus for image processing for color classification and skin color detection
EP3612963A1 (en) Biochemical analyser based on a machine learning algorithm using test strips and a smartdevice
CN106383095B (en) Device and method for detecting total bacteria on surface of cooled mutton
CN110346305B (en) Method and device for measuring nitrogen content of plant leaves
CN109324000A (en) Kuerle delicious pear soluble solids content prediction technique based on CARS-MIV-SVR
TW202103484A (en) System and method for creation of topical agents with improved image capture
CN115032196B (en) Full-scribing high-flux color pathological imaging analysis instrument and method
KR102448123B1 (en) Method And Apparatus for Predicting Agricultural Freshness
CN110958450A (en) Imaging system space testing device and contrast and frequency testing method
CN114419311B (en) Multi-source information-based passion fruit maturity nondestructive testing method and device
CN113256733B (en) Camera spectral sensitivity reconstruction method based on confidence voting convolutional neural network
CN115266608A (en) Fruit and vegetable quality detection device and method based on phase and hyperspectral information fusion
CN115545383A (en) Binocular stereoscopic vision detection method for crisp appearance quality of strawberries
CN114755228A (en) Detection instrument and detection method for chlorophyll and nitrogen contents of cotton leaves
CN113252585B (en) Method and device for judging gold surface coating based on hyperspectral image
CN110533027A (en) A kind of mobile device-based text detection and recognition methods and system
WO2021195817A1 (en) Method for extracting spectral information of object to be detected
KR20220144237A (en) Real-time Rainfall Prediction Device using Cloud Images, and Rainfall Prediction Method using the same, and a computer-readable storage medium
CN114414500B (en) Spectrum detection method, storage medium, electronic device, and apparatus
CN116849612B (en) Multispectral tongue picture image acquisition and analysis system
CN117523013B (en) Color change characteristic analysis method and system for high-moisture-conductivity fabric
CN113390870B (en) Plant leaf chlorophyll content estimation method based on machine vision technology
CN115015180B (en) Fruit sugar degree nondestructive testing method based on spectral reflectance reconstruction technology
CN117274236B (en) Urine component abnormality detection method and system based on hyperspectral image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination