CN111428990A - Deep neural network-based method for evaluating flower grade of water-cultured flowers in flowering period - Google Patents

Deep neural network-based method for evaluating flower grade of water-cultured flowers in flowering period Download PDF

Info

Publication number
CN111428990A
CN111428990A CN202010202571.9A CN202010202571A CN111428990A CN 111428990 A CN111428990 A CN 111428990A CN 202010202571 A CN202010202571 A CN 202010202571A CN 111428990 A CN111428990 A CN 111428990A
Authority
CN
China
Prior art keywords
flower
flowers
plant
neural network
height
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010202571.9A
Other languages
Chinese (zh)
Inventor
陈垣毅
郑增威
闫鹏全
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hongfujin Precision Industry Shenzhen Co Ltd
Zhejiang University City College ZUCC
Original Assignee
Hongfujin Precision Industry Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hongfujin Precision Industry Shenzhen Co Ltd filed Critical Hongfujin Precision Industry Shenzhen Co Ltd
Priority to CN202010202571.9A priority Critical patent/CN111428990A/en
Publication of CN111428990A publication Critical patent/CN111428990A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/02Agriculture; Fishing; Forestry; Mining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Human Resources & Organizations (AREA)
  • Strategic Management (AREA)
  • Economics (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Health & Medical Sciences (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Educational Administration (AREA)
  • Development Economics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Operations Research (AREA)
  • Computational Linguistics (AREA)
  • Quality & Reliability (AREA)
  • Software Systems (AREA)
  • Artificial Intelligence (AREA)
  • Mathematical Physics (AREA)
  • Data Mining & Analysis (AREA)
  • Game Theory and Decision Science (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • Image Processing (AREA)
  • Biomedical Technology (AREA)
  • General Engineering & Computer Science (AREA)
  • Molecular Biology (AREA)
  • Agronomy & Crop Science (AREA)
  • Animal Husbandry (AREA)
  • Marine Sciences & Fisheries (AREA)
  • Mining & Mineral Resources (AREA)
  • Primary Health Care (AREA)
  • Geometry (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a deep neural network-based method for evaluating the flower grade of a hydroponic flower in the flowering period, which comprises the following steps of: designing a hardware system and configuring a camera position; processing the proportional relation; identifying and segmenting images of the top view and the side view of the flower by using a mask convolutional neural network based on the region to obtain a contour map and a crown block map of the plant; and calculating the numerical values of four indexes of plant height, crown diameter, flower coverage and flower uniformity degree through the obtained contour map and corresponding pixel points. The invention has the beneficial effects that: compare in traditional thought that relies on artifical judgement to solve the problem, the flowers grading system based on deep neural network can carry out automatic judgement aassessment through the computer, great promotion flowers rate efficiency, reduced the difference that the human labor brought simultaneously, have the universality. Meanwhile, the process of continuously contacting flowers in the traditional grading evaluation is avoided, and the possibility of flower damage is greatly reduced.

Description

Deep neural network-based method for evaluating flower grade of water-cultured flowers in flowering period
Technical Field
The invention relates to the field of flower grade evaluation, in particular to a deep neural network-based method for evaluating flower grade of water-cultured flowers in a flowering period.
Background
In recent years, with the rapid development of economy in China and the continuous improvement of the living standard of people, the demands of people on living diversity and green environmental protection are increasingly strengthened, and water culture flowers are more and more attracted and favored by people as a novel type in ornamental plants. China is a big country for flower production but not a strong country, and products are still at the bottom layer of a value chain.
In general, flower production can be divided into two stages, the first part being plantlet cultivation and the second part being plantlet cultivation. The specific process flow of the cultivation comprises the following steps: greenhouse cultivation, large and medium seedling cultivation, finished product grading and packaging. In the process, the flowers need to be graded for multiple times, and the seedlings can be graded in a concentrated mode, so that the plants with the same growth vigor can be concentrated, and a more reasonable strategy is facilitated to be implemented in watering and fertilizing. The grading of the middle seedlings can avoid the wind shielding and shading of plants with larger crown. Grading the finished flowers to be marketed, sorting the products which do not meet the standard, and distinguishing the grades to ensure the quality of the flowers. Most of flower grading methods in China are based on manual grading, the industry standards are backward, the automation level is low, the flower quality is reduced, and the added value is low. The flower industry of developed countries widely uses automated production equipment, has greatly improved production efficiency and the quality of flowers finished product. However, since the direct introduction of foreign grading equipment has the disadvantages of high price, difference from Chinese grading standards and the like, and does not meet the Chinese flower grading requirements, the development of a Chinese flower grading system is required.
Disclosure of Invention
The invention aims to overcome the defects in the prior art and provides a deep neural network-based method for evaluating the flower grade of hydroponic flowers in the flowering period.
The method for evaluating the flower grade of the hydroponic flowers in the flowering period based on the deep neural network comprises the following steps:
step 1, designing a hardware system and configuring the position of a camera; placing a camera at the top of the flower, placing a plurality of cameras on the side surface, and shooting pictures of flower plants; the hardware system is used for detection based on a deep neural network;
step 2, processing the proportional relation: after the position of the camera is fixed, obtaining a proportion by measuring the size of a real object and calculating the number of pixel points; the proportion is the proportion of the size of the shot picture pixels to the size of the real flowers;
step 3, identifying and segmenting images of the top view and the side view of the flower by using a mask convolutional neural network based on the region to obtain a contour map and a crown block map of the plant;
step 4, calculating the numerical values of four indexes of plant height, crown diameter, flower coverage and flower uniformity through the obtained contour map and corresponding pixel points;
step 5, grading according to the numerical values of the plant height, the crown diameter, the flower coverage and the flower uniformity, and obtaining the grades of the plant height, the crown diameter, the flower coverage and the flower uniformity according to the national standard;
and 6, comprehensively evaluating to obtain the comprehensive rating of the flowers in the flowering period according to the evaluation grades of the height of the flower plants, the crown diameter, the coverage degree and the uniformity degree of the flowers and by combining two factors of whether the growth of the flowers is inclined and whether the shape of the crown is in accordance with the specification.
Preferably, the step 3 specifically includes the following steps:
step 3.1, constructing a mask convolutional neural network based on an area, wherein a framework of the mask convolutional neural network is added with a fully-connected segmentation sub-network on the basis of a basic area convolutional neural network; the region-based mask convolutional neural network comprises a total of three tasks: classification, regression, and segmentation;
and 3.2, inputting the picture into a trained mask convolutional neural network model based on the region, and identifying the outline of the input flower picture.
Preferably, the step 4 of calculating the plant height, the crown diameter, the flower coverage and the flower uniformity comprises the following steps:
step 4.1, calculating the height of the plant:
selecting a standard plant, firstly manually measuring the real height of the standard plant, and then measuring the pixel height of the standard plant by using an image processing algorithm; the height of the plant is the height difference from the upper edge of the culture dish to the highest point of the plant; after the standard plant image is processed, the image height is set as R1(ii) a Traversing the image to obtain the vertical coordinate of the standard plant as R2,R2The mean value of plant vertical coordinates of all the images is traversed; the true height of the standard plant is F1(ii) a Calculating to obtain the height conversion ratio T1
Figure BDA0002419883650000021
After the plant image to be detected is processed, the height vertical coordinate R of the plant image to be detected is obtained in a traversing manner3(ii) a Calculating the real height of the plant to be detected through the height transformation ratio:
H=(R1-R3)T1(2)
step 4.2, calculating the crown diameter:
the crown diameter is the difference between the maximum value and the minimum value of the average crown diameter, the bias angle of camera shooting is considered when the crown diameter is calculated, and the calculation formula is as follows:
Figure BDA0002419883650000022
in the above formula, D is the crown diameter, L1Is the length of the standard plant outer spacing, W1Is the width of the standard plant outer spacing, L2Is the length of the external spacing of the target plant, W2Is the width of the external connecting distance of the target plant, S is the height of the camera, F1Height of standard plant, F2The diameter of the crown of the standard plant;
step 4.3, calculating the coverage:
the coverage degree is the density degree of the flowers in the crown; the measuring method is that the ratio of the number of pixels in the flower and the crown segmentation chart is made; the area of the crown is M1The area of the flower in the figure is M2The degree of coverage of the flower is as follows:
Figure BDA0002419883650000031
step 4.4, calculating the distribution uniformity of the flowers:
dividing the detection result of the edge of the flower into a plurality of external distance areas, obtaining the coverage of the area according to the pixel number ratio of the flower to the crown in each area, and calculating the standard deviation of the coverage of the flower in each area to estimate the even distribution condition of the flower; calculating the pixel number of each area for the divided outer distance area; let the number of pixels of a flower in a region be PiWherein i ∈ {1,2,3,4,5,6,7,8,9}, the number of crown pixels in the region is QiThe coverage of flowers in each area IiThe calculation method comprises the following steps:
Figure BDA0002419883650000032
the standard deviation σ of the outer range region is expressed as follows:
Figure BDA0002419883650000033
in the above formula, V represents an average value of the coverage.
Preferably, the loss function adopted by the mask convolutional neural network based on the region in the step 3.1 is as follows:
L=Lcls+lbox+Lmask(7)
in the above formula, LclsFor panoramic classification LboxTo classify each frame of pixels LmaskTo classify each pixel, LmaskOutput with K m dimensions, K being the number of classifications, m being the size of the extracted ROI image LmaskUsing mean binary cross entropyA loss function.
Preferably, in the step 3, the image segmentation method includes:
1) preprocessing the picture, and adjusting the size of the picture to be matched with the size of the pixel input by the network;
2) performing convolution operation on the picture by using different convolution cores through a plurality of convolution layers, and extracting features of different levels in the picture;
3) generating picture candidate areas by adopting an area generation network, and outputting masks of each candidate area;
4) detecting the position of a target in the candidate area, and dividing the target; fusing target detection and semantic segmentation.
Preferably, the step 4.4 divides the flower edge detection result into a plurality of external distance areas, and the external distance areas are in the shape of a Sudoku, so that the division effect is good, and the calculation complexity is moderate.
Preferably, the step 6 specifically includes the following steps:
step 6.1, when the growth of the flowers is inclined, considering the correction of the grading of the uniformity degree of the distribution of the flowers, and degrading the grading of the uniformity degree of the distribution of the flowers;
step 6.2, after the four indexes of the height, the crown diameter, the coverage and the uniformity of flowers of the hydroponic flowers are evaluated, the final grade evaluation of the flowers is realized according to the comprehensive grading obedience principle of the national grading standard: when all the single-item grade evaluations are positioned at the same grade, the comprehensive grade is judged to be the grade; if all the single-item grade evaluations are in different grades, the comprehensive grade is judged to be a lower grade, and the final grade of the flowers is the same as the lowest grade in the single-item grade evaluations;
and 6.3, degrading the comprehensive rating when the shape of the crown does not meet the specification.
The invention has the beneficial effects that: compare in traditional thought that relies on artifical judgement to solve the problem, the flowers grading system based on deep neural network can carry out automatic judgement aassessment through the computer, great promotion flowers rate efficiency, reduced the difference that the human labor brought simultaneously, have the universality. Meanwhile, the process of continuously contacting flowers in the traditional grading evaluation is avoided, and the possibility of flower damage is greatly reduced.
Drawings
FIG. 1 is a flow chart of a method for evaluating the flowering performance of a hydroponic flower during the flowering period of the hydroponic flower in accordance with an embodiment of the present invention;
FIG. 2 is a flowchart of image segmentation based on a mask convolutional neural network model of an area according to an embodiment of the present invention;
FIG. 3 is a flow chart of the calculation of the four attributes of the flower according to an embodiment of the present invention;
fig. 4 is a flow chart of the comprehensive flower rating according to an embodiment of the present invention.
Detailed Description
The present invention will be further described with reference to the following examples. The following examples are set forth merely to aid in the understanding of the invention. It should be noted that, for a person skilled in the art, several modifications can be made to the invention without departing from the principle of the invention, and these modifications and modifications also fall within the protection scope of the claims of the present invention.
According to the method for evaluating the grade of the flowers in the flowering period of the hydroponic flowers based on the deep neural network, during the flowering period of the hydroponic flowers, the side view and the top view of pot flowers are collected, the height, the crown diameter, the flower coverage and the flower uniformity of plants are measured by using the deep neural network, and the grade evaluation of the flowering period of the hydroponic flowers is obtained by integrating the measurement results.
In the growth of flowers, the flowers are graded for many times, and the grading of seedlings can concentrate plants with the same growth vigor, thereby being beneficial to implementing a more reasonable strategy when watering and fertilizing. The grading of the middle seedlings can avoid the wind shielding and shading of plants with larger crown. Grading the finished flowers to be marketed, sorting the products which do not meet the standard, and distinguishing the grades to ensure the quality of the flowers.
In one embodiment, as shown in fig. 1, an automated flower grading method based on a deep neural network implements the following steps:
s102, shooting a flower plant photo by using a camera:
the photographs of the flower plant are divided into two types, one is a photograph of a top view, and the other is a photograph of a side view. The picture of the top view is used for judging the crown, the coverage degree and the uniformity degree of the flower. The photograph of the side view is used for the determination of the plant height and whether a plant growth dip has occurred. The time of shooting by the camera can be in seconds when network training is carried out, and is generally in hours after the network training is finished.
S104, carrying out contour segmentation on the shot picture by using a mask convolution neural network based on the region:
the segmentation of the image is carried out by using a deep neural network technology (trained mask convolutional neural network based on regions), and pixels representing the flower, the outline of the crown and the outline of the plant are extracted.
S106, calculating the numerical values of the four indexes of the plant height, the crown diameter, the flower coverage and the flower uniformity according to the obtained flower contour:
specifically, corresponding calculation modes are provided among the four attributes of the pixels of the outline picture, the real plant height, the crown diameter, the coverage degree and the flower uniformity degree, and corresponding attribute values are calculated by solving a formula in a physical mode.
S108, grading the flower quality according to the obtained flower indexes:
the rating standard adopts national uniform flower rating standard, and if the numerical value of each attribute reaches the corresponding level, the attribute is judged to be the level of the attribute.
S110, carrying out comprehensive grading on flowers according to a single flower grading result:
specifically, when all the individual evaluations are at the same level, the flower is judged to be at the level, and if the flower is at a different level, the flower is judged to be at a lower level, that is, the final level of the flower is the same as the lowest level in the individual evaluations. When the growth of the flowers is inclined, the uniform degree of the distribution of the flowers needs to be considered, if the uniform degree level of the distribution of the flowers is higher, the inclination degree is limited, otherwise, the comprehensive rating is degraded. The composite rating is also degraded when it occurs that the shape of the crown does not meet the specifications.
As shown in fig. 2, in an embodiment, the step of performing image segmentation on the mask convolutional neural network model based on the region specifically includes:
s202, preprocessing the picture:
specifically, the picture size is adjusted to the pixel size required for network input.
S204, extracting the features of the picture:
specifically, the method comprises the steps of performing convolution operation on pictures by using different convolution kernels through a plurality of convolution layers to extract features with different layer sizes,
s206, generating a candidate region:
specifically, the candidate area is generated using an area generation network.
S208, target detection of the candidate area is carried out:
specifically, the mask convolutional neural network based on regions comprises the following two stages of generating picture candidate regions, outputting masks of each candidate region, simultaneously detecting the position of a target and segmenting the target, and fusing target detection (framing the position of the target) and semantic segmentation.
As shown in FIG. 3, in one embodiment, the calculation of the four attributes of the flower implements the following steps:
s302, plant height: is defined as the height difference from the upper edge of the culture dish to the highest point of the plant; selecting a standard plant, firstly manually measuring the real height of the standard plant, and then measuring the pixel height of the standard plant by using an image processing algorithm. After the standard plant image is processed, the image height is set as R1Traversing the image to obtain the vertical coordinate of the standard plant as R2The true height of the standard plant is F1Calculating the height conversion ratio
Figure BDA0002419883650000061
After the plant image to be detected is processed, the height vertical coordinate of the plant image to be detected is obtained as R in a traversing manner3By high transformationCalculating the real height H ═ R (R) of the plant to be measured1-R3)T1The principle is shown in fig. 3.
S304, crown width: defined as the difference between the maximum and minimum of the mean crown diameter; the calculation method is similar to the method for calculating the plant height in the step 1), but the top camera is not directly opposite to the corolla of each flower, so the bias angle of camera shooting needs to be considered when calculating the crown diameter. The calculation formula is as follows:
Figure BDA0002419883650000062
l therein1And W1Length and width of standard plant circumscribed distance, L2And W2Length and width of the target plant, S camera height, F1And F2The height and crown diameter of the standard plant are shown, and the principle is shown in figure 4.
S306, flower coverage: defined as the degree of density of the flowers in the crown; the measuring method comprises the steps of taking the number of pixels in a flower and crown division graph as a ratio, and setting the area in the crown division graph as M1The area of the flower in the figure is M2Degree of coverage of flower
Figure BDA0002419883650000063
S308, the flower distribution uniformity degree: dividing the detection result of the flower edge into a plurality of external distance areas (finding that the effect of dividing the flower edge into the Sudoku is good and the calculation complexity is moderate in a plurality of experiments), obtaining the flower coverage of the areas according to the pixel number ratio of the flower to the plant crown in each area, and calculating the standard deviation of the flower coverage of each area to estimate the even distribution condition of the flower. For the divided outer distance areas, calculating the pixel number of each area, and setting the pixel number of the flowers in the area as PiWhere i ∈ {1,2,3,4,5,6,7,8,9 }. The number of crown pixels in a region is QiThe coverage of flowers in each area IiThe calculation method comprises the following steps:
Figure BDA0002419883650000064
further, the area standard deviation σ is expressed as follows, where V represents the average value of the flower cover degree:
Figure BDA0002419883650000065
for a specific determination method of the degree of uniformity, no specific parameters are given in the national standard. In the method, the standard deviation 10 is taken as a uniformity threshold value, the distribution is uniform when the standard deviation is smaller than the threshold value, and the distribution is non-uniform when the standard deviation is larger than the threshold value.
As shown in FIG. 4, in one embodiment, the comprehensive rating of flowers is implemented by the following steps:
s402, correcting the flower uniformity degree rating:
specifically, when the growth inclination of the flowers occurs, the grading of the uniformity degree of the flower distribution needs to be considered, and the grading is reduced by one step.
S404, carrying out comprehensive flower rating:
after the four indexes of the water culture flower grade are evaluated, the final flower grade evaluation is realized according to the comprehensive grading obeying principle of the national grading standard: and when all the single evaluations are in the same level, judging the level, and if the single evaluations are in different levels, judging the level to be a lower level, namely the final level of the flowers is the same as the lowest level in the single evaluations.
S406, correcting the comprehensive rating:
specifically, when the crown shape does not meet the specification, the overall rating is reduced by one step.
The deep neural network-based flower grading system for the flowering period of the hydroponic flowers has the characteristics of rapidness, accuracy and non-contact property. By collecting the side view and the top view of the potted flower, a deep neural network algorithm is applied, the height of the plant is measured in the side view, the crown diameter, the coverage degree and the flower uniformity degree are measured in the top view, whether the growth of the flower is inclined or not is detected, and whether the shape of the planted crown meets the specification or not is detected. Finished product grading needs to synthesize each measurement result according to the requirements in the national finished product flower grading standard to obtain a conclusion. Compared with the traditional idea of solving problems by relying on manual judgment, the method can perform automatic judgment evaluation and has universality. Meanwhile, the process of continuously contacting flowers in the traditional grading evaluation is avoided, and the possibility of flower damage is greatly reduced.

Claims (7)

1. A deep neural network-based method for evaluating the flower grade of water-cultured flowers in the flowering period is characterized by comprising the following steps:
step 1, designing a hardware system and configuring the position of a camera; placing a camera at the top of the flower, placing a plurality of cameras on the side surface, and shooting pictures of flower plants; the hardware system is used for detection based on a deep neural network;
step 2, processing the proportional relation: after the position of the camera is fixed, obtaining a proportion by measuring the size of a real object and calculating the number of pixel points; the proportion is the proportion of the size of the shot picture pixels to the size of the real flowers;
step 3, identifying and segmenting images of the top view and the side view of the flower by using a mask convolutional neural network based on the region to obtain a contour map and a crown block map of the plant;
step 4, calculating the numerical values of four indexes of plant height, crown diameter, flower coverage and flower uniformity through the obtained contour map and corresponding pixel points;
step 5, grading according to the numerical values of the plant height, the crown diameter, the flower coverage and the flower uniformity, and obtaining the grades of the plant height, the crown diameter, the flower coverage and the flower uniformity according to the national standard;
and 6, comprehensively evaluating to obtain the comprehensive rating of the flowers in the flowering period according to the evaluation grades of the height of the flower plants, the crown diameter, the coverage degree and the uniformity degree of the flowers and by combining two factors of whether the growth of the flowers is inclined and whether the shape of the crown is in accordance with the specification.
2. The deep neural network-based method for evaluating the flowering grade of hydroponic flowers in the flowering period according to claim 1, wherein the step 3 specifically comprises the following steps:
step 3.1, constructing a mask convolutional neural network based on an area, wherein a framework of the mask convolutional neural network is added with a fully-connected segmentation sub-network on the basis of a basic area convolutional neural network; the region-based mask convolutional neural network comprises a total of three tasks: classification, regression, and segmentation;
and 3.2, inputting the picture into a trained mask convolutional neural network model based on the region, and identifying the outline of the input flower picture.
3. The method for evaluating the flowering grade of hydroponic flowers in the flowering period based on the deep neural network as claimed in claim 1, wherein the step 4 of calculating the plant height, the crown diameter, the flower coverage and the flower uniformity specifically comprises the following steps:
step 4.1, calculating the height of the plant:
selecting a standard plant, firstly manually measuring the real height of the standard plant, and then measuring the pixel height of the standard plant by using an image processing algorithm; the height of the plant is the height difference from the upper edge of the culture dish to the highest point of the plant; after the standard plant image is processed, the image height is set as R1(ii) a Traversing the image to obtain the vertical coordinate of the standard plant as R2,R2The mean value of plant vertical coordinates of all the images is traversed; the true height of the standard plant is F1(ii) a Calculating to obtain the height conversion ratio T1
Figure FDA0002419883640000011
After the plant image to be detected is processed, the height vertical coordinate R of the plant image to be detected is obtained in a traversing manner3(ii) a Calculating the real height of the plant to be detected through the height transformation ratio:
H=(R1-R3)T1(2)
step 4.2, calculating the crown diameter:
the crown diameter is the difference between the maximum value and the minimum value of the average crown diameter, the bias angle of camera shooting is considered when the crown diameter is calculated, and the calculation formula is as follows:
Figure FDA0002419883640000021
in the above formula, D is the crown diameter, L1Is the length of the standard plant outer spacing, W1Is the width of the standard plant outer spacing, L2Is the length of the external spacing of the target plant, W2Is the width of the external connecting distance of the target plant, S is the height of the camera, F1Height of standard plant, F2The diameter of the crown of the standard plant;
step 4.3, calculating the coverage:
the coverage degree is the density degree of the flowers in the crown; the measuring method is that the ratio of the number of pixels in the flower and the crown segmentation chart is made; the area of the crown is M1The area of the flower in the figure is M2The degree of coverage of the flower is as follows:
Figure FDA0002419883640000022
step 4.4, calculating the distribution uniformity of the flowers:
dividing the detection result of the edge of the flower into a plurality of external distance areas, obtaining the coverage of the area according to the pixel number ratio of the flower to the crown in each area, and calculating the standard deviation of the coverage of the flower in each area to estimate the even distribution condition of the flower; calculating the pixel number of each area for the divided outer distance area; let the number of pixels of a flower in a region be PiWherein i ∈ {1,2,3,4,5,6,7,8,9}, the number of crown pixels in the region is QiThe coverage of flowers in each area IiThe calculation method comprises the following steps:
Figure FDA0002419883640000023
the standard deviation σ of the outer range region is expressed as follows:
Figure FDA0002419883640000024
in the above formula, V represents an average value of the coverage.
4. The method for evaluating the flowering grade of hydroponics flowers during the flowering period based on the deep neural network as claimed in claim 2, wherein the loss function adopted by the mask convolutional neural network based on the region in the step 3.1 is as follows:
L=Lcls+lbox+Lmask(7)
in the above formula, LclsFor panoramic classification LboxTo classify each frame of pixels LmaskTo classify each pixel, LmaskOutput with K m dimensions, K being the number of classifications, m being the size of the extracted ROI image LmaskCalculated using the average binary cross entropy loss function.
5. The method for evaluating the flowering grade of hydroponic flowers during the flowering period based on the deep neural network as claimed in claim 2, wherein the image segmentation in the step 3 is performed by:
1) preprocessing the picture, and adjusting the size of the picture to be matched with the size of the pixel input by the network;
2) performing convolution operation on the picture by using different convolution cores through a plurality of convolution layers, and extracting features of different levels in the picture;
3) generating picture candidate areas by adopting an area generation network, and outputting masks of each candidate area;
4) detecting the position of a target in the candidate area, and dividing the target; fusing target detection and semantic segmentation.
6. The deep neural network-based method for evaluating the flowering grade of hydroponic flowers during the flowering period according to claim 3, wherein: and 4.4, dividing the flower edge detection result into a plurality of external distance areas, wherein the external distance areas are in a nine-square grid shape.
7. The deep neural network-based method for evaluating the flowering grade of hydroponic flowers during the flowering period according to claim 1, wherein the step 6 specifically comprises the following steps:
step 6.1, when the growth of the flowers is inclined, considering the correction of the grading of the uniformity degree of the distribution of the flowers, and degrading the grading of the uniformity degree of the distribution of the flowers;
step 6.2, after the four indexes of the height, the crown diameter, the coverage and the uniformity of flowers of the hydroponic flowers are evaluated, the final grade evaluation of the flowers is realized according to the comprehensive grading obedience principle of the national grading standard: when all the single-item grade evaluations are positioned at the same grade, the comprehensive grade is judged to be the grade; if all the single-item grade evaluations are in different grades, the comprehensive grade is judged to be a lower grade, and the final grade of the flowers is the same as the lowest grade in the single-item grade evaluations;
and 6.3, degrading the comprehensive rating when the shape of the crown does not meet the specification.
CN202010202571.9A 2020-03-20 2020-03-20 Deep neural network-based method for evaluating flower grade of water-cultured flowers in flowering period Pending CN111428990A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010202571.9A CN111428990A (en) 2020-03-20 2020-03-20 Deep neural network-based method for evaluating flower grade of water-cultured flowers in flowering period

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010202571.9A CN111428990A (en) 2020-03-20 2020-03-20 Deep neural network-based method for evaluating flower grade of water-cultured flowers in flowering period

Publications (1)

Publication Number Publication Date
CN111428990A true CN111428990A (en) 2020-07-17

Family

ID=71548347

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010202571.9A Pending CN111428990A (en) 2020-03-20 2020-03-20 Deep neural network-based method for evaluating flower grade of water-cultured flowers in flowering period

Country Status (1)

Country Link
CN (1) CN111428990A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111784688A (en) * 2020-07-24 2020-10-16 征图新视(江苏)科技股份有限公司 Flower automatic grading method based on deep learning
CN117726935A (en) * 2023-11-23 2024-03-19 盐城中科高通量计算研究院有限公司 Agricultural straw ground clearance recognition device and method based on machine vision

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102072885A (en) * 2010-12-06 2011-05-25 浙江大学 Machine vision-based paddy neck blast infection degree grading method
CN104162514A (en) * 2013-05-16 2014-11-26 北京京鹏环球科技股份有限公司 Image recognition based potted flower intelligent grading equipment and method
CN104318240A (en) * 2014-07-18 2015-01-28 天津理工大学 Flower bud discriminating method based on computer vision
US20180211393A1 (en) * 2017-01-24 2018-07-26 Beihang University Image guided video semantic object segmentation method and apparatus
CN109086826A (en) * 2018-08-06 2018-12-25 中国农业科学院农业资源与农业区划研究所 Wheat Drought recognition methods based on picture depth study

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102072885A (en) * 2010-12-06 2011-05-25 浙江大学 Machine vision-based paddy neck blast infection degree grading method
CN104162514A (en) * 2013-05-16 2014-11-26 北京京鹏环球科技股份有限公司 Image recognition based potted flower intelligent grading equipment and method
CN104318240A (en) * 2014-07-18 2015-01-28 天津理工大学 Flower bud discriminating method based on computer vision
US20180211393A1 (en) * 2017-01-24 2018-07-26 Beihang University Image guided video semantic object segmentation method and apparatus
CN109086826A (en) * 2018-08-06 2018-12-25 中国农业科学院农业资源与农业区划研究所 Wheat Drought recognition methods based on picture depth study

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李想: "基于计算机视觉的花卉分级系统研究", 《中国优秀硕士学位论文全文数据库 信息科技辑(月刊)》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111784688A (en) * 2020-07-24 2020-10-16 征图新视(江苏)科技股份有限公司 Flower automatic grading method based on deep learning
CN117726935A (en) * 2023-11-23 2024-03-19 盐城中科高通量计算研究院有限公司 Agricultural straw ground clearance recognition device and method based on machine vision

Similar Documents

Publication Publication Date Title
CN109711325B (en) Mango picking point identification method
CN108363983B (en) Urban vegetation classification method based on unmanned aerial vehicle image and reconstructed point cloud
CN105718945B (en) Apple picking robot night image recognition method based on watershed and neural network
CN107609526A (en) Rule-based fine dimension city impervious surface rapid extracting method
CN112418188A (en) Crop growth whole-course digital assessment method based on unmanned aerial vehicle vision
Blok et al. Machine vision for a selective broccoli harvesting robot
CN108710766B (en) Greenhouse plant liquid manure machine fertilizer regulation parameter calculation method based on growth model
CN105335749B (en) Area limit line drawing method is not cut in meadow based on gray level co-occurrence matrixes
CN111428990A (en) Deep neural network-based method for evaluating flower grade of water-cultured flowers in flowering period
CN112614147B (en) Crop seedling stage plant density estimation method and system based on RGB image
Chang et al. Quantifying muskmelon fruit attributes with A-TEP-based model and machine vision measurement
CN114067207A (en) Vegetable seedling field weed detection method based on deep learning and image processing
CN112990103A (en) String mining secondary positioning method based on machine vision
CN112907545A (en) Method for detecting bud length and root length of seeds based on image processing
CN115049925A (en) Method for extracting field ridge, electronic device and storage medium
CN112131952A (en) Corn seedling stage plant number information extraction based on unmanned aerial vehicle remote sensing image
CN115497067A (en) Path identification and planning method for nursery patrol intelligent vehicle
CN115937151A (en) Method for judging curling degree of crop leaves
CN111291686A (en) Method and system for extracting crop root phenotype parameters and judging root phenotype
CN114387343A (en) Method for detecting picking position of mushroom stem of clustered oyster mushroom
CN111369497B (en) Walking type tree fruit continuous counting method and device
CN117152172A (en) Point cloud data-based power transmission line tower and power line extraction method
CN111401121A (en) Method for realizing citrus segmentation based on super-pixel feature extraction
CN115631136A (en) 3D point cloud image-based method for rapidly measuring phenotypic parameters of schima superba seedlings
CN115937687A (en) Remote sensing identification method for rice and shrimp co-culture planting and breeding mode

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination