CN113361642A - Fresh cut flower grading method, device and medium - Google Patents

Fresh cut flower grading method, device and medium Download PDF

Info

Publication number
CN113361642A
CN113361642A CN202110753202.3A CN202110753202A CN113361642A CN 113361642 A CN113361642 A CN 113361642A CN 202110753202 A CN202110753202 A CN 202110753202A CN 113361642 A CN113361642 A CN 113361642A
Authority
CN
China
Prior art keywords
flower
information
branch
image
leaf
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110753202.3A
Other languages
Chinese (zh)
Other versions
CN113361642B (en
Inventor
景珑
武光
吕会甫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhongnongshichuang Beijing Environmental Engineering Technology Co ltd
Qijiu Horticultural Technology Beijing Co ltd
Original Assignee
Zhongnongshichuang Beijing Environmental Engineering Technology Co ltd
Qijiu Horticultural Technology Beijing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhongnongshichuang Beijing Environmental Engineering Technology Co ltd, Qijiu Horticultural Technology Beijing Co ltd filed Critical Zhongnongshichuang Beijing Environmental Engineering Technology Co ltd
Priority to CN202110753202.3A priority Critical patent/CN113361642B/en
Publication of CN113361642A publication Critical patent/CN113361642A/en
Application granted granted Critical
Publication of CN113361642B publication Critical patent/CN113361642B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/243Classification techniques relating to the number of classes
    • G06F18/2431Multiple classes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The application relates to a fresh cut flower grading method, a fresh cut flower grading device and a medium, which relate to the field of fresh flower grading and comprise first image information and second image information which are acquired to respectively correspond to each fresh flower, wherein the first image information is used for representing the image information of flower parts respectively corresponding to the fresh flowers, the second image information is used for representing the image information of branch and leaf parts respectively corresponding to each fresh flower, and grades respectively corresponding to each fresh flower are estimated based on the first image information and the second image information respectively corresponding to each fresh flower. This application has the effect that improves flower sorting efficiency.

Description

Fresh cut flower grading method, device and medium
Technical Field
The application relates to the field of fresh flower grading, in particular to a method, a device and a medium for grading fresh cut flowers.
Background
The harvested fresh flowers must be sorted due to their varying quality and classified according to specific criteria to highly standardize and commercialize the product.
The selected fresh flowers are further classified according to the length of flower buds, the quality and the size of the flowers, the openness degree, the number of flowers on inflorescences, the conditions of leaves, the quality of varieties and the like.
The current flower grading is usually carried out by manual screening and grading operations, so that the grading efficiency is low.
Disclosure of Invention
In order to improve the sorting efficiency of fresh flowers, the application provides a fresh cut flower grading method, a fresh cut flower grading device and a medium.
In a first aspect, the present application provides a method for grading fresh cut flowers, which adopts the following technical scheme:
a method for grading a fresh cut flower, comprising:
acquiring first image information and second image information which respectively correspond to each flower, wherein the first image information is used for representing the image information of flower parts respectively corresponding to the flowers, and the second image information is used for representing the image information of branch and leaf parts respectively corresponding to each flower;
and estimating grades respectively corresponding to the flowers based on the first image information and the second image information respectively corresponding to the flowers.
Through adopting above-mentioned technical scheme, the flower generally includes flower part and branch and leaf part, accessible is graded flower part and branch and leaf part respectively when grading the flower again, synthesizes the grade of flower part and branch and leaf part and grades the flower whole, and the image information of gathering flower part is first image information, and the image information of gathering flower branch and leaf part is second image information, and electronic equipment is based on first image information and second image information to the whole flower of grading compare in artifical screening classification efficiency higher.
In another possible implementation manner, the estimating, based on the first image information and the second image information respectively corresponding to each flower, a grade respectively corresponding to each flower includes:
inputting the first image information corresponding to each flower into the trained first network model to obtain flower vividness information and flower plumpness information corresponding to each flower;
inputting second image information corresponding to each flower into a trained second network model to obtain branch and leaf curvature information and branch and leaf freshness information corresponding to each flower;
and estimating the grade respectively corresponding to each flower based on the flower vividness information and the flower satisfaction information respectively corresponding to each flower and the branch and leaf curvature information and the branch and leaf freshness information respectively corresponding to each flower.
Through the technical scheme, the first image information is input into the trained first network model to obtain the flower vividness information and the flower plumpness information of the flower part, the flower part is classified more accurately and truly through the flower vividness information and the flower plumpness information, the second image information is input into the trained second network model to obtain the branch and leaf curvature information and the branch and leaf freshness information of the branch and leaf part, and the branch and leaf part is classified more accurately and truly through the branch and leaf curvature information and the branch and leaf plumpness information. The fresh flowers are pre-estimated and graded through the flower brilliance information, the flower plumpness information, the branch and leaf curvature information and the branch and leaf freshness information, so that the method is closer to accuracy and more practical.
In another possible implementation manner, the inputting the first image information corresponding to each flower into the trained first network model to obtain the flower vividness information and the flower plumpness information corresponding to each flower further includes:
acquiring a flower image sample set, wherein the flower image sample set comprises vividness information and plumpness information which are respectively corresponding to each sample flower image used for training a first network model;
training a first network model based on the flower sample set.
By adopting the technical scheme, the flower image sample set is input into the first network model to train the first network model to obtain the trained first network model, and the flower vividness information and the saturation information of the flower part corresponding to the relatively real first image information can be obtained after the first image information is input into the trained first network model.
In another possible implementation manner, the obtaining a sample set of flower images includes:
performing segmentation processing on each sample flower image in the flower image sample set and generating a plurality of sub-sample flower images corresponding to each sample flower image;
extracting RGB values in the plurality of sub-sample flower images and determining an RGB average value;
comparing the RGB average value with a plurality of preset RGB intervals;
marking freshness information of the sample flowers corresponding to each sample flower image in each sample flower image based on the color comparison result;
extracting pixel arrangement information of each sample flower image;
the pixel arrangement information is compared with a plurality of preset pixel arrangement intervals in an arrangement mode;
and marking the plumpness information of the sample flowers corresponding to each sample flower image based on the arrangement comparison result.
By adopting the technical scheme, each sample flower image is divided to obtain a plurality of sub-sample flower images, RGB values are extracted from the plurality of sub-sample flower images, the RGB average values represent the whole vividness information of the flower parts more accurately, the RGB average values are compared with a plurality of preset RGB intervals, and therefore the vividness information of the flower parts in different preset RGB intervals is labeled, and a first network model is trained. The pixel arrangement information is used for representing the fullness information of the flower part, and the pixel arrangement information is arranged and compared with a plurality of preset pixel arrangement intervals, so that the fullness information of the flower part of the sample is determined, and the fullness information of the flower part in different preset pixel arrangement intervals is marked to train the first network model. And training the first network model through the flower image sample set, thereby improving the information processing effect of the first network model.
In another possible implementation manner, the inputting the second image information corresponding to each flower into the trained second network model to obtain the branch and leaf curvature information and the branch and leaf freshness information corresponding to each flower further includes:
acquiring a branch and leaf image sample set, wherein the branch and leaf image sample set comprises curvature information and freshness information which respectively correspond to each sample branch and leaf image used for training a second network model;
and training a second network model based on the branch and leaf image sample set.
By adopting the technical scheme, the branch and leaf image sample set is input into the second network model and the second network model is trained to obtain the trained second network model, and after the second image information is input into the trained second network model, the second network model can output the branch and leaf curvature information and the branch and leaf freshness information which are more real and more in line with the actual branch and leaf part.
In another possible implementation manner, the obtaining a sample set of branch and leaf images includes:
determining pixel arrangement information of each sample branch and leaf image in each branch and leaf sample image based on each branch and leaf sample image in the branch and leaf sample set;
extracting branch and stem pixel arrangement information from the pixel arrangement information;
comparing the branch and stem pixel arrangement information with a plurality of preset branch and stem pixel arrangement intervals;
marking the bending information of each sample branch and leaf based on the branch and stem pixel arrangement comparison result;
extracting blade pixel arrangement information from the pixel arrangement information;
comparing the blade pixel arrangement information with a plurality of preset blade pixel arrangement intervals;
and marking the freshness information of each sample branch and leaf based on the leaf pixel arrangement comparison result.
Through adopting above-mentioned technical scheme, branch and stem pixel arranges the crookedness information of information representation branch and stem, arrange information and a plurality of branch and stem pixel of predetermineeing of branch and stem pixel and arrange the information and compare, and then arrange the regional branch and stem of arranging to the difference of locating and carry out different labels, the new freshness information of information representation blade is arranged to the blade pixel, arrange the information and a plurality of leaf pixel of predetermineeing and arrange the region and compare, and then arrange the regional blade of region to the difference of locating and carry out different labels. And the second network model can be better trained by the branch and leaf sample image set.
In another possible implementation manner, the estimating, based on the flower vividness information and the flower plumpness information respectively corresponding to each flower and the branch and leaf curvature information and the branch and leaf freshness information respectively corresponding to each flower, a grade respectively corresponding to each flower includes:
determining flower grades corresponding to the flowers respectively based on the flower vividness information and the flower plumpness information corresponding to the flowers respectively;
determining the branch and leaf grade corresponding to each flower according to the branch and leaf curvature information and the branch and leaf freshness information corresponding to each flower;
and determining grades respectively corresponding to the flowers based on the grades respectively corresponding to the flowers and the grades respectively corresponding to the branches and leaves of the flowers.
By adopting the technical scheme, after the flower grade and the branch and leaf grade of the flower are determined, the grade effect of the flower is determined to be better based on comprehensive estimation of the flower grade and the branch and leaf grade.
In a second aspect, the present application provides a fresh cut flower grading device, which adopts the following technical scheme:
a fresh cut flower grading device comprising:
the device comprises an acquisition module, a storage module and a processing module, wherein the acquisition module is used for acquiring first image information and second image information which respectively correspond to each flower, the first image information is used for representing the image information of flower parts which respectively correspond to the flowers, and the second image information is used for representing the image information of branch and leaf parts which respectively correspond to each flower;
and the estimation module is used for estimating the grade corresponding to each flower based on the first image information and the second image information corresponding to each flower.
Through adopting above-mentioned technical scheme, the flower generally includes flower part and branch and leaf part, and the accessible is graded flower part and branch and leaf part respectively when grading the flower again, synthesizes the grade of flower part and branch and leaf part and grades the flower whole, and the image information that collection module gathered flower part is first image information, and the image information of gathering flower branch and leaf part is second image information. The estimation module estimates and grades the whole fresh flower based on the first image information and the second image information, and compared with manual screening and grading, the estimation module has higher efficiency.
In another possible implementation manner, when the estimation module estimates the grades corresponding to the respective flowers based on the first image information and the second image information corresponding to the respective flowers, the estimation module is specifically configured to:
inputting the first image information corresponding to each flower into the trained first network model to obtain flower vividness information and flower plumpness information corresponding to each flower;
inputting second image information corresponding to each flower into a trained second network model to obtain branch and leaf curvature information and branch and leaf freshness information corresponding to each flower;
and estimating the grade respectively corresponding to each flower based on the flower vividness information and the flower satisfaction information respectively corresponding to each flower and the branch and leaf curvature information and the branch and leaf freshness information respectively corresponding to each flower.
In another possible implementation manner, the apparatus further includes:
the first obtaining module is used for obtaining a flower image sample set, and the flower image sample set comprises vividness information and plumpness information which respectively correspond to each sample flower image used for training a first network model;
and the first training module is used for training a first network model based on the flower sample set.
In another possible implementation manner, when the first obtaining module obtains the flower image sample set, the first obtaining module is specifically configured to:
performing segmentation processing on each sample flower image in the flower image sample set and generating a plurality of sub-sample flower images corresponding to each sample flower image;
extracting RGB values in the plurality of sub-sample flower images and determining an RGB average value;
comparing the RGB average value with a plurality of preset RGB intervals;
marking freshness information of the sample flowers corresponding to each sample flower image in each sample flower image based on the color comparison result;
extracting pixel arrangement information of each sample flower image;
the pixel arrangement information is compared with a plurality of preset pixel arrangement intervals in an arrangement mode;
and marking the plumpness information of the sample flowers corresponding to each sample flower image based on the arrangement comparison result.
In another possible implementation manner, the apparatus further includes:
the second acquisition module is used for acquiring a branch and leaf image sample set, and the branch and leaf image sample set comprises curvature information and freshness information which respectively correspond to each sample branch and leaf image used for training a second network model;
and the second training module is used for training a second network model based on the branch and leaf image sample set.
In another possible implementation manner, when the second obtaining module obtains the branch and leaf image sample set, the second obtaining module is specifically configured to:
determining pixel arrangement information of each sample branch and leaf image in each branch and leaf sample image based on each branch and leaf sample image in the branch and leaf sample set;
extracting branch and stem pixel arrangement information from the pixel arrangement information;
comparing the branch and stem pixel arrangement information with a plurality of preset branch and stem pixel arrangement intervals;
marking the bending information of each sample branch and leaf based on the branch and stem pixel arrangement comparison result;
extracting blade pixel arrangement information from the pixel arrangement information;
comparing the blade pixel arrangement information with a plurality of preset blade pixel arrangement intervals;
and marking the freshness information of each sample branch and leaf based on the leaf pixel arrangement comparison result.
In another possible implementation manner, the estimating module is specifically configured to, when estimating the respective corresponding grades of each flower based on the respective flower vividness information and flower plumpness information, and the respective corresponding branch and leaf curvature information and branch and leaf freshness information of each flower:
determining flower grades corresponding to the flowers respectively based on the flower vividness information and the flower plumpness information corresponding to the flowers respectively;
determining the branch and leaf grade corresponding to each flower according to the branch and leaf curvature information and the branch and leaf freshness information corresponding to each flower;
and determining grades respectively corresponding to the flowers based on the grades respectively corresponding to the flowers and the grades respectively corresponding to the branches and leaves of the flowers.
In a third aspect, the present application provides an electronic device, which adopts the following technical solutions:
an electronic device, comprising:
one or more processors;
a memory;
one or more application programs, wherein the one or more application programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to: a method of grading cut flowers according to one of the possible implementations of the first aspect is performed.
In a fourth aspect, the present application provides a computer-readable storage medium, which adopts the following technical solutions:
a computer-readable storage medium, comprising: there is stored a computer program that can be loaded by a processor and that implements a method for grading cut flowers as shown in any one of the possible implementations of the first aspect.
In summary, the present application includes at least one of the following beneficial technical effects:
1. the flower generally comprises a flower part and a branch and leaf part, the flower part and the branch and leaf part can be graded respectively when the flower is graded, the grades of the flower part and the branch and leaf part are integrated to grade the whole flower, the image information of the flower part of the flower is collected to be first image information, the image information of the branch and leaf part of the flower is collected to be second image information, and the grading efficiency of the whole flower by the electronic equipment based on the first image information and the second image information is higher than that of manual screening;
2. the method comprises the steps of inputting first image information into a trained first network model to obtain flower vividness information and flower plumpness information of a flower part, grading the flower part more accurately and truly through the flower vividness information and the flower plumpness information, inputting second image information into a trained second network model to obtain branch and leaf flexibility information and branch and leaf freshness information of a branch and leaf part, and grading the branch and leaf part more accurately and truly through the branch and leaf flexibility information and the branch and leaf plumpness information. The fresh flowers are pre-estimated and graded through the flower brilliance information, the flower plumpness information, the branch and leaf curvature information and the branch and leaf freshness information, so that the method is closer to accuracy and more practical.
Drawings
Fig. 1 is a schematic flow chart of a method for grading fresh cut flowers according to an embodiment of the present application.
Fig. 2 is a schematic structural diagram of a fresh cut flower grading device according to an embodiment of the present application.
Fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The present application is described in further detail below with reference to the attached drawings.
A person skilled in the art, after reading the present description, may make modifications to the embodiments as required, without any inventive contribution thereto, but shall be protected by the patent laws within the scope of the claims of the present application.
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In addition, the term "and/or" herein is only one kind of association relationship describing an associated object, and means that there may be three kinds of relationships, for example, a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship, unless otherwise specified.
The embodiments of the present application will be described in further detail with reference to the drawings attached hereto.
The embodiment of the application provides a fresh cut flower grading method, which is executed by electronic equipment, wherein the electronic equipment can be a server or terminal equipment, the server can be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, and a cloud server for providing cloud computing service. The terminal device may be a smart phone, a tablet computer, a notebook computer, a desktop computer, etc., but is not limited thereto, the terminal device and the server may be directly or indirectly connected through wired or wireless communication, and the embodiment of the present application is not limited thereto, as shown in fig. 1, the method includes steps S101 and S102, wherein,
s101, acquiring first image information and second image information corresponding to each flower, wherein the first image information is used for representing the image information of flower parts corresponding to the flowers, and the second image information is used for representing the image information of branch and leaf parts corresponding to the flowers.
For the embodiment of the application, the electronic device can respectively collect the first image information and the second image information corresponding to each flower through one collection device such as a camera and a camera, can also collect the whole image information corresponding to each flower through one collection device and then divide the whole image information into the first image information and the second image information, can also respectively collect the first image information and the second image information corresponding to each flower through two collection devices, and is not limited to the above. Flowers generally consist of a flower part and a branch and leaf part, so that when the flowers are classified, the flower part and the branch and leaf part are screened more accurately and practically.
S102, estimating grades corresponding to the flowers based on the first image information and the second image information corresponding to the flowers.
For the embodiment of the application, after the electronic equipment collects the first image information and the second image information corresponding to each flower, the electronic equipment pre-estimates the grade of the flower based on the first image information and the second image information. Compared with manual screening, the efficiency of classification is higher and more convenient.
In a possible implementation manner of the embodiment of the present application, the step S102 may specifically include a step S1021 (not shown in the figure), a step S1022 (not shown in the figure), and a step S1023 (not shown in the figure), wherein,
s1021, inputting the first image information corresponding to each flower into the trained first network model to obtain flower vividness information and flower plumpness information corresponding to each flower.
For the embodiment of the application, the first image information corresponding to each flower is input into the trained first network model, and the trained first network model outputs the flower vividness information and the flower plumpness information corresponding to each flower. For example, after the first image information of a flower is input into the first network model, the first network model outputs the flower vividness information of the flower as 5, and outputs the flower plumpness information of the flower as 4.
And S1022, inputting the second image information corresponding to each flower into the trained second network model to obtain the branch and leaf curvature information and branch and leaf freshness information corresponding to each flower.
For the embodiment of the application, the second image information corresponding to each fresh flower is input into the trained second network model, and the trained second network model outputs the branch and leaf curvature information and the branch and leaf freshness information corresponding to each fresh flower. In step S1021, for example, when the second image information of the flower is input to the second network model, the second network model outputs the information of the degree of curvature of the flower 'S branches and leaves as 3 and outputs the information of the freshness of the flower' S branches and leaves as 2.
And S1023, estimating grades respectively corresponding to the flowers based on the flower vividness information and the flower satisfaction information respectively corresponding to the flowers and the branch and leaf curvature information and the branch and leaf freshness information respectively corresponding to the flowers.
In the present embodiment, taking steps S1021 and S1022 as examples, the flower freshness information is 5, the flower fullness information is 4, the branch and leaf curvature information is 3, and the branch and leaf freshness information is 2. The electronic equipment estimates the grade of the flower based on the information of the flower vividness, the flower plumpness, the branch and leaf curvature and the branch and leaf freshness of the flower.
In a possible implementation manner of the embodiment of the present application, step S1021 further includes step S103 (not shown in the figure) and step S104 (not shown in the figure), wherein,
s103, obtaining a flower image sample set, wherein the flower image sample set comprises vividness information and plumpness information which are respectively corresponding to each sample flower image used for training the first network model.
For the embodiment of the present application, the first network model may be trained according to the flower image sample set and the Loss function. For example, the loss function of the first network model is LossA = α LossA1+ β LossA 2. LossA1 is a loss function for identifying a task of flower vividness information, LossA2 is a loss function for flower plumpness information, alpha is a weight for identifying a task function corresponding to flower vividness, and alpha can be 15; beta is the level of flower plumpness, and beta may be 10.
The flower image sample set comprises a plurality of sample flower images, the more the sample flower images in the flower image sample set are, the better the effect of processing the first image information through the first network model trained by the flower image sample set is. And labeling the vividness information and the plumpness information corresponding to the flower images of the samples so as to facilitate the training of the first network model based on the flower image sample set.
S104, training a first network model based on the flower sample set.
After the sample flower images are input into the first network model, the first network model is trained on the basis of the vividness information and the plumpness information which correspond to and are marked on the sample flower images respectively, so that the trained first network model is obtained.
In a possible implementation manner of the embodiment of the present application, step S103 specifically includes step S1031 (not shown), step S1032 (not shown), step S1033 (not shown), step S1034 (not shown), step S1035 (not shown), step S1036 (not shown), and step S1037 (not shown), step S1031 and step S1035 may be executed simultaneously, step S1031 may be executed after step S1035, step S1031 may also be executed before step S1035, where,
and S1031, performing segmentation processing on each sample flower image in the flower image sample set and generating a plurality of sub-sample flower images corresponding to each sample flower image.
For the embodiment of the present application, the division of each sample flower image into a plurality of sub-sample images facilitates the electronic device to perform analysis processing on the sample flower images, for example, six sub-sample flower images that divide one sample flower image by 2 × 3 are located inside the sample flower image, and then analysis processing is performed based on the six sub-sample flower images.
S1032, RGB values in the plurality of sub-sample flower images are extracted and an RGB average value is determined.
For the embodiment of the present application, taking step S1031 as an example, extracting RGB values of each of the six sub-sample flower images, assuming that the RGB values of the six sub-sample flowers are 200, 210, 205, 200, 195 and 190, respectively, summing the RGB values extracted from the six sub-sample flower images and averaging the RGB values to obtain an average RGB value of 200, and 200 may represent the vividness of the sample flower corresponding to the sample flower image.
S1033, comparing the RGB average value with a plurality of predetermined RGB intervals.
For the embodiment of the application, taking the step S1032 as an example, assuming that the plurality of predetermined RGB intervals are respectively 180-. Therefore, the sample flower image is located in the interval 210-219.
S1034, based on the color comparison result, marking the freshness information of the sample flowers corresponding to each sample flower image in each sample flower image.
The RGB average value is located in different preset RGB intervals to label the vividness of the sample flower differently, for example, label 1 is located in the interval 180-. The RGB average value of the flower is within the interval of 200-209, so the vividness of the sample flower is 4. Each sample flower image in the flower image sample set is processed through step S1031, step S1032, step S1033 and step S1034, so that the vividness of each sample flower image can be labeled.
S1035, extracting pixel arrangement information of each sample flower image.
For the embodiment of the application, the pixel arrangement information of each sample flower image is extracted to represent the plumpness of the sample flower more clearly, for example, the pixel arrangement information of the sample flower image is … … with 3 pixels in the first row, 5 pixels in the second row and 7 pixels in the third row
S1036, comparing the pixel arrangement information with a plurality of preset pixel arrangement intervals.
For the embodiment of the present application, taking the step S1035 as an example, assuming that a plurality of preset pixel arrangement sections are pixel arrangement sections with different shapes, the pixel arrangement information corresponding to the sample flower image is compared with the plurality of preset pixel arrangement sections. The pixel arrangement information of the sample flower and the number of overlapping pixels of a plurality of preset pixel arrangement sections determine which preset pixel arrangement section the sample flower is located in, for example, the number of the plurality of preset pixel arrangement sections is four, and after the sample flower image is compared with each preset pixel arrangement section, the number of overlapping pixels of the sample flower image and each preset pixel arrangement section is determined.
And S1037, marking the plumpness information of the sample flowers corresponding to each sample flower image based on the arrangement comparison result.
For the embodiment of the present application, taking the step S1036 as an example, it is assumed that the four preset pixel arrangement sections are respectively labeled as 1, 2, 3, and 4. And if the number of overlapped pixels between the sample flower image and the third preset pixel arrangement interval is the most, marking the corresponding plumpness of the sample flower image as 3. Each sample flower image in the flower image sample set is processed through step S1035, step S1036 and step S1037, and thus the plumpness of each sample flower image can be labeled.
In a possible implementation manner of the embodiment of the present application, step S1022 further includes step S105 (not shown in the figure) and step S106 (not shown in the figure), wherein,
and S105, obtaining a branch and leaf image sample set, wherein the branch and leaf image sample set comprises curvature information and freshness information which are respectively corresponding to each sample branch and leaf image used for training the second network model.
For the embodiment of the application, the second network model can be trained according to the branch and leaf image sample set and the Loss function. For example, the loss function of the second network model is LossB = aLossB1+ bLossB 2. LossB1 is a loss function of branch and leaf curvature information, and LossB2 is a loss function of branch and leaf freshness information. a is the weight corresponding to the branch and leaf bending degree identification task, and a can be 8; b is the weight corresponding to the task function of identifying freshness of branches and leaves, and b can be 5.
The branch and leaf image sample set comprises a plurality of branch and leaf flower images, the more the branch and leaf images of the sample in the branch and leaf image sample set are, the better the effect of processing the second image information through the second network model trained by the branch and leaf image sample set is. And labeling the curvature information and the freshness information corresponding to each sample branch and leaf image respectively so as to facilitate the training of the second network model based on the branch and leaf image sample set.
And S106, training a second network model based on the branch and leaf image sample set.
For the embodiment of the application, after each sample branch and leaf image is input into the second network model, the second network model is trained based on the curvature information and the freshness information which are respectively corresponding to and labeled with each sample branch and leaf image, so as to obtain the trained second network model.
In a possible implementation manner of the embodiment of the present application, the step S105 specifically includes a step S1051 (not shown in the figure), a step S1052 (not shown in the figure), a step S1053 (not shown in the figure), a step S1054 (not shown in the figure), a step S1055 (not shown in the figure), a step S1056 (not shown in the figure), and a step S1057 (not shown in the figure), the steps S1052 and S1055 may be executed at the same time, the step S1052 may be executed after the step S1055, and the step S1052 may also be executed before the step S1055, wherein,
s1051, determining the pixel arrangement information of each sample branch and leaf image in each branch and leaf sample image based on each branch and leaf sample image in the branch and leaf sample set.
For the embodiment of the application, the pixel arrangement information of each branch and leaf sample image is determined, so that the electronic equipment can analyze the branch and leaf sample image conveniently.
S1052, extracting branch and stem pixel arrangement information from the pixel arrangement information.
For the embodiment of the application, the pixel arrangement information of each branch and leaf sample image comprises branch and stem pixel arrangement information and leaf pixel arrangement information. And extracting branch and stem pixel arrangement information from the pixel arrangement information of the branch and leaf sample image so as to analyze the curvature of the branch and stem.
And S1053, comparing the branch and stem pixel arrangement information with a plurality of preset branch and stem pixel arrangement intervals.
For the embodiment of the application, it is assumed that the plurality of preset branch and stem pixel arrangement intervals are branch and stem pixel arrangement intervals with different curvatures respectively, and the branch and stem pixel arrangement information corresponding to the sample branch and leaf image is compared with the plurality of preset branch and stem pixel arrangement intervals. The pixel arrangement information of the sample branches and leaves and the number of overlapped pixels of a plurality of preset branch and stem pixel arrangement intervals determine which preset branch and stem pixel arrangement interval the sample branches and leaves are located in, for example, the number of the preset branch and stem pixel arrangement intervals is four, and after the sample branch and stem pixel image is compared with each preset branch and stem pixel arrangement interval, the number of pixels of the sample branch and stem pixel arrangement information overlapped with each preset branch and stem pixel arrangement interval is determined.
And S1054, marking the bending information of the branches and leaves of each sample based on the branch and stem pixel arrangement comparison result.
For the embodiment of the present application, taking step S1052 as an example, it is assumed that the four preset branch and stem pixel arrangement intervals are respectively and correspondingly labeled as 1, 2, 3, and 4. And if the overlapping pixels of the sample branch and stem pixel arrangement information and the third preset branch and stem pixel arrangement interval are the most, marking the curvature corresponding to the sample branch and leaf image as 3. Each sample branch and leaf image in the branch and leaf image sample set is processed through step S1051, step S1052, step S1053 and step S1054, that is, the curvature of each sample branch and leaf image can be labeled.
S1055, extracting blade pixel arrangement information from the pixel arrangement information.
For the embodiment, the higher the integrity and the flatness of the leaves are, the fresher the leaves are, and therefore, the freshness of the branches and leaves is characterized. And extracting the leaf pixel arrangement information from the pixel arrangement information of the sample branch and leaf image, thereby being convenient for analyzing the freshness of the branches and leaves.
And S1056, comparing the blade pixel arrangement information with a plurality of preset blade pixel arrangement intervals.
For the embodiment of the application, assuming that the plurality of preset leaf pixel arrangement sections are leaf pixel arrangement sections of different shapes, the leaf pixel arrangement information corresponding to the sample branch and leaf image is compared with the plurality of preset leaf pixel arrangement sections. The pixel arrangement information of the sample blade and the number of overlapped pixels of a plurality of preset blade pixel arrangement sections determine which preset blade pixel arrangement section the sample blade is in, for example, the number of the preset blade pixel arrangement sections is four, and after the blade pixel image corresponding to the sample branch and blade image is compared with each preset blade pixel arrangement section, the number of pixels of the blade pixel arrangement information corresponding to the sample branch and blade image and the number of overlapped pixels of each preset blade pixel arrangement section are determined.
And S1057, labeling the freshness information of each sample branch and leaf based on the leaf pixel arrangement comparison result.
For the embodiment of the present application, taking step S1056 as an example, it is assumed that the four preset blade pixel arrangement sections are respectively labeled as 1, 2, 3, and 4. The freshness corresponding to the four preset blade pixel arrangement intervals is increased in sequence. Assuming that the number of overlapped pixels between the leaf pixel arrangement information corresponding to the sample branch and leaf image and the fourth preset leaf pixel arrangement section is the largest, the freshness corresponding to the sample branch and leaf image is marked as 4. Each sample branch and leaf image in the branch and leaf image sample set is processed through step S1055, step S1056 and step S1057, so that the freshness of each sample branch and leaf image can be labeled.
In a possible implementation manner of the embodiment of the present application, step S1023 specifically includes step S10231 (not shown), step S10232 (not shown), and step S10233 (not shown), step S10231 and step S10232 can be executed simultaneously, step S10231 can be executed after step S10232, step S10231 can also be executed before step S10232, wherein,
s10231, determining flower grades corresponding to the flowers based on the flower vividness information and the flower plumpness information corresponding to the flowers.
For the embodiment of the application, the flower grade is determined by the vividness information and the flower plumpness information of the flower, and the first network model comprises a Backbone network (Backbone 1) and a Head network (Head 1). And inputting first image information of flowers into the trained first network model for training. The method comprises the steps that first image information is input into a first network model, then image characteristic information is extracted through a Backbone network (Backbone 1), the image characteristic information is used for representing the RGB average value of flowers and the pixel arrangement information of the flowers in the first image information, then the image characteristic information is analyzed through a Head network (Head 1) and the flower vividness information of the flowers is output, and the image characteristic information is analyzed through the Head network (Head 1) and the flower plumpness information is output. For example, first image information of a flower is input into the first network model, the flower vividness information of the flower output by the first network model is 4, the flower plumpness is 4, the flower grade can be estimated through a flower grade calculation formula preset in the electronic device, and if the flower grade = the flower vividness grade × 0.5+ the flower plumpness information × 0.5, the flower grade =4 × 0.5+4 × 0.5= 4.
And S10232, determining the branch and leaf grade corresponding to each flower according to the branch and leaf curvature information and the branch and leaf freshness information corresponding to each flower.
For the embodiment of the application, the branch and leaf grade is determined by branch and leaf bending information and branch and leaf freshness information, and the second network model comprises a Backbone network (Backbone 2) and a Head network (Head 2). And inputting second image information of the flowers into the trained second network model for recognition. The second image information is input into the second network model and then extracted through a Backbone network (Backbone 2), the image characteristic information is used for representing branch and stem pixel arrangement information and leaf pixel arrangement information in the second image information, then analyzing the image characteristic information through a Head network (Head 2) of the second network model and outputting branch and leaf flower tortuosity information, analyzing the image characteristic information through a Head network (Head 2) of the second network model and outputting branch and leaf freshness information, for example, the second image information of the flower is input into the second network model, the information of the curvature of the leaves of the flower output by the second network model is 4, the information of the freshness of the leaves is 3, the grade of the branches and leaves can be estimated through a branch and leaf grade calculation formula preset in the electronic device, and if the grade of the branches and leaves = branch and leaf curvature information × 0.6+ branch and leaf freshness information × 0.4, the flower grade of the fresh flowers =4 × 0.6+3 × 0.4= 3.6.
And S10233, determining the grade corresponding to each flower based on the flower grade corresponding to each flower and the grade corresponding to each branch and leaf.
For the embodiment of the application, the grade of the flower is determined by the flower grade and the branch and leaf grade, for example, the flower grade is higher than the branch and leaf grade, the electronic device determines the grade of the flower by a grade calculation formula, and the formula for calculating the grade of the flower is assumed as follows: flower grade = flower grade × 0.8+ branch and leaf grade × 0.2. Wherein 0.8 is the weight of flower grade, and 0.2 is the weight of branch grade. For example, a flower grade of 4, a branch grade of 3.6, and a flower grade =4 × 0.8+3.6 × 0.2= 3.92. A plurality of grades can be preset in the electronic equipment, for example, … … that 3.5-4 is equal, 3.0-3.5 is equal, 2.5-3.0 is equal, and the grade of the flower is equal.
The above embodiment describes a method for grading fresh cut flowers from the perspective of a method flow, and the following embodiment describes a device 20 for grading fresh cut flowers from the perspective of a virtual module or a virtual unit, which will be described in detail in the following embodiment.
The embodiment of the present application provides a cut flower grading device 20, as shown in fig. 2, the cut flower grading device 20 may specifically include:
the collecting module 201 is configured to collect first image information and second image information corresponding to each flower, where the first image information is used to represent image information of flower parts corresponding to the flowers, and the second image information is used to represent image information of branch and leaf parts corresponding to the flowers;
the estimating module 202 is configured to estimate grades corresponding to the respective flowers based on the first image information and the second image information corresponding to the respective flowers.
To this application embodiment, the flower generally includes flower part and branch and leaf part, and the accessible is graded flower part and branch and leaf part respectively when grading the flower again, synthesizes the grade of flower part and branch and leaf part and grades the flower whole, and the image information that collection module 201 gathered flower part is first image information, and the image information that gathers flower branch and leaf part is the second image information. The estimation module 202 estimates and grades the whole flower based on the first image information and the second image information, and compared with manual screening and grading, the efficiency is higher.
In a possible implementation manner of the embodiment of the application, the estimation module 202 is specifically configured to, when estimating the grade corresponding to each flower based on the first image information and the second image information corresponding to each flower, specifically:
inputting first image information corresponding to each flower into the trained first network model to obtain flower vividness information and flower plumpness information corresponding to each flower;
inputting second image information corresponding to each flower into the trained second network model to obtain branch and leaf curvature information and branch and leaf freshness information corresponding to each flower;
and estimating the grade of each flower corresponding to each flower based on the information of the flower vividness and the flower satisfaction of each flower and the information of the degree of curvature and the freshness of the branches and leaves corresponding to each flower.
In a possible implementation manner of the embodiment of the present application, the apparatus further includes:
the first obtaining module is used for obtaining a flower image sample set, and the flower image sample set comprises vividness information and plumpness information which are respectively corresponding to each sample flower image used for training the first network model;
and the first training module is used for training the first network model based on the flower sample set.
In a possible implementation manner of the embodiment of the present application, when the first obtaining module obtains the flower image sample set, the first obtaining module is specifically configured to:
segmenting each sample flower image in the flower image sample set and generating a plurality of sub-sample flower images corresponding to each sample flower image;
extracting RGB values in a plurality of sub-sample flower images and determining an RGB average value;
comparing the RGB average value with a plurality of preset RGB intervals;
marking freshness information of the sample flowers corresponding to each sample flower image in each sample flower image based on the color comparison result;
extracting pixel arrangement information of each sample flower image;
the pixel arrangement information is compared with a plurality of preset pixel arrangement intervals in an arrangement mode;
and marking the plumpness information of the sample flowers corresponding to each sample flower image based on the arrangement comparison result.
In a possible implementation manner of the embodiment of the present application, the apparatus further includes:
the second acquisition module is used for acquiring a branch and leaf image sample set, and the branch and leaf image sample set comprises curvature information and freshness information which respectively correspond to each sample branch and leaf image used for training the second network model;
and the second training module is used for training the second network model based on the branch and leaf image sample set.
In a possible implementation manner of the embodiment of the application, the second obtaining module is specifically configured to, when obtaining the branch and leaf image sample set:
determining pixel arrangement information of each sample branch and leaf image in each branch and leaf sample image based on each branch and leaf sample image in the branch and leaf sample set;
extracting branch and stem pixel arrangement information from the pixel arrangement information;
comparing the branch and stem pixel arrangement information with a plurality of preset branch and stem pixel arrangement intervals;
labeling the bending information of branches and leaves of each sample based on the branch and stem pixel arrangement comparison result;
extracting blade pixel arrangement information from the pixel arrangement information;
comparing the blade pixel arrangement information with a plurality of preset blade pixel arrangement intervals;
and marking the freshness information of each sample branch and leaf based on the leaf pixel arrangement comparison result.
The possible implementation manner of the embodiment of the application, the pre-estimation module 202 is specifically used for pre-estimating the grade corresponding to each flower based on the flower vividness information and the flower plumpness information corresponding to each flower respectively, and the branch and leaf curvature information and the branch and leaf freshness information corresponding to each flower respectively:
determining flower grades corresponding to the flowers based on the flower vividness information and the flower plumpness information corresponding to the flowers;
determining the branch and leaf grade corresponding to each flower according to the branch and leaf curvature information and the branch and leaf freshness information corresponding to each flower;
and determining the grade corresponding to each flower based on the grade of the flower corresponding to each flower and the grade of the branch and leaf corresponding to each flower.
For the embodiment of the present application, the first obtaining module and the second obtaining module may be the same obtaining module, and may also be different obtaining modules. The first training module and the second training module can be the same training module or different training modules.
The embodiment of the application provides a fresh cut flower grading device, which is suitable for the method embodiment and is not described herein again.
In an embodiment of the present application, an electronic device is provided, and as shown in fig. 3, an electronic device 30 shown in fig. 3 includes: a processor 301 and a memory 303. Wherein processor 301 is coupled to memory 303, such as via bus 302. Optionally, the electronic device 30 may also include a transceiver 304. It should be noted that the transceiver 304 is not limited to one in practical applications, and the structure of the electronic device 30 is not limited to the embodiment of the present application.
The Processor 301 may be a CPU (Central Processing Unit), a general-purpose Processor, a DSP (Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), an FPGA (Field Programmable Gate Array) or other Programmable logic device, a transistor logic device, a hardware component, or any combination thereof. Which may implement or perform the various illustrative logical blocks, modules, and circuits described in connection with the disclosure. The processor 301 may also be a combination of computing functions, e.g., comprising one or more microprocessors, a combination of a DSP and a microprocessor, or the like.
Bus 302 may include a path that transfers information between the above components. The bus 302 may be a PCI (Peripheral Component Interconnect) bus, an EISA (Extended Industry Standard Architecture) bus, or the like. The bus 302 may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown in FIG. 3, but this does not mean only one bus or one type of bus.
The Memory 303 may be a ROM (Read Only Memory) or other type of static storage device that can store static information and instructions, a RAM (Random Access Memory) or other type of dynamic storage device that can store information and instructions, an EEPROM (Electrically Erasable Programmable Read Only Memory), a CD-ROM (Compact Disc Read Only Memory) or other optical Disc storage, optical Disc storage (including Compact Disc, laser Disc, optical Disc, digital versatile Disc, blu-ray Disc, etc.), a magnetic Disc storage medium or other magnetic storage device, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, but is not limited to these.
The memory 303 is used for storing application program codes for executing the scheme of the application, and the processor 301 controls the execution. The processor 301 is configured to execute application program code stored in the memory 303 to implement the aspects illustrated in the foregoing method embodiments.
Among them, electronic devices include but are not limited to: mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMPs (portable multimedia players), in-vehicle terminals (e.g., in-vehicle navigation terminals), and the like, and fixed terminals such as digital TVs, desktop computers, and the like. But also a server, etc. The electronic device shown in fig. 3 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
The present application provides a computer-readable storage medium, on which a computer program is stored, which, when running on a computer, enables the computer to execute the corresponding content in the foregoing method embodiments. Compared with the prior art, the flower generally comprises a flower part and a branch and leaf part, the flower part and the branch and leaf part can be graded respectively by integrating the grade of the flower part and the branch and leaf part when the flower is graded, the image information of the flower part is collected to be first image information, the image information of the branch and leaf part is collected to be second image information, and the electronic equipment grades the whole flower based on the first image information and the second image information, so that the grading efficiency is higher compared with that of manual screening.
It should be understood that, although the steps in the flowcharts of the figures are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and may be performed in other orders unless explicitly stated herein. Moreover, at least a portion of the steps in the flow chart of the figure may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, which are not necessarily performed in sequence, but may be performed alternately or alternately with other steps or at least a portion of the sub-steps or stages of other steps.
The foregoing is only a partial embodiment of the present application, and it should be noted that, for those skilled in the art, several modifications and decorations can be made without departing from the principle of the present application, and these modifications and decorations should also be regarded as the protection scope of the present application.

Claims (10)

1. A method for grading a fresh cut flower, comprising:
acquiring first image information and second image information which respectively correspond to each flower, wherein the first image information is used for representing the image information of flower parts respectively corresponding to the flowers, and the second image information is used for representing the image information of branch and leaf parts respectively corresponding to each flower;
and estimating grades respectively corresponding to the flowers based on the first image information and the second image information respectively corresponding to the flowers.
2. The method for grading fresh cut flowers according to claim 1, wherein the estimating of the grade of each flower based on the first image information and the second image information of each flower comprises:
inputting the first image information corresponding to each flower into the trained first network model to obtain flower vividness information and flower plumpness information corresponding to each flower;
inputting second image information corresponding to each flower into a trained second network model to obtain branch and leaf curvature information and branch and leaf freshness information corresponding to each flower;
and estimating the grade respectively corresponding to each flower based on the flower vividness information and the flower satisfaction information respectively corresponding to each flower and the branch and leaf curvature information and the branch and leaf freshness information respectively corresponding to each flower.
3. The method for grading fresh cut flowers according to claim 1, wherein the inputting the first image information corresponding to each flower into the trained first network model to obtain the flower vividness information and the flower plumpness information corresponding to each flower further comprises:
acquiring a flower image sample set, wherein the flower image sample set comprises vividness information and plumpness information which are respectively corresponding to each sample flower image used for training a first network model;
training a first network model based on the flower sample set.
4. The method for grading fresh cut flowers according to claim 3, wherein the obtaining of the sample set of flower images comprises:
performing segmentation processing on each sample flower image in the flower image sample set and generating a plurality of sub-sample flower images corresponding to each sample flower image;
extracting RGB values in the plurality of sub-sample flower images and determining an RGB average value;
comparing the RGB average value with a plurality of preset RGB intervals;
marking freshness information of the sample flowers corresponding to each sample flower image in each sample flower image based on the color comparison result;
extracting pixel arrangement information of each sample flower image;
the pixel arrangement information is compared with a plurality of preset pixel arrangement intervals in an arrangement mode;
and marking the plumpness information of the sample flowers corresponding to each sample flower image based on the arrangement comparison result.
5. The method for grading fresh cut flowers according to claim 2, wherein the inputting the second image information corresponding to each flower into the trained second network model to obtain the branch and leaf curvature information and the branch and leaf freshness information corresponding to each flower further comprises:
acquiring a branch and leaf image sample set, wherein the branch and leaf image sample set comprises curvature information and freshness information which respectively correspond to each sample branch and leaf image used for training a second network model;
and training a second network model based on the branch and leaf image sample set.
6. The method for grading fresh cut flowers according to claim 5, wherein the obtaining of the sample set of branch and leaf images comprises:
determining pixel arrangement information of each sample branch and leaf image in each branch and leaf sample image based on each branch and leaf sample image in the branch and leaf sample set;
extracting branch and stem pixel arrangement information from the pixel arrangement information;
comparing the branch and stem pixel arrangement information with a plurality of preset branch and stem pixel arrangement intervals;
marking the bending information of each sample branch and leaf based on the branch and stem pixel arrangement comparison result;
extracting blade pixel arrangement information from the pixel arrangement information;
comparing the blade pixel arrangement information with a plurality of preset blade pixel arrangement intervals;
and marking the freshness information of each sample branch and leaf based on the leaf pixel arrangement comparison result.
7. The fresh cut flower grading method according to claim 6, wherein estimating the grade of each flower based on the flower vividness information and flower plumpness information and the branch and leaf bending information and the branch and leaf freshness information of each flower comprises:
determining flower grades corresponding to the flowers respectively based on the flower vividness information and the flower plumpness information corresponding to the flowers respectively;
determining the branch and leaf grade corresponding to each flower according to the branch and leaf curvature information and the branch and leaf freshness information corresponding to each flower;
and determining grades respectively corresponding to the flowers based on the grades respectively corresponding to the flowers and the grades respectively corresponding to the branches and leaves of the flowers.
8. A fresh cut flower grading device, characterized by comprising:
the device comprises an acquisition module, a storage module and a processing module, wherein the acquisition module is used for acquiring first image information and second image information which respectively correspond to each flower, the first image information is used for representing the image information of flower parts which respectively correspond to the flowers, and the second image information is used for representing the image information of branch and leaf parts which respectively correspond to each flower;
and the estimation module is used for estimating the grade corresponding to each flower based on the first image information and the second image information corresponding to each flower.
9. An electronic device, comprising:
one or more processors;
a memory;
one or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to: performing the method of grading a cut flower according to any one of claims 1 to 7.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the method of grading a cut flower according to any one of claims 1 to 7.
CN202110753202.3A 2021-07-02 2021-07-02 Fresh cut flower grading method, device and medium Active CN113361642B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110753202.3A CN113361642B (en) 2021-07-02 2021-07-02 Fresh cut flower grading method, device and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110753202.3A CN113361642B (en) 2021-07-02 2021-07-02 Fresh cut flower grading method, device and medium

Publications (2)

Publication Number Publication Date
CN113361642A true CN113361642A (en) 2021-09-07
CN113361642B CN113361642B (en) 2024-03-19

Family

ID=77538056

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110753202.3A Active CN113361642B (en) 2021-07-02 2021-07-02 Fresh cut flower grading method, device and medium

Country Status (1)

Country Link
CN (1) CN113361642B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002203242A (en) * 2000-12-28 2002-07-19 Japan Science & Technology Corp Plant recognition system
JP6238216B1 (en) * 2016-12-09 2017-11-29 マクタアメニティ株式会社 Crop judgment system
CN109663741A (en) * 2019-01-17 2019-04-23 昆明理工大学 A kind of Fresh Cutting flower image collecting device
CN110111311A (en) * 2019-04-18 2019-08-09 北京奇艺世纪科技有限公司 A kind of image quality evaluating method and device
CN111144270A (en) * 2019-12-23 2020-05-12 智慧神州(北京)科技有限公司 Evaluation method and evaluation device for handwritten text neatness based on neural network
CN111299186A (en) * 2020-02-21 2020-06-19 杨伟 Fruit grading method, device and equipment
CN111784688A (en) * 2020-07-24 2020-10-16 征图新视(江苏)科技股份有限公司 Flower automatic grading method based on deep learning
CN212370614U (en) * 2020-05-29 2021-01-19 云南农业大学 Fresh cut flower grading system
CN112808603A (en) * 2020-12-22 2021-05-18 南京林业大学 Fresh cut flower sorting device and method based on RealSense camera

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002203242A (en) * 2000-12-28 2002-07-19 Japan Science & Technology Corp Plant recognition system
JP6238216B1 (en) * 2016-12-09 2017-11-29 マクタアメニティ株式会社 Crop judgment system
CN109663741A (en) * 2019-01-17 2019-04-23 昆明理工大学 A kind of Fresh Cutting flower image collecting device
CN110111311A (en) * 2019-04-18 2019-08-09 北京奇艺世纪科技有限公司 A kind of image quality evaluating method and device
CN111144270A (en) * 2019-12-23 2020-05-12 智慧神州(北京)科技有限公司 Evaluation method and evaluation device for handwritten text neatness based on neural network
CN111299186A (en) * 2020-02-21 2020-06-19 杨伟 Fruit grading method, device and equipment
CN212370614U (en) * 2020-05-29 2021-01-19 云南农业大学 Fresh cut flower grading system
CN111784688A (en) * 2020-07-24 2020-10-16 征图新视(江苏)科技股份有限公司 Flower automatic grading method based on deep learning
CN112808603A (en) * 2020-12-22 2021-05-18 南京林业大学 Fresh cut flower sorting device and method based on RealSense camera

Also Published As

Publication number Publication date
CN113361642B (en) 2024-03-19

Similar Documents

Publication Publication Date Title
US20210182611A1 (en) Training data acquisition method and device, server and storage medium
CN102792332B (en) Image management apparatus, image management method and integrated circuit
CN111476280A (en) Plant leaf identification method and system
CN110418204B (en) Video recommendation method, device, equipment and storage medium based on micro expression
CN111242318B (en) Service model training method and device based on heterogeneous feature library
CN112818162B (en) Image retrieval method, device, storage medium and electronic equipment
CN111182367A (en) Video generation method and device and computer system
CN111598827A (en) Appearance flaw detection method, electronic device and storage medium
CN111931809A (en) Data processing method and device, storage medium and electronic equipment
CN113239227A (en) Image data structuring method and device, electronic equipment and computer readable medium
CN114066848A (en) FPCA appearance defect visual inspection system
CN110852322B (en) Method and device for determining region of interest
CN117520800A (en) Training method, system, electronic equipment and medium for nutrition literature model
CN110796178B (en) Decision model training method, sample feature selection method, device and electronic equipment
CN113361642A (en) Fresh cut flower grading method, device and medium
CN110096708A (en) A kind of determining method and device of calibration collection
CN111027771A (en) Scenic spot passenger flow volume estimation method, system and device and storable medium
CN111026935B (en) Cross-modal retrieval reordering method based on adaptive measurement fusion
CN115239947A (en) Wheat stripe rust severity evaluation method and device based on unsupervised learning
CN115184674A (en) Insulation test method and device, electronic terminal and storage medium
CN111651410B (en) Dynamic balance method and system for sample data
CN116415020A (en) Image retrieval method, device, electronic equipment and storage medium
CN113962216A (en) Text processing method and device, electronic equipment and readable storage medium
CN113011503A (en) Data evidence obtaining method of electronic equipment, storage medium and terminal
CN112966789A (en) Tobacco maturity identification method, device and equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant