CN113298180A - Method and computer system for plant identification - Google Patents

Method and computer system for plant identification Download PDF

Info

Publication number
CN113298180A
CN113298180A CN202110658721.1A CN202110658721A CN113298180A CN 113298180 A CN113298180 A CN 113298180A CN 202110658721 A CN202110658721 A CN 202110658721A CN 113298180 A CN113298180 A CN 113298180A
Authority
CN
China
Prior art keywords
plant
image
classification
identified
computer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110658721.1A
Other languages
Chinese (zh)
Inventor
徐青松
李青
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Ruisheng Software Co Ltd
Original Assignee
Hangzhou Ruisheng Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Ruisheng Software Co Ltd filed Critical Hangzhou Ruisheng Software Co Ltd
Priority to CN202110658721.1A priority Critical patent/CN113298180A/en
Publication of CN113298180A publication Critical patent/CN113298180A/en
Priority to PCT/CN2022/096706 priority patent/WO2022262586A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The present disclosure provides a method for plant identification, comprising: receiving an image, wherein the image comprises at least one part of a plant; identifying, from the image, at least one of a part, a growing place, and a growing period of a plant in the image and a classification of the plant in the image based on the trained neural network model; and performing the requested operation on the at least one item and the classification of the identified plant according to an operation request of a user. The present disclosure also relates to a computer system for plant identification.

Description

Method and computer system for plant identification
Technical Field
The present disclosure relates to the field of computer technology, and more particularly, to a method and computer system for plant identification.
Background
In the field of computer technology, there are applications for identifying plants. These applications typically receive imagery (including still images, moving images, and videos, etc.) from a user and identify a classification of a plant to be identified in the imagery based on a plant identification model established by artificial intelligence techniques. For example, the identification result, i.e., the classification of the plant, may be a plant Species (specifices) or the like. The image from the user typically includes at least a portion of the plant to be identified, for example, the image may include the stem, leaves, flowers, etc. of the plant to be identified.
Disclosure of Invention
It is an object of the present disclosure to provide a method and a computer system for plant identification.
According to a first aspect of the present disclosure, there is provided a method for plant identification, comprising: receiving an image, wherein the image comprises at least one part of a plant; identifying, from the image, at least one of a part, a growing place, and a growing period of a plant in the image and a classification of the plant in the image based on the trained neural network model; and performing the requested operation on the at least one item and the classification of the identified plant according to an operation request of a user.
According to a first aspect of the present disclosure, there is provided a method for plant identification, comprising: receiving an image, wherein the image comprises at least one part of a plant; identifying, from the image, a classification of a plant in the image, a part of the plant in the image, a growing place of the plant in the image, a growing cycle of the plant in the image, and an image quality of the image based on a trained neural network model; and selecting one or more items of identified content to perform the requested operation according to the operation request of the user.
According to a third aspect of the present disclosure, there is provided a method for plant identification, comprising: identifying a classification of a plant based on a trained neural network model according to an image, and at least two of a place of growth, a growth cycle of the plant, and a part of the plant in the image, wherein the image includes at least one part of the plant; determining a maintenance regimen for the plant based on the identified classification of the plant and the at least two of the location of growth, the growth cycle of the plant, and the part of the plant in the image; and outputting the maintenance scheme of the plant.
According to a fourth aspect of the present disclosure, there is provided a method for plant identification, comprising: receiving an image, wherein the image comprises at least one part of the plant; identifying parts of the plants in the images and the classification of the plants according to the images; determining an output classification level according to the identified plant part in the image; and outputting a classification of the respective classification level of the plant according to the determined classification level.
According to a fifth aspect of the present disclosure, there is provided a computer system for plant identification, comprising: one or more processors; and one or more memories configured to store a series of computer-executable instructions and computer-accessible data associated with the series of computer-executable instructions, wherein the series of computer-executable instructions, when executed by the one or more processors, cause the computer system to perform any of the methods described above.
According to a sixth aspect of the present disclosure, there is provided a non-transitory computer readable storage medium having stored thereon a series of computer executable instructions which, when executed by one or more computing devices, cause the one or more computing devices to perform any of the methods described above.
Other features of the present disclosure and advantages thereof will become apparent from the following detailed description of exemplary embodiments thereof, which proceeds with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the description, serve to explain the principles of the disclosure.
The present disclosure may be more clearly understood from the following detailed description, taken with reference to the accompanying drawings, in which:
fig. 1 is a flow diagram schematically illustrating at least a portion of a method for plant identification according to one embodiment of the present disclosure.
Fig. 2 is a flow diagram schematically illustrating at least a portion of a method for plant identification, according to another embodiment of the present disclosure.
Fig. 3 is a block diagram that schematically illustrates at least a portion of a computer system for plant identification, in accordance with an embodiment of the present disclosure.
Fig. 4 is a block diagram schematically illustrating at least a portion of a computer system for plant identification, according to another embodiment of the present disclosure.
Fig. 5 is a flow chart schematically illustrating at least a portion of a method for plant identification according to yet another embodiment of the present disclosure.
Fig. 6 is a flow chart schematically illustrating at least a portion of a method for plant identification according to yet another embodiment of the present disclosure.
Note that in the embodiments described below, the same reference numerals are used in common between different drawings to denote the same portions or portions having the same functions, and a repetitive description thereof will be omitted. In this specification, like reference numerals and letters are used to designate like items, and therefore, once an item is defined in one drawing, further discussion thereof is not required in subsequent drawings.
Detailed Description
Various exemplary embodiments of the present disclosure will be described in detail below with reference to the accompanying drawings. It should be noted that: the relative arrangement of the components and steps, the numerical expressions, and numerical values set forth in these embodiments do not limit the scope of the present disclosure unless specifically stated otherwise. In the following description, numerous details are set forth in order to better explain the present disclosure, however it is understood that the present disclosure may be practiced without these details.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the disclosure, its application, or uses. In all examples shown and discussed herein, any particular value should be construed as merely illustrative, and not limiting.
Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate.
Fig. 1 is a flow diagram schematically illustrating at least a portion of a method 100 for plant identification, according to one embodiment of the present disclosure. The method 100 comprises: identifying a classification of the plant based on the image, and identifying at least two of a growing place, a growing period of the plant, and a part of the plant in the image, wherein the image includes at least one part of the plant (step 110); determining a maintenance program of the plant according to the identified classification of the plant and at least two of the growing place, the growing period and the part of the plant in the image of the identified plant (step 120); and outputting the maintenance schedule of the plant (step 130). The plant maintenance regimen may include, for example, an implementation of at least one of watering, water spraying, water changing, water adding, fertilizing, pruning, weeding, pot changing, sun exposure, sun shading, temperature regulation, humidity regulation, overwintering protection, and pest control. The inventors of the present application found that the maintenance scheme is not only related to the kind of the plant, but also related to the growing place, growing cycle, and parts of the plant, and thus the method according to the embodiments of the present disclosure may determine a personalized maintenance scheme for the plant in the image according to the identified classification of the plant and at least two of the growing place, growing cycle, and parts of the plant in the image.
The user may input an image including at least one part of the plant to be identified to an application that may perform plant identification to identify the classification of the plant. It should be noted that the image includes at least one part of the plant, and refers to one or more parts of the plant, wherein each part may be the whole or part of the part. The images may be previously stored by the user, taken in real-time, or downloaded from a network. The imagery may include any form of visual presentation, such as still images, moving images, and video. The image can be captured by using a device including a camera, such as a mobile phone, a tablet computer, and the like.
An application capable of implementing the method 100 may receive the image from the user and perform plant identification based on the image in step 110. Any known image-based plant identification method may be included. For example, the identified plants in the image may be identified by the computing device and a pre-trained (or referred to as "trained") plant identification model to obtain an identification result, i.e., a classification of the plants. The vegetation identification model may be built based on a neural network, such as a deep Convolutional Neural Network (CNN) or a deep residual network (Resnet), etc. For example, a certain number of image samples labeled with the classification name of each plant, i.e., a training sample set, are obtained for the classification of each plant, and the neural network is trained by using the image samples until the output accuracy of the neural network meets the requirement. The images may also be preprocessed before plant identification based on the images. The pre-processing may include normalization, brightness adjustment, or noise reduction, among others. The noise reduction process can highlight the description of the characteristic part in the image, so that the characteristic is more vivid.
At step 110, the method 100 further identifies at least two of a growing place, a growing period, and a part of the plant in the image of the plant from the image. The recognition may be based on a trained neural network model.
The maintenance regimen may be related to the location of the plant. The habitat of the plant may be identified based on a trained habitat classification model. A plurality of classifications may be established for a growing place of a plant based on a relationship between the growing place of the plant and a maintenance scheme of the plant. A certain number of image samples can be obtained for each classification of growing areas, and the classification names of the growing areas of plants in the image samples are labeled on the image samples, so as to create a training sample set. And training the neural network by using the training sample set until the output accuracy of the neural network meets the requirement.
Some maintenance schemes differentiate between potted plants and non-potted plants, for example, the task of pot changing and changing is only for potted plants. Some maintenance protocols distinguish between fresh cut flowers and non-fresh cut flowers, for example, the task for fresh cut flowers may include changing water, while the task for non-fresh cut flowers may include watering. Thus, in one example, the multiple classifications of the growing place of the plant may include potted, non-potted, and cut flower. For example, images in which flowerpots are visible or which grow indoors may be labeled as "pot plants", images inserted in flower pots may be labeled as "cut flowers", images which are not pot plants nor cut flowers may be labeled as "non-pot plants", and images in which no plants, fake plants, specimens, or unidentifiable images may be labeled as "other".
The maintenance regimen may be related to the growth cycle of the plant. The growth cycle of the plant may be identified based on the trained growth cycle classification model. Multiple classifications may be established for the growth cycle of a plant based on the relationship between the growth cycle of the plant and the maintenance regimen of the plant. A certain number of image samples can be obtained for each classification of a growth cycle, and the classification names of the growth cycles of plants in the image samples are labeled on the image samples, thereby creating a training sample set. And training the neural network by using the training sample set until the output accuracy of the neural network meets the requirement.
The maintenance regimen may be different for plants in different growth cycles. For example, the frequency and amount of watering required by the same plant during its different growth cycles may vary. For example, the task of pot-potting plants can be directed only to plants whose growth cycle is at the leaf, flowering or fruit stage. In one example, the plurality of classifications of the growth cycle of a plant may include a just-emerged seedling, a leaf stage, a flowering stage, a fruit stage, a defoliation stage, and a resting stage. For example, an image including only two cotyledons in the image may be labeled as "just-emerged seedling", an image including a plurality of leaves but not growing up completely may be labeled as "seedling", an image before flowering may be labeled as "leaf stage", and an image with difficulty in distinguishing between other growth cycles may be labeled as "other".
The maintenance regimen may be associated with a part of the plant. Parts of the plant may be identified based on the trained part classification model. Multiple classifications may be established for parts of a plant based on the relationship between the parts of the plant and the maintenance regimen of the plant. A certain number of image samples may be obtained for each part classification, and the classification names of the parts of the plant in each image sample are labeled on the image samples, thereby creating a training sample set. And training the neural network by using the training sample set until the output accuracy of the neural network meets the requirement.
The location of the plant in the image may affect the maintenance regimen output to the user. For example, if the part of the plant is found to be foliage and some is yellow, the user may be reminded to water more. For example, if the leaves are spotted, a certain pest is possible, and the user can be reminded to control the corresponding pest. In one example, the plurality of classifications of the part of the plant may include a trunk, a bud, a seed, a bud, a fruit, a seedling, a leaf, a flower, a stem, and a root. In addition, in other examples, corresponding classifications may also be established to include plant subjects in the image that are too far apart to have their detailed characteristics discerned, plant subjects that are too close together to have the plant in the image too local to have a complete organ, and so on.
At step 120, the method 100 determines a maintenance regimen for the plant based on the identified classification of the plant and at least two of the location of the plant, the growth cycle, and the portion of the plant in the image. At step 130, the method 100 outputs a maintenance regimen for the plant.
In one example, a maintenance regimen lookup table as shown in table 1 may be established in advance, and a maintenance regimen of the plant may be determined based on the maintenance regimen lookup table. The value in each cell in table 1 indicates the frequency with which maintenance tasks should be performed in the maintenance protocol, i.e., the interval at which tasks are repeated, in days. For example, the maintenance protocol for plant classification 1 in the case of growth cycle 1, growing area 1 is to perform a corresponding maintenance task (e.g. pruning) every 28 days. No value in a cell may indicate that the corresponding plant classification need not be performed the task at the corresponding growth cycle, and a value of-1 may indicate that the task at the corresponding plant classification need only be performed once at the corresponding growth cycle. The frequency with which maintenance tasks should be performed can be determined from the classification and growth cycle of the plants according to a maintenance regimen look-up table as shown in table 1.
Table 1 maintenance protocol lookup table
Figure BDA0003114397350000071
In some embodiments, the maintenance regimen comprises a maintenance regimen relating to pest control, i.e., a regimen for performing pest control. In this embodiment, the method 100 requires identifying the pest type of the plant from the image (also referred to herein as "pest diagnosis information"), thereby recommending a personalized maintenance regimen for the user regarding pest control. The plant disease and insect diagnosis information can be identified according to the images by using the trained disease and insect diagnosis model. The diagnostic information may include pest information or non-detected pest information. The disease and pest diagnosis model can be a neural network model, and specifically can be a convolutional neural network model or a residual error network model.
A large number of images may be included in the training sample set of the disease and pest diagnosis model, and each image is labeled with diagnosis information, which may be, for example, disease and pest information suffered by a plant in the image, or undetected disease and pest information corresponding to a healthy plant. Inputting the image into the disease and pest diagnosis model to generate output diagnosis information, and then adjusting related parameters in the disease and pest diagnosis model according to a comparison result between the output diagnosis information and the labeled diagnosis information, namely training the disease and pest diagnosis model until the training is finished when the output accuracy of the disease and pest diagnosis model meets the requirement, thereby obtaining the trained disease and pest diagnosis model.
After the pest diagnosis information is identified, a maintenance scheme related to pest control may be extracted in the already established database according to the identified classification of the plant and the pest diagnosis information, and according to at least two of a growing place, a growing period of the plant, and a part of the plant in the image, and the maintenance scheme may be output. When the relevant maintenance schemes are extracted from the database, the maintenance schemes can be retrieved according to the classification of the plants and the pest and disease diagnosis information, and the retrieved maintenance schemes are properly adjusted according to at least two items of the growing areas, the growing periods and the parts of the plants in the images of the plants. When a large amount of data are stored in the database in advance, the maintenance scheme of the classification and pest and disease diagnosis information of most plants can be covered, so that the corresponding maintenance scheme can be provided for users.
In some embodiments, the classification level of the output may be determined based on the identified part of the plant in the image. In some cases, if the plant part in the image is a trunk, a bud, a seed, a bud, a fruit or a seedling, it is difficult to obtain more accurate species information (i.e., information classified into species). In these cases, if the result of identifying the species is directly output, it is likely to be wrong, which may mislead or confuse the user. And at this point it is generally more accurate if the information of the Genus (Genus) is identified. And if the part of the plant in the image is a characteristic part such as a leaf, a flower, a stem, a root and the like, the identified species information is generally reliable. Thus, in response to the part of the plant in the image being one of a trunk, a bud, a seed, a bud, a fruit, and a seedling, determining the classification level of the output as a genus; and determining the output classification level as a species in response to the part of the plant in the image being one of a leaf, a flower, a stem, and a root.
The recognition results provided by the plant recognition model typically include one or more classifications of the identified plants. One or more classes are ranked from high to low confidence (the class approaches the confidence level of the true class). In one embodiment, the plant identification model provides identification results that include one or more classifications of classification levels as seeds. The classification hierarchy of each recognition result as the classification of the genus can be obtained according to the correspondence between the species and the genus. In one embodiment, the plant identification model provides the identification result including one or more classifications of classification levels as species and genus.
In some embodiments, the various identifications are based on the subject in the image. The main body in the image may refer to the entity occupying the largest range in the image, may refer to the entity located substantially in the middle of the image, and may refer to the entity not located at the corners of the image. In some embodiments, in response to the subject in the image being ambiguous (subject cannot be resolved, e.g., perspective of a large tree), subject being a non-plant (subject can be resolved, but subject is not a plant), and subject being a perspective of the entire plant (subject can be resolved, but subject is far away from identifying details, e.g., shape of leaves, etc.), no recognition is performed, i.e., step 110 of method 100 is not performed, and a prompt is output to the user to re-enter the image. This avoids invalid identifications. The determination of whether the image is ambiguous, whether the subject is a non-plant, or whether the subject is a perspective view of the entire plant can be performed by a trained neural network model. A certain number of image samples marked with the classification can be prepared for each classification of the classifications of which the main body is undefined, the main body is a non-plant and the main body is a long-range view of the whole plant, and the image samples are used for training the neural network until the output accuracy of the neural network meets the requirement.
In some embodiments, the image quality of the image may be identified based on a trained quality classification model. In response to the image quality being clear, an identification is performed, i.e., step 110 of the method 100 is performed. In response to the image quality being unclear, no recognition is performed, i.e., step 110 of method 100 is not performed, and information prompting the user to re-input the image is output. The classification of the unclear image quality may be divided more finely, for example, into unclear due to light, unclear due to focal length, and the like, so that more specific prompt information may be output to the user, for example, prompting the user to fill light and shoot again. The quality classification model may be trained based on a set of image samples. The image sample set comprises a certain number of image samples which are prepared for each classification and marked with the classification, and the image samples are used for training the neural network until the output accuracy of the neural network meets the requirement, so that the trained quality classification model is obtained.
Fig. 2 is a flow diagram schematically illustrating at least a portion of a method 200 for plant identification, according to another embodiment of the present disclosure. The method 200 comprises the following steps: receiving an image, wherein the image includes at least one part of a plant (step 210); identifying the plant part in the image and the plant classification according to the image (step 220); determining an output classification level according to the identified plant part in the image (step 230); and outputting a classification of the corresponding classification level of the plant according to the determined classification level (step 240).
In step 210, an application capable of implementing the method 200 may receive an image from a user including at least one part of a plant to be identified. It should be noted that the image includes at least one part of the plant, and refers to one or more parts of the plant, wherein each part may be the whole or part of the part. The images may be previously stored by the user, taken in real-time, or downloaded from a network. The imagery may include any form of visual presentation, such as still images, moving images, and video. The image can be captured by using a device including a camera, such as a mobile phone, a tablet computer, and the like.
In step 220, the method 200 identifies the plant part and the plant classification in the image according to the image. The trained plant recognition model described above may be used to identify a classification of a plant in the image, and the trained part classification model described above may be used to identify a part of the plant in the image.
In step 230, the method 200 determines the output classification level based on the identified plant part in the image. In some cases, if the plant part in the image is a trunk, a bud, a seed, a bud, a fruit or a seedling, it is difficult to obtain more accurate species information (i.e., information classified into species). In these cases, if the result of identifying the species is directly output, it is likely to be wrong, which may mislead or confuse the user. And at this time, if the information of the belongings is identified, the information is generally more accurate. And if the part of the plant in the image is a characteristic part such as a leaf, a flower, a stem, a root and the like, the identified species information is generally reliable. Thus, in response to the part of the plant in the image being one of a trunk, a bud, a seed, a bud, a fruit, and a seedling, determining the classification level of the output as a genus; and determining the output classification level as a species in response to the part of the plant in the image being one of a leaf, a flower, a stem, and a root.
The recognition results provided by the plant recognition model typically include one or more classifications of the identified plants. One or more classes are ranked from high to low confidence (the class approaches the confidence level of the true class). In one embodiment, the plant identification model provides identification results that include one or more classifications of classification levels as seeds. The classification hierarchy of each recognition result as the classification of the genus can be obtained according to the correspondence between the species and the genus. In one embodiment, the plant identification model provides the identification result including one or more classifications of classification levels as species and genus.
In some embodiments, the identification is performed based on the subject in the image. The main body in the image may refer to the entity occupying the largest range in the image, may refer to the entity located substantially in the middle of the image, and may refer to the entity not located at the corners of the image. In some embodiments, in response to the subject in the image being ambiguous (unable to distinguish the subject, e.g., the perspective of a large tree), the subject being a non-plant (able to distinguish the subject, but the subject is not a plant), and the subject being a perspective view of the entire plant (able to distinguish the subject, but the subject is far enough to be unable to recognize details, e.g., the shape of a leaf, etc.), no recognition is performed, i.e., step 220 of method 200 is not performed, and information is output prompting the user to re-input the image. This avoids invalid identifications. The determination of whether the image is ambiguous, whether the subject is a non-plant, or whether the subject is a perspective view of the entire plant can also be performed using a trained neural network model. A certain number of image samples marked with the classification can be prepared for each classification of the classifications of which the main body is undefined, the main body is a non-plant and the main body is a long-range view of the whole plant, and the image samples are used for training the neural network until the output accuracy of the neural network meets the requirement.
In some embodiments, the image quality of the image may be identified based on a trained quality classification model. In response to the image quality being clear, an identification is made, i.e., step 220 of the method 200 described above is performed. In response to the image quality being unclear, no recognition is performed, i.e., step 220 of method 200 is not performed, and information prompting the user to re-input the image is output. The classification of the unclear image quality may be divided more finely, for example, into unclear due to light, unclear due to focal length, and the like, so that more specific prompt information may be output to the user, for example, prompting the user to fill light and shoot again. The quality classification model may be trained based on a set of image samples. The image sample set comprises a certain number of image samples which are prepared for each classification and marked with the classification, and the image samples are used for training the neural network until the output accuracy of the neural network meets the requirement, so that the trained quality classification model is obtained.
In some embodiments, the user's needs may not be simply a regimen of maintenance of the plants or a classification of the plants, where methods 100 and 200 according to embodiments of the present invention may be difficult to meet. Fig. 5 is a flow diagram schematically illustrating at least a portion of a method 500 for plant identification, according to yet another embodiment of the present disclosure. The method 500 includes: receiving an image, wherein the image includes at least one part of a plant (step 510); identifying at least one of a part, a growing place and a growing period of the plant in the image and a classification of the plant in the image based on the trained neural network model according to the image (step 520); and performing the requested operation on at least one of the classification and the above-mentioned item of the identified plant according to the operation request of the user (step 530).
In step 510, an application capable of implementing method 500 may receive an image from a user that includes at least one part of a plant to be identified. It should be noted that the image includes at least one part of the plant, and refers to one or more parts of the plant, wherein each part may be the whole or part of the part. The images may be previously stored by the user, taken in real-time, or downloaded from a network. The imagery may include any form of visual presentation, such as still images, moving images, and video. The image can be captured by using a device including a camera, such as a mobile phone, a tablet computer, and the like.
In step 520, the method 500 identifies at least one of a part, a growing place, and a growing period of the plant in the image and a classification of the plant in the image based on the image and based on the trained neural network model. For example, the trained plant recognition model described above may be used to identify the classification of plants in the image, and the trained part classification model, growing region classification model, and growing cycle classification model described above may be used to identify the part, growing region, and growing cycle of plants in the image, respectively.
In step 530, the method 500 performs the requested operation on at least one of the identified part, growing place and growing period of the plant and the classification of the plant according to the operation request of the user. In one embodiment, the user's operation request is a request for the identified information, e.g., a request to output the identified information, and the application program capable of implementing method 500 may output (e.g., through an interface of the application program) information of the identified at least one of the part, the growing place, and the growing period of the plant and the classification of the plant.
In one embodiment, where the user's operation request is a request for a plant maintenance regimen, an application capable of implementing method 500 may invoke the application's maintenance regimen determination module to determine a plant maintenance regimen based on the identified at least one of the part, growing area, and growing cycle of the plant and the classification of the plant, and output (e.g., via an interface of the application) the determined plant maintenance regimen to the user. The maintenance regimen determination module may pre-establish a maintenance regimen lookup table as shown in table 1 as described in method 100 and determine a maintenance regimen for the plant based on the maintenance regimen lookup table. In one embodiment, the user's request for a maintenance regimen for the plant may be a request for a pest control regimen. An application capable of implementing method 500 may identify a pest type of a plant (also referred to herein as "pest diagnosis information") from the image using the trained pest diagnosis model, and may extract a maintenance plan related to pest control from an established database according to the identified classification of the plant and the pest diagnosis information and at least one of a growing place, a growing period of the plant, and a part of the plant in the image, thereby recommending a user a personalized maintenance plan related to pest control.
Fig. 6 is a flow diagram schematically illustrating at least a portion of a method 600 for plant identification, according to yet another embodiment of the present disclosure. The method 600 comprises: receiving an image, wherein the image includes at least one part of a plant (step 610); identifying, from the image, a classification of the plant in the image, a part of the plant in the image, a growing place of the plant in the image, a growing period of the plant in the image, and an image quality of the image based on the trained neural network model (step 620); and selecting one or more items of identified content to perform the requested operation according to the operation request of the user (step 630). Unlike the method 500, the method 600 identifies the plant classification, plant part, growing area, growing period, image quality, etc. in the received image, and then selects one or more items of content for use in subsequent steps as needed (determined according to the user's operation request). In one embodiment, the items of content identified in step 620 are stored in association with the corresponding images, for example, as labels for the images for later use. Therefore, after a request of a user is subsequently received, the image does not need to be identified again to obtain the required information, and the required information can be directly extracted from the label information of the picture.
In step 610, an application capable of implementing the method 600 may receive an image from a user including at least one part of a plant to be identified. It should be noted that the image includes at least one part of the plant, and refers to one or more parts of the plant, wherein each part may be the whole or part of the part. The images may be previously stored by the user, taken in real-time, or downloaded from a network. The imagery may include any form of visual presentation, such as still images, moving images, and video. The image can be captured by using a device including a camera, such as a mobile phone, a tablet computer, and the like.
In step 620, the method 600 may identify information of the classification, the part, the growing place, the growing period, and the image quality of the plant in the image based on the trained neural network model according to the image. For example, the trained plant recognition model described above may be used to identify the classification of plants in the image, and the trained part classification model, growing region classification model, growing cycle classification model, and quality classification model described above may be used to identify the part, growing region, growing cycle, and image quality of plants in the image, respectively. As described above, in the identification step, the method 600 identifies the classification, the part, the growing area, the growing period, the image quality, and other information of the plant in the received image for selective use in the subsequent steps.
In step 630, the method 600 may select one or more of the entire contents identified in step 620 for the requested operation according to the operation request of the user. In one embodiment, the operation request of the user is a request for the identified information, for example, a request for outputting information of the identified one or more items of content. An application capable of implementing the method 600 may select one or more items of content from the total content identified in step 620 according to a user request and output (e.g., through an interface of the application) information of the selected one or more items of content.
In one embodiment, the user's operation request is a request for a maintenance regimen for the plant. An application capable of implementing method 600 may select at least one of the location, growing place, and growing period of the plant identified in step 620 and the classification of the plant in the image, and invoke the maintenance program determination module of the application to determine the maintenance program for the plant and output (e.g., via an interface of the application) the determined plant maintenance program to the user. The maintenance regimen determination module may pre-establish a maintenance regimen lookup table as shown in table 1 as described in method 100 and determine a maintenance regimen for the plant based on the maintenance regimen lookup table. In one embodiment, the user's request for a maintenance regimen for the plant may be a request for a pest control regimen. The application program capable of implementing the method 600 may identify the type of pest of the plant (also referred to herein as "pest diagnosis information") from the image using the trained pest diagnosis model, and may extract a maintenance scheme related to pest control from an established database according to the identified classification of the plant and the pest diagnosis information and at least one of a growing place, a growing period, and a part of the plant, thereby recommending a personalized maintenance scheme related to pest control to a user.
Fig. 3 is a block diagram that schematically illustrates at least a portion of a computer system 300 for plant identification, in accordance with an embodiment of the present disclosure. Those skilled in the art will appreciate that the system 300 is only an example and should not be considered as limiting the scope of the present disclosure or the features described herein. In this example, the system 300 may include one or more storage devices 310, one or more user devices 320, and one or more computing devices 330, which may be communicatively connected to each other via a network or bus 340. The one or more storage devices 310 provide storage services for one or more user equipment 320, and one or more computing devices 330. Although one or more storage devices 310 are shown in system 300 as separate blocks apart from one or more user devices 320, and one or more computing devices 330, it should be understood that one or more storage devices 310 may actually be stored on any of the other entities 320, 330 included in system 300. Each of the one or more user devices 320 and the one or more computing devices 330 may be located at different nodes of the network or bus 340 and may be capable of communicating directly or indirectly with other nodes of the network or bus 340. Those skilled in the art will appreciate that system 300 may also include other devices not shown in fig. 3, where each different device is located at a different node of the network or bus 340.
The one or more storage devices 310 may be configured to store any of the data described above, including but not limited to: the image data is input from the user, each image sample, each neural network model, the recognition result, the file of the application program, and the like. One or more computing devices 330 may be configured to perform one or more of the methods according to embodiments described above, and/or one or more steps of one or more methods according to embodiments. One or more user devices 320 may be configured to provide services to the user, such as receiving imagery from the user, outputting a regimen of plants, outputting a classification of plants, and outputting information prompting the user to re-enter imagery. The one or more user devices 320 may also be configured to perform one or more of the methods according to embodiments described above, and/or one or more steps of one or more methods according to embodiments.
The network or bus 340 may be any wired or wireless network and may include cables. The network or bus 340 may be part of the internet, world wide web, a specific intranet, a wide area network, or a local area network. The network or bus 340 may utilize standard communication protocols such as ethernet, WiFi, and HTTP, protocols that are proprietary to one or more companies, and various combinations of the foregoing. The network or bus 340 may also include, but is not limited to, an Industry Standard Architecture (ISA) bus, a Micro Channel Architecture (MCA) bus, an enhanced ISA (eisa) bus, a Video Electronics Standards Association (VESA) local bus, and a Peripheral Component Interconnect (PCI) bus.
Each of the one or more user devices 320 and the one or more computing devices 330 may be configured similarly to the system 400 shown in fig. 4, i.e., with one or more processors 410, one or more memories 420, and instructions 421 and data 422. Each of the one or more user devices 320 and the one or more computing devices 330 may be a personal computing device intended for use by a user or a commercial computer device used by an enterprise, and have all of the components typically used in conjunction with a personal computing device or a commercial computer device, such as a Central Processing Unit (CPU), memory (e.g., RAM and internal hard drives) that stores data and instructions, one or more I/O devices such as a display (e.g., a monitor having a screen, a touch screen, a projector, a television, or other device operable to display information), a mouse, a keyboard, a touch screen, a microphone, speakers, and/or a network interface device, among others.
One or more user devices 320 may also include one or more cameras for capturing still images or recording video streams, as well as all components for connecting these elements to each other. While one or more user devices 320 may each comprise a full-sized personal computing device, they may alternatively comprise a mobile computing device capable of wirelessly exchanging data with a server over a network such as the internet. The one or more user devices 320 may be, for example, mobile phones or devices such as PDAs with wireless support, tablet PCs or netbooks capable of obtaining information via the internet. In another example, one or more user devices 320 may be wearable computing systems.
Fig. 4 is a block diagram that schematically illustrates at least a portion of a computer system 400 for plant identification, in accordance with an embodiment of the present disclosure. The system 400 includes one or more processors 410, one or more memories 420, and other components (not shown) typically present in a computer or like device. Each of the one or more memories 420 may store content accessible by the one or more processors 410, including instructions 421 that may be executed by the one or more processors 410, and data 422 that may be retrieved, manipulated, or stored by the one or more processors 410.
The instructions 421 may be any set of instructions to be executed directly by the one or more processors 410, such as machine code, or indirectly, such as scripts. The terms "instructions," "applications," "processes," "steps," and "programs" herein may be used interchangeably. The instructions 421 may be stored in object code format for direct processing by the one or more processors 410, or in any other computer language, including scripts or collections of independent source code modules that are interpreted or compiled in advance, as needed. The instructions 421 may include instructions that cause, for example, one or more processors 410 to act as neural networks herein. The functions, methods, and routines of the instructions 421 are explained in more detail elsewhere herein.
The one or more memories 420 may be any temporary or non-temporary computer-readable storage medium capable of storing content accessible by the one or more processors 410, such as a hard drive, memory card, ROM, RAM, DVD, CD, USB memory, writable and read-only memories, and the like. One or more of the one or more memories 420 may comprise a distributed storage system, where the instructions 421 and/or data 422 may be stored on a plurality of different storage devices, which may be physically located at the same or different geographic locations. One or more of the one or more memories 420 may be connected to the one or more first devices 410 via a network and/or may be directly connected to or incorporated into any of the one or more processors 410.
The one or more processors 410 may retrieve, store, or modify data 422 according to instructions 421. Data 422 stored in one or more memories 420 may include at least portions of one or more of the items stored in one or more storage devices 310 described above. For example, although the subject matter described herein is not limited by any particular data structure, data 422 may also be stored in computer registers (not shown) as tables or XML documents having many different fields and records stored in a relational database. The data 422 may be formatted in any computing device readable format, such as, but not limited to, binary values, ASCII, or unicode. Further, the data 422 may include any information sufficient to identify the relevant information, such as a number, descriptive text, proprietary code, pointer, reference to data stored in other memory, such as at other network locations, or information used by a function to compute the relevant data.
The one or more processors 410 may be any conventional processor, such as a commercially available Central Processing Unit (CPU), Graphics Processing Unit (GPU), or the like. Alternatively, the one or more processors 410 may also be special-purpose components, such as an Application Specific Integrated Circuit (ASIC) or other hardware-based processor. Although not required, one or more of the processors 410 may include specialized hardware components to perform certain computational processes faster or more efficiently, such as image processing of imagery.
Although one or more processors 410 and one or more memories 420 are schematically illustrated in fig. 4 within the same block, system 400 may actually comprise multiple processors or memories that may reside within the same physical housing or within different physical housings. For example, one of the one or more memories 420 may be a hard disk drive or other storage medium located in a different housing than the housing of each of the one or more computing devices (not shown) described above. Thus, references to a processor, computer, computing device, or memory are to be understood as including references to a collection of processors, computers, computing devices, or memories that may or may not operate in parallel.
In the specification and claims, the word "a or B" includes "a and B" and "a or B" rather than exclusively including only "a" or only "B" unless specifically stated otherwise.
Reference in the present disclosure to "one embodiment," "some embodiments," means that a feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment, at least some embodiments, of the present disclosure. Thus, the appearances of the phrases "in one embodiment," "in some embodiments" in various places throughout this disclosure are not necessarily referring to the same or like embodiments. Furthermore, the features, structures, or characteristics may be combined in any suitable combination and/or sub-combination in one or more embodiments.
As used herein, the word "exemplary" means "serving as an example, instance, or illustration," and not as a "model" that is to be replicated accurately. Any implementation exemplarily described herein is not necessarily to be construed as preferred or advantageous over other implementations. Furthermore, the disclosure is not limited by any expressed or implied theory presented in the preceding technical field, background, brief summary or the detailed description.
In addition, certain terminology may also be used in the following description for the purpose of reference only, and thus is not intended to be limiting. For example, the terms "first," "second," and other such numerical terms referring to structures or elements do not imply a sequence or order unless clearly indicated by the context. It will be further understood that the terms "comprises/comprising," "includes" and/or "including," when used herein, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
In this disclosure, the terms "component" and "system" are intended to refer to a computer-related entity, either hardware, a combination of hardware and software, or software in execution. For example, a component can be, but is not limited to being, a process running on a processor, an object, an executable, a thread of execution, a program, and/or the like. By way of illustration, both an application running on a server and the server can be a component. One or more components can reside within a process and/or thread of execution and a component can be localized on one computer and/or distributed between two or more computers.
In addition, embodiments of the present disclosure may also include the following examples:
1. a method for plant identification, comprising:
receiving an image, wherein the image comprises at least one part of a plant;
identifying, from the image, at least one of a part, a growing place, and a growing period of a plant in the image and a classification of the plant in the image based on the trained neural network model; and
and performing the requested operation on the at least one item and the classification of the identified plant according to the operation request of the user.
2. The method of 1, wherein the operation request comprises a request for identified information, the method further comprising:
outputting the identified classification of the plant and the at least one item of information.
3. The method of 1, wherein the operation request comprises a request for a maintenance regimen for the plant, the method further comprising:
determining a maintenance regimen for the plant based on the identified classification of the plant and the at least one item; and
outputting the maintenance scheme of the plants.
4 the method of 3, wherein the maintenance regimen comprises an execution regimen of at least one of watering, spraying water, changing water, adding water, fertilizing, trimming, weeding, rotating pots, changing pots, sun shine, sun shading, adjusting temperature, adjusting humidity, protecting against overwintering, and pest control.
5. The method of 1, wherein the at least one item includes a part of a plant in the image, the method further comprising:
determining a classification level of classification of the plant according to the identified part of the plant in the image; and
adjusting the result of the classification of the identified plant according to the determined classification level.
6. The method of claim 5, further comprising:
determining the classification level as a genus in response to the part of the plant in the image being one of a trunk, a bud, a seed, a bud, a fruit, and a seedling; and
determining the classification level as a species in response to the part of the plant in the image being one of a leaf, a flower, a stem, and a root.
7. The method of 1, further comprising, prior to the identifying:
determining an image quality of the image based on the trained quality classification model;
in response to the image quality being clear, performing the identifying; and
and outputting prompt information of re-input images in response to the image quality is unclear.
8. The method of 1, wherein the identifying is performed according to a subject in the imagery, the method further comprising:
and in response to the main subject in the image being ambiguous, the main subject being a non-plant, and the main subject being a perspective view of the entire plant, not performing the recognition, and outputting prompt information for re-inputting the image.
9. The method of 1, wherein the growing area of the plant is identified based on a trained growing area classification model trained from a plurality of labeled samples under each of a plurality of classifications of growing areas, wherein the plurality of classifications of growing areas include potted, non-potted, and cut flowers.
10. The method of 1, wherein the growth cycle of the plant is identified based on a trained growth cycle classification model trained from a plurality of labeled samples under each of a plurality of classifications of growth cycle, wherein the plurality of classifications of growth cycle comprise emerging seedling, leaf stage, flowering stage, fruit stage, leaf drop stage, and resting stage.
11. The method of 1, wherein the part of the plant in the image is identified based on a trained part classification model trained from a plurality of labeled samples under each of a plurality of classifications of parts, wherein the plurality of classifications of parts include trunk, shoot, seed, bud, fruit, seedling, leaf, flower, stem, and root.
12. A method for plant identification, comprising:
receiving an image, wherein the image comprises at least one part of a plant;
identifying, from the image, a classification of a plant in the image, a part of the plant in the image, a growing place of the plant in the image, a growing cycle of the plant in the image, and an image quality of the image based on a trained neural network model; and
and selecting one or more identified contents from the identified contents according to the operation request of the user to perform the requested operation.
13. The method of claim 12, wherein the operation request includes a request for identified information, the method further comprising:
and outputting information of the selected one or more items of content.
14. The method of 12, wherein the operation request comprises a request for a maintenance regimen for the plant, the method further comprising:
selecting at least one of the parts of the plants in the images, the growing places of the plants in the images, the growing periods of the plants in the images and the classifications of the plants in the images;
determining a maintenance scheme of the plant according to the selected classification of the plant and the at least one item; and
outputting the maintenance scheme of the plants.
The method of claim 12, further comprising saving the identified items of content in association with the imagery.
16. A method for plant identification, comprising:
identifying a classification of a plant based on a trained neural network model from an image, and identifying at least two of a place of growth, a period of growth, and a part of the plant in the image, wherein the image includes at least one part of the plant;
determining a maintenance regimen for the plant based on the identified classification of the plant and the identified at least two of the location of the plant, the growth cycle, and the part of the plant in the image; and
outputting the maintenance scheme of the plants.
17. A method for plant identification, comprising:
receiving an image, wherein the image comprises at least one part of the plant;
identifying parts of the plants in the images and the classification of the plants according to the images;
determining an output classification level according to the identified plant part in the image; and
outputting a classification of the respective classification level of the plant according to the determined classification level.
18. The method of claim 17, further comprising:
determining a classification level of an output as a genus in response to a part of the plant in the image being one of a trunk, a bud, a seed, a bud, a fruit, and a seedling; and
determining the output classification level as a species in response to the part of the plant in the image being one of a leaf, a flower, a stem, and a root.
19. The method of claim 17, wherein the identifying is performed according to a subject in the image.
20. The method of claim 19, further comprising:
and in response to the main subject in the image being ambiguous, the main subject being a non-plant, and the main subject being a perspective view of the entire plant, not performing the recognition, and outputting information prompting the user to re-input the image.
21. A computer system for plant identification, comprising:
one or more processors; and
one or more memories configured to store a series of computer-executable instructions and computer-accessible data associated with the series of computer-executable instructions,
wherein the series of computer-executable instructions, when executed by the one or more processors, cause the computer system to perform the method of any of claims 1-20.
22. A non-transitory computer-readable storage medium having stored thereon a series of computer-executable instructions that, when executed by one or more computer systems, cause the one or more computer systems to perform the method of any one of claims 1-20.
Those skilled in the art will appreciate that the boundaries between the above described operations merely illustrative. Multiple operations may be combined into a single operation, single operations may be distributed in additional operations, and operations may be performed at least partially overlapping in time. Moreover, alternative embodiments may include multiple instances of a particular operation, and the order of operations may be altered in various other embodiments. However, other modifications, variations, and alternatives are also possible. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.
Although some specific embodiments of the present disclosure have been described in detail by way of example, it should be understood by those skilled in the art that the foregoing examples are for purposes of illustration only and are not intended to limit the scope of the present disclosure. The various embodiments disclosed herein may be combined in any combination without departing from the spirit and scope of the present disclosure. It will also be appreciated by those skilled in the art that various modifications may be made to the embodiments without departing from the scope and spirit of the disclosure. The scope of the present disclosure is defined by the appended claims.

Claims (10)

1. A method for plant identification, comprising:
receiving an image, wherein the image comprises at least one part of a plant;
identifying, from the image, at least one of a part, a growing place, and a growing period of a plant in the image and a classification of the plant in the image based on the trained neural network model; and
and performing the requested operation on the at least one item and the classification of the identified plant according to the operation request of the user.
2. The method of claim 1, wherein the operation request comprises a request for identified information, the method further comprising:
outputting the identified classification of the plant and the at least one item of information.
3. The method of claim 1, wherein the operation request comprises a request for a maintenance regimen for the plant, the method further comprising:
determining a maintenance regimen for the plant based on the identified classification of the plant and the at least one item; and
outputting the maintenance scheme of the plants.
4. The method of claim 3, wherein the maintenance regimen comprises an implementation of at least one of watering, water spraying, water changing, water adding, fertilizer application, pruning, weeding, pot changing, sun exposure, sun shading, temperature regulation, humidity regulation, overwintering protection, and pest control.
5. The method of claim 1, wherein the at least one item comprises a part of a plant in the image, the method further comprising:
determining a classification level of classification of the plant according to the identified part of the plant in the image; and
adjusting the result of the classification of the identified plant according to the determined classification level.
6. A method for plant identification, comprising:
receiving an image, wherein the image comprises at least one part of a plant;
identifying, from the image, a classification of a plant in the image, a part of the plant in the image, a growing place of the plant in the image, a growing cycle of the plant in the image, and an image quality of the image based on a trained neural network model; and
and selecting one or more identified contents from the identified contents according to the operation request of the user to perform the requested operation.
7. A method for plant identification, comprising:
identifying a classification of a plant based on a trained neural network model from an image, and identifying at least two of a place of growth, a period of growth, and a part of the plant in the image, wherein the image includes at least one part of the plant;
determining a maintenance regimen for the plant based on the identified classification of the plant and the identified at least two of the location of the plant, the growth cycle, and the part of the plant in the image; and
outputting the maintenance scheme of the plants.
8. A method for plant identification, comprising:
receiving an image, wherein the image comprises at least one part of the plant;
identifying parts of the plants in the images and the classification of the plants according to the images;
determining an output classification level according to the identified plant part in the image; and
outputting a classification of the respective classification level of the plant according to the determined classification level.
9. A computer system for plant identification, comprising:
one or more processors; and
one or more memories configured to store a series of computer-executable instructions and computer-accessible data associated with the series of computer-executable instructions,
wherein the series of computer-executable instructions, when executed by the one or more processors, cause the computer system to perform the method of any of claims 1-8.
10. A non-transitory computer-readable storage medium having stored thereon a series of computer-executable instructions that, when executed by one or more computer systems, cause the one or more computer systems to perform the method of any one of claims 1-8.
CN202110658721.1A 2021-06-15 2021-06-15 Method and computer system for plant identification Pending CN113298180A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110658721.1A CN113298180A (en) 2021-06-15 2021-06-15 Method and computer system for plant identification
PCT/CN2022/096706 WO2022262586A1 (en) 2021-06-15 2022-06-01 Method for plant identification, computer system and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110658721.1A CN113298180A (en) 2021-06-15 2021-06-15 Method and computer system for plant identification

Publications (1)

Publication Number Publication Date
CN113298180A true CN113298180A (en) 2021-08-24

Family

ID=77328225

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110658721.1A Pending CN113298180A (en) 2021-06-15 2021-06-15 Method and computer system for plant identification

Country Status (2)

Country Link
CN (1) CN113298180A (en)
WO (1) WO2022262586A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022262586A1 (en) * 2021-06-15 2022-12-22 杭州睿胜软件有限公司 Method for plant identification, computer system and computer-readable storage medium
WO2024027476A1 (en) * 2022-08-03 2024-02-08 杭州睿胜软件有限公司 Identification processing method and system for plant image, and storage medium

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106303434A (en) * 2016-08-23 2017-01-04 深圳前海弘稼科技有限公司 The control method of plantation equipment, server and plantation equipment
JP6307680B1 (en) * 2017-05-24 2018-04-11 節三 田中 Plant health diagnosis system
CN109522858A (en) * 2018-11-26 2019-03-26 Oppo广东移动通信有限公司 Plant disease detection method, device and terminal device
CN109840549A (en) * 2019-01-07 2019-06-04 武汉南博网络科技有限公司 A kind of pest and disease damage recognition methods and device
CN110555416A (en) * 2019-09-06 2019-12-10 杭州睿琪软件有限公司 Plant identification method and device
CN110619349A (en) * 2019-08-12 2019-12-27 深圳市识农智能科技有限公司 Plant image classification method and device
CN111191552A (en) * 2019-12-23 2020-05-22 合肥美的智能科技有限公司 Image recognition method based on visual terminal and visual terminal
CN111242178A (en) * 2020-01-02 2020-06-05 杭州睿琪软件有限公司 Object identification method, device and equipment
CN111325240A (en) * 2020-01-23 2020-06-23 杭州睿琪软件有限公司 Weed-related computer-executable method and computer system
CN111513673A (en) * 2019-02-01 2020-08-11 百度在线网络技术(北京)有限公司 Image-based growth state monitoring method, device, equipment and storage medium
CN112270297A (en) * 2020-11-13 2021-01-26 杭州睿琪软件有限公司 Method and computer system for displaying recognition result
CN112287790A (en) * 2020-10-20 2021-01-29 北京字跳网络技术有限公司 Image processing method, image processing device, storage medium and electronic equipment
CN112673785A (en) * 2020-12-09 2021-04-20 珠海格力电器股份有限公司 Method, device and system for generating plant feeding strategy
CN112784925A (en) * 2021-02-08 2021-05-11 杭州睿胜软件有限公司 Method, computer system and electronic equipment for object recognition

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110070101B (en) * 2019-03-12 2024-05-14 平安科技(深圳)有限公司 Plant species identification method and device, storage medium and computer equipment
CN110378303B (en) * 2019-07-25 2021-07-09 杭州睿琪软件有限公司 Method and system for object recognition
CN113298180A (en) * 2021-06-15 2021-08-24 杭州睿胜软件有限公司 Method and computer system for plant identification

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106303434A (en) * 2016-08-23 2017-01-04 深圳前海弘稼科技有限公司 The control method of plantation equipment, server and plantation equipment
JP6307680B1 (en) * 2017-05-24 2018-04-11 節三 田中 Plant health diagnosis system
CN109522858A (en) * 2018-11-26 2019-03-26 Oppo广东移动通信有限公司 Plant disease detection method, device and terminal device
CN109840549A (en) * 2019-01-07 2019-06-04 武汉南博网络科技有限公司 A kind of pest and disease damage recognition methods and device
CN111513673A (en) * 2019-02-01 2020-08-11 百度在线网络技术(北京)有限公司 Image-based growth state monitoring method, device, equipment and storage medium
CN110619349A (en) * 2019-08-12 2019-12-27 深圳市识农智能科技有限公司 Plant image classification method and device
CN110555416A (en) * 2019-09-06 2019-12-10 杭州睿琪软件有限公司 Plant identification method and device
CN111191552A (en) * 2019-12-23 2020-05-22 合肥美的智能科技有限公司 Image recognition method based on visual terminal and visual terminal
CN111242178A (en) * 2020-01-02 2020-06-05 杭州睿琪软件有限公司 Object identification method, device and equipment
CN111325240A (en) * 2020-01-23 2020-06-23 杭州睿琪软件有限公司 Weed-related computer-executable method and computer system
CN112287790A (en) * 2020-10-20 2021-01-29 北京字跳网络技术有限公司 Image processing method, image processing device, storage medium and electronic equipment
CN112270297A (en) * 2020-11-13 2021-01-26 杭州睿琪软件有限公司 Method and computer system for displaying recognition result
CN112673785A (en) * 2020-12-09 2021-04-20 珠海格力电器股份有限公司 Method, device and system for generating plant feeding strategy
CN112784925A (en) * 2021-02-08 2021-05-11 杭州睿胜软件有限公司 Method, computer system and electronic equipment for object recognition

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
许展慧,等: "国内8款常用植物识别软件的识别能力评价", 《生物多样性》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022262586A1 (en) * 2021-06-15 2022-12-22 杭州睿胜软件有限公司 Method for plant identification, computer system and computer-readable storage medium
WO2024027476A1 (en) * 2022-08-03 2024-02-08 杭州睿胜软件有限公司 Identification processing method and system for plant image, and storage medium

Also Published As

Publication number Publication date
WO2022262586A1 (en) 2022-12-22

Similar Documents

Publication Publication Date Title
US20180322353A1 (en) Systems and methods for electronically identifying plant species
CN109583301B (en) Method and device for predicting optimal external planting conditions in crop growth process
WO2022262586A1 (en) Method for plant identification, computer system and computer-readable storage medium
WO2021147528A1 (en) Computer-executable method relating to weeds and computer system
CN107392238A (en) Outdoor knowledge of plants based on moving-vision search expands learning system
CN108073947B (en) Method for identifying blueberry varieties
EP3032473A1 (en) Method and system for classifying plant disease through crowdsourcing using a mobile communication device
WO2023138298A1 (en) Method and apparatus for determining whether container of plant is suitable for plant maintenance
CN115203451A (en) Recognition processing method, system and storage medium for plant image
CN113313193A (en) Plant picture identification method, readable storage medium and electronic device
CN114170509A (en) Plant identification method, plant identification device and plant identification system
WO2021205442A1 (en) Methods for artificial pollination and apparatus for doing the same
CN113744226A (en) Intelligent agricultural pest identification and positioning method and system
Wang et al. Online recognition and yield estimation of tomato in plant factory based on YOLOv3
CN114399108A (en) Tea garden yield prediction method based on multi-mode information
CN110705698B (en) Target counting depth network design method for scale self-adaptive perception
CN112328771A (en) Service information output method, device, server and storage medium
Mithra et al. Cucurbitaceous family flower inferencing using deep transfer learning approaches: CuCuFlower UAV imagery data
CN114841955A (en) Biological species identification method, device, equipment and storage medium
CN114120117A (en) Method and system for displaying plant disease diagnosis information and readable storage medium
CN114937030A (en) Phenotypic parameter calculation method for intelligent agricultural planting of lettuce
CN111814592B (en) Plant leaf identification method based on stackable capsule network
CN113886620A (en) Plant recommendation method, readable storage medium and electronic device
Dinca et al. Halyomorpha Halys Detection in Orchard from UAV Images Using Convolutional Neural Networks
KR102281983B1 (en) Talk with plant service method using mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination