CN112001242B - Intelligent gardening management method and device - Google Patents
Intelligent gardening management method and device Download PDFInfo
- Publication number
- CN112001242B CN112001242B CN202010688777.7A CN202010688777A CN112001242B CN 112001242 B CN112001242 B CN 112001242B CN 202010688777 A CN202010688777 A CN 202010688777A CN 112001242 B CN112001242 B CN 112001242B
- Authority
- CN
- China
- Prior art keywords
- plant
- information
- leaf
- stem
- image information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/56—Extraction of image or video features relating to colour
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Software Systems (AREA)
- Computing Systems (AREA)
- Multimedia (AREA)
- Artificial Intelligence (AREA)
- Mathematical Physics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- General Health & Medical Sciences (AREA)
- Computational Linguistics (AREA)
- Biophysics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Medical Informatics (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
The invention provides an intelligent gardening management method and device, which comprises the steps of obtaining image information of plants; performing feature extraction on the image information of the plant to obtain a first identification region, wherein the first identification region comprises the leaf and stem information of the plant; inputting the first recognition area into a first training model, wherein the first training model is obtained by training multiple groups of training data, and each group of training data in the multiple groups comprises: the first identification area and preset leaf and stem state grade information; obtaining output information of the first training model, wherein the output information comprises leaf and stem state grade information corresponding to the first identification region; and screening and filtering the image information of the plant according to the leaf and stem state grade information corresponding to the first identification area to obtain first plant image information, wherein the first plant is a plant to be managed. The technical effects of improving artificial intelligence, reducing manpower consumption and improving the management efficiency of the horticultural plants are achieved.
Description
Technical Field
The invention relates to the technical field of garden irrigation, in particular to an intelligent garden management method and device.
Background
In the field of irrigation of gardens, gardeners are often required to determine whether or not specific treatments, such as watering, fertilizing, deinsectization and the like, are required to be carried out on various plants according to personal experience.
However, the applicant of the present invention finds that the prior art has at least the following technical problems:
gardens management means among the prior art needs the manual work of professional to take care of, consumes the manpower in a large number, causes that horticulture management intelligent level is low, the not good technical problem of management effect.
Disclosure of Invention
The embodiment of the invention provides an intelligent gardening management method and device, and solves the technical problems that in the prior art, a garden management means needs to be manually observed by a professional, a large amount of manpower is consumed, the intelligent level of gardening management is low, and the management effect is poor.
In view of the above problems, embodiments of the present application are proposed to provide an intelligent gardening management method and apparatus.
In a first aspect, the present invention provides an intelligent gardening management method, comprising: obtaining image information of a plant; performing feature extraction on the image information of the plant to obtain a first identification region, wherein the first identification region comprises the leaf and stem information of the plant; inputting the first recognition area into a first training model, wherein the first training model is obtained by training a plurality of sets of training data, and each set of training data in the plurality of sets includes: the first identification area and preset leaf and stem state grade information; obtaining output information of the first training model, wherein the output information comprises leaf and stem state grade information corresponding to the first identification region; and screening and filtering the image information of the plant according to the leaf and stem state grade information corresponding to the first identification area to obtain first plant image information, wherein the first plant is a plant to be managed.
Preferably, after obtaining the image information of the plant, the method includes: performing feature extraction on the image information of the plant to obtain a second identification area, wherein the second identification area comprises soil information of the plant; inputting the second recognition area into a second training model, wherein the second training model is obtained by training a plurality of sets of training data, and each set of training data in the plurality of sets includes: the second identification area and preset soil state grade information; obtaining output information of the second training model, wherein the output information comprises soil state grade information corresponding to the first identification area; and screening and filtering the image information of the plant according to the soil state grade information corresponding to the second identification area to obtain second plant image information, wherein the second plant is a plant to be managed.
Preferably, the first identification region and the second identification region are obtained by performing feature extraction on the plant image information through a feature extraction layer.
Preferably, the method comprises: according to the leaf and stem information, obtaining water content information of the leaves and stems and integrity information of the leaves and stems; and taking the preset leaf and stem state grade information as first supervision data, and obtaining leaf and stem state grade information corresponding to the first identification region according to the water content information of the leaves and stems and the leaf and stem integrity information.
Preferably, the method comprises: according to the soil information, acquiring the wettability information and the nutrient content information of the soil; and taking the preset soil state grade information as second supervision data, and obtaining soil state grade information corresponding to the second identification area according to the wettability information and the nutrient content information of the soil.
Preferably, the screening and filtering the image information of the plant according to the leaf and stem state grade information corresponding to the first identification region to obtain first plant image information includes: obtaining a predetermined plant health status criterion; judging whether the leaf and stem state grade information corresponding to the first identification region meets the preset plant health state standard or not; if the leaf and stem state grade information corresponding to the first identification region meets the preset plant health state standard, filtering plant image information corresponding to the first identification region; and if the leaf and stem state grade information corresponding to the first identification region does not meet the preset plant health state standard, determining that the plant image information corresponding to the first identification region is first plant information, and performing targeted management on the first plant.
In a second aspect, the present invention provides an intelligent gardening management device, comprising:
a first obtaining unit for obtaining image information of a plant;
a second obtaining unit, configured to perform feature extraction on the image information of the plant to obtain a first identification region, where the first identification region includes information of leaves and stems of the plant;
a first input unit, configured to input the first recognition area into a first training model, where the first training model is obtained by training multiple sets of training data, and each set of training data in the multiple sets includes: the first identification area and preset leaf and stem state grade information;
a third obtaining unit, configured to obtain output information of the first training model, where the output information includes leaf and stem state grade information corresponding to the first identification region;
and a fourth obtaining unit, configured to perform screening and filtering on the image information of the plant according to the leaf and stem state grade information corresponding to the first identification region, so as to obtain first plant image information, where the first plant is a plant to be managed.
Preferably, the apparatus further comprises:
a fifth obtaining unit, configured to perform feature extraction on the image information of the plant to obtain a second identification region, where the second identification region includes soil information of the plant;
a second input unit, configured to input the second recognition area into a second training model, where the second training model is obtained by training multiple sets of training data, and each set of training data in the multiple sets includes: the second identification area and preset soil state grade information;
a sixth obtaining unit, configured to obtain output information of the second training model, where the output information includes soil state grade information corresponding to the first recognition area;
and the seventh obtaining unit is used for screening and filtering the image information of the plant according to the soil state grade information corresponding to the second identification area to obtain second plant image information, wherein the second plant is a plant to be managed.
Preferably, the device further includes that the first identification region and the second identification region are obtained by performing feature extraction on the plant image information through a feature extraction layer.
Preferably, the apparatus further comprises:
an eighth obtaining unit, configured to obtain, according to the leaf and stem information, water content information of the leaf and stem and integrity information of the leaf and stem;
a ninth obtaining unit, configured to obtain, by using the preset leaf-stem state grade information as first supervision data, leaf-stem state grade information corresponding to the first identification region according to the water content information of the leaf and the leaf-stem integrity information.
Preferably, the apparatus further comprises:
a tenth obtaining unit, configured to obtain, according to the soil information, wettability information and nutrient content information of the soil;
an eleventh obtaining unit, configured to obtain soil state grade information corresponding to the second identification area according to the information on the wettability and the content of nutrients of the soil, using the preset soil state grade information as second supervision data.
Preferably, the apparatus further comprises:
a twelfth obtaining unit for obtaining a predetermined plant health status criterion;
the first judging unit is used for judging whether the leaf and stem state grade information corresponding to the first identification area meets the preset plant health state standard or not;
the first filtering unit is used for filtering the plant image information corresponding to the first identification region if the leaf and stem state grade information corresponding to the first identification region meets the preset plant health state standard;
and the first determining unit is used for determining that the plant image information corresponding to the first identification region is first plant information and performing targeted management on the first plant if the leaf and stem state grade information corresponding to the first identification region does not meet the preset plant health state standard.
In a third aspect, the present invention provides an intelligent gardening management device, comprising a memory, a processor and a computer program stored on the memory and operable on the processor, wherein the processor executes the program to implement the following steps: obtaining image information of a plant; performing feature extraction on the image information of the plant to obtain a first identification region, wherein the first identification region comprises the leaf and stem information of the plant; inputting the first recognition area into a first training model, wherein the first training model is obtained by training a plurality of sets of training data, and each set of training data in the plurality of sets includes: the first identification area and preset leaf and stem state grade information; obtaining output information of the first training model, wherein the output information comprises leaf and stem state grade information corresponding to the first identification region; and screening and filtering the image information of the plant according to the leaf and stem state grade information corresponding to the first identification area to obtain first plant image information, wherein the first plant is a plant to be managed.
In a fourth aspect, the present invention provides a computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of: obtaining image information of a plant; performing feature extraction on the image information of the plant to obtain a first identification region, wherein the first identification region comprises the leaf and stem information of the plant; inputting the first recognition area into a first training model, wherein the first training model is obtained by training a plurality of sets of training data, and each set of training data in the plurality of sets includes: the first identification area and preset leaf and stem state grade information; obtaining output information of the first training model, wherein the output information comprises leaf and stem state grade information corresponding to the first identification region; and screening and filtering the image information of the plant according to the leaf and stem state grade information corresponding to the first identification area to obtain first plant image information, wherein the first plant is a plant to be managed.
One or more technical solutions in the embodiments of the present application have at least one or more of the following technical effects:
according to the intelligent gardening management method and device provided by the embodiment of the invention, the image information of the plants is obtained; performing feature extraction on the image information of the plant to obtain a first identification region, wherein the first identification region comprises the leaf and stem information of the plant; inputting the first recognition area into a first training model, wherein the first training model is obtained by training a plurality of sets of training data, and each set of training data in the plurality of sets includes: the first identification area and preset leaf and stem state grade information; obtaining output information of the first training model, wherein the output information comprises leaf and stem state grade information corresponding to the first identification region; and screening and filtering the image information of the plant according to the leaf and stem state grade information corresponding to the first identification area to obtain first plant image information, wherein the first plant is a plant to be managed. The technical problems that garden management means in the prior art need manual supervision of professionals, manpower is consumed in a large quantity, the intelligent level of garden management is low, and the management effect is poor are solved. The technical effects of improving artificial intelligence, reducing manpower consumption and improving the management efficiency of the horticultural plants are achieved.
The foregoing description is only an overview of the technical solutions of the present invention, and the embodiments of the present invention are described below in order to make the technical means of the present invention more clearly understood and to make the above and other objects, features, and advantages of the present invention more clearly understandable.
Drawings
Fig. 1 is a schematic flow chart illustrating an intelligent gardening management method according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of an intelligent gardening management device according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of another intelligent gardening management device according to an embodiment of the present invention.
Description of reference numerals: a first obtaining unit 11, a second obtaining unit 12, a first input unit 13, a third obtaining unit 14, a fourth obtaining unit 15, a bus 300, a receiver 301, a processor 302, a transmitter 303, a memory 304, and a bus interface 306.
Detailed Description
The embodiment of the invention provides an intelligent gardening management method and device, which are used for solving the technical problems that a garden management means in the prior art needs manual supervision of professionals and consumes a large amount of manpower, so that the intelligent level of gardening management is low and the management effect is poor.
The technical scheme provided by the invention has the following general idea:
obtaining image information of a plant; performing feature extraction on the image information of the plant to obtain a first identification region, wherein the first identification region comprises the leaf and stem information of the plant; inputting the first recognition area into a first training model, wherein the first training model is obtained by training a plurality of sets of training data, and each set of training data in the plurality of sets of training data includes: the first identification area and preset leaf and stem state grade information; obtaining output information of the first training model, wherein the output information comprises leaf and stem state grade information corresponding to the first identification region; and screening and filtering the image information of the plant according to the leaf and stem state grade information corresponding to the first identification area to obtain first plant image information, wherein the first plant is a plant to be managed. The technical effects of improving artificial intelligence, reducing manpower consumption and improving the management efficiency of the horticultural plants are achieved.
The technical solutions of the present invention are described in detail below with reference to the drawings and specific embodiments, and it should be understood that the specific features in the embodiments and examples of the present invention are described in detail in the technical solutions of the present application, and are not limited to the technical solutions of the present application, and the technical features in the embodiments and examples of the present application may be combined with each other without conflict.
The term "and/or" herein is merely an association relationship describing an associated object, and means that there may be three relationships, for example, a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
Example one
Fig. 1 is a schematic flow chart illustrating an intelligent gardening management method according to an embodiment of the present invention. As shown in fig. 1, an embodiment of the present invention provides an intelligent gardening management method, including:
step 110: obtaining image information of a plant;
step 120: performing feature extraction on the image information of the plant to obtain a first identification region, wherein the first identification region comprises the leaf and stem information of the plant;
specifically, the image information of the plant may be acquired by an image acquisition device, and in terms of the type of the simulation image, the image may be a video image obtained by capturing a still image or a sequence of video frames, a signal trend graph obtained by a detection device, or a synthesized image. The first identification region is obtained by performing feature extraction on the obtained image information of the plant to be identified, and according to the needs of practical application, the number of the first identification regions can be one or more, and information feature extraction is performed from the extracted first identification region. The method can only train the interested region aiming at the feature extraction of the interested region in each plant image information, thereby achieving the effect of improving the training efficiency.
Step 130: inputting the first recognition area into a first training model, wherein the first training model is obtained by training a plurality of sets of training data, and each set of training data in the plurality of sets of training data includes: the first identification area and preset leaf and stem state grade information;
step 140: obtaining output information of the first training model, wherein the output information comprises leaf and stem state grade information corresponding to the first identification region;
specifically, a training model, namely a Neural network model in machine learning, a Neural Network (NN) is a complex network system formed by a large number of simple processing units (called neurons) which are widely connected with each other, reflects many basic features of human brain functions, and is a highly complex nonlinear dynamical learning system. The neural network has the capabilities of large-scale parallel, distributed storage and processing, self-organization, self-adaptation and self-learning, and is particularly suitable for processing the inaccurate and fuzzy information processing problems which need to consider many factors and conditions simultaneously. Neural network models are described based on mathematical models of neurons. Artificial neural networks (artificalnearl new tokr) s, are a description of the first-order properties of the human brain system. Briefly, it is a mathematical model. The neural network model is represented by a network topology, node characteristics, and learning rules. In the embodiment of the application, the first identification region is used as input data and is input into a training model, each set of input training data comprises the first identification region and preset leaf and stem state grade information, and the preset leaf and stem state grade information is used as supervision data, so that the first identification region is trained, and output data, namely the leaf and stem state grade information corresponding to the first identification region, is obtained.
Step 150: and screening and filtering the image information of the plant according to the leaf and stem state grade information corresponding to the first identification area to obtain first plant image information, wherein the first plant is a plant to be managed.
Specifically, after the leaf and stem state grade information corresponding to the first identification area in each plant image information is obtained, a standard can be established according to actual needs, and all plant image information is screened according to the standard, so that plant image information meeting requirements is obtained, namely the first plant image information, the requirements can be determined according to actual conditions, for example, plants needing watering, fertilizing and the like are required to be screened, so that the plants are subjected to targeted management, and withered or dead plants can also be screened out so as to be cleaned in time.
Preferably, after obtaining the image information of the plant, the method includes: performing feature extraction on the image information of the plant to obtain a second identification area, wherein the second identification area comprises soil information of the plant; inputting the second recognition area into a second training model, wherein the second training model is obtained by training a plurality of sets of training data, and each set of training data in the plurality of sets of training data includes: the second identification area and preset soil state grade information; obtaining output information of the second training model, wherein the output information comprises soil state grade information corresponding to the first recognition area; and screening and filtering the image information of the plant according to the soil state grade information corresponding to the second identification area to obtain second plant image information, wherein the second plant is a plant to be managed.
Specifically, a second recognition area is used as input data and is input into a training model, each set of input training data comprises the second recognition area and preset soil state grade information, the preset soil state grade information is used as supervision data, the second recognition area is trained, and output data, namely soil state grade information corresponding to the second recognition area, is obtained. After soil state grade information corresponding to the second identification area in each plant image information is obtained, a standard can be made according to actual needs, all plant image information is screened according to the standard, and therefore plant image information meeting requirements, namely the second plant image information, is obtained, the requirements can be determined according to actual conditions, for example, plants needing watering, fertilizing and the like need to be screened, so that the plants can be managed in a targeted mode, withered or dead plants can also be screened out, and timely cleaning is facilitated. The embodiment of the application trains through two characteristic regions of leaf and stem and soil respectively, so that the health state of the plant can be more comprehensively identified, and the plant is pertinently processed according to the health state of the plant.
Preferably, the first identification region and the second identification region are obtained by performing feature extraction on the plant image information through a feature extraction layer.
Specifically, feature extraction is performed on the plant image information, and the feature extraction or target object recognition is performed through a neural network model, which may be any suitable neural network that can perform feature extraction or target object recognition, including but not limited to a convolutional neural network, an enhanced learning neural network, a generation network in an antagonistic neural network, and the like. The neural network model comprises a feature extraction layer and can be partially adjusted according to actual needs, wherein the feature extraction layer can be a convolutional neural network, the image to be identified, namely plant image information passes through the feature extraction layer to obtain a first identification region of the simulated image to be identified, and leaf and stem state grade information, including the water content of the leaf and stem, whether the leaf and stem are etched by insects, whether the leaf and stem have injury and other features, is obtained from the first identification region.
Preferably, the method comprises: according to the leaf and stem information, obtaining water content information of the leaves and stems and integrity information of the leaves and stems; and taking the preset leaf and stem state grade information as first supervision data, and obtaining leaf and stem state grade information corresponding to the first identification area according to the water content information of the leaves and stems and the leaf and stem integrity information.
Specifically, corresponding water content information and leaf-stem integrity information can be obtained from the first identification region according to the color and shape of the leaves and stems, wherein the preset leaf-stem state grade information comprises preset water content grade information aiming at the water content and preset leaf-stem integrity grade information aiming at the leaf-stem integrity, the preset leaf-stem state grade information is used as first supervision data, the input first identification region is trained, and therefore output data is obtained, the leaf-stem water content and the leaf-stem integrity are integrated, and the leaf-stem state grade information corresponding to the plant image information one to one is obtained.
Preferably, the method comprises: according to the soil information, acquiring the wettability information and the nutrient content information of the soil; and taking the preset soil state grade information as second supervision data, and obtaining soil state grade information corresponding to the second identification area according to the soil wettability information and the nutrient content information.
Specifically, the corresponding wettability information and nutrient content information can be obtained from the second identification area according to the color of the soil and the particle dispersion degree of the soil, for example, the soil with higher wettability is often darker than the soil with lower wettability, and the particle size is weaker. The preset soil state grade information comprises preset wettability grade information aiming at the wettability and preset nutrient substance grade information aiming at the influence substances, the preset wettability grade information is used as second supervision data, the input second identification area is trained, output data are obtained, the wettability and the nutrient substance content of the soil are integrated, and the leaf and stem soil state grade information corresponding to the plant image information in a one-to-one mode is obtained.
The method integrates two identification areas of the soil part and the leaves and the stems of the plant, can identify the growth state of the plant more completely and pertinently, and judges the health condition of each plant, so that the judgment result is more accurate and reliable.
Preferably, the screening and filtering the image information of the plant according to the leaf and stem state grade information corresponding to the first identification region to obtain the first plant image information includes: obtaining a predetermined plant health status standard; judging whether the leaf and stem state grade information corresponding to the first identification area meets the preset plant health state standard or not; if the leaf and stem state grade information corresponding to the first identification region meets the preset plant health state standard, filtering plant image information corresponding to the first identification region; and if the leaf and stem state grade information corresponding to the first identification region does not meet the preset plant health state standard, determining that the plant image information corresponding to the first identification region is first plant information, and performing targeted management on the first plant.
Specifically, the predetermined plant health status standard may be a measure made for plant health according to actual requirements, when the leaf and stem status grade information corresponding to the first identification area has been obtained, the remaining predetermined plant health status standards are compared to determine whether the plant meets the predetermined plant health status standard, if the predetermined plant health status standard is met, the plant growth status is good, and no special treatment is required, if the leaf and stem status grade information corresponding to the first identification area does not meet the predetermined plant health status standard, the plant is marked as first plant information, that is, a plant with a bad health status, and specific treatment, such as watering, fertilizing, applying pesticide, and the like, is required. Thereby reach intelligent management gardens horticulture plant, when using manpower sparingly, more intelligent, vegetation more healthy technological effect.
Example two
Based on the same inventive concept as the intelligent gardening management method in the previous embodiment, the present invention further provides an intelligent gardening management device, as shown in fig. 2, the device comprising:
a first obtaining unit 11, wherein the first obtaining unit 11 is used for obtaining image information of a plant;
a second obtaining unit 12, where the second obtaining unit 12 is configured to perform feature extraction on the image information of the plant to obtain a first identification region, where the first identification region includes information of a leaf and a stem of the plant;
a first input unit 13, where the first input unit 13 is configured to input the first recognition area into a first training model, where the first training model is obtained by training multiple sets of training data, and each set of training data in the multiple sets includes: the first identification area and preset leaf and stem state grade information;
a third obtaining unit 14, where the third obtaining unit 14 is configured to obtain output information of the first training model, where the output information includes leaf and stem state grade information corresponding to the first identification region;
a fourth obtaining unit 15, where the fourth obtaining unit 15 is configured to perform screening and filtering on the image information of the plant according to the leaf and stem state grade information corresponding to the first identification region, so as to obtain first plant image information, where the first plant is a plant to be managed.
Preferably, the apparatus further comprises:
a fifth obtaining unit, configured to perform feature extraction on the image information of the plant to obtain a second identification area, where the second identification area includes soil information of the plant;
a second input unit, configured to input the second recognition area into a second training model, where the second training model is obtained by training multiple sets of training data, and each set of training data in the multiple sets includes: the second identification area and preset soil state grade information;
a sixth obtaining unit, configured to obtain output information of the second training model, where the output information includes soil state grade information corresponding to the first recognition area;
and the seventh obtaining unit is used for screening and filtering the image information of the plant according to the soil state grade information corresponding to the second identification area to obtain second plant image information, wherein the second plant is a plant to be managed.
Preferably, the device further includes that the first identification area and the second identification area are obtained by feature extraction of the plant image information through a feature extraction layer.
Preferably, the apparatus further comprises:
an eighth obtaining unit, configured to obtain, according to the leaf and stem information, water content information of the leaf and stem and integrity information of the leaf and stem;
a ninth obtaining unit, configured to obtain, by using the preset leaf-stem state grade information as first supervision data, leaf-stem state grade information corresponding to the first identification region according to the water content information of the leaf and the leaf-stem integrity information.
Preferably, the apparatus further comprises:
a tenth obtaining unit, configured to obtain, according to the soil information, wettability information and nutrient content information of the soil;
and the eleventh obtaining unit is used for taking the preset soil state grade information as second supervision data, and obtaining soil state grade information corresponding to the second identification area according to the wettability information and the nutrient content information of the soil.
Preferably, the apparatus further comprises:
a twelfth obtaining unit for obtaining a predetermined plant health status criterion;
the first judging unit is used for judging whether the leaf and stem state grade information corresponding to the first identification area meets the preset plant health state standard or not;
the first filtering unit is used for filtering the plant image information corresponding to the first identification region if the leaf and stem state grade information corresponding to the first identification region meets the preset plant health state standard;
and the first determining unit is used for determining that the plant image information corresponding to the first identification region is first plant information and performing targeted management on the first plant if the leaf and stem state grade information corresponding to the first identification region does not meet the preset plant health state standard.
Various modifications and embodiments of the aforementioned intelligent gardening management method in the first embodiment of fig. 1 are also applicable to the intelligent gardening management device of this embodiment, and the implementation of the intelligent gardening management device in this embodiment will be apparent to those skilled in the art from the foregoing detailed description of the intelligent gardening management method, and therefore, for the sake of brevity of the description, detailed description is omitted here.
EXAMPLE III
Based on the same inventive concept as the intelligent gardening management method in the previous embodiment, the present invention further provides an intelligent gardening management device having a computer program stored thereon, wherein the computer program, when executed by a processor, implements the steps of any one of the aforementioned intelligent gardening management methods.
Where in fig. 3 a bus architecture (represented by bus 300), bus 300 may include any number of interconnected buses and bridges, bus 300 linking together various circuits including one or more processors, represented by processor 302, and memory, represented by memory 304. The bus 300 may also link together various other circuits such as peripherals, voltage regulators, power management circuits, and the like, which are well known in the art, and therefore, will not be described any further herein. A bus interface 306 provides an interface between the bus 300 and the receiver 301 and transmitter 303. The receiver 301 and the transmitter 303 may be one and the same element, i.e. a transceiver, providing a means for communicating with various other apparatus over a transmission medium.
The processor 302 is responsible for managing the bus 300 and general processing, and the memory 304 may be used for storing data used by the processor 302 in performing operations.
Example four
Based on the same inventive concept as the method of intelligent gardening management in the previous embodiment, the present invention also provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of: obtaining image information of a plant; performing feature extraction on the image information of the plant to obtain a first identification region, wherein the first identification region comprises the leaf and stem information of the plant; inputting the first recognition area into a first training model, wherein the first training model is obtained by training a plurality of sets of training data, and each set of training data in the plurality of sets of training data includes: the first identification area and preset leaf and stem state grade information; obtaining output information of the first training model, wherein the output information comprises leaf and stem state grade information corresponding to the first identification region; and screening and filtering the image information of the plant according to the leaf and stem state grade information corresponding to the first identification area to obtain first plant image information, wherein the first plant is a plant to be managed.
In this embodiment, the program may further implement any one of the method steps of the first embodiment when the program is executed by a processor.
One or more technical solutions in the embodiments of the present application have at least one or more of the following technical effects:
according to the intelligent gardening management method and device provided by the embodiment of the invention, the image information of the plants is obtained; performing feature extraction on the image information of the plant to obtain a first identification region, wherein the first identification region comprises the leaf and stem information of the plant; inputting the first recognition area into a first training model, wherein the first training model is obtained by training a plurality of sets of training data, and each set of training data in the plurality of sets of training data includes: the first identification area and preset leaf and stem state grade information; obtaining output information of the first training model, wherein the output information comprises leaf and stem state grade information corresponding to the first identification region; and screening and filtering the image information of the plant according to the leaf and stem state grade information corresponding to the first identification area to obtain first plant image information, wherein the first plant is a plant to be managed. The technical problems that garden management means in the prior art need manual supervision of professionals, manpower is consumed in a large quantity, the intelligent level of garden management is low, and the management effect is poor are solved. The technical effects of improving artificial intelligence, reducing manpower consumption and improving the management efficiency of the horticultural plants are achieved.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention has been described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.
Claims (7)
1. An intelligent horticultural management method, the method comprising:
obtaining image information of a plant;
performing feature extraction on the image information of the plant to obtain a first identification region, wherein the first identification region comprises the leaf and stem information of the plant;
inputting the first recognition area into a first training model, wherein the first training model is obtained by training a plurality of sets of training data, and each set of training data in the plurality of sets includes: the first identification area and preset leaf and stem state grade information;
obtaining output information of the first training model, wherein the output information comprises leaf and stem state grade information corresponding to the first identification region;
screening and filtering the image information of the plant according to the leaf and stem state grade information corresponding to the first identification area to obtain first plant image information, wherein the first plant is a plant to be managed;
after the obtaining of the image information of the plant, the method comprises the following steps:
performing feature extraction on the image information of the plant to obtain a second identification area, wherein the second identification area comprises soil information of the plant;
inputting the second recognition area into a second training model, wherein the second training model is obtained by training a plurality of sets of training data, and each set of training data in the plurality of sets of training data includes: the second identification area and preset soil state grade information;
obtaining output information of the second training model, wherein the output information comprises soil state grade information corresponding to the first identification area;
screening and filtering the image information of the plants according to the soil state grade information corresponding to the second identification area to obtain second plant image information, wherein the second plants are plants to be managed;
according to the leaf and stem information, obtaining water content information of the leaf and stem and integrity information of the leaf and stem;
and taking the preset leaf and stem state grade information as first supervision data, and obtaining leaf and stem state grade information corresponding to the first identification region according to the water content information of the leaves and stems and the leaf and stem integrity information.
2. The method of claim 1, wherein the first identification region and the second identification region are obtained by feature extraction of the plant image information by a feature extraction layer.
3. The method of claim 1, wherein the method comprises:
according to the soil information, obtaining the humidity information and the nutrient content information of the soil;
and taking the preset soil state grade information as second supervision data, and obtaining soil state grade information corresponding to the second identification area according to the wettability information and the nutrient content information of the soil.
4. The method as claimed in claim 1, wherein the step of performing filtering on the image information of the plant according to the leaf and stem status grade information corresponding to the first identification region to obtain the first plant image information comprises:
obtaining a predetermined plant health status standard;
judging whether the leaf and stem state grade information corresponding to the first identification area meets the preset plant health state standard or not;
if the leaf and stem state grade information corresponding to the first identification region meets the preset plant health state standard, filtering the plant image information corresponding to the first identification region;
and if the leaf and stem state grade information corresponding to the first identification region does not meet the preset plant health state standard, determining that the plant image information corresponding to the first identification region is first plant information, and performing targeted management on the first plant.
5. An intelligent gardening management device, wherein the device comprises:
a first obtaining unit for obtaining image information of a plant;
a second obtaining unit, configured to perform feature extraction on the image information of the plant to obtain a first identification region, where the first identification region includes information of leaves and stems of the plant;
a first input unit, configured to input the first recognition area into a first training model, where the first training model is obtained by training multiple sets of training data, and each set of training data in the multiple sets includes: the first identification area and preset leaf and stem state grade information;
a third obtaining unit, configured to obtain output information of the first training model, where the output information includes leaf and stem state grade information corresponding to the first identification region;
and a fourth obtaining unit, configured to perform screening and filtering on the image information of the plant according to the leaf and stem state grade information corresponding to the first identification region, so as to obtain first plant image information, where the first plant is a plant to be managed.
6. An intelligent gardening management device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor when executing the program implements the steps of:
obtaining image information of a plant;
performing feature extraction on the image information of the plant to obtain a first identification region, wherein the first identification region comprises the leaf and stem information of the plant;
inputting the first recognition area into a first training model, wherein the first training model is obtained by training a plurality of sets of training data, and each set of training data in the plurality of sets includes: the first identification area and preset leaf and stem state grade information;
obtaining output information of the first training model, wherein the output information comprises leaf and stem state grade information corresponding to the first identification region;
and screening and filtering the image information of the plant according to the leaf and stem state grade information corresponding to the first identification area to obtain first plant image information, wherein the first plant is a plant to be managed.
7. A computer-readable storage medium, on which a computer program is stored, which program, when executed by a processor, carries out the steps of:
obtaining image information of a plant;
performing feature extraction on the image information of the plant to obtain a first identification region, wherein the first identification region comprises the leaf and stem information of the plant;
inputting the first recognition area into a first training model, wherein the first training model is obtained by training a plurality of sets of training data, and each set of training data in the plurality of sets of training data includes: the first identification area and preset leaf and stem state grade information;
obtaining output information of the first training model, wherein the output information comprises leaf and stem state grade information corresponding to the first identification region;
and screening and filtering the image information of the plant according to the leaf and stem state grade information corresponding to the first identification area to obtain first plant image information, wherein the first plant is a plant to be managed.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010688777.7A CN112001242B (en) | 2020-07-16 | 2020-07-16 | Intelligent gardening management method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010688777.7A CN112001242B (en) | 2020-07-16 | 2020-07-16 | Intelligent gardening management method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112001242A CN112001242A (en) | 2020-11-27 |
CN112001242B true CN112001242B (en) | 2022-12-06 |
Family
ID=73467501
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010688777.7A Active CN112001242B (en) | 2020-07-16 | 2020-07-16 | Intelligent gardening management method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112001242B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112529076A (en) * | 2020-12-10 | 2021-03-19 | 广西顶俏食品科技集团有限公司 | Method and device for reducing impurity yield of high-quality koji soy sauce |
CN112597827A (en) * | 2020-12-11 | 2021-04-02 | 西北农林科技大学 | Plant phenological period prediction method and system based on big data |
CN112617101A (en) * | 2020-12-21 | 2021-04-09 | 柳州市汇方科技有限公司 | Control method and device for rice noodle extruding |
CN112868455A (en) * | 2021-01-14 | 2021-06-01 | 柳州市彭莫山农业科技有限公司 | Method and device for improving planting yield of uncaria |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107844771A (en) * | 2017-11-03 | 2018-03-27 | 深圳春沐源控股有限公司 | Method, system, computer installation and the storage medium of crop production management |
CN110352660A (en) * | 2019-07-12 | 2019-10-22 | 塔里木大学 | A kind of ginning cotton seed vigor Fast nondestructive evaluation information processing method and device |
CN110404688A (en) * | 2019-07-12 | 2019-11-05 | 塔里木大学 | A kind of machine pick cotton information processing method miscellaneous clearly and device merging electrostatic and wind-force |
CN111122453A (en) * | 2018-11-01 | 2020-05-08 | 阿里巴巴集团控股有限公司 | Information processing method, device and system |
-
2020
- 2020-07-16 CN CN202010688777.7A patent/CN112001242B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107844771A (en) * | 2017-11-03 | 2018-03-27 | 深圳春沐源控股有限公司 | Method, system, computer installation and the storage medium of crop production management |
CN111122453A (en) * | 2018-11-01 | 2020-05-08 | 阿里巴巴集团控股有限公司 | Information processing method, device and system |
CN110352660A (en) * | 2019-07-12 | 2019-10-22 | 塔里木大学 | A kind of ginning cotton seed vigor Fast nondestructive evaluation information processing method and device |
CN110404688A (en) * | 2019-07-12 | 2019-11-05 | 塔里木大学 | A kind of machine pick cotton information processing method miscellaneous clearly and device merging electrostatic and wind-force |
Also Published As
Publication number | Publication date |
---|---|
CN112001242A (en) | 2020-11-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112001242B (en) | Intelligent gardening management method and device | |
Kang et al. | From parallel plants to smart plants: intelligent control and management for plant growth | |
Papageorgiou et al. | Yield prediction in apples using Fuzzy Cognitive Map learning approach | |
CN110909679B (en) | Remote sensing identification method and system for fallow crop rotation information of winter wheat historical planting area | |
CN114637351B (en) | Greenhouse environment regulation and control method and system for facility crops | |
CN108510102A (en) | A kind of water-fertilizer integral control method of irrigation using big data calculative strategy | |
WO2021007363A1 (en) | Irrigation control with deep reinforcement learning and smart scheduling | |
CN112836623B (en) | Auxiliary method and device for agricultural decision of facility tomatoes | |
Kolmanič et al. | An algorithm for automatic dormant tree pruning | |
CN112989688A (en) | Method and device for intelligently monitoring pesticide residues of kumquats | |
Blagojević et al. | Web-based intelligent system for predicting apricot yields using artificial neural networks | |
Avigal et al. | Learning seed placements and automation policies for polyculture farming with companion plants | |
CN117575094A (en) | Crop yield prediction and optimization method and device based on digital twin | |
CN118120602A (en) | Irrigation and fertilization management system for gardens | |
Logachev et al. | Simulation model of crop yields | |
CN113359628B (en) | Control method and device for green tea processing process | |
Deepthi et al. | Application of expert systems for agricultural crop disease diagnoses—A review | |
CN112868455A (en) | Method and device for improving planting yield of uncaria | |
CN113377141A (en) | Artificial intelligence agricultural automatic management system | |
WO2024178904A1 (en) | Crop water and fertilizer stress decision-making method and apparatus, and mobile phone terminal | |
CN112818154A (en) | Method and device for efficiently treating surface of fruit | |
CN113010529A (en) | Crop management method and device based on knowledge graph | |
Mesabbah et al. | Hybrid modeling for vineyard harvesting operations | |
CN112099557A (en) | Internet-based household plant planting method and system | |
Rathore | Application of artificial intelligence in agriculture including horticulture |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |