CN112001242A - Intelligent gardening management method and device - Google Patents

Intelligent gardening management method and device Download PDF

Info

Publication number
CN112001242A
CN112001242A CN202010688777.7A CN202010688777A CN112001242A CN 112001242 A CN112001242 A CN 112001242A CN 202010688777 A CN202010688777 A CN 202010688777A CN 112001242 A CN112001242 A CN 112001242A
Authority
CN
China
Prior art keywords
plant
information
leaf
stem
image information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010688777.7A
Other languages
Chinese (zh)
Other versions
CN112001242B (en
Inventor
陈顺义
王丹丹
陈浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiamen University of Technology
Original Assignee
Xiamen University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiamen University of Technology filed Critical Xiamen University of Technology
Priority to CN202010688777.7A priority Critical patent/CN112001242B/en
Publication of CN112001242A publication Critical patent/CN112001242A/en
Application granted granted Critical
Publication of CN112001242B publication Critical patent/CN112001242B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour

Abstract

The invention provides an intelligent gardening management method and device, which are characterized in that image information of plants is obtained; performing feature extraction on the image information of the plant to obtain a first identification region, wherein the first identification region comprises the leaf and stem information of the plant; inputting the first recognition area into a first training model, wherein the first training model is obtained by training multiple groups of training data, and each group of training data in the multiple groups comprises: the first identification area and preset leaf and stem state grade information; obtaining output information of the first training model, wherein the output information comprises leaf and stem state grade information corresponding to the first identification region; and screening and filtering the image information of the plant according to the leaf and stem state grade information corresponding to the first identification area to obtain first plant image information, wherein the first plant is a plant to be managed. The technical effects of improving artificial intelligence, reducing manpower consumption and improving the management efficiency of the horticultural plants are achieved.

Description

Intelligent gardening management method and device
Technical Field
The invention relates to the technical field of garden irrigation, in particular to an intelligent garden management method and device.
Background
In the field of irrigation of gardens, gardeners are often required to judge whether various plants need to be subjected to specific treatment according to personal experience, such as watering, fertilizing, deinsectization and the like.
However, the applicant of the present invention finds that the prior art has at least the following technical problems:
gardens management means among the prior art needs the manual work of professional to take care of, consumes the manpower in a large number, causes that horticulture management intelligent level is low, the not good technical problem of management effect.
Disclosure of Invention
The embodiment of the invention provides an intelligent gardening management method and device, and solves the technical problems that in the prior art, a garden management means needs to be manually observed by a professional, a large amount of manpower is consumed, the intelligent level of gardening management is low, and the management effect is poor.
In view of the above problems, embodiments of the present application are proposed to provide an intelligent gardening management method and apparatus.
In a first aspect, the present invention provides an intelligent gardening management method, comprising: obtaining image information of a plant; performing feature extraction on the image information of the plant to obtain a first identification region, wherein the first identification region comprises the leaf and stem information of the plant; inputting the first recognition area into a first training model, wherein the first training model is obtained by training a plurality of sets of training data, and each set of training data in the plurality of sets includes: the first identification area and preset leaf and stem state grade information; obtaining output information of the first training model, wherein the output information comprises leaf and stem state grade information corresponding to the first identification region; and screening and filtering the image information of the plant according to the leaf and stem state grade information corresponding to the first identification area to obtain first plant image information, wherein the first plant is a plant to be managed.
Preferably, after obtaining the image information of the plant, the method includes: performing feature extraction on the image information of the plant to obtain a second identification area, wherein the second identification area comprises soil information of the plant; inputting the second recognition area into a second training model, wherein the second training model is obtained by training a plurality of sets of training data, and each set of training data in the plurality of sets includes: the second identification area and preset soil state grade information; obtaining output information of the second training model, wherein the output information comprises soil state grade information corresponding to the first identification area; and screening and filtering the image information of the plant according to the soil state grade information corresponding to the second identification area to obtain second plant image information, wherein the second plant is a plant to be managed.
Preferably, the first identification region and the second identification region are obtained by performing feature extraction on the plant image information through a feature extraction layer.
Preferably, the method comprises: according to the leaf and stem information, obtaining water content information of the leaves and stems and integrity information of the leaves and stems; and taking the preset leaf and stem state grade information as first supervision data, and obtaining leaf and stem state grade information corresponding to the first identification region according to the water content information of the leaves and stems and the leaf and stem integrity information.
Preferably, the method comprises: according to the soil information, acquiring the wettability information and the nutrient content information of the soil; and taking the preset soil state grade information as second supervision data, and obtaining soil state grade information corresponding to the second identification area according to the wettability information and the nutrient content information of the soil.
Preferably, the screening and filtering the image information of the plant according to the leaf and stem state grade information corresponding to the first identification region to obtain the first plant image information includes: obtaining a predetermined plant health status standard; judging whether the leaf and stem state grade information corresponding to the first identification region meets the preset plant health state standard or not; if the leaf and stem state grade information corresponding to the first identification region meets the preset plant health state standard, filtering the plant image information corresponding to the first identification region; and if the leaf and stem state grade information corresponding to the first identification region does not meet the preset plant health state standard, determining that the plant image information corresponding to the first identification region is first plant information, and performing targeted management on the first plant.
In a second aspect, the present invention provides an intelligent gardening management device, comprising:
a first obtaining unit for obtaining image information of a plant;
a second obtaining unit, configured to perform feature extraction on the image information of the plant to obtain a first identification region, where the first identification region includes information of leaves and stems of the plant;
a first input unit, configured to input the first recognition area into a first training model, where the first training model is obtained by training multiple sets of training data, and each set of training data in the multiple sets includes: the first identification area and preset leaf and stem state grade information;
a third obtaining unit, configured to obtain output information of the first training model, where the output information includes leaf and stem state grade information corresponding to the first identification region;
and the fourth obtaining unit is used for screening and filtering the image information of the plant according to the leaf and stem state grade information corresponding to the first identification area to obtain first plant image information, wherein the first plant is a plant to be managed.
Preferably, the apparatus further comprises:
a fifth obtaining unit, configured to perform feature extraction on the image information of the plant to obtain a second identification region, where the second identification region includes soil information of the plant;
a second input unit, configured to input the second recognition area into a second training model, where the second training model is obtained by training multiple sets of training data, and each set of training data in the multiple sets includes: the second identification area and preset soil state grade information;
a sixth obtaining unit, configured to obtain output information of the second training model, where the output information includes soil state grade information corresponding to the first recognition area;
and the seventh obtaining unit is used for screening and filtering the image information of the plant according to the soil state grade information corresponding to the second identification area to obtain second plant image information, wherein the second plant is a plant to be managed.
Preferably, the device further includes that the first identification region and the second identification region are obtained by performing feature extraction on the plant image information through a feature extraction layer.
Preferably, the apparatus further comprises:
an eighth obtaining unit, configured to obtain, according to the leaf and stem information, water content information of the leaf and stem and integrity information of the leaf and stem;
a ninth obtaining unit, configured to obtain, by using the preset leaf-stem state grade information as first supervision data, leaf-stem state grade information corresponding to the first identification region according to the water content information of the leaf-stem and the leaf-stem integrity information.
Preferably, the apparatus further comprises:
a tenth obtaining unit, configured to obtain, according to the soil information, wettability information and nutrient content information of the soil;
and the eleventh obtaining unit is used for taking the preset soil state grade information as second supervision data, and obtaining soil state grade information corresponding to the second identification area according to the wettability information and the nutrient content information of the soil.
Preferably, the apparatus further comprises:
a twelfth obtaining unit for obtaining a predetermined plant health status criterion;
the first judging unit is used for judging whether the leaf and stem state grade information corresponding to the first identification area meets the preset plant health state standard or not;
the first filtering unit is used for filtering the plant image information corresponding to the first identification region if the leaf and stem state grade information corresponding to the first identification region meets the preset plant health state standard;
and the first determining unit is used for determining that the plant image information corresponding to the first identification region is first plant information and performing targeted management on the first plant if the leaf and stem state grade information corresponding to the first identification region does not meet the preset plant health state standard.
In a third aspect, the present invention provides an intelligent gardening management device, comprising a memory, a processor and a computer program stored on the memory and operable on the processor, wherein the processor executes the program to implement the following steps: obtaining image information of a plant; performing feature extraction on the image information of the plant to obtain a first identification region, wherein the first identification region comprises the leaf and stem information of the plant; inputting the first recognition area into a first training model, wherein the first training model is obtained by training a plurality of sets of training data, and each set of training data in the plurality of sets includes: the first identification area and preset leaf and stem state grade information; obtaining output information of the first training model, wherein the output information comprises leaf and stem state grade information corresponding to the first identification region; and screening and filtering the image information of the plant according to the leaf and stem state grade information corresponding to the first identification area to obtain first plant image information, wherein the first plant is a plant to be managed.
In a fourth aspect, the present invention provides a computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of: obtaining image information of a plant; performing feature extraction on the image information of the plant to obtain a first identification region, wherein the first identification region comprises the leaf and stem information of the plant; inputting the first recognition area into a first training model, wherein the first training model is obtained by training a plurality of sets of training data, and each set of training data in the plurality of sets includes: the first identification area and preset leaf and stem state grade information; obtaining output information of the first training model, wherein the output information comprises leaf and stem state grade information corresponding to the first identification region; and screening and filtering the image information of the plant according to the leaf and stem state grade information corresponding to the first identification area to obtain first plant image information, wherein the first plant is a plant to be managed.
One or more technical solutions in the embodiments of the present application have at least one or more of the following technical effects:
according to the intelligent gardening management method and device provided by the embodiment of the invention, the image information of the plants is obtained; performing feature extraction on the image information of the plant to obtain a first identification region, wherein the first identification region comprises the leaf and stem information of the plant; inputting the first recognition area into a first training model, wherein the first training model is obtained by training a plurality of sets of training data, and each set of training data in the plurality of sets includes: the first identification area and preset leaf and stem state grade information; obtaining output information of the first training model, wherein the output information comprises leaf and stem state grade information corresponding to the first identification region; and screening and filtering the image information of the plant according to the leaf and stem state grade information corresponding to the first identification area to obtain first plant image information, wherein the first plant is a plant to be managed. The technical problems that garden management means in the prior art need manual supervision of professionals, manpower is consumed in a large quantity, the intelligent level of garden management is low, and the management effect is poor are solved. The technical effects of improving artificial intelligence, reducing manpower consumption and improving the management efficiency of the horticultural plants are achieved.
The foregoing description is only an overview of the technical solutions of the present invention, and the embodiments of the present invention are described below in order to make the technical means of the present invention more clearly understood and to make the above and other objects, features, and advantages of the present invention more clearly understandable.
Drawings
Fig. 1 is a schematic flow chart illustrating an intelligent gardening management method according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of an intelligent gardening management device according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of another intelligent gardening management device according to an embodiment of the present invention.
Description of reference numerals: a first obtaining unit 11, a second obtaining unit 12, a first input unit 13, a third obtaining unit 14, a fourth obtaining unit 15, a bus 300, a receiver 301, a processor 302, a transmitter 303, a memory 304, and a bus interface 306.
Detailed Description
The embodiment of the invention provides an intelligent gardening management method and device, which are used for solving the technical problems that in the prior art, a garden management means needs to be manually observed by a professional, a large amount of manpower is consumed, the intellectualization level of gardening management is low, and the management effect is poor.
The technical scheme provided by the invention has the following general idea:
obtaining image information of a plant; performing feature extraction on the image information of the plant to obtain a first identification region, wherein the first identification region comprises the leaf and stem information of the plant; inputting the first recognition area into a first training model, wherein the first training model is obtained by training a plurality of sets of training data, and each set of training data in the plurality of sets includes: the first identification area and preset leaf and stem state grade information; obtaining output information of the first training model, wherein the output information comprises leaf and stem state grade information corresponding to the first identification region; and screening and filtering the image information of the plant according to the leaf and stem state grade information corresponding to the first identification area to obtain first plant image information, wherein the first plant is a plant to be managed. The technical effects of improving artificial intelligence, reducing manpower consumption and improving the management efficiency of the horticultural plants are achieved.
The technical solutions of the present invention are described in detail below with reference to the drawings and specific embodiments, and it should be understood that the specific features in the embodiments and examples of the present invention are described in detail in the technical solutions of the present application, and are not limited to the technical solutions of the present application, and the technical features in the embodiments and examples of the present application may be combined with each other without conflict.
The term "and/or" herein is merely an association describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
Example one
Fig. 1 is a schematic flow chart illustrating an intelligent gardening management method according to an embodiment of the present invention. As shown in fig. 1, an embodiment of the present invention provides an intelligent gardening management method, including:
step 110: obtaining image information of a plant;
step 120: performing feature extraction on the image information of the plant to obtain a first identification region, wherein the first identification region comprises the leaf and stem information of the plant;
specifically, the image information of the plant may be acquired by an image acquisition device, and in terms of the type of the simulation image, the image may be a video image obtained by capturing a still image or a sequence of video frames, a signal trend graph obtained by a detection device, or a synthesized image. The first identification region is obtained by performing feature extraction on the obtained image to be identified, namely, image information of the plant, and according to the needs of practical application, the number of the first identification regions may be one or more, and information feature extraction is performed from the extracted first identification region. The feature extraction of the interested region in each plant image information can be aimed at, and only the interested region can be trained, so that the effect of improving the training efficiency is achieved.
Step 130: inputting the first recognition area into a first training model, wherein the first training model is obtained by training a plurality of sets of training data, and each set of training data in the plurality of sets includes: the first identification area and preset leaf and stem state grade information;
step 140: obtaining output information of the first training model, wherein the output information comprises leaf and stem state grade information corresponding to the first identification region;
specifically, a training model, namely a Neural network model in machine learning, a Neural Network (NN) is a complex network system formed by a large number of simple processing units (called neurons) which are widely connected with each other, reflects many basic features of human brain functions, and is a highly complex nonlinear dynamical learning system. The neural network has the capabilities of large-scale parallel, distributed storage and processing, self-organization, self-adaptation and self-learning, and is particularly suitable for processing inaccurate and fuzzy information processing problems which need to consider many factors and conditions simultaneously. Neural network models are described based on mathematical models of neurons. Artificial neural networks (artificalnearl new tokr) s, are a description of the first-order properties of the human brain system. Briefly, it is a mathematical model. The neural network model is represented by a network topology, node characteristics, and learning rules. In the embodiment of the application, the first identification region is used as input data and is input into a training model, each set of input training data comprises the first identification region and preset leaf and stem state grade information, and the preset leaf and stem state grade information is used as supervision data, so that the first identification region is trained, and output data, namely the leaf and stem state grade information corresponding to the first identification region, is obtained.
Step 150: and screening and filtering the image information of the plant according to the leaf and stem state grade information corresponding to the first identification area to obtain first plant image information, wherein the first plant is a plant to be managed.
Specifically, after the leaf and stem state grade information corresponding to the first identification area in each plant image information is obtained, a standard can be formulated according to actual needs, and all plant image information is screened according to the standard, so that plant image information meeting requirements, namely the first plant image information, is obtained, and the requirements can be determined according to actual conditions, for example, plants needing watering and fertilizing are screened out, so that the plants are managed in a targeted manner, and withered or dead plants can be screened out, so that the plants can be cleaned in time.
Preferably, after obtaining the image information of the plant, the method includes: performing feature extraction on the image information of the plant to obtain a second identification area, wherein the second identification area comprises soil information of the plant; inputting the second recognition area into a second training model, wherein the second training model is obtained by training a plurality of sets of training data, and each set of training data in the plurality of sets includes: the second identification area and preset soil state grade information; obtaining output information of the second training model, wherein the output information comprises soil state grade information corresponding to the first identification area; and screening and filtering the image information of the plant according to the soil state grade information corresponding to the second identification area to obtain second plant image information, wherein the second plant is a plant to be managed.
Specifically, the second recognition area is used as input data and is input into a training model, each set of input training data comprises the second recognition area and preset soil state grade information, the preset soil state grade information is used as supervision data, the second recognition area is trained, and output data, namely soil state grade information corresponding to the second recognition area, is obtained. After soil state grade information corresponding to the second identification area in each plant image information is obtained, a standard can be established according to actual requirements, and all plant image information is screened according to the standard, so that plant image information meeting requirements, namely the second plant image information, is obtained, the requirements can be determined according to actual conditions, for example, plants needing watering, fertilizing and the like are screened, and are managed in a targeted manner, and withered or dead plants can be screened out, so that the plants can be cleaned in time. The embodiment of the application trains the leaf and stem and the soil respectively through two characteristic areas, so that the health state of the plant can be more comprehensively identified, and the plant is pertinently processed according to the health state of the plant.
Preferably, the first identification region and the second identification region are obtained by performing feature extraction on the plant image information through a feature extraction layer.
Specifically, feature extraction is performed on the plant image information, and the feature extraction or target object recognition is performed through a neural network model, which may be any suitable neural network that can perform feature extraction or target object recognition, including but not limited to a convolutional neural network, an enhanced learning neural network, a generation network in an antagonistic neural network, and the like. The neural network model comprises a feature extraction layer and can be partially adjusted according to actual needs, wherein the feature extraction layer can be a convolutional neural network, the image to be identified, namely plant image information, passes through the feature extraction layer to obtain a first identification region of the simulated image to be identified, and leaf and stem state grade information including the water content of leaves and stems, whether the leaves and stems are etched by insects or whether the leaves and stems have injuries or the like is obtained from the first identification region.
Preferably, the method comprises: according to the leaf and stem information, obtaining water content information of the leaves and stems and integrity information of the leaves and stems; and taking the preset leaf and stem state grade information as first supervision data, and obtaining leaf and stem state grade information corresponding to the first identification region according to the water content information of the leaves and stems and the leaf and stem integrity information.
Specifically, corresponding water content information and leaf-stem integrity information can be obtained from the first identification region according to the color and shape of the leaves and stems, wherein the preset leaf-stem state grade information comprises preset water content grade information aiming at the water content and preset leaf-stem integrity grade information aiming at the leaf-stem integrity, the preset leaf-stem state grade information is used as first supervision data, the input first identification region is trained, and therefore output data is obtained, the leaf-stem water content and the leaf-stem integrity are integrated, and the leaf-stem state grade information corresponding to the plant image information one to one is obtained.
Preferably, the method comprises: according to the soil information, acquiring the wettability information and the nutrient content information of the soil; and taking the preset soil state grade information as second supervision data, and obtaining soil state grade information corresponding to the second identification area according to the wettability information and the nutrient content information of the soil.
Specifically, the corresponding wettability information and nutrient content information can be obtained from the second identification area according to the color of the soil and the particle dispersion degree of the soil, for example, the soil with higher wettability is often darker than the soil with lower wettability, and the particle size is weaker. The preset soil state grade information comprises preset wettability grade information aiming at the wettability and preset nutrient substance grade information aiming at the influence substances, the preset wettability grade information is used as second supervision data, the input second identification area is trained, output data are obtained, the wettability and the nutrient substance content of the soil are integrated, and the leaf and stem soil state grade information corresponding to the plant image information in a one-to-one mode is obtained.
The method integrates two identification areas of the soil part and the leaves and the stems of the plant, can identify the growth state of the plant more completely and pertinently, and judges the health condition of each plant, so that the judgment result is more accurate and reliable.
Preferably, the screening and filtering the image information of the plant according to the leaf and stem state grade information corresponding to the first identification region to obtain the first plant image information includes: obtaining a predetermined plant health status standard; judging whether the leaf and stem state grade information corresponding to the first identification region meets the preset plant health state standard or not; if the leaf and stem state grade information corresponding to the first identification region meets the preset plant health state standard, filtering the plant image information corresponding to the first identification region; and if the leaf and stem state grade information corresponding to the first identification region does not meet the preset plant health state standard, determining that the plant image information corresponding to the first identification region is first plant information, and performing targeted management on the first plant.
Specifically, the predetermined plant health status standard may be a measure established for plant health according to actual requirements, after the leaf and stem status grade information corresponding to the first identification area is obtained, the remaining predetermined plant health status standards are compared to determine whether the plant meets the predetermined plant health status standard, if the predetermined plant health status standard is met, the plant is in good growth status and does not need to be specially processed, if the leaf and stem status grade information corresponding to the first identification area does not meet the predetermined plant health status standard, the plant is marked as first plant information, i.e. a plant with poor health status, and the plant needs to be subjected to targeted processing, such as watering and fertilizing, medicine application and the like, and the above embodiment is also applicable to the second identification area, wherein according to the unsatisfied item of the plant, i.e. the water content of the leaf and stem, the integrity of the leaf and stem, or soil moisture and nutrient content, for example, if the water content of the leaves and stems does not meet the plant health standard, the plant is watered, and the watering amount is determined by the water content grade. Thereby reach intelligent management gardens horticulture plant, when using manpower sparingly, more intelligent, vegetation more healthy technological effect.
Example two
Based on the same inventive concept as the intelligent gardening management method in the previous embodiment, the present invention further provides an intelligent gardening management device, as shown in fig. 2, the device comprising:
a first obtaining unit 11, wherein the first obtaining unit 11 is used for obtaining image information of a plant;
a second obtaining unit 12, where the second obtaining unit 12 is configured to perform feature extraction on the image information of the plant to obtain a first identification region, where the first identification region includes information of a leaf and a stem of the plant;
a first input unit 13, where the first input unit 13 is configured to input the first recognition area into a first training model, where the first training model is obtained by training multiple sets of training data, and each set of training data in the multiple sets includes: the first identification area and preset leaf and stem state grade information;
a third obtaining unit 14, where the third obtaining unit 14 is configured to obtain output information of the first training model, where the output information includes leaf and stem state grade information corresponding to the first identification region;
a fourth obtaining unit 15, where the fourth obtaining unit 15 is configured to perform screening and filtering on the image information of the plant according to the leaf and stem state grade information corresponding to the first identification region, so as to obtain first plant image information, where the first plant is a plant to be managed.
Preferably, the apparatus further comprises:
a fifth obtaining unit, configured to perform feature extraction on the image information of the plant to obtain a second identification region, where the second identification region includes soil information of the plant;
a second input unit, configured to input the second recognition area into a second training model, where the second training model is obtained by training multiple sets of training data, and each set of training data in the multiple sets includes: the second identification area and preset soil state grade information;
a sixth obtaining unit, configured to obtain output information of the second training model, where the output information includes soil state grade information corresponding to the first recognition area;
and the seventh obtaining unit is used for screening and filtering the image information of the plant according to the soil state grade information corresponding to the second identification area to obtain second plant image information, wherein the second plant is a plant to be managed.
Preferably, the device further includes that the first identification region and the second identification region are obtained by performing feature extraction on the plant image information through a feature extraction layer.
Preferably, the apparatus further comprises:
an eighth obtaining unit, configured to obtain, according to the leaf and stem information, water content information of the leaf and stem and integrity information of the leaf and stem;
a ninth obtaining unit, configured to obtain, by using the preset leaf-stem state grade information as first supervision data, leaf-stem state grade information corresponding to the first identification region according to the water content information of the leaf-stem and the leaf-stem integrity information.
Preferably, the apparatus further comprises:
a tenth obtaining unit, configured to obtain, according to the soil information, wettability information and nutrient content information of the soil;
and the eleventh obtaining unit is used for taking the preset soil state grade information as second supervision data, and obtaining soil state grade information corresponding to the second identification area according to the wettability information and the nutrient content information of the soil.
Preferably, the apparatus further comprises:
a twelfth obtaining unit for obtaining a predetermined plant health status criterion;
the first judging unit is used for judging whether the leaf and stem state grade information corresponding to the first identification area meets the preset plant health state standard or not;
the first filtering unit is used for filtering the plant image information corresponding to the first identification region if the leaf and stem state grade information corresponding to the first identification region meets the preset plant health state standard;
and the first determining unit is used for determining that the plant image information corresponding to the first identification region is first plant information and performing targeted management on the first plant if the leaf and stem state grade information corresponding to the first identification region does not meet the preset plant health state standard.
Various modifications and embodiments of the aforementioned intelligent gardening management method in the first embodiment of fig. 1 are also applicable to the intelligent gardening management device of this embodiment, and the implementation of the intelligent gardening management device in this embodiment will be apparent to those skilled in the art from the foregoing detailed description of the intelligent gardening management method, and therefore, for the sake of brevity of the description, detailed description is omitted here.
EXAMPLE III
Based on the same inventive concept as the intelligent gardening management method in the previous embodiment, the present invention further provides an intelligent gardening management device having a computer program stored thereon, wherein the computer program, when executed by a processor, implements the steps of any one of the aforementioned intelligent gardening management methods.
Where in fig. 3 a bus architecture (represented by bus 300), bus 300 may include any number of interconnected buses and bridges, bus 300 linking together various circuits including one or more processors, represented by processor 302, and memory, represented by memory 304. The bus 300 may also link together various other circuits such as peripherals, voltage regulators, power management circuits, and the like, which are well known in the art, and therefore, will not be described any further herein. A bus interface 306 provides an interface between the bus 300 and the receiver 301 and transmitter 303. The receiver 301 and the transmitter 303 may be the same element, i.e., a transceiver, providing a means for communicating with various other apparatus over a transmission medium.
The processor 302 is responsible for managing the bus 300 and general processing, and the memory 304 may be used for storing data used by the processor 302 in performing operations.
Example four
Based on the same inventive concept as the method of intelligent gardening management in the previous embodiment, the present invention also provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of: obtaining image information of a plant; performing feature extraction on the image information of the plant to obtain a first identification region, wherein the first identification region comprises the leaf and stem information of the plant; inputting the first recognition area into a first training model, wherein the first training model is obtained by training a plurality of sets of training data, and each set of training data in the plurality of sets includes: the first identification area and preset leaf and stem state grade information; obtaining output information of the first training model, wherein the output information comprises leaf and stem state grade information corresponding to the first identification region; and screening and filtering the image information of the plant according to the leaf and stem state grade information corresponding to the first identification area to obtain first plant image information, wherein the first plant is a plant to be managed.
In a specific implementation, when the program is executed by a processor, any method step in the first embodiment may be further implemented.
One or more technical solutions in the embodiments of the present application have at least one or more of the following technical effects:
according to the intelligent gardening management method and device provided by the embodiment of the invention, the image information of the plants is obtained; performing feature extraction on the image information of the plant to obtain a first identification region, wherein the first identification region comprises the leaf and stem information of the plant; inputting the first recognition area into a first training model, wherein the first training model is obtained by training a plurality of sets of training data, and each set of training data in the plurality of sets includes: the first identification area and preset leaf and stem state grade information; obtaining output information of the first training model, wherein the output information comprises leaf and stem state grade information corresponding to the first identification region; and screening and filtering the image information of the plant according to the leaf and stem state grade information corresponding to the first identification area to obtain first plant image information, wherein the first plant is a plant to be managed. The technical problems that garden management means in the prior art need manual supervision of professionals, manpower is consumed in a large quantity, the intelligent level of garden management is low, and the management effect is poor are solved. The technical effects of improving artificial intelligence, reducing manpower consumption and improving the management efficiency of the horticultural plants are achieved.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (9)

1. An intelligent horticultural management method, the method comprising:
obtaining image information of a plant;
performing feature extraction on the image information of the plant to obtain a first identification region, wherein the first identification region comprises the leaf and stem information of the plant;
inputting the first recognition area into a first training model, wherein the first training model is obtained by training a plurality of sets of training data, and each set of training data in the plurality of sets includes: the first identification area and preset leaf and stem state grade information;
obtaining output information of the first training model, wherein the output information comprises leaf and stem state grade information corresponding to the first identification region;
and screening and filtering the image information of the plant according to the leaf and stem state grade information corresponding to the first identification area to obtain first plant image information, wherein the first plant is a plant to be managed.
2. The method of claim 1, wherein obtaining image information of the plant comprises, after:
performing feature extraction on the image information of the plant to obtain a second identification area, wherein the second identification area comprises soil information of the plant;
inputting the second recognition area into a second training model, wherein the second training model is obtained by training a plurality of sets of training data, and each set of training data in the plurality of sets includes: the second identification area and preset soil state grade information;
obtaining output information of the second training model, wherein the output information comprises soil state grade information corresponding to the first identification area;
and screening and filtering the image information of the plant according to the soil state grade information corresponding to the second identification area to obtain second plant image information, wherein the second plant is a plant to be managed.
3. The method of claim 2, wherein the first identification region and the second identification region are obtained by feature extraction of the plant image information by a feature extraction layer.
4. The method of claim 1, wherein the method comprises:
according to the leaf and stem information, obtaining water content information of the leaves and stems and integrity information of the leaves and stems;
and taking the preset leaf and stem state grade information as first supervision data, and obtaining leaf and stem state grade information corresponding to the first identification region according to the water content information of the leaves and stems and the leaf and stem integrity information.
5. The method of claim 2, wherein the method comprises:
according to the soil information, acquiring the wettability information and the nutrient content information of the soil;
and taking the preset soil state grade information as second supervision data, and obtaining soil state grade information corresponding to the second identification area according to the wettability information and the nutrient content information of the soil.
6. The method as claimed in claim 1, wherein the step of performing filtering on the image information of the plant according to the leaf and stem status grade information corresponding to the first identification region to obtain the first plant image information comprises:
obtaining a predetermined plant health status standard;
judging whether the leaf and stem state grade information corresponding to the first identification region meets the preset plant health state standard or not;
if the leaf and stem state grade information corresponding to the first identification region meets the preset plant health state standard, filtering the plant image information corresponding to the first identification region;
and if the leaf and stem state grade information corresponding to the first identification region does not meet the preset plant health state standard, determining that the plant image information corresponding to the first identification region is first plant information, and performing targeted management on the first plant.
7. An intelligent gardening management device, wherein the device comprises:
a first obtaining unit for obtaining image information of a plant;
a second obtaining unit, configured to perform feature extraction on the image information of the plant to obtain a first identification region, where the first identification region includes information of leaves and stems of the plant;
a first input unit, configured to input the first recognition area into a first training model, where the first training model is obtained by training multiple sets of training data, and each set of training data in the multiple sets includes: the first identification area and preset leaf and stem state grade information;
a third obtaining unit, configured to obtain output information of the first training model, where the output information includes leaf and stem state grade information corresponding to the first identification region;
and the fourth obtaining unit is used for screening and filtering the image information of the plant according to the leaf and stem state grade information corresponding to the first identification area to obtain first plant image information, wherein the first plant is a plant to be managed.
8. An intelligent gardening management device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor when executing the program implements the steps of:
obtaining image information of a plant;
performing feature extraction on the image information of the plant to obtain a first identification region, wherein the first identification region comprises the leaf and stem information of the plant;
inputting the first recognition area into a first training model, wherein the first training model is obtained by training a plurality of sets of training data, and each set of training data in the plurality of sets includes: the first identification area and preset leaf and stem state grade information;
obtaining output information of the first training model, wherein the output information comprises leaf and stem state grade information corresponding to the first identification region;
and screening and filtering the image information of the plant according to the leaf and stem state grade information corresponding to the first identification area to obtain first plant image information, wherein the first plant is a plant to be managed.
9. A computer-readable storage medium, on which a computer program is stored, which program, when executed by a processor, carries out the steps of:
obtaining image information of a plant;
performing feature extraction on the image information of the plant to obtain a first identification region, wherein the first identification region comprises the leaf and stem information of the plant;
inputting the first recognition area into a first training model, wherein the first training model is obtained by training a plurality of sets of training data, and each set of training data in the plurality of sets includes: the first identification area and preset leaf and stem state grade information;
obtaining output information of the first training model, wherein the output information comprises leaf and stem state grade information corresponding to the first identification region;
and screening and filtering the image information of the plant according to the leaf and stem state grade information corresponding to the first identification area to obtain first plant image information, wherein the first plant is a plant to be managed.
CN202010688777.7A 2020-07-16 2020-07-16 Intelligent gardening management method and device Active CN112001242B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010688777.7A CN112001242B (en) 2020-07-16 2020-07-16 Intelligent gardening management method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010688777.7A CN112001242B (en) 2020-07-16 2020-07-16 Intelligent gardening management method and device

Publications (2)

Publication Number Publication Date
CN112001242A true CN112001242A (en) 2020-11-27
CN112001242B CN112001242B (en) 2022-12-06

Family

ID=73467501

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010688777.7A Active CN112001242B (en) 2020-07-16 2020-07-16 Intelligent gardening management method and device

Country Status (1)

Country Link
CN (1) CN112001242B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112529076A (en) * 2020-12-10 2021-03-19 广西顶俏食品科技集团有限公司 Method and device for reducing impurity yield of high-quality koji soy sauce
CN112597827A (en) * 2020-12-11 2021-04-02 西北农林科技大学 Plant phenological period prediction method and system based on big data
CN112617101A (en) * 2020-12-21 2021-04-09 柳州市汇方科技有限公司 Control method and device for rice noodle extruding
CN112868455A (en) * 2021-01-14 2021-06-01 柳州市彭莫山农业科技有限公司 Method and device for improving planting yield of uncaria

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107844771A (en) * 2017-11-03 2018-03-27 深圳春沐源控股有限公司 Method, system, computer installation and the storage medium of crop production management
CN110352660A (en) * 2019-07-12 2019-10-22 塔里木大学 A kind of ginning cotton seed vigor Fast nondestructive evaluation information processing method and device
CN110404688A (en) * 2019-07-12 2019-11-05 塔里木大学 A kind of machine pick cotton information processing method miscellaneous clearly and device merging electrostatic and wind-force
CN111122453A (en) * 2018-11-01 2020-05-08 阿里巴巴集团控股有限公司 Information processing method, device and system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107844771A (en) * 2017-11-03 2018-03-27 深圳春沐源控股有限公司 Method, system, computer installation and the storage medium of crop production management
CN111122453A (en) * 2018-11-01 2020-05-08 阿里巴巴集团控股有限公司 Information processing method, device and system
CN110352660A (en) * 2019-07-12 2019-10-22 塔里木大学 A kind of ginning cotton seed vigor Fast nondestructive evaluation information processing method and device
CN110404688A (en) * 2019-07-12 2019-11-05 塔里木大学 A kind of machine pick cotton information processing method miscellaneous clearly and device merging electrostatic and wind-force

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112529076A (en) * 2020-12-10 2021-03-19 广西顶俏食品科技集团有限公司 Method and device for reducing impurity yield of high-quality koji soy sauce
CN112597827A (en) * 2020-12-11 2021-04-02 西北农林科技大学 Plant phenological period prediction method and system based on big data
CN112617101A (en) * 2020-12-21 2021-04-09 柳州市汇方科技有限公司 Control method and device for rice noodle extruding
CN112868455A (en) * 2021-01-14 2021-06-01 柳州市彭莫山农业科技有限公司 Method and device for improving planting yield of uncaria

Also Published As

Publication number Publication date
CN112001242B (en) 2022-12-06

Similar Documents

Publication Publication Date Title
CN112001242B (en) Intelligent gardening management method and device
Papageorgiou et al. Yield prediction in apples using Fuzzy Cognitive Map learning approach
CN110909679B (en) Remote sensing identification method and system for fallow crop rotation information of winter wheat historical planting area
CN108510102A (en) A kind of water-fertilizer integral control method of irrigation using big data calculative strategy
US20220248616A1 (en) Irrigation control with deep reinforcement learning and smart scheduling
CN112836623B (en) Auxiliary method and device for agricultural decision of facility tomatoes
CN110648020A (en) Greenhouse crop water demand prediction method and device
Kolmanič et al. An algorithm for automatic dormant tree pruning
Avigal et al. Learning seed placements and automation policies for polyculture farming with companion plants
CN113359628B (en) Control method and device for green tea processing process
CN113377141A (en) Artificial intelligence agricultural automatic management system
CN112868455A (en) Method and device for improving planting yield of uncaria
CN112989688A (en) Method and device for intelligently monitoring pesticide residues of kumquats
CN113010529A (en) Crop management method and device based on knowledge graph
CN112818154A (en) Method and device for efficiently treating surface of fruit
CN112099557A (en) Internet-based household plant planting method and system
Deepthi et al. Application of expert systems for agricultural crop disease diagnoses—A review
Rathore Application of artificial intelligence in agriculture including horticulture
CN114637351B (en) Greenhouse environment regulation and control method and system for facility crops
CN114821253B (en) Method and system for regulating and controlling applicability of water and fertilizer spraying and drip irrigation integrated system
CN112699805A (en) Intelligent recognition system for vegetable pest control
CN114662609B (en) Intelligent greenhouse farm management method and system
CN113408849B (en) Method and system for evaluating mixed pesticide effect of fenpyroximate and pesticide
AU2021100096A4 (en) Ai-based crop recommendation system for smart farming towards agriculture 5.0
CN113552806B (en) Breeding house environment parameter analysis method based on fuzzy control algorithm

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant