CN115482467B - Automatic irrigation system for intelligent gardens - Google Patents

Automatic irrigation system for intelligent gardens Download PDF

Info

Publication number
CN115482467B
CN115482467B CN202211203487.4A CN202211203487A CN115482467B CN 115482467 B CN115482467 B CN 115482467B CN 202211203487 A CN202211203487 A CN 202211203487A CN 115482467 B CN115482467 B CN 115482467B
Authority
CN
China
Prior art keywords
feature
vector
plant
matrix
irrigation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211203487.4A
Other languages
Chinese (zh)
Other versions
CN115482467A (en
Inventor
宋彦峰
高效田
高原
丁娟
王君
申胜歌
黄艳丽
张晓盼
赵礼浩
张赵亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Henan Landscape Architecture Planning And Design Co ltd
Original Assignee
Henan Landscape Architecture Planning And Design Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Henan Landscape Architecture Planning And Design Co ltd filed Critical Henan Landscape Architecture Planning And Design Co ltd
Priority to CN202211203487.4A priority Critical patent/CN115482467B/en
Publication of CN115482467A publication Critical patent/CN115482467A/en
Application granted granted Critical
Publication of CN115482467B publication Critical patent/CN115482467B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01GHORTICULTURE; CULTIVATION OF VEGETABLES, FLOWERS, RICE, FRUIT, VINES, HOPS OR SEAWEED; FORESTRY; WATERING
    • A01G25/00Watering gardens, fields, sports grounds or the like
    • A01G25/16Control of watering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/7715Feature extraction, e.g. by transforming the feature space, e.g. multi-dimensional scaling [MDS]; Mappings, e.g. subspace methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • G06V10/806Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/10Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture
    • Y02A40/22Improving land use; Improving water use or availability; Controlling erosion

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Artificial Intelligence (AREA)
  • Medical Informatics (AREA)
  • Mathematical Physics (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Optimization (AREA)
  • Data Mining & Analysis (AREA)
  • Pure & Applied Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Water Supply & Treatment (AREA)
  • Algebra (AREA)
  • Environmental Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The utility model discloses an automatic irrigation system in wisdom gardens, its adopts artificial intelligence control algorithm based on degree of depth study to plant characteristics such as plant kind and growth state and the environmental element characteristic that the plant is located carry out the characteristic extraction, further based on as plant kind and growth state's plant feature vector is for the transfer vector of environmental element feature matrix of environmental element characteristic carries out the judgement estimation of recommended irrigation volume again, so that the irrigation volume of estimation not only can adapt to plant kind and growth state and can adapt to the growth environment, and then not only can guarantee the growth demand of plant, can also save irrigation volume.

Description

Automatic irrigation system for intelligent gardens
Technical Field
The application relates to the technical field of garden irrigation, in particular to an automatic irrigation system for intelligent gardens.
Background
Irrigation for gardens is a technical measure for supplementing soil moisture required for the growth of garden plants so as to improve the growth conditions of the garden plants. The soil moisture of the garden greenbelt is supplemented in different irrigation modes by using an artificial method or a mechanical method, so that the moisture requirement of plants is met.
In the existing landscape irrigation system, there is already an irrigation device capable of walking, the device has a simple structure and a wide irrigation area, but the problem is that the device faces different types of plants to perform indiscriminate irrigation, on one hand, waste of water resources is caused, and on the other hand, the difference of water demand of different plants is not considered, so that the growth of the plants is affected. In addition, when irrigation is performed by using a walking irrigation device, the irrigation amount needs to be adaptively adjusted in combination with weather conditions.
Thus, an optimized intelligent garden automatic irrigation scheme is desired.
Disclosure of Invention
The present application has been made to solve the above-mentioned technical problems. The embodiment of the application provides an automatic irrigation system for intelligent gardens, which adopts an artificial intelligent control algorithm based on deep learning to extract plant characteristics such as plant types and growth states and environment element characteristics where plants are located, and further carries out judgment and estimation of recommended irrigation quantity based on a transfer vector of a plant characteristic vector serving as the plant types and growth states relative to an environment element characteristic matrix serving as the environment element characteristics, so that the estimated irrigation quantity can be adapted to the plant types and growth states and the growth environment, further not only can the growth requirements of the plants be ensured, but also the irrigation quantity can be saved.
According to one aspect of the present application, there is provided an automatic irrigation system for intelligent gardens, comprising:
the environment data acquisition module is used for acquiring temperature values and humidity values of a plurality of preset time points of an area to be irrigated in the intelligent garden within a preset time period;
the environment element association module is used for respectively arranging the temperature values and the humidity values of a plurality of preset time points of the to-be-irrigated area in a preset time period into a temperature input vector and a humidity input vector according to a time dimension, and then calculating the product between the transposed vector of the temperature input vector and the humidity input vector to obtain an environment element matrix;
the environment element feature extraction module is used for inputting the environment element matrix into a first convolution neural network model of which adjacent layers use convolution kernels which are transposed with each other to obtain an environment element feature matrix;
the irrigation object acquisition module is used for acquiring images of plants in the area to be irrigated, which are acquired by the camera;
the irrigation object identification module is used for enabling the image of the plant in the area to be irrigated to pass through a second convolution neural network model serving as a feature extractor so as to obtain a plant feature vector;
the transfer module is used for calculating a transfer vector of the plant feature vector relative to the environment element feature matrix to serve as a decoding feature vector;
The feature distribution correction module is used for correcting the feature distribution of the decoding feature vector to obtain a corrected decoding feature vector; and the irrigation result generation module is used for enabling the corrected decoding characteristic vector to pass through a decoder to obtain a decoding value, wherein the decoding value is used for representing the recommended irrigation quantity.
In the above-mentioned automatic irrigation system of wisdom gardens, the environmental element feature draws the module, includes: a shallow feature map extracting unit, configured to extract a shallow feature map from an mth layer of the first convolutional neural network model, where M is an even number; a deep feature map extracting unit, configured to extract a deep feature map from an nth layer of the first convolutional neural network model, where N is an even number and N is greater than 2 times of M; the feature map fusion unit is used for fusing the shallow feature map and the deep feature map to generate an environment element feature map; and the dimension reduction unit is used for carrying out global average pooling processing on the environment element feature map along the channel dimension so as to obtain the environment element feature matrix.
In the above-mentioned automatic irrigation system for intelligent gardens, the irrigation object identification module is further configured to: each layer using the second convolutional neural network model is performed in forward pass of the layer: carrying out convolution processing on input data to obtain a convolution characteristic diagram; carrying out mean pooling based on a local feature matrix on the convolution feature map to obtain a pooled feature map; performing nonlinear activation on the pooled feature map to obtain an activated feature map; the output of the last layer of the second convolutional neural network model is the plant characteristic vector, and the input of the first layer of the second convolutional neural network model is an image of plants in the to-be-irrigated area.
In the above-mentioned automatic irrigation system for intelligent gardens, the transfer module is further configured to: calculating a transfer vector of the plant feature vector relative to the environmental element feature matrix by the following formula; wherein, the formula is:
wherein V is a Representing the plant feature vector, M b Representing the feature matrix of the environmental element, V c The transfer vector is represented by a vector of the transfer,the representation vector is multiplied by a matrix.
In the above-mentioned automatic irrigation system for intelligent gardens, the feature distribution correction module is further configured to: correcting the feature distribution of the decoding feature vector by the following formula to obtain the corrected decoding feature vector; wherein, the formula is:
wherein v is i A feature value, v, representing the i-th position of the decoded feature vector i ' represents the eigenvalue of the i-th position of the corrected decoded eigenvector, and log represents a logarithmic function value based on 2.
In the above-mentioned automatic irrigation system for intelligent gardens, the irrigation result generation module is further configured to: performing a decoding regression on the corrected decoded feature vector using a plurality of fully connected layers of the decoder to obtain the decoded value, wherein the formula is: Wherein X is the corrected decoded feature vector, Y is the decoded value, W is a weight matrix, B is a bias vector, +.>Representing the matrix multiplication, h (·) is the activation function.
According to another aspect of the present application, there is also provided an automatic irrigation method for intelligent gardens, comprising:
acquiring temperature values and humidity values of a plurality of preset time points of a region to be irrigated in the intelligent garden within a preset time period;
after arranging temperature values and humidity values of a plurality of preset time points of the area to be irrigated into a temperature input vector and a humidity input vector according to a time dimension respectively, calculating the product between a transposed vector of the temperature input vector and the humidity input vector to obtain an environment element matrix;
inputting the environment element matrix into a first convolution neural network model of a convolution kernel which is mutually transposed for adjacent layers to obtain an environment element feature matrix;
acquiring an image of the plant in the area to be irrigated, which is acquired by a camera;
the image of the plant in the area to be irrigated is passed through a second convolution neural network model serving as a feature extractor to obtain a plant feature vector;
calculating a transfer vector of the plant feature vector relative to the environmental element feature matrix as a decoding feature vector;
Correcting the feature distribution of the decoding feature vector to obtain a corrected decoding feature vector; and passing the corrected decoded feature vector through a decoder to obtain a decoded value, the decoded value being indicative of a recommended irrigation quantity.
In the above automatic irrigation method for intelligent gardens, the inputting the environmental element matrix into the adjacent layer uses a first convolution neural network model of mutually transposed convolution kernels to obtain the environmental element feature matrix, including: extracting a shallow feature map from an M-th layer of the first convolutional neural network model, wherein M is an even number; extracting a deep feature map from an nth layer of the first convolutional neural network model, wherein N is an even number and is greater than 2 times M; fusing the shallow feature map and the deep feature map to generate an environmental element feature map; and carrying out global averaging treatment on the environment element feature map along the channel dimension to obtain the environment element feature matrix.
In the above automatic irrigation method for intelligent gardens, the step of obtaining plant feature vectors by passing the image of the plant in the area to be irrigated through a second convolutional neural network model serving as a feature extractor includes: each layer using the second convolutional neural network model is performed in forward pass of the layer: carrying out convolution processing on input data to obtain a convolution characteristic diagram; carrying out mean pooling based on a local feature matrix on the convolution feature map to obtain a pooled feature map; performing nonlinear activation on the pooled feature map to obtain an activated feature map; the output of the last layer of the second convolutional neural network model is the plant characteristic vector, and the input of the first layer of the second convolutional neural network model is an image of plants in the to-be-irrigated area.
In the above-mentioned intelligent garden automatic irrigation method, the calculating the transfer vector of the plant feature vector with respect to the environmental element feature matrix as a decoded feature vector includes: calculating a transfer vector of the plant feature vector relative to the environmental element feature matrix by the following formula; wherein, the formula is:
wherein V is a Representing the plant feature vector, M b Representing the feature matrix of the environmental element, V c The transfer vector is represented by a vector of the transfer,the representation vector is multiplied by a matrix.
In the above-mentioned automatic irrigation method for intelligent gardens, the correcting the feature distribution of the decoded feature vector to obtain a corrected decoded feature vector includes: correcting the feature distribution of the decoding feature vector by the following formula to obtain the corrected decoding feature vector; wherein, the formula is:
wherein v is i A feature value, v, representing the i-th position of the decoded feature vector i ' represents the eigenvalue of the i-th position of the corrected decoded eigenvector, and log represents a logarithmic function value based on 2.
In the above-mentioned intelligent garden automatic irrigation method, the step of passing the corrected decoded feature vector through a decoder to obtain a decoded value includes: performing a decoding regression on the corrected decoded feature vector using a plurality of fully connected layers of the decoder to obtain the decoded value, wherein the formula is: Wherein X is the corrected decoded feature vector, Y is the decoded value, W is a weight matrix, B is a bias vector, +.>Representing the matrix multiplication, h (·) is the activation function.
According to still another aspect of the present application, there is provided an electronic apparatus including: a processor; and a memory in which computer program instructions are stored which, when executed by the processor, cause the processor to perform the method of automatic irrigation of intelligent gardens as described above.
According to a further aspect of the present application there is provided a computer readable medium having stored thereon computer program instructions which, when executed by a processor, cause the processor to perform the method of automatic irrigation of intelligent gardens as described above.
Compared with the prior art, the intelligent garden automatic irrigation system provided by the application adopts the artificial intelligent control algorithm based on deep learning to extract the plant characteristics such as the plant type and the growth state and the environmental element characteristics of the plant, and further carries out judgment and estimation of the recommended irrigation amount based on the transfer vector of the plant characteristic vector serving as the plant type and the growth state relative to the environmental element characteristic matrix serving as the environmental element characteristics, so that the estimated irrigation amount can be adapted to the plant type and the growth state and the growth environment, further the growth requirement of the plant can be ensured, and the irrigation amount can be saved.
Drawings
The above and other objects, features and advantages of the present application will become more apparent by describing embodiments of the present application in more detail with reference to the attached drawings. The accompanying drawings are included to provide a further understanding of embodiments of the application and are incorporated in and constitute a part of this specification, illustrate the application and together with the embodiments of the application, and not constitute a limitation to the application. In the drawings, like reference numerals generally refer to like parts or steps.
Fig. 1 illustrates an application scenario diagram of an automatic irrigation system for intelligent gardens according to an embodiment of the present application.
Fig. 2 illustrates a block diagram of an automatic irrigation system for intelligent gardens, according to an embodiment of the present application.
Fig. 3 illustrates a system architecture diagram of an automatic irrigation system for intelligent gardens according to an embodiment of the present application.
Fig. 4 illustrates a block diagram of an environmental element feature extraction module in an automatic irrigation system for intelligent gardens, according to an embodiment of the application.
Fig. 5 illustrates a flowchart of an automatic irrigation method for intelligent gardens according to an embodiment of the present application.
Fig. 6 illustrates a block diagram of an electronic device according to an embodiment of the application.
Detailed Description
Hereinafter, exemplary embodiments according to the present application will be described in detail with reference to the accompanying drawings. It should be apparent that the described embodiments are only some embodiments of the present application and not all embodiments of the present application, and it should be understood that the present application is not limited by the example embodiments described herein.
Summary of the application
As described above, in the existing landscape irrigation system, there is already a walkable irrigation device, which has a simple structure and a wide irrigation area, but the problem is that the device performs non-differential irrigation on different types of plants, which causes waste of water resources on one hand and affects growth of plants on the other hand without considering different water requirements of different plants. In addition, when irrigation is performed by using a walking irrigation device, the irrigation amount needs to be adaptively adjusted in combination with weather conditions. Thus, an optimized intelligent garden automatic irrigation scheme is desired.
At present, deep learning and neural networks have been widely used in the fields of computer vision, natural language processing, speech signal processing, and the like. In addition, deep learning and neural networks have also shown levels approaching and even exceeding humans in the fields of image classification, object detection, semantic segmentation, text translation, and the like.
In recent years, deep learning and development of neural networks provide new solutions and schemes for automatic irrigation of intelligent gardens.
Specifically, in the technical scheme of the application, the artificial intelligent control algorithm based on deep learning is adopted to perform feature extraction on plant features such as plant types and growth states and environment element features where plants are located, and further, judgment and estimation of recommended irrigation amount are performed based on the plant feature vectors serving as the plant types and growth states relative to the transfer vectors of the environment element feature matrix serving as the environment element features, so that the estimated irrigation amount can be adapted to the plant types and growth states and the growth environment, and further, the growth requirements of the plants can be ensured, and the irrigation amount can be saved.
Specifically, in the technical scheme of the application, firstly, temperature values and humidity values of a region to be irrigated of the intelligent garden at a plurality of preset time points in a preset time period are obtained. Then, the environmental element characteristic information of the intelligent garden to be irrigated area is expressed in order to construct the correlation characteristic of the temperature change characteristic and the humidity change characteristic of the intelligent garden to be irrigated area. Therefore, the temperature values and the humidity values of a plurality of preset time points of the area to be irrigated in a preset time period are further arranged into a temperature input vector and a humidity input vector according to the time dimension respectively so as to integrate the temperature data information and the humidity data information of the area to be irrigated in the time dimension respectively; then, a product between a transpose of the temperature input vector and the humidity input vector is calculated to obtain an environmental element matrix having the region temperature and humidity correlations.
Implicit correlated feature extraction of the environmental elements can then be performed using a convolutional neural network model that has excellent performance in terms of local implicit correlated feature extraction, but taking into account that there is a considerable degree of correlation due to temperature and humidity in the environmental elements. Therefore, in order to fully extract the hidden characteristic of the plant growth environment elements in the to-be-irrigated area to accurately estimate the irrigation amount, in the technical scheme of the application, the first convolution neural network model with the convolution kernels transposed to each other is further used for carrying out characteristic mining on the environment element matrix by the adjacent layers so as to extract the deep and more sufficient hidden characteristic information of the plant growth environment elements in the to-be-irrigated area, thereby obtaining the environment element characteristic matrix. It should be understood that, the adjacent convolution layers of the first convolutional neural network model can update network parameters and search network parameter structures suitable for specific data structures at the same time during training by using the convolution kernels which are transposed with each other, so as to improve the accuracy of subsequent classification.
Further, after characteristic excavation is performed on the plant growth environment element characteristics in the to-be-irrigated area, the fact that the water quantity required by different types of plants and different growth states of the same type of plants is different is considered. Therefore, in order to obtain more accurate irrigation quantity to ensure the plant growth requirement and avoid waste, in the technical scheme of the application, the characteristic excavation is also required for the plant type and the growth state in the irrigation area to be irrigated. Specifically, an image of the plant in the area to be irrigated is acquired through a camera, and the image of the plant in the area to be irrigated is processed through a second convolution neural network model serving as a feature extractor, so that implicit feature distribution information about the type and growth state of the plant in the image of the plant in the area to be irrigated is extracted, and a plant feature vector is obtained.
Next, decoding regression is performed based on the transfer vector of the plant feature vector as the plant type and the growth state with respect to the environmental element feature matrix as the environmental element feature as a decoding feature vector to obtain a decoded value representing the recommended irrigation amount. In this way, the recommended irrigation amount decoded can be adapted to the kind and growth state of the plant and to the growth environment.
In particular, in the technical solution of the present application, when calculating the transfer vector of the plant feature vector with respect to the environmental element feature matrix as the decoding feature vector, since the image semantic association feature distribution expressed by the plant feature vector and the time sequence association feature distribution of the temperature and the humidity expressed by the environmental element feature matrix belong to heterogeneous distributions, local abnormal distribution may be introduced in the decoding feature vector, thereby causing generalized deviation of regression in decoding regression.
Therefore, the decoded feature vector is subjected to micromanipulator conversion optimization of regression deviation, specifically:
wherein v is i Is the eigenvalue of the i-th position of the decoded eigenvector.
That is, for the generalized deviation of the high-dimensional feature distribution of the decoding feature vector under the decoding regression problem due to the local abnormal distribution, the generalized deviation is converted into the informationized expression combination of the micromanipulators based on the derivative constraint form of the generalized convergence rate, so that the decision domain under the regression limit is converged based on the generalized constraint of the regression problem, and the certainty of the generalized result under the target regression problem is improved, that is, the accuracy of the decoding value of the decoding feature vector passing through the decoder is improved. Like this, can carry out the self-adaptation control of irrigation volume according to actual conditions for the irrigation volume that finally obtains not only adapts to the kind and the growth state of plant and adapts to the growing environment of plant, and then makes the recommended irrigation volume not only can guarantee the growth of plant and need, can also save irrigation volume.
Based on this, the application proposes an automatic irrigation system for intelligent gardens, comprising: the environment data acquisition module is used for acquiring temperature values and humidity values of a plurality of preset time points of an area to be irrigated in the intelligent garden within a preset time period; the environment element association module is used for respectively arranging the temperature values and the humidity values of a plurality of preset time points of the to-be-irrigated area in a preset time period into a temperature input vector and a humidity input vector according to a time dimension, and then calculating the product between the transposed vector of the temperature input vector and the humidity input vector to obtain an environment element matrix; the environment element feature extraction module is used for inputting the environment element matrix into a first convolution neural network model of which adjacent layers use convolution kernels which are transposed with each other to obtain an environment element feature matrix; the irrigation object acquisition module is used for acquiring images of plants in the area to be irrigated, which are acquired by the camera; the irrigation object identification module is used for enabling the image of the plant in the area to be irrigated to pass through a second convolution neural network model serving as a feature extractor so as to obtain a plant feature vector; the transfer module is used for calculating a transfer vector of the plant feature vector relative to the environment element feature matrix to serve as a decoding feature vector; the feature distribution correction module is used for correcting the feature distribution of the decoding feature vector to obtain a corrected decoding feature vector; and the irrigation result generation module is used for enabling the corrected decoding characteristic vector to pass through a decoder to obtain a decoding value, wherein the decoding value is used for representing the recommended irrigation quantity.
Fig. 1 illustrates an application scenario diagram of an automatic irrigation system for intelligent gardens according to an embodiment of the present application. As shown in fig. 1, in this application scenario, first, temperature values and humidity values of a region to be irrigated (e.g., a as illustrated in fig. 1) of a smart garden (e.g., G as illustrated in fig. 1) acquired by a temperature sensor (e.g., se1 as illustrated in fig. 1) and a humidity sensor (e.g., se2 as illustrated in fig. 1) at a plurality of predetermined time points within a predetermined period of time, and images of plants (e.g., P as illustrated in fig. 1) within the region to be irrigated acquired by a camera (e.g., C as illustrated in fig. 1) are acquired. Further, the temperature and humidity values of the area to be irrigated of the smart gardens at a plurality of predetermined time points within a predetermined period of time and the image of the plant within the area to be irrigated are input to a server (e.g., S as illustrated in fig. 1) where an automatic irrigation algorithm of the smart gardens is deployed, wherein the server is capable of processing the input temperature and humidity values of the area to be irrigated of the smart gardens at a plurality of predetermined time points within a predetermined period of time and the image of the plant within the area to be irrigated with the automatic irrigation algorithm of the smart gardens to obtain a decoded value representing a recommended irrigation amount.
Having described the basic principles of the present application, various non-limiting embodiments of the present application will now be described in detail with reference to the accompanying drawings.
Exemplary System
Fig. 2 illustrates a block diagram of an automatic irrigation system for intelligent gardens, according to an embodiment of the present application. As shown in fig. 2, the automatic irrigation system 100 for intelligent gardens according to an embodiment of the present application includes: the environment data acquisition module 110 is used for acquiring temperature values and humidity values of a plurality of preset time points of an area to be irrigated in the intelligent garden in a preset time period; the environmental element association module 120 is configured to arrange the temperature values and the humidity values of the to-be-irrigated area at a plurality of predetermined time points in a predetermined time period into a temperature input vector and a humidity input vector according to a time dimension, and calculate a product between a transposed vector of the temperature input vector and the humidity input vector to obtain an environmental element matrix; the environmental element feature extraction module 130 is configured to input the environmental element matrix into a first convolutional neural network model of an adjacent layer using mutually transposed convolution kernels to obtain an environmental element feature matrix; the irrigation object acquisition module 140 is used for acquiring the image of the plant in the area to be irrigated, which is acquired by the camera; the irrigation object identifying module 150 is configured to pass the image of the plant in the area to be irrigated through a second convolutional neural network model serving as a feature extractor to obtain a plant feature vector; a transfer module 160 for calculating a transfer vector of the plant feature vector with respect to the environmental element feature matrix as a decoded feature vector; a feature distribution correction module 170, configured to correct a feature distribution of the decoded feature vector to obtain a corrected decoded feature vector; and an irrigation result generation module 180, configured to pass the corrected decoded feature vector through a decoder to obtain a decoded value, where the decoded value is used to represent a recommended irrigation amount.
Fig. 3 illustrates a system architecture diagram of an automatic irrigation system for intelligent gardens according to an embodiment of the present application. As shown in fig. 3, in the system architecture, first, temperature values and humidity values of a region to be irrigated of a smart garden at a plurality of predetermined time points within a predetermined period of time and images of plants within the region to be irrigated, which are acquired by a camera, are acquired. And then, respectively arranging the temperature values and the humidity values of a plurality of preset time points of the area to be irrigated into a temperature input vector and a humidity input vector according to a time dimension, and calculating the product between the transposed vector of the temperature input vector and the humidity input vector to obtain an environment element matrix. Then, the environmental element matrix is input into a first convolution neural network model of adjacent layers using convolution kernels which are transposed with each other to obtain an environmental element feature matrix. And then, passing the image of the plant in the area to be irrigated through a second convolution neural network model serving as a feature extractor to obtain plant feature vectors. Then, calculating a transfer vector of the plant feature vector relative to the environmental element feature matrix as a decoding feature vector, and correcting the feature distribution of the decoding feature vector to obtain a corrected decoding feature vector. The corrected decoded feature vector is then passed through a decoder to obtain a decoded value, which is used to represent the recommended irrigation quantity.
In the automatic irrigation system 100 for intelligent gardens, the environmental data collection module 110 is configured to obtain temperature values and humidity values of the area to be irrigated for intelligent gardens at a plurality of predetermined time points within a predetermined time period. In the existing landscape irrigation system, there is already an irrigation device capable of walking, the device has a simple structure and a wide irrigation area, but the problem is that the device faces different types of plants to perform indiscriminate irrigation, on one hand, waste of water resources is caused, and on the other hand, the difference of water demand of different plants is not considered, so that the growth of the plants is affected. In addition, when irrigation is performed by using a walking irrigation device, the irrigation amount needs to be adaptively adjusted in combination with weather conditions. Thus, an optimized intelligent garden automatic irrigation scheme is desired.
Specifically, in the technical scheme of the application, the artificial intelligent control algorithm based on deep learning is adopted to perform feature extraction on plant features such as plant types and growth states and environment element features where plants are located, and further, judgment and estimation of recommended irrigation amount are performed based on the plant feature vectors serving as the plant types and growth states relative to the transfer vectors of the environment element feature matrix serving as the environment element features, so that the estimated irrigation amount can be adapted to the plant types and growth states and the growth environment, and further, the growth requirements of the plants can be ensured, and the irrigation amount can be saved. Thus, first, a temperature value and a humidity value of an area to be irrigated of the smart garden at a plurality of predetermined time points within a predetermined period of time are acquired, wherein the temperature value and the humidity value are acquired by a temperature sensor and a humidity sensor.
In the automatic irrigation system 100 for intelligent gardens, the environmental element association module 120 is configured to arrange the temperature values and the humidity values of the to-be-irrigated area at a plurality of predetermined time points in a predetermined time period into a temperature input vector and a humidity input vector according to a time dimension, and then calculate a product between a transposed vector of the temperature input vector and the humidity input vector to obtain an environmental element matrix. And expressing the environmental element characteristic information of the intelligent garden to be irrigated in order to construct the correlation characteristic of the temperature change characteristic and the humidity change characteristic of the intelligent garden to be irrigated. Therefore, the temperature values and the humidity values of a plurality of preset time points of the area to be irrigated in a preset time period are further arranged into a temperature input vector and a humidity input vector according to the time dimension respectively so as to integrate the temperature data information and the humidity data information of the area to be irrigated in the time dimension respectively; then, a product between a transpose of the temperature input vector and the humidity input vector is calculated to obtain an environmental element matrix having the region temperature and humidity correlations.
In the above-mentioned intelligent garden automatic irrigation system 100, the environmental element feature extraction module 130 is configured to input the environmental element matrix into a first convolutional neural network model of an adjacent layer using mutually transposed convolutional kernels to obtain the environmental element feature matrix. That is, implicit associated feature extraction of the environmental elements is performed using a convolutional neural network model that has excellent performance in terms of local implicit associated feature extraction. However, it is considered that there is a considerable correlation due to the temperature and humidity in the environmental elements. Thus, in order to be able to adequately extract the plant growth environmental element implicit features in the area to be irrigated, an accurate estimation of the irrigation quantity is made. In the technical scheme of the application, the characteristic mining is further carried out on the environment element matrix by using a first convolution neural network model of which the adjacent layers are transposed with each other, so that the plant growth environment element hidden characteristic information in the deep and more sufficient to-be-irrigated area is extracted, and the environment element characteristic matrix is obtained.
In one example, in the above-described intelligent garden automatic irrigation system 100, the environmental element feature extraction module 130 is further configured to: extracting a shallow feature map from an mth layer of the first convolutional neural network model using a shallow feature map extraction unit, wherein M is an even number; extracting a deep feature map from an nth layer of the first convolutional neural network model using a deep feature map extraction unit, wherein N is an even number and N is greater than 2 times M; fusing the shallow feature map and the deep feature map by using a feature map fusion unit to generate an environment element feature map; and performing global averaging treatment on the environment element feature map along the channel dimension by using a dimension reduction unit to obtain the environment element feature matrix.
It should be understood that, the adjacent convolution layers of the first convolutional neural network model can update network parameters and search network parameter structures suitable for specific data structures at the same time during training by using the convolution kernels which are transposed with each other, so as to improve the accuracy of subsequent classification.
Fig. 4 illustrates a block diagram of an environmental element feature extraction module in an automatic irrigation system for intelligent gardens, according to an embodiment of the application. As shown in fig. 4, the environmental element feature extraction module 130 includes: a shallow feature map extracting unit 131, configured to extract a shallow feature map from an mth layer of the first convolutional neural network model, where M is an even number; a deep feature map extracting unit 132, configured to extract a deep feature map from an nth layer of the first convolutional neural network model, where N is an even number and N is greater than 2 times of M; a feature map fusion unit 133, configured to fuse the shallow feature map and the deep feature map to generate an environmental element feature map; and a dimension reduction unit 134, configured to perform global averaging processing along a channel dimension on the environmental element feature map to obtain the environmental element feature matrix.
In the automatic irrigation system 100 for intelligent gardens, the irrigation object collection module 140 is configured to obtain an image of the plant in the area to be irrigated, which is collected by the camera. Further, after characteristic excavation is performed on the plant growth environment element characteristics in the to-be-irrigated area, the fact that the water quantity required by different types of plants and different growth states of the same type of plants is different is considered. Therefore, in order to obtain more accurate irrigation quantity to ensure the plant growth requirement and avoid waste, in the technical scheme of the application, the characteristic excavation is also required for the plant type and the growth state in the irrigation area to be irrigated. Specifically, firstly, an image of the plant in the area to be irrigated is acquired by a camera.
In the automatic irrigation system 100 for intelligent gardens, the irrigation object identification module 150 is configured to obtain the plant feature vector by passing the image of the plant in the area to be irrigated through a second convolutional neural network model serving as a feature extractor. That is, after obtaining the image of the plant in the area to be irrigated, the image of the plant in the area to be irrigated is processed through a second convolutional neural network model serving as a feature extractor to extract implicit feature distribution information about the type and growth state of the plant in the image of the plant in the area to be irrigated, thereby obtaining a plant feature vector.
In one example, in the above-described intelligent garden automatic irrigation system 100, the irrigation object identification module 150 is further configured to: each layer using the second convolutional neural network model is performed in forward pass of the layer: carrying out convolution processing on input data to obtain a convolution characteristic diagram; carrying out mean pooling based on a local feature matrix on the convolution feature map to obtain a pooled feature map; performing nonlinear activation on the pooled feature map to obtain an activated feature map; the output of the last layer of the second convolutional neural network model is the plant characteristic vector, and the input of the first layer of the second convolutional neural network model is an image of plants in the to-be-irrigated area.
In the above-mentioned intelligent garden automatic irrigation system 100, the transfer module 160 is configured to calculate a transfer vector of the plant feature vector with respect to the environmental element feature matrix as a decoded feature vector. In order to perform judgment estimation of the recommended irrigation amount based on the transfer vector of the plant feature vector as the plant type and the growth state with respect to the environmental element feature matrix as the environmental element feature, that is, performing decoding regression based on the transfer vector of the plant feature vector as the plant type and the growth state with respect to the environmental element feature matrix as the environmental element feature as the decoding feature vector.
In one example, in the above-described intelligent garden automatic irrigation system 100, the transfer module 160 is further configured to: calculating a transfer vector of the plant feature vector relative to the environmental element feature matrix by the following formula; wherein, the formula is:
wherein V is a Representing the plant feature vector, M b Representing the feature matrix of the environmental element, V c The transfer vector is represented by a vector of the transfer,the representation vector is multiplied by a matrix.
In the intelligent garden automatic irrigation system 100, the feature distribution correction module 170 is configured to correct the feature distribution of the decoded feature vector to obtain a corrected decoded feature vector. In particular, in the technical solution of the present application, when calculating the transfer vector of the plant feature vector with respect to the environmental element feature matrix as the decoding feature vector, since the image semantic association feature distribution expressed by the plant feature vector and the time sequence association feature distribution of the temperature and the humidity expressed by the environmental element feature matrix belong to heterogeneous distributions, local abnormal distribution may be introduced in the decoding feature vector, thereby causing generalized deviation of regression in decoding regression. Thus, a micromanipulatable transformation optimization of regression bias is performed on the decoded feature vectors.
In one example, in the above-described intelligent garden automatic irrigation system 100, the feature distribution correction module 170 is further configured to: correcting the feature distribution of the decoding feature vector by the following formula to obtain the corrected decoding feature vector; wherein, the formula is:
wherein v is i A feature value, v, representing the i-th position of the decoded feature vector i ' represents the eigenvalue of the i-th position of the corrected decoded eigenvector, and log represents a logarithmic function value based on 2.
That is, for the generalized deviation of the high-dimensional feature distribution of the decoding feature vector under the decoding regression problem due to the local abnormal distribution, the generalized deviation is converted into the informationized expression combination of the micromanipulators based on the derivative constraint form of the generalized convergence rate, so that the decision domain under the regression limit is converged based on the generalized constraint of the regression problem, and the certainty of the generalized result under the target regression problem is improved, that is, the accuracy of the decoding value of the decoding feature vector passing through the decoder is improved.
In the intelligent garden automatic irrigation system 100, the irrigation result generating module 180 is configured to pass the corrected decoded feature vector through a decoder to obtain a decoded value, where the decoded value is used to represent the recommended irrigation amount. Like this, can make the decoding obtain recommended irrigation volume adaptation in the kind and the growth state of plant and adapt to the growing environment, namely, can carry out the self-adaptation control of irrigation volume according to actual conditions for the irrigation volume that finally obtains not only adapts to the kind and the growth state of plant and adapts to the growing environment of plant, and then makes the recommended irrigation volume not only can guarantee that the growth of plant needs, can also save irrigation volume.
In one example, in the above-described intelligent garden automatic irrigation system 100, the irrigation result generation module 180 is further configured to: performing a decoding regression on the corrected decoded feature vector using a plurality of fully connected layers of the decoder to obtain the decoded value, wherein the formula is:wherein X is the corrected decoded feature vector, Y is the decoded value, W is a weight matrix, B is a bias vector, +.>Representing the matrix multiplication, h (·) is the activation function.
In summary, the intelligent garden automatic irrigation system 100 according to the embodiment of the present application is illustrated, which performs feature extraction on plant features such as plant types and growth states and environmental element features where plants are located by adopting an artificial intelligent control algorithm based on deep learning, and further performs judgment estimation of recommended irrigation amount based on a transfer vector of a plant feature vector as the plant types and growth states relative to an environmental element feature matrix as the environmental element features, so that the estimated irrigation amount can be adapted to not only plant types and growth states but also growth environments, thereby not only ensuring growth requirements of plants, but also saving irrigation amount.
As described above, the automatic irrigation system 100 for intelligent gardens according to the embodiment of the present application can be implemented in various terminal devices, such as a server for automatic irrigation of intelligent gardens, etc. In one example, the intelligent garden automatic irrigation system 100 according to an embodiment of the present application may be integrated into the terminal device as a software module and/or hardware module. For example, the intelligent garden automatic irrigation system 100 may be a software module in the operating system of the terminal device, or may be an application developed for the terminal device; of course, the intelligent garden automatic irrigation system 100 could also be one of the hardware modules of the terminal device.
Alternatively, in another example, the intelligent-garden automatic irrigation system 100 and the terminal device may be separate devices, and the intelligent-garden automatic irrigation system 100 may be connected to the terminal device through a wired and/or wireless network, and transmit interactive information in a contracted data format.
Exemplary method
Fig. 5 illustrates a flowchart of an automatic irrigation method for intelligent gardens according to an embodiment of the present application. As shown in fig. 5, the automatic irrigation method for intelligent gardens according to the embodiment of the application comprises the following steps: s110, acquiring temperature values and humidity values of a plurality of preset time points of a region to be irrigated in the intelligent garden in a preset time period; s120, after arranging temperature values and humidity values of a plurality of preset time points of the area to be irrigated into a temperature input vector and a humidity input vector according to a time dimension respectively, calculating the product between a transposed vector of the temperature input vector and the humidity input vector to obtain an environment element matrix; s130, inputting the environment element matrix into a first convolution neural network model of which adjacent layers use convolution kernels which are transposed to obtain an environment element feature matrix; s140, acquiring an image of the plant in the area to be irrigated, which is acquired by a camera; s150, passing the image of the plant in the area to be irrigated through a second convolution neural network model serving as a feature extractor to obtain a plant feature vector; s160, calculating a transfer vector of the plant feature vector relative to the environment element feature matrix as a decoding feature vector; s170, correcting the feature distribution of the decoding feature vector to obtain a corrected decoding feature vector; and S180, passing the corrected decoding characteristic vector through a decoder to obtain a decoding value, wherein the decoding value is used for representing the recommended irrigation quantity.
Specifically, in an embodiment of the present application, the inputting the environmental element matrix into the adjacent layer uses a first convolutional neural network model with mutually transposed convolution kernels to obtain an environmental element feature matrix, including: extracting a shallow feature map from an M-th layer of the first convolutional neural network model, wherein M is an even number; extracting a deep feature map from an nth layer of the first convolutional neural network model, wherein N is an even number and is greater than 2 times M; fusing the shallow feature map and the deep feature map to generate an environmental element feature map; and carrying out global averaging treatment on the environment element feature map along the channel dimension to obtain the environment element feature matrix.
Specifically, in an embodiment of the present application, the step of obtaining a plant feature vector by passing the image of the plant in the area to be irrigated through a second convolutional neural network model serving as a feature extractor includes: each layer using the second convolutional neural network model is performed in forward pass of the layer: carrying out convolution processing on input data to obtain a convolution characteristic diagram; carrying out mean pooling based on a local feature matrix on the convolution feature map to obtain a pooled feature map; performing nonlinear activation on the pooled feature map to obtain an activated feature map; the output of the last layer of the second convolutional neural network model is the plant characteristic vector, and the input of the first layer of the second convolutional neural network model is an image of plants in the to-be-irrigated area.
Specifically, in an embodiment of the present application, the calculating a transfer vector of the plant feature vector with respect to the environmental element feature matrix as a decoded feature vector includes: calculating a transfer vector of the plant feature vector relative to the environmental element feature matrix by the following formula; wherein, the formula is:
wherein V is a Representing the plant feature vector, M b Representing the feature matrix of the environmental element, V c The transfer vector is represented by a vector of the transfer,the representation vector is multiplied by a matrix.
Specifically, in an embodiment of the present application, the correcting the feature distribution of the decoded feature vector to obtain a corrected decoded feature vector includes: correcting the feature distribution of the decoding feature vector by the following formula to obtain the corrected decoding feature vector; wherein, the formula is:
wherein v is i A feature value, v, representing the i-th position of the decoded feature vector i ' represents the eigenvalue of the i-th position of the corrected decoded eigenvector, and log represents a logarithmic function value based on 2.
Specifically, in an embodiment of the present application, the passing the corrected decoded feature vector through a decoder to obtain a decoded value includes: performing a decoding regression on the corrected decoded feature vector using a plurality of fully connected layers of the decoder to obtain the decoded value, wherein the formula is: Wherein X is the corrected decoded feature vector, Y is the decoded value, W is a weight matrix, B is a bias vector, +.>Representing the matrix multiplication, h (·) is the activation function.
In summary, the automatic irrigation method for intelligent gardens according to the embodiment of the application is clarified, which adopts an artificial intelligent control algorithm based on deep learning to perform feature extraction on plant features such as plant types and growth states and environmental element features where plants are located, and further performs judgment estimation of recommended irrigation amount based on a transfer vector of a plant feature vector serving as the plant types and growth states relative to an environmental element feature matrix serving as the environmental element features, so that the estimated irrigation amount can be adapted to not only plant types and growth states but also growth environments, thereby not only ensuring growth requirements of plants, but also saving irrigation amount.
Exemplary electronic device
Next, an electronic device according to an embodiment of the present application is described with reference to fig. 6. Fig. 6 illustrates a block diagram of an electronic device according to an embodiment of the application.
As shown in fig. 6, the electronic device 10 includes one or more processors 11 and a memory 12.
The processor 11 may be a Central Processing Unit (CPU) or other form of processing unit having data processing and/or instruction execution capabilities, and may control other components in the electronic device 10 to perform desired functions.
Memory 12 may include one or more computer program products that may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. The volatile memory may include, for example, random Access Memory (RAM) and/or cache memory (cache), and the like. The non-volatile memory may include, for example, read Only Memory (ROM), hard disk, flash memory, and the like. On which one or more computer program instructions may be stored that the processor 11 may execute to implement the functions in the automatic irrigation method for intelligent gardens of the various embodiments of the application described above and/or other desired functions. Various contents such as a temperature value, a humidity value, a plant image, etc. may also be stored in the computer-readable storage medium.
In one example, the electronic device 10 may further include: an input device 13 and an output device 14, which are interconnected by a bus system and/or other forms of connection mechanisms (not shown).
The input means 13 may comprise, for example, a keyboard, a mouse, etc.
The output device 14 can output various information including a decoded value and the like to the outside. The output means 14 may include, for example, a display, speakers, a printer, and a communication network and remote output devices connected thereto, etc.
Of course, only some of the components of the electronic device 10 that are relevant to the present application are shown in fig. 6 for simplicity, components such as buses, input/output interfaces, etc. are omitted. In addition, the electronic device 10 may include any other suitable components depending on the particular application.
Exemplary computer program product and computer readable storage Medium
In addition to the methods and apparatus described above, embodiments of the application may also be a computer program product comprising computer program instructions which, when executed by a processor, cause the processor to perform steps in the functions of the automatic irrigation method of intelligent gardens according to the various embodiments of the application described in the "exemplary methods" section of this specification.
The computer program product may write program code for performing operations of embodiments of the present application in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server.
Furthermore, embodiments of the present application may also be a computer-readable storage medium, on which computer program instructions are stored, which, when being executed by a processor, cause the processor to perform steps in the functions of the automatic irrigation method of intelligent gardens according to the various embodiments of the present application described in the "exemplary methods" section of the description above.
The computer readable storage medium may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium may include, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The basic principles of the present application have been described above in connection with specific embodiments, however, it should be noted that the advantages, benefits, effects, etc. mentioned in the present application are merely examples and not intended to be limiting, and these advantages, benefits, effects, etc. are not to be considered as essential to the various embodiments of the present application. Furthermore, the specific details disclosed herein are for purposes of illustration and understanding only, and are not intended to be limiting, as the application is not necessarily limited to practice with the above described specific details.
The block diagrams of the devices, apparatuses, devices, systems referred to in the present application are only illustrative examples and are not intended to require or imply that the connections, arrangements, configurations must be made in the manner shown in the block diagrams. As will be appreciated by one of skill in the art, the devices, apparatuses, devices, systems may be connected, arranged, configured in any manner. Words such as "including," "comprising," "having," and the like are words of openness and mean "including but not limited to," and are used interchangeably therewith. The terms "or" and "as used herein refer to and are used interchangeably with the term" and/or "unless the context clearly indicates otherwise. The term "such as" as used herein refers to, and is used interchangeably with, the phrase "such as, but not limited to.
It is also noted that in the apparatus, devices and methods of the present application, the components or steps may be disassembled and/or assembled. Such decomposition and/or recombination should be considered as equivalent aspects of the present application.
The previous description of the disclosed aspects is provided to enable any person skilled in the art to make or use the present application. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects without departing from the scope of the application. Thus, the present application is not intended to be limited to the aspects shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing description has been presented for purposes of illustration and description. Furthermore, this description is not intended to limit embodiments of the application to the form disclosed herein. Although a number of example aspects and embodiments have been discussed above, a person of ordinary skill in the art will recognize certain variations, modifications, alterations, additions, and subcombinations thereof.

Claims (3)

1. An automatic irrigation system for intelligent gardens, comprising:
the environment data acquisition module is used for acquiring temperature values and humidity values of a plurality of preset time points of an area to be irrigated in the intelligent garden within a preset time period;
the environment element association module is used for respectively arranging the temperature values and the humidity values of a plurality of preset time points of the to-be-irrigated area in a preset time period into a temperature input vector and a humidity input vector according to a time dimension, and then calculating the product between the transposed vector of the temperature input vector and the humidity input vector to obtain an environment element matrix;
the environment element feature extraction module is used for inputting the environment element matrix into a first convolution neural network model of which adjacent layers use convolution kernels which are transposed with each other to obtain an environment element feature matrix;
the irrigation object acquisition module is used for acquiring images of plants in the area to be irrigated, which are acquired by the camera;
The irrigation object identification module is used for enabling the image of the plant in the area to be irrigated to pass through a second convolution neural network model serving as a feature extractor so as to obtain a plant feature vector;
the transfer module is used for calculating a transfer vector of the plant feature vector relative to the environment element feature matrix to serve as a decoding feature vector;
the feature distribution correction module is used for correcting the feature distribution of the decoding feature vector to obtain a corrected decoding feature vector; the irrigation result generation module is used for enabling the corrected decoding characteristic vector to pass through a decoder to obtain a decoding value, and the decoding value is used for representing recommended irrigation quantity;
the transfer module is further configured to: calculating a transfer vector of the plant feature vector relative to the environmental element feature matrix by the following formula;
wherein, the formula is:
wherein V is a Representing the plant feature vector, M b Representing the feature matrix of the environmental element, V c The transfer vector is represented by a vector of the transfer,multiplying the representation vector by a matrix;
the feature distribution correction module is further configured to: correcting the feature distribution of the decoding feature vector by the following formula to obtain the corrected decoding feature vector;
Wherein, the formula is:
wherein v is i A feature value, v, representing the i-th position of the decoded feature vector i ' represents the eigenvalue of the ith position of the corrected decoded eigenvector, log represents a logarithmic function value based on 2;
the irrigation result generation module is further configured to:
performing decoding regression on the corrected decoding feature vector by using a plurality of full-connection layers of the decoder to obtain the decoded value, wherein the formula is:wherein X is the corrected decoded feature vector, Y is the decoded value, W is a weight matrix, B is a bias vector, +.>The representation vector is multiplied by a matrix, h (·) being the activation function.
2. The intelligent garden automatic irrigation system as claimed in claim 1, wherein the environmental element feature extraction module comprises:
a shallow feature map extracting unit, configured to extract a shallow feature map from an mth layer of the first convolutional neural network model, where M is an even number;
a deep feature map extracting unit, configured to extract a deep feature map from an nth layer of the first convolutional neural network model, where N is an even number and N is greater than 2 times of M;
the feature map fusion unit is used for fusing the shallow feature map and the deep feature map to generate an environment element feature map; and the dimension reduction unit is used for carrying out global average pooling processing on the environment element feature map along the channel dimension so as to obtain the environment element feature matrix.
3. The intelligent garden automatic irrigation system as recited in claim 2, wherein the irrigation object identification module is further configured to:
each layer using the second convolutional neural network model is performed in forward pass of the layer:
carrying out convolution processing on input data to obtain a convolution characteristic diagram;
carrying out mean pooling based on a local feature matrix on the convolution feature map to obtain a pooled feature map; non-linear activation is carried out on the pooled feature map so as to obtain an activated feature map;
the output of the last layer of the second convolutional neural network model is the plant characteristic vector, and the input of the first layer of the second convolutional neural network model is an image of plants in the to-be-irrigated area.
CN202211203487.4A 2022-09-29 2022-09-29 Automatic irrigation system for intelligent gardens Active CN115482467B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211203487.4A CN115482467B (en) 2022-09-29 2022-09-29 Automatic irrigation system for intelligent gardens

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211203487.4A CN115482467B (en) 2022-09-29 2022-09-29 Automatic irrigation system for intelligent gardens

Publications (2)

Publication Number Publication Date
CN115482467A CN115482467A (en) 2022-12-16
CN115482467B true CN115482467B (en) 2023-09-05

Family

ID=84393163

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211203487.4A Active CN115482467B (en) 2022-09-29 2022-09-29 Automatic irrigation system for intelligent gardens

Country Status (1)

Country Link
CN (1) CN115482467B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116739868B (en) * 2023-07-05 2024-04-23 浙江星宸环境建设有限公司 Afforestation management system and method based on artificial intelligence
CN116649159B (en) * 2023-08-01 2023-11-07 江苏慧岸信息科技有限公司 Edible fungus growth parameter optimizing system and method
CN117726308B (en) * 2024-02-18 2024-05-24 中铁水利信息科技有限公司 Intelligent water conservancy management system and method based on Internet of things and 5G
CN117743975A (en) * 2024-02-21 2024-03-22 君研生物科技(山西)有限公司 Hillside cultivated land soil environment improvement method
CN117765403B (en) * 2024-02-22 2024-04-30 山西余得水农牧有限公司 Fertilizing method for improving lodging resistance and grain quality of crops

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110458335A (en) * 2019-07-23 2019-11-15 华北水利水电大学 Adaptability water-saving irrigation method based on dynamic drought forccast
CN110457982A (en) * 2018-12-28 2019-11-15 中国科学院合肥物质科学研究院 A kind of crop disease image-recognizing method based on feature transfer learning
CN111492959A (en) * 2020-06-02 2020-08-07 山东贵合信息科技有限公司 Irrigation method and equipment based on Internet of things
WO2021007363A1 (en) * 2019-07-09 2021-01-14 The Texas A&M University System Irrigation control with deep reinforcement learning and smart scheduling
CN113724504A (en) * 2021-08-06 2021-11-30 之江实验室 Urban area traffic prediction system and method oriented to vehicle track big data
CN113988153A (en) * 2021-09-24 2022-01-28 中国科学院空天信息创新研究院 High-resolution aerosol estimation method based on condition generation countermeasure network
CN113994868A (en) * 2021-09-27 2022-02-01 上海易航海芯农业科技有限公司 Automatic irrigation method and system based on plant growth period
CN114600750A (en) * 2022-03-02 2022-06-10 上海继睿机械工程有限公司 Intelligent water-saving irrigation system and operation method thereof
CN114821253A (en) * 2022-03-18 2022-07-29 北京国垦节水科技有限公司 Method and system for regulating and controlling applicability of water and fertilizer spraying and drip irrigation integrated system
CN114982606A (en) * 2022-05-26 2022-09-02 河南省景观规划设计研究院有限公司 Garden soil intelligent management method and device, computer and storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110457982A (en) * 2018-12-28 2019-11-15 中国科学院合肥物质科学研究院 A kind of crop disease image-recognizing method based on feature transfer learning
WO2021007363A1 (en) * 2019-07-09 2021-01-14 The Texas A&M University System Irrigation control with deep reinforcement learning and smart scheduling
CN110458335A (en) * 2019-07-23 2019-11-15 华北水利水电大学 Adaptability water-saving irrigation method based on dynamic drought forccast
CN111492959A (en) * 2020-06-02 2020-08-07 山东贵合信息科技有限公司 Irrigation method and equipment based on Internet of things
CN113724504A (en) * 2021-08-06 2021-11-30 之江实验室 Urban area traffic prediction system and method oriented to vehicle track big data
CN113988153A (en) * 2021-09-24 2022-01-28 中国科学院空天信息创新研究院 High-resolution aerosol estimation method based on condition generation countermeasure network
CN113994868A (en) * 2021-09-27 2022-02-01 上海易航海芯农业科技有限公司 Automatic irrigation method and system based on plant growth period
CN114600750A (en) * 2022-03-02 2022-06-10 上海继睿机械工程有限公司 Intelligent water-saving irrigation system and operation method thereof
CN114821253A (en) * 2022-03-18 2022-07-29 北京国垦节水科技有限公司 Method and system for regulating and controlling applicability of water and fertilizer spraying and drip irrigation integrated system
CN114982606A (en) * 2022-05-26 2022-09-02 河南省景观规划设计研究院有限公司 Garden soil intelligent management method and device, computer and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于Arduino单片机的智能灌溉系统设计;付宁 等;《信息与电脑(理论版)》(第8期);全文 *

Also Published As

Publication number Publication date
CN115482467A (en) 2022-12-16

Similar Documents

Publication Publication Date Title
CN115482467B (en) Automatic irrigation system for intelligent gardens
Truong et al. An IoT environmental data collection system for fungal detection in crop fields
CN107392097B (en) Three-dimensional human body joint point positioning method of monocular color video
CN111414987B (en) Training method and training device of neural network and electronic equipment
CN110674323B (en) Unsupervised cross-modal Hash retrieval method and system based on virtual label regression
KR20190041819A (en) Apparatus and method for convolution operation of convolution neural network
CN111325381A (en) Multi-source heterogeneous farmland big data yield prediction method, system and device
CN113034592B (en) Three-dimensional scene target detection modeling and detection method based on natural language description
CN116071667B (en) Method and system for detecting abnormal aircraft targets in specified area based on historical data
CN116307624A (en) Resource scheduling method and system of ERP system
CN116308754A (en) Bank credit risk early warning system and method thereof
CN115984745A (en) Moisture control method for black garlic fermentation
CN116859830A (en) Production management control system for electronic grade ammonium fluoride production
CN116151545A (en) Multi-wind motor group power control optimization system
CN116071601A (en) Method, apparatus, device and medium for training model
CN116624977B (en) Building automatic control system and method based on artificial intelligence
CN117039894B (en) Photovoltaic power short-term prediction method and system based on improved dung beetle optimization algorithm
CN110990630B (en) Video question-answering method based on graph modeling visual information and guided by using questions
CN113344243A (en) Wind speed prediction method and system for optimizing ELM based on improved Harris eagle algorithm
CN115982573A (en) Multifunctional feeder and control method thereof
CN113449893A (en) Insect pest prediction model training method, insect pest prediction method and insect pest prediction device
CN116467485A (en) Video image retrieval construction system and method thereof
Airlangga et al. Initial machine learning framework development of agriculture cyber physical systems
CN116483132A (en) Coal flow control system and method based on drive motor current control coordination
CN115564092A (en) Short-time wind power prediction system and method for wind power plant

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant