CN116721348A - Automatic fertilization control system and method for landscape garden seedlings - Google Patents

Automatic fertilization control system and method for landscape garden seedlings Download PDF

Info

Publication number
CN116721348A
CN116721348A CN202310742398.5A CN202310742398A CN116721348A CN 116721348 A CN116721348 A CN 116721348A CN 202310742398 A CN202310742398 A CN 202310742398A CN 116721348 A CN116721348 A CN 116721348A
Authority
CN
China
Prior art keywords
seedling
image
feature vector
nursery stock
feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310742398.5A
Other languages
Chinese (zh)
Inventor
马璐通
武睿
李松珍
杨松
聂其涛
周楠
苏云鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yiwen Environmental Development Co ltd
Original Assignee
Yiwen Environmental Development Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yiwen Environmental Development Co ltd filed Critical Yiwen Environmental Development Co ltd
Priority to CN202310742398.5A priority Critical patent/CN116721348A/en
Publication of CN116721348A publication Critical patent/CN116721348A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P60/00Technologies relating to agriculture, livestock or agroalimentary industries
    • Y02P60/20Reduction of greenhouse gas [GHG] emissions in agriculture, e.g. CO2
    • Y02P60/21Dinitrogen oxide [N2O], e.g. using aquaponics, hydroponics or efficiency measures

Abstract

Discloses an automatic fertilization control system and method for landscape garden seedlings. Firstly, collecting a seedling image through a camera, then extracting a seedling multiscale feature vector from the seedling image, and finally, determining whether fertilization operation is needed or not based on the seedling multiscale feature vector. Therefore, gray processing and analysis can be carried out on the seedling image based on the machine vision technology of deep learning, so that intelligent control of fertilization operation is carried out by utilizing the growth state characteristic information about the seedlings in the image, thereby optimizing fertilization effect and efficiency of landscape garden seedlings and ensuring normal growth of the seedlings.

Description

Automatic fertilization control system and method for landscape garden seedlings
Technical Field
The present disclosure relates to the field of automated control, and more particularly, to a system and method for automated fertilization control of landscape architecture seedlings.
Background
The landscape garden nursery stock can provide beautiful green environment and increase the living comfort level of people, and plays a vital role in the fields of urban greening, landscape and the like. In the cultivation process of the nursery stock, proper fertilization is important for healthy growth of the nursery stock.
At present, the traditional landscape garden seedling fertilization mode generally depends on manual observation and experience judgment. That is, maintenance personnel need to periodically patrol the seedlings, and judge whether fertilization is needed or not by visually checking the growth condition of the seedlings. The scheme not only needs professional personnel to evaluate the growth state of garden seedlings regularly, consumes a great amount of time and energy, and is low in efficiency, but also the traditional method often cannot discover the condition that the seedlings need to be fertilized in time because the growth state of the seedlings changes quickly, so that fertilization control is delayed, and the normal growth of the seedlings is affected.
Therefore, an optimized automated fertilization control scheme for landscape architecture seedlings is desired.
Disclosure of Invention
In view of this, the disclosure provides an automated fertilization control system and method for landscape architecture seedlings, which can perform gray scale processing and analysis on seedling images based on a machine vision technology of deep learning, so as to perform intelligent control of fertilization operation by utilizing characteristic information about the growth state of the seedlings in the images, thereby optimizing fertilization effect and efficiency of the landscape architecture seedlings and ensuring normal growth of the seedlings.
According to an aspect of the present disclosure, there is provided a landscape architecture seedling automated fertilization control method, comprising: collecting a seedling image through a camera; extracting a seedling multiscale feature vector from the seedling image; and determining whether fertilization operation is required or not based on the multi-scale feature vector of the seedling.
According to another aspect of the present disclosure, there is provided a landscape architecture seedling automated fertilization control system, comprising: the image acquisition module is used for acquiring seedling images through the camera; the multi-scale extraction module is used for extracting a seedling multi-scale feature vector from the seedling image; and the fertilization operation control module is used for determining whether fertilization operation is needed or not based on the multi-scale feature vector of the nursery stock.
According to the embodiment of the disclosure, firstly, a seedling image is collected through a camera, then, a seedling multi-scale feature vector is extracted from the seedling image, and finally, whether fertilization operation is needed or not is determined based on the seedling multi-scale feature vector. Therefore, gray processing and analysis can be carried out on the seedling image based on the machine vision technology of deep learning, so that intelligent control of fertilization operation is carried out by utilizing the growth state characteristic information about the seedlings in the image, thereby optimizing fertilization effect and efficiency of landscape garden seedlings and ensuring normal growth of the seedlings.
Other features and aspects of the present disclosure will become apparent from the following detailed description of exemplary embodiments, which proceeds with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate exemplary embodiments, features and aspects of the present disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 shows a flowchart of a landscape architecture seedling automated fertilization control method according to an embodiment of the present disclosure.
Fig. 2 shows a schematic architecture diagram of a landscape architecture seedling automated fertilization control method according to an embodiment of the present disclosure.
Fig. 3 shows a flowchart of substep S120 of the automated fertilization control method for landscape architecture seedlings according to an embodiment of the present disclosure.
Fig. 4 shows a flowchart of substep S123 of the automated fertilization control method for landscape architecture seedlings according to an embodiment of the present disclosure.
Fig. 5 shows a block diagram of a landscape architecture seedling automated fertilization control system according to an embodiment of the present disclosure.
Fig. 6 shows an application scenario diagram of a landscape architecture seedling automated fertilization control method according to an embodiment of the present disclosure.
Detailed Description
The following description of the embodiments of the present disclosure will be made clearly and fully with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some, but not all embodiments of the disclosure. All other embodiments, which can be made by one of ordinary skill in the art without undue burden based on the embodiments of the present disclosure, are also within the scope of the present disclosure.
As used in this disclosure and in the claims, the terms "a," "an," "the," and/or "the" are not specific to a singular, but may include a plurality, unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that the steps and elements are explicitly identified, and they do not constitute an exclusive list, as other steps or elements may be included in a method or apparatus.
Various exemplary embodiments, features and aspects of the disclosure will be described in detail below with reference to the drawings. In the drawings, like reference numbers indicate identical or functionally similar elements. Although various aspects of the embodiments are illustrated in the accompanying drawings, the drawings are not necessarily drawn to scale unless specifically indicated.
In addition, numerous specific details are set forth in the following detailed description in order to provide a better understanding of the present disclosure. It will be understood by those skilled in the art that the present disclosure may be practiced without some of these specific details. In some instances, methods, means, elements, and circuits well known to those skilled in the art have not been described in detail in order not to obscure the present disclosure.
The traditional landscape garden seedling fertilizing method not only needs professional personnel to evaluate the growth state of garden seedlings regularly, consumes a great amount of time and energy, and is low in efficiency, but also can not discover the condition that the seedlings need to be fertilized in time due to the fact that the growth state of the seedlings changes fast, so that fertilization control is lagged, and normal growth of the seedlings is affected. Therefore, an optimized automated fertilization control scheme for landscape architecture seedlings is desired.
Accordingly, in the process of automatically fertilizing and controlling the seedlings in the landscape architecture, analysis is required to be carried out on the growth state of the seedlings so as to accurately judge whether fertilization operation is required. However, in the automated fertilization control system for landscape nursery stock, the growth state characteristics of the nursery stock are characteristic information with small scale, and are easy to be interfered by environmental factors, so that the detection precision of the nursery stock growth information is low. Therefore, in the technical scheme of the disclosure, the machine vision technology based on deep learning is expected to perform gray processing and analysis on the seedling image, so that intelligent control of fertilization operation is performed by utilizing the growth state characteristic information about the seedling in the image, thereby optimizing fertilization effect and efficiency of the landscape garden seedling and ensuring normal growth of the seedling.
Fig. 1 shows a flowchart of a landscape architecture seedling automated fertilization control method according to an embodiment of the present disclosure. Fig. 2 shows a schematic architecture diagram of a landscape architecture seedling automated fertilization control method according to an embodiment of the present disclosure. As shown in fig. 1 and 2, the automated fertilization control method for landscape architecture seedlings according to the embodiment of the present disclosure includes the steps of: s110, acquiring a seedling image through a camera; s120, extracting a seedling multiscale feature vector from the seedling image; and S130, determining whether fertilization operation is needed or not based on the seedling multiscale feature vector.
More specifically, in step S110, a seedling image is acquired by a camera. In one example, a high resolution digital camera may be optionally used to capture images of the nursery stock. It will be appreciated that the selection of a digital camera with high resolution and clear image quality ensures that the details and features of the nursery stock are captured and that the proper angle of capture needs to be selected so that the overall morphology and features of the nursery stock can be captured in its entirety, and that capture from different angles can be attempted to obtain more comprehensive information. Meanwhile, the shooting environment is ensured to have proper light conditions, and the light is bright and uniform, so that a clearer image can be provided, and the characteristics of the seedlings can be extracted; too many background interferents are avoided during shooting so as not to influence the definition of images and the extraction of seedling characteristics, and the characteristics of the seedlings can be better highlighted by selecting a clean and concise background; and the focal length and the focal point of the camera are adjusted according to the size of the nursery stock and the required feature extraction so as to ensure the definition and the detail of the nursery stock image. Therefore, when the landscape garden nursery stock is shot, a proper camera is selected, and the shooting angle, the light condition, the background interference, the adjustment of the focal length and the focal point are paid attention to, so that the quality of the nursery stock image and the accuracy of feature extraction can be improved.
More specifically, in step S120, a nursery stock multiscale feature vector is extracted from the nursery stock image. Accordingly, in one possible implementation, as shown in fig. 3, extracting the multi-scale feature vector of the seedling from the seedling image includes: s121, carrying out gray scale processing on the seedling image to obtain a gray scale seedling image; s122, carrying out local feature extraction on the gray nursery stock image to obtain a nursery stock local feature vector; s123, carrying out global feature extraction on the gray nursery stock image to obtain a nursery stock global feature vector; and S124, fusing the local characteristic vector of the seedling and the global characteristic vector of the seedling by using a cascading function to obtain the multi-scale characteristic vector of the seedling.
Considering that when the growth state of the seedling is actually detected, the seedling image may be affected by factors such as illumination conditions or camera performance, so that noise is generated, and further detection and evaluation of the growth state of the seedling are affected. In addition, the growth state characteristics of the seedlings in the seedling image are also extremely easy to be interfered by environmental background factors. Therefore, gray-scale processing is required to be performed on the nursery stock image before image feature extraction to obtain a gray-scale nursery stock image. It should be appreciated that the gray scale process may convert a color image to a gray scale image, i.e., combining the RGB values of each pixel of the image into a single gray scale value. In this way, the influence of environmental noise and the interference caused by color difference can be reduced, and the calculation amount can be effectively reduced by using the grayscale nursery stock image to carry out subsequent feature extraction, so that the overfitting is prevented, and the subsequent image processing and analysis can be conveniently carried out.
Accordingly, in one possible implementation manner, the local feature extraction is performed on the grayscale seedling image to obtain a seedling local feature vector, which includes: and the grey-scale nursery stock image is passed through a local feature extractor based on a convolutional neural network model to obtain the nursery stock local feature vector. And performing feature mining on the gray nursery stock image by using a local feature extractor which has excellent expression in the aspect of extracting hidden features of the image and is based on a convolutional neural network model so as to extract local hidden feature distribution information about the growth state of the nursery stock in the gray nursery stock image, thereby obtaining a nursery stock local feature vector. It should be appreciated that by extracting local features from the grayscale seedling image, different parts of the seedling, including leaves, branches and other detailed feature information, can be better distinguished, which is beneficial to accurately detecting the growth state of the seedling.
It should be appreciated that convolutional neural network (Convolutional Neural Network, CNN for short) is a deep learning model, particularly suited for image processing tasks. Convolutional neural networks can effectively extract local features of an image by mimicking the structure and function of the human visual system. The convolutional neural network consists of a plurality of convolutional layers, a pooling layer and a full-connection layer. In image processing, a convolution layer can carry out convolution operation on an image in a sliding window mode, and local features of the image are extracted. These local features may capture information such as edges, textures, and shapes in the image. In the embodiment of the disclosure, the grey-scale nursery stock image is processed through a local feature extractor based on a convolutional neural network model to obtain a local feature vector of the nursery stock. Specifically, the local feature extractor refers to a portion of the image that performs feature extraction using a convolutional neural network model. By training the convolutional neural network model, a characteristic extraction mode suitable for the seedling image can be learned, so that local characteristic vectors of the seedling are obtained, and the local characteristic vectors can be used for subsequent fertilization control decisions. It should be noted that the specific architecture and parameter setting of the convolutional neural network model may be designed and adjusted according to the specific application scenario and data set to obtain the best feature extraction effect.
In one example, passing the grayscale nursery stock image through a local feature extractor based on a convolutional neural network model to obtain the nursery stock local feature vector includes: and respectively carrying out two-dimensional convolution processing, feature matrix-based mean value pooling processing and nonlinear activation processing on input data in forward transfer of layers by using each layer of the local feature extractor based on the convolutional neural network model so as to output the nursery stock local feature vector by the last layer of the local feature extractor based on the convolutional neural network model, wherein the input of the first layer of the local feature extractor based on the convolutional neural network model is the gray nursery stock image.
It is worth mentioning that in the local feature extractor based on the convolutional neural network model, a two-dimensional convolutional process, a mean pooling process based on a feature matrix, and a nonlinear activation process are used to extract features from input data and perform nonlinear transformation. In the two-dimensional Convolution (2D Convolution), the Convolution operation is one of the core operations of the convolutional neural network, and in the forward transfer of each layer of the local feature extractor, the input data is subjected to the two-dimensional Convolution. The convolution operation uses a set of learnable convolution kernels (also called filters) to perform sliding window computation on the input data, so as to extract local features in the image, each convolution kernel performs multiplication on a small area of the input data, and adds the results to obtain an output feature map, and different features can be extracted through different convolution kernels. Regarding the Average Pooling (Average Pooling) section based on feature matrices, pooling operates to reduce the size of feature graphs and preserve important features. In the local feature extractor, feature dimension reduction is generally performed by using mean value pooling, the feature map is divided into non-overlapping areas by mean value pooling, the features in each area are averaged, the average value is used as the output of the area, the size of the feature map can be reduced by mean value pooling, and the main features in the image are reserved. Regarding the nonlinear activation processing (Nonlinear Activation) section, in the local feature extractor, nonlinear activation functions are typically added after convolution and pooling to introduce nonlinear transformations, with common nonlinear activation functions including ReLU (Rectified Linear Unit), sigmoid, tanh, and the like. The activation function increases the expressive power of the network by non-linearly mapping the features so that the network can learn more complex representation of the features. Through the operations, the local feature extractor can extract the local features of the seedlings from the grey-scale seedling images and convert the local features into feature vector representations for subsequent seedling classification or other tasks.
Given the inherent limitations of convolution operations, pure CNN methods have difficulty learning explicit global and remote semantic information interactions. In addition, capturing and extracting are difficult to carry out in consideration of the fact that hidden features of the grey-scale seedling image on the growth state of the seedling are small-scale fine features. Therefore, in order to improve the expression capability of the hidden small-scale fine features of the growth state of the seedling in the grayscale seedling image, so as to improve the precision of fertilization control, in the technical scheme of the disclosure, the grayscale seedling image is subjected to image segmentation processing and then is encoded in a ViT model containing an embedded layer, so that hidden local context semantic association feature distribution information of the growth state of the seedling in the grayscale seedling image is extracted, and thus a global seedling feature vector is obtained. It should be understood that the small-scale implicit features related to the growth state of the seedling in each image block after the image block processing is performed on the grayscale seedling image are no longer small-scale feature information, which is beneficial to the subsequent extraction of the implicit features of the growth state of the seedling. In particular, here, the embedding layer linearly projects the individual image blocks as one-dimensional embedding vectors via a learnable embedding matrix. The embedding process is realized by firstly arranging pixel values of all pixel positions in each image block into one-dimensional vectors, and then carrying out full-connection coding on the one-dimensional vectors by using a full-connection layer. And, here, the ViT model may directly process the image blocks through a self-attention mechanism like a transducer, so as to extract implicit context semantic association feature information about the growth state of the seedlings based on the whole grayscale seedling image in the image blocks, that is, global context association feature information about the growth state of the seedlings in the grayscale seedling image. Accordingly, in one possible implementation manner, the global feature extraction is performed on the grayscale seedling image to obtain a seedling global feature vector, which includes: and extracting features of the gray nursery stock image by a global feature extractor based on a deep neural network model to obtain the nursery stock global feature vector. Accordingly, in one possible implementation, the deep neural network model is a ViT model.
It should be appreciated that the ViT (Vision Transformer) model is an image classification model based on a transducer architecture. Conventional Convolutional Neural Networks (CNNs) perform well in image processing tasks, but are computationally and memory demanding for larger image sizes and complex visual scenes. The ViT model achieves global feature extraction of an image by introducing a transform's self-attention mechanism, dividing the image into a series of image blocks, and processing the blocks through a multi-layer transform encoder. The core idea of the ViT model is to treat the image as a sequence of data, with each image block being passed as an input token to the transducer encoder. Through a self-attention mechanism, the model can learn the relationships between image blocks and the context information, thereby capturing global features of the image. And finally, mapping the global features to specific categories through a full connection layer to realize image classification tasks. Compared with the traditional convolutional neural network model, the ViT model has better expandability and calculation efficiency when processing large-size images, and can adapt to different application fields in a pre-training and fine-tuning mode. The ViT model is suitable for image classification tasks and may require corresponding modifications and adjustments for other image processing tasks (e.g., object detection, image segmentation, etc.).
Accordingly, in one possible implementation manner, as shown in fig. 4, the feature extraction of the grayscale seedling image by using a global feature extractor based on a deep neural network model to obtain the seedling global feature vector includes: s1231, performing image blocking on the grayscale seedling image to obtain a plurality of grayscale seedling image blocks; s1232, embedding each grayscale seedling image block in the grayscale seedling image blocks by using an embedding layer of the ViT model to obtain a plurality of grayscale seedling image block embedding vectors; s1233, enabling the plurality of grayscale seedling image block embedded vectors to pass through a converter of the ViT model so as to obtain a plurality of context seedling image feature vectors; s1234, performing feature distribution optimization on the plurality of context seedling image feature vectors to obtain a plurality of optimized context seedling image feature vectors; and S1235, cascading the plurality of optimization context seedling image feature vectors to obtain the seedling global feature vector.
Particularly, in the technical scheme of the disclosure, when the grayscale seedling image is passed through the global feature extractor based on the ViT model to obtain the seedling global feature vector, since the context seedling image feature vector obtained by each image block of the grayscale seedling image through the ViT model expresses the context correlation feature of the local image semantics of the image block, each context seedling image feature vector can express the image semantic feature in the global context correlation dimension. However, in the case where the plurality of contextual seedling image feature vectors are cascaded to obtain the seedling global feature vector, it is still desirable to promote global relevance between the plurality of contextual seedling image feature vectors, that is, promote a global relevance expression effect of the seedling global feature vector on the plurality of contextual seedling image feature vectors.
Here, the global feature vector of the seedling is obtained by taking into account the cascade of feature vectors of the image of the seedling in the context, so that the global feature vector of the seedling can be regarded as an overall feature set formed by the local feature sets of the feature vectors of the image of each seedling in the context. And, because each feature vector of the plurality of context seedling image feature vectors follows the local image semantic association distribution of the image blocks of the grayscale seedling image in the whole image, the seedling global feature directionThe feature vectors of the respective context nursery stock images have not only the neighborhood distribution relationship of the mutual association under the feature set dimension, but also the multisource information association relationship corresponding to the local image semantic association distribution information of the whole gray nursery stock image under the global image. Thus, to promote the overall associative distribution expression effect of the nursery stock global feature vector on the plurality of contextual nursery stock image feature vectors, applicants of the present disclosure, for each contextual nursery stock image feature vector of the plurality of contextual nursery stock image feature vectors, e.g., noted asPerforming multisource information fusion pre-verification distribution evaluation optimization to obtain optimized context nursery stock image feature vectors +.>
Accordingly, in one possible implementation manner, performing feature distribution optimization on the plurality of contextual seedling image feature vectors to obtain a plurality of optimized contextual seedling image feature vectors, including: carrying out multisource information fusion pre-verification distribution evaluation optimization on each contextual seedling image feature vector in the contextual seedling image feature vectors according to the following optimization formula so as to obtain the optimized contextual seedling image feature vectors; wherein, the optimization formula is:wherein (1)>Is the +.f of the feature vector of the image of the plurality of contextual seedlings>Image feature vector of each context nursery stock, </u >>Is the +.f of the feature vector of the image of the plurality of contextual seedlings>Image feature vector of each context nursery stock, </u >>Is the mean feature vector, ++>Setting up superparameters for a neighborhood->Represents a logarithmic function with base 2, +.>Representing subtraction by position +.>Is the +.o. of the feature vector of the plurality of optimized contextual nursery stock images>And optimizing the image feature vector of the context nursery stock. And when->When the number of the feature vectors of the contextual seedling images is smaller than or equal to zero or larger than the number of the feature vectors of the contextual seedling images, the feature vectors are +.>May be an all zero vector or a unit vector.
Here, the optimization of the multisource information fusion pre-verification distribution evaluation can realize effective folding of the pre-verification information of each of the seedling image feature vectors of each context on the local synthesis distribution based on the quasi-maximum likelihood estimation of the feature distribution fusion robustness for the feature local collection formed by a plurality of mutually associated neighborhood parts, and the optimization paradigm for evaluating the standard expected fusion information between the internal association and the inter-collection change relation in the collection is obtained through the pre-verification distribution construction under the multisource condition, so that the information expression effect of the multi-source information association fusion between the seedling image feature vectors of each context is improved, and the overall association distribution expression effect of the seedling global feature vector for the seedling image feature vectors of each context is improved. Therefore, the self-adaptive control of the fertilization operation can be performed based on the growth state of the seedlings, so that the fertilization effect and efficiency of the landscape garden seedlings are optimized, and the normal growth of the seedlings is ensured.
It should be appreciated that multisource information fusion pre-verification distribution evaluation optimization is a method for fusing data of multiple information sources and evaluating and optimizing the fused data. In many practical problems, data may be acquired from different sources, which may have different characteristics and errors, and the goal of multi-source information fusion is to fuse the data into a more accurate, reliable result. In multi-source information fusion, pre-test distribution refers to the prior distribution of data of each information source before fusion, and the aim of optimization is to optimize the fused result by considering the reliability and accuracy of each information source and the correlation between the reliability and accuracy. The process of multisource information fusion pre-verification distribution assessment optimization generally includes the following steps: 1. data fusion: fusing data from different information sources, which may use statistical methods, machine learning methods, or other fusion techniques; 2. pre-test distribution assessment: evaluating the data of each information source, wherein factors such as accuracy, reliability, weight and the like of the data are considered to obtain pre-verification distribution of each information source; 3. correlation analysis: the correlation between different information sources is analyzed, the degree of correlation between the information sources is known, and a correlation analysis method can be used for evaluating the similarity or correlation between the information sources; 4. optimizing fusion results: and optimizing the fused result according to the pre-verification distribution and the correlation analysis result of the information source, and optimizing the result by using methods such as an optimization algorithm, bayesian reasoning and the like. Through multisource information fusion pre-verification distribution evaluation optimization, accuracy and reliability of data fusion results can be improved, and therefore uncertainty and noise in practical problems can be better dealt with.
And then, fusing the local feature vector of the seedling and the global feature vector of the seedling by using a cascading function to obtain a multi-scale feature vector of the seedling, so as to express fusion association feature distribution information between local detail feature information and global feature information of the seedling in the seedling image, namely leaf, branch and integral feature of the seedling, which is beneficial to detecting the growth state of the seedling, thereby improving the classification accuracy. It should be understood that a cascading function refers to an operation that joins or concatenates multiple vectors or features. In the nursery stock image processing, a cascading function can be used for fusing the local feature vector and the global feature vector of the nursery stock to obtain the multi-scale feature vector of the nursery stock. Specifically, assuming that the local feature vector is v_local and the global feature vector is v_global, a cascading function can be used to cascade the local feature vector and the global feature vector into a new feature vector v_concat, which represents fusion-related feature distribution information of local detail feature information and global feature information of the nursery stock in the nursery stock image. The cascading function may simply connect two vectors together in a certain order, forming a longer vector. In this case, the cascading function can be expressed as: v_concat= [ v_local, v_global ], where [ (c) represents a join operation and comma represents a position of a join. The local feature vector and the global feature vector are fused through the cascading function, so that the local and the whole information in the seedling image can be comprehensively considered, and the detection and classification precision of the growth state of the seedlings is improved.
More specifically, in step S130, it is determined whether a fertilization operation is required based on the seedling multiscale feature vector. Accordingly, in one possible implementation, determining whether a fertilization operation is required based on the seedling multiscale feature vector includes: and the seedling multiscale feature vector passes through a classifier to obtain a classification result, wherein the classification result is used for indicating whether fertilization operation is needed or not. That is, in the technical solution of the present disclosure, the label of the classifier includes a fertilizing operation (first label) and a fertilizing operation (second label) is not required, wherein the classifier determines, through a soft maximum function, to which classification label the seedling multiscale feature vector belongs. It should be noted that the first tag p1 and the second tag p2 do not include a manually set concept, and in fact, during the training process, the computer model does not have a concept of "whether fertilization is required", which is only two kinds of classification tags, and the probability that the output characteristics are under the two classification tags, that is, the sum of p1 and p2 is one. Therefore, the classification result of whether the fertilization operation is needed is actually converted into the classification probability distribution conforming to the natural rule through classifying the labels, and the physical meaning of the natural probability distribution of the labels is essentially used instead of the language text meaning of whether the fertilization operation is needed. It should be understood that, in the technical scheme of the disclosure, the classification label of the classifier is a control policy label of whether fertilization operation is needed, so after the classification result is obtained, intelligent control of fertilization operation can be performed based on the classification result, thereby optimizing fertilization effect and efficiency of landscape garden seedlings.
It should be appreciated that the role of the classifier is to learn the classification rules and classifier using a given class, known training data, and then classify (or predict) the unknown data. Logistic regression (logistics), SVM, etc. are commonly used to solve the classification problem, and for multi-classification problems (multi-class classification), logistic regression or SVM can be used as well, but multiple bi-classifications are required to compose multiple classifications, but this is error-prone and inefficient, and the commonly used multi-classification method is the Softmax classification function.
In one example, the multi-scale feature vector of the seedling is passed through a classifier to obtain a classification result, where the classification result is used to indicate whether fertilization is required, and the method includes: performing full-connection coding on the seedling multi-scale feature vectors by using a full-connection layer of the classifier to obtain coded classification feature vectors; and inputting the coding classification feature vector into a Softmax classification function of the classifier to obtain the classification result.
In summary, according to the automated fertilization control method for the landscape-garden seedlings, which is disclosed by the embodiment of the invention, gray processing and analysis can be performed on the seedling image based on the machine vision technology of deep learning, so that intelligent control of fertilization operation is performed by utilizing the growth state characteristic information about the seedlings in the image, thereby optimizing fertilization effect and efficiency of the landscape-garden seedlings and ensuring normal growth of the seedlings.
Fig. 5 shows a block diagram of a landscape architecture seedling automated fertilization control system 100 according to an embodiment of the present disclosure. As shown in fig. 5, the automated fertilization control system 100 for landscape architecture seedlings according to the embodiment of the present disclosure includes: the image acquisition module 110 is used for acquiring a seedling image through a camera; a multi-scale extraction module 120, configured to extract a multi-scale feature vector of the seedling from the seedling image; and a fertilizing operation control module 130, configured to determine whether a fertilizing operation is required based on the multi-scale feature vector of the seedling.
In one possible implementation, the multi-scale extraction module 120 includes: the gray processing unit is used for carrying out gray processing on the seedling image to obtain a gray seedling image; the local feature extraction unit is used for extracting local features of the gray nursery stock image to obtain nursery stock local feature vectors; the global feature extraction unit is used for carrying out global feature extraction on the gray nursery stock image to obtain a nursery stock global feature vector; and a fusion unit for fusing the local feature vector of the seedling and the global feature vector of the seedling by using a cascading function to obtain the multi-scale feature vector of the seedling.
Here, it will be understood by those skilled in the art that the specific functions and operations of the respective units and modules in the above-described landscape architecture seedling automated fertilization control system 100 have been described in detail in the above description of the landscape architecture seedling automated fertilization control method with reference to fig. 1 to 4, and thus, repetitive descriptions thereof will be omitted.
As described above, the automated fertilization control system 100 for landscape architecture seedlings according to the embodiments of the present disclosure may be implemented in various wireless terminals, such as a server or the like having an automated fertilization control algorithm for landscape architecture seedlings. In one possible implementation, the landscape architecture seedling automated fertilization control system 100 according to embodiments of the present disclosure may be integrated into the wireless terminal as one software module and/or hardware module. For example, the landscape architecture seedling automated fertilization control system 100 may be a software module in the operating system of the wireless terminal, or may be an application developed for the wireless terminal; of course, the automated fertilization control system 100 for landscape architecture seedlings can also be one of a plurality of hardware modules of the wireless terminal.
Alternatively, in another example, the landscape architecture seedling automated fertilization control system 100 and the wireless terminal may be separate devices, and the landscape architecture seedling automated fertilization control system 100 may be connected to the wireless terminal through a wired and/or wireless network and transmit interactive information in a agreed data format.
Fig. 6 shows an application scenario diagram of a landscape architecture seedling automated fertilization control method according to an embodiment of the present disclosure. As shown in fig. 6, in this application scenario, first, a seedling image (e.g., D illustrated in fig. 6) is acquired by a camera, and then, the seedling image is input to a server (e.g., S illustrated in fig. 6) in which a landscape architecture seedling automated fertilization control algorithm is deployed, wherein the server can process the seedling image using the landscape architecture seedling automated fertilization control algorithm to obtain a classification result for indicating whether a fertilization operation is required.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The foregoing description of the embodiments of the present disclosure has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the various embodiments described. The terminology used herein was chosen in order to best explain the principles of the embodiments, the practical application, or the improvement of technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (10)

1. The automatic fertilization control method for the landscape garden seedlings is characterized by comprising the steps of collecting seedling images through a camera; extracting a seedling multiscale feature vector from the seedling image; and determining whether fertilization operation is required or not based on the multi-scale feature vector of the seedling.
2. The automated fertilization control method of a landscape architecture seedling according to claim 1, wherein extracting a seedling multiscale feature vector from the seedling image comprises: gray processing is carried out on the seedling image to obtain a gray seedling image; extracting local features of the gray nursery stock image to obtain nursery stock local feature vectors; global feature extraction is carried out on the gray nursery stock image so as to obtain a nursery stock global feature vector; and fusing the seedling local feature vector and the seedling global feature vector by using a cascading function to obtain the seedling multi-scale feature vector.
3. The automated fertilization control method for landscape architecture seedlings according to claim 2, wherein the local feature extraction of the grayscale seedling image to obtain a seedling local feature vector comprises: and the grey-scale nursery stock image is passed through a local feature extractor based on a convolutional neural network model to obtain the nursery stock local feature vector.
4. The automated fertilization control method for landscape architecture seedlings according to claim 3, wherein the global feature extraction is performed on the grayscale seedling image to obtain a seedling global feature vector, comprising: and extracting features of the gray nursery stock image by a global feature extractor based on a deep neural network model to obtain the nursery stock global feature vector.
5. The automated fertilization control method for landscape architecture seedlings according to claim 4, wherein the deep neural network model is a ViT model.
6. The automated fertilization control method of landscape architecture seedlings according to claim 5, wherein the feature extraction of the grayscale seedling image by a global feature extractor based on a deep neural network model to obtain the seedling global feature vector, comprises: image blocking is carried out on the grayscale seedling image to obtain a plurality of grayscale seedling image blocks; embedding each gray seedling image block in the plurality of gray seedling image blocks by using an embedding layer of the ViT model to obtain a plurality of gray seedling image block embedding vectors; embedding the plurality of grayscale seedling image blocks into vectors through a converter of the ViT model to obtain a plurality of contextual seedling image feature vectors; performing feature distribution optimization on the plurality of context seedling image feature vectors to obtain a plurality of optimized context seedling image feature vectors; and cascading the plurality of optimization context seedling image feature vectors to obtain the seedling global feature vector.
7. The automated fertilization control method of landscape architecture seedlings according to claim 6, wherein performing feature distribution optimization on the plurality of contextual seedling image feature vectors to obtain a plurality of optimized contextual seedling image feature vectors, comprises: carrying out multisource information fusion pre-verification distribution evaluation optimization on each contextual seedling image feature vector in the contextual seedling image feature vectors according to the following optimization formula so as to obtain the optimized contextual seedling image feature vectors;
wherein, the optimization formula is:wherein (1)>Is the +.f of the feature vector of the image of the plurality of contextual seedlings>Image feature vector of each context nursery stock, </u >>Is the +.f of the feature vector of the image of the plurality of contextual seedlings>Image feature vector of each context nursery stock, </u >>Is the mean feature vector, ++>Setting up superparameters for a neighborhood->Represents a logarithmic function with base 2, +.>Representing subtraction by position +.>Is the +.o. of the feature vector of the plurality of optimized contextual nursery stock images>Personal advantageAnd (5) transforming the characteristic vector of the contextual seedling image.
8. The automated fertilization control method of a landscape architecture seedling according to claim 7, wherein determining whether fertilization is required based on the seedling multi-scale feature vector comprises: and the seedling multiscale feature vector passes through a classifier to obtain a classification result, wherein the classification result is used for indicating whether fertilization operation is needed or not.
9. An automated fertilization control system for landscape architecture seedlings, comprising: the image acquisition module is used for acquiring seedling images through the camera; the multi-scale extraction module is used for extracting a seedling multi-scale feature vector from the seedling image; and the fertilization operation control module is used for determining whether fertilization operation is needed or not based on the multi-scale feature vector of the nursery stock.
10. The automated fertilization control system of landscape architecture seedlings of claim 9, wherein the multi-scale extraction module comprises: the gray processing unit is used for carrying out gray processing on the seedling image to obtain a gray seedling image; the local feature extraction unit is used for extracting local features of the gray nursery stock image to obtain nursery stock local feature vectors; the global feature extraction unit is used for carrying out global feature extraction on the gray nursery stock image to obtain a nursery stock global feature vector; and a fusion unit for fusing the local feature vector of the seedling and the global feature vector of the seedling by using a cascading function to obtain the multi-scale feature vector of the seedling.
CN202310742398.5A 2023-06-21 2023-06-21 Automatic fertilization control system and method for landscape garden seedlings Pending CN116721348A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310742398.5A CN116721348A (en) 2023-06-21 2023-06-21 Automatic fertilization control system and method for landscape garden seedlings

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310742398.5A CN116721348A (en) 2023-06-21 2023-06-21 Automatic fertilization control system and method for landscape garden seedlings

Publications (1)

Publication Number Publication Date
CN116721348A true CN116721348A (en) 2023-09-08

Family

ID=87864402

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310742398.5A Pending CN116721348A (en) 2023-06-21 2023-06-21 Automatic fertilization control system and method for landscape garden seedlings

Country Status (1)

Country Link
CN (1) CN116721348A (en)

Similar Documents

Publication Publication Date Title
Agarwal et al. A new Conv2D model with modified ReLU activation function for identification of disease type and severity in cucumber plant
Latha et al. Automatic detection of tea leaf diseases using deep convolution neural network
Mathur et al. Crosspooled FishNet: transfer learning based fish species classification model
NL2025689B1 (en) Crop pest detection method based on f-ssd-iv3
Huang et al. Identification of the source camera of images based on convolutional neural network
CN111291809A (en) Processing device, method and storage medium
Su et al. LodgeNet: Improved rice lodging recognition using semantic segmentation of UAV high-resolution remote sensing images
Verma et al. Prediction of diseased rice plant using video processing and LSTM-simple recurrent neural network with comparative study
CN116958825B (en) Mobile remote sensing image acquisition method and highway maintenance monitoring method
CN112464983A (en) Small sample learning method for apple tree leaf disease image classification
Islam et al. Performance prediction of tomato leaf disease by a series of parallel convolutional neural networks
Rezk et al. An efficient plant disease recognition system using hybrid convolutional neural networks (cnns) and conditional random fields (crfs) for smart iot applications in agriculture
CN112749675A (en) Potato disease identification method based on convolutional neural network
CN116721389A (en) Crop planting management method
Jenifa et al. Classification of cotton leaf disease using multi-support vector machine
CN116612435B (en) Corn high-yield cultivation method
Bhadur et al. Agricultural crops disease identification and classification through leaf images using machine learning and deep learning technique: a review
Djibrine et al. Transfer Learning for Animal Species Identification from CCTV Image: Case Study Zakouma National Park
Bajpai et al. Deep learning model for plant-leaf disease detection in precision agriculture
CN116721348A (en) Automatic fertilization control system and method for landscape garden seedlings
CN116664904A (en) New crown infection medical image classification method based on self-supervision learning
Zhu et al. Computer image analysis for various shading factors segmentation in forest canopy using convolutional neural networks
Kumar et al. Crop disease detection using 2d cnn based deep learning architecture
Cherifi et al. Convolution neural network deployment for plant leaf diseases detection
Hindarto Comparative Analysis VGG16 Vs MobileNet Performance for Fish Identification

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination