CN115240126B - Intelligent drip irrigation method based on artificial intelligence - Google Patents

Intelligent drip irrigation method based on artificial intelligence Download PDF

Info

Publication number
CN115240126B
CN115240126B CN202211167732.0A CN202211167732A CN115240126B CN 115240126 B CN115240126 B CN 115240126B CN 202211167732 A CN202211167732 A CN 202211167732A CN 115240126 B CN115240126 B CN 115240126B
Authority
CN
China
Prior art keywords
blade
image
land
value
color
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211167732.0A
Other languages
Chinese (zh)
Other versions
CN115240126A (en
Inventor
郭宽
曹冬梅
杜朗
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Jingrui Agriculture Technology Development Co ltd
Original Assignee
Jiangsu Jingrui Agriculture Technology Development Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu Jingrui Agriculture Technology Development Co ltd filed Critical Jiangsu Jingrui Agriculture Technology Development Co ltd
Priority to CN202211167732.0A priority Critical patent/CN115240126B/en
Publication of CN115240126A publication Critical patent/CN115240126A/en
Application granted granted Critical
Publication of CN115240126B publication Critical patent/CN115240126B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01GHORTICULTURE; CULTIVATION OF VEGETABLES, FLOWERS, RICE, FRUIT, VINES, HOPS OR SEAWEED; FORESTRY; WATERING
    • A01G25/00Watering gardens, fields, sports grounds or the like
    • A01G25/16Control of watering
    • A01G25/167Control by humidity of the soil itself or of devices simulating soil or of the atmosphere; Soil humidity sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/28Quantising the image, e.g. histogram thresholding for discrimination between background and foreground patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/42Global feature extraction by analysis of the whole pattern, e.g. using frequency domain transformations or autocorrelation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/48Extraction of image or video features by mapping characteristic values of the pattern into a parameter space, e.g. Hough transformation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/50Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/54Extraction of image or video features relating to texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/10Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture
    • Y02A40/22Improving land use; Improving water use or availability; Controlling erosion

Abstract

The invention relates to the technical field of data processing, in particular to an intelligent drip irrigation method based on artificial intelligence. The method is characterized in that a crop color image and a land gray level image identified by a depth camera are subjected to data acquisition, the acquired data are processed and analyzed, the processing method of the acquired data is improved, the crop color image and the land gray level image are analyzed according to a specific method respectively, the water shortage condition of crops and the water shortage condition of the land around the crops are determined, the obtained water shortage condition is input into a trained DNN network, and the most appropriate drip irrigation grade is determined to carry out drip irrigation on the crops. The method performs specific water shortage condition analysis on the shot images of the crops and the images of the land around the crops, improves the accuracy of judging whether the crops are in water shortage or not and the severity of the water shortage, and further reduces the waste of water resources in the drip irrigation process.

Description

Intelligent drip irrigation method based on artificial intelligence
Technical Field
The invention relates to the technical field of data processing, in particular to an intelligent drip irrigation method based on artificial intelligence.
Background
Although earth water resources are quite abundant, the proportion of fresh water resources available for human beings accounts for only one percent of all water resources. Agricultural irrigation needs a large amount of fresh water resources, traditional irrigation technologies such as flood irrigation, furrow irrigation and ditch irrigation are extremely low in water resource utilization rate and consume a large amount of manpower, so that new modern irrigation technologies such as spray irrigation, drip irrigation and micro irrigation are provided, and the utilization efficiency of water resources is greatly improved.
However, although the modern irrigation technology has improved the water resource utilization rate compared with the traditional irrigation technology, the fuzzy control algorithm used by the current modern irrigation technology still has a lack of accuracy in determining whether crops are short of water, and the situation of water resource waste still exists.
Disclosure of Invention
In order to further reduce the waste of water resources by modern irrigation technology, the application provides an intelligent drip irrigation method based on artificial intelligence, and the adopted technical scheme is as follows:
the invention relates to an intelligent drip irrigation method based on artificial intelligence, which comprises the following steps:
after the previous drip irrigation operation is finished, acquiring images of crops and the land around the crops by adopting a depth camera so as to acquire a color image of the crops and a gray level image of the land, and taking the color image of the crops and the gray level image of the land as an initial color image of the crops and an initial gray level image of the land for the current drip irrigation control;
waiting for a set time interval after the previous drip irrigation operation is finished, acquiring images of crops and the land around the crops again by adopting the depth camera, and taking the obtained crop color images and land gray level images as real-time crop color images and real-time land gray level images of the current drip irrigation control;
comparing the data of the real-time land gray level image with the data of the initial land gray level image, and determining the color value variable of the land surface part in the real-time land gray level image, the crack area in the real-time land gray level image, the variance of the crack areas in different areas and the maximum crack width to obtain the water shortage of the land;
performing target identification on the real-time crop color image, determining a leaf area where each leaf is located, and calculating a deviation value between a pixel value of each pixel point in the leaf area and a green preset value to obtain the color offset of the leaf, wherein the green preset value is a pixel value when the water content of the leaf is normal;
graying a leaf area in a real-time crop color image, obtaining a color deviation map of the leaf area by making a difference between the obtained gray leaf area and a gray value of the green preset value, and calculating an entropy value of the color deviation map to obtain a color deviation entropy;
calculating the irregularity degree of the blade edge, the symmetry degree of the blade, the blade surface curling degree and the blade height, and determining the form water shortage degree of the blade;
and inputting the land water shortage, the color offset entropy and the form water shortage into a trained neural network, determining the drip irrigation grade of the drip irrigation control, and finishing the drip irrigation operation.
The invention has the beneficial effects that:
the method comprises the steps of obtaining crop state images including the crops and the land around the crops by shooting through a depth camera, subdividing the images into crop color images and land gray level images, respectively carrying out specific analysis on the two images to obtain a characteristic quantity representing the water shortage degree of the land and a characteristic quantity representing the water shortage condition of the crops, inputting the two characteristic quantities together into a trained neural network to obtain the drip irrigation grade under the specific water shortage condition, and finishing the most appropriate and accurate drip irrigation operation. The method improves the accuracy of identifying the water shortage condition, so that a more accurate drip irrigation scheme can be provided according to the more accurate water shortage condition, and the waste of water resources in the drip irrigation process is further reduced.
Further, the method for acquiring the color image and the gray level image of the crop by acquiring the image of the crop and the land around the crop by adopting the depth camera comprises the following steps:
the depth camera shoots crops and the land around the crops to obtain crop state images, graying is conducted on the crop state images to obtain crop state gray level images, threshold segmentation is conducted on the crop state gray level images to obtain binary images, and the binary images after the reverse color are respectively subjected to bit operation with the crop state images and the crop state gray level images to obtain crop color images and land gray level images.
Further, the specific process of determining the color value variable of the soil surface part in the real-time soil gray level image, the crack area in the real-time soil gray level image, the variance of the crack areas in different areas and the maximum crack width to obtain the water shortage degree of the soil is as follows:
determining a region of which the gray value on the real-time land gray image is less than or equal to the initial land gray image, setting the gray value of the region to be 255, and setting the gray value of the part outside the region to be 0 to obtain a mask image;
determining a land surface part in the real-time land gray scale image according to the mask image, and calculating a color value variable of the land surface part in the real-time land gray scale image due to water content change:
Figure 100002_DEST_PATH_IMAGE002
wherein the content of the first and second substances,
Figure 100002_DEST_PATH_IMAGE004
is a variation of the color value,
Figure 100002_DEST_PATH_IMAGE006
the gray level average value of the soil surface part in the real-time soil gray level image,
Figure 100002_DEST_PATH_IMAGE008
the gray average value of the soil surface part in the initial soil gray image is obtained;
setting the gray value of a pixel point with the gray value of 255 in the mask image as 1, and solving the sum of the gray values of all the pixel points in the mask image to obtain the crack area S in the real-time land gray image;
dividing the mask image into 25 regions equally, determining the crack area of each region, and calculating the variance of the crack area of each region:
Figure 100002_DEST_PATH_IMAGE010
Figure 100002_DEST_PATH_IMAGE012
wherein the content of the first and second substances,
Figure 100002_DEST_PATH_IMAGE014
the variance of the area of the cracks in each region evenly divided in the mask image,
Figure 100002_DEST_PATH_IMAGE016
the areas of cracks on the i-th region averaged out in the mask image,
Figure 100002_DEST_PATH_IMAGE018
the mean value of the areas of the cracks on each region which are uniformly divided in the mask image is obtained;
carrying out Hough line detection in each area of the mask image, determining the longest straight line segment in the area, carrying out connected domain identification by taking any point on the longest straight line segment as a starting point to obtain a connected domain containing the longest straight line segment, calculating the length average value of each straight line segment perpendicular to the longest straight line segment in the connected domain, wherein the length average value is the maximum crack width in the area, and taking the maximum value in the maximum crack width in each area to obtain the maximum crack width W in the real-time land gray level image;
the final land water shortage degree is as follows:
Figure 100002_DEST_PATH_IMAGE020
wherein the content of the first and second substances,
Figure 100002_DEST_PATH_IMAGE022
the water shortage degree of the land is determined,
Figure 100002_DEST_PATH_IMAGE024
Figure 100002_DEST_PATH_IMAGE026
Figure 100002_DEST_PATH_IMAGE028
Figure 100002_DEST_PATH_IMAGE030
the value of the weight coefficient is determined empirically.
Further, the color shift amount is:
Figure 100002_DEST_PATH_IMAGE032
wherein, the first and the second end of the pipe are connected with each other,
Figure 100002_DEST_PATH_IMAGE034
in order to be the amount of color shift,
Figure 100002_DEST_PATH_IMAGE036
is the blade region
Figure 100002_DEST_PATH_IMAGE038
Go to the first
Figure 100002_DEST_PATH_IMAGE040
The pixel values of the pixel points of a column,
Figure 100002_DEST_PATH_IMAGE042
and N and m are respectively the length and the width of the blade area, and N is the total number of the pixel points in the blade area.
Further, the color shift entropy is:
Figure 100002_DEST_PATH_IMAGE044
wherein the content of the first and second substances,
Figure 100002_DEST_PATH_IMAGE046
in order to be the color shift entropy,
Figure 100002_DEST_PATH_IMAGE048
in the histogram obtained by counting the values of each offset in the color offset map by using a histogram statistical method, the first step
Figure 100002_DEST_PATH_IMAGE050
Statistical probability of occurrence of seed pixels.
Further, the method for calculating the irregularity degree of the blade edge, the symmetry degree of the blade, the curling degree of the blade surface and the height of the blade and determining the form water shortage degree of the blade comprises the following steps:
edge detection is carried out in the gray scale blade area to determine the blade edge, the distance between each point on the blade edge and the circumference of the blade inscribed ellipse on the connecting line of the center of the blade inscribed ellipse and each point on the blade edge is calculated, the irregularity degree of the blade edge is represented by the variance of the distance, and the variance is as follows:
Figure 100002_DEST_PATH_IMAGE052
wherein the content of the first and second substances,
Figure 100002_DEST_PATH_IMAGE054
the variance of the distance between each point on the edge of the blade and the circumference of the inscribed ellipse of the blade on the connecting line of each point on the edge of the blade and the center of the inscribed ellipse of the blade,
Figure 100002_DEST_PATH_IMAGE056
and
Figure 100002_DEST_PATH_IMAGE058
coordinates of each point on the edge of the blade and a point on the circumference of the blade inscribed ellipse on a connecting line of each point on the edge of the blade and the center of a circle of the blade inscribed ellipse are respectively set;
taking the major axis of the blade inscribed ellipse as a symmetry axis, dividing the blade edge into a left curve and a right curve, turning one curve over according to the symmetry axis, obtaining the dynamic time normalization distance of the two curves by using a dynamic time normalization algorithm, and then obtaining the blade symmetry degree value:
Figure 100002_DEST_PATH_IMAGE060
wherein the content of the first and second substances,
Figure 100002_DEST_PATH_IMAGE062
is the value of the degree of symmetry of the blade,
Figure 100002_DEST_PATH_IMAGE064
the dynamic time reduction distance of the curves at the left side and the right side of the blade is obtained;
and obtaining a leaf withering degree value according to the variance and the leaf symmetry degree value:
Figure 100002_DEST_PATH_IMAGE066
wherein the content of the first and second substances,
Figure 100002_DEST_PATH_IMAGE068
is the leaf blight degree value;
gradient changes of an x axis and a y axis of the blade area are respectively obtained by using a Sobel operator, and the two-axis gradient changes are weighted and summed to obtain the depth gradient of the whole blade area:
Figure 100002_DEST_PATH_IMAGE070
wherein G is the depth gradient of the whole blade image, G x Leaf image depth gradient, G, calculated by Sobel operator in the x-axis direction y Calculating the depth gradient of the leaf image obtained by the Sobel operator in the Y-axis direction;
uniformly selecting a certain number of sampling points on the blade area to obtain a depth gradient on each sampling point, arranging the depth gradients of each sampling point into a matrix according to the coordinate position of each sampling point to obtain a sampling point depth gradient matrix, and quantizing the sampling point depth gradient matrix to obtain a sampling point depth gradient quantization map;
according to the method for acquiring the gray level co-occurrence matrix, obtaining the depth gradient co-occurrence matrix of each sampling point in the sampling point depth gradient quantization graph, and calculating the entropy value of the depth gradient co-occurrence matrix of each sampling point:
Figure 100002_DEST_PATH_IMAGE072
wherein the content of the first and second substances,
Figure 100002_DEST_PATH_IMAGE074
is the entropy value of the depth gradient co-occurrence matrix of the sampling point,
Figure 100002_DEST_PATH_IMAGE076
the probability of the occurrence of a depth gradient point pair (u, v) in a depth gradient co-occurrence matrix of a sampling point is shown, and (u, v) is a coordinate of the midpoint of the depth gradient co-occurrence matrix of the sampling point;
then, carrying out an average value on entropy values of the depth gradient co-occurrence matrixes of all sampling points to obtain the leaf surface curl degree:
Figure 100002_DEST_PATH_IMAGE078
wherein E is the leaf surface curling degree, and h and w respectively represent the length and width of the sampling point depth gradient quantization map;
shape water shortage of the blade:
Figure 100002_DEST_PATH_IMAGE080
wherein the content of the first and second substances,
Figure 100002_DEST_PATH_IMAGE082
is the water-lack range value of the blade,
Figure 119026DEST_PATH_IMAGE068
is the value of the degree of wilting of the leaves,
Figure 100002_DEST_PATH_IMAGE084
the degree of curling of the leaf surface is,
Figure 100002_DEST_PATH_IMAGE086
the height of the crop leaf obtained by the depth camera.
Further, the method for determining the blade area where each blade is located comprises the following steps:
and performing target detection on the real-time crop color image by adopting a target detection method based on a deep learning neural network model, separating the foreground from the background by utilizing the height difference between the foreground blade and the background blade determined by the depth camera, and obtaining a single blade area of the foreground to obtain a complete blade area.
Drawings
Fig. 1 is a flow chart of the intelligent drip irrigation method based on artificial intelligence of the present invention.
Detailed Description
The intelligent drip irrigation method based on artificial intelligence of the invention is explained in detail below with reference to the accompanying drawings and examples.
The method comprises the following steps:
the invention discloses an embodiment of an intelligent drip irrigation method based on artificial intelligence, which has the overall flow as shown in figure 1 and comprises the following specific processes:
the method comprises the following steps of collecting a crop image and a land image around the crop.
And arranging a depth camera right above the crops, and vertically shooting the crops and the land around the crops downwards to obtain the crop state images. And performing Gaussian filtering on the crop state image to eliminate image noise, and performing gray processing on the image to obtain a gray image.
After the image is converted into a grayscale image, threshold segmentation is performed on the grayscale image to binarize the grayscale image. Because the crop and the land have larger gray difference, the threshold value is automatically selected by using a maximum variance threshold value method. The image is divided to generate a binary image, the binary image and the reversed binary image are respectively subjected to bit operation with the original color image and the gray level image, so that a color image of the crop and a gray level image of the land can be extracted, and the separation of the crop and the land image is realized.
When each irrigation operation is finished, the crops and the land around the crops are shot by adopting a depth camera, and the obtained crop color image and the land gray level image are used as initial images for the next drip irrigation control, wherein the crop color image is an initial crop color image, and the land gray level image is an initial land gray level image.
After the irrigation operation is finished, the depth camera is adopted to shoot the crops and the land around the crops again after the set time interval is waited, and a real-time crop color image and a real-time land gray level image are obtained.
And step two, analyzing the land partial image to determine the land water shortage condition.
When the water content of the land is different, the surface state of the land is different, and the remarkable performance comprises two aspects, namely the color of the land changes when the water content is reduced, and the land cracks when the water content is reduced. That is, when the soil moisture content is high, the soil is darker and cracks are not generated, and when the soil moisture content is low, the soil is lighter in color and is accompanied by the generation of cracks.
In the case of ground cracks, when the moisture content of the ground is reduced, the color of the ground surface is lighter, the gray scale map value is higher, but the gray scale value of the cracks is obviously lower compared with the ground surface part due to darker parts.
And comparing the shot real-time land gray level image with the initial land gray level image in the initial image to accurately identify the cracks generated in the land, and generating a mask image. And if the gray value of a certain position in the real-time land gray image is less than or equal to the gray value of the position in the initial land gray image, setting the gray value of the mask image corresponding to the position to be 255, otherwise, setting the gray value to be 0.
And performing opening operation processing on the obtained mask image to obtain a mask image capable of accurately representing the position of the crack, wherein if the mask image has no crack, the mask image is an image with the gray value of 0 at each position.
Based on the obtained mask image, a real-time land gray scale map can be determinedThe portion of the earth surface in the image, i.e., the portion other than the cracks. Calculating the gray average value of the soil surface part in the real-time soil gray image
Figure 429921DEST_PATH_IMAGE006
Then, the gray level of the earth surface part in the initial earth gray level image is averaged
Figure 90710DEST_PATH_IMAGE008
And (3) obtaining a color value variable of the soil surface part generated due to the change of the water content in real time by difference:
Figure DEST_PATH_IMAGE002A
wherein the content of the first and second substances,
Figure 577186DEST_PATH_IMAGE004
namely the color value variable is obtained,
Figure 665228DEST_PATH_IMAGE006
the gray average value of the soil surface part in the real-time soil gray image,
Figure 73206DEST_PATH_IMAGE008
is the gray average value of the soil surface part in the initial soil gray image.
Meanwhile, the area of the crack in the mask image is obtained. Specifically, the gray value of a pixel point with the gray value of 255 in the mask image is set to be 1, and then all the pixel points are summed to obtain the crack area S. Of course, other methods known in the art may be used to determine the area of the crack in the mask image in other embodiments.
The mask image is uniformly divided into 25 areas of 5*5, and the crack area of each area is obtained
Figure DEST_PATH_IMAGE088
Then, calculating the variance of the crack areas on the regions evenly divided in the mask image:
Figure DEST_PATH_IMAGE010A
Figure DEST_PATH_IMAGE089
wherein, the first and the second end of the pipe are connected with each other,
Figure 68844DEST_PATH_IMAGE014
the variance of the area of the cracks in each region evenly divided in the mask image,
Figure 878668DEST_PATH_IMAGE016
the areas of cracks on the i-th region averaged out in the mask image,
Figure 137612DEST_PATH_IMAGE018
the average value of the areas of the cracks in the regions evenly divided in the mask image is obtained. When the variance is larger, the crack distribution in the mask image is represented to be more uneven, and the moisture content of the earth is represented to be lower.
In the present embodiment, the mask image is divided equally into 25 regions, and it is obviously determined that the mask image may be divided equally into other suitable number of regions in other embodiments.
Then, hough line detection is carried out in each area of the mask image, the longest straight line segment in each area is determined, connected domain identification is carried out by taking any point on the longest straight line segment as a starting point, a connected domain containing the longest straight line segment is obtained, the average value of all straight line segments perpendicular to the longest straight line segment in the connected domain is calculated, the average value is the maximum crack width in the corresponding area, the maximum value in the maximum crack width of each area is taken, and the final maximum crack width W can be obtained. Since the hough transform and cartesian coordinate system are known technologies, they are not described herein again, and in this embodiment, the maximum crack widths of the regions uniformly divided on the mask image are respectively found, and then the determined maximum crack widths W are compared as a whole, and in other embodiments, the maximum crack widths W in the whole range of the mask image can also be directly found.
Finally, the color value variables obtained in accordance with the above
Figure 891941DEST_PATH_IMAGE004
The area S of the crack, and the variance of the area of the crack in each region evenly divided in the mask image
Figure 894532DEST_PATH_IMAGE014
And the maximum crack width W, the final land water shortage can be obtained:
Figure DEST_PATH_IMAGE020A
wherein, the first and the second end of the pipe are connected with each other,
Figure 824442DEST_PATH_IMAGE022
the water shortage degree of the land is determined,
Figure 519865DEST_PATH_IMAGE024
Figure 27070DEST_PATH_IMAGE026
Figure 302194DEST_PATH_IMAGE028
Figure 211244DEST_PATH_IMAGE030
the values of the weight coefficients are determined according to practical experience, and in this embodiment, the four weight coefficients are respectively set to 0.1,0.5,1,1.
And step three, analyzing partial images of the crops to determine the water shortage condition of the crops.
And detecting and acquiring the leaves in the image by using the target for the crop color image acquired and processed by the depth camera. The target detection adopts a target detection method based on deep learning, such as a yoolov 5 model, the model needs to be trained before use, and how to train a neural network is well known to those skilled in the art, and the process is not described herein again.
For the overlapped leaves, the foreground is separated from the background by using the depth information of the height difference between the foreground leaves and the background leaves, and a single leaf image of the foreground is obtained. In order to remove holes possibly existing on the leaf surface, the obtained leaf image is binarized, and then closed operation is carried out, so that a complete leaf surface area is obtained. Finally, for each target detected image, a separate leaf region can be acquired.
The leaves of the crops are green and rich in luster under the condition of no water shortage, and the texture is not obvious; when the object is lack of water, the leaves are sagged, the leaves are changed from green to yellow, and in the changing process, the color change is not evenly distributed, so that the obvious degree of the texture of the leaves is increased. Therefore, the water shortage degree of the blade can be represented by the water shortage calculated by analyzing the morphological state of the blade and the color offset amount and the color offset entropy of the blade.
1. The color shift amount is calculated.
The color offset can be obtained by calculating the euclidean distance between the pixel value of the leaf area and the green preset value when the water content of the leaf is normal in the real-time crop color image, and the green preset value when the water content of the leaf is normal is selected as (50,185,10) in the embodiment.
Calculating Euclidean distance between each pixel point of the leaf area and a green preset value in an RGB color space, then calculating the mean value of Euclidean distances corresponding to each leaf, wherein the mean value is the color offset of the leaf, the color offset reflects the water shortage degree of the leaf, and the higher the water shortage degree of the crop is, the larger the color offset is.
The color shift amount is:
Figure DEST_PATH_IMAGE032A
wherein the content of the first and second substances,
Figure 215584DEST_PATH_IMAGE034
in order to be the amount of color shift,
Figure 944506DEST_PATH_IMAGE036
in the region of the blade
Figure 288900DEST_PATH_IMAGE038
Go to the first
Figure 927823DEST_PATH_IMAGE040
The pixel values of the pixel points of a column,
Figure 965049DEST_PATH_IMAGE042
and N and m are respectively the length and the width of the blade area, and N is the total number of the pixel points in the blade area.
2. The color shift entropy is calculated.
When the leaves lack water and change the color from green to yellow, the leaves have irregularity and nonuniformity in color, and complex random textures are generated on the leaves.
And graying the image of the single blade, subtracting the gray value 125 of a green preset value to obtain a color shift graph of the blade, and performing Gaussian filtering on the color shift graph to remove the white noise amount of the color shift graph. And counting the value of each offset in the color offset map by using the histogram, and calculating the color offset entropy corresponding to the histogram.
When the crops are not lack of water, the texture of the leaves is not obvious, the distribution of pixels in a histogram obtained by corresponding to the color shift diagram is concentrated, and the corresponding entropy value is smaller; when the crops lack water, the leaf texture is obvious, the pixel distribution in the histogram obtained by the color migration diagram is dispersed, and the corresponding entropy value is larger. Therefore, the water shortage degree of the crops can be reflected by the change of the entropy value, and the color deviation entropy is as follows:
Figure DEST_PATH_IMAGE044A
wherein the content of the first and second substances,
Figure 446846DEST_PATH_IMAGE046
in order to be the color shift entropy,
Figure 470296DEST_PATH_IMAGE048
for the first histogram of the color shift map
Figure 822780DEST_PATH_IMAGE050
Statistical probability of occurrence of seed pixels.
3. The morphological water deficit was calculated.
The water content of the leaves is represented in various morphologies, and the change of the water content of the leaves is analyzed from different morphologic angles and parameters for representing the water content of the leaves are correspondingly determined.
The first morphological analysis is that under normal conditions, the edges appear as a smoother ellipse as a whole. When the object lacks water, the blade shrinks, the edge part curls towards the middle, and the overall contour of the blade is in an irregular shape. The lower the moisture content of the leaf, the higher the degree of edge irregularity.
Therefore, according to the characteristics, the color image of the blade is converted into the gray image, and the edge detection is carried out on the gray image. The edge detection uses the canny operator to obtain the edge image, and the canny operator is known in the art and is not described in detail herein. And then carrying out ellipse fitting on the edge curve of the blade to obtain an inscribed ellipse curve. And calculating the variance of the distance between each point on the edge of the blade and the circumference of the blade inscribed ellipse on the connecting line of each point on the edge of the blade and the center of the circle of the blade inscribed ellipse so as to measure the smoothness degree of the edge of the blade, wherein the greater the variance is, the more irregular the edge of the blade is, and the higher the water shortage degree of the blade is. The variance is:
Figure DEST_PATH_IMAGE052A
wherein the content of the first and second substances,
Figure 296487DEST_PATH_IMAGE054
the variance of the distance between each point on the edge of the blade and the circumference of the blade inscribed ellipse on the connecting line of each point on the edge of the blade and the center of the blade inscribed ellipse,
Figure 1DEST_PATH_IMAGE056
and
Figure 561563DEST_PATH_IMAGE058
the coordinates of each point on the edge of the blade and the point on the circumference of the blade inscribed ellipse are respectively on the connecting line of each point on the edge of the blade and the circle center of the blade inscribed ellipse.
The second morphological analysis is that the leaves are more symmetrical when the water content is normal, and the symmetry of the leaves is reduced when the water content is reduced, so that the symmetry of the leaves can be detected and the water shortage of the leaves can be reflected.
After the inscribed ellipse of the blade is obtained, the major axis of the ellipse can be determined, and the axis is the symmetry axis of the blade. The blade edge is divided into a left curve and a right curve by the blade symmetry axis, and the detection of the blade symmetry is the detection of the similarity of the left curve and the right curve. Turning one curve according to a symmetry axis, obtaining a dynamic time reduction distance of the two curves by using a dynamic time reduction algorithm, obtaining curve similarity through the distance, and determining a blade symmetry degree value:
Figure DEST_PATH_IMAGE060A
wherein the content of the first and second substances,
Figure 565291DEST_PATH_IMAGE062
is the value of the degree of symmetry of the blade,
Figure 678741DEST_PATH_IMAGE064
and (4) normalizing the distance for the dynamic time of the curves on the left side and the right side of the blade.
And obtaining a leaf withering degree value according to the variance and the leaf symmetry degree value:
Figure DEST_PATH_IMAGE066A
wherein the content of the first and second substances,
Figure 276075DEST_PATH_IMAGE068
the value of the leaf blight degree is shown. When the blades are curled in the absence of water,
Figure 500383DEST_PATH_IMAGE054
the value of the signal is increased and,
Figure 827460DEST_PATH_IMAGE062
the value is reduced, thereby
Figure 377390DEST_PATH_IMAGE068
The value is increased; when the water content of the leaves is increased and the leaves are unfolded,
Figure 789916DEST_PATH_IMAGE054
the value is decreased and the number of the first and second,
Figure 958861DEST_PATH_IMAGE062
the value is increased, thereby
Figure 406022DEST_PATH_IMAGE068
The value decreases.
The third morphological analysis angle is that the surface of the blade is flat under the normal water shortage condition, the depth value of each position of the blade obtained by shooting through the depth camera is not changed greatly, and after the blade is in water shortage, the whole blade is curled, the depth value of each position of the blade obtained by shooting through the depth camera is changed obviously, and a large gradient exists. Therefore, the flatness of the reaction blade can be changed by detecting the depth gradient of the blade, so that the water shortage degree of the reaction blade is reflected.
The flatness of the blade is reflected by the gradient change of the depth of the blade, and the gradient of the depth of the blade is calculated by using a sobel operator. And respectively obtaining the gradient changes of the x axis and the y axis of the leaf image by using a Sobel operator, and weighting and summing the two-axis gradient changes to obtain the depth gradient of the whole leaf image. The formula is as follows:
Figure DEST_PATH_IMAGE070A
wherein G is the depth gradient of the whole blade image, G x Calculation for Sobel operator in x-axis directionResulting leaf image depth gradient, G y And calculating the depth gradient of the leaf image obtained by the Sobel operator in the Y-axis direction. Since the sobel operator calculation method is a well-known technology, it is not described herein in detail.
After the depth gradient of the whole blade image is obtained, a certain number of sampling points are uniformly selected on the blade image to obtain the depth gradient of each sampling point, then the depth gradients of each sampling point are arranged into a matrix according to the coordinate position of the sampling point to obtain a sampling point depth gradient matrix, histogram equalization is carried out on the matrix, the global contrast of the gradient is increased, and compression quantization is carried out to obtain a sampling point depth gradient quantization image.
And obtaining the depth gradient co-occurrence matrix of each sampling point in the sampling point depth gradient quantization image according to the gray level co-occurrence matrix obtaining method. When the blade curls, the overall depth gradient change of the blade is increased, the randomness of the sampling point depth gradient quantization diagram is increased, so that the entropy value of the depth gradient co-occurrence matrix of each sampling point in the sampling point depth gradient quantization diagram is increased, the entropy values of the depth gradient co-occurrence matrix corresponding to all the sampling points in the sampling point depth gradient quantization diagram are averaged, and the average value can represent the blade curl degree:
Figure DEST_PATH_IMAGE078A
Figure DEST_PATH_IMAGE072A
wherein E is the leaf surface curling degree,
Figure 798958DEST_PATH_IMAGE074
is the entropy value of the depth gradient co-occurrence matrix of the sampling point,
Figure 964360DEST_PATH_IMAGE076
is the probability of the occurrence of a depth gradient point pair (u, v) in the depth gradient co-occurrence matrix of the sampling point, (u, v) is the coordinate of the midpoint of the depth gradient co-occurrence matrix of the sampling point, hAnd w represent the length and width of the sampled point depth gradient quantization map, respectively.
When the E value is increased, the gradient change degree and the complexity are improved, and the gradient change of the leaves is more violent, so that the curling degree of the leaves is increased, and the water shortage of crops is increased.
The fourth morphological analysis is that when the leaves contain insufficient water, the leaf height changes. The leaves of the crops are upright when the crops are normal, and the leaves are drooping when the crops lack water. According to this feature, the amount of depth in the depth camera is used to measure the plant. The depth camera can sense the height change of the crop leaves and determine the specific height of the crop leaves. The height of the crop leaf can be obtained by calculating the depth value change of the crop region part in the graph
Figure 530470DEST_PATH_IMAGE086
. When the water content of crops is normal, the height of the leaves is generally normal,
Figure 973084DEST_PATH_IMAGE086
the value is large; when the crops lack water, the leaves droop,
Figure 599238DEST_PATH_IMAGE086
the value is correspondingly reduced, indicating that the crop needs drip irrigation.
Through the parameters of the water content of the blades characterized by different angles, the form water shortage of the blades can be finally determined:
Figure DEST_PATH_IMAGE080A
wherein the content of the first and second substances,
Figure 517515DEST_PATH_IMAGE082
is the water-lack range value of the blade,
Figure 37051DEST_PATH_IMAGE068
is the value of the degree of wilting of the leaves,
Figure 927647DEST_PATH_IMAGE084
the degree of curling of the leaf surface is,
Figure 990281DEST_PATH_IMAGE086
is the height of the crop leaves.
And step four, inputting the land water shortage condition and the crop water shortage condition into the trained neural network, and controlling the drip irrigation system to execute the drip irrigation operation.
After the analysis of the soil part image and the crop part image, four parameters representing the crop water shortage degree are finally obtained, namely the soil water shortage degree D E Color offset P, color offset entropy H, form water shortage Q. Each leaf identified from the live image taken by the depth camera has a four-dimensional vector consisting of these four parameters.
The number of the leaves for different crops is different, and is usually about 30, and 20 leaves are randomly selected from the leaves to form a vector matrix. If the crop leaves are less than 20, the leaf is filled with 0. And (3) sending a vector matrix consisting of 20 blades to a trained DNN network, outputting a six-dimensional vector through a softmax function, wherein the vector is the confidence coefficient of the network on whether crops need to be subjected to drip irrigation, and each dimension corresponds to a preset drip irrigation grade. The drip irrigation is carried out on six grades, the drip irrigation amount and the drip irrigation duration are preset amounts, wherein the drip irrigation is not needed on the grade 0; level 1 is drip irrigation 10ml/min for 3 hours; the 2 level is drip irrigation for 20ml/min and lasts for 3 hours; the 3-level drip irrigation lasts for 3 hours at 30 ml/min; drip irrigation for 30ml/s for 5 hours is performed in level 4; grade 5 is drip irrigation at 30ml/min for 7 hours. During the continuous period of drip irrigation, images of crops are not shot any more, at the moment of finishing the execution of the drip irrigation operation, the images of the crops and the images of the land around the crops are collected through the depth camera again to serve as initial images of the next drip irrigation control, and then after a set time interval is waited, the images of the crops and the images of the land around the crops are collected through the depth camera to serve as real-time images to carry out a new drip irrigation control. Since the DNN network is composed and trained in a conventional manner, it is not described herein in detail.
In this embodiment, the set time interval to wait is set to 2 hours, in this embodiment, because six drip irrigation levels are set, six-dimensional vectors are output by the DNN network, in other embodiments, different numbers of drip irrigation levels can be set, and the DNN network can output vectors corresponding to the number of dimensions, for example, if five drip irrigation levels are set, five-dimensional vectors are output by the DNN network.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (7)

1. An intelligent drip irrigation method based on artificial intelligence is characterized by comprising the following steps:
after the previous drip irrigation operation is finished, acquiring images of crops and the land around the crops by adopting a depth camera so as to acquire a color image of the crops and a gray level image of the land, and taking the color image of the crops and the gray level image of the land as an initial color image of the crops and an initial gray level image of the land for the current drip irrigation control;
waiting for a set time interval after the previous drip irrigation operation is finished, acquiring images of crops and the land around the crops again by adopting the depth camera, and taking the obtained crop color images and land gray level images as real-time crop color images and real-time land gray level images of the current drip irrigation control;
comparing the data of the real-time land gray level image with the data of the initial land gray level image, and determining the color value variable of the land surface part in the real-time land gray level image, the crack area in the real-time land gray level image, the variance of the crack areas in different areas and the maximum crack width to obtain the water shortage of the land;
performing target identification on the real-time crop color image, determining a leaf area where each leaf is located, and calculating a deviation value between a pixel value of each pixel point in the leaf area and a green preset value to obtain the color offset of the leaf, wherein the green preset value is a pixel value when the water content of the leaf is normal;
graying a leaf area in a real-time crop color image, obtaining a color deviation map of the leaf area by making a difference between the obtained gray leaf area and a gray value of the green preset value, and calculating an entropy value of the color deviation map to obtain a color deviation entropy;
calculating the irregularity degree of the blade edge, the symmetry degree of the blade, the blade surface curling degree and the blade height, and determining the form water shortage degree of the blade;
and inputting the land water shortage, the color offset entropy and the form water shortage into a trained neural network, determining the drip irrigation grade of the drip irrigation control, and finishing the drip irrigation operation.
2. The intelligent drip irrigation method based on artificial intelligence according to claim 1, wherein the method for acquiring the color image and the gray scale image of the crop by acquiring the image of the crop and the land around the crop by using the depth camera comprises the following steps:
the depth camera shoots crops and the land around the crops to obtain crop state images, graying is conducted on the crop state images to obtain crop state gray level images, threshold segmentation is conducted on the crop state gray level images to obtain binary images, and the binary images after the reverse color are respectively subjected to bit operation with the crop state images and the crop state gray level images to obtain crop color images and land gray level images.
3. The intelligent drip irrigation method based on artificial intelligence according to claim 1, wherein the color value variable of the soil surface portion in the real-time soil gray level image, the crack area in the real-time soil gray level image, the variance of the crack areas in different regions and the maximum crack width are determined, and the specific process of obtaining the water shortage of the soil is as follows:
determining a region of which the gray value on the real-time land gray image is less than or equal to the initial land gray image, setting the gray value of the region to be 255, and setting the gray value of the part outside the region to be 0 to obtain a mask image;
determining a land surface part in the real-time land gray scale image according to the mask image, and calculating a color value variable of the land surface part in the real-time land gray scale image due to the change of the water content:
Figure DEST_PATH_IMAGE002
wherein, the first and the second end of the pipe are connected with each other,
Figure DEST_PATH_IMAGE004
is a variation of the color value,
Figure DEST_PATH_IMAGE006
the gray average value of the soil surface part in the real-time soil gray image,
Figure DEST_PATH_IMAGE008
the gray level mean value of the soil surface part in the initial soil gray level image is obtained;
setting the gray value of a pixel point with the gray value of 255 in the mask image as 1, and solving the sum of the gray values of all the pixel points in the mask image to obtain the crack area S in the real-time land gray image;
dividing the mask image into 25 regions equally, determining the crack area of each region, and calculating the variance of the crack area of each region:
Figure DEST_PATH_IMAGE010
Figure DEST_PATH_IMAGE012
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE014
the variance of the area of the cracks in each region evenly divided in the mask image,
Figure DEST_PATH_IMAGE016
the area of the crack on the i-th region equally divided in the mask image,
Figure DEST_PATH_IMAGE018
the mean value of the areas of the cracks on each region which are uniformly divided in the mask image is obtained;
carrying out Hough line detection in each area of the mask image, determining the longest straight line segment in the area, carrying out connected domain identification by taking any point on the longest straight line segment as a starting point to obtain a connected domain containing the longest straight line segment, calculating the length average value of each straight line segment perpendicular to the longest straight line segment in the connected domain, wherein the length average value is the maximum crack width in the area, and taking the maximum value in the maximum crack width in each area to obtain the maximum crack width W in the real-time land gray level image;
the final land water shortage degree is as follows:
Figure DEST_PATH_IMAGE020
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE022
the water shortage degree of the land is determined,
Figure DEST_PATH_IMAGE024
Figure DEST_PATH_IMAGE026
Figure DEST_PATH_IMAGE028
Figure DEST_PATH_IMAGE030
is a weightThe values of the coefficients and the weighting coefficients are determined empirically.
4. The intelligent drip irrigation method based on artificial intelligence according to claim 1, wherein the color offset is:
Figure DEST_PATH_IMAGE032
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE034
in order to be the amount of color shift,
Figure DEST_PATH_IMAGE036
is the blade region
Figure DEST_PATH_IMAGE038
Go to the first
Figure DEST_PATH_IMAGE040
The pixel values of the pixel points of a column,
Figure DEST_PATH_IMAGE042
and N and m are respectively the length and the width of the blade area, and N is the total number of the pixel points in the blade area.
5. The intelligent drip irrigation method based on artificial intelligence according to claim 1, wherein the color shift entropy is:
Figure DEST_PATH_IMAGE044
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE046
in order to be the color shift entropy,
Figure DEST_PATH_IMAGE048
in a histogram obtained by counting the values of the respective offsets in the color offset map by using a histogram statistical method
Figure DEST_PATH_IMAGE050
Statistical probability of occurrence of seed pixels.
6. The intelligent drip irrigation method based on artificial intelligence of claim 1, wherein the method for calculating the irregularity degree of the blade edge, the symmetry degree of the blade, the curling degree of the blade surface and the height of the blade and determining the form water shortage degree of the blade comprises the following steps:
edge detection is carried out in the gray level blade area to determine the blade edge, the distance between each point on the blade edge and the circumference of the blade inscribed ellipse on the connecting line of the center of the blade inscribed ellipse and each point on the blade edge is calculated, the irregularity degree of the blade edge is represented by the variance of the distance, and the variance is as follows:
Figure DEST_PATH_IMAGE052
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE054
the variance of the distance between each point on the edge of the blade and the circumference of the inscribed ellipse of the blade on the connecting line of each point on the edge of the blade and the center of the inscribed ellipse of the blade,
Figure DEST_PATH_IMAGE056
and
Figure DEST_PATH_IMAGE058
coordinates of each point on the edge of the blade and a point on the circumference of the blade inscribed ellipse on a connecting line of each point on the edge of the blade and the center of a circle of the blade inscribed ellipse are respectively set;
taking the major axis of the blade inscribed ellipse as a symmetry axis, dividing the blade edge into a left curve and a right curve, turning one curve over according to the symmetry axis, obtaining the dynamic time normalization distance of the two curves by using a dynamic time normalization algorithm, and then obtaining the blade symmetry degree value:
Figure DEST_PATH_IMAGE060
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE062
is the value of the degree of symmetry of the blade,
Figure DEST_PATH_IMAGE064
the dynamic time reduction distance of the curves at the left side and the right side of the blade is obtained;
and obtaining a leaf withering degree value according to the variance and the leaf symmetry degree value:
Figure DEST_PATH_IMAGE066
wherein, the first and the second end of the pipe are connected with each other,
Figure DEST_PATH_IMAGE068
is the leaf blight degree value;
gradient changes of an x axis and a y axis of the blade area are respectively obtained by using a Sobel operator, and the two-axis gradient changes are weighted and summed to obtain the depth gradient of the whole blade area:
Figure DEST_PATH_IMAGE070
wherein G is the depth gradient of the whole blade image, G x Leaf image depth gradient, G, calculated by Sobel operator in the x-axis direction y Calculating the depth gradient of the leaf image obtained by the Sobel operator in the Y-axis direction;
uniformly selecting a certain number of sampling points on the blade area to obtain a depth gradient on each sampling point, arranging the depth gradients of each sampling point into a matrix according to the coordinate position of each sampling point to obtain a sampling point depth gradient matrix, and quantizing the sampling point depth gradient matrix to obtain a sampling point depth gradient quantization map;
according to the method for acquiring the gray level co-occurrence matrix, obtaining the depth gradient co-occurrence matrix of each sampling point in the sampling point depth gradient quantization graph, and calculating the entropy value of the depth gradient co-occurrence matrix of each sampling point:
Figure DEST_PATH_IMAGE072
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE074
is the entropy value of the depth gradient co-occurrence matrix of the sampling point,
Figure DEST_PATH_IMAGE076
the probability of the occurrence of a depth gradient point pair (u, v) in a depth gradient co-occurrence matrix of a sampling point is shown, and (u, v) is a coordinate of the midpoint of the depth gradient co-occurrence matrix of the sampling point;
then, averaging the entropy values of the depth gradient co-occurrence matrixes of all sampling points to obtain the leaf surface curl degree:
Figure DEST_PATH_IMAGE078
wherein E is the leaf surface curl degree, and h and w respectively represent the length and width of the sampling point depth gradient quantization graph;
shape water shortage of the blade:
Figure DEST_PATH_IMAGE080
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE082
is the water-lack range value of the blade,
Figure 489498DEST_PATH_IMAGE068
is the value of the degree of wilting of the leaves,
Figure DEST_PATH_IMAGE084
the degree of curling of the leaf surface is,
Figure DEST_PATH_IMAGE086
the height of the crop leaf obtained by the depth camera.
7. The intelligent drip irrigation method based on artificial intelligence according to claim 1, wherein the method for determining the area of each blade comprises the following steps:
and performing target detection on the real-time crop color image by adopting a target detection method based on a deep learning neural network model, separating the foreground from the background by utilizing the height difference between the foreground blade and the background blade determined by the depth camera, and obtaining a single blade area of the foreground to obtain a complete blade area.
CN202211167732.0A 2022-09-23 2022-09-23 Intelligent drip irrigation method based on artificial intelligence Active CN115240126B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211167732.0A CN115240126B (en) 2022-09-23 2022-09-23 Intelligent drip irrigation method based on artificial intelligence

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211167732.0A CN115240126B (en) 2022-09-23 2022-09-23 Intelligent drip irrigation method based on artificial intelligence

Publications (2)

Publication Number Publication Date
CN115240126A CN115240126A (en) 2022-10-25
CN115240126B true CN115240126B (en) 2022-12-20

Family

ID=83667471

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211167732.0A Active CN115240126B (en) 2022-09-23 2022-09-23 Intelligent drip irrigation method based on artificial intelligence

Country Status (1)

Country Link
CN (1) CN115240126B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102564593A (en) * 2011-12-30 2012-07-11 河海大学常州校区 Plant growth condition monitoring system based on compute vision and internet of things
CN114429592A (en) * 2021-12-30 2022-05-03 山东浪潮工业互联网产业股份有限公司 Automatic irrigation method and equipment based on artificial intelligence

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102564593A (en) * 2011-12-30 2012-07-11 河海大学常州校区 Plant growth condition monitoring system based on compute vision and internet of things
CN114429592A (en) * 2021-12-30 2022-05-03 山东浪潮工业互联网产业股份有限公司 Automatic irrigation method and equipment based on artificial intelligence

Also Published As

Publication number Publication date
CN115240126A (en) 2022-10-25

Similar Documents

Publication Publication Date Title
CN108009542B (en) Weed image segmentation method in rape field environment
CN110837768B (en) Online detection and identification method for rare animal protection
CN110120042B (en) Crop image pest and disease damage area extraction method based on SLIC super-pixel and automatic threshold segmentation
CN103984946B (en) High resolution remote sensing map road extraction method based on K-means
CN106446942A (en) Crop disease identification method based on incremental learning
CN110427922A (en) One kind is based on machine vision and convolutional neural networks pest and disease damage identifying system and method
CN111369597A (en) Particle filter target tracking method based on multi-feature fusion
CN108319973A (en) Citrusfruit detection method on a kind of tree
CN111967511B (en) Foundation cloud picture classification method based on heterogeneous feature fusion network
CN109978848B (en) Method for detecting hard exudation in fundus image based on multi-light-source color constancy model
CN111340824A (en) Image feature segmentation method based on data mining
CN109886146B (en) Flood information remote sensing intelligent acquisition method and device based on machine vision detection
CN113077486B (en) Method and system for monitoring vegetation coverage rate in mountainous area
CN106446925A (en) Dolphin identity recognition method based on image processing
CN111784597A (en) Plant protection robot autonomous navigation path detection method, server and storage medium
CN112418087B (en) Underwater video fish identification method based on neural network
CN111798470A (en) Crop image entity segmentation method and system applied to intelligent agriculture
CN111199245A (en) Rape pest identification method
CN115497067A (en) Path identification and planning method for nursery patrol intelligent vehicle
CN115147746A (en) Saline-alkali geological identification method based on unmanned aerial vehicle remote sensing image
CN115908371A (en) Plant leaf disease and insect pest degree detection method based on optimized segmentation
CN111582198A (en) Automatic sea-land segmentation method for remote sensing image
CN114862902A (en) Illumination self-adaptive ORB feature extraction and matching method based on quadtree
CN115240126B (en) Intelligent drip irrigation method based on artificial intelligence
CN111667509B (en) Automatic tracking method and system for moving target under condition that target and background colors are similar

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant