CN115240126A - Intelligent drip irrigation method based on artificial intelligence - Google Patents
Intelligent drip irrigation method based on artificial intelligence Download PDFInfo
- Publication number
- CN115240126A CN115240126A CN202211167732.0A CN202211167732A CN115240126A CN 115240126 A CN115240126 A CN 115240126A CN 202211167732 A CN202211167732 A CN 202211167732A CN 115240126 A CN115240126 A CN 115240126A
- Authority
- CN
- China
- Prior art keywords
- blade
- image
- land
- value
- gray
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000002262 irrigation Effects 0.000 title claims abstract description 73
- 238000003973 irrigation Methods 0.000 title claims abstract description 73
- 238000000034 method Methods 0.000 title claims abstract description 41
- 238000013473 artificial intelligence Methods 0.000 title claims abstract description 16
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 claims abstract description 90
- 230000008569 process Effects 0.000 claims abstract description 7
- 238000005070 sampling Methods 0.000 claims description 47
- 239000002689 soil Substances 0.000 claims description 33
- 239000011159 matrix material Substances 0.000 claims description 28
- 230000008859 change Effects 0.000 claims description 13
- 238000013139 quantization Methods 0.000 claims description 13
- 238000001514 detection method Methods 0.000 claims description 11
- 238000013528 artificial neural network Methods 0.000 claims description 5
- 238000004422 calculation algorithm Methods 0.000 claims description 4
- 238000003708 edge detection Methods 0.000 claims description 4
- 238000010606 normalization Methods 0.000 claims description 4
- 230000009467 reduction Effects 0.000 claims description 4
- 238000013135 deep learning Methods 0.000 claims description 3
- 230000011218 segmentation Effects 0.000 claims description 3
- 238000003062 neural network model Methods 0.000 claims description 2
- 238000007619 statistical method Methods 0.000 claims description 2
- 238000004458 analytical method Methods 0.000 abstract description 7
- 239000002699 waste material Substances 0.000 abstract description 4
- 238000003672 processing method Methods 0.000 abstract 1
- 238000005516 engineering process Methods 0.000 description 8
- 239000013598 vector Substances 0.000 description 8
- 230000000877 morphologic effect Effects 0.000 description 7
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 239000013505 freshwater Substances 0.000 description 2
- 230000001788 irregular Effects 0.000 description 2
- 241000282414 Homo sapiens Species 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000006735 deficit Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 239000002932 luster Substances 0.000 description 1
- 230000005012 migration Effects 0.000 description 1
- 238000013508 migration Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 239000007921 spray Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01G—HORTICULTURE; CULTIVATION OF VEGETABLES, FLOWERS, RICE, FRUIT, VINES, HOPS OR SEAWEED; FORESTRY; WATERING
- A01G25/00—Watering gardens, fields, sports grounds or the like
- A01G25/16—Control of watering
- A01G25/167—Control by humidity of the soil itself or of devices simulating soil or of the atmosphere; Soil humidity sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
- G06T7/62—Analysis of geometric attributes of area, perimeter, diameter or volume
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/26—Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
- G06V10/267—Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/28—Quantising the image, e.g. histogram thresholding for discrimination between background and foreground patterns
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/42—Global feature extraction by analysis of the whole pattern, e.g. using frequency domain transformations or autocorrelation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/48—Extraction of image or video features by mapping characteristic values of the pattern into a parameter space, e.g. Hough transformation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/50—Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/54—Extraction of image or video features relating to texture
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/56—Extraction of image or video features relating to colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/07—Target detection
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A40/00—Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
- Y02A40/10—Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture
- Y02A40/22—Improving land use; Improving water use or availability; Controlling erosion
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Computation (AREA)
- Life Sciences & Earth Sciences (AREA)
- Environmental Sciences (AREA)
- Water Supply & Treatment (AREA)
- Soil Sciences (AREA)
- Geometry (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Image Processing (AREA)
Abstract
The invention relates to the technical field of data processing, in particular to an intelligent drip irrigation method based on artificial intelligence. The method is characterized in that a crop color image and a land gray scale image identified by a depth camera are subjected to data acquisition, the acquired data are processed and analyzed, the processing method of the acquired data is improved, the crop color image and the land gray scale image are analyzed according to a specific method respectively, the water shortage condition of the crop and the water shortage condition of the land around the crop are determined, the obtained water shortage condition is input into a trained DNN network, and the most appropriate drip irrigation grade is determined to carry out drip irrigation on the crop. The method carries out specific water shortage condition analysis aiming at the shot images of the crops and the land images around the crops, improves the accuracy of judging whether the crops are in water shortage or not and the severity of the water shortage, and further reduces the waste of water resources in the drip irrigation process.
Description
Technical Field
The invention relates to the technical field of data processing, in particular to an intelligent drip irrigation method based on artificial intelligence.
Background
Although earth water resources are quite abundant, the proportion of fresh water resources available for human beings accounts for only one percent of all water resources. Agricultural irrigation needs a large amount of fresh water resources, traditional irrigation technologies such as flood irrigation, furrow irrigation and ditch irrigation are extremely low in water resource utilization rate and consume a large amount of manpower, so that new modern irrigation technologies such as spray irrigation, drip irrigation and micro irrigation are provided, and the utilization efficiency of water resources is greatly improved.
However, although the modern irrigation technology has improved the water resource utilization ratio compared with the traditional irrigation technology, the fuzzy control algorithm used by the current modern irrigation technology still has not been accurate enough to judge whether crops are short of water, and the condition of water resource waste still exists.
Disclosure of Invention
In order to further reduce the waste of water resources by modern irrigation technology, the application provides an intelligent drip irrigation method based on artificial intelligence, and the adopted technical scheme is as follows:
the invention relates to an intelligent drip irrigation method based on artificial intelligence, which comprises the following steps:
after the previous drip irrigation operation is finished, acquiring images of crops and the land around the crops by adopting a depth camera so as to acquire a color image of the crops and a gray level image of the land, and taking the color image of the crops and the gray level image of the land as an initial color image of the crops and an initial gray level image of the land for the current drip irrigation control;
waiting for a set time interval after the previous drip irrigation operation is finished, acquiring images of crops and the land around the crops again by adopting the depth camera, and taking the obtained crop color images and land gray level images as real-time crop color images and real-time land gray level images of the current drip irrigation control;
comparing the data of the real-time land gray level image with the data of the initial land gray level image, and determining the color value variable of the land surface part in the real-time land gray level image, the crack area in the real-time land gray level image, the variance of the crack areas in different areas and the maximum crack width to obtain the water shortage of the land;
performing target identification on the real-time crop color image, determining a leaf area where each leaf is located, and calculating a deviation value between a pixel value of each pixel point in the leaf area and a green preset value to obtain the color offset of the leaf, wherein the green preset value is a pixel value when the water content of the leaf is normal;
graying a leaf area in a real-time crop color image, obtaining a color deviation map of the leaf area by making a difference between the obtained gray leaf area and a gray value of the green preset value, and calculating an entropy value of the color deviation map to obtain a color deviation entropy;
calculating the irregularity degree of the blade edge, the symmetry degree of the blade, the blade surface curling degree and the blade height, and determining the form water shortage degree of the blade;
and inputting the land water shortage, the color offset entropy and the form water shortage into a trained neural network, determining the drip irrigation grade of the drip irrigation control, and finishing the drip irrigation operation.
The invention has the beneficial effects that:
the method comprises the steps of obtaining crop state images including the crops and the land around the crops by shooting through a depth camera, subdividing the images into crop color images and land gray level images, respectively carrying out specific analysis on the two images to obtain a characteristic quantity representing the water shortage degree of the land and a characteristic quantity representing the water shortage condition of the crops, inputting the two characteristic quantities together into a trained neural network to obtain the drip irrigation grade under the specific water shortage condition, and finishing the most appropriate and accurate drip irrigation operation. The method improves the accuracy of identifying the water shortage condition, so that a more accurate drip irrigation scheme can be provided according to the more accurate water shortage condition, and the waste of water resources in the drip irrigation process is further reduced.
Further, the method for acquiring the color image and the gray level image of the crop by acquiring the image of the crop and the land around the crop by adopting the depth camera comprises the following steps:
the depth camera shoots crops and the land around the crops to obtain crop state images, graying is conducted on the crop state images to obtain crop state gray level images, threshold segmentation is conducted on the crop state gray level images to obtain binary images, and the binary images after the reverse color are respectively subjected to bit operation with the crop state images and the crop state gray level images to obtain crop color images and land gray level images.
Further, the specific process of determining the color value variable of the soil surface part in the real-time soil gray level image, the crack area in the real-time soil gray level image, the variance of the crack areas in different areas and the maximum crack width to obtain the water shortage degree of the soil is as follows:
determining a region of which the gray value on the real-time land gray image is less than or equal to the initial land gray image, setting the gray value of the region to be 255, and setting the gray value of the part outside the region to be 0 to obtain a mask image;
determining a land surface part in the real-time land gray scale image according to the mask image, and calculating a color value variable of the land surface part in the real-time land gray scale image due to the change of the water content:
wherein,is a variation of the color value,the gray average value of the soil surface part in the real-time soil gray image,the gray average value of the soil surface part in the initial soil gray image is obtained;
setting the gray value of a pixel point with the gray value of 255 in the mask image as 1, and solving the sum of the gray values of all the pixel points in the mask image to obtain the crack area S in the real-time land gray image;
dividing the mask image into 25 regions equally, determining the crack area of each region, and calculating the variance of the crack area of each region:
wherein,the variance of the area of the cracks in each region evenly divided in the mask image,the areas of cracks on the i-th region averaged out in the mask image,the mean value of the areas of the cracks on each region which are uniformly divided in the mask image is obtained;
carrying out Hough line detection in each area of the mask image, determining the longest straight line segment in the area, carrying out connected domain identification by taking any point on the longest straight line segment as a starting point to obtain a connected domain containing the longest straight line segment, calculating the length average value of each straight line segment perpendicular to the longest straight line segment in the connected domain, wherein the length average value is the maximum crack width in the area, and taking the maximum value in the maximum crack width in each area to obtain the maximum crack width W in the real-time land gray level image;
the final land water shortage degree is as follows:
wherein,the water shortage degree of the land is determined,、、、the value of the weight coefficient is determined empirically.
Further, the color shift amount is:
wherein,in order to shift the amount of color shift,is the blade regionGo to the firstThe pixel values of the pixel points of a column,and for a green preset value, N and m are respectively the length and the width of the blade area, and N is the total number of pixel points in the blade area.
Further, the color shift entropy is:
wherein,in order to be the color shift entropy,in the histogram obtained by counting the values of each offset in the color offset map by using a histogram statistical method, the first stepStatistical probability of occurrence of seed pixels.
Further, the method for calculating the irregularity degree of the blade edge, the symmetry degree of the blade, the curling degree of the blade surface and the height of the blade and determining the form water shortage degree of the blade comprises the following steps:
edge detection is carried out in the gray level blade area to determine the blade edge, the distance between each point on the blade edge and the circumference of the blade inscribed ellipse on the connecting line of the center of the blade inscribed ellipse and each point on the blade edge is calculated, the irregularity degree of the blade edge is represented by the variance of the distance, and the variance is as follows:
wherein,as the edge of the bladeThe variance of the distance between the edge point of the blade and the circumference of the blade inscribed ellipse on the connecting line of the upper points and the circle center of the blade inscribed ellipse,andcoordinates of each point on the edge of the blade and a point on the circumference of the blade inscribed ellipse on a connecting line of each point on the edge of the blade and the center of a circle of the blade inscribed ellipse are respectively set;
taking the major axis of the blade inscribed ellipse as a symmetry axis, dividing the blade edge into a left curve and a right curve, turning one curve over according to the symmetry axis, obtaining the dynamic time normalization distance of the two curves by using a dynamic time normalization algorithm, and then obtaining the blade symmetry degree value:
wherein,is the value of the degree of symmetry of the blade,the dynamic time reduction distance of the curves at the left side and the right side of the blade is obtained;
and obtaining a leaf withering degree value according to the variance and the leaf symmetry degree value:
gradient changes of an x axis and a y axis of the blade area are respectively obtained by using a Sobel operator, and the two-axis gradient changes are weighted and summed to obtain the depth gradient of the whole blade area:
wherein G is the depth gradient of the whole blade image, G x Leaf image depth gradient, G, calculated by Sobel operator in the x-axis direction y Calculating the depth gradient of the leaf image obtained by the Sobel operator in the Y-axis direction;
uniformly selecting a certain number of sampling points on the blade area to obtain a depth gradient on each sampling point, arranging the depth gradients of each sampling point into a matrix according to the coordinate position of each sampling point to obtain a sampling point depth gradient matrix, and quantizing the sampling point depth gradient matrix to obtain a sampling point depth gradient quantization map;
according to the method for acquiring the gray level co-occurrence matrix, obtaining the depth gradient co-occurrence matrix of each sampling point in the sampling point depth gradient quantization graph, and calculating the entropy value of the depth gradient co-occurrence matrix of each sampling point:
wherein,is the entropy value of the depth gradient co-occurrence matrix of the sampling point,the probability of the occurrence of a depth gradient point pair (u, v) in the depth gradient co-occurrence matrix of the sampling point, wherein (u, v) is the coordinate of the midpoint of the depth gradient co-occurrence matrix of the sampling point;
then, carrying out an average value on entropy values of the depth gradient co-occurrence matrixes of all sampling points to obtain the leaf surface curl degree:
wherein E is the leaf surface curling degree, and h and w respectively represent the length and width of the sampling point depth gradient quantization map;
form water shortage degree of the blade:
wherein,is the water shortage degree value of the blade,is the value of the degree of wilting of the leaves,the degree of curling of the leaf surface is,the height of the crop leaf obtained by the depth camera.
Further, the method for determining the blade area where each blade is located comprises the following steps:
and performing target detection on the real-time crop color image by adopting a target detection method based on a deep learning neural network model, separating the foreground from the background by utilizing the height difference between the foreground blade and the background blade determined by the depth camera, and obtaining a single blade area of the foreground to obtain a complete blade area.
Drawings
Fig. 1 is a flow chart of the intelligent drip irrigation method based on artificial intelligence of the present invention.
Detailed Description
The intelligent drip irrigation method based on artificial intelligence of the invention is explained in detail below with reference to the accompanying drawings and examples.
The method comprises the following steps:
the invention relates to an embodiment of an intelligent drip irrigation method based on artificial intelligence, which has the overall flow shown in figure 1 and comprises the following specific processes:
the method comprises the following steps of collecting a crop image and a land image around the crop.
And arranging a depth camera right above the crops, and vertically shooting the crops and the land around the crops downwards to obtain the crop state images. And performing Gaussian filtering on the crop state image to eliminate image noise, and performing gray processing on the image to obtain a gray image.
After the image is converted into a grayscale image, threshold segmentation is performed on the grayscale image to binarize the grayscale image. Because the crop and the land have larger gray difference, the threshold value is automatically selected by using a maximum variance threshold value method. The image is divided to generate a binary image, and the binary image and the inverted binary image are respectively subjected to bit operation with the color original image and the gray image, so that a crop color image and a land gray image can be extracted, and the separation of the crop and the land image is realized.
When each irrigation operation is finished, the crops and the land around the crops are shot by adopting a depth camera, and the obtained crop color image and the land gray level image are used as initial images for the next drip irrigation control, wherein the crop color image is an initial crop color image, and the land gray level image is an initial land gray level image.
After the irrigation operation is finished, the depth camera is adopted to shoot the crops and the land around the crops again after the set time interval is waited, and a real-time crop color image and a real-time land gray level image are obtained.
And step two, analyzing the land partial image to determine the land water shortage condition.
When the water content of the land is different, the surface state of the land is different, and the remarkable performance comprises two aspects, namely the color of the land changes when the water content is reduced, and the land cracks when the water content is reduced. That is, when the soil moisture content is high, the soil is darker and cracks are not generated, and when the soil moisture content is low, the soil is lighter in color and is accompanied by the generation of cracks.
In the case of ground cracks, when the moisture content of the ground is reduced, the color of the ground surface is lighter, the gray scale map value is higher, but the gray scale value of the cracks is obviously lower compared with the ground surface part due to darker parts.
And comparing the shot real-time land gray level image with the initial land gray level image in the initial image to accurately identify the cracks generated in the land, and generating a mask image. And if the gray value of a certain position in the real-time land gray image is less than or equal to the gray value of the position in the initial land gray image, setting the gray value of the mask image corresponding to the position to be 255, otherwise, setting the gray value to be 0.
And performing opening operation processing on the obtained mask image to obtain a mask image capable of accurately representing the position of the crack, wherein if the mask image has no crack, the mask image is an image with the gray value of 0 at each position.
Based on the resulting mask image, the portion of the earth's surface in the real-time earth grayscale image, i.e., the portion other than the cracks, can be determined. Calculating gray average value of soil surface part in real-time soil gray imageThen, the gray level average value of the soil surface part in the initial soil gray level image is compared with the gray level average value of the soil surface part in the initial soil gray level imageMaking difference to obtain the color value variable of the soil surface part due to the change of the water content in real time:
wherein,namely the color value variable is obtained,the gray average value of the soil surface part in the real-time soil gray image,is the gray average value of the soil surface part in the initial soil gray image.
Meanwhile, the area of the crack in the mask image is obtained. Specifically, the gray value of a pixel point with the gray value of 255 in the mask image is set to be 1, and then all the pixel points are summed to obtain the crack area S. Of course, other methods known in the art may be used to determine the area of the crack in the mask image in other embodiments.
Dividing the mask image into 25 regions of 5-by-5, and calculating the crack area of each regionThen, the variance of the areas of the cracks on the regions uniformly divided in the mask image is calculated as follows:
wherein,the variance of the area of the cracks in each region evenly divided in the mask image,the areas of cracks on the i-th region averaged out in the mask image,the average value of the areas of the cracks on the regions evenly divided in the mask image is obtained. When the variance is larger, the crack distribution in the mask image is represented to be more uneven, and the moisture content of the earth is represented to be lower.
In the present embodiment, the mask image is divided equally into 25 regions, and it is obviously determined that the mask image may be divided equally into other suitable number of regions in other embodiments.
Then, hough line detection is carried out in each area of the mask image, the longest straight line segment in each area is determined, connected domain identification is carried out by taking any point on the longest straight line segment as a starting point, a connected domain containing the longest straight line segment is obtained, the mean value of all straight line segments perpendicular to the longest straight line segment is calculated in the connected domain, the mean value is the maximum crack width in the corresponding area, the maximum value in the maximum crack width of each area is taken, and the final maximum crack width W can be obtained. Since the hough transform and cartesian coordinate system are known technologies, they are not described herein again, and in this embodiment, the maximum crack widths of the regions uniformly divided on the mask image are respectively found, and then the determined maximum crack widths W are compared as a whole, and in other embodiments, the maximum crack widths W in the whole range of the mask image can also be directly found.
Finally, the color value variables obtained in accordance with the aboveThe area S of the crack, and the variance of the area of the crack in each region evenly divided in the mask imageAnd the maximum crack width W, the final land water shortage can be obtained:
wherein,the water shortage degree of the land is determined,、、、the values of the weight coefficients are determined according to practical experience, and in this embodiment, the four weight coefficients are respectively set to be 0.1,0.5, 1.
And step three, analyzing partial images of the crops to determine the water shortage condition of the crops.
And detecting and acquiring the leaves in the image by using the target for the crop color image acquired and processed by the depth camera. The target detection adopts a target detection method based on deep learning, such as a YoloV5 model, the model needs to be trained before use, and how to train the neural network is well known to those skilled in the art, and the process is not described again here.
For overlapped leaves, the depth information of the height difference between the foreground leaves and the background leaves is utilized to separate the foreground from the background, and a single leaf image of the foreground is obtained. In order to remove holes possibly existing on the leaf surface, the obtained leaf image is binarized, and then closed operation is carried out to obtain a complete leaf surface area. Finally, for each target detected image, a separate leaf region may be acquired.
The leaves of the crops are green and rich in luster under the condition of no water shortage, and the texture is not obvious; when the object is lack of water, the leaves are sagged, the leaves are changed from green to yellow, and in the changing process, the color change is not evenly distributed, so that the obvious degree of the texture of the leaves is increased. Therefore, the water shortage degree of the blade can be represented by the water shortage calculated by analyzing the morphological state of the blade and the color offset amount and the color offset entropy of the blade.
1. The color shift amount is calculated.
The color offset can be obtained by calculating the Euclidean distance between the pixel value of the leaf area and the green preset value when the leaf water content is normal in the real-time crop color image, and the green preset value when the leaf water content is normal is selected to be (50, 185, 10) in the embodiment.
And calculating Euclidean distance between each pixel point of the leaf region and a green preset value in the RGB color space, and then calculating the mean value of the Euclidean distance corresponding to each leaf, wherein the mean value is the color offset of the leaf, the color offset reflects the water shortage degree of the leaf, and the higher the water shortage degree of the crop is, the larger the color offset is.
The color shift amount is:
wherein,in order to be the amount of color shift,is the blade regionGo to the firstThe pixel values of the pixel points of a column,and N and m are respectively the length and the width of the blade area, and N is the total number of the pixel points in the blade area.
2. The color shift entropy is calculated.
When the leaves lack water and change the color from green to yellow, the leaves have irregularity and nonuniformity in color, and complex random textures are generated on the leaves.
And graying the image of the single blade, subtracting the gray value 125 of the green preset value to obtain a color shift map of the blade, and performing Gaussian filtering on the color shift map to remove the white noise amount of the color shift map. And counting the value of each offset in the color offset map by using the histogram, and calculating the color offset entropy corresponding to the histogram.
When the crops are not lack of water, the texture of the leaves is not obvious, the distribution of pixels in a histogram obtained by corresponding to the color shift diagram is concentrated, and the corresponding entropy value is smaller; when the crops lack water, the leaf texture is obvious, the pixel distribution in the histogram obtained by the color migration diagram is dispersed, and the corresponding entropy value is larger. Therefore, the water shortage degree of the crops can be reflected by the change of the entropy value, and the color deviation entropy is as follows:
wherein,in order to be the color shift entropy,is the first histogram in the color shift map correspondenceStatistical probability of occurrence of seed pixels.
3. The morphological water deficit was calculated.
The water content of the leaves is reflected in various morphologies, and the change of the water content of the leaves is analyzed from different morphological angles and parameters representing the water content of the leaves are correspondingly determined.
The first morphological analysis is that under normal conditions, the edges appear as a smoother ellipse as a whole. When the object lacks water, the blade shrinks, the edge part curls towards the middle, and the overall contour of the blade is in an irregular shape. The lower the moisture content of the leaf, the higher the degree of edge irregularity.
Therefore, according to the characteristics, the color image of the blade is converted into the gray image, and the edge detection is carried out on the gray image. The edge detection uses the canny operator to obtain the edge image, and the canny operator is known in the art and is not described in detail herein. And then carrying out ellipse fitting on the edge curve of the blade to obtain an inscribed ellipse curve. And calculating the variance of the distance between each point on the edge of the blade and the circumference of the blade inscribed ellipse on the connecting line of each point on the edge of the blade and the center of the circle of the blade inscribed ellipse so as to measure the smoothness degree of the edge of the blade, wherein the greater the variance is, the more irregular the edge of the blade is, and the higher the water shortage degree of the blade is. The variance is:
wherein,the variance of the distance between each point on the edge of the blade and the circumference of the inscribed ellipse of the blade on the connecting line of each point on the edge of the blade and the center of the inscribed ellipse of the blade,andthe coordinates of each point on the edge of the blade and the point on the circumference of the blade inscribed ellipse are respectively on the connecting line of each point on the edge of the blade and the circle center of the blade inscribed ellipse.
The second morphological analysis is that the leaves are more symmetrical when the water content is normal, and the symmetry of the leaves is reduced when the water content is reduced, so that the symmetry of the leaves can be detected and the water shortage of the leaves can be reflected.
After the inscribed ellipse of the blade is obtained, the major axis of the ellipse can be determined, and the axis is the symmetry axis of the blade. The blade edge is divided into a left curve and a right curve by the blade symmetry axis, and the detection of the blade symmetry is the detection of the similarity of the left curve and the right curve. Turning one curve according to a symmetry axis, obtaining a dynamic time reduction distance of the two curves by using a dynamic time reduction algorithm, obtaining curve similarity through the distance, and determining a blade symmetry degree value:
wherein,is a value of the degree of symmetry of the blade,and (4) normalizing the distance for the dynamic time of the curves on the left side and the right side of the blade.
And obtaining a leaf withering degree value according to the variance and the leaf symmetry degree value:
wherein,the value of the leaf blight degree is shown. When the blades are curled in the absence of water,the value is increased by the amount of the first,the value is reduced, therebyThe value is increased; when the water content of the leaves is increased and the leaves are unfolded,the value is decreased and the number of the first and second,the value is increased, therebyThe value decreases.
The third morphological analysis angle is that the surface of the blade is flat under the normal water shortage condition, the depth value of each position of the blade obtained by shooting through the depth camera is not changed greatly, and after the blade is in water shortage, the whole blade is curled, the depth value of each position of the blade obtained by shooting through the depth camera is changed obviously, and a large gradient exists. Therefore, the flatness of the reaction blade can be changed by detecting the depth gradient of the blade, so that the water shortage degree of the reaction blade is reflected.
The flatness of the blade is reflected by the gradient change of the depth of the blade, and the gradient of the depth of the blade is calculated by using a sobel operator. And respectively obtaining the gradient changes of the x axis and the y axis of the leaf image by using a Sobel operator, and weighting and summing the two-axis gradient changes to obtain the depth gradient of the whole leaf image. The formula is as follows:
wherein G is the depth gradient of the whole blade image, G x Leaf image depth gradient G calculated by Sobel operator in the x-axis direction y And calculating the depth gradient of the leaf image obtained by the Sobel operator in the Y-axis direction. Since the sobel operator calculation method is a well-known technology, it is not described herein again.
After the depth gradient of the whole blade image is obtained, a certain number of sampling points are uniformly selected on the blade image to obtain the depth gradient of each sampling point, then the depth gradients of each sampling point are arranged into a matrix according to the coordinate position of the sampling point to obtain a sampling point depth gradient matrix, histogram equalization is carried out on the matrix, the global contrast of the gradient is increased, and compression quantization is carried out to obtain a sampling point depth gradient quantization image.
And obtaining the depth gradient co-occurrence matrix of each sampling point in the sampling point depth gradient quantization image according to the gray level co-occurrence matrix obtaining method. When the blade is curled, the whole depth gradient change of the blade is increased, the randomness of the depth gradient quantization map of the sampling points is increased, the entropy values of the depth gradient co-occurrence matrixes of the sampling points in the depth gradient quantization map of the sampling points are increased, the entropy values of the depth gradient co-occurrence matrixes corresponding to all the sampling points in the depth gradient quantization map of the sampling points are averaged, and the average value can represent the curling degree of the blade surface:
wherein E is the leaf surface curling degree,is the entropy value of the depth gradient co-occurrence matrix of the sampling point,the probability of the occurrence of a depth gradient point pair (u, v) in the depth gradient co-occurrence matrix of the sampling point is shown in the description, (u, v) is the coordinate of the midpoint of the depth gradient co-occurrence matrix of the sampling point, and h and w respectively represent the length and width of the depth gradient quantization map of the sampling point.
When the E value is increased, the gradient change degree and the complexity are improved, and the gradient change of the leaves is more violent, so that the curling degree of the leaves is increased, and the water shortage of crops is increased.
The fourth morphological analysis angle is that when the leaves contain insufficient water, the leaf height changes. The leaves of the crops are upright when the crops are normal, and the leaves are drooping when the crops lack water. According to this feature, the plants are measured using the amount of depth in the depth camera. The depth camera can sense the height change of the crop leaves and determine the specific height of the crop leaves. The height of the crop leaf can be obtained by calculating the depth value change of the crop region part in the graph. When the water content of crops is normal, the height of the leaves is generally normal,the value is large; when the crops lack water, the leaves droop,the value is correspondingly reduced, indicating that the crop needs drip irrigation.
Through the parameters of the water content of the blades characterized by different angles, the form water shortage of the blades can be finally determined:
wherein,is the water-lack range value of the blade,is the value of the degree of wilting of the leaves,the degree of curling of the leaf surface is,is the height of the crop leaves.
And step four, inputting the land water shortage condition and the crop water shortage condition into the trained neural network, and controlling the drip irrigation system to execute the drip irrigation operation.
After the analysis of the soil part image and the crop part image, four parameters representing the crop water shortage degree are finally obtained, namely the soil water shortage degree D E Color offset P, color offset entropy H, and form water shortage Q. Each leaf identified from the live image taken by the depth camera has a four-dimensional vector consisting of these four parameters.
The number of the leaves for different crops is different, and is usually about 30, and 20 leaves are randomly selected from the leaves to form a vector matrix. If the crop leaves are less than 20, the leaf is filled with 0. And (3) sending a vector matrix consisting of 20 blades to a trained DNN network, outputting a six-dimensional vector through a softmax function, wherein the vector is the confidence coefficient of the network on whether crops need to be subjected to drip irrigation, and each dimension corresponds to a preset drip irrigation grade. The drip irrigation is carried out at six levels, the drip irrigation amount and the drip irrigation duration are preset amounts, wherein 0 level is not required; level 1 is drip irrigation 10ml/min for 3 hours; the 2 level is drip irrigation for 20ml/min and lasts for 3 hours; the 3-level drip irrigation lasts for 3 hours at 30 ml/min; drip irrigation for 30ml/s for 5 hours is performed in level 4; grade 5 is drip irrigation at 30ml/min for 7 hours. During the continuous period of the drip irrigation, the images of the crops are not shot any more, at the moment of finishing the execution of the drip irrigation operation, the images of the crops and the images of the land around the crops are collected through the depth camera again to serve as initial images of the next drip irrigation control, and then after the set time interval is waited, the images of the crops and the images of the land around the crops are collected through the depth camera to serve as real-time images to conduct a new round of drip irrigation control. Since the DNN network is composed and trained in a conventional manner, it is not described herein in detail.
In this embodiment, the set time interval for waiting is set to 2 hours, in this embodiment, because six drip irrigation levels are set, six-dimensional vectors are output by the DNN network, in other embodiments, different numbers of drip irrigation levels can be set, and the DNN network can output vectors with corresponding dimensional numbers, and if five drip irrigation levels are set, the DNN network outputs five-dimensional vectors.
The above-mentioned embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.
Claims (7)
1. An intelligent drip irrigation method based on artificial intelligence is characterized by comprising the following steps:
after the previous drip irrigation operation is finished, acquiring images of crops and the land around the crops by adopting a depth camera so as to acquire a color image of the crops and a gray level image of the land, and taking the color image of the crops and the gray level image of the land as an initial color image of the crops and an initial gray level image of the land for the current drip irrigation control;
waiting for a set time interval after the previous drip irrigation operation is finished, acquiring images of crops and the land around the crops again by adopting the depth camera, and taking the obtained crop color images and land gray level images as real-time crop color images and real-time land gray level images of the current drip irrigation control;
comparing the data of the real-time land gray level image with the data of the initial land gray level image, and determining the color value variable of the land surface part in the real-time land gray level image, the crack area in the real-time land gray level image, the variance of the crack areas in different areas and the maximum crack width to obtain the water shortage of the land;
performing target identification on the real-time crop color image, determining a leaf area where each leaf is located, and calculating a deviation value between a pixel value of each pixel point in the leaf area and a green preset value to obtain the color offset of the leaf, wherein the green preset value is a pixel value when the water content of the leaf is normal;
graying a leaf area in a real-time crop color image, obtaining a color deviation map of the leaf area by making a difference between the obtained gray leaf area and a gray value of the green preset value, and calculating an entropy value of the color deviation map to obtain a color deviation entropy;
calculating the irregularity degree of the blade edge, the symmetry degree of the blade, the blade surface curling degree and the blade height, and determining the form water shortage degree of the blade;
and inputting the land water shortage, the color offset entropy and the form water shortage into a trained neural network, determining the drip irrigation grade of the drip irrigation control, and finishing the drip irrigation operation.
2. The intelligent drip irrigation method based on the artificial intelligence as claimed in claim 1, wherein the method for acquiring the color image and the gray scale image of the crop by acquiring the image of the crop and the land around the crop by using the depth camera comprises the following steps:
the depth camera shoots crops and the land around the crops to obtain crop state images, graying is conducted on the crop state images to obtain crop state gray level images, threshold segmentation is conducted on the crop state gray level images to obtain binary images, and the binary images after the reverse color are respectively subjected to bit operation with the crop state images and the crop state gray level images to obtain crop color images and land gray level images.
3. The intelligent drip irrigation method based on artificial intelligence of claim 1, wherein the specific process of determining the color value variable of the soil surface part in the real-time soil gray-scale image, the crack area in the real-time soil gray-scale image, the variance of the crack areas of different areas and the maximum crack width to obtain the water shortage of the soil is as follows:
determining a region of which the gray value on the real-time land gray image is less than or equal to that of the initial land gray image, setting the gray value of the region to be 255, and setting the gray value of the part outside the region to be 0 to obtain a mask image;
determining a land surface part in the real-time land gray scale image according to the mask image, and calculating a color value variable of the land surface part in the real-time land gray scale image due to the change of the water content:
wherein,is a variation of the color value,the gray average value of the soil surface part in the real-time soil gray image,the gray average value of the soil surface part in the initial soil gray image is obtained;
setting the gray value of a pixel point with the gray value of 255 in the mask image as 1, and solving the sum of the gray values of all the pixel points in the mask image to obtain the crack area S in the real-time land gray image;
dividing the mask image into 25 regions equally, determining the crack area of each region, and calculating the variance of the crack area of each region:
wherein,the variance of the area of the cracks in each region evenly divided in the mask image,the area of the crack on the i-th region equally divided in the mask image,the average value of the areas of the cracks on each region which are uniformly divided in the mask image is obtained;
carrying out Hough line detection in each area of the mask image, determining the longest straight line segment in the area, carrying out connected domain identification by taking any point on the longest straight line segment as a starting point to obtain a connected domain containing the longest straight line segment, calculating the length average value of each straight line segment perpendicular to the longest straight line segment in the connected domain, wherein the length average value is the maximum crack width in the area, and taking the maximum value in the maximum crack width in each area to obtain the maximum crack width W in the real-time land gray level image;
the final land water shortage degree is as follows:
4. The intelligent drip irrigation method based on artificial intelligence according to claim 1, wherein the color offset is:
5. The intelligent drip irrigation method based on artificial intelligence according to claim 1, wherein the color shift entropy is:
6. The intelligent drip irrigation method based on artificial intelligence of claim 1, wherein the method for calculating the irregularity degree of the blade edge, the symmetry degree of the blade, the curling degree of the blade surface and the height of the blade and determining the form water shortage degree of the blade comprises the following steps:
edge detection is carried out in the gray scale blade area to determine the blade edge, the distance between each point on the blade edge and the circumference of the blade inscribed ellipse on the connecting line of the center of the blade inscribed ellipse and each point on the blade edge is calculated, the irregularity degree of the blade edge is represented by the variance of the distance, and the variance is as follows:
wherein,the variance of the distance between each point on the edge of the blade and the circumference of the inscribed ellipse of the blade on the connecting line of each point on the edge of the blade and the center of the inscribed ellipse of the blade,and withCoordinates of each point on the edge of the blade and a point on the circumference of the blade inscribed ellipse on a connecting line of each point on the edge of the blade and the center of a circle of the blade inscribed ellipse are respectively set;
taking the major axis of the blade inscribed ellipse as a symmetry axis, dividing the blade edge into a left curve and a right curve, turning one curve over according to the symmetry axis, obtaining the dynamic time normalization distance of the two curves by using a dynamic time normalization algorithm, and then obtaining the blade symmetry degree value:
wherein,is the value of the degree of symmetry of the blade,the dynamic time reduction distance of the curves at the left side and the right side of the blade is obtained;
and obtaining a leaf withering degree value according to the variance and the leaf symmetry degree value:
respectively obtaining gradient changes of an x axis and a y axis of the blade area by using a Sobel operator, and weighting and summing the two-axis gradient changes to obtain the depth gradient of the whole blade area:
wherein G is the depth gradient of the whole blade image, G x Leaf image depth gradient G calculated by Sobel operator in the x-axis direction y Calculating the depth gradient of the leaf image obtained by the Sobel operator in the Y-axis direction;
uniformly selecting a certain number of sampling points on the blade area to obtain the depth gradient of each sampling point, arranging the depth gradients of each sampling point into a matrix according to the coordinate position of each sampling point to obtain a sampling point depth gradient matrix, and quantizing the sampling point depth gradient matrix to obtain a sampling point depth gradient quantization map;
according to the method for acquiring the gray level co-occurrence matrix, obtaining the depth gradient co-occurrence matrix of each sampling point in the sampling point depth gradient quantization graph, and calculating the entropy value of the depth gradient co-occurrence matrix of each sampling point:
wherein,is the entropy value of the depth gradient co-occurrence matrix of the sampling point,the probability of the occurrence of a depth gradient point pair (u, v) in the depth gradient co-occurrence matrix of the sampling point, wherein (u, v) is the coordinate of the midpoint of the depth gradient co-occurrence matrix of the sampling point;
then, carrying out an average value on entropy values of the depth gradient co-occurrence matrixes of all sampling points to obtain the leaf surface curl degree:
wherein E is the leaf surface curl degree, and h and w respectively represent the length and width of the sampling point depth gradient quantization graph;
shape water shortage of the blade:
7. The intelligent drip irrigation method based on artificial intelligence according to claim 1, wherein the method for determining the area of each blade comprises the following steps:
and performing target detection on the real-time crop color image by adopting a target detection method based on a deep learning neural network model, separating the foreground from the background by utilizing the height difference between the foreground blade and the background blade determined by the depth camera, and obtaining a single blade area of the foreground to obtain a complete blade area.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211167732.0A CN115240126B (en) | 2022-09-23 | 2022-09-23 | Intelligent drip irrigation method based on artificial intelligence |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211167732.0A CN115240126B (en) | 2022-09-23 | 2022-09-23 | Intelligent drip irrigation method based on artificial intelligence |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115240126A true CN115240126A (en) | 2022-10-25 |
CN115240126B CN115240126B (en) | 2022-12-20 |
Family
ID=83667471
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211167732.0A Active CN115240126B (en) | 2022-09-23 | 2022-09-23 | Intelligent drip irrigation method based on artificial intelligence |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115240126B (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102564593A (en) * | 2011-12-30 | 2012-07-11 | 河海大学常州校区 | Plant growth condition monitoring system based on compute vision and internet of things |
CN114429592A (en) * | 2021-12-30 | 2022-05-03 | 山东浪潮工业互联网产业股份有限公司 | Automatic irrigation method and equipment based on artificial intelligence |
-
2022
- 2022-09-23 CN CN202211167732.0A patent/CN115240126B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102564593A (en) * | 2011-12-30 | 2012-07-11 | 河海大学常州校区 | Plant growth condition monitoring system based on compute vision and internet of things |
CN114429592A (en) * | 2021-12-30 | 2022-05-03 | 山东浪潮工业互联网产业股份有限公司 | Automatic irrigation method and equipment based on artificial intelligence |
Also Published As
Publication number | Publication date |
---|---|
CN115240126B (en) | 2022-12-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111369597B (en) | Particle filter target tracking method based on multi-feature fusion | |
CN112435221A (en) | Image anomaly detection method based on generative confrontation network model | |
CN111340824B (en) | Image feature segmentation method based on data mining | |
CN106446942A (en) | Crop disease identification method based on incremental learning | |
CN110837768B (en) | Online detection and identification method for rare animal protection | |
CN111967511B (en) | Foundation cloud picture classification method based on heterogeneous feature fusion network | |
CN111784597B (en) | Autonomous navigation path detection method for plant protection robot, server and storage medium | |
CN108319973A (en) | Detection method for citrus fruits on tree | |
CN113077486B (en) | Method and system for monitoring vegetation coverage rate in mountainous area | |
CN109886146B (en) | Flood information remote sensing intelligent acquisition method and device based on machine vision detection | |
CN112288010B (en) | Finger vein image quality evaluation method based on network learning | |
CN109740485A (en) | Reservoir or dyke recognition methods based on spectrum analysis and depth convolutional neural networks | |
CN106446925A (en) | Dolphin identity recognition method based on image processing | |
CN111783693A (en) | Intelligent identification method of fruit and vegetable picking robot | |
CN113269191A (en) | Crop leaf disease identification method and device and storage medium | |
CN116109933B (en) | Dynamic identification method for ecological restoration of abandoned mine | |
CN111709305B (en) | Face age identification method based on local image block | |
CN111582198A (en) | Automatic sea-land segmentation method for remote sensing image | |
CN111798470A (en) | Crop image entity segmentation method and system applied to intelligent agriculture | |
CN115147746A (en) | Saline-alkali geological identification method based on unmanned aerial vehicle remote sensing image | |
CN116721121A (en) | Plant phenotype color image feature extraction method | |
CN115497067A (en) | Path identification and planning method for nursery patrol intelligent vehicle | |
CN111667509B (en) | Automatic tracking method and system for moving target under condition that target and background colors are similar | |
CN115240126B (en) | Intelligent drip irrigation method based on artificial intelligence | |
CN117593540A (en) | Pressure injury staged identification method based on intelligent image identification technology |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |