CN117789067A - Unmanned aerial vehicle crop monitoring method and system based on machine learning - Google Patents

Unmanned aerial vehicle crop monitoring method and system based on machine learning Download PDF

Info

Publication number
CN117789067A
CN117789067A CN202410215219.7A CN202410215219A CN117789067A CN 117789067 A CN117789067 A CN 117789067A CN 202410215219 A CN202410215219 A CN 202410215219A CN 117789067 A CN117789067 A CN 117789067A
Authority
CN
China
Prior art keywords
distance
historical
aerial vehicle
unmanned aerial
crop
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202410215219.7A
Other languages
Chinese (zh)
Other versions
CN117789067B (en
Inventor
王吉林
张立超
王军
游忠川
张枨枨
管竟尧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Byte Information Technology Co ltd
Original Assignee
Shandong Byte Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Byte Information Technology Co ltd filed Critical Shandong Byte Information Technology Co ltd
Priority to CN202410215219.7A priority Critical patent/CN117789067B/en
Publication of CN117789067A publication Critical patent/CN117789067A/en
Application granted granted Critical
Publication of CN117789067B publication Critical patent/CN117789067B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Catching Or Destruction (AREA)

Abstract

The invention relates to the technical field of crop monitoring, and discloses a machine learning-based unmanned aerial vehicle crop monitoring method and system.

Description

Unmanned aerial vehicle crop monitoring method and system based on machine learning
Technical Field
The invention relates to the technical field of crop monitoring, in particular to an unmanned aerial vehicle crop monitoring method and system based on machine learning.
Background
With the advent of the information age, construction of a modern agricultural development system is the current focus attention of the agricultural field of China, and a large number of practices prove that the unmanned aerial vehicle remote sensing technology is applied to the agricultural production field, so that the rapid, accurate and comprehensive monitoring and management of each growth stage of crops can be realized, the unmanned aerial vehicle can rapidly acquire the growth information of the crops, so that the conditions of pigments, diseases and insect pests, growth vigor and the like of the crops can be accurately monitored, in addition, the camera carried by the unmanned aerial vehicle has higher resolution, the accuracy of data information is ensured, related ground object information can be effectively extracted through the information processing technology, under the general circumstances, the unmanned aerial vehicle firstly acquires the height information and images of the crops, and then the accurate monitoring of the crops is realized by utilizing the data, and the application of the technology brings great convenience and potential to the agricultural field;
for example, patent application publication No. CN113989689a discloses a method and a system for identifying crop diseases and insect pests based on unmanned aerial vehicle, the patent processes crop pictures shot by unmanned aerial vehicle, according to the vein structure of crop leaves, firstly identifies the leaves of crops, then inputs the identified leaves into convolutional neural network or deep learning model, detects crop disease and insect pest information, realizes identification of crop diseases and insect pests based on unmanned aerial vehicle, patent application publication No. CN106643529a discloses a method for rapidly measuring the growth height of crops in mountain area based on unmanned aerial vehicle image, the patent monitors crops by obtaining the growth height of crops in each stage, although the above patent realizes monitoring of crops to a certain extent, the following problems still exist:
although the insect damage of the crops is identified through the blades and the deep learning model of the crops, the efficiency of monitoring is not high due to the fact that the area of the crops is large, the work load is definitely increased by the fact that the images are sequentially identified, the monitoring of the crops is achieved through the rapid measurement of the growth heights of the crops in spite of the fact that the prior art exists, the problem that inaccuracy occurs in the monitoring of the crops only through the growth heights is solved, the effect of insect damage on the growth heights of the crops is achieved, the lodging phenomenon occurs in the crops due to weather causes, the error occurs in the detection of the growth heights of the crops, and therefore misjudgment occurs in the growth states of the crops in the area.
In view of the above, the present invention provides an unmanned aerial vehicle crop monitoring method and system based on machine learning to solve the above problems.
Disclosure of Invention
In order to overcome the defects in the prior art, the invention provides an unmanned aerial vehicle crop monitoring method and system based on machine learning.
In order to achieve the above purpose, the present invention provides the following technical solutions:
an unmanned aerial vehicle crop monitoring method based on machine learning, comprising:
s10: acquiring a first distance of an ith area according to a sequence of dividing sequence numbers, and judging whether the first distance is in a standard interval or not, wherein the first distance is the distance between the unmanned aerial vehicle and crops;
s20: if the first distance is not in the standard interval, acquiring a crop image, and acquiring a recognition result based on the crop image and a pre-constructed pest recognition model, wherein the recognition result comprises the existence of pests and the absence of pests;
s30: if there is a pest, marking the ith area as a first area, if there is no pest, marking the ith area as a second area, letting i=i+1, and returning to S10;
s40: repeating the steps S10 to S30 until i=k, ending the cycle, obtaining a second area number, and determining whether the lodging phenomenon exists based on the second area number.
Further, the method for generating the standard interval comprises the following steps:
acquiring the growth time and plant types of crops, and inputting the growth time and the plant types into a pre-constructed height prediction model to obtain a first height interval output by the height prediction model;
obtaining a second height interval from the database based on the growth time of the crop and the plant category, and generating a third height interval according to the first height interval and the second height interval;
and acquiring a second distance, and generating a standard interval based on the third height interval and the second distance, wherein the second distance is the distance from the unmanned aerial vehicle to the ground.
Further, the method of generating a third altitude interval from the first altitude interval and the second altitude interval comprises:
an intersection between the first height section and the second height section is taken to form a third height section.
Further, the method for constructing the height prediction model comprises the following steps:
the method comprises the steps of obtaining a sample data set, wherein the sample data set comprises a historical growth time, a historical plant category and a historical standard interval, dividing the sample data set into a sample training set and a sample test set, constructing a regression network, taking the historical growth time and the historical plant category in the sample training set as input data of the regression network, taking the historical standard interval in the sample training set as output data of the regression network, training the regression network to obtain an initial regression network for predicting a real-time standard interval, testing the initial regression network by using the sample test set, and outputting the regression network meeting a value smaller than a preset error as a rate determination model.
Further, the construction method of the pest identification model comprises the following steps:
and h groups of data are acquired, wherein h is a positive integer greater than 1, the data comprise historical crop images and historical recognition results, the historical crop images and the historical recognition results are used as sample sets, the sample sets are divided into training sets and test sets, a classifier is constructed, the historical crop images in the training sets are used as input data, the historical recognition results in the training sets are used as output data, the classifier is trained, an initial classifier is obtained, the test sets are used for testing the initial classifier, and the classifier meeting the preset accuracy is output to serve as a solid waste classification model.
Further, the method for determining whether the lodging phenomenon exists based on the second region number comprises the following steps:
judging whether the number of the second areas is smaller than a preset area number threshold value, if so, not generating lodging phenomenon, and if not, generating lodging phenomenon.
Further, the method for constructing the distance compensation model comprises the following steps:
the method comprises the steps of obtaining a sample data set, wherein the sample data set comprises historical wind power information, historical plant types, historical first distances and historical second distances, dividing the sample data set into a sample training set and a sample testing set, constructing a regression network, taking the historical wind power information, the historical plant types and the historical first distances in the sample training set as input data of the regression network, taking the historical second distances in the sample training set as output data of the regression network, training the regression network to obtain an initial regression network for predicting the real-time second distances, testing the initial regression network by using the sample testing set, and outputting the regression network meeting the requirement of less than a preset error value as a rate determination model.
An unmanned aerial vehicle crop monitoring system based on machine learning is used for realizing the unmanned aerial vehicle crop monitoring method based on machine learning, and comprises the following steps:
distance acquisition module: acquiring a first distance of an ith area according to a sequence of dividing sequence numbers, and judging whether the first distance is in a standard interval or not, wherein the first distance is the distance between the unmanned aerial vehicle and crops;
pest identification module: if the first distance is not in the standard interval, acquiring a crop image, and acquiring a recognition result based on the crop image and a pre-constructed pest recognition model, wherein the recognition result comprises the existence of pests and the absence of pests;
region marking module: if the insect pest exists, marking the ith area as a first area, if the insect pest does not exist, marking the ith area as a second area, enabling i=i+1, and returning to the distance acquisition module;
the quantity determining module: repeating the distance acquisition module to the region marking module until i=k, ending the cycle to obtain a second region number, and determining whether a lodging phenomenon exists based on the second region number.
An electronic device comprises a power supply, an interface, a keyboard, a memory, a central processing unit and a computer program stored on the memory and capable of running on the central processing unit, wherein the computer program is executed by the processor to realize the unmanned aerial vehicle crop monitoring method based on machine learning, the interface comprises a network interface and a data interface, the network interface comprises a wired or wireless interface, and the data interface comprises an input or output interface.
A computer readable storage medium having stored thereon a computer program which when executed implements a machine learning based unmanned aerial vehicle crop monitoring method of any of the above.
Compared with the prior art, the invention has the beneficial effects that:
according to the method, the distance from the unmanned aerial vehicle to the crops is firstly obtained, and is firstly judged according to the first distance, so that the time spent on monitoring the crop areas is shortened, meanwhile, whether insect damage occurs in the crop areas is also monitored, meanwhile, the abnormal areas are marked according to the identification result, the abnormal areas are marked as a first area and a second area, finally, whether the crops fall down is further judged according to the number of the second areas, if the fall down occurs, the second area is not required to be subjected to next measures, only the first area is required to be treated, the accuracy and the efficiency of treatment are improved, and if the fall down does not occur, the first area and the second area are required to be treated.
Drawings
FIG. 1 is a schematic diagram of a machine learning-based unmanned aerial vehicle crop monitoring method in the present invention;
FIG. 2 is a schematic view of the crop area division in the present invention;
FIG. 3 is a schematic diagram of a computer readable storage medium according to the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Embodiment 1, please refer to fig. 1, the disclosure of this embodiment provides an unmanned aerial vehicle crop monitoring method based on machine learning, which includes:
s10: acquiring a first distance of an ith area according to a sequence of dividing sequence numbers, and judging whether the first distance is in a standard interval or not, wherein the first distance is the distance between the unmanned aerial vehicle and crops;
specifically, as shown in fig. 2, in the present invention, a crop area is taken as an example of a square, k areas are obtained by dividing the crop area, in this embodiment, k is 8, so that the unmanned aerial vehicle obtains a first distance of the ith area according to the dividing sequence number, that is, the unmanned aerial vehicle first starts from the area with the sequence number of 1, so as to traverse the whole crop area, so as to monitor the whole crop area, an infrared sensor is usually installed on the unmanned aerial vehicle, infrared rays are sent downwards through the infrared sensor, so as to obtain a first distance, the first distance refers to a distance between the unmanned aerial vehicle and vegetation, and the unmanned aerial vehicle usually flies at a constant height, that is, the shorter the first distance is, the higher the growth height of crops is represented;
the generation method of the standard interval comprises the following steps:
acquiring the growth time and plant types of crops, and inputting the growth time and the plant types into a pre-constructed height prediction model to obtain a first height interval output by the height prediction model;
obtaining a second height interval from the database based on the growth time of the crop and the plant category, and generating a third height interval according to the first height interval and the second height interval;
acquiring a second distance, and generating a standard interval based on a third height interval and the second distance, wherein the second distance is the distance from the unmanned aerial vehicle to the ground;
it will be appreciated that the first height interval is obtained experimentally by a person skilled in the art, that is to say by measuring the height of the crop continuously experimentally, to form experimental data for training the model, while the second height interval is obtained by connecting the background to the internet, for example, from reference books and agricultural manuals for some plant growth;
the method for generating the third height interval according to the first height interval and the second height interval comprises the following steps:
taking an intersection between the first height section and the second height section to form a third height section;
it should be noted that, although the first height section is obtained by a person skilled in the art according to the experiment, the experiment may have errors and contingencies, and meanwhile, the second height section is obtained according to the database, so that the data cannot completely conform to the actual situation due to the fact that the data is too standardized, therefore, the first height section and the second height section are combined to form a third height section, and the third height section is more accurate relative to the ranges of the first height section and the second height section, so that the errors, contingencies and other factors can be eliminated, and meanwhile, the situation that the data cannot completely conform to the actual situation due to the fact that the data is too standardized is avoided;
the method for generating the standard interval based on the third altitude interval and the second distance comprises the following steps:
obtaining a minimum boundary value and a maximum boundary value of the third height interval, taking the difference value between the second distance and the minimum boundary value as the maximum value of the standard interval, and taking the difference value between the second distance and the maximum boundary value as the minimum value of the standard interval;
it will be appreciated that the second distance refers to the distance of the drone to the ground, and the third height interval refers to the height interval where the crop grows, so the distance of the drone to the plant needs to be obtained by subtracting the growth height of the crop from the second distance;
the construction method of the height prediction model comprises the following steps:
obtaining a sample data set, wherein the sample data set comprises a historical growth time, a historical plant category and a historical standard interval, dividing the sample data set into a sample training set and a sample test set, constructing a regression network, taking the historical growth time and the historical plant category in the sample training set as input data of the regression network, taking the historical standard interval in the sample training set as output data of the regression network, training the regression network to obtain an initial regression network for predicting a real-time standard interval, testing the initial regression network by utilizing the sample test set, and outputting the regression network meeting a value smaller than a preset error value as a rate determination model, wherein the regression network is preferably a neural network model;
s20: if the first distance is not in the standard interval, acquiring a crop image, and acquiring a recognition result based on the crop image and a pre-constructed pest recognition model, wherein the recognition result comprises the existence of pests and the absence of pests;
it should be noted that if the first distance is not in the standard interval, it indicates that the production state of the crop is problematic, and the production state of the crop is greatly affected by the insect damage, so that the target image needs to be acquired, where the target image refers to an image of the surface of the crop, if the first distance is in the standard interval, it indicates that the production state of the crop is not problematic, then the unmanned aerial vehicle sequentially moves to the next area according to the dividing sequence number, so that the unmanned aerial vehicle does not acquire an image of the surface of the crop in each area, then analyzes according to the image of the surface of the crop, but acquires the distance from the unmanned aerial vehicle to the crop first, and judges according to the first distance, thereby not only shortening the time spent for monitoring the area of the crop, but also monitoring whether the insect damage occurs in the area of the crop;
the construction method of the insect pest identification model comprises the following steps:
acquiring h groups of data, wherein h is a positive integer greater than 1, the data comprise historical crop images and historical recognition results, the historical crop images and the historical recognition results are used as sample sets, the sample sets are divided into training sets and test sets, a classifier is constructed, the historical crop images in the training sets are used as input data, the historical recognition results in the training sets are used as output data, the classifier is trained to obtain an initial classifier, the test sets are used for testing the initial classifier, the classifier meeting the preset accuracy is output to serve as a solid waste classification model, and the classifier is preferably one of a naive Bayesian model or a support vector machine model;
s30: if there is a pest, marking the ith area as a first area, if there is no pest, marking the ith area as a second area, letting i=i+1, and returning to S10;
it is noted that if there is a pest, the i-th area is marked as a first area, as shown in fig. 2, fta is a first area, sda is a second area, the areas corresponding to the serial numbers 3 and 5 in the drawing are marked as a first area, the areas corresponding to the serial numbers 2 and 6 in the drawing are marked as a second area, the rest of serial numbers 1, 4, 7 and 8 are normal areas, the normal areas refer to the first distance obtained by the unmanned aerial vehicle passing through the area being in the standard interval, the first area refers to the first distance obtained by the unmanned aerial vehicle passing through the area being not in the standard interval, and the crop in the area is in the pest phenomenon, and the second area refers to the first distance obtained by the unmanned aerial vehicle passing through the area being not in the standard interval, but the crop in the area is not in the pest phenomenon, so that the crop in the second area has a lodging phenomenon on a large probability, thereby leading to the first distance obtained by the unmanned aerial vehicle passing through the area not being in the standard interval;
s40: repeating the steps S10 to S30 until i=k, ending the cycle to obtain a second area number, and determining whether a lodging phenomenon exists or not based on the second area number;
in the above description, the second area refers to the first distance obtained by the unmanned aerial vehicle passing through the area is not in the standard interval, but the crop in the area has no insect damage phenomenon, so that the crop in the second area has a lodging phenomenon in a high probability, and whether the crop lodges or not is difficult to judge from a single area;
because the invention judges whether crops are lodged or not by judging that the first distance is not in a standard interval and the crops in the area are not in insect damage phenomenon, but other factors can also influence the height of the crops except the lodging phenomenon because of the fact that the first distance is not in the standard interval, for example, the area is planted too early or too late, so that the first distance is not in the standard interval, but the area influenced by the too early or too late planting is occupied, and whether the crop area is abnormal in large area or not is judged by the number of the second areas, if the crop area is abnormal in large area, the crop lodging phenomenon is shown by the great probability, and the accuracy of judging the state of the crops is further improved;
the method for determining whether the lodging phenomenon exists based on the second area number comprises the following steps:
judging whether the number of the second areas is smaller than a preset area number threshold value, if so, not generating lodging phenomenon, and if not, generating lodging phenomenon;
in this embodiment, the distance from the unmanned aerial vehicle to the crops is obtained first, and is judged according to the first distance, so that the time spent for monitoring the crop area is shortened, meanwhile, whether insect damage occurs in the crop area is monitored, meanwhile, the abnormal area is marked as a first area and a second area according to the identification result, finally, whether the crops fall down or not is further judged according to the number of the second areas, if the fall down occurs, the next measure is not needed for the second area, only the first area is needed to be treated, the accuracy and the efficiency of treatment are improved, and if the fall down does not occur, the first area and the second area are all needed to be treated.
Embodiment 2, this embodiment further provides an unmanned aerial vehicle crop monitoring method based on machine learning on the basis of embodiment 1, including:
s50: if the lodging phenomenon exists, acquiring wind information of the first n days, and inputting the wind information, the plant type and the first distance into a pre-constructed distance compensation model to acquire a second distance output by the distance compensation model;
it should be noted that in embodiment 1, whether a lodging phenomenon exists is determined according to the number of the second areas, and further, based on embodiment 1, it is determined that if a lodging phenomenon exists, a second distance is required, where the second distance refers to a distance between the unmanned aerial vehicle and the crops in a normal state, it is understood that the lodging phenomenon occurs on the crops, generally because the crops are blown down by storm or storm, the higher the wind speed is, the more serious the influence is, the same plant category influences the strength of the crop stalks, for example, there is a significant difference in the strength between the stalks of sugarcane and the stalks of the rice, the wind force information may be the wind force strength, and the wind force strength in the first n days may be directly obtained through the background networking;
the construction method of the distance compensation model comprises the following steps:
obtaining a sample data set, wherein the sample data set comprises historical wind power information, historical plant types, historical first distances and historical second distances, dividing the sample data set into a sample training set and a sample testing set, constructing a regression network, taking the historical wind power information, the historical plant types and the historical first distances in the sample training set as input data of the regression network, taking the historical second distances in the sample training set as output data of the regression network, training the regression network to obtain an initial regression network for predicting the real-time second distances, testing the initial regression network by using the sample testing set, and outputting the regression network meeting the requirement of less than a preset error value as a rate determination model, wherein the regression network is preferably a neural network model;
in this embodiment, the second distance is further obtained through the wind power information, the plant category and the first distance, and then the second distance refers to the distance between the unmanned aerial vehicle and the crops in the normal state, and then the growth state of the crops is judged according to the second distance and the standard interval, so that the interference of the lodging of the crops on the monitoring of the unmanned aerial vehicle can be eliminated, and the monitoring accuracy of the crops in the interval can be further improved.
Embodiment 3, this embodiment is based on embodiment 1, and provides an unmanned aerial vehicle crops monitoring system based on machine learning, includes:
distance acquisition module: acquiring a first distance of an ith area according to a sequence of dividing sequence numbers, and judging whether the first distance is in a standard interval or not, wherein the first distance is the distance between the unmanned aerial vehicle and crops;
specifically, as shown in fig. 2, in the present invention, a crop area is taken as an example of a square, k areas are obtained by dividing the crop area, in this embodiment, k is 8, so that the unmanned aerial vehicle obtains a first distance of the ith area according to the dividing sequence number, that is, the unmanned aerial vehicle first starts from the area with the sequence number of 1, so as to traverse the whole crop area, so as to monitor the whole crop area, an infrared sensor is usually installed on the unmanned aerial vehicle, infrared rays are sent downwards through the infrared sensor, so as to obtain a first distance, the first distance refers to a distance between the unmanned aerial vehicle and vegetation, and the unmanned aerial vehicle usually flies at a constant height, that is, the shorter the first distance is, the higher the growth height of crops is represented;
the generation method of the standard interval comprises the following steps:
acquiring the growth time and plant types of crops, and inputting the growth time and the plant types into a pre-constructed height prediction model to obtain a first height interval output by the height prediction model;
obtaining a second height interval from the database based on the growth time of the crop and the plant category, and generating a third height interval according to the first height interval and the second height interval;
acquiring a second distance, and generating a standard interval based on a third height interval and the second distance, wherein the second distance is the distance from the unmanned aerial vehicle to the ground;
it will be appreciated that the first height interval is obtained experimentally by a person skilled in the art, that is to say by measuring the height of the crop continuously experimentally, to form experimental data for training the model, while the second height interval is obtained by connecting the background to the internet, for example, from reference books and agricultural manuals for some plant growth;
the method for generating the third height interval according to the first height interval and the second height interval comprises the following steps:
taking an intersection between the first height section and the second height section to form a third height section;
it should be noted that, although the first height section is obtained by a person skilled in the art according to the experiment, the experiment may have errors and contingencies, and meanwhile, the second height section is obtained according to the database, so that the data cannot completely conform to the actual situation due to the fact that the data is too standardized, therefore, the first height section and the second height section are combined to form a third height section, and the third height section is more accurate relative to the ranges of the first height section and the second height section, so that the errors, contingencies and other factors can be eliminated, and meanwhile, the situation that the data cannot completely conform to the actual situation due to the fact that the data is too standardized is avoided;
the method for generating the standard interval based on the third altitude interval and the second distance comprises the following steps:
obtaining a minimum boundary value and a maximum boundary value of the third height interval, taking the difference value between the second distance and the minimum boundary value as the maximum value of the standard interval, and taking the difference value between the second distance and the maximum boundary value as the minimum value of the standard interval;
it will be appreciated that the second distance refers to the distance of the drone to the ground, and the third height interval refers to the height interval where the crop grows, so the distance of the drone to the plant needs to be obtained by subtracting the growth height of the crop from the second distance;
the construction method of the height prediction model comprises the following steps:
obtaining a sample data set, wherein the sample data set comprises a historical growth time, a historical plant category and a historical standard interval, dividing the sample data set into a sample training set and a sample test set, constructing a regression network, taking the historical growth time and the historical plant category in the sample training set as input data of the regression network, taking the historical standard interval in the sample training set as output data of the regression network, training the regression network to obtain an initial regression network for predicting a real-time standard interval, testing the initial regression network by utilizing the sample test set, and outputting the regression network meeting a value smaller than a preset error value as a rate determination model, wherein the regression network is preferably a neural network model;
pest identification module: if the first distance is not in the standard interval, acquiring a crop image, and acquiring a recognition result based on the crop image and a pre-constructed pest recognition model, wherein the recognition result comprises the existence of pests and the absence of pests;
it should be noted that if the first distance is not in the standard interval, it indicates that the production state of the crop is problematic, and the production state of the crop is greatly affected by the insect damage, so that the target image needs to be acquired, where the target image refers to an image of the surface of the crop, if the first distance is in the standard interval, it indicates that the production state of the crop is not problematic, then the unmanned aerial vehicle sequentially moves to the next area according to the dividing sequence number, so that the unmanned aerial vehicle does not acquire an image of the surface of the crop in each area, then analyzes according to the image of the surface of the crop, but acquires the distance from the unmanned aerial vehicle to the crop first, and judges according to the first distance, thereby not only shortening the time spent for monitoring the area of the crop, but also monitoring whether the insect damage occurs in the area of the crop;
the construction method of the insect pest identification model comprises the following steps:
acquiring h groups of data, wherein h is a positive integer greater than 1, the data comprise historical crop images and historical recognition results, the historical crop images and the historical recognition results are used as sample sets, the sample sets are divided into training sets and test sets, a classifier is constructed, the historical crop images in the training sets are used as input data, the historical recognition results in the training sets are used as output data, the classifier is trained to obtain an initial classifier, the test sets are used for testing the initial classifier, the classifier meeting the preset accuracy is output to serve as a solid waste classification model, and the classifier is preferably one of a naive Bayesian model or a support vector machine model;
region marking module: if the insect pest exists, marking the ith area as a first area, if the insect pest does not exist, marking the ith area as a second area, enabling i=i+1, and returning to the distance acquisition module;
it is noted that if there is a pest, the i-th area is marked as a first area, as shown in fig. 2, fta is a first area, sda is a second area, the areas corresponding to the serial numbers 3 and 5 in the drawing are marked as a first area, the areas corresponding to the serial numbers 2 and 6 in the drawing are marked as a second area, the rest of serial numbers 1, 4, 7 and 8 are normal areas, the normal areas refer to the first distance obtained by the unmanned aerial vehicle passing through the area being in the standard interval, the first area refers to the first distance obtained by the unmanned aerial vehicle passing through the area being not in the standard interval, and the crop in the area is in the pest phenomenon, and the second area refers to the first distance obtained by the unmanned aerial vehicle passing through the area being not in the standard interval, but the crop in the area is not in the pest phenomenon, so that the crop in the second area has a lodging phenomenon on a large probability, thereby leading to the first distance obtained by the unmanned aerial vehicle passing through the area not being in the standard interval;
the quantity determining module: repeating the distance acquisition module to the region marking module until i=k, ending the cycle to obtain a second region number, and determining whether a lodging phenomenon exists based on the second region number;
in the above description, the second area refers to the first distance obtained by the unmanned aerial vehicle passing through the area is not in the standard interval, but the crop in the area has no insect damage phenomenon, so that the crop in the second area has a lodging phenomenon in a high probability, and whether the crop lodges or not is difficult to judge from a single area;
because the invention judges whether crops are lodged or not by judging that the first distance is not in a standard interval and the crops in the area are not in insect damage phenomenon, but other factors can also influence the height of the crops except the lodging phenomenon because of the fact that the first distance is not in the standard interval, for example, the area is planted too early or too late, so that the first distance is not in the standard interval, but the area influenced by the too early or too late planting is occupied, and whether the crop area is abnormal in large area or not is judged by the number of the second areas, if the crop area is abnormal in large area, the crop lodging phenomenon is shown by the great probability, and the accuracy of judging the state of the crops is further improved;
the method for determining whether the lodging phenomenon exists based on the second area number comprises the following steps:
judging whether the number of the second areas is smaller than a preset area number threshold value, if so, not generating lodging phenomenon, and if not, generating lodging phenomenon;
in this embodiment, the distance from the unmanned aerial vehicle to the crops is obtained first, and is judged according to the first distance, so that the time spent for monitoring the crop area is shortened, meanwhile, whether insect damage occurs in the crop area is monitored, meanwhile, the abnormal area is marked as a first area and a second area according to the identification result, finally, whether the crops fall down or not is further judged according to the number of the second areas, if the fall down occurs, the next measure is not needed for the second area, only the first area is needed to be treated, the accuracy and the efficiency of treatment are improved, and if the fall down does not occur, the first area and the second area are all needed to be treated.
Embodiment 4, the disclosure of the present embodiment provides an electronic device, where the disclosure of the present embodiment provides an electronic device, including a power supply, an interface, a keyboard, a memory, a central processing unit, and a computer program stored on the memory and capable of running on the central processing unit, where the central processing unit implements any one of the above methods when the computer program is executed, and the interface includes a network interface and a data interface, where the network interface includes a wired or wireless interface, and the data interface includes an input or output interface.
Since the electronic device described in this embodiment is an electronic device used for implementing a machine learning-based unmanned aerial vehicle crop monitoring method in this embodiment, based on the machine learning-based unmanned aerial vehicle crop monitoring method described in this embodiment, those skilled in the art can understand the specific implementation of the electronic device in this embodiment and various modifications thereof, so how to implement the method in this embodiment of the present application for this electronic device will not be described in detail herein. As long as those skilled in the art implement an electronic device used in the machine learning-based unmanned aerial vehicle crop monitoring method in the embodiments of the present application, all electronic devices are within the scope of protection intended in the present application.
Embodiment 5, as shown in fig. 3, the disclosure of the present embodiment provides a computer readable storage medium, on which a computer program is stored, which when executed implements the above-mentioned unmanned aerial vehicle crop monitoring method based on machine learning.
The above formulas are all formulas with dimensionality removed and numerical value calculated, the formulas are formulas with the latest real situation obtained by software simulation by collecting a large amount of data, and preset parameters, weights and threshold selection in the formulas are set by those skilled in the art according to the actual situation.
The above embodiments may be implemented in whole or in part by software, hardware, firmware, or any other combination. When implemented in software, the above-described embodiments may be implemented in whole or in part in the form of a computer program product. The computer program product comprises one or more computer instructions or computer programs. When the computer instructions or computer program are loaded or executed on a computer, the processes or functions described in accordance with embodiments of the present invention are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website site, computer, server, or data center to another website site, computer, server, or data center over a wired network or a wireless network. The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains one or more sets of available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium. The semiconductor medium may be a solid state disk.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In the several embodiments provided in the present invention, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely one, and there may be additional divisions in actual implementation, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The foregoing is merely illustrative of the present invention, and the present invention is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.
Finally: the foregoing description of the preferred embodiments of the invention is not intended to limit the invention to the precise form disclosed, and any such modifications, equivalents, and alternatives falling within the spirit and principles of the invention are intended to be included within the scope of the invention.

Claims (10)

1. Machine learning-based unmanned aerial vehicle crop monitoring method is characterized by comprising the following steps:
s10: acquiring a first distance of an ith area according to a sequence of dividing sequence numbers, and judging whether the first distance is in a standard interval or not, wherein the first distance is the distance between the unmanned aerial vehicle and crops;
s20: if the first distance is not in the standard interval, acquiring a crop image, and acquiring a recognition result based on the crop image and a pre-constructed pest recognition model, wherein the recognition result comprises the existence of pests and the absence of pests;
s30: if there is a pest, marking the ith area as a first area, if there is no pest, marking the ith area as a second area, letting i=i+1, and returning to S10;
s40: repeating the steps S10 to S30 until i=k, ending the cycle, obtaining a second area number, and determining whether the lodging phenomenon exists based on the second area number.
2. The unmanned aerial vehicle crop monitoring method based on machine learning of claim 1, wherein the standard interval generating method comprises:
acquiring the growth time and plant types of crops, and inputting the growth time and the plant types into a pre-constructed height prediction model to obtain a first height interval output by the height prediction model;
obtaining a second height interval from the database based on the growth time of the crop and the plant category, and generating a third height interval according to the first height interval and the second height interval;
and acquiring a second distance, and generating a standard interval based on the third height interval and the second distance, wherein the second distance is the distance from the unmanned aerial vehicle to the ground.
3. The machine learning based unmanned aerial vehicle crop monitoring method of claim 2, wherein the method of generating a third altitude interval from the first altitude interval and the second altitude interval comprises:
an intersection between the first height section and the second height section is taken to form a third height section.
4. The unmanned aerial vehicle crop monitoring method based on machine learning of claim 2, wherein the method for constructing the height prediction model comprises the following steps:
the method comprises the steps of obtaining a sample data set, wherein the sample data set comprises a historical growth time, a historical plant category and a historical standard interval, dividing the sample data set into a sample training set and a sample test set, constructing a regression network, taking the historical growth time and the historical plant category in the sample training set as input data of the regression network, taking the historical standard interval in the sample training set as output data of the regression network, training the regression network to obtain an initial regression network for predicting a real-time standard interval, testing the initial regression network by using the sample test set, and outputting the regression network meeting a value smaller than a preset error as a rate determination model.
5. The unmanned aerial vehicle crop monitoring method based on machine learning of claim 1, wherein the method for constructing the pest identification model comprises the following steps:
and h groups of data are acquired, wherein h is a positive integer greater than 1, the data comprise historical crop images and historical recognition results, the historical crop images and the historical recognition results are used as sample sets, the sample sets are divided into training sets and test sets, a classifier is constructed, the historical crop images in the training sets are used as input data, the historical recognition results in the training sets are used as output data, the classifier is trained, an initial classifier is obtained, the test sets are used for testing the initial classifier, and the classifier meeting the preset accuracy is output to serve as a solid waste classification model.
6. The machine learning based unmanned aerial vehicle crop monitoring method of claim 1, wherein the method of determining whether a lodging phenomenon exists based on the second number of regions comprises:
judging whether the number of the second areas is smaller than a preset area number threshold value, if so, not generating lodging phenomenon, and if not, generating lodging phenomenon.
7. The unmanned aerial vehicle crop monitoring method based on machine learning of claim 1, wherein the method for constructing the distance compensation model comprises the following steps:
the method comprises the steps of obtaining a sample data set, wherein the sample data set comprises historical wind power information, historical plant types, historical first distances and historical second distances, dividing the sample data set into a sample training set and a sample testing set, constructing a regression network, taking the historical wind power information, the historical plant types and the historical first distances in the sample training set as input data of the regression network, taking the historical second distances in the sample training set as output data of the regression network, training the regression network to obtain an initial regression network for predicting the real-time second distances, testing the initial regression network by using the sample testing set, and outputting the regression network meeting the requirement of less than a preset error value as a rate determination model.
8. A machine learning based unmanned aerial vehicle crop monitoring system for implementing the machine learning based unmanned aerial vehicle crop monitoring method of any of claims 1-7, comprising:
distance acquisition module: acquiring a first distance of an ith area according to a sequence of dividing sequence numbers, and judging whether the first distance is in a standard interval or not, wherein the first distance is the distance between the unmanned aerial vehicle and crops;
pest identification module: if the first distance is not in the standard interval, acquiring a crop image, and acquiring a recognition result based on the crop image and a pre-constructed pest recognition model, wherein the recognition result comprises the existence of pests and the absence of pests;
region marking module: if the insect pest exists, marking the ith area as a first area, if the insect pest does not exist, marking the ith area as a second area, enabling i=i+1, and returning to the distance acquisition module;
the quantity determining module: repeating the distance acquisition module to the region marking module until i=k, ending the cycle to obtain a second region number, and determining whether a lodging phenomenon exists based on the second region number.
9. An electronic device comprising a power supply, an interface, a keyboard, a memory, a central processing unit and a computer program stored on the memory and executable on the central processing unit, characterized in that the central processing unit implements a machine learning based unmanned aerial vehicle crop monitoring method according to any one of claims 1 to 7 when executing the computer program, the interface comprising a network interface and a data interface, the network interface comprising a wired or wireless interface, the data interface comprising an input or output interface.
10. A computer readable storage medium, characterized in that the computer readable storage medium has stored thereon a computer program which, when executed, implements a machine learning based unmanned aerial vehicle crop monitoring method according to any of claims 1 to 7.
CN202410215219.7A 2024-02-27 2024-02-27 Unmanned aerial vehicle crop monitoring method and system based on machine learning Active CN117789067B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410215219.7A CN117789067B (en) 2024-02-27 2024-02-27 Unmanned aerial vehicle crop monitoring method and system based on machine learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410215219.7A CN117789067B (en) 2024-02-27 2024-02-27 Unmanned aerial vehicle crop monitoring method and system based on machine learning

Publications (2)

Publication Number Publication Date
CN117789067A true CN117789067A (en) 2024-03-29
CN117789067B CN117789067B (en) 2024-05-10

Family

ID=90402168

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410215219.7A Active CN117789067B (en) 2024-02-27 2024-02-27 Unmanned aerial vehicle crop monitoring method and system based on machine learning

Country Status (1)

Country Link
CN (1) CN117789067B (en)

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106971167A (en) * 2017-03-30 2017-07-21 北京兴农丰华科技有限公司 Crop growth analysis method and its analysis system based on unmanned aerial vehicle platform
WO2018058821A1 (en) * 2016-09-30 2018-04-05 深圳前海弘稼科技有限公司 Disease and insect pest forecasting method and apparatus based on planting equipment
CN108776106A (en) * 2018-04-23 2018-11-09 中国农业大学 A kind of crop condition monitoring method and system based on unmanned plane low-altitude remote sensing
CN109813286A (en) * 2019-01-28 2019-05-28 中科光启空间信息技术有限公司 A kind of lodging disaster remote sensing damage identification method based on unmanned plane
CN112163639A (en) * 2020-10-20 2021-01-01 华南农业大学 Crop lodging classification method based on height distribution characteristic vector
CN112597855A (en) * 2020-12-15 2021-04-02 中国农业大学 Crop lodging degree identification method and device
EP3847610A1 (en) * 2018-09-04 2021-07-14 Viliam Kiss Method of identifying and displaying areas of lodged crops
CN114220022A (en) * 2021-12-10 2022-03-22 中国科学院南京土壤研究所 Remote sensing monitoring method for rice lodging based on satellite and unmanned aerial vehicle cooperative observation
CN114550108A (en) * 2022-04-26 2022-05-27 广东省农业科学院植物保护研究所 Spodoptera frugiperda identification and early warning method and system
JP2022181163A (en) * 2021-05-25 2022-12-07 国立研究開発法人農業・食品産業技術総合研究機構 Information processing apparatus, information processing method, and program
CN115482465A (en) * 2022-09-20 2022-12-16 广东交通职业技术学院 Crop disease and insect pest prediction method and system based on machine vision and storage medium
CN115953690A (en) * 2023-03-09 2023-04-11 济宁市保田农机技术推广专业合作社 Lodging crop identification method for advancing calibration of unmanned harvester
CN116205879A (en) * 2023-02-22 2023-06-02 中科合肥智慧农业协同创新研究院 Unmanned aerial vehicle image and deep learning-based wheat lodging area estimation method
US20230196117A1 (en) * 2020-08-31 2023-06-22 Huawei Technologies Co., Ltd. Training method for semi-supervised learning model, image processing method, and device
CN117193347A (en) * 2023-11-08 2023-12-08 北京市农林科学院智能装备技术研究中心 Unmanned aerial vehicle flight height control method and device, electronic equipment and storage medium
CN117557914A (en) * 2024-01-08 2024-02-13 成都大学 Crop pest identification method based on deep learning
CN117557897A (en) * 2023-09-07 2024-02-13 北京观微科技有限公司 Lodging monitoring method and device for target crops, electronic equipment and storage medium

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018058821A1 (en) * 2016-09-30 2018-04-05 深圳前海弘稼科技有限公司 Disease and insect pest forecasting method and apparatus based on planting equipment
CN106971167A (en) * 2017-03-30 2017-07-21 北京兴农丰华科技有限公司 Crop growth analysis method and its analysis system based on unmanned aerial vehicle platform
CN108776106A (en) * 2018-04-23 2018-11-09 中国农业大学 A kind of crop condition monitoring method and system based on unmanned plane low-altitude remote sensing
EP3847610A1 (en) * 2018-09-04 2021-07-14 Viliam Kiss Method of identifying and displaying areas of lodged crops
CN109813286A (en) * 2019-01-28 2019-05-28 中科光启空间信息技术有限公司 A kind of lodging disaster remote sensing damage identification method based on unmanned plane
US20230196117A1 (en) * 2020-08-31 2023-06-22 Huawei Technologies Co., Ltd. Training method for semi-supervised learning model, image processing method, and device
CN112163639A (en) * 2020-10-20 2021-01-01 华南农业大学 Crop lodging classification method based on height distribution characteristic vector
CN112597855A (en) * 2020-12-15 2021-04-02 中国农业大学 Crop lodging degree identification method and device
JP2022181163A (en) * 2021-05-25 2022-12-07 国立研究開発法人農業・食品産業技術総合研究機構 Information processing apparatus, information processing method, and program
CN114220022A (en) * 2021-12-10 2022-03-22 中国科学院南京土壤研究所 Remote sensing monitoring method for rice lodging based on satellite and unmanned aerial vehicle cooperative observation
CN114550108A (en) * 2022-04-26 2022-05-27 广东省农业科学院植物保护研究所 Spodoptera frugiperda identification and early warning method and system
CN115482465A (en) * 2022-09-20 2022-12-16 广东交通职业技术学院 Crop disease and insect pest prediction method and system based on machine vision and storage medium
CN116205879A (en) * 2023-02-22 2023-06-02 中科合肥智慧农业协同创新研究院 Unmanned aerial vehicle image and deep learning-based wheat lodging area estimation method
CN115953690A (en) * 2023-03-09 2023-04-11 济宁市保田农机技术推广专业合作社 Lodging crop identification method for advancing calibration of unmanned harvester
CN117557897A (en) * 2023-09-07 2024-02-13 北京观微科技有限公司 Lodging monitoring method and device for target crops, electronic equipment and storage medium
CN117193347A (en) * 2023-11-08 2023-12-08 北京市农林科学院智能装备技术研究中心 Unmanned aerial vehicle flight height control method and device, electronic equipment and storage medium
CN117557914A (en) * 2024-01-08 2024-02-13 成都大学 Crop pest identification method based on deep learning

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
QILONG WANG;YILIN REN;HAOJIE WANG;JIANSONG WANG;YANG YANG;QIANGQIANG ZHANG;GUANGSHENG ZHOU: "Wind-induced response of rapeseed seedling stage and lodging prediction based on UAV imagery and machine learning methods", COMPUTERS AND ELECTRONICS IN AGRICULTURE, 30 January 2024 (2024-01-30) *
朱文静;冯展康;戴世元;张平平;嵇文;王爱臣;魏新华: "无人机多光谱影像的小麦倒伏信息多特征融合检测研究", 光谱学与光谱分析, 8 January 2024 (2024-01-08) *
龙佳宁;张昭;刘晓航;李云霞;芮照钰;余江帆;漫;FLORES PAULO;韩哲雄;胡灿;王旭峰: "利用改进EfficientNetV2和无人机图像检测小麦倒伏类型", 智慧农业(中英文), 3 November 2023 (2023-11-03) *

Also Published As

Publication number Publication date
CN117789067B (en) 2024-05-10

Similar Documents

Publication Publication Date Title
US11586913B2 (en) Power equipment fault detecting and positioning method of artificial intelligence inference fusion
CN109840549B (en) Method and device for identifying plant diseases and insect pests
CN111459700B (en) Equipment fault diagnosis method, diagnosis device, diagnosis equipment and storage medium
US8433539B2 (en) Wind turbine monitoring device, method, and program
CN110210434A (en) Pest and disease damage recognition methods and device
US20220082625A1 (en) Data processor, data processing method, and computer program
Yalcin An approximation for a relative crop yield estimate from field images using deep learning
CN111868780A (en) Learning data generation device, learning model generation system, learning data generation method, and program
CN113822366A (en) Service index abnormality detection method and device, electronic equipment and storage medium
CN112580671A (en) Automatic detection method and system for multiple development stages of rice ears based on deep learning
US20180137579A1 (en) Method for optimizing crop production efficiency and apparatus for the same
CN115953436A (en) Intelligent assessment early warning method and system for pregnant pet behavior
CN115601585A (en) Agricultural pest and disease diagnosis method and device based on picture analysis
CN107480721A (en) A kind of ox only ill data analysing method and device
JP5164802B2 (en) Recognition system, recognition method, and recognition program
CN117789067B (en) Unmanned aerial vehicle crop monitoring method and system based on machine learning
CN111610026B (en) Rotary machine fault diagnosis method based on deep clustering
CN116186561B (en) Running gesture recognition and correction method and system based on high-dimensional time sequence diagram network
CN116579521A (en) Yield prediction time window determining method, device, equipment and readable storage medium
CN116310913A (en) Natural resource investigation monitoring method and device based on unmanned aerial vehicle measurement technology
Gómez-Zamanillo et al. Damage assessment of soybean and redroot amaranth plants in greenhouse through biomass estimation and deep learning-based symptom classification
CN113240340B (en) Soybean planting area analysis method, device, equipment and medium based on fuzzy classification
US20220222580A1 (en) Deterioration detection method, non-transitory computer-readable storage medium, and information processing device
CN108764183A (en) A kind of plant disease diagnostic method, device and storage medium
EP4071570A1 (en) Prediction system, information processing device, and information processing program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant