CN111352404A - Method, system, medium and device for acquiring agricultural condition by cooperation of unmanned aerial vehicle and ground mobile robot - Google Patents
Method, system, medium and device for acquiring agricultural condition by cooperation of unmanned aerial vehicle and ground mobile robot Download PDFInfo
- Publication number
- CN111352404A CN111352404A CN202010217922.3A CN202010217922A CN111352404A CN 111352404 A CN111352404 A CN 111352404A CN 202010217922 A CN202010217922 A CN 202010217922A CN 111352404 A CN111352404 A CN 111352404A
- Authority
- CN
- China
- Prior art keywords
- agricultural
- ground
- information
- agricultural condition
- unmanned aerial
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/418—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
- G05B19/4183—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM] characterised by data acquisition, e.g. workpiece identification
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/418—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
- G05B19/41835—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM] characterised by programme execution
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/418—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
- G05B19/41845—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM] characterised by system universality, reconfigurability, modularity
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/418—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
- G05B19/4189—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM] characterised by the transport system
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2413—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
- G06F18/24133—Distances to prototypes
- G06F18/24137—Distances to cluster centroïds
- G06F18/2414—Smoothing the distance, e.g. radial basis function networks [RBFN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
- G06Q50/02—Agriculture; Fishing; Mining
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Quality & Reliability (AREA)
- Manufacturing & Machinery (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- Business, Economics & Management (AREA)
- Agronomy & Crop Science (AREA)
- Economics (AREA)
- Evolutionary Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Animal Husbandry (AREA)
- Marine Sciences & Fisheries (AREA)
- Mining & Mineral Resources (AREA)
- Artificial Intelligence (AREA)
- Health & Medical Sciences (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Human Resources & Organizations (AREA)
- Marketing (AREA)
- Primary Health Care (AREA)
- Strategic Management (AREA)
- Tourism & Hospitality (AREA)
- General Business, Economics & Management (AREA)
- Catching Or Destruction (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The invention discloses a method, a system, a medium and equipment for acquiring agricultural conditions by cooperation of an unmanned aerial vehicle and a ground mobile robot, wherein the method comprises the following steps: the server drives the unmanned aerial vehicle to fly to the upper part of the monitoring area, and the agricultural condition related data is remotely obtained; the server classifies the monitoring area by adopting a neural network method and identifies uncertain sub-areas; determining a ground secondary observation point based on information such as a perception area and a path; driving a ground robot to finish secondary detection of agricultural conditions according to the determined ground secondary observation point; re-extracting the obtained ground agricultural condition information; and correcting and supplementing the agricultural condition information to complete the task of acquiring the agricultural condition information. The invention can obtain more comprehensive and accurate agricultural condition information. The reason is that the information collection of the agricultural conditions is completed by adopting a ground-air cooperative method. And (4) finishing secondary detection of information by adopting a ground mobile robot aiming at the subarea where the agricultural condition is uncertain.
Description
Technical Field
The invention relates to the technical field of agricultural information, in particular to a method, a system, a medium and equipment for acquiring agricultural conditions by cooperation of an unmanned aerial vehicle and a ground mobile robot.
Background
Wisdom agriculture is the wisdom economy in agriculture, or the concrete manifestation of wisdom economy modalities in agriculture. Wisdom agriculture is an important part of wisdom economy; for developing countries, wisdom agriculture is a main component of wisdom economy, and is a main way for developing countries to eliminate poverty, realize the advantages of later development, stay behind in economy development and realize overtaking strategies. Agricultural condition acquisition is a key basis for realizing intelligent agriculture, but traditional agricultural condition acquisition mainly comprises a wireless sensor network and the like.
With the rapid development of agricultural unmanned aerial vehicles, hyperspectrum, multispectral and the like in recent years, the novel agricultural situation acquisition technology based on unmanned aerial vehicles multispectral and the like is provided. The invention discloses a hyperspectral and multispectral technology of an unmanned aerial vehicle, which is constructed in the patents of cabin-201810107354.4 of the unmanned aerial vehicle carrying a hyperspectral sensor, airborne imaging hyperspectral geometric correction method-201610415761.2 of the unmanned aerial vehicle, multispectral remote sensing system-201820850776.6 of the unmanned aerial vehicle and the like; the possibility of unmanned aerial vehicle agricultural condition acquisition is proved.
Meanwhile, the agricultural condition acquisition robot system for the ground is provided in patents such as 'data acquisition robot for agricultural greenhouses-201711131665.6', 'agricultural inspection robot with navigation and obstacle avoidance functions-201710628327.7', and the like, so that the possibility of constructing the ground agricultural condition acquisition robot is proved.
In the patent of 'a cooperative tracking obstacle avoidance system of an unmanned aerial vehicle and an unmanned ground robot and a method thereof-201910188193.0', a cooperative obstacle avoidance method of the unmanned aerial vehicle and the ground is provided. The invention provides a new idea for acquiring the agricultural condition.
However, in the prior art for acquiring the agricultural conditions, related agricultural conditions are acquired mostly by adopting an unmanned aerial vehicle or a ground robot, and the strategies of the two layers have the defects of incomplete agricultural condition information and the like. Particularly, unmanned aerial vehicle agricultural conditions obtain and are difficult to obtain relevant information below crops. The ground robot is not easy to acquire the global information of the whole farmland and orchard.
Disclosure of Invention
In order to overcome the defects and shortcomings in the prior art, the invention provides a method, a system, a medium and equipment for acquiring the agricultural condition by the cooperation of an unmanned aerial vehicle and a ground mobile robot, wherein a brand-new agricultural condition acquisition system is constructed by the cooperation of the unmanned aerial vehicle and the ground robot, so that the agricultural condition information which is globally and locally related combined is acquired; meanwhile, the two are mutually cooperated to complete verification of related information, so that agricultural condition information distortion caused by other factors is prevented.
The invention aims to provide a method for acquiring the agricultural condition by the cooperation of an unmanned aerial vehicle and a ground mobile robot, which comprises the following steps:
the server drives the unmanned aerial vehicle to fly to the upper part of the monitoring area, and the agricultural condition related data is remotely obtained;
the server classifies the monitoring area by adopting a neural network method and identifies uncertain sub-areas, wherein the uncertain sub-areas are classified and determined sub-areas with low information degree;
determining a ground secondary observation point based on the perception area and the path information;
driving a ground robot to finish secondary detection of agricultural conditions according to the determined ground secondary observation point;
re-extracting the obtained ground agricultural condition information;
and correcting and supplementing the agricultural condition information to complete the task of acquiring the agricultural condition information.
As an optimal technical scheme, the step of the server driving the unmanned aerial vehicle to complete the acquisition of the aerial agriculture condition related data specifically comprises the following steps:
when the server obtains the task from the agricultural condition, sending an instruction M1 to the unmanned aerial vehicle;
after the unmanned aerial vehicle completes relevant assembly on the ground, constructing a flight track adapted to the land parcel;
the unmanned aerial vehicle flies according to the track, drives the sensor and the camera to complete the collection of the agricultural condition related information, sends the collected information to the server, and then the server obtains the related agricultural condition information by adopting a neural network algorithm.
As a preferred technical solution, the instruction M1 includes a plot position, an area, and a kind of agricultural information to be acquired.
As a preferred technical solution, in the step of providing the agricultural condition information and identifying the uncertain information, the server identifies the uncertain sub-regions by using the following steps:
carrying out gridding processing on the agricultural condition acquisition area, mapping the acquired data and corresponding grid area information, and establishing an agricultural condition data set corresponding to the grid area and the acquired data;
classifying each grid data set by adopting a neural network, wherein the neural network is trained based on a historical data set, and can realize classification of agricultural condition information;
repeating the step of classifying each grid data set for n times, wherein certain grid areas appear, and the classification is not unique after repeated times;
and setting an uncertain subregion judgment threshold, and determining that the subregion which is smaller than the threshold is an uncertain subregion of the agricultural condition.
As a preferred technical solution, the step of determining the secondary observation point on the ground based on the sensing area specifically includes:
based on the above, extracting and dividing uncertain sub-regions in the land parcel;
the server randomly selects some observation points in a certain uncertain sub-area, and the random observation points are determined according to the area ratio of the surface sensed by the ground mobile robot to the uncertain sub-area;
when the observation points are determined, points which cannot be reached by the mobile robot are removed;
constructing a shortest path based on a starting point and an observation point of the mobile robot;
and then sending the shortest path to the ground mobile robot.
As a preferred technical scheme, the step of driving the ground robot to complete the secondary detection of the agricultural condition according to the determined ground secondary observation point specifically comprises the following steps:
after the ground mobile robot receives the shortest path of the agricultural condition acquisition instruction, the acquisition of the ground agricultural condition is completed, namely after the mobile robot reaches a certain ground observation point, the relevant sensor and the camera are driven to complete data acquisition, and then the data are sent to the server through the wireless communication module.
As a preferred technical solution, the step of correcting and supplementing the agricultural condition information specifically comprises:
and after the data of the observation points are collected, the server analyzes the observed data by adopting a neural network algorithm to obtain related ground information. And then, based on the ground agricultural condition information, correcting and supplementing the agricultural condition information based on the unmanned aerial vehicle.
A second object of the present invention is to provide an agricultural condition obtaining system with an unmanned aerial vehicle and a ground mobile robot cooperating with each other, comprising:
the data acquisition module is used for driving the unmanned aerial vehicle by using the server to complete the acquisition of the aerial agriculture condition related data;
the identification module is used for providing agricultural condition information by using the server and identifying uncertain sub-regions;
the secondary observation point determining module is used for determining a ground secondary observation point based on the sensing area;
the second detection module is used for driving the ground robot to finish secondary detection of agricultural conditions according to the determined ground secondary observation point;
the information extraction module is used for extracting the obtained ground agricultural condition information;
and the correction and supplement module is used for correcting and supplementing the agricultural condition information to complete the task of acquiring the agricultural condition information.
A third object of the present invention is to provide a storage medium storing a program, which when executed by a processor, implements the method for acquiring an agricultural situation in cooperation of an unmanned aerial vehicle and a ground mobile robot according to any one of claims 1 to 7.
The invention also provides a computing device, which comprises a processor and a memory for storing an executable program of the processor, wherein when the processor executes the program stored in the memory, the cooperative agricultural condition acquisition method of the unmanned aerial vehicle and the ground mobile robot is realized.
Compared with the prior art, the invention has the following advantages and beneficial effects:
the invention provides a method for acquiring agricultural conditions by mutual cooperation of an unmanned aerial vehicle and a ground mobile robot, in particular to a method for acquiring the agricultural conditions in the air of crops by the unmanned aerial vehicle, and then driving the ground mobile robot to finish secondary acquisition of agricultural condition information aiming at a region with low certainty, so as to correct and supplement the agricultural condition information in the air. The traditional method mostly adopts a single means to complete the agricultural condition acquisition, and has the defects of incomplete information and the like. The invention can obtain more comprehensive and accurate agricultural condition information. The reason is that the information collection of the agricultural conditions is completed by adopting a ground-air cooperative method. And (4) finishing secondary detection of information by adopting a ground mobile robot aiming at the subarea where the agricultural condition is uncertain.
Drawings
Fig. 1 is a schematic flow chart of an agricultural condition acquisition method by cooperation of an unmanned aerial vehicle and a ground mobile robot according to the invention;
fig. 2 is a schematic structural diagram of a system of the method for acquiring agricultural conditions by cooperation of the unmanned aerial vehicle and the ground mobile robot.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Examples
The agricultural condition acquisition system is composed of main components such as an unmanned aerial vehicle, a ground mobile robot and a server. The unmanned aerial vehicle comprises a power supply, a flight control system, a farming condition acquisition camera (which can select multispectral, hyperspectral, visible light and other items), a positioning system, a navigation system, a communication system and the like. The ground mobile robot is composed of a power supply, a positioning system (GPS or Beidou), a traveling system, an agricultural condition acquisition system (related cameras, sensors and corresponding driving programs), a communication system and the like. The server connects the main components through the communication module to construct an agricultural condition acquisition physical information system.
As shown in fig. 1, the present invention provides a method for acquiring an agricultural condition by cooperation of an unmanned aerial vehicle and a ground mobile robot, comprising the following steps:
s101, driving an unmanned aerial vehicle to fly to the upper part of a monitoring area by a server, and remotely acquiring agricultural condition related data;
s102, classifying the monitoring area by the server by adopting a neural network method, and identifying the uncertain subareas by the server;
the uncertain sub-regions are sub-regions with low information degree and determined by classification; for example, a plot a, having an area of 100 squares. Certain crop is planted in this plot, and unmanned aerial vehicle flies to this plot overhead, has shot a picture. Then, a neural network model is adopted to classify the diseases and insect pests of the plot. Suppose that a small area (a) in the lower left corner is not classified accurately (e.g., 30% may be pests, 40% may be diseases, 30% may be normal, etc. by neural network). The classification of the area a is then less certain. Here, we consider the cell as an uncertain sub-region of the agricultural classification. And when the situation occurs, driving the ground sensing robot to move to the subarea a for secondary exploration. Then, agricultural condition information with high certainty about the plot is established.
Optionally, the uncertain sub-region is identified, and the following steps are adopted:
firstly, gridding an agricultural condition acquisition area (namely, dividing the agricultural condition acquisition area into a plurality of small areas on the right), then mapping the acquired data and the corresponding grid area information, and establishing an agricultural condition data set corresponding to the grid area and the acquired data.
And secondly, classifying each grid data set by adopting a neural network (the neural network is already based on the historical data set, training is completed, and agricultural condition information can be classified).
Again, repeating the second step n times in the above steps may result in some grid regions, and may result in non-unique classification by repeating the above steps a number of times (such as repeating the above steps 100 times, 30 times being the first case, 30 times being the second case, and 40 times being the third case).
Fourthly, a non-determinate sub-region determination threshold (for example, in the same case, the number of times of repeating n to obtain the same classification exceeds 80 times, and is determined as a determinate region) is set, and a sub-region which is less than the threshold is determined as a non-determinate sub-region of agricultural conditions.
S103, determining a ground secondary observation point based on the perception area and the path information;
s104, driving a ground robot to finish secondary detection of agricultural conditions according to the determined ground secondary observation point;
s105, extracting the obtained ground agricultural condition information again;
and S106, correcting and supplementing the agricultural condition information to complete the task of acquiring the agricultural condition information.
Optionally, in step S101, the step of the server driving the unmanned aerial vehicle to complete the acquisition of the aerial agriculture condition related data specifically includes:
s1011, after the server arrives at the agricultural condition acquisition task, sending an instruction M1 (the information comprises the position and the area of a land, the type of agricultural condition information to be acquired and the like) to the unmanned aerial vehicle;
s1012, after the unmanned aerial vehicle completes relevant assembly on the ground, constructing a flight track adaptive to the land (the process is generally completed at a ground station configured with the unmanned aerial vehicle);
s1013, the unmanned aerial vehicle flies according to the track, drives the sensor and the camera to complete the collection of the agricultural condition related information, sends the collected information to the server, and then the server obtains the related agricultural condition information by adopting a related algorithm. Such as ground disease maps, etc. (many patents provide related methods, not discussed in detail).
Optionally, in step S102, after the server plot-related agricultural condition information, an area with low information degree (i.e., an area that cannot be determined) is identified (a neural network or the like may be used).
Optionally, in step S103, the step of determining the secondary ground observation point based on the sensing area specifically includes:
s1031, extracting and dividing uncertain sub-regions in the plot based on the above;
s1032, the server randomly selects some observation points for a certain uncertain sub-area (for example, 5 observation points and 10 observation points can be determined according to the area ratio of the surface sensed by the ground mobile robot to the uncertain sub-area);
s1033, eliminating points which cannot be reached by the mobile robot when the observation points are determined;
s1034, constructing a shortest path based on the starting point and the observation point of the mobile robot (solving the problem of the traveling salesman by adopting an ant colony algorithm);
and S1035, sending the shortest path to the ground mobile robot.
Optionally, in step S104, the step of driving the ground robot to complete the secondary detection of the agricultural condition according to the determined ground secondary observation point specifically includes:
after the ground mobile robot receives the shortest path of the agricultural condition acquisition instruction, the acquisition of the ground agricultural condition is completed, namely after the mobile robot reaches a certain ground observation point, the relevant sensor and the camera are driven to complete data acquisition, and then the data are sent to the server through the wireless communication module.
Optionally, in step S106, the step of correcting and supplementing the agricultural condition information specifically includes:
after the data collection of the observation point is completed, the server analyzes (for example, classifies) the data observed by the relevant information by using a neural network algorithm (a correlation method is provided for various patents of different sensors, such as multispectral, image and the like, which are not described herein again), so as to obtain the relevant ground information. And then, based on the ground agricultural condition information, correcting and supplementing the agricultural condition information based on the unmanned aerial vehicle.
Example 2
As shown in fig. 2, the present embodiment further provides an agricultural condition obtaining system with cooperation of an unmanned aerial vehicle and a ground mobile robot, including: the device comprises a data acquisition module 1, an identification module 2, a secondary observation point determination module 3, a secondary detection module 4, an information extraction module 5 and a correction and supplement module 6.
The data acquisition module 1 drives the unmanned aerial vehicle to fly to the sky above a monitoring area by the server, and acquires agricultural condition related data remotely;
the identification module 2, the server adopts the neural network method to classify the monitoring area and identify the uncertain sub-area, the uncertain sub-area is the sub-area with low information degree and determined by classification;
the secondary observation point determining module 3 determines a ground secondary observation point based on the sensing area and the path information;
the second detection module 4 is used for driving the ground robot to finish secondary detection of agricultural conditions according to the determined ground secondary observation point;
the information extraction module 5 is used for extracting the obtained ground agriculture information again;
and the correction and supplement module 6 is used for correcting and supplementing the agricultural condition information and completing the task of acquiring the agricultural condition information.
Optionally, the data acquisition module 1 includes an instruction sending module 11, a flight trajectory construction module 12, and an information acquisition module 13;
the instruction sending module 11 is configured to send an instruction M1 (the information includes a plot position, an area, a type of agricultural condition information to be obtained, and the like) to the unmanned aerial vehicle after the server arrives at the agricultural condition obtaining task;
the flight path constructing module 12 is configured to construct a flight path corresponding to the land area after the unmanned aerial vehicle completes the relevant assembly on the ground (this process is generally completed at a ground station configured with the unmanned aerial vehicle).
The information acquisition module 13 is that the unmanned aerial vehicle flies according to the track, drives the sensor and the camera to complete the acquisition of the agricultural condition related information, sends the acquired information to the server, and then the server acquires the related agricultural condition information by adopting a related algorithm. Such as ground disease maps, etc. (many patents provide related methods, not discussed in detail).
Optionally, the secondary observation point determining module 3 includes an uncertain sub-region processing module 31, an observation point selecting module 32, a removing module 33, a shortest path constructing module 34, and a path sending module 35;
the uncertain sub-region processing module 31 is configured to extract and divide uncertain sub-regions in the parcel based on the above;
the observation point selecting module 32 selects some observation points at random in a certain uncertain sub-region by the server (for example, 5 observation points and 10 observation points can be determined according to the area ratio of the surface sensed by the ground mobile robot to the uncertain sub-region);
the eliminating module 33 is configured to eliminate a point which cannot be reached by the mobile robot when the observation point is determined;
the shortest path constructing module 34 constructs a shortest path (an ant colony algorithm is adopted to solve the problem of the traveling salesman) based on the starting point and the observation point of the mobile robot;
and the path sending module 35 is configured to send the shortest path to the ground mobile robot.
Optionally, in the second detection module 4, after the ground mobile robot receives the shortest path of the agricultural condition acquisition instruction, the acquisition of the ground agricultural condition is completed, that is, after the mobile robot reaches a certain ground observation point, the relevant sensor and camera are driven to complete data acquisition, and then the data is sent to the server through the wireless communication module.
Optionally, in the correction and supplement module 6, after the observation point data is collected, the server analyzes the relevant information (relevant methods are provided for various patents of different sensors, such as multispectral, image, etc., and are not described here again) to obtain the relevant ground information. And then, based on the ground agricultural condition information, correcting and supplementing the agricultural condition information based on the unmanned aerial vehicle.
Example 3
The present embodiment further provides a storage medium, where one or more programs are stored, and when the program is executed by a processor, the method for acquiring agricultural conditions by cooperation of the unmanned aerial vehicle and the ground mobile robot is implemented, where the storage medium may be a storage medium such as a ROM, a RAM, a magnetic disk, or an optical disk; the method for acquiring the agricultural condition by the cooperation of the unmanned aerial vehicle and the ground mobile robot comprises the following steps:
the server drives the unmanned aerial vehicle to complete the acquisition of the aerial agriculture condition related data;
the server provides agricultural condition information and marks uncertain sub-areas;
determining a ground secondary observation point based on the perception area;
driving a ground robot to finish secondary detection of agricultural conditions according to the determined ground secondary observation point;
extracting the obtained ground agricultural condition information;
and correcting and supplementing the agricultural condition information to complete the task of acquiring the agricultural condition information.
Example 4
The embodiment also provides a computing device, which includes a processor and a memory, where the memory stores one or more programs, and when the processor executes the programs stored in the memory, the cooperative agricultural condition obtaining method of the unmanned aerial vehicle and the ground mobile robot is realized, and the computing device may be a desktop computer, a notebook computer, a smart phone, a PDA handheld terminal, a tablet computer, or other terminal devices with a display function; the method for acquiring the agricultural condition by the cooperation of the unmanned aerial vehicle and the ground mobile robot comprises the following steps:
the server drives the unmanned aerial vehicle to complete the acquisition of the aerial agriculture condition related data;
the server provides agricultural condition information and marks uncertain sub-areas;
determining a ground secondary observation point based on the perception area;
driving a ground robot to finish secondary detection of agricultural conditions according to the determined ground secondary observation point;
extracting the obtained ground agricultural condition information;
and correcting and supplementing the agricultural condition information to complete the task of acquiring the agricultural condition information.
The above embodiments are preferred embodiments of the present invention, but the present invention is not limited to the above embodiments, and any other changes, modifications, substitutions, combinations, and simplifications which do not depart from the spirit and principle of the present invention should be construed as equivalents thereof, and all such changes, modifications, substitutions, combinations, and simplifications are intended to be included in the scope of the present invention.
Claims (10)
1. An agricultural condition acquisition method based on cooperation of an unmanned aerial vehicle and a ground mobile robot is characterized by comprising the following steps:
the server drives the unmanned aerial vehicle to fly to the upper part of the monitoring area, and the agricultural condition related data is remotely obtained;
the server classifies the monitoring area by adopting a neural network method and identifies uncertain sub-areas, wherein the uncertain sub-areas are classified and determined sub-areas with low information degree;
determining a ground secondary observation point based on the perception area and the path information;
driving a ground robot to finish secondary detection of agricultural conditions according to the determined ground secondary observation point;
re-extracting the obtained ground agricultural condition information;
and correcting and supplementing the agricultural condition information to complete the task of acquiring the agricultural condition information.
2. The cooperative method for agricultural robot calculation tasks according to claim 1, wherein the step of the server driving the unmanned aerial vehicle to complete the acquisition of the aerial agricultural situation related data specifically comprises:
when the server obtains the task from the agricultural condition, sending an instruction M1 to the unmanned aerial vehicle;
after the unmanned aerial vehicle completes relevant assembly on the ground, constructing a flight track adapted to the land parcel;
the unmanned aerial vehicle flies according to the track, drives the sensor and the camera to complete the collection of the agricultural condition related information, sends the collected information to the server, and then the server obtains the related agricultural condition information by adopting a neural network algorithm.
3. The agricultural robot calculation task collaborative method according to claim 2, wherein the instruction M1 includes a parcel position, an area, and a kind of agricultural information to be acquired.
4. The agricultural robot calculation task coordination method according to claim 1, wherein in the step of the server providing agricultural condition information and identifying uncertain information, an uncertain sub-area is identified by adopting the following steps:
carrying out gridding processing on the agricultural condition acquisition area, mapping the acquired data and corresponding grid area information, and establishing an agricultural condition data set corresponding to the grid area and the acquired data;
classifying each grid data set by adopting a neural network, wherein the neural network is trained based on a historical data set, and can realize classification of agricultural condition information;
repeating the step of classifying each grid data set for n times, wherein certain grid areas appear, and the classification is not unique after repeated times;
and setting an uncertain subregion judgment threshold, and determining that the subregion which is smaller than the threshold is an uncertain subregion of the agricultural condition.
5. The agricultural robot calculation task cooperation method according to claim 1, wherein the step of determining the ground secondary observation point based on the sensing area specifically comprises the steps of:
based on the above, extracting and dividing uncertain sub-regions in the land parcel;
the server randomly selects some observation points in a certain uncertain sub-area, and the random observation points are determined according to the area ratio of the surface sensed by the ground mobile robot to the uncertain sub-area;
when the observation points are determined, points which cannot be reached by the mobile robot are removed;
constructing a shortest path based on a starting point and an observation point of the mobile robot;
and then sending the shortest path to the ground mobile robot.
6. The cooperative method for the agricultural robot computing task according to claim 1, wherein the step of driving the ground robot to complete the secondary detection of the agricultural condition according to the determined secondary observation point of the ground specifically comprises the following steps:
after the ground mobile robot receives the shortest path of the agricultural condition acquisition instruction, the acquisition of the ground agricultural condition is completed, namely after the mobile robot reaches a certain ground observation point, the relevant sensor and the camera are driven to complete data acquisition, and then the data are sent to the server through the wireless communication module.
7. The agricultural robot calculation task cooperation method according to claim 1, wherein the step of correcting and supplementing the agricultural condition information specifically comprises the steps of:
after the data of the observation points are collected, the server analyzes the observed data by adopting a neural network algorithm to obtain related ground information, and then corrects and supplements the agricultural condition information based on the unmanned aerial vehicle based on the ground agricultural condition information.
8. The utility model provides an unmanned aerial vehicle and ground mobile robot collaborative farming feelings acquisition system which characterized in that includes:
the data acquisition module is used for driving the unmanned aerial vehicle by using the server to complete the acquisition of the aerial agriculture condition related data;
the identification module is used for providing agricultural condition information by using the server and identifying uncertain sub-regions;
the secondary observation point determining module is used for determining a ground secondary observation point based on the sensing area;
the second detection module is used for driving the ground robot to finish secondary detection of agricultural conditions according to the determined ground secondary observation point;
the information extraction module is used for extracting the obtained ground agricultural condition information;
and the correction and supplement module is used for correcting and supplementing the agricultural condition information to complete the task of acquiring the agricultural condition information.
9. A storage medium storing a program, wherein the program, when executed by a processor, implements the unmanned aerial vehicle-ground mobile robot-cooperative agricultural situation acquisition method according to any one of claims 1 to 7.
10. A computing device comprising a processor and a memory for storing processor executable programs, wherein the processor, when executing the programs stored in the memory, implements the method for acquiring agricultural conditions in cooperation of a drone and a ground mobile robot according to any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010217922.3A CN111352404A (en) | 2020-03-25 | 2020-03-25 | Method, system, medium and device for acquiring agricultural condition by cooperation of unmanned aerial vehicle and ground mobile robot |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010217922.3A CN111352404A (en) | 2020-03-25 | 2020-03-25 | Method, system, medium and device for acquiring agricultural condition by cooperation of unmanned aerial vehicle and ground mobile robot |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111352404A true CN111352404A (en) | 2020-06-30 |
Family
ID=71193025
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010217922.3A Pending CN111352404A (en) | 2020-03-25 | 2020-03-25 | Method, system, medium and device for acquiring agricultural condition by cooperation of unmanned aerial vehicle and ground mobile robot |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111352404A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112020981A (en) * | 2020-09-14 | 2020-12-04 | 皖西学院 | Intelligent weeding robot system based on machine vision |
CN112462756A (en) * | 2020-10-29 | 2021-03-09 | 久瓴(上海)智能科技有限公司 | Agriculture and forestry operation task generation method and device, computer equipment and storage medium |
CN116883875A (en) * | 2023-07-18 | 2023-10-13 | 仲恺农业工程学院 | Agricultural condition monitoring method and system by cooperation of unmanned aerial vehicle and ground robot |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108389354A (en) * | 2018-04-16 | 2018-08-10 | 南京森林警察学院 | A kind of method of unmanned plane joint ground micro robot detection forest ground fire |
CN108541683A (en) * | 2018-04-18 | 2018-09-18 | 济南浪潮高新科技投资发展有限公司 | A kind of unmanned plane pesticide spraying system based on convolutional neural networks chip |
CN109000664A (en) * | 2018-09-25 | 2018-12-14 | 宁夏大学 | A kind of the robot search and rescue system and rescue method of air-ground coordination |
CN109977924A (en) * | 2019-04-15 | 2019-07-05 | 北京麦飞科技有限公司 | For real time image processing and system on the unmanned plane machine of crops |
KR20190100107A (en) * | 2019-08-09 | 2019-08-28 | 엘지전자 주식회사 | Method for wireless communication of vehicle in autonomous driving system and apparatus thereof |
CN110796011A (en) * | 2019-09-29 | 2020-02-14 | 湖北工程学院 | Rice ear recognition method, system, device and medium based on deep learning |
-
2020
- 2020-03-25 CN CN202010217922.3A patent/CN111352404A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108389354A (en) * | 2018-04-16 | 2018-08-10 | 南京森林警察学院 | A kind of method of unmanned plane joint ground micro robot detection forest ground fire |
CN108541683A (en) * | 2018-04-18 | 2018-09-18 | 济南浪潮高新科技投资发展有限公司 | A kind of unmanned plane pesticide spraying system based on convolutional neural networks chip |
CN109000664A (en) * | 2018-09-25 | 2018-12-14 | 宁夏大学 | A kind of the robot search and rescue system and rescue method of air-ground coordination |
CN109977924A (en) * | 2019-04-15 | 2019-07-05 | 北京麦飞科技有限公司 | For real time image processing and system on the unmanned plane machine of crops |
KR20190100107A (en) * | 2019-08-09 | 2019-08-28 | 엘지전자 주식회사 | Method for wireless communication of vehicle in autonomous driving system and apparatus thereof |
CN110796011A (en) * | 2019-09-29 | 2020-02-14 | 湖北工程学院 | Rice ear recognition method, system, device and medium based on deep learning |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112020981A (en) * | 2020-09-14 | 2020-12-04 | 皖西学院 | Intelligent weeding robot system based on machine vision |
CN112020981B (en) * | 2020-09-14 | 2022-04-01 | 皖西学院 | Intelligent weeding robot system based on machine vision |
CN112462756A (en) * | 2020-10-29 | 2021-03-09 | 久瓴(上海)智能科技有限公司 | Agriculture and forestry operation task generation method and device, computer equipment and storage medium |
CN116883875A (en) * | 2023-07-18 | 2023-10-13 | 仲恺农业工程学院 | Agricultural condition monitoring method and system by cooperation of unmanned aerial vehicle and ground robot |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10614562B2 (en) | Inventory, growth, and risk prediction using image processing | |
CN111352404A (en) | Method, system, medium and device for acquiring agricultural condition by cooperation of unmanned aerial vehicle and ground mobile robot | |
JP2019520631A (en) | Weed recognition in the natural environment | |
Bargoti et al. | A pipeline for trunk detection in trellis structured apple orchards | |
WO2020103110A1 (en) | Image boundary acquisition method and device based on point cloud map and aircraft | |
WO2020103108A1 (en) | Semantic generation method and device, drone and storage medium | |
Kazmi et al. | Adaptive surveying and early treatment of crops with a team of autonomous vehicles | |
WO2020103109A1 (en) | Map generation method and device, drone and storage medium | |
CN110672088B (en) | Unmanned aerial vehicle autonomous navigation method imitating homing mechanism of landform perception of homing pigeons | |
CN113920474B (en) | Internet of things system and method for intelligently supervising citrus planting situation | |
Huang et al. | Deep localization model for intra-row crop detection in paddy field | |
Bayraktar et al. | A low-cost UAV framework towards ornamental plant detection and counting in the wild | |
CN111811502B (en) | Motion carrier multi-source information fusion navigation method and system | |
Sassu et al. | Artichoke deep learning detection network for site-specific agrochemicals uas spraying | |
US20210374975A1 (en) | Detection of environmental changes to delivery zone | |
Moazzam et al. | Crop and weeds classification in aerial imagery of sesame crop fields using a patch-based deep learning model-ensembling method | |
CN116739739A (en) | Loan amount evaluation method and device, electronic equipment and storage medium | |
US20230028196A1 (en) | User-in-the-loop object detection and classification systems and methods | |
He et al. | Extracting the navigation path of an agricultural plant protection robot based on machine vision | |
Downey et al. | Weeds accurately mapped using DGPS and ground-based vision identification | |
Wang et al. | The identification of straight-curved rice seedling rows for automatic row avoidance and weeding system | |
Ahmad et al. | UAS and UGV-Based Disease Management System for Diagnosing Corn Diseases Above and Below the Canopy Using Deep Learning | |
Alirezazadeh et al. | Weeds detection in winter wheat field using improved-YOLOv4 with attention module from UAV imagery | |
Hroob et al. | Learned Long-Term Stability Scan Filtering for Robust Robot Localisation in Continuously Changing Environments | |
Khidher et al. | Automatic trees density classification using deep learning of unmanned aerial vehicles images |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20200630 |