CN117274844A - Rapid extraction method for field peanut seedling emergence condition by using unmanned aerial vehicle remote sensing image - Google Patents
Rapid extraction method for field peanut seedling emergence condition by using unmanned aerial vehicle remote sensing image Download PDFInfo
- Publication number
- CN117274844A CN117274844A CN202311524223.3A CN202311524223A CN117274844A CN 117274844 A CN117274844 A CN 117274844A CN 202311524223 A CN202311524223 A CN 202311524223A CN 117274844 A CN117274844 A CN 117274844A
- Authority
- CN
- China
- Prior art keywords
- image
- aerial vehicle
- unmanned aerial
- peanut plant
- peanut
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 241001553178 Arachis glabrata Species 0.000 title claims abstract description 117
- 235000020232 peanut Nutrition 0.000 title claims abstract description 46
- 235000017060 Arachis glabrata Nutrition 0.000 title abstract description 31
- 235000010777 Arachis hypogaea Nutrition 0.000 title abstract description 31
- 235000018262 Arachis monticola Nutrition 0.000 title abstract description 31
- 238000000605 extraction Methods 0.000 title abstract description 8
- 238000000034 method Methods 0.000 claims abstract description 37
- 230000009466 transformation Effects 0.000 claims abstract description 25
- 238000012360 testing method Methods 0.000 claims abstract description 9
- 238000012545 processing Methods 0.000 claims abstract description 7
- 238000004364 calculation method Methods 0.000 claims abstract description 6
- 238000003709 image segmentation Methods 0.000 claims abstract description 4
- 238000005070 sampling Methods 0.000 claims description 11
- 230000011218 segmentation Effects 0.000 claims description 10
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 claims description 10
- 238000010586 diagram Methods 0.000 claims description 9
- 241000196324 Embryophyta Species 0.000 claims description 7
- 238000012935 Averaging Methods 0.000 claims description 4
- 238000012937 correction Methods 0.000 claims description 3
- 238000005286 illumination Methods 0.000 claims description 3
- 238000012876 topography Methods 0.000 claims 1
- 238000013480 data collection Methods 0.000 abstract description 2
- 238000002474 experimental method Methods 0.000 description 10
- 238000005516 engineering process Methods 0.000 description 6
- 108090000623 proteins and genes Proteins 0.000 description 6
- 239000003086 colorant Substances 0.000 description 4
- 238000005259 measurement Methods 0.000 description 4
- 238000011160 research Methods 0.000 description 4
- 238000011161 development Methods 0.000 description 3
- 230000018109 developmental process Effects 0.000 description 3
- 239000002689 soil Substances 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 1
- 238000007792 addition Methods 0.000 description 1
- 238000009395 breeding Methods 0.000 description 1
- 230000001488 breeding effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000007405 data analysis Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000035784 germination Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000010899 nucleation Methods 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 230000001737 promoting effect Effects 0.000 description 1
- 238000012216 screening Methods 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/17—Terrestrial scenes taken from planes or by drones
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/22—Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/25—Determination of region of interest [ROI] or a volume of interest [VOI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/26—Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/762—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/188—Vegetation
Abstract
The invention discloses a method for rapidly extracting field peanut seedlings by using unmanned aerial vehicle remote sensing images, which belongs to the technical field of agricultural informatization and comprises the following steps: selecting an unmanned aerial vehicle to carry a multispectral sensor to collect image data according to the field data collection requirement of a test area; performing image data processing to generate orthographic image data; calculating a green vegetation index image; carrying out local maximum value calculation by using the calculated over-green vegetation index image, and preliminarily determining key points of the peanut plant position information; performing straight line fitting on key points of the initially determined peanut plant position information to determine a peanut plant area; generating a distance transformation graph; and (3) image segmentation is carried out, and the peanut plant area is segmented into a series of segmented objects, wherein each segmented object is each peanut plant obtained through recognition. The invention realizes the rapid and accurate extraction and identification of the peanut seedling condition information in the field.
Description
Technical Field
The invention belongs to the technical field of agricultural informatization, and particularly relates to a method for rapidly extracting seedling emergence conditions of field peanuts by using unmanned aerial vehicle remote sensing images.
Background
Peanuts are main crops and one of important oil crops, and the high and stable yield of the peanuts is important for guaranteeing national oil safety and promoting agricultural sustainable development. With the rapid development of seed science and genetics, research into crop genes has become increasingly important. Peanut is one of important cash crops, and planting and optimizing peanut genes are of great significance. Due to the differences of soil conditions, mechanical properties and experience of robots, the problems of uneven peanut emergence, seedling shortage and ridge breakage are often caused, and the peanut yield and quality are affected to different degrees. Therefore, the peanut seedling condition information needs to be monitored, and then, a variety suitable for the growth and development of local peanuts and suitable for the growth period is bred.
Traditional peanut seedling information extraction methods rely on manual observation and measurement, which requires a great deal of manpower and time. In recent years, unmanned aerial vehicle technology and image processing technology are rapidly developed, a large-scale peanut crop image can be obtained through an unmanned aerial vehicle carrying a high-resolution camera, and seedling information is extracted by using an image processing and analyzing algorithm, so that the problems of long period and low efficiency of a traditional artificial peanut seedling information extraction method are effectively solved. Therefore, automatic acquisition of peanut seedling condition information from unmanned aerial vehicle remote sensing images is a current research hotspot.
The prior art has the defects that: due to different soil preparation conditions and mechanical properties of the fields, inconsistent seeding depth can be caused, and the peanut planting germination is affected, so that inconsistent seedling emergence time is caused. In the peanut seedling stage, the plant size difference is large, the adhesion condition of large plants is easy to appear, and the smaller plants show blurring in the unmanned aerial vehicle remote sensing image. Therefore, it is necessary to explore a more efficient, rapid and accurate method for acquiring peanut seedling condition information by utilizing the unmanned aerial vehicle visible light remote sensing image technology so as to infer peanut genes and screen out high-quality genes.
Disclosure of Invention
In order to solve the problems, the invention provides a method for rapidly extracting the seedling emergence condition of the field peanuts by using the unmanned aerial vehicle remote sensing image, and the unmanned aerial vehicle visible light remote sensing image technology has obvious advantages in the field peanut seedling condition information extraction aspect by comparing with the traditional manual observation method.
The technical scheme of the invention is as follows:
a method for rapidly extracting the emergence condition of peanuts in a field by using unmanned aerial vehicle remote sensing images comprises the following steps:
step 1, selecting an unmanned aerial vehicle to carry a multispectral sensor to acquire image data according to the field data acquisition requirement of a test area;
step 2, image data processing is carried out by using intelligent map software of Xinjiang to generate orthophoto data;
step 3, randomly cutting out the region-of-interest image with the designated size from the orthographic image, and calculating to obtain a greenish vegetation index image of the remote sensing image of the field of the test area according to the greenish vegetation index;
step 4, calculating local maxima by using the calculated over-green vegetation index image, and preliminarily determining key points of the peanut plant position information;
step 5, performing straight line fitting on key points of the initially determined peanut plant position information by utilizing a random sampling consistency algorithm to determine a peanut plant area;
step 6, generating a distance transformation graph based on the local maximum value;
and 7, taking the generated distance transformation graph as an input image of a marked watershed algorithm, taking the position of the residual local maximum value as a peanut plant mark of the marked watershed algorithm, performing image segmentation, and segmenting a peanut plant area into a series of segmentation objects, wherein each segmentation object is each peanut plant obtained by recognition.
Further, in the step 1, the selected unmanned aerial vehicle model is DJI Mavic 3; the acquired image data comprise an image sequence and aviation positioning and orientation system data corresponding to the image sequence; when in flight, the environment conditions with stable illumination intensity, clear weather and no cloud, little wind or no wind are selected; the flight period is controlled to be 11:00-13:00 noon.
Further, the specific process of step 2 is as follows:
step 2.1, firstly, splicing high-definition image sequences acquired by an unmanned aerial vehicle by combining aeronautical positioning and orientation system data corresponding to the image sequences by using intelligent map software of Xinjiang;
step 2.2, restoring the space attitude of the image shooting moment based on the aviation positioning and orientation system data and the corresponding high-definition image sequence acquired by the unmanned aerial vehicle;
and 2.3, finally, performing orthographic image generation, wherein the orthographic image generation comprises three-space encryption, orthographic correction and mosaic splicing operations.
Further, in step 3, the calculation formula of the over-green vegetation index is as follows:
(1);
wherein,the over-green vegetation index is valued; />The green wave band of the unmanned aerial vehicle image; />The red wave band is the unmanned aerial vehicle image; />Is the blue band of unmanned aerial vehicle image.
Further, the specific steps of step 4 are as follows:
step 4.1, defining a circular sliding window, wherein the radius of the window is set to be 20 pixels, and positioning a local maximum value through the circular sliding window; if the current central pixel value of the sliding window is larger than other pixel values in the window, the current central pixel position is a local maximum value;
step 4.2, setting a distance thresholdPerforming European clustering on the local maximum points at 15cm to obtain a series of maximum point clusters; then judging the distance between the points in each point cluster, if the furthest distance between the points in the current point cluster is less than +.>Averaging the points in the current point cluster to obtain the position of the peanut plant; if the furthest distance between the points in the current point cluster is greater than or equal toDividing the current point cluster into a plurality of point clusters by a furthest sampling method until the furthest distance between the points in the new point cluster is less than +.>All points in each point cluster are averaged to be used as key points of the position information of peanut plants.
Further, the specific steps of step 5 are as follows:
using the formula (2) as a mathematical expression of a fitting straight line, and using a random sampling consistency algorithm to fit the straight line to key points of the preliminarily determined peanut plant position information;
(2);
wherein,and->Respectively representing the abscissa and the ordinate of the current peanut plant point,/->、/>And->Representing different fitting straight line parameters;
then, the distance from each plant point to the fitting straight line is calculated according to the formula (3);
(3);
Wherein,and->Respectively represent any peanut plant point->Is the abscissa and ordinate of (2);
if the distance to the fitted line is less than the specified threshold, the peanut plant point is considered to be in the current peanut plant area.
Further, in the step 6, the position with the over-green vegetation index value larger than 0.3 is judged as a peanut plant area, and the distance from the peanut plant area to the local maximum point closest to the peanut plant area is extracted to generate a distance transformation chart; in the generated distance transformation graph, the non-peanut plant area is assigned 0, and each pixel of the peanut plant area is assigned a distance from the nearest local maximum point of the peanut plant area.
In step 7, the marked watershed algorithm realizes segmentation by simulating the process of water flow inundation of the terrain; the method comprises the following steps: the distance transformation diagram is regarded as a terrain surface, the high pixel value in the distance transformation diagram is a peak, and the low pixel value in the distance transformation diagram is a valley; filling isolated valleys, which are local minima in the distance transformation map, with water from rivers from different valleys; building a dam at a position where rivers of adjacent valleys are about to meet; continuously injecting water, continuously building a dam until all peaks are submerged, and at the moment, building the dam at the position corresponding to each segmented object, namely, each peanut plant obtained through recognition.
The invention has the beneficial technical effects that: the invention can not only greatly save labor and time cost, but also improve accuracy and efficiency, and realize rapid and accurate extraction and identification of the peanut seedling condition information in the field. In addition, the unmanned aerial vehicle remote sensing technology can also realize comprehensive monitoring of a large-area peanut planting area, and provides more possibility for peanut gene research and breeding. The research result of the invention provides a novel and efficient method for researching the phenotypic character of peanut crops and screening genes, and provides a beneficial reference for unmanned aerial vehicle remote sensing technology application in the agricultural field.
Drawings
Fig. 1 is a flowchart of a method for rapidly extracting peanut seedlings in a field by using unmanned aerial vehicle remote sensing images.
Fig. 2 is an RGB image of scene 1 acquired by a drone in an experiment of the present invention.
Fig. 3 shows the EGI image obtained in scene 1 in the experiment of the present invention.
Fig. 4 is an RGB and EGI superimposed image obtained in scenario 1 in the experiment of the present invention.
Fig. 5 is a distance map obtained in the case 1 in the experiment of the present invention.
Fig. 6 is a graph of a part of peanut recognition results obtained in the scene 1 in the experiment of the present invention.
Fig. 7 is an RGB image of scene 2 acquired by a drone in an experiment of the present invention.
Fig. 8 shows the EGI images obtained in case 2 in the experiments of the present invention.
Fig. 9 is an RGB and EGI superimposed image obtained in scenario 2 in the experiment of the present invention.
Fig. 10 is a distance map obtained in the scenario 2 in the experiment of the present invention.
Fig. 11 is a graph of a part of peanut recognition results obtained in the scene 2 in the embodiment of the present invention.
Detailed Description
The invention is described in further detail below with reference to the attached drawings and detailed description:
as shown in fig. 1, the method for rapidly extracting the seedling emergence condition of the field peanuts by using the remote sensing image of the unmanned aerial vehicle comprises the following steps:
step 1, selecting a DJI Mavic 3 unmanned aerial vehicle with small volume and stable flight to carry a multispectral sensor to collect image data according to the field data collection requirement of a test area, wherein the image data comprises an image sequence and aviation positioning and orientation system data corresponding to the image sequence. In order to ensure the stability of unmanned aerial vehicle during flight and the accuracy of data acquisition, the continuous flight monitoring is carried out on a test area by selecting environmental conditions with stable illumination intensity, clear weather without clouds, little wind or no wind. Meanwhile, in order to reduce information errors, a time period with sufficient light is selected for flight, and therefore, the flight time period is controlled to be 11:00-13:00 noon. The unmanned aerial vehicle is used for collecting the image sequence and the data of the aviation positioning and orientation system corresponding to the image sequence, wherein the data of the aviation positioning and orientation system is called POS data for short. POS data are acquired through a POS system of the unmanned aerial vehicle, the POS system is also called an IMU/DGPS system, the POS system consists of a dynamic differential GPS (abbreviated as DGPS), an inertial measurement device (abbreviated as IMU), a main control computer system and corresponding post-processing software, the IMU/DGPS system is a high-precision position and posture measurement system which is formed by combining the DGPS/IMU (/ inertial measurement device), the position and posture of the POS system can be measured in real time in the imaging process of a sensor, and the image external azimuth element with higher precision can be obtained without performing space three encryption by using ground control points.
And 2, performing image data processing by using intelligent map software of the Xinjiang to generate orthophoto data. The specific process is as follows:
step 2.1, firstly, splicing high-definition image sequences acquired by an unmanned aerial vehicle by combining POS data corresponding to the image sequences by using intelligent map software of Xinjiang;
step 2.2, restoring the space attitude of the image shooting moment based on POS data and a high-definition image sequence acquired by a corresponding unmanned aerial vehicle;
and 2.3, finally, generating an orthographic image, namely generating a high-quality orthographic image to ensure accuracy and consistency, wherein the orthographic image generation comprises operations such as space three encryption, orthographic correction, mosaic splicing and the like, and the operations are completed by means of the existing software.
The image processing method can improve the accuracy and comparability of the image and provides a reliable basis for subsequent data analysis and processing.
Step 3, randomly cutting out an image of a region of interest with a specified size from the orthographic image, and calculating according to the over-green vegetation index to obtain an over-green vegetation index image (EGI image for short) of the remote sensing image of the field of the test area; the calculation formula of the over-green vegetation index is as follows:
(1);
wherein,the over-green vegetation index is valued; />The green wave band of the unmanned aerial vehicle image; />The red wave band is the unmanned aerial vehicle image; />Is the blue band of unmanned aerial vehicle image.
And step 4, the test area generally mainly comprises two ground features of peanut plants and soil, and the position with a larger value in the EGI image is generally the position of the peanut plants. Therefore, the calculated EGI image is utilized to carry out local maximum value calculation, and key points of the peanut plant position information are preliminarily determined. In agricultural planting experience, adjacent peanut planting intervals are about 15cm, and repeated local maxima are deleted and combined. The method comprises the following specific steps:
step 4.1, defining a circular sliding window (set to be 20 pixels in radius) through which the local maxima are located. If the central pixel value of the sliding window is larger than other pixel values in the window, the central pixel position is the local maximum value.
In step 4.2, the same peanut plant may be detected with a plurality of maximum points due to image noise and characteristics of the peanut plant, which may lead to over-segmentation in subsequent operations. Thus, european clustering is performed on the detected local maxima (distance threshold valueSet to 15 cm) to obtain a series of maximum point clusters. Then judging the distance between the points in each point cluster, if the furthest distance between the points in the point cluster is less than +.>And averaging the points in the point cluster to obtain the position information of the peanut plants. If the furthest distance between the dots in the dot cluster is greater than or equal to +.>Dividing the point cluster into a plurality of point clusters by a furthest sampling method until the furthest distance between the points in the new point cluster is less than +.>All points within each cluster were averaged as the peanut plant position. The core idea of the furthest distance sampling method is to make the distance between all sampling points as far as possible, and the specific process is as follows:
step 4.2.1, calculating Euclidean distances among all points of the current point cluster, and taking the distance to be the mostThe far two points are used as a clustering center, and the nearest neighbor clusters are utilized to divide the current point cluster into two new point clustersAnd->;
Step 4.2.2, judging new point clusters respectivelyAnd->Whether the furthest distance between the inner points is smaller than +.>. If the furthest distance between the points in the cluster is less than +.>And averaging the points in the point cluster to obtain the position information of the peanut plants. If the furthest distance between the dots in the dot cluster is greater than or equal to +.>The furthest sampling method is iteratively executed until the furthest distance between the points in the new point cluster is less than +.>。
And step 5, the EGI image obtained by calculation by utilizing the image spectrum is easily affected by noise points (such as weeds and the like). According to the planting mode of peanut plants, the peanut plants are planted in a linear arrangement mode according to ridges, key points for preliminarily determining the position information of the peanut plants are subjected to linear fitting by utilizing a random sampling consistency (RANSAC) algorithm, so that the peanut plant positions are restrained to be in a linear arrangement mode, the influence of noise points is eliminated, and the peanut plant area is determined.
Using the formula (2) as a mathematical expression of a fitting straight line, and using a RANSAC algorithm to fit the straight line to key points of the initially determined peanut plant position information;
(2);
wherein,and->Respectively representing the abscissa and the ordinate of the current peanut plant point,/->、/>And->Representing different fit straight line parameters.
Then, the distance from each plant point to the fitting straight line is calculated according to the formula (3);
(3);
Wherein,and->Respectively represent any peanut plant point->And the abscissa and ordinate of (c).
If the distance to the fitted line is less than the specified threshold, the peanut plant point is considered to be in the current peanut plant area.
And 6, generating a distance transformation graph based on the local maximum value. Generally, the location where the EGI image is greater than 0.3 is the peanut plant area. And (3) generating a distance transformation graph by calculating the distance from the extracted peanut plant area to the nearest local maximum point. The size of the distance transformation graph is consistent with that of the EGI image, the non-peanut plant area in the graph is assigned with 0, each pixel of the peanut plant area is assigned with a distance from the nearest local maximum point, and then the local minimum value in the distance transformation graph corresponds to the local maximum value in the EGI image.
And 7, regarding the generated distance transformation graph as a terrain surface, wherein peaks represent high pixel values, and valleys represent low pixel values. And (3) taking the generated distance transformation graph as an input image of a marked watershed algorithm, taking the position of the local maximum value obtained in the step (4) as a peanut plant mark of the marked watershed algorithm, carrying out image segmentation, and segmenting a peanut plant area into a series of segmentation objects, wherein each segmentation object is each peanut plant obtained by recognition. The method comprises the following specific steps:
and 7.1, the watershed marking algorithm realizes segmentation by simulating the process of submerged water flow in the terrain. Isolated valleys (i.e., local minima in the distance map) are filled with water of various colors. The colors of the rivers from different valleys are different. As the water level rises, the rivers from adjacent valleys begin to pool together. To avoid this, dams are constructed at the locations where the rivers of adjacent valleys are about to meet to prevent the rivers from converging. The water is continuously injected, the dam is built until all peaks are submerged, and the segmentation result is determined by the built dam.
Step 7.2, the effect of the peanut plant mark is to allow the rivers of which valleys can be gathered together and the rivers of which cannot be gathered together in the dam construction process, so that the occurrence of over-segmentation is avoided.
To demonstrate the feasibility and superiority of the invention 16348 peanut seedlings from 267 cells of different territories, different years and different growth stages were collected and identified.
Fig. 2-6 are experimental process diagrams of a scene 1, fig. 2 is an RGB image of the scene 1 collected by an unmanned aerial vehicle, an EGI image shown in fig. 3 is obtained by calculating in a formula (1) in a step 3, a color image is generated according to the pixel value size (the range of the pixel value size of the EGI image is equally divided into four subintervals, which respectively correspond to four colors, namely blue, green, yellow and red, wherein blue represents a low value, red represents a high value, green and yellow represent intermediate values), the color EGI image and the RGB image collected by the unmanned aerial vehicle are superimposed to generate a superimposed image shown in fig. 4, a distance transformation diagram shown in fig. 5 is generated in a step 6, and a final peanut recognition result shown in fig. 6 is obtained in a step 7.
Fig. 7-11 are experimental process diagrams of a scene 2, fig. 7 is an RGB image of the scene 2 acquired by an unmanned aerial vehicle, an EGI image shown in fig. 8 is calculated by using a formula (1) in step 3, a color image is generated according to the pixel value size (the range of the pixel value size of the EGI image is equally divided into four subintervals, which respectively correspond to four colors, namely blue, green, yellow and red, wherein blue represents a low value, red represents a high value, green and yellow represent intermediate values), the color EGI image and the RGB image acquired by the unmanned aerial vehicle are superimposed to generate a superimposed image shown in fig. 9, a distance transformation image shown in fig. 10 is generated in step 6, and finally a final peanut recognition result shown in fig. 11 is obtained through step 7.
Each peanut seedling in the identification result obtained by the experiment is marked, so that the rapid and accurate extraction and identification of the condition information of the peanut seedling are realized, and the invention is proved to be feasible and efficient.
It should be understood that the above description is not intended to limit the invention to the particular embodiments disclosed, but to limit the invention to the particular embodiments disclosed, and that the invention is not limited to the particular embodiments disclosed, but is intended to cover modifications, adaptations, additions and alternatives falling within the spirit and scope of the invention.
Claims (8)
1. A method for rapidly extracting the emergence condition of peanuts in a field by using unmanned aerial vehicle remote sensing images is characterized by comprising the following steps:
step 1, selecting an unmanned aerial vehicle to carry a multispectral sensor to acquire image data according to the field data acquisition requirement of a test area;
step 2, image data processing is carried out by using intelligent map software of Xinjiang to generate orthophoto data;
step 3, randomly cutting out the region-of-interest image with the designated size from the orthographic image, and calculating to obtain a greenish vegetation index image of the remote sensing image of the field of the test area according to the greenish vegetation index;
step 4, calculating local maxima by using the calculated over-green vegetation index image, and preliminarily determining key points of the peanut plant position information;
step 5, performing straight line fitting on key points of the initially determined peanut plant position information by utilizing a random sampling consistency algorithm to determine a peanut plant area;
step 6, generating a distance transformation graph based on the local maximum value;
and 7, taking the generated distance transformation graph as an input image of a marked watershed algorithm, taking the position of the residual local maximum value as a peanut plant mark of the marked watershed algorithm, performing image segmentation, and segmenting a peanut plant area into a series of segmentation objects, wherein each segmentation object is each peanut plant obtained by recognition.
2. The method for rapidly extracting the seedling emergence condition of the field peanuts by utilizing the remote sensing images of the unmanned aerial vehicle according to claim 1, wherein in the step 1, the model of the unmanned aerial vehicle is DJI Mavic 3; the acquired image data comprise an image sequence and aviation positioning and orientation system data corresponding to the image sequence; when in flight, the environment conditions with stable illumination intensity, clear weather and no cloud, little wind or no wind are selected; the flight period is controlled to be 11:00-13:00 noon.
3. The method for rapidly extracting the seedling emergence condition of the field peanuts by using the unmanned aerial vehicle remote sensing image according to claim 1, wherein the specific process of the step 2 is as follows:
step 2.1, firstly, splicing high-definition image sequences acquired by an unmanned aerial vehicle by combining aeronautical positioning and orientation system data corresponding to the image sequences by using intelligent map software of Xinjiang;
step 2.2, restoring the space attitude of the image shooting moment based on the aviation positioning and orientation system data and the corresponding high-definition image sequence acquired by the unmanned aerial vehicle;
and 2.3, finally, performing orthographic image generation, wherein the orthographic image generation comprises three-space encryption, orthographic correction and mosaic splicing operations.
4. The method for rapidly extracting the emergence condition of peanuts in a field by using the remote sensing image of an unmanned aerial vehicle according to claim 1, wherein in the step 3, the calculation formula of the over-green vegetation index is as follows:
(1);
wherein,the over-green vegetation index is valued; />The green wave band of the unmanned aerial vehicle image; />The red wave band is the unmanned aerial vehicle image; />Is the blue band of unmanned aerial vehicle image.
5. The method for rapidly extracting the seedling emergence condition of the field peanuts by using the unmanned aerial vehicle remote sensing image according to claim 1, wherein the specific steps of the step 4 are as follows:
step 4.1, defining a circular sliding window, wherein the radius of the window is set to be 20 pixels, and positioning a local maximum value through the circular sliding window; if the current central pixel value of the sliding window is larger than other pixel values in the window, the current central pixel position is a local maximum value;
step 4.2, setting a distance thresholdPerforming European clustering on the local maximum points at 15cm to obtain a series of maximum point clusters; then judging the distance between the points in each point cluster, if the furthest distance between the points in the current point cluster is less than +.>Averaging the points in the current point cluster to obtain the position of the peanut plant; if the furthest distance between the points in the current point cluster is greater than or equal toDividing the current point cluster into a plurality of point clusters by a furthest sampling method until the furthest distance between the points in the new point cluster is less than +.>All points in each point cluster are averaged to be used as key points of the position information of peanut plants.
6. The method for rapidly extracting the seedling emergence condition of the field peanuts by using the unmanned aerial vehicle remote sensing image according to claim 1, wherein the specific steps of the step 5 are as follows:
using the formula (2) as a mathematical expression of a fitting straight line, and using a random sampling consistency algorithm to fit the straight line to key points of the preliminarily determined peanut plant position information;
(2);
wherein,and->Respectively representing the abscissa and the ordinate of the current peanut plant point,/->、/>And->Representing different fitting straight line parameters;
then, the distance from each plant point to the fitting straight line is calculated according to the formula (3);
(3);
Wherein,and->Respectively represent any peanut plant point->Is the abscissa and ordinate of (2);
if the distance to the fitted line is less than the specified threshold, the peanut plant point is considered to be in the current peanut plant area.
7. The method for rapidly extracting the seedling emergence condition of the field peanuts by utilizing the unmanned aerial vehicle remote sensing image according to claim 1, wherein in the step 6, the position with the over-green vegetation index value larger than 0.3 is judged as a peanut plant area, and the distance from the peanut plant area to a local maximum point closest to the peanut plant area is extracted to generate a distance transformation map; in the generated distance transformation graph, the non-peanut plant area is assigned 0, and each pixel of the peanut plant area is assigned a distance from the nearest local maximum point of the peanut plant area.
8. The method for rapidly extracting the seedling emergence condition of the field peanuts by utilizing the unmanned aerial vehicle remote sensing image according to claim 1, wherein in the step 7, the marked watershed algorithm realizes segmentation by simulating the process of submerged topography of water flow; the method comprises the following steps: the distance transformation diagram is regarded as a terrain surface, the high pixel value in the distance transformation diagram is a peak, and the low pixel value in the distance transformation diagram is a valley; filling isolated valleys, which are local minima in the distance transformation map, with water from rivers from different valleys; building a dam at a position where rivers of adjacent valleys are about to meet; continuously injecting water, continuously building a dam until all peaks are submerged, and at the moment, building the dam at the position corresponding to each segmented object, namely, each peanut plant obtained through recognition.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311524223.3A CN117274844B (en) | 2023-11-16 | 2023-11-16 | Rapid extraction method for field peanut seedling emergence condition by using unmanned aerial vehicle remote sensing image |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311524223.3A CN117274844B (en) | 2023-11-16 | 2023-11-16 | Rapid extraction method for field peanut seedling emergence condition by using unmanned aerial vehicle remote sensing image |
Publications (2)
Publication Number | Publication Date |
---|---|
CN117274844A true CN117274844A (en) | 2023-12-22 |
CN117274844B CN117274844B (en) | 2024-02-06 |
Family
ID=89216296
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311524223.3A Active CN117274844B (en) | 2023-11-16 | 2023-11-16 | Rapid extraction method for field peanut seedling emergence condition by using unmanned aerial vehicle remote sensing image |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117274844B (en) |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101922914A (en) * | 2010-08-27 | 2010-12-22 | 中国林业科学研究院资源信息研究所 | Crown information extraction method and system based on high spatial resolution remote sense image |
CN107516068A (en) * | 2017-07-26 | 2017-12-26 | 福州大学 | A kind of method that single ebon hat is extracted in the high resolution image from unmanned plane |
CN108038487A (en) * | 2017-11-22 | 2018-05-15 | 湖北工业大学 | Plant leaf blade discriminating conduct based on image segmentation with Fusion Features |
CN109084690A (en) * | 2018-10-31 | 2018-12-25 | 杨凌禾讯遥感科技有限公司 | Crop plant height calculation method based on unmanned plane visual remote sensing |
CN109283937A (en) * | 2018-09-18 | 2019-01-29 | 广东省智能制造研究所 | A kind of plant protection based on unmanned plane sprays the method and system of operation |
US20200068816A1 (en) * | 2018-09-05 | 2020-03-05 | University Of Georgia Research Foundation, Inc. | Peanut maturity grading systems and methods |
AU2020100917A4 (en) * | 2020-06-02 | 2020-07-09 | Guizhou Institute Of Pratacultural | A Method For Extracting Vegetation Information From Aerial Photographs Of Synergistic Remote Sensing Images |
CN112016388A (en) * | 2020-07-08 | 2020-12-01 | 珠江水利委员会珠江水利科学研究院 | Vegetation information extraction method based on visible light waveband unmanned aerial vehicle remote sensing image |
CN112036102A (en) * | 2019-05-15 | 2020-12-04 | 北京兆易创新科技股份有限公司 | Clock control method and device for multi-bit register |
CN112131952A (en) * | 2020-08-26 | 2020-12-25 | 航天信德智图(北京)科技有限公司 | Corn seedling stage plant number information extraction based on unmanned aerial vehicle remote sensing image |
CN115690081A (en) * | 2022-11-15 | 2023-02-03 | 电子科技大学长三角研究院(湖州) | Tree counting method, system, storage medium, computer equipment and terminal |
CN115760885A (en) * | 2022-11-09 | 2023-03-07 | 南京林业大学 | High-canopy-density wetland forest parameter extraction method based on consumption-level unmanned aerial vehicle image |
WO2023029373A1 (en) * | 2021-08-30 | 2023-03-09 | 广东海洋大学 | High-precision farmland vegetation information extraction method |
-
2023
- 2023-11-16 CN CN202311524223.3A patent/CN117274844B/en active Active
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101922914A (en) * | 2010-08-27 | 2010-12-22 | 中国林业科学研究院资源信息研究所 | Crown information extraction method and system based on high spatial resolution remote sense image |
CN107516068A (en) * | 2017-07-26 | 2017-12-26 | 福州大学 | A kind of method that single ebon hat is extracted in the high resolution image from unmanned plane |
CN108038487A (en) * | 2017-11-22 | 2018-05-15 | 湖北工业大学 | Plant leaf blade discriminating conduct based on image segmentation with Fusion Features |
US20200068816A1 (en) * | 2018-09-05 | 2020-03-05 | University Of Georgia Research Foundation, Inc. | Peanut maturity grading systems and methods |
CN109283937A (en) * | 2018-09-18 | 2019-01-29 | 广东省智能制造研究所 | A kind of plant protection based on unmanned plane sprays the method and system of operation |
CN109084690A (en) * | 2018-10-31 | 2018-12-25 | 杨凌禾讯遥感科技有限公司 | Crop plant height calculation method based on unmanned plane visual remote sensing |
CN112036102A (en) * | 2019-05-15 | 2020-12-04 | 北京兆易创新科技股份有限公司 | Clock control method and device for multi-bit register |
AU2020100917A4 (en) * | 2020-06-02 | 2020-07-09 | Guizhou Institute Of Pratacultural | A Method For Extracting Vegetation Information From Aerial Photographs Of Synergistic Remote Sensing Images |
CN112016388A (en) * | 2020-07-08 | 2020-12-01 | 珠江水利委员会珠江水利科学研究院 | Vegetation information extraction method based on visible light waveband unmanned aerial vehicle remote sensing image |
CN112131952A (en) * | 2020-08-26 | 2020-12-25 | 航天信德智图(北京)科技有限公司 | Corn seedling stage plant number information extraction based on unmanned aerial vehicle remote sensing image |
WO2023029373A1 (en) * | 2021-08-30 | 2023-03-09 | 广东海洋大学 | High-precision farmland vegetation information extraction method |
CN115760885A (en) * | 2022-11-09 | 2023-03-07 | 南京林业大学 | High-canopy-density wetland forest parameter extraction method based on consumption-level unmanned aerial vehicle image |
CN115690081A (en) * | 2022-11-15 | 2023-02-03 | 电子科技大学长三角研究院(湖州) | Tree counting method, system, storage medium, computer equipment and terminal |
Non-Patent Citations (2)
Title |
---|
RAJU, M.N.; NATARAJAN, K.; VASAMSETTY, C.S.: "Remote Sensing Image Classification Using CNN-LSTM Model", 《REVUE D\'INTELLIGENCE ARTIFICIELLE》, pages 147 - 153 * |
刘帅兵 等: "基于无人机遥感影像的玉米苗期株数信息提取方法", 《农业工程学报》, pages 69 - 77 * |
Also Published As
Publication number | Publication date |
---|---|
CN117274844B (en) | 2024-02-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Reza et al. | Rice yield estimation based on K-means clustering with graph-cut segmentation using low-altitude UAV images | |
CN108009542B (en) | Weed image segmentation method in rape field environment | |
Che et al. | Estimation of maize plant height and leaf area index dynamics using an unmanned aerial vehicle with oblique and nadir photography | |
AU2020103026A4 (en) | A Single Tree Crown Segmentation Algorithm Based on Super-pixels and Topological Features in Aerial Images | |
CN106971167B (en) | Crop growth analysis method and system based on unmanned aerial vehicle platform | |
CN110163138B (en) | Method for measuring and calculating wheat tillering density based on multispectral remote sensing image of unmanned aerial vehicle | |
CN112418188A (en) | Crop growth whole-course digital assessment method based on unmanned aerial vehicle vision | |
CN111767865A (en) | Method for inverting mangrove forest biomass by using aerial image and laser data | |
CN111340826A (en) | Single tree crown segmentation algorithm for aerial image based on superpixels and topological features | |
CN111881816B (en) | Long-time-sequence river and lake ridge culture area monitoring method | |
CN102542560B (en) | Method for automatically detecting density of rice after transplantation | |
Xu et al. | Classification method of cultivated land based on UAV visible light remote sensing | |
Liu et al. | Estimating maize seedling number with UAV RGB images and advanced image processing methods | |
CN111353402B (en) | Remote sensing extraction method for oil palm forest | |
CN116228041A (en) | Method for calculating carbon index after ecological restoration of abandoned mine | |
CN111339953B (en) | Clustering analysis-based mikania micrantha monitoring method | |
CN117274844B (en) | Rapid extraction method for field peanut seedling emergence condition by using unmanned aerial vehicle remote sensing image | |
CN115760885A (en) | High-canopy-density wetland forest parameter extraction method based on consumption-level unmanned aerial vehicle image | |
CN114778476A (en) | Alfalfa cotton field soil water content monitoring model based on unmanned aerial vehicle remote sensing | |
Kothawade et al. | High throughput canopy characterization of a commercial apple orchard using aerial RGB imagery | |
CN110288645B (en) | Terrain surface area calculation method based on slope direction constraint | |
CN113554675A (en) | Edible fungus yield estimation method based on unmanned aerial vehicle visible light remote sensing | |
CN113487636A (en) | Automatic extraction method for plant height and line spacing of wide-ridge crops based on laser radar | |
Yusof et al. | Land clearing, preparation and drone monitoring using Red-Green-Blue (RGB) and thermal imagery for Smart Durian Orchard Management project | |
Feng et al. | Evaluation of cotton stand count using UAV-based hyperspectral imagery |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |