CN117274844A - Rapid extraction method for field peanut seedling emergence condition by using unmanned aerial vehicle remote sensing image - Google Patents

Rapid extraction method for field peanut seedling emergence condition by using unmanned aerial vehicle remote sensing image Download PDF

Info

Publication number
CN117274844A
CN117274844A CN202311524223.3A CN202311524223A CN117274844A CN 117274844 A CN117274844 A CN 117274844A CN 202311524223 A CN202311524223 A CN 202311524223A CN 117274844 A CN117274844 A CN 117274844A
Authority
CN
China
Prior art keywords
image
aerial vehicle
unmanned aerial
peanut plant
peanut
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311524223.3A
Other languages
Chinese (zh)
Other versions
CN117274844B (en
Inventor
杨俊涛
袁悦茹
李国卫
白波
李振海
杨吉顺
邱天凤
牛召兴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong University of Science and Technology
Original Assignee
Shandong University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong University of Science and Technology filed Critical Shandong University of Science and Technology
Priority to CN202311524223.3A priority Critical patent/CN117274844B/en
Publication of CN117274844A publication Critical patent/CN117274844A/en
Application granted granted Critical
Publication of CN117274844B publication Critical patent/CN117274844B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/22Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/762Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation

Abstract

The invention discloses a method for rapidly extracting field peanut seedlings by using unmanned aerial vehicle remote sensing images, which belongs to the technical field of agricultural informatization and comprises the following steps: selecting an unmanned aerial vehicle to carry a multispectral sensor to collect image data according to the field data collection requirement of a test area; performing image data processing to generate orthographic image data; calculating a green vegetation index image; carrying out local maximum value calculation by using the calculated over-green vegetation index image, and preliminarily determining key points of the peanut plant position information; performing straight line fitting on key points of the initially determined peanut plant position information to determine a peanut plant area; generating a distance transformation graph; and (3) image segmentation is carried out, and the peanut plant area is segmented into a series of segmented objects, wherein each segmented object is each peanut plant obtained through recognition. The invention realizes the rapid and accurate extraction and identification of the peanut seedling condition information in the field.

Description

Rapid extraction method for field peanut seedling emergence condition by using unmanned aerial vehicle remote sensing image
Technical Field
The invention belongs to the technical field of agricultural informatization, and particularly relates to a method for rapidly extracting seedling emergence conditions of field peanuts by using unmanned aerial vehicle remote sensing images.
Background
Peanuts are main crops and one of important oil crops, and the high and stable yield of the peanuts is important for guaranteeing national oil safety and promoting agricultural sustainable development. With the rapid development of seed science and genetics, research into crop genes has become increasingly important. Peanut is one of important cash crops, and planting and optimizing peanut genes are of great significance. Due to the differences of soil conditions, mechanical properties and experience of robots, the problems of uneven peanut emergence, seedling shortage and ridge breakage are often caused, and the peanut yield and quality are affected to different degrees. Therefore, the peanut seedling condition information needs to be monitored, and then, a variety suitable for the growth and development of local peanuts and suitable for the growth period is bred.
Traditional peanut seedling information extraction methods rely on manual observation and measurement, which requires a great deal of manpower and time. In recent years, unmanned aerial vehicle technology and image processing technology are rapidly developed, a large-scale peanut crop image can be obtained through an unmanned aerial vehicle carrying a high-resolution camera, and seedling information is extracted by using an image processing and analyzing algorithm, so that the problems of long period and low efficiency of a traditional artificial peanut seedling information extraction method are effectively solved. Therefore, automatic acquisition of peanut seedling condition information from unmanned aerial vehicle remote sensing images is a current research hotspot.
The prior art has the defects that: due to different soil preparation conditions and mechanical properties of the fields, inconsistent seeding depth can be caused, and the peanut planting germination is affected, so that inconsistent seedling emergence time is caused. In the peanut seedling stage, the plant size difference is large, the adhesion condition of large plants is easy to appear, and the smaller plants show blurring in the unmanned aerial vehicle remote sensing image. Therefore, it is necessary to explore a more efficient, rapid and accurate method for acquiring peanut seedling condition information by utilizing the unmanned aerial vehicle visible light remote sensing image technology so as to infer peanut genes and screen out high-quality genes.
Disclosure of Invention
In order to solve the problems, the invention provides a method for rapidly extracting the seedling emergence condition of the field peanuts by using the unmanned aerial vehicle remote sensing image, and the unmanned aerial vehicle visible light remote sensing image technology has obvious advantages in the field peanut seedling condition information extraction aspect by comparing with the traditional manual observation method.
The technical scheme of the invention is as follows:
a method for rapidly extracting the emergence condition of peanuts in a field by using unmanned aerial vehicle remote sensing images comprises the following steps:
step 1, selecting an unmanned aerial vehicle to carry a multispectral sensor to acquire image data according to the field data acquisition requirement of a test area;
step 2, image data processing is carried out by using intelligent map software of Xinjiang to generate orthophoto data;
step 3, randomly cutting out the region-of-interest image with the designated size from the orthographic image, and calculating to obtain a greenish vegetation index image of the remote sensing image of the field of the test area according to the greenish vegetation index;
step 4, calculating local maxima by using the calculated over-green vegetation index image, and preliminarily determining key points of the peanut plant position information;
step 5, performing straight line fitting on key points of the initially determined peanut plant position information by utilizing a random sampling consistency algorithm to determine a peanut plant area;
step 6, generating a distance transformation graph based on the local maximum value;
and 7, taking the generated distance transformation graph as an input image of a marked watershed algorithm, taking the position of the residual local maximum value as a peanut plant mark of the marked watershed algorithm, performing image segmentation, and segmenting a peanut plant area into a series of segmentation objects, wherein each segmentation object is each peanut plant obtained by recognition.
Further, in the step 1, the selected unmanned aerial vehicle model is DJI Mavic 3; the acquired image data comprise an image sequence and aviation positioning and orientation system data corresponding to the image sequence; when in flight, the environment conditions with stable illumination intensity, clear weather and no cloud, little wind or no wind are selected; the flight period is controlled to be 11:00-13:00 noon.
Further, the specific process of step 2 is as follows:
step 2.1, firstly, splicing high-definition image sequences acquired by an unmanned aerial vehicle by combining aeronautical positioning and orientation system data corresponding to the image sequences by using intelligent map software of Xinjiang;
step 2.2, restoring the space attitude of the image shooting moment based on the aviation positioning and orientation system data and the corresponding high-definition image sequence acquired by the unmanned aerial vehicle;
and 2.3, finally, performing orthographic image generation, wherein the orthographic image generation comprises three-space encryption, orthographic correction and mosaic splicing operations.
Further, in step 3, the calculation formula of the over-green vegetation index is as follows:
(1);
wherein,the over-green vegetation index is valued; />The green wave band of the unmanned aerial vehicle image; />The red wave band is the unmanned aerial vehicle image; />Is the blue band of unmanned aerial vehicle image.
Further, the specific steps of step 4 are as follows:
step 4.1, defining a circular sliding window, wherein the radius of the window is set to be 20 pixels, and positioning a local maximum value through the circular sliding window; if the current central pixel value of the sliding window is larger than other pixel values in the window, the current central pixel position is a local maximum value;
step 4.2, setting a distance thresholdPerforming European clustering on the local maximum points at 15cm to obtain a series of maximum point clusters; then judging the distance between the points in each point cluster, if the furthest distance between the points in the current point cluster is less than +.>Averaging the points in the current point cluster to obtain the position of the peanut plant; if the furthest distance between the points in the current point cluster is greater than or equal toDividing the current point cluster into a plurality of point clusters by a furthest sampling method until the furthest distance between the points in the new point cluster is less than +.>All points in each point cluster are averaged to be used as key points of the position information of peanut plants.
Further, the specific steps of step 5 are as follows:
using the formula (2) as a mathematical expression of a fitting straight line, and using a random sampling consistency algorithm to fit the straight line to key points of the preliminarily determined peanut plant position information;
(2);
wherein,and->Respectively representing the abscissa and the ordinate of the current peanut plant point,/->、/>And->Representing different fitting straight line parameters;
then, the distance from each plant point to the fitting straight line is calculated according to the formula (3)
(3);
Wherein,and->Respectively represent any peanut plant point->Is the abscissa and ordinate of (2);
if the distance to the fitted line is less than the specified threshold, the peanut plant point is considered to be in the current peanut plant area.
Further, in the step 6, the position with the over-green vegetation index value larger than 0.3 is judged as a peanut plant area, and the distance from the peanut plant area to the local maximum point closest to the peanut plant area is extracted to generate a distance transformation chart; in the generated distance transformation graph, the non-peanut plant area is assigned 0, and each pixel of the peanut plant area is assigned a distance from the nearest local maximum point of the peanut plant area.
In step 7, the marked watershed algorithm realizes segmentation by simulating the process of water flow inundation of the terrain; the method comprises the following steps: the distance transformation diagram is regarded as a terrain surface, the high pixel value in the distance transformation diagram is a peak, and the low pixel value in the distance transformation diagram is a valley; filling isolated valleys, which are local minima in the distance transformation map, with water from rivers from different valleys; building a dam at a position where rivers of adjacent valleys are about to meet; continuously injecting water, continuously building a dam until all peaks are submerged, and at the moment, building the dam at the position corresponding to each segmented object, namely, each peanut plant obtained through recognition.
The invention has the beneficial technical effects that: the invention can not only greatly save labor and time cost, but also improve accuracy and efficiency, and realize rapid and accurate extraction and identification of the peanut seedling condition information in the field. In addition, the unmanned aerial vehicle remote sensing technology can also realize comprehensive monitoring of a large-area peanut planting area, and provides more possibility for peanut gene research and breeding. The research result of the invention provides a novel and efficient method for researching the phenotypic character of peanut crops and screening genes, and provides a beneficial reference for unmanned aerial vehicle remote sensing technology application in the agricultural field.
Drawings
Fig. 1 is a flowchart of a method for rapidly extracting peanut seedlings in a field by using unmanned aerial vehicle remote sensing images.
Fig. 2 is an RGB image of scene 1 acquired by a drone in an experiment of the present invention.
Fig. 3 shows the EGI image obtained in scene 1 in the experiment of the present invention.
Fig. 4 is an RGB and EGI superimposed image obtained in scenario 1 in the experiment of the present invention.
Fig. 5 is a distance map obtained in the case 1 in the experiment of the present invention.
Fig. 6 is a graph of a part of peanut recognition results obtained in the scene 1 in the experiment of the present invention.
Fig. 7 is an RGB image of scene 2 acquired by a drone in an experiment of the present invention.
Fig. 8 shows the EGI images obtained in case 2 in the experiments of the present invention.
Fig. 9 is an RGB and EGI superimposed image obtained in scenario 2 in the experiment of the present invention.
Fig. 10 is a distance map obtained in the scenario 2 in the experiment of the present invention.
Fig. 11 is a graph of a part of peanut recognition results obtained in the scene 2 in the embodiment of the present invention.
Detailed Description
The invention is described in further detail below with reference to the attached drawings and detailed description:
as shown in fig. 1, the method for rapidly extracting the seedling emergence condition of the field peanuts by using the remote sensing image of the unmanned aerial vehicle comprises the following steps:
step 1, selecting a DJI Mavic 3 unmanned aerial vehicle with small volume and stable flight to carry a multispectral sensor to collect image data according to the field data collection requirement of a test area, wherein the image data comprises an image sequence and aviation positioning and orientation system data corresponding to the image sequence. In order to ensure the stability of unmanned aerial vehicle during flight and the accuracy of data acquisition, the continuous flight monitoring is carried out on a test area by selecting environmental conditions with stable illumination intensity, clear weather without clouds, little wind or no wind. Meanwhile, in order to reduce information errors, a time period with sufficient light is selected for flight, and therefore, the flight time period is controlled to be 11:00-13:00 noon. The unmanned aerial vehicle is used for collecting the image sequence and the data of the aviation positioning and orientation system corresponding to the image sequence, wherein the data of the aviation positioning and orientation system is called POS data for short. POS data are acquired through a POS system of the unmanned aerial vehicle, the POS system is also called an IMU/DGPS system, the POS system consists of a dynamic differential GPS (abbreviated as DGPS), an inertial measurement device (abbreviated as IMU), a main control computer system and corresponding post-processing software, the IMU/DGPS system is a high-precision position and posture measurement system which is formed by combining the DGPS/IMU (/ inertial measurement device), the position and posture of the POS system can be measured in real time in the imaging process of a sensor, and the image external azimuth element with higher precision can be obtained without performing space three encryption by using ground control points.
And 2, performing image data processing by using intelligent map software of the Xinjiang to generate orthophoto data. The specific process is as follows:
step 2.1, firstly, splicing high-definition image sequences acquired by an unmanned aerial vehicle by combining POS data corresponding to the image sequences by using intelligent map software of Xinjiang;
step 2.2, restoring the space attitude of the image shooting moment based on POS data and a high-definition image sequence acquired by a corresponding unmanned aerial vehicle;
and 2.3, finally, generating an orthographic image, namely generating a high-quality orthographic image to ensure accuracy and consistency, wherein the orthographic image generation comprises operations such as space three encryption, orthographic correction, mosaic splicing and the like, and the operations are completed by means of the existing software.
The image processing method can improve the accuracy and comparability of the image and provides a reliable basis for subsequent data analysis and processing.
Step 3, randomly cutting out an image of a region of interest with a specified size from the orthographic image, and calculating according to the over-green vegetation index to obtain an over-green vegetation index image (EGI image for short) of the remote sensing image of the field of the test area; the calculation formula of the over-green vegetation index is as follows:
(1);
wherein,the over-green vegetation index is valued; />The green wave band of the unmanned aerial vehicle image; />The red wave band is the unmanned aerial vehicle image; />Is the blue band of unmanned aerial vehicle image.
And step 4, the test area generally mainly comprises two ground features of peanut plants and soil, and the position with a larger value in the EGI image is generally the position of the peanut plants. Therefore, the calculated EGI image is utilized to carry out local maximum value calculation, and key points of the peanut plant position information are preliminarily determined. In agricultural planting experience, adjacent peanut planting intervals are about 15cm, and repeated local maxima are deleted and combined. The method comprises the following specific steps:
step 4.1, defining a circular sliding window (set to be 20 pixels in radius) through which the local maxima are located. If the central pixel value of the sliding window is larger than other pixel values in the window, the central pixel position is the local maximum value.
In step 4.2, the same peanut plant may be detected with a plurality of maximum points due to image noise and characteristics of the peanut plant, which may lead to over-segmentation in subsequent operations. Thus, european clustering is performed on the detected local maxima (distance threshold valueSet to 15 cm) to obtain a series of maximum point clusters. Then judging the distance between the points in each point cluster, if the furthest distance between the points in the point cluster is less than +.>And averaging the points in the point cluster to obtain the position information of the peanut plants. If the furthest distance between the dots in the dot cluster is greater than or equal to +.>Dividing the point cluster into a plurality of point clusters by a furthest sampling method until the furthest distance between the points in the new point cluster is less than +.>All points within each cluster were averaged as the peanut plant position. The core idea of the furthest distance sampling method is to make the distance between all sampling points as far as possible, and the specific process is as follows:
step 4.2.1, calculating Euclidean distances among all points of the current point cluster, and taking the distance to be the mostThe far two points are used as a clustering center, and the nearest neighbor clusters are utilized to divide the current point cluster into two new point clustersAnd->
Step 4.2.2, judging new point clusters respectivelyAnd->Whether the furthest distance between the inner points is smaller than +.>. If the furthest distance between the points in the cluster is less than +.>And averaging the points in the point cluster to obtain the position information of the peanut plants. If the furthest distance between the dots in the dot cluster is greater than or equal to +.>The furthest sampling method is iteratively executed until the furthest distance between the points in the new point cluster is less than +.>
And step 5, the EGI image obtained by calculation by utilizing the image spectrum is easily affected by noise points (such as weeds and the like). According to the planting mode of peanut plants, the peanut plants are planted in a linear arrangement mode according to ridges, key points for preliminarily determining the position information of the peanut plants are subjected to linear fitting by utilizing a random sampling consistency (RANSAC) algorithm, so that the peanut plant positions are restrained to be in a linear arrangement mode, the influence of noise points is eliminated, and the peanut plant area is determined.
Using the formula (2) as a mathematical expression of a fitting straight line, and using a RANSAC algorithm to fit the straight line to key points of the initially determined peanut plant position information;
(2);
wherein,and->Respectively representing the abscissa and the ordinate of the current peanut plant point,/->、/>And->Representing different fit straight line parameters.
Then, the distance from each plant point to the fitting straight line is calculated according to the formula (3)
(3);
Wherein,and->Respectively represent any peanut plant point->And the abscissa and ordinate of (c).
If the distance to the fitted line is less than the specified threshold, the peanut plant point is considered to be in the current peanut plant area.
And 6, generating a distance transformation graph based on the local maximum value. Generally, the location where the EGI image is greater than 0.3 is the peanut plant area. And (3) generating a distance transformation graph by calculating the distance from the extracted peanut plant area to the nearest local maximum point. The size of the distance transformation graph is consistent with that of the EGI image, the non-peanut plant area in the graph is assigned with 0, each pixel of the peanut plant area is assigned with a distance from the nearest local maximum point, and then the local minimum value in the distance transformation graph corresponds to the local maximum value in the EGI image.
And 7, regarding the generated distance transformation graph as a terrain surface, wherein peaks represent high pixel values, and valleys represent low pixel values. And (3) taking the generated distance transformation graph as an input image of a marked watershed algorithm, taking the position of the local maximum value obtained in the step (4) as a peanut plant mark of the marked watershed algorithm, carrying out image segmentation, and segmenting a peanut plant area into a series of segmentation objects, wherein each segmentation object is each peanut plant obtained by recognition. The method comprises the following specific steps:
and 7.1, the watershed marking algorithm realizes segmentation by simulating the process of submerged water flow in the terrain. Isolated valleys (i.e., local minima in the distance map) are filled with water of various colors. The colors of the rivers from different valleys are different. As the water level rises, the rivers from adjacent valleys begin to pool together. To avoid this, dams are constructed at the locations where the rivers of adjacent valleys are about to meet to prevent the rivers from converging. The water is continuously injected, the dam is built until all peaks are submerged, and the segmentation result is determined by the built dam.
Step 7.2, the effect of the peanut plant mark is to allow the rivers of which valleys can be gathered together and the rivers of which cannot be gathered together in the dam construction process, so that the occurrence of over-segmentation is avoided.
To demonstrate the feasibility and superiority of the invention 16348 peanut seedlings from 267 cells of different territories, different years and different growth stages were collected and identified.
Fig. 2-6 are experimental process diagrams of a scene 1, fig. 2 is an RGB image of the scene 1 collected by an unmanned aerial vehicle, an EGI image shown in fig. 3 is obtained by calculating in a formula (1) in a step 3, a color image is generated according to the pixel value size (the range of the pixel value size of the EGI image is equally divided into four subintervals, which respectively correspond to four colors, namely blue, green, yellow and red, wherein blue represents a low value, red represents a high value, green and yellow represent intermediate values), the color EGI image and the RGB image collected by the unmanned aerial vehicle are superimposed to generate a superimposed image shown in fig. 4, a distance transformation diagram shown in fig. 5 is generated in a step 6, and a final peanut recognition result shown in fig. 6 is obtained in a step 7.
Fig. 7-11 are experimental process diagrams of a scene 2, fig. 7 is an RGB image of the scene 2 acquired by an unmanned aerial vehicle, an EGI image shown in fig. 8 is calculated by using a formula (1) in step 3, a color image is generated according to the pixel value size (the range of the pixel value size of the EGI image is equally divided into four subintervals, which respectively correspond to four colors, namely blue, green, yellow and red, wherein blue represents a low value, red represents a high value, green and yellow represent intermediate values), the color EGI image and the RGB image acquired by the unmanned aerial vehicle are superimposed to generate a superimposed image shown in fig. 9, a distance transformation image shown in fig. 10 is generated in step 6, and finally a final peanut recognition result shown in fig. 11 is obtained through step 7.
Each peanut seedling in the identification result obtained by the experiment is marked, so that the rapid and accurate extraction and identification of the condition information of the peanut seedling are realized, and the invention is proved to be feasible and efficient.
It should be understood that the above description is not intended to limit the invention to the particular embodiments disclosed, but to limit the invention to the particular embodiments disclosed, and that the invention is not limited to the particular embodiments disclosed, but is intended to cover modifications, adaptations, additions and alternatives falling within the spirit and scope of the invention.

Claims (8)

1. A method for rapidly extracting the emergence condition of peanuts in a field by using unmanned aerial vehicle remote sensing images is characterized by comprising the following steps:
step 1, selecting an unmanned aerial vehicle to carry a multispectral sensor to acquire image data according to the field data acquisition requirement of a test area;
step 2, image data processing is carried out by using intelligent map software of Xinjiang to generate orthophoto data;
step 3, randomly cutting out the region-of-interest image with the designated size from the orthographic image, and calculating to obtain a greenish vegetation index image of the remote sensing image of the field of the test area according to the greenish vegetation index;
step 4, calculating local maxima by using the calculated over-green vegetation index image, and preliminarily determining key points of the peanut plant position information;
step 5, performing straight line fitting on key points of the initially determined peanut plant position information by utilizing a random sampling consistency algorithm to determine a peanut plant area;
step 6, generating a distance transformation graph based on the local maximum value;
and 7, taking the generated distance transformation graph as an input image of a marked watershed algorithm, taking the position of the residual local maximum value as a peanut plant mark of the marked watershed algorithm, performing image segmentation, and segmenting a peanut plant area into a series of segmentation objects, wherein each segmentation object is each peanut plant obtained by recognition.
2. The method for rapidly extracting the seedling emergence condition of the field peanuts by utilizing the remote sensing images of the unmanned aerial vehicle according to claim 1, wherein in the step 1, the model of the unmanned aerial vehicle is DJI Mavic 3; the acquired image data comprise an image sequence and aviation positioning and orientation system data corresponding to the image sequence; when in flight, the environment conditions with stable illumination intensity, clear weather and no cloud, little wind or no wind are selected; the flight period is controlled to be 11:00-13:00 noon.
3. The method for rapidly extracting the seedling emergence condition of the field peanuts by using the unmanned aerial vehicle remote sensing image according to claim 1, wherein the specific process of the step 2 is as follows:
step 2.1, firstly, splicing high-definition image sequences acquired by an unmanned aerial vehicle by combining aeronautical positioning and orientation system data corresponding to the image sequences by using intelligent map software of Xinjiang;
step 2.2, restoring the space attitude of the image shooting moment based on the aviation positioning and orientation system data and the corresponding high-definition image sequence acquired by the unmanned aerial vehicle;
and 2.3, finally, performing orthographic image generation, wherein the orthographic image generation comprises three-space encryption, orthographic correction and mosaic splicing operations.
4. The method for rapidly extracting the emergence condition of peanuts in a field by using the remote sensing image of an unmanned aerial vehicle according to claim 1, wherein in the step 3, the calculation formula of the over-green vegetation index is as follows:
(1);
wherein,the over-green vegetation index is valued; />The green wave band of the unmanned aerial vehicle image; />The red wave band is the unmanned aerial vehicle image; />Is the blue band of unmanned aerial vehicle image.
5. The method for rapidly extracting the seedling emergence condition of the field peanuts by using the unmanned aerial vehicle remote sensing image according to claim 1, wherein the specific steps of the step 4 are as follows:
step 4.1, defining a circular sliding window, wherein the radius of the window is set to be 20 pixels, and positioning a local maximum value through the circular sliding window; if the current central pixel value of the sliding window is larger than other pixel values in the window, the current central pixel position is a local maximum value;
step 4.2, setting a distance thresholdPerforming European clustering on the local maximum points at 15cm to obtain a series of maximum point clusters; then judging the distance between the points in each point cluster, if the furthest distance between the points in the current point cluster is less than +.>Averaging the points in the current point cluster to obtain the position of the peanut plant; if the furthest distance between the points in the current point cluster is greater than or equal toDividing the current point cluster into a plurality of point clusters by a furthest sampling method until the furthest distance between the points in the new point cluster is less than +.>All points in each point cluster are averaged to be used as key points of the position information of peanut plants.
6. The method for rapidly extracting the seedling emergence condition of the field peanuts by using the unmanned aerial vehicle remote sensing image according to claim 1, wherein the specific steps of the step 5 are as follows:
using the formula (2) as a mathematical expression of a fitting straight line, and using a random sampling consistency algorithm to fit the straight line to key points of the preliminarily determined peanut plant position information;
(2);
wherein,and->Respectively representing the abscissa and the ordinate of the current peanut plant point,/->、/>And->Representing different fitting straight line parameters;
then, the distance from each plant point to the fitting straight line is calculated according to the formula (3)
(3);
Wherein,and->Respectively represent any peanut plant point->Is the abscissa and ordinate of (2);
if the distance to the fitted line is less than the specified threshold, the peanut plant point is considered to be in the current peanut plant area.
7. The method for rapidly extracting the seedling emergence condition of the field peanuts by utilizing the unmanned aerial vehicle remote sensing image according to claim 1, wherein in the step 6, the position with the over-green vegetation index value larger than 0.3 is judged as a peanut plant area, and the distance from the peanut plant area to a local maximum point closest to the peanut plant area is extracted to generate a distance transformation map; in the generated distance transformation graph, the non-peanut plant area is assigned 0, and each pixel of the peanut plant area is assigned a distance from the nearest local maximum point of the peanut plant area.
8. The method for rapidly extracting the seedling emergence condition of the field peanuts by utilizing the unmanned aerial vehicle remote sensing image according to claim 1, wherein in the step 7, the marked watershed algorithm realizes segmentation by simulating the process of submerged topography of water flow; the method comprises the following steps: the distance transformation diagram is regarded as a terrain surface, the high pixel value in the distance transformation diagram is a peak, and the low pixel value in the distance transformation diagram is a valley; filling isolated valleys, which are local minima in the distance transformation map, with water from rivers from different valleys; building a dam at a position where rivers of adjacent valleys are about to meet; continuously injecting water, continuously building a dam until all peaks are submerged, and at the moment, building the dam at the position corresponding to each segmented object, namely, each peanut plant obtained through recognition.
CN202311524223.3A 2023-11-16 2023-11-16 Rapid extraction method for field peanut seedling emergence condition by using unmanned aerial vehicle remote sensing image Active CN117274844B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311524223.3A CN117274844B (en) 2023-11-16 2023-11-16 Rapid extraction method for field peanut seedling emergence condition by using unmanned aerial vehicle remote sensing image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311524223.3A CN117274844B (en) 2023-11-16 2023-11-16 Rapid extraction method for field peanut seedling emergence condition by using unmanned aerial vehicle remote sensing image

Publications (2)

Publication Number Publication Date
CN117274844A true CN117274844A (en) 2023-12-22
CN117274844B CN117274844B (en) 2024-02-06

Family

ID=89216296

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311524223.3A Active CN117274844B (en) 2023-11-16 2023-11-16 Rapid extraction method for field peanut seedling emergence condition by using unmanned aerial vehicle remote sensing image

Country Status (1)

Country Link
CN (1) CN117274844B (en)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101922914A (en) * 2010-08-27 2010-12-22 中国林业科学研究院资源信息研究所 Crown information extraction method and system based on high spatial resolution remote sense image
CN107516068A (en) * 2017-07-26 2017-12-26 福州大学 A kind of method that single ebon hat is extracted in the high resolution image from unmanned plane
CN108038487A (en) * 2017-11-22 2018-05-15 湖北工业大学 Plant leaf blade discriminating conduct based on image segmentation with Fusion Features
CN109084690A (en) * 2018-10-31 2018-12-25 杨凌禾讯遥感科技有限公司 Crop plant height calculation method based on unmanned plane visual remote sensing
CN109283937A (en) * 2018-09-18 2019-01-29 广东省智能制造研究所 A kind of plant protection based on unmanned plane sprays the method and system of operation
US20200068816A1 (en) * 2018-09-05 2020-03-05 University Of Georgia Research Foundation, Inc. Peanut maturity grading systems and methods
AU2020100917A4 (en) * 2020-06-02 2020-07-09 Guizhou Institute Of Pratacultural A Method For Extracting Vegetation Information From Aerial Photographs Of Synergistic Remote Sensing Images
CN112016388A (en) * 2020-07-08 2020-12-01 珠江水利委员会珠江水利科学研究院 Vegetation information extraction method based on visible light waveband unmanned aerial vehicle remote sensing image
CN112036102A (en) * 2019-05-15 2020-12-04 北京兆易创新科技股份有限公司 Clock control method and device for multi-bit register
CN112131952A (en) * 2020-08-26 2020-12-25 航天信德智图(北京)科技有限公司 Corn seedling stage plant number information extraction based on unmanned aerial vehicle remote sensing image
CN115690081A (en) * 2022-11-15 2023-02-03 电子科技大学长三角研究院(湖州) Tree counting method, system, storage medium, computer equipment and terminal
CN115760885A (en) * 2022-11-09 2023-03-07 南京林业大学 High-canopy-density wetland forest parameter extraction method based on consumption-level unmanned aerial vehicle image
WO2023029373A1 (en) * 2021-08-30 2023-03-09 广东海洋大学 High-precision farmland vegetation information extraction method

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101922914A (en) * 2010-08-27 2010-12-22 中国林业科学研究院资源信息研究所 Crown information extraction method and system based on high spatial resolution remote sense image
CN107516068A (en) * 2017-07-26 2017-12-26 福州大学 A kind of method that single ebon hat is extracted in the high resolution image from unmanned plane
CN108038487A (en) * 2017-11-22 2018-05-15 湖北工业大学 Plant leaf blade discriminating conduct based on image segmentation with Fusion Features
US20200068816A1 (en) * 2018-09-05 2020-03-05 University Of Georgia Research Foundation, Inc. Peanut maturity grading systems and methods
CN109283937A (en) * 2018-09-18 2019-01-29 广东省智能制造研究所 A kind of plant protection based on unmanned plane sprays the method and system of operation
CN109084690A (en) * 2018-10-31 2018-12-25 杨凌禾讯遥感科技有限公司 Crop plant height calculation method based on unmanned plane visual remote sensing
CN112036102A (en) * 2019-05-15 2020-12-04 北京兆易创新科技股份有限公司 Clock control method and device for multi-bit register
AU2020100917A4 (en) * 2020-06-02 2020-07-09 Guizhou Institute Of Pratacultural A Method For Extracting Vegetation Information From Aerial Photographs Of Synergistic Remote Sensing Images
CN112016388A (en) * 2020-07-08 2020-12-01 珠江水利委员会珠江水利科学研究院 Vegetation information extraction method based on visible light waveband unmanned aerial vehicle remote sensing image
CN112131952A (en) * 2020-08-26 2020-12-25 航天信德智图(北京)科技有限公司 Corn seedling stage plant number information extraction based on unmanned aerial vehicle remote sensing image
WO2023029373A1 (en) * 2021-08-30 2023-03-09 广东海洋大学 High-precision farmland vegetation information extraction method
CN115760885A (en) * 2022-11-09 2023-03-07 南京林业大学 High-canopy-density wetland forest parameter extraction method based on consumption-level unmanned aerial vehicle image
CN115690081A (en) * 2022-11-15 2023-02-03 电子科技大学长三角研究院(湖州) Tree counting method, system, storage medium, computer equipment and terminal

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
RAJU, M.N.; NATARAJAN, K.; VASAMSETTY, C.S.: "Remote Sensing Image Classification Using CNN-LSTM Model", 《REVUE D\'INTELLIGENCE ARTIFICIELLE》, pages 147 - 153 *
刘帅兵 等: "基于无人机遥感影像的玉米苗期株数信息提取方法", 《农业工程学报》, pages 69 - 77 *

Also Published As

Publication number Publication date
CN117274844B (en) 2024-02-06

Similar Documents

Publication Publication Date Title
Reza et al. Rice yield estimation based on K-means clustering with graph-cut segmentation using low-altitude UAV images
CN108009542B (en) Weed image segmentation method in rape field environment
Che et al. Estimation of maize plant height and leaf area index dynamics using an unmanned aerial vehicle with oblique and nadir photography
AU2020103026A4 (en) A Single Tree Crown Segmentation Algorithm Based on Super-pixels and Topological Features in Aerial Images
CN106971167B (en) Crop growth analysis method and system based on unmanned aerial vehicle platform
CN110163138B (en) Method for measuring and calculating wheat tillering density based on multispectral remote sensing image of unmanned aerial vehicle
CN112418188A (en) Crop growth whole-course digital assessment method based on unmanned aerial vehicle vision
CN111767865A (en) Method for inverting mangrove forest biomass by using aerial image and laser data
CN111340826A (en) Single tree crown segmentation algorithm for aerial image based on superpixels and topological features
CN111881816B (en) Long-time-sequence river and lake ridge culture area monitoring method
CN102542560B (en) Method for automatically detecting density of rice after transplantation
Xu et al. Classification method of cultivated land based on UAV visible light remote sensing
Liu et al. Estimating maize seedling number with UAV RGB images and advanced image processing methods
CN111353402B (en) Remote sensing extraction method for oil palm forest
CN116228041A (en) Method for calculating carbon index after ecological restoration of abandoned mine
CN111339953B (en) Clustering analysis-based mikania micrantha monitoring method
CN117274844B (en) Rapid extraction method for field peanut seedling emergence condition by using unmanned aerial vehicle remote sensing image
CN115760885A (en) High-canopy-density wetland forest parameter extraction method based on consumption-level unmanned aerial vehicle image
CN114778476A (en) Alfalfa cotton field soil water content monitoring model based on unmanned aerial vehicle remote sensing
Kothawade et al. High throughput canopy characterization of a commercial apple orchard using aerial RGB imagery
CN110288645B (en) Terrain surface area calculation method based on slope direction constraint
CN113554675A (en) Edible fungus yield estimation method based on unmanned aerial vehicle visible light remote sensing
CN113487636A (en) Automatic extraction method for plant height and line spacing of wide-ridge crops based on laser radar
Yusof et al. Land clearing, preparation and drone monitoring using Red-Green-Blue (RGB) and thermal imagery for Smart Durian Orchard Management project
Feng et al. Evaluation of cotton stand count using UAV-based hyperspectral imagery

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant