CN110909652B - Monthly-scale dynamic extraction method of crop planting structure based on texture feature optimization - Google Patents

Monthly-scale dynamic extraction method of crop planting structure based on texture feature optimization Download PDF

Info

Publication number
CN110909652B
CN110909652B CN201911122874.3A CN201911122874A CN110909652B CN 110909652 B CN110909652 B CN 110909652B CN 201911122874 A CN201911122874 A CN 201911122874A CN 110909652 B CN110909652 B CN 110909652B
Authority
CN
China
Prior art keywords
texture
image
calculation formula
follows
gray level
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911122874.3A
Other languages
Chinese (zh)
Other versions
CN110909652A (en
Inventor
王镕
赵红莉
郝震
蒋云钟
段浩
朱彦儒
李向龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Institute of Water Resources and Hydropower Research
Original Assignee
China Institute of Water Resources and Hydropower Research
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Institute of Water Resources and Hydropower Research filed Critical China Institute of Water Resources and Hydropower Research
Priority to CN201911122874.3A priority Critical patent/CN110909652B/en
Publication of CN110909652A publication Critical patent/CN110909652A/en
Application granted granted Critical
Publication of CN110909652B publication Critical patent/CN110909652B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a method for dynamically extracting a monthly scale of a crop planting structure with optimal textural features, which comprises the following steps of: s1: determining the space range of an analysis area, preparing data, collecting a time series satellite remote sensing data set not larger than a month scale, uniformly processing the time series satellite remote sensing data set into month scale data in time, and simultaneously completing pre-acquisition of sample data in a research area; s2: utilizing the preprocessed monthly scale satellite remote sensing image data to calculate texture features of the image based on the gray level co-occurrence matrix; s3: calculating the mean value and the variance of different texture characteristic quantities based on the actually measured samples, and calculating the differentiable capacity of the texture characteristic quantities among different samples; the method solves the problems that the traditional method for acquiring the crop planting structure information neglects the screening of the classification characteristic quantity and increases the time complexity.

Description

纹理特征优选的农作物种植结构月尺度动态提取方法Monthly-scale dynamic extraction method of crop planting structure based on texture feature optimization

技术领域technical field

本发明涉及遥感种植结构监测领域,特别是纹理特征优选的农作物种植结构月尺度动态提取方法。The invention relates to the field of remote sensing planting structure monitoring, in particular to a monthly-scale dynamic extraction method for crop planting structures with optimized texture features.

背景技术Background technique

农作物种植结构反映了人类农业生产在空间范围内利用农业生产资源的状况,是研究农作物种类、数量结构和空间分布特征的重要信息,也是进行农作物结构调整与优化以及农业用水精细化管理的依据。传统获取农作物种植结构信息的方法主要基于灰度共生矩阵计算的纹理信息对影像进行分类,但是忽略了对分类特征量的筛选,增加了时间复杂度。Crop planting structure reflects the utilization of agricultural production resources by human agricultural production in a spatial range. It is an important information for studying crop types, quantity structure and spatial distribution characteristics, and is also the basis for adjusting and optimizing crop structure and fine management of agricultural water use. The traditional method of obtaining crop planting structure information mainly classifies images based on the texture information calculated by the gray level co-occurrence matrix, but ignores the screening of classification features, which increases the time complexity.

纹理特征广泛应用于高分辨率影像的分类过程。高分辨率卫星影像以其清晰、细腻的地物结构、形状和纹理信息,迅速占领了城市规划、区域环境监测与评估的卫星应用市场。如Dekker R J.2003等基于SAR建立纹理特征对城市建筑区域进行分析,发现作物的纹理特征参数之间存在一定的影响,在识别过程中需要去除其对分类精度造成的误差;陈君颖等2007基于IKONOS影像的光谱与纹理特征作用于农业植被的决策树分类识别过程,发现仅使用光谱特征分类精度可达83%,加以纹理特征精度提高到91%;Elmqvist B 2008、ZhouW 2009等利用高空间分辨率影像实现土地利用制图,并通过实地验证分析结果;叶时平2008基于灰度共生矩阵计算原理获取Quick Bird影像纹理特征使得建筑区的提取精度高达93%;侯学会等2013利用SPOT-NDVI时间序列分析农牧交错带的生长变化趋势,研究结果与已有的结果一致;刘克宝等2014利用RapidEye5米分辨率的遥感数据提取农作物,发现5米分辨率在分类过程中可以减少大量的误差,使得分类精度达到最高;刘国栋等2015基于高分2/8米影像的纹理完成对地遥感抽样调查,利用农作物的物候历以及多时相的影像完成农作物提取,最终精度优于80%;Piazza GA等2016对高分辨率影展开研究,完成森林制图,基于不同的分类器进行分类对比,得到最佳的分类器并应用于大面积的识别。宋茜等2016、张超等2016基于高分一号PMS提取特定作物的纹理特征,并通过面向对象分类实现作物的种植面积变化图;郑利娟2017基于高分系列卫星分析在农业上的应用,并提出高分一号与高分六号在提取过程中的优势。但是这些都没有对纹理特征进行计算筛选,从而导致分类特征过多增加了分类时间复杂度;同时也忽略了特征相互之间的相关性,影响了最终的分类精度,难以实现遥感农作物分类的应用。Texture features are widely used in the classification process of high-resolution images. High-resolution satellite images have quickly occupied the satellite application market of urban planning, regional environmental monitoring and evaluation with their clear and delicate ground object structure, shape and texture information. For example, Dekker R J. 2003 established texture features based on SAR to analyze urban building areas, and found that there is a certain influence between the texture feature parameters of crops, and the errors caused by the classification accuracy need to be removed in the identification process; Chen Junying et al. 2007 based on The spectral and texture features of IKONOS images act on the decision tree classification and identification process of agricultural vegetation, and it is found that the classification accuracy of only spectral features can reach 83%, and the accuracy of texture features can be improved to 91%; Elmqvist B 2008, ZhouW 2009, etc. use high spatial resolution land use mapping using high-rate images, and verifying the analysis results in the field; Ye Shiping 2008 obtained Quick Bird image texture features based on the gray-scale co-occurrence matrix calculation principle, so that the extraction accuracy of built-up areas was as high as 93%; Hou Xue et al. 2013 used SPOT-NDVI time series analysis The research results are consistent with the existing results; Liu Kebao et al. 2014 used RapidEye 5-meter resolution remote sensing data to extract crops, and found that 5-meter resolution can reduce a large amount of errors in the classification process and improve the classification accuracy. In 2015, Liu Guodong et al. completed a remote sensing sampling survey based on the texture of high-scoring 2/8-meter images, and used crop phenology and multi-temporal images to complete crop extraction, and the final accuracy was better than 80%; Piazza GA et al. Research on resolution shadows, complete forest mapping, classify and compare based on different classifiers, get the best classifier and apply it to large-scale identification. Song Qian et al. 2016, Zhang Chao et al. 2016 extracted texture features of specific crops based on Gaofen-1 PMS, and realized the change map of crop planting area through object-oriented classification; Zheng Lijuan 2017 was based on the application of Gaofen series satellite analysis in agriculture, and The advantages of Gaofen-1 and Gaofen-6 in the extraction process are put forward. However, these do not calculate and filter the texture features, which leads to too many classification features and increases the classification time complexity; at the same time, the correlation between the features is ignored, which affects the final classification accuracy, and it is difficult to realize the application of remote sensing crop classification. .

发明内容SUMMARY OF THE INVENTION

为解决现有技术中存在的问题,本发明提供了纹理特征优选的农作物种植结构月尺度动态提取方法,解决了传统获取农作物种植结构信息的方法忽略了对分类特征量的筛选,增加了时间复杂度的问题。In order to solve the problems existing in the prior art, the present invention provides a monthly-scale dynamic extraction method of crop planting structure with optimized texture features, which solves the problem that the traditional method for obtaining crop planting structure information ignores the screening of classification feature quantities, which increases the time complexity. degree issue.

本发明采用的技术方案是,纹理特征优选的农作物种植结构月尺度动态提取方法,包括以下步骤:The technical scheme adopted in the present invention is that the monthly-scale dynamic extraction method of crop planting structure with preferred texture features comprises the following steps:

S1:确定分析区域空间范围并进行数据准备,收集不大于月尺度的时间序列卫星遥感数据集,时间上统一处理为月尺度数据,同时完成研究区域内样本数据的预获取;S1: Determine the spatial scope of the analysis area and prepare the data, collect time series satellite remote sensing data sets not larger than the monthly scale, and uniformly process it into monthly scale data in time, and complete the pre-acquisition of sample data in the study area;

S2:根据预处理后的卫星遥感影像数据,基于灰度共生矩阵计算影像纹理特征,并利用八个特征量描述作物的纹理特性;S2: According to the preprocessed satellite remote sensing image data, the image texture features are calculated based on the gray level co-occurrence matrix, and eight feature quantities are used to describe the texture characteristics of crops;

S3:基于实测样本计算不同纹理特征量的均值与方差,并计算不同样本间纹理特征量的可区分能力;S3: Calculate the mean and variance of different texture feature quantities based on the measured samples, and calculate the distinguishability of texture feature quantities between different samples;

S4:基于各特征量的可区分能力建立优选公式,并利用公式确定参与分类的纹理特征量的最优个数,将其构建为一个新的图像;S4: Establish an optimal formula based on the distinguishability of each feature quantity, and use the formula to determine the optimal number of texture feature quantities participating in the classification, and construct it as a new image;

S5:利用随机森林分类器对研究区域的农作物类型进行精细识别,实现农作物的精细化管理,生成完整时间序列的农作物的时空分布专题图并验证精度。S5: Use the random forest classifier to finely identify the types of crops in the study area, realize the fine management of crops, generate a thematic map of the spatiotemporal distribution of crops in a complete time series, and verify the accuracy.

优选地,S1包括以下步骤:Preferably, S1 includes the following steps:

S11:根据研究区的位置和范围,选择具有高时间分辨率和高空间分辨率的GF-1WFV数据,如果出现数据源不能完全覆盖的情况,考虑使用sention-2,高分二号,landsat8或HJ-1A/B代替,同时调查实施例范围的农作物类型以及各自的生长物候期;S11: According to the location and extent of the study area, select GF-1WFV data with high temporal resolution and high spatial resolution. If the data source cannot be completely covered, consider using sention-2, Gaofen-2, landsat8 or HJ-1A/B instead, while investigating the types of crops and their respective growth phenology within the scope of the examples;

S12:对收集的数据进行遥感影像的处理,如果出现替代数据,需要重采样统一空间分辨率;S12: Perform remote sensing image processing on the collected data. If there is alternative data, it needs to be resampled to a unified spatial resolution;

S13:对样本的采集需要考虑其代表性、典型性、时效性,通过建立规则格网将研究区划分为n块面积相同的区域,在各个区域内选取不同的作物样本。S13: The representativeness, typicality and timeliness of sample collection should be considered. The study area is divided into n areas with the same area by establishing a regular grid, and different crop samples are selected in each area.

优选地,S2包括以下步骤:Preferably, S2 includes the following steps:

S21:基于灰度共生矩阵计算纹理特征信息量,根据灰度共生矩阵GLCM统计在一定距离的两个像素点之间灰度相关系数,其表达式为:S21: Calculate the amount of texture feature information based on the gray-level co-occurrence matrix, and count the gray-level correlation coefficient between two pixels at a certain distance according to the gray-level co-occurrence matrix GLCM, and its expression is:

p(i,j)=[p(i,j,d,θ)]p(i,j)=[p(i,j,d,θ)]

其中,P(i,j)为在距离和方向确定的情况下出现相同像素对的频率;d为距离像素点的距离,两像素连线向量的角度为θ,通常θ取0°、45°、90°和135°;Among them, P(i,j) is the frequency of the same pixel pair when the distance and direction are determined; d is the distance from the pixel point, and the angle of the line vector between the two pixels is θ, usually θ takes 0°, 45° , 90° and 135°;

S22:利用灰度共生矩阵计算纹理时,选取八个特征量表征纹理的特性:S22: When calculating the texture by using the grayscale co-occurrence matrix, select eight feature quantities to characterize the characteristics of the texture:

平均值:反映窗口内灰度平均值与纹理的规则程度,其计算公式为:Average: It reflects the regularity of the average gray level and texture in the window. The calculation formula is:

Figure BDA0002275929690000031
Figure BDA0002275929690000031

方差:反映矩阵元素偏离均值的程度与灰度变化的大小,其计算公式为:Variance: It reflects the degree of deviation of the matrix elements from the mean value and the size of the grayscale change. The calculation formula is:

Figure BDA0002275929690000041
Figure BDA0002275929690000041

其中,μ是p(i,j)的均值,where μ is the mean of p(i,j),

对比:反映图像的清晰度与纹理沟纹深浅的程度,其计算公式为:Contrast: It reflects the clarity of the image and the depth of the texture grooves. The calculation formula is:

Figure BDA0002275929690000042
Figure BDA0002275929690000042

逆差距:反映了图像分布的平滑性,是图像均匀程度的度量,其计算公式为:Inverse gap: It reflects the smoothness of the image distribution and is a measure of the uniformity of the image. Its calculation formula is:

Figure BDA0002275929690000043
Figure BDA0002275929690000043

差异程度:用来检测图像的差异程度,其计算公式为:Difference degree: It is used to detect the difference degree of the image, and its calculation formula is:

Figure BDA0002275929690000044
Figure BDA0002275929690000044

影像所包含的信息量:度量影像纹理的随机性,是测量灰度级分布随机性的特征参数,表征影像灰度级别的混乱程度,其计算公式为:The amount of information contained in the image: it measures the randomness of the image texture, which is a characteristic parameter to measure the randomness of the gray level distribution, and represents the degree of confusion of the gray level of the image. The calculation formula is:

Figure BDA0002275929690000045
Figure BDA0002275929690000045

图像灰度分布均匀性:反映了影像灰度分布均匀程度和纹理粗细度,其计算公式为:Image gray distribution uniformity: It reflects the image gray distribution uniformity and texture thickness. The calculation formula is:

Figure BDA0002275929690000046
Figure BDA0002275929690000046

相似程度:反映了空间灰度共生矩阵元素在行或列方向上的相似程度,其计算公式为:Similarity: It reflects the similarity of the elements of the spatial grayscale co-occurrence matrix in the row or column direction. The calculation formula is:

Figure BDA0002275929690000047
Figure BDA0002275929690000047

式中:

Figure BDA0002275929690000048
where:
Figure BDA0002275929690000048

Figure BDA0002275929690000049
Figure BDA0002275929690000049

Figure BDA0002275929690000051
Figure BDA0002275929690000051

Figure BDA0002275929690000052
Figure BDA0002275929690000052

优选地,S3包括以下步骤:Preferably, S3 includes the following steps:

S31:统计不同样本在各个特征量上的均值和方差,其计算公式为:S31: Count the mean and variance of different samples on each feature quantity, and the calculation formula is:

Figure BDA0002275929690000053
Figure BDA0002275929690000053

Figure BDA0002275929690000054
Figure BDA0002275929690000054

S32:基于巴氏距离构建样本的可分离程度计算,针对不同纹理特征量计算不同样本两两间的可区分能力,其计算公式为:S32: Calculate the degree of separability of the constructed samples based on the Bavarian distance, and calculate the distinguishability between different samples for different texture feature quantities. The calculation formula is:

Figure BDA0002275929690000055
Figure BDA0002275929690000055

式中,式中,μ图像上同一特征量在2个不同类别间的均值,σ为在同一特征量上在2个不同类别间的标准差。In the formula, in the formula, μ is the mean value of the same feature between two different categories on the image, and σ is the standard deviation of the same feature between two different categories.

优选地,S4包括以下步骤:Preferably, S4 includes the following steps:

S41:计算各特征量在不同农作物类别中的总体分离能力,具体的计算公式为:S41: Calculate the overall separation ability of each feature quantity in different crop categories, and the specific calculation formula is:

Figure BDA0002275929690000056
Figure BDA0002275929690000056

式中,Dij为在不同样本间的可分离程度;In the formula, D ij is the degree of separability among different samples;

S42:按照各特征量对应的区分能力值从大到小的顺序对特征量进行排序;S42: Sort the feature quantities in descending order of the distinguishing ability value corresponding to each feature quantity;

S43:计算各特征量可分离度的累计和,其计算公式为:S43: Calculate the cumulative sum of the separability of each feature quantity, and the calculation formula is:

Figure BDA0002275929690000057
Figure BDA0002275929690000057

S44:基于纹理特征量对应的累计分离度值与对应的精度构建相应的函数表达式,发现其满足对数函数的基本规律,因此获取函数对应的导数,即各点的斜率,其计算公式为:S44: Construct a corresponding function expression based on the cumulative separation value corresponding to the texture feature quantity and the corresponding precision, and find that it satisfies the basic law of the logarithmic function, so the derivative corresponding to the function is obtained, that is, the slope of each point, and the calculation formula is: :

y=0.0625ln(x)+0.642y=0.0625ln(x)+0.642

R2=0.9191R 2 =0.9191

Figure BDA0002275929690000061
Figure BDA0002275929690000061

其中,x表示各特征量所对应的累计D值;Among them, x represents the cumulative D value corresponding to each feature quantity;

S45:将斜率进行差值计算,若差值小于0.0001,则计算的点所对应的即为所求的纹理特征量的最佳个数,并将结果输出,获取具体的优选计算公式为:S45: Calculate the difference between the slopes. If the difference is less than 0.0001, the calculated points correspond to the optimum number of texture features, and output the results. The specific preferred calculation formula is:

D=|y'i-y'i-1|D=|y' i -y' i-1 |

S46:将筛选后的纹理特征量组合形成新的图像。S46: Combine the filtered texture feature quantities to form a new image.

优选地,S5包括以下步骤:Preferably, S5 includes the following steps:

S51:以月为单位,对研究区进行种植结构识别,生成完整时间序列的农作物的时空分布专题图;S51: In the unit of month, identify the planting structure in the study area, and generate a thematic map of the temporal and spatial distribution of crops in a complete time series;

S52:根据验证样本进行结果验证,得到总体分类精度以及Kappa系数。S52: Validate the result according to the validation sample, and obtain the overall classification accuracy and Kappa coefficient.

本发明纹理特征优选的农作物种植结构月尺度动态提取方法的有益效果如下:The beneficial effects of the monthly-scale dynamic extraction method of crop planting structure with the preferred texture feature of the present invention are as follows:

1.本发明基于影像纹理进行优选最终服务于农作物种植结构的识别,不仅能够有效的识别农作物种植结构,还能够节省时间,减少计算机的工作量。1. The present invention optimizes and ultimately serves the identification of crop planting structures based on image textures, which can not only effectively identify crop planting structures, but also save time and reduce computer workload.

2.该技术具有计算快速、适用性强的特点,有效改善了中高分辨率数据的分类局限,提高了分类速率,改进了高分一号在分类中的精度与效率,对农作物种植结构识别技术业务化推广具有重要意义。2. The technology has the characteristics of fast calculation and strong applicability, which effectively improves the classification limitations of medium and high-resolution data, improves the classification rate, and improves the accuracy and efficiency of Gaofen-1 in classification. Business promotion is of great significance.

附图说明Description of drawings

图1为本发明纹理特征优选的农作物种植结构月尺度动态提取方法的总流程框图。FIG. 1 is a general flow chart of the monthly-scale dynamic extraction method of crop planting structure with preferred texture features of the present invention.

图2为本发明纹理特征优选的农作物种植结构月尺度动态提取方法的S1的分步骤流程图。FIG. 2 is a step-by-step flow chart of S1 of the monthly-scale dynamic extraction method of crop planting structure with preferred texture features of the present invention.

图3为本发明纹理特征优选的农作物种植结构月尺度动态提取方法的S2的分步骤流程图。FIG. 3 is a step-by-step flow chart of S2 of the monthly-scale dynamic extraction method of crop planting structure with preferred texture features of the present invention.

图4为本发明纹理特征优选的农作物种植结构月尺度动态提取方法的S3的分步骤流程图。FIG. 4 is a step-by-step flow chart of S3 of the monthly-scale dynamic extraction method of crop planting structure with preferred texture features of the present invention.

图5为本发明纹理特征优选的农作物种植结构月尺度动态提取方法的S4的分步骤流程图。FIG. 5 is a step-by-step flow chart of S4 of the monthly-scale dynamic extraction method of crop planting structure with preferred texture features of the present invention.

图6为本发明纹理特征优选的农作物种植结构月尺度动态提取方法的S5的分步骤流程图。FIG. 6 is a step-by-step flow chart of S5 of the monthly-scale dynamic extraction method of crop planting structure with preferred texture features of the present invention.

具体实施方式Detailed ways

下面对本发明的具体实施方式进行描述,以便于本技术领域的技术人员理解本发明,但应该清楚,本发明不限于具体实施方式的范围,对本技术领域的普通技术人员来讲,只要各种变化在所附的权利要求限定和确定的本发明的精神和范围内,这些变化是显而易见的,一切利用本发明构思的发明创造均在保护之列。The specific embodiments of the present invention are described below to facilitate those skilled in the art to understand the present invention, but it should be clear that the present invention is not limited to the scope of the specific embodiments. For those of ordinary skill in the art, as long as various changes Such changes are obvious within the spirit and scope of the present invention as defined and determined by the appended claims, and all inventions and creations utilizing the inventive concept are within the scope of protection.

如图1所示,一种农作物种植结构月尺度动态提取方法,包括以下步骤:As shown in Figure 1, a monthly-scale dynamic extraction method of crop planting structure includes the following steps:

S1:确定分析区域空间范围并进行数据准备,收集不大于月尺度的时间序列卫星遥感数据集,时间上统一处理为月尺度数据,同时完成研究区域内样本数据的预获取;S1: Determine the spatial scope of the analysis area and prepare the data, collect time series satellite remote sensing data sets not larger than the monthly scale, and uniformly process it into monthly scale data in time, and complete the pre-acquisition of sample data in the study area;

S2:根据预处理后的卫星遥感影像数据,基于灰度共生矩阵计算影像纹理特征,并利用八个特征量描述作物的纹理特性;S2: According to the preprocessed satellite remote sensing image data, the image texture features are calculated based on the gray level co-occurrence matrix, and eight feature quantities are used to describe the texture characteristics of crops;

S3:基于实测样本计算不同纹理特征量的均值与方差,并计算不同样本间纹理特征量的可区分能力,最终评价不同特征量在样本间的区分能力;S3: Calculate the mean and variance of different texture feature quantities based on the measured samples, and calculate the distinguishing ability of texture feature quantities between different samples, and finally evaluate the distinguishing ability of different feature quantities between samples;

S4:基于各特征量的可区分能力建立优选公式,并利用公式确定参与分类的纹理特征量的最优个数,将其构建为一个新的图像;S4: Establish an optimal formula based on the distinguishability of each feature quantity, and use the formula to determine the optimal number of texture feature quantities participating in the classification, and construct it as a new image;

S5:利用随机森林分类器对研究区域的农作物类型进行精细识别,实现农作物的精细化管理,生成完整时间序列的农作物的时空分布专题图并验证精度。S5: Use the random forest classifier to finely identify the types of crops in the study area, realize the fine management of crops, generate a thematic map of the spatiotemporal distribution of crops in a complete time series, and verify the accuracy.

如图2所示,本实施方案的S1包括以下步骤:As shown in Figure 2, S1 of this embodiment includes the following steps:

S11:根据研究区的位置和范围,选择我国自主研发的具有高时间分辨率和高空间分辨率的GF-1WFV数据,如果出现数据源不能完全覆盖的情况,考虑使用sention-2,高分二号,landsat8或HJ-1A/B代替,同时调查实施例范围的农作物类型以及各自的生长物候期;S11: According to the location and scope of the study area, select the GF-1WFV data independently developed by my country with high temporal resolution and high spatial resolution. If the data source cannot be completely covered, consider using sention-2, with a high score of 2 No., landsat8 or HJ-1A/B instead, while investigating the crop types within the scope of the examples and their respective growth phenology;

S12:对收集的数据进行遥感影像的处理,如果出现替代数据,需要重采样统一空间分辨率;S12: Perform remote sensing image processing on the collected data. If there is alternative data, it needs to be resampled to a unified spatial resolution;

S13:对样本的采集需要考虑其代表性、典型性、时效性,通过建立规则格网将研究区划分为n块面积相同的区域,在各个区域内选取作物样本。S13: The representativeness, typicality, and timeliness of sample collection should be considered. The study area is divided into n areas with the same area by establishing a regular grid, and crop samples are selected in each area.

如图3所示,本实施方案的S2包括以下步骤:As shown in Figure 3, S2 of this embodiment includes the following steps:

S21:基于灰度共生矩阵计算纹理特征信息量,根据灰度共生矩阵GLCM统计在一定距离的两个像素点之间灰度相关系数,表示灰度重复出现的概率分布,其表达式为:S21: Calculate the amount of texture feature information based on the gray-level co-occurrence matrix, and count the gray-level correlation coefficient between two pixels at a certain distance according to the gray-level co-occurrence matrix GLCM, which represents the probability distribution of repeated gray levels, and its expression is:

p(i,j)=[p(i,j,d,θ)]p(i,j)=[p(i,j,d,θ)]

其中,P(i,j)为在距离和方向确定的情况下出现相同像素对的频率;d为距离像素点的距离,两像素连线向量的角度为θ,通常θ取0°、45°、90°和135°;Among them, P(i,j) is the frequency of the same pixel pair when the distance and direction are determined; d is the distance from the pixel point, and the angle of the line vector between the two pixels is θ, usually θ takes 0°, 45° , 90° and 135°;

利用灰度共生矩阵计算纹理时,选取常用的八个特征量表征纹理的特性。When using the gray level co-occurrence matrix to calculate the texture, eight commonly used feature quantities are selected to characterize the characteristics of the texture.

Mean:反映窗口内灰度平均值与纹理的规则程度。Mean: It reflects the average gray level and the regularity of the texture in the window.

Figure BDA0002275929690000091
Figure BDA0002275929690000091

Variance:反映矩阵元素偏离均值的程度与灰度变化的大小。灰度变化较大时,偏离均值越大,其值越大。Variance: It reflects the degree to which the matrix elements deviate from the mean and the size of the grayscale change. When the grayscale changes greatly, the larger the deviation from the mean value is, the larger the value is.

Figure BDA0002275929690000092
Figure BDA0002275929690000092

其中μ是p(i,j)的均值。当图像中的灰度变化较大时该值较大。where μ is the mean of p(i,j). This value is larger when the grayscale changes in the image are larger.

Contrast:反映图像的清晰度与纹理沟纹深浅的程度。纹理沟纹越深,其值大。值越大则表示即纹理的清晰程度越高。Contrast: Reflects the clarity of the image and the depth of the texture grooves. The deeper the texture groove, the larger its value. The larger the value, the higher the clarity of the texture.

Figure BDA0002275929690000093
Figure BDA0002275929690000093

Homogeneity:又称逆差距,反映了图像分布的平滑性,是图像均匀程度的度量。如果图像局部的灰度均匀,逆差矩的取值较大。Homogeneity: Also known as the inverse gap, it reflects the smoothness of the image distribution and is a measure of the uniformity of the image. If the local gray level of the image is uniform, the value of the inverse difference moment is larger.

Figure BDA0002275929690000094
Figure BDA0002275929690000094

Dissimilarity:用来检测图像的差异程度。若局部区域内差异变化较大,则值较大。Dissimilarity: Used to detect the dissimilarity of images. If the difference in the local area varies greatly, the value will be larger.

Figure BDA0002275929690000095
Figure BDA0002275929690000095

Entropy:度量影像纹理的随机性,即影像所包含的信息量,是测量灰度级分布随机性的特征参数,表征影像灰度级别的混乱程度。熵越大,样本的类别不确定性越大。反之,图像中的灰度均匀,则嫡值就小。Entropy: Measure the randomness of image texture, that is, the amount of information contained in the image. It is a characteristic parameter that measures the randomness of gray level distribution, and represents the degree of confusion in the gray level of the image. The greater the entropy, the greater the class uncertainty of the sample. On the contrary, if the gray level in the image is uniform, the direct value is small.

Figure BDA0002275929690000096
Figure BDA0002275929690000096

Angular Second Moment:反映了影像灰度分布均匀程度和纹理粗细度,是图像灰度分布均匀性的度量。当灰度共生矩阵中元素分布较集中于主对角线附近时,说明局部区域内图像灰度分布较均匀,图像呈现较粗的纹理,角二阶矩的取值相应较大。Angular Second Moment: It reflects the uniformity of the grayscale distribution of the image and the thickness of the texture, and is a measure of the uniformity of the grayscale distribution of the image. When the distribution of elements in the gray-level co-occurrence matrix is more concentrated near the main diagonal, it means that the gray-level distribution of the image in the local area is relatively uniform, the image has a thicker texture, and the value of the second-order moment of the angle is correspondingly large.

Figure BDA0002275929690000101
Figure BDA0002275929690000101

Correlation:反映了空间灰度共生矩阵元素在行或列方向上的相似程度。若矩阵元素均匀相等,则相关程度较高,值较大;反之,如果元素间相差较大时,则表示相关性较小。Correlation: It reflects the similarity of the elements of the spatial grayscale co-occurrence matrix in the row or column direction. If the matrix elements are uniform and equal, the correlation degree is high and the value is large; on the contrary, if the difference between the elements is large, the correlation is small.

Figure BDA0002275929690000102
Figure BDA0002275929690000102

式中:

Figure BDA0002275929690000103
where:
Figure BDA0002275929690000103

Figure BDA0002275929690000104
Figure BDA0002275929690000104

Figure BDA0002275929690000105
Figure BDA0002275929690000105

Figure BDA0002275929690000106
Figure BDA0002275929690000106

如图4所示,本实施方案的S3包括以下步骤:As shown in Figure 4, S3 of this embodiment includes the following steps:

S31:统计不同样本在各个特征量上的均值和方差,其计算公式为:S31: Count the mean and variance of different samples on each feature quantity, and the calculation formula is:

Figure BDA0002275929690000107
Figure BDA0002275929690000107

Figure BDA0002275929690000108
Figure BDA0002275929690000108

S32:基于Bhattacharyya distance构建样本的可分离程度计算,针对不同纹理特征量计算不同样本两两间的可区分能力,其计算公式为:S32: Calculate the degree of separability of the constructed samples based on the Bhattacharyya distance, and calculate the distinguishability between different samples for different texture features. The calculation formula is:

Figure BDA0002275929690000109
Figure BDA0002275929690000109

式中,式中,μ图像上同一特征量在2个不同类别间的均值,σ为在同一特征量上在2个不同类别间的标准差。In the formula, in the formula, μ is the mean value of the same feature between two different categories on the image, and σ is the standard deviation of the same feature between two different categories.

如图5所示,本实施方案的S4包括以下步骤:As shown in Figure 5, S4 of this embodiment includes the following steps:

S41:计算各特征量在不同农作物类别中的总体分离能力,具体的计算公式如下:S41: Calculate the overall separation ability of each feature quantity in different crop categories, and the specific calculation formula is as follows:

Figure BDA0002275929690000111
Figure BDA0002275929690000111

式中,Dij为在不同样本间的可分离程度;In the formula, D ij is the degree of separability among different samples;

S42:按照各特征量对应的区分能力值从大到小的顺序对特征量进行排序;S42: Sort the feature quantities in descending order of the distinguishing ability value corresponding to each feature quantity;

S43:计算各特征量可分离度的累计和,计算公式为:S43: Calculate the cumulative sum of the separability of each feature quantity, and the calculation formula is:

Figure BDA0002275929690000112
Figure BDA0002275929690000112

S44:基于纹理特征量对应的累计分离度值与对应的精度构建相应的函数表达式,发现其满足对数函数的基本规律,因此获取函数对应的导数,即各点的斜率,具体如下:S44: Construct a corresponding function expression based on the cumulative separation value corresponding to the texture feature quantity and the corresponding precision, and find that it satisfies the basic law of the logarithmic function, so the derivative corresponding to the function is obtained, that is, the slope of each point, as follows:

y=0.0625ln(x)+0.642y=0.0625ln(x)+0.642

R2=0.9191R 2 =0.9191

Figure BDA0002275929690000113
Figure BDA0002275929690000113

其中,x表示各特征量所对应的累计D值;Among them, x represents the cumulative D value corresponding to each feature quantity;

S45:将斜率进行差值计算,若差值小于0.0001,该点所对应的即为所求的纹理特征量的最佳个数。并将结果输出。获取具体的优选计算公式如下:S45: Calculate the difference value of the slope, if the difference value is less than 0.0001, the corresponding point is the optimal number of the obtained texture feature quantity. and output the result. The specific optimal calculation formula is obtained as follows:

D=|y'i-y'i-1|D=|y' i -y' i-1 |

S46:将筛选后的纹理特征量组合形成新的图像。S46: Combine the filtered texture feature quantities to form a new image.

如图6所示,本实施方案的S5包括以下步骤:As shown in Figure 6, S5 of this embodiment includes the following steps:

S51:根据随机森林原理对研究区域的地物进行识别;S51: Identify the ground objects in the study area according to the random forest principle;

S52:设置分类器所需各类参数,输入分类样本对研究区域的农作物种植结构进行识别分类,完成动态识别。S52: Set various parameters required by the classifier, input classification samples to identify and classify the crop planting structure in the research area, and complete the dynamic identification.

S53:对研究区进行农作物识别,生成完整时间序列的农作物时空分布专题图;S53: Identify crops in the study area, and generate a thematic map of the spatial and temporal distribution of crops in a complete time series;

S54:根据验证样本进行结果验证,得到总体分类精度以及Kappa系数。S54: Validate the result according to the validation sample, and obtain the overall classification accuracy and Kappa coefficient.

本实施方案在实施时,本发明提出的方法是基于不同农作物在影像上具有不同的纹理特征的原理,首先对实验区的农作物类型进行分析汇总,再基于纹理特征量的可分离度与精度的计算原理,筛选最优的纹理特征量,进而形成新的图像,最终运用随机森林分类方法识别得到农作物种植结构信息。本技术方案具有简单、有效、适用性强的特点,可以快速精确的获取大范围的农作物空间分布信息,改善了传统方法中由于参与分类的特征量过多而导致的分类速率的下降与分类精度的不佳,提高了遥感监测农作物种植结构的计算效率和精度,有助于遥感技术监测农作物种植结构的业务化推广。When this embodiment is implemented, the method proposed by the present invention is based on the principle that different crops have different texture features on the image. First, the types of crops in the experimental area are analyzed and summarized, and then based on the separability and accuracy of texture feature quantities. According to the calculation principle, the optimal texture feature quantity is screened, and then a new image is formed. Finally, the random forest classification method is used to identify the crop planting structure information. The technical solution has the characteristics of simplicity, effectiveness and strong applicability, and can quickly and accurately obtain a wide range of crop spatial distribution information, and improves the classification rate and classification accuracy caused by too many features involved in the classification in the traditional method. It improves the computational efficiency and accuracy of remote sensing monitoring of crop planting structure, and is helpful for the commercialization of remote sensing technology to monitor crop planting structure.

Claims (1)

1. The method for dynamically extracting the monthly scale of the crop planting structure with the optimized textural features is characterized by comprising the following steps of:
s1: determining the space range of an analysis area, preparing data, collecting a time series satellite remote sensing data set not larger than a month scale, uniformly processing the time series satellite remote sensing data set into month scale data in time, and simultaneously completing pre-acquisition of sample data in a research area;
s2: calculating image texture characteristics based on a gray level co-occurrence matrix according to the preprocessed satellite remote sensing image data, and describing the texture characteristics of crops by using eight characteristic quantities;
s3: calculating the mean value and the variance of different texture characteristic quantities based on the actually measured samples, and calculating the differentiable capacity of the texture characteristic quantities among different samples;
s4: establishing an optimal calculation formula based on the distinguishing capability of each characteristic quantity, determining the optimal number of the texture characteristic quantities participating in classification by using the formula, and constructing a new image;
s5: finely identifying the crop types in the research area by using a random forest classifier, realizing fine management of the crops, generating a space-time distribution thematic map of the crops with a complete time sequence and verifying the precision;
the S1 comprises the following steps:
s11: selecting GF-1WFV data with high time resolution and high spatial resolution according to the position and range of the research area, and considering using the position-2, high mark two, landsat8 or HJ-1A/B to replace if the data source can not completely cover, and simultaneously investigating the type of crops and the respective growth phenological period;
s12: processing the collected data by remote sensing images, and if substitute data appears, resampling the uniform spatial resolution;
s13: the representativeness, the typicality and the timeliness of the samples need to be considered for collecting the samples, a regular grid is established to divide a research area into n areas with the same area, and different crop samples are selected in each area;
the S2 comprises the following steps:
s21: calculating texture characteristic information quantity based on a gray level co-occurrence matrix, and counting a gray level correlation coefficient between two pixel points at a certain distance according to a gray level co-occurrence matrix GLCM, wherein the expression is as follows:
Figure DEST_PATH_IMAGE001
wherein,
Figure 234249DEST_PATH_IMAGE002
is the frequency at which the same pixel pair occurs with distance and direction determinations; i, j represents the spatial position of the pixel;
Figure DEST_PATH_IMAGE003
the distance from a pixel point is the angle of a connecting line vector of two pixels
Figure 823494DEST_PATH_IMAGE004
Figure 868810DEST_PATH_IMAGE004
Taking 0 degree, 45 degrees, 90 degrees and 135 degrees;
s22, when the texture is calculated by utilizing the gray level co-occurrence matrix, eight characteristic quantities are selected to characterize the characteristics of the texture:
average value: the calculation formula of the regular degree of the gray level average value and the texture in the window is as follows:
Figure DEST_PATH_IMAGE005
wherein L represents the number of pixels;
variance: the degree of matrix element deviation from the mean value and the gray scale change are reflected, and the calculation formula is as follows:
Figure 683926DEST_PATH_IMAGE006
wherein μ is the mean of p (i, j);
and (3) comparison: the degree of the definition of the image and the depth of the texture groove is reflected, and the calculation formula is as follows:
Figure DEST_PATH_IMAGE007
reverse difference: the smoothness of the image distribution is reflected, and is a measure of the uniformity of the image, and the calculation formula is as follows:
Figure 232719DEST_PATH_IMAGE008
degree of difference: detecting the difference degree of the images;
amount of information contained in the image: the randomness of the image texture is measured, the characteristic parameters of the randomness of the gray level distribution are measured, the chaotic degree of the gray level of the image is represented, and the calculation formula is as follows:
Figure DEST_PATH_IMAGE009
uniformity of gray level distribution of the image: the image gray level distribution uniformity and texture thickness are reflected, and the calculation formula is as follows:
Figure 524023DEST_PATH_IMAGE010
degree of similarity: the similarity degree of the elements of the space gray level co-occurrence matrix in the row or column direction is reflected, and the calculation formula is as follows:
Figure DEST_PATH_IMAGE011
wherein,
Figure 712428DEST_PATH_IMAGE012
represents the mean value in the row direction;
Figure DEST_PATH_IMAGE013
indicates all in the column directionA value;
Figure 645749DEST_PATH_IMAGE014
represents the standard deviation in the row direction;
Figure DEST_PATH_IMAGE015
represents the standard deviation in the column direction;
the S3 comprises the following steps:
s31: and (3) counting the mean value and the variance of different samples on each characteristic quantity, wherein the calculation formula is as follows:
Figure 737464DEST_PATH_IMAGE016
Figure DEST_PATH_IMAGE017
s32: the method comprises the following steps of constructing separable degree calculation of samples based on the Pasteur distance, and calculating the differentiable capacity between different samples aiming at different texture characteristic quantities, wherein the calculation formula is as follows:
Figure 855461DEST_PATH_IMAGE018
in the formula, mu a Is the mean value, mu, of the same feature in the image between the a classes b Mean, σ, between b classes for the same feature of the image a Standard deviation, σ, between a classes for the same feature b Standard deviation between categories b for the same feature;
the S4 comprises the following steps:
s41: calculating the total separation capacity of each characteristic quantity in different crop categories, wherein the specific calculation formula is as follows:
Figure DEST_PATH_IMAGE019
in the formula, D ij The degree of separability between different samples;
s42: sorting the characteristic quantities according to the sequence of the distinguishing capacity values corresponding to the characteristic quantities from large to small;
s43: and calculating the cumulative sum of the separability of the characteristic quantities, wherein the calculation formula is as follows:
Figure 344211DEST_PATH_IMAGE020
s44: constructing a corresponding function expression based on the accumulated separation value corresponding to the texture characteristic quantity and the corresponding precision, finding that the function expression meets the basic rule of a logarithmic function, and acquiring a derivative corresponding to the function, namely the slope of each point, wherein the calculation formula is as follows:
y = 0.0625ln(x) + 0.642
R² = 0.9191
Figure DEST_PATH_IMAGE021
wherein x represents the accumulated D value corresponding to each characteristic quantity; y represents the precision corresponding to the accumulated D value;
Figure 18906DEST_PATH_IMAGE022
is the first partial derivative of y;
s45: and calculating the difference of the slopes, if the difference is less than 0.0001, the calculated point corresponds to the optimal number of the texture feature quantity, outputting the result, and acquiring a specific optimal calculation formula as follows:
Figure DEST_PATH_IMAGE023
s46: combining the screened texture characteristic quantities to form a new image;
the S5 comprises the following steps:
s51: recognizing the ground features of the research area according to a random forest principle;
s52: setting various parameters required by a classifier, inputting classification samples to identify and classify crop planting structures in a research area, and completing dynamic identification;
s53: identifying crops in a research area to generate a complete time sequence crop space-time distribution thematic map;
s54: and verifying the result according to the verification sample to obtain the overall classification precision and the Kappa coefficient.
CN201911122874.3A 2019-11-16 2019-11-16 Monthly-scale dynamic extraction method of crop planting structure based on texture feature optimization Active CN110909652B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911122874.3A CN110909652B (en) 2019-11-16 2019-11-16 Monthly-scale dynamic extraction method of crop planting structure based on texture feature optimization

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911122874.3A CN110909652B (en) 2019-11-16 2019-11-16 Monthly-scale dynamic extraction method of crop planting structure based on texture feature optimization

Publications (2)

Publication Number Publication Date
CN110909652A CN110909652A (en) 2020-03-24
CN110909652B true CN110909652B (en) 2022-10-21

Family

ID=69816875

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911122874.3A Active CN110909652B (en) 2019-11-16 2019-11-16 Monthly-scale dynamic extraction method of crop planting structure based on texture feature optimization

Country Status (1)

Country Link
CN (1) CN110909652B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111950530B (en) * 2020-09-08 2024-04-12 中国水利水电科学研究院 Multi-feature optimization and fusion method for crop planting structure extraction
CN112949607A (en) * 2021-04-15 2021-06-11 辽宁工程技术大学 Wetland vegetation feature optimization and fusion method based on JM Relief F
CN114519721A (en) * 2022-02-16 2022-05-20 广东皓行科技有限公司 Crop lodging identification method, device and equipment based on remote sensing image
CN114219847B (en) * 2022-02-18 2022-07-01 清华大学 Method and system for determining crop planting area based on phenological characteristics and storage medium
CN114490449B (en) * 2022-04-18 2022-07-08 飞腾信息技术有限公司 Memory access method and device and processor

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2007317180A1 (en) * 2006-11-10 2008-05-15 National Ict Australia Limited Texture feature extractor
CN103500344A (en) * 2013-09-02 2014-01-08 中国测绘科学研究院 Method and module for extracting and interpreting information of remote-sensing image
CN104715255A (en) * 2015-04-01 2015-06-17 电子科技大学 Landslide information extraction method based on SAR (Synthetic Aperture Radar) images
CN105678281A (en) * 2016-02-04 2016-06-15 中国农业科学院农业资源与农业区划研究所 Plastic film mulching farmland remote sensing monitoring method based on spectrum and texture features
CN108399400A (en) * 2018-03-23 2018-08-14 中国农业科学院农业资源与农业区划研究所 A kind of early stage crop recognition methods and system based on high-definition remote sensing data
CN109584284A (en) * 2018-12-13 2019-04-05 宁波大学 A kind of seashore wetland ground object sample extracting method of hierarchical decision making
CN110321861A (en) * 2019-07-09 2019-10-11 中国水利水电科学研究院 A kind of main crops production moon scale Dynamic Extraction method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6321026B2 (en) * 2012-11-20 2018-05-09 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Integrated phenotypic analysis using image texture features
TWI668666B (en) * 2018-02-14 2019-08-11 China Medical University Hospital Prediction model for grouping hepatocellular carcinoma, prediction system thereof, and method for determining hepatocellular carcinoma group

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2007317180A1 (en) * 2006-11-10 2008-05-15 National Ict Australia Limited Texture feature extractor
CN103500344A (en) * 2013-09-02 2014-01-08 中国测绘科学研究院 Method and module for extracting and interpreting information of remote-sensing image
CN104715255A (en) * 2015-04-01 2015-06-17 电子科技大学 Landslide information extraction method based on SAR (Synthetic Aperture Radar) images
CN105678281A (en) * 2016-02-04 2016-06-15 中国农业科学院农业资源与农业区划研究所 Plastic film mulching farmland remote sensing monitoring method based on spectrum and texture features
CN108399400A (en) * 2018-03-23 2018-08-14 中国农业科学院农业资源与农业区划研究所 A kind of early stage crop recognition methods and system based on high-definition remote sensing data
CN109584284A (en) * 2018-12-13 2019-04-05 宁波大学 A kind of seashore wetland ground object sample extracting method of hierarchical decision making
CN110321861A (en) * 2019-07-09 2019-10-11 中国水利水电科学研究院 A kind of main crops production moon scale Dynamic Extraction method

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
"Feature Selection Techniques for Breast Cancer Image Classification with Support Vector Machine";Kedkarn Chaiyakhan et al;《Proceedings of the International MultiConference of Engineers and Computer Scientists》;20160318(第1期);全文 *
"Monthly spatial distributed water resources assessment: a case study";Yuhui Wang et al;《Computers & Geosciences》;20111206;全文 *
"SAR 图像纹理特征提取与分类研究";胡召玲等;《中国矿业大学学报》;20090531;第38卷(第3期);全文 *
"特征选择的全极化SAR影像面向对象土地覆盖分类";陆翔;《航天返回与遥感》;20180430;第39卷(第2期);全文 *

Also Published As

Publication number Publication date
CN110909652A (en) 2020-03-24

Similar Documents

Publication Publication Date Title
CN110909652B (en) Monthly-scale dynamic extraction method of crop planting structure based on texture feature optimization
CN110321861A (en) A kind of main crops production moon scale Dynamic Extraction method
Malambo et al. Automated detection and measurement of individual sorghum panicles using density-based clustering of terrestrial lidar data
CN111950530B (en) Multi-feature optimization and fusion method for crop planting structure extraction
CN110288647B (en) Method for monitoring irrigation area of irrigation area based on high-resolution satellite data
CN112164062A (en) Wasteland information extraction method and device based on remote sensing time sequence analysis
CN112395914A (en) Method for identifying land parcel crops by fusing remote sensing image time sequence and textural features
CN109753916A (en) A method and device for constructing a scale conversion model of normalized differential vegetation index
CN113642464B (en) Time sequence remote sensing image crop classification method combining TWDTW algorithm and fuzzy set
Sun et al. Remote estimation of grafted apple tree trunk diameter in modern orchard with RGB and point cloud based on SOLOv2
CN113609899B (en) Remote sensing time sequence analysis-based tilling land information positioning display system
CN109063660B (en) Crop identification method based on multispectral satellite image
Yalcin Phenology monitoring of agricultural plants using texture analysis
CN117475313A (en) A method for early identification of large-scale winter wheat based on automated extraction of training samples
CN113807269A (en) Landscape scale plant diversity evaluation method with space-time consistency
CN115953685A (en) Multi-layer multi-scale division agricultural greenhouse type information extraction method and system
CN115131668A (en) A small sample crop classification method, system, storage device and electronic device
CN113408370B (en) Forest change remote sensing detection method based on adaptive parameter genetic algorithm
CN117171533B (en) Real-time acquisition and processing method and system for geographical mapping operation data
Aparna et al. Analytical approach for soil and land classification using image processing with deep learning
CN105184234B (en) A method and device for measuring and calculating pollutant emissions from winter wheat straw incineration
CN112801481A (en) Smart power grid infrastructure project construction cost evaluation management cloud platform based on cloud computing and big data
CN111444824A (en) A survey method of vegetation spatial distribution pattern and vegetation classification method based on UAV technology
Yuan et al. Rapidly count crop seedling emergence based on waveform Method (WM) using drone imagery at the early stage
CN118230188B (en) Cultivated land type change detection method, cultivated land type change detection device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant