CN110909652B - Method for dynamically extracting monthly scale of crop planting structure with optimized textural features - Google Patents

Method for dynamically extracting monthly scale of crop planting structure with optimized textural features Download PDF

Info

Publication number
CN110909652B
CN110909652B CN201911122874.3A CN201911122874A CN110909652B CN 110909652 B CN110909652 B CN 110909652B CN 201911122874 A CN201911122874 A CN 201911122874A CN 110909652 B CN110909652 B CN 110909652B
Authority
CN
China
Prior art keywords
texture
image
follows
calculation formula
gray level
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911122874.3A
Other languages
Chinese (zh)
Other versions
CN110909652A (en
Inventor
王镕
赵红莉
郝震
蒋云钟
段浩
朱彦儒
李向龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Institute of Water Resources and Hydropower Research
Original Assignee
China Institute of Water Resources and Hydropower Research
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Institute of Water Resources and Hydropower Research filed Critical China Institute of Water Resources and Hydropower Research
Priority to CN201911122874.3A priority Critical patent/CN110909652B/en
Publication of CN110909652A publication Critical patent/CN110909652A/en
Application granted granted Critical
Publication of CN110909652B publication Critical patent/CN110909652B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a method for dynamically extracting a monthly scale of a crop planting structure with optimal textural features, which comprises the following steps of: s1: determining the space range of an analysis area, preparing data, collecting a time series satellite remote sensing data set not larger than a month scale, uniformly processing the time series satellite remote sensing data set into month scale data in time, and simultaneously completing pre-acquisition of sample data in a research area; s2: utilizing the preprocessed monthly scale satellite remote sensing image data to calculate texture features of the image based on the gray level co-occurrence matrix; s3: calculating the mean value and the variance of different texture characteristic quantities based on the actually measured samples, and calculating the differentiable capacity of the texture characteristic quantities among different samples; the method solves the problems that the traditional method for acquiring the crop planting structure information neglects the screening of the classification characteristic quantity and increases the time complexity.

Description

Method for dynamically extracting monthly scale of crop planting structure with optimized textural features
Technical Field
The invention relates to the field of remote sensing planting structure monitoring, in particular to a monthly scale dynamic extraction method for a crop planting structure with optimal textural features.
Background
The crop planting structure reflects the condition that agricultural production resources are utilized in the space range of human agricultural production, is important information for researching the types, quantity structures and space distribution characteristics of crops, and is also a basis for carrying out crop structure adjustment and optimization and fine management on agricultural water. The traditional method for acquiring the crop planting structure information mainly classifies images based on texture information calculated by a gray level co-occurrence matrix, but neglects screening of classification characteristic quantities, and increases time complexity.
Texture features are widely used in the classification process of high-resolution images. The high-resolution satellite image rapidly occupies the satellite application market of urban planning, regional environment monitoring and evaluation by means of clear and exquisite ground object structure, shape and texture information. For example, dekker R J.2003 and the like establish texture features based on SAR to analyze urban building areas, find that certain influence exists among texture feature parameters of crops, and need to remove errors caused by classification precision in the identification process; chen Junying and other 2007 act on the decision tree classification and identification process of agricultural vegetation based on the spectral and textural features of the IKONOS images, and the classification precision can reach 83% only by using the spectral features, and the precision of the textural features is improved to 91%; the method comprises the following steps of (1) utilizing high-spatial-resolution images to realize land utilization mapping by utilizing Elmqvist B2008, zhouW 2009 and the like, and verifying and analyzing results through field verification; she Shiping 2008 obtaining Quick Bird image texture characteristics based on gray level co-occurrence matrix calculation principle to make the extraction precision of the building area as high as 93%; hou Xuehui and 2013, analyzing the growth change trend of the farming and pasturing interlaced zone by using the SPOT-NDVI time sequence, wherein the research result is consistent with the existing result; liu Kebao and the like 2014 extract crops by using remote sensing data with a RapidEye resolution of 5 meters, and a great amount of errors can be reduced in the classification process by the 5 meter resolution, so that the classification precision is highest; liu Guodong etc. 2015 completes sampling survey of ground remote sensing based on the texture of high-resolution 2/8 m image, completes crop extraction by using the phenological calendar of crop and multi-temporal image, and the final precision is better than 80%; the Piazza GA et al 2016 develop and research high-resolution images, complete forest mapping, and perform classification and comparison based on different classifiers to obtain an optimal classifier and apply the optimal classifier to large-area recognition. Song Qian and 2016 (Zhang Chaodeng) 2016, extracting texture features of specific crops based on the high-grade first PMS, and realizing a planting area change map of the crops through object-oriented classification; zheng Lijuan 2017 is applied to agriculture based on high-resolution series satellite analysis, and provides advantages of high-resolution first-grade and high-resolution sixth-grade in the extraction process. However, these methods do not perform calculation screening on texture features, so that the complexity of classification time is increased due to excessive classification features; meanwhile, the correlation among the characteristics is neglected, the final classification precision is influenced, and the application of remote sensing crop classification is difficult to realize.
Disclosure of Invention
In order to solve the problems in the prior art, the invention provides a crop planting structure month scale dynamic extraction method with optimal textural features, and solves the problems that the screening of classification feature quantity is neglected and the time complexity is increased in the traditional method for obtaining crop planting structure information.
The technical scheme adopted by the invention is that the method for dynamically extracting the monthly scale of the crop planting structure with the optimized textural features comprises the following steps:
s1: determining the space range of an analysis area, preparing data, collecting a time series satellite remote sensing data set not larger than a month scale, uniformly processing the time series satellite remote sensing data set into month scale data in time, and simultaneously completing pre-acquisition of sample data in a research area;
s2: calculating image texture characteristics based on the gray level co-occurrence matrix according to the preprocessed satellite remote sensing image data, and describing texture characteristics of crops by using eight characteristic quantities;
s3: calculating the mean value and the variance of different texture characteristic quantities based on the actually measured samples, and calculating the differentiable capacity of the texture characteristic quantities among different samples;
s4: establishing an optimal formula based on the distinguishable capability of each characteristic quantity, determining the optimal number of the texture characteristic quantities participating in classification by using the formula, and constructing the optimal number as a new image;
s5: and finely identifying the crop types in the research area by using a random forest classifier, realizing fine management of the crops, generating a space-time distribution thematic map of the crops with a complete time sequence and verifying the precision.
Preferably, S1 comprises the steps of:
s11: selecting GF-1WFV data with high time resolution and high spatial resolution according to the position and range of the research area, considering using the position-2, high score two, landsat8 or HJ-1A/B to replace if the data source can not completely cover, and simultaneously investigating the crop type and respective growth climate stage within the range of the embodiment;
s12: processing the remote sensing image of the collected data, and if substitute data appears, resampling the uniform spatial resolution;
s13: the representativeness, the typicality and the timeliness of the samples need to be considered for collecting the samples, the research area is divided into n areas with the same area by establishing a regular grid, and different crop samples are selected in each area.
Preferably, S2 comprises the steps of:
s21: calculating texture characteristic information quantity based on a gray level co-occurrence matrix, and counting a gray level correlation coefficient between two pixel points at a certain distance according to a gray level co-occurrence matrix GLCM, wherein the expression is as follows:
p(i,j)=[p(i,j,d,θ)]
where P (i, j) is the frequency at which the same pixel pair occurs with a determined distance and direction; d is the distance from the pixel points, the angle of the connecting line vector of the two pixels is theta, and the theta is usually 0 degree, 45 degrees, 90 degrees and 135 degrees;
s22, when the texture is calculated by utilizing the gray level co-occurrence matrix, eight characteristic quantities are selected to characterize the characteristics of the texture:
average value: the calculation formula of the regular degree of the gray level average value and the texture in the window is as follows:
Figure BDA0002275929690000031
variance: the degree of matrix element deviation from the mean value and the gray scale change are reflected, and the calculation formula is as follows:
Figure BDA0002275929690000041
where μ is the mean of p (i, j),
and (3) comparison: the degree of the definition and the depth of the texture grooves of the image is reflected, and the calculation formula is as follows:
Figure BDA0002275929690000042
reverse difference: the smoothness of the image distribution is reflected, and is a measure of the uniformity degree of the image, and the calculation formula is as follows:
Figure BDA0002275929690000043
degree of difference: the calculation formula for detecting the difference degree of the images is as follows:
Figure BDA0002275929690000044
amount of information contained in the image: the randomness of the image texture is measured, the characteristic parameters of the randomness of the gray level distribution are measured, the chaotic degree of the gray level of the image is represented, and the calculation formula is as follows:
Figure BDA0002275929690000045
uniformity of image gray level distribution: the image gray level distribution uniformity and texture thickness are reflected, and the calculation formula is as follows:
Figure BDA0002275929690000046
degree of similarity: the similarity degree of the elements of the space gray level co-occurrence matrix in the row or column direction is reflected, and the calculation formula is as follows:
Figure BDA0002275929690000047
in the formula:
Figure BDA0002275929690000048
Figure BDA0002275929690000049
Figure BDA0002275929690000051
Figure BDA0002275929690000052
preferably, S3 comprises the steps of:
s31: and (3) counting the mean value and the variance of different samples on each characteristic quantity, wherein the calculation formula is as follows:
Figure BDA0002275929690000053
Figure BDA0002275929690000054
s32: the separable degree calculation of the constructed samples based on the Papanicolaou distance calculates the distinguishable capability between different samples aiming at different texture characteristic quantities, and the calculation formula is as follows:
Figure BDA0002275929690000055
in the formula, the average value of the same feature amount in 2 different classes on the μ image, and σ is the standard deviation of the same feature amount in 2 different classes.
Preferably, S4 comprises the steps of:
s41: calculating the total separation capacity of each characteristic quantity in different crop categories, wherein the specific calculation formula is as follows:
Figure BDA0002275929690000056
in the formula, D ij The degree of separability between different samples;
s42: sequencing the characteristic quantities according to the sequence of the distinguishing capacity values corresponding to the characteristic quantities from large to small;
s43: and calculating the cumulative sum of the separability of the characteristic quantities, wherein the calculation formula is as follows:
Figure BDA0002275929690000057
s44: constructing a corresponding function expression based on the accumulated separation value corresponding to the texture characteristic quantity and the corresponding precision, finding that the function expression meets the basic rule of a logarithmic function, and acquiring a derivative corresponding to the function, namely the slope of each point, wherein the calculation formula is as follows:
y=0.0625ln(x)+0.642
R 2 =0.9191
Figure BDA0002275929690000061
wherein x represents the accumulated D value corresponding to each characteristic quantity;
s45: and calculating the difference of the slopes, if the difference is less than 0.0001, the calculated point corresponds to the optimal number of the texture feature quantity, outputting the result, and acquiring a specific optimal calculation formula as follows:
D=|y' i -y' i-1 |
s46: and combining the screened texture characteristic quantities to form a new image.
Preferably, S5 comprises the steps of:
s51: carrying out planting structure identification on the research area by taking a month as a unit to generate a time-space distribution thematic map of crops with a complete time sequence;
s52: and verifying the result according to the verification sample to obtain the overall classification precision and the Kappa coefficient.
The method for dynamically extracting the monthly scale of the crop planting structure with the optimized textural features has the following beneficial effects:
1. the invention preferably and finally serves the identification of the crop planting structure based on the image texture, not only can effectively identify the crop planting structure, but also can save time and reduce the workload of a computer.
2. The technology has the characteristics of quick calculation and strong applicability, effectively improves the classification limitation of medium and high resolution data, improves the classification speed, improves the precision and efficiency of the high-grade first number in classification, and has important significance for the business popularization of the crop planting structure identification technology.
Drawings
Fig. 1 is a general flow chart of a monthly scale dynamic extraction method of a crop planting structure with preferred textural features according to the invention.
Fig. 2 is a flow chart of steps S1 of the month scale dynamic extraction method of a crop planting structure with preferred textural features of the invention.
Fig. 3 is a flowchart of the substep S2 of the monthly scale dynamic extraction method of a crop planting structure with preferred textural features of the invention.
Fig. 4 is a flowchart of the substep S3 of the monthly scale dynamic extraction method of a crop planting structure with preferred textural features of the invention.
Fig. 5 is a flowchart of the substep S4 of the monthly scale dynamic extraction method of a crop planting structure with preferred textural features of the invention.
Fig. 6 is a flowchart of the substep S5 of the monthly scale dynamic extraction method of a crop planting structure with preferred textural features of the invention.
Detailed Description
The following description of the embodiments of the present invention is provided to facilitate the understanding of the present invention by those skilled in the art, but it should be understood that the present invention is not limited to the scope of the embodiments, and it will be apparent to those skilled in the art that various changes may be made without departing from the spirit and scope of the invention as defined and defined in the appended claims, and all matters produced by the invention using the inventive concept are protected.
As shown in fig. 1, a method for dynamically extracting a monthly scale of a crop planting structure comprises the following steps:
s1: determining the space range of an analysis area, preparing data, collecting a time series satellite remote sensing data set not larger than a month scale, uniformly processing the time series satellite remote sensing data set into month scale data in time, and simultaneously completing pre-acquisition of sample data in a research area;
s2: calculating image texture characteristics based on a gray level co-occurrence matrix according to the preprocessed satellite remote sensing image data, and describing the texture characteristics of crops by using eight characteristic quantities;
s3: calculating the mean value and the variance of different texture characteristic quantities based on the actually measured samples, calculating the distinguishing capability of the texture characteristic quantities among different samples, and finally evaluating the distinguishing capability of the different characteristic quantities among the samples;
s4: establishing an optimal formula based on the distinguishable capability of each characteristic quantity, determining the optimal number of the texture characteristic quantities participating in classification by using the formula, and constructing the optimal number as a new image;
s5: and finely identifying the crop types in the research area by using a random forest classifier, realizing fine management of the crops, generating a space-time distribution thematic map of the crops with a complete time sequence and verifying the precision.
As shown in fig. 2, S1 of the present embodiment includes the steps of:
s11: selecting GF-1WFV data which is independently developed in China and has high time resolution and high spatial resolution according to the position and the range of a research area, considering to use the position-2, the high mark two, the landsat8 or the HJ-1A/B for replacement if a data source cannot completely cover, and simultaneously investigating the crop types and respective growth climatic periods within the range of the embodiment;
s12: processing the remote sensing image of the collected data, and if substitute data appears, resampling the uniform spatial resolution;
s13: the representativeness, the typicality and the timeliness of the samples need to be considered for collecting the samples, the research area is divided into n areas with the same area by establishing a regular grid, and crop samples are selected in each area.
As shown in fig. 3, S2 of the present embodiment includes the steps of:
s21: calculating texture feature information quantity based on a gray level co-occurrence matrix, counting a gray level correlation coefficient between two pixel points at a certain distance according to a gray level co-occurrence matrix GLCM, and expressing probability distribution of repeated occurrence of gray levels, wherein the expression is as follows:
p(i,j)=[p(i,j,d,θ)]
where P (i, j) is the frequency at which the same pixel pair occurs with a determined distance and direction; d is the distance from the pixel points, the angle of the connecting line vector of the two pixels is theta, and the theta is usually 0 degree, 45 degrees, 90 degrees and 135 degrees;
when the texture is calculated by utilizing the gray level co-occurrence matrix, eight feature quantities which are commonly used are selected to characterize the characteristics of the texture.
Mean: and reflecting the gray level average value and the rule degree of the texture in the window.
Figure BDA0002275929690000091
Variance: reflecting the degree of matrix element deviation from the mean and the magnitude of gray scale change. When the gray scale change is large, the larger the deviation from the mean value is, the larger the value is.
Figure BDA0002275929690000092
Where μ is the mean of p (i, j). This value is large when the gradation change in the image is large.
Contrast: reflecting the definition of the image and the depth of the texture groove. The deeper the texture grooves, the greater its value. A larger value indicates a higher degree of sharpness of the texture.
Figure BDA0002275929690000093
Homogeneity: also known as inverse difference, reflects the smoothness of the image distribution and is a measure of the uniformity of the image. If the local gray scale of the image is uniform, the value of the inverse difference moment is large.
Figure BDA0002275929690000094
Dissimuity: for detecting the degree of difference in the images. If the variation in the local area is large, the value is large.
Figure BDA0002275929690000095
Encopy: the randomness of the image texture, i.e. the amount of information contained in the image, is a characteristic parameter for measuring the randomness of the gray level distribution, and represents the degree of disorder of the gray level of the image. The larger the entropy, the larger the class uncertainty of the sample. Otherwise, the gray scale in the image is uniform, so the entropy value is small.
Figure BDA0002275929690000096
Angular Second Moment: the uniformity degree and the texture thickness of the image gray level distribution are reflected, and the method is a measure of the image gray level distribution uniformity. When the element distribution in the gray level co-occurrence matrix is more concentrated near the main diagonal line, the gray level distribution of the image in the local area is more uniform, the image presents thicker texture, and the value of the angular second moment is correspondingly larger.
Figure BDA0002275929690000101
Correction: reflecting the degree of similarity of the elements of the spatial gray level co-occurrence matrix in the row or column direction. If the matrix elements are uniform and equal, the correlation degree is higher and the value is larger; conversely, if the element difference is large, the correlation is small.
Figure BDA0002275929690000102
In the formula:
Figure BDA0002275929690000103
Figure BDA0002275929690000104
Figure BDA0002275929690000105
Figure BDA0002275929690000106
as shown in fig. 4, S3 of the present embodiment includes the steps of:
s31: and (3) counting the mean value and the variance of different samples on each characteristic quantity, wherein the calculation formula is as follows:
Figure BDA0002275929690000107
Figure BDA0002275929690000108
s32: the separable degree calculation of the samples is constructed based on Bhattacharyya distance, the distinguishable capability between different samples is calculated aiming at different texture characteristic quantities, and the calculation formula is as follows:
Figure BDA0002275929690000109
in the formula, the average value of the same feature amount in 2 different classes on the μ image, and σ is the standard deviation of the same feature amount in 2 different classes.
As shown in fig. 5, S4 of the present embodiment includes the steps of:
s41: calculating the total separating capacity of each characteristic quantity in different crop categories, wherein a specific calculation formula is as follows:
Figure BDA0002275929690000111
in the formula, D ij The degree of separability between different samples;
s42: sorting the characteristic quantities according to the sequence of the distinguishing capacity values corresponding to the characteristic quantities from large to small;
s43: and calculating the cumulative sum of the separability of the characteristic quantities, wherein the calculation formula is as follows:
Figure BDA0002275929690000112
s44: constructing a corresponding function expression based on the accumulated separation value corresponding to the texture characteristic quantity and the corresponding precision, and finding that the function expression meets the basic rule of a logarithmic function, thereby obtaining a derivative corresponding to the function, namely the slope of each point, specifically as follows:
y=0.0625ln(x)+0.642
R 2 =0.9191
Figure BDA0002275929690000113
wherein x represents the accumulated D value corresponding to each characteristic quantity;
s45: and calculating the difference of the slopes, wherein if the difference is less than 0.0001, the point corresponds to the optimal number of the texture feature quantity. And outputs the result. The specific preferred calculation formula is obtained as follows:
D=|y' i -y' i-1 |
s46: and combining the screened texture characteristic quantities to form a new image.
As shown in fig. 6, S5 of the present embodiment includes the steps of:
s51: identifying ground objects in the research area according to a random forest principle;
s52: and setting various parameters required by the classifier, inputting classification samples to identify and classify the crop planting structure in the research area, and finishing dynamic identification.
S53: identifying crops in a research area to generate a complete time sequence crop space-time distribution thematic map;
s54: and verifying the result according to the verification sample to obtain the overall classification precision and the Kappa coefficient.
In the implementation of the embodiment, the method provided by the invention is based on the principle that different crops have different texture characteristics on images, the types of the crops in an experimental area are analyzed and summarized, then the optimal texture characteristic quantity is screened based on the calculation principle of the separability and the precision of the texture characteristic quantity, a new image is formed, and finally the crop planting structure information is identified and obtained by using a random forest classification method. The technical scheme has the characteristics of simplicity, effectiveness and strong applicability, can quickly and accurately acquire the spatial distribution information of crops in a large range, improves the reduction of classification speed and the poor classification precision caused by excessive characteristic quantity participating in classification in the traditional method, improves the calculation efficiency and precision of the crop planting structure monitored by remote sensing, and is beneficial to the business popularization of monitoring the crop planting structure by the remote sensing technology.

Claims (1)

1. The method for dynamically extracting the monthly scale of the crop planting structure with the optimized textural features is characterized by comprising the following steps of:
s1: determining the space range of an analysis area, preparing data, collecting a time series satellite remote sensing data set not larger than a month scale, uniformly processing the time series satellite remote sensing data set into month scale data in time, and simultaneously completing pre-acquisition of sample data in a research area;
s2: calculating image texture characteristics based on a gray level co-occurrence matrix according to the preprocessed satellite remote sensing image data, and describing the texture characteristics of crops by using eight characteristic quantities;
s3: calculating the mean value and the variance of different texture characteristic quantities based on the actually measured samples, and calculating the differentiable capacity of the texture characteristic quantities among different samples;
s4: establishing an optimal calculation formula based on the distinguishing capability of each characteristic quantity, determining the optimal number of the texture characteristic quantities participating in classification by using the formula, and constructing a new image;
s5: finely identifying the crop types in the research area by using a random forest classifier, realizing fine management of the crops, generating a space-time distribution thematic map of the crops with a complete time sequence and verifying the precision;
the S1 comprises the following steps:
s11: selecting GF-1WFV data with high time resolution and high spatial resolution according to the position and range of the research area, and considering using the position-2, high mark two, landsat8 or HJ-1A/B to replace if the data source can not completely cover, and simultaneously investigating the type of crops and the respective growth phenological period;
s12: processing the collected data by remote sensing images, and if substitute data appears, resampling the uniform spatial resolution;
s13: the representativeness, the typicality and the timeliness of the samples need to be considered for collecting the samples, a regular grid is established to divide a research area into n areas with the same area, and different crop samples are selected in each area;
the S2 comprises the following steps:
s21: calculating texture characteristic information quantity based on a gray level co-occurrence matrix, and counting a gray level correlation coefficient between two pixel points at a certain distance according to a gray level co-occurrence matrix GLCM, wherein the expression is as follows:
Figure DEST_PATH_IMAGE001
wherein,
Figure 234249DEST_PATH_IMAGE002
is the frequency at which the same pixel pair occurs with distance and direction determinations; i, j represents the spatial position of the pixel;
Figure DEST_PATH_IMAGE003
the distance from a pixel point is the angle of a connecting line vector of two pixels
Figure 823494DEST_PATH_IMAGE004
Figure 868810DEST_PATH_IMAGE004
Taking 0 degree, 45 degrees, 90 degrees and 135 degrees;
s22, when the texture is calculated by utilizing the gray level co-occurrence matrix, eight characteristic quantities are selected to characterize the characteristics of the texture:
average value: the calculation formula of the regular degree of the gray level average value and the texture in the window is as follows:
Figure DEST_PATH_IMAGE005
wherein L represents the number of pixels;
variance: the degree of matrix element deviation from the mean value and the gray scale change are reflected, and the calculation formula is as follows:
Figure 683926DEST_PATH_IMAGE006
wherein μ is the mean of p (i, j);
and (3) comparison: the degree of the definition of the image and the depth of the texture groove is reflected, and the calculation formula is as follows:
Figure DEST_PATH_IMAGE007
reverse difference: the smoothness of the image distribution is reflected, and is a measure of the uniformity of the image, and the calculation formula is as follows:
Figure 232719DEST_PATH_IMAGE008
degree of difference: detecting the difference degree of the images;
amount of information contained in the image: the randomness of the image texture is measured, the characteristic parameters of the randomness of the gray level distribution are measured, the chaotic degree of the gray level of the image is represented, and the calculation formula is as follows:
Figure DEST_PATH_IMAGE009
uniformity of gray level distribution of the image: the image gray level distribution uniformity and texture thickness are reflected, and the calculation formula is as follows:
Figure 524023DEST_PATH_IMAGE010
degree of similarity: the similarity degree of the elements of the space gray level co-occurrence matrix in the row or column direction is reflected, and the calculation formula is as follows:
Figure DEST_PATH_IMAGE011
wherein,
Figure 712428DEST_PATH_IMAGE012
represents the mean value in the row direction;
Figure DEST_PATH_IMAGE013
indicates all in the column directionA value;
Figure 645749DEST_PATH_IMAGE014
represents the standard deviation in the row direction;
Figure DEST_PATH_IMAGE015
represents the standard deviation in the column direction;
the S3 comprises the following steps:
s31: and (3) counting the mean value and the variance of different samples on each characteristic quantity, wherein the calculation formula is as follows:
Figure 737464DEST_PATH_IMAGE016
Figure DEST_PATH_IMAGE017
s32: the method comprises the following steps of constructing separable degree calculation of samples based on the Pasteur distance, and calculating the differentiable capacity between different samples aiming at different texture characteristic quantities, wherein the calculation formula is as follows:
Figure 855461DEST_PATH_IMAGE018
in the formula, mu a Is the mean value, mu, of the same feature in the image between the a classes b Mean, σ, between b classes for the same feature of the image a Standard deviation, σ, between a classes for the same feature b Standard deviation between categories b for the same feature;
the S4 comprises the following steps:
s41: calculating the total separation capacity of each characteristic quantity in different crop categories, wherein the specific calculation formula is as follows:
Figure DEST_PATH_IMAGE019
in the formula, D ij The degree of separability between different samples;
s42: sorting the characteristic quantities according to the sequence of the distinguishing capacity values corresponding to the characteristic quantities from large to small;
s43: and calculating the cumulative sum of the separability of the characteristic quantities, wherein the calculation formula is as follows:
Figure 344211DEST_PATH_IMAGE020
s44: constructing a corresponding function expression based on the accumulated separation value corresponding to the texture characteristic quantity and the corresponding precision, finding that the function expression meets the basic rule of a logarithmic function, and acquiring a derivative corresponding to the function, namely the slope of each point, wherein the calculation formula is as follows:
y = 0.0625ln(x) + 0.642
R² = 0.9191
Figure DEST_PATH_IMAGE021
wherein x represents the accumulated D value corresponding to each characteristic quantity; y represents the precision corresponding to the accumulated D value;
Figure 18906DEST_PATH_IMAGE022
is the first partial derivative of y;
s45: and calculating the difference of the slopes, if the difference is less than 0.0001, the calculated point corresponds to the optimal number of the texture feature quantity, outputting the result, and acquiring a specific optimal calculation formula as follows:
Figure DEST_PATH_IMAGE023
s46: combining the screened texture characteristic quantities to form a new image;
the S5 comprises the following steps:
s51: recognizing the ground features of the research area according to a random forest principle;
s52: setting various parameters required by a classifier, inputting classification samples to identify and classify crop planting structures in a research area, and completing dynamic identification;
s53: identifying crops in a research area to generate a complete time sequence crop space-time distribution thematic map;
s54: and verifying the result according to the verification sample to obtain the overall classification precision and the Kappa coefficient.
CN201911122874.3A 2019-11-16 2019-11-16 Method for dynamically extracting monthly scale of crop planting structure with optimized textural features Active CN110909652B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911122874.3A CN110909652B (en) 2019-11-16 2019-11-16 Method for dynamically extracting monthly scale of crop planting structure with optimized textural features

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911122874.3A CN110909652B (en) 2019-11-16 2019-11-16 Method for dynamically extracting monthly scale of crop planting structure with optimized textural features

Publications (2)

Publication Number Publication Date
CN110909652A CN110909652A (en) 2020-03-24
CN110909652B true CN110909652B (en) 2022-10-21

Family

ID=69816875

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911122874.3A Active CN110909652B (en) 2019-11-16 2019-11-16 Method for dynamically extracting monthly scale of crop planting structure with optimized textural features

Country Status (1)

Country Link
CN (1) CN110909652B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111950530B (en) * 2020-09-08 2024-04-12 中国水利水电科学研究院 Multi-feature optimization and fusion method for crop planting structure extraction
CN112949607A (en) * 2021-04-15 2021-06-11 辽宁工程技术大学 Wetland vegetation feature optimization and fusion method based on JM Relief F
CN114519721A (en) * 2022-02-16 2022-05-20 广东皓行科技有限公司 Crop lodging identification method, device and equipment based on remote sensing image
CN114219847B (en) * 2022-02-18 2022-07-01 清华大学 Method and system for determining crop planting area based on phenological characteristics and storage medium
CN114490449B (en) * 2022-04-18 2022-07-08 飞腾信息技术有限公司 Memory access method and device and processor

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2007317180A1 (en) * 2006-11-10 2008-05-15 National Ict Australia Limited Texture feature extractor
CN103500344A (en) * 2013-09-02 2014-01-08 中国测绘科学研究院 Method and module for extracting and interpreting information of remote-sensing image
CN104715255A (en) * 2015-04-01 2015-06-17 电子科技大学 Landslide information extraction method based on SAR (Synthetic Aperture Radar) images
CN105678281A (en) * 2016-02-04 2016-06-15 中国农业科学院农业资源与农业区划研究所 Plastic film mulching farmland remote sensing monitoring method based on spectrum and texture features
CN108399400A (en) * 2018-03-23 2018-08-14 中国农业科学院农业资源与农业区划研究所 A kind of early stage crop recognition methods and system based on high-definition remote sensing data
CN109584284A (en) * 2018-12-13 2019-04-05 宁波大学 A kind of seashore wetland ground object sample extracting method of hierarchical decision making
CN110321861A (en) * 2019-07-09 2019-10-11 中国水利水电科学研究院 A kind of main crops production moon scale Dynamic Extraction method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9552649B2 (en) * 2012-11-20 2017-01-24 Koninklijke Philips N.V. Integrated phenotyping employing image texture features
TWI668666B (en) * 2018-02-14 2019-08-11 China Medical University Hospital Prediction model for grouping hepatocellular carcinoma, prediction system thereof, and method for determining hepatocellular carcinoma group

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2007317180A1 (en) * 2006-11-10 2008-05-15 National Ict Australia Limited Texture feature extractor
CN103500344A (en) * 2013-09-02 2014-01-08 中国测绘科学研究院 Method and module for extracting and interpreting information of remote-sensing image
CN104715255A (en) * 2015-04-01 2015-06-17 电子科技大学 Landslide information extraction method based on SAR (Synthetic Aperture Radar) images
CN105678281A (en) * 2016-02-04 2016-06-15 中国农业科学院农业资源与农业区划研究所 Plastic film mulching farmland remote sensing monitoring method based on spectrum and texture features
CN108399400A (en) * 2018-03-23 2018-08-14 中国农业科学院农业资源与农业区划研究所 A kind of early stage crop recognition methods and system based on high-definition remote sensing data
CN109584284A (en) * 2018-12-13 2019-04-05 宁波大学 A kind of seashore wetland ground object sample extracting method of hierarchical decision making
CN110321861A (en) * 2019-07-09 2019-10-11 中国水利水电科学研究院 A kind of main crops production moon scale Dynamic Extraction method

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
"Feature Selection Techniques for Breast Cancer Image Classification with Support Vector Machine";Kedkarn Chaiyakhan et al;《Proceedings of the International MultiConference of Engineers and Computer Scientists》;20160318(第1期);全文 *
"Monthly spatial distributed water resources assessment: a case study";Yuhui Wang et al;《Computers & Geosciences》;20111206;全文 *
"SAR 图像纹理特征提取与分类研究";胡召玲等;《中国矿业大学学报》;20090531;第38卷(第3期);全文 *
"特征选择的全极化SAR影像面向对象土地覆盖分类";陆翔;《航天返回与遥感》;20180430;第39卷(第2期);全文 *

Also Published As

Publication number Publication date
CN110909652A (en) 2020-03-24

Similar Documents

Publication Publication Date Title
CN110909652B (en) Method for dynamically extracting monthly scale of crop planting structure with optimized textural features
Halme et al. Utility of hyperspectral compared to multispectral remote sensing data in estimating forest biomass and structure variables in Finnish boreal forest
Klodt et al. Field phenotyping of grapevine growth using dense stereo reconstruction
CN109726698B (en) Method for identifying seasonal irrigation area based on remote sensing data
CN111950530B (en) Multi-feature optimization and fusion method for crop planting structure extraction
CN107862255B (en) Wetland information extraction and ecological sensitivity evaluation method based on microwave remote sensing and optical remote sensing technology
CN105389799B (en) SAR image object detection method based on sketch map and low-rank decomposition
Delenne et al. From pixel to vine parcel: A complete methodology for vineyard delineation and characterization using remote-sensing data
CN109753916A (en) A kind of vegetation index spatial scaling model building method and device
CN110321861A (en) A kind of main crops production moon scale Dynamic Extraction method
CN118225711B (en) Intelligent soil water potential detection method based on spectral analysis
Yalcin Phenology monitoring of agricultural plants using texture analysis
CN109960972B (en) Agricultural and forestry crop identification method based on middle-high resolution time sequence remote sensing data
CN111291818A (en) Non-uniform class sample equalization method for cloud mask
CN116740579B (en) Intelligent collection method for territorial space planning data
CN113807269A (en) Landscape scale plant diversity evaluation method with space-time consistency
Aparna et al. Analytical approach for soil and land classification using image processing with deep learning
CN112949607A (en) Wetland vegetation feature optimization and fusion method based on JM Relief F
Warner et al. Remote sensing of land cover change
McCloy et al. Optimizing image resolution to maximize the accuracy of hard classification
CN115130547A (en) Polarized SAR crop classification method, system, equipment and medium based on multi-feature combined time sequence matching
CN113221790A (en) Method and device for generating field crop rotation mode based on radar data
Yuan et al. Rapidly count crop seedling emergence based on waveform Method (WM) using drone imagery at the early stage
CN118038286B (en) Crop area extraction method and system combining Beidou position service and knowledge decision
CN117273273B (en) Urban planning construction land partition management and control system based on big data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant