CN112241981A - Method and device for verifying precision of secondary measurement data of crop planting area - Google Patents

Method and device for verifying precision of secondary measurement data of crop planting area Download PDF

Info

Publication number
CN112241981A
CN112241981A CN202010930985.3A CN202010930985A CN112241981A CN 112241981 A CN112241981 A CN 112241981A CN 202010930985 A CN202010930985 A CN 202010930985A CN 112241981 A CN112241981 A CN 112241981A
Authority
CN
China
Prior art keywords
area
precision
spot
error
accuracy
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010930985.3A
Other languages
Chinese (zh)
Other versions
CN112241981B (en
Inventor
吕争
王小燕
崔小贺
姜涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Center for Resource Satellite Data and Applications CRESDA
Original Assignee
China Center for Resource Satellite Data and Applications CRESDA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Center for Resource Satellite Data and Applications CRESDA filed Critical China Center for Resource Satellite Data and Applications CRESDA
Priority to CN202010930985.3A priority Critical patent/CN112241981B/en
Publication of CN112241981A publication Critical patent/CN112241981A/en
Application granted granted Critical
Publication of CN112241981B publication Critical patent/CN112241981B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/11Complex mathematical operations for solving equations, e.g. nonlinear equations, general mathematical optimization problems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30188Vegetation; Agriculture

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Analysis (AREA)
  • Computational Mathematics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Mathematical Optimization (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Algebra (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Operations Research (AREA)
  • Computing Systems (AREA)
  • Geometry (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The application discloses a method and a device for verifying the precision of secondary measurement data of crop planting area, wherein the method comprises the following steps: determining a first vector pattern spot of a crop planting area to be measured in any crop maturity period; determining a second vector spot which is the same as the area range of the first vector spot and is registered, wherein the second vector spot comprises classification information of field investigation; superposing the first vector pattern spot and the second vector pattern spot to obtain an intersection to obtain a third vector pattern spot, and determining the areas of a plurality of surface feature pattern spots according to the third vector pattern spot; establishing an error matrix according to the area of the pattern spot, and calculating the overall precision, the user precision and the producer precision according to the error matrix; and calculating the area error of the pattern spot of each ground feature, respectively calculating the value ranges of the overall precision, the user precision and the producer precision according to the influence of the area error of the pattern spot on the precision, and generating a verification report. The method and the device solve the technical problem that in the prior art, the accuracy of the precision verification result of the second-class measurement data is low.

Description

Method and device for verifying precision of secondary measurement data of crop planting area
Technical Field
The application relates to the technical field of remote sensing image processing and crop planting area statistics, in particular to a method and a device for verifying the precision of secondary measurement data of a crop planting area.
Background
The remote sensing measurement of the crop planting area is to combine the agricultural yield to ground sampling investigation technology and the remote sensing measurement technology, accurately and timely acquire the main crop area data of each grain production area, and further establish a modern agricultural statistical system facing the future. The remote sensing measurement method for the crop planting area has various ways, wherein the second type of measurement refers to the remote sensing measurement of the crop planting area to be measured based on the remote sensing image with the resolution of 2 meters to obtain the remote sensing image in the crop maturity period, the attribute information of various types of ground objects can be measured in the second type of measurement process, for example, the various types of ground objects comprise nine types of ground objects such as facility agriculture land, crops, garden fruit trees, forest trees, grasses, water areas, buildings, roads and other ground objects, and the total local crop planting area is determined according to the attribute information of the various types of ground objects. However, in the process of the second type of measurement, precision auditing and precision checking are required to be performed on the second type of measurement data, wherein the precision checking mainly refers to analyzing and evaluating the accuracy and area precision of the type of the resultant pattern spot.
At present, the method for inspecting the precision of the second-class measurement data is mainly to compare the accuracy of the pattern types in the second-class measurement result pattern spot data with the preset actual verification result pattern spot data to generate an error matrix, and then calculate the overall precision, the user precision and the producer precision according to the error matrix to perform comprehensive verification. However, in the prior art, the error matrix is generated by the accuracy of the pattern types in the two types of measurement result pattern spot data and the preset actual verification result pattern spot data, that is, only the influence of the number of correctly classified pattern spots on the accuracy of the measurement data is considered, and the influence of the area of the pattern spots on the accuracy of the measurement data is not considered, so that the accuracy of the accuracy verification result of the two types of measurement data is low.
Disclosure of Invention
The technical problem that this application was solved is: aiming at the problem that the accuracy of the accuracy verification result of the second-class measurement data in the prior art is low, the application provides a method and a device for verifying the accuracy of the second-class measurement data of the crop planting area.
In a first aspect, an embodiment of the present application provides a method for verifying accuracy of data obtained by measuring a crop planting area in two types, where the method includes:
determining a first remote sensing image of a crop planting area to be measured in any crop maturity period, and visually interpreting and sketching the first remote sensing image to obtain a first vector pattern spot;
determining a second remote sensing image which has the same area range as the first remote sensing image and is orthorectified with the first remote sensing image, and sketching actual classification information obtained by preset field investigation into the second remote sensing image to obtain a second vector pattern spot;
carrying out polygon superposition on the first vector pattern spot and the second vector pattern spot to obtain an intersection to obtain a third vector pattern spot, and determining the pattern spot areas of a plurality of preset surface features according to the third vector pattern spot;
and establishing an error matrix according to the area of the pattern spot, calculating according to the error matrix to obtain overall accuracy, user accuracy and producer accuracy, and generating a verification report according to the overall accuracy, the user accuracy and the producer accuracy.
Optionally, the third vector patch includes predicted category information, predicted position information, predicted area information of the preset multiple surface features, and actual category information, actual position information, and actual area information of the preset multiple surface features.
Optionally, establishing an error matrix according to the area of the pattern spot, including:
determining the area of the pattern spot corresponding to the correctly classified ground object according to the predicted position information, the predicted category information, the actual position information and the actual category information corresponding to each ground object;
and taking the predicted area information corresponding to each ground feature as the column of the error matrix, taking the actual area information corresponding to each ground feature as the row of the error matrix, and taking the area of the pattern spot corresponding to the ground feature with correct classification as the data on the diagonal line of the error matrix to establish and obtain the error matrix.
Optionally, calculating an overall accuracy, a user accuracy, and a producer accuracy according to the error matrix includes:
calculating the overall accuracy, the user accuracy and the producer accuracy by the following formulas:
Figure BDA0002670230670000031
Figure BDA0002670230670000032
Figure BDA0002670230670000033
wherein OA represents the overall accuracy, XijRepresenting the sum of the areas of the spots corresponding to the correctly classified ground objects in the error matrix; n represents the total area of the crop planting area to be measured; PA represents the user precision; x is the number of+jRepresenting the sum of the areas of the jth column in the error matrix; x is the number ofi+Representing the sum of the areas of the ith column in the error matrix.
Optionally, generating a verification report according to the overall accuracy, the user accuracy, and the producer accuracy includes:
calculating to obtain a total standard deviation according to a preset random error standard deviation formula, and determining the number of pixels of the second vector pattern spot boundary and the area occupied by a single pixel;
calculating the area error of the pattern spot of each ground feature according to the total standard deviation, the number of the pixels and the area occupied by the single pixel;
and determining the value ranges of the overall precision, the user precision and the producer precision according to the image spot area error of each ground feature, and generating the verification report according to the value ranges of the overall precision, the user precision and the producer precision.
Optionally, calculating a spot area error of each feature according to the total standard deviation, the number of pixels, and the area occupied by the single pixel, includes:
calculating the area error of the image spot of each ground feature by the following formula:
σArea=L×A×σ
wherein σAreaRepresenting the area error of the image spot of each ground feature; l represents the number of pixels corresponding to the second vector image spot boundary on the second remote sensing image; a represents the area occupied by the single pixel; σ represents the total standard deviation.
Optionally, determining the value ranges of the overall accuracy, the user accuracy and the producer accuracy according to the spot area error of each feature includes:
the value ranges of the overall accuracy, the user accuracy and the producer accuracy are represented by the following formulas:
Figure BDA0002670230670000041
Figure BDA0002670230670000042
Figure BDA0002670230670000043
wherein OAerrorA value range representing the overall accuracy; PAerrorRepresenting the value range of the user precision; UA (UA)errorA value range representing the precision of the producer; sigmaijRepresenting the sum of the area errors of the spots corresponding to the correctly classified ground objects in the error matrix,
Figure BDA0002670230670000044
m represents the number of the preset ground features,
Figure BDA0002670230670000045
and (4) representing the total standard deviation of the t-th preset ground feature.
In a second aspect, the embodiment of the present application provides a device for verifying the accuracy of data obtained by measuring two types of crop planting areas, the device includes:
the determining unit is used for determining a first remote sensing image of a crop planting area to be measured in any crop maturity period, and visually interpreting and sketching the first remote sensing image to obtain a first vector pattern spot;
the delineation unit is used for determining a second remote sensing image which has the same area range as the first remote sensing image and is orthorectified with the first remote sensing image, and delineating actual classification information obtained by a preset field survey in the second remote sensing image to obtain a second vector graphic spot;
the superposition unit is used for carrying out polygon superposition on the first vector pattern spot and the second vector pattern spot to obtain an intersection so as to obtain a third vector pattern spot, and determining the pattern spot areas of a plurality of preset ground objects according to the third vector pattern spot;
and the calculation unit is used for establishing an error matrix according to the area of the pattern spot, calculating according to the error matrix to obtain overall precision, user precision and producer precision, and generating a verification report according to the overall precision, the user precision and the producer precision.
Optionally, the third vector patch includes predicted category information, predicted position information, predicted area information of the preset multiple surface features, and actual category information, actual position information, and actual area information of the preset multiple surface features.
Optionally, the computing unit is specifically configured to:
determining the area of the pattern spot corresponding to the correctly classified ground object according to the predicted position information, the predicted category information, the actual position information and the actual category information corresponding to each ground object;
and taking the predicted area information corresponding to each ground feature as the column of the error matrix, taking the actual area information corresponding to each ground feature as the row of the error matrix, and taking the area of the pattern spot corresponding to the ground feature with correct classification as the data on the diagonal line of the error matrix to establish and obtain the error matrix.
Optionally, the computing unit is specifically configured to:
calculating the overall accuracy, the user accuracy and the producer accuracy by the following formulas:
Figure BDA0002670230670000051
Figure BDA0002670230670000052
Figure BDA0002670230670000053
wherein OA represents the overall accuracy, XijRepresenting the sum of the areas of the spots corresponding to the correctly classified ground objects in the error matrix; n represents the total area of the crop planting area to be measured; PA represents the user precision; x is the number of+jRepresenting the sum of the areas of the jth column in the error matrix; x is the number ofi+Representing the sum of the areas of the ith column in the error matrix.
Optionally, the computing unit is specifically configured to:
calculating to obtain a total standard deviation according to a preset random error standard deviation formula, and determining the number of pixels of the second vector pattern spot boundary and the area occupied by a single pixel;
calculating the area error of the pattern spot of each ground feature according to the total standard deviation, the number of the pixels and the area occupied by the single pixel;
and determining the value ranges of the overall precision, the user precision and the producer precision according to the image spot area error of each ground feature, and generating the verification report according to the value ranges of the overall precision, the user precision and the producer precision.
Optionally, the computing unit is specifically configured to:
calculating the area error of the image spot of each ground feature by the following formula:
σArea=L×A×σ
wherein σAreaRepresenting the area error of the image spot of each ground feature; l represents the number of pixels corresponding to the second vector image spot boundary on the second remote sensing image; a represents the area occupied by the single pixel; σ represents the total standard deviation.
Optionally, the computing unit is specifically configured to:
the value ranges of the overall accuracy, the user accuracy and the producer accuracy are represented by the following formulas:
Figure BDA0002670230670000061
Figure BDA0002670230670000062
Figure BDA0002670230670000063
wherein OAerrorA value range representing the overall accuracy; PAerrorRepresenting the value range of the user precision; UA (UA)errorA value range representing the precision of the producer; sigmaijRepresenting the sum of the area errors of the spots corresponding to the correctly classified ground objects in the error matrix,
Figure BDA0002670230670000064
m represents the number of the preset ground features,
Figure BDA0002670230670000065
and (4) representing the total standard deviation of the t-th preset ground feature.
Compared with the prior art, the scheme provided by the embodiment of the application has the following beneficial effects:
1. in the scheme provided by the embodiment of the application, the error matrix is constructed according to the area of the pattern spots of a plurality of ground objects in the first vector pattern spot and the second vector pattern spot, so that the influence of the number of the classified pattern spots on the precision of the measured data and the influence of the area of the pattern spots on the precision of the measured data are considered when constructing the error matrix, and the accuracy of the precision verification result of the second type of measured data is further improved.
2. In the scheme provided by the embodiment of the application, after the precision verification is carried out, the registration error and the scale error are also considered, the area error of the image spot corresponding to each ground feature is calculated according to the registration error and the scale error, the precision range with the registration error and the scale error removed is calculated according to the area error of the image spot, the verification report is generated according to the precision range, and the accuracy of the precision verification result of the second-class measurement data is further improved.
Drawings
Fig. 1 is a schematic flowchart of a method for verifying accuracy of data obtained by measuring a crop planting area according to an embodiment of the present disclosure;
fig. 2a is a schematic view of a vector patch generated by an image with a resolution of 2 meters according to an embodiment of the present disclosure;
fig. 2b is a schematic view of a vector patch generated from a 1-meter resolution image according to an embodiment of the present disclosure;
fig. 2c is a schematic diagram of a size error between a vector pattern spot generated by an image with a resolution of 2 meters and a vector pattern spot generated by an image with a resolution of 1 meter according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram of a registration error provided in an embodiment of the present application;
fig. 4 is a schematic structural diagram of an apparatus for verifying precision of data obtained by measuring a crop planting area according to an embodiment of the present disclosure.
Detailed Description
In the solutions provided in the embodiments of the present application, the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The method for verifying the accuracy of the second-class measurement data of the crop planting area provided by the embodiment of the present application is further described in detail with reference to the accompanying drawings of the specification, and the specific implementation manner of the method may include the following steps (the flow of the method is shown in fig. 1):
step 101, determining a first remote sensing image of a crop planting area to be measured in any crop maturity period, and visually interpreting and sketching the first remote sensing image to obtain a first vector pattern spot.
In the scheme provided by the embodiment of the application, a plurality of remote sensing images of the planting area of the crop to be measured in different periods are stored in a local database in advance, a first remote sensing image of the planting area of the crop to be measured in any crop maturity period is selected from the local database, wherein the resolution of the first remote sensing image is generally higher than 2 meters, then the first remote sensing image is visually interpreted and sketched to obtain a first vector pattern spot, wherein the first vector pattern spot comprises preset data of various types of land features, for example, the first vector pattern spot comprises prediction data of nine types of land features such as facility agriculture land, crops, garden fruit trees, forest trees, grasses, water areas, buildings, roads and other land features, the prediction data comprises prediction type information, prediction position information and prediction area information of each type of ground feature, and the first vector graphic spot is in a vector data format (.shp).
And 102, determining a second remote sensing image which has the same area range as the first remote sensing image and is orthorectified with the first remote sensing image, and sketching actual classification information obtained by preset field investigation into the second remote sensing image to obtain a second vector graphic spot.
In the scheme provided by the embodiment of the application, after the first remote sensing image is determined, a second remote sensing image which has the same area range as the first remote sensing image and is orthorectified with the first remote sensing image is selected from a local database according to the first remote sensing image, wherein the resolution of the second remote sensing image is generally higher than 1 meter.
Further, actual classification information acquired through field investigation in different periods of the crop planting area to be measured is stored in a local database in advance, and the actual classification information in the same period as the first remote sensing image is selected from the local database; and then, the second remote sensing image is used as a base map for field investigation, and the actual classification information is sketched on the base map for field investigation to obtain a second vector pattern spot. The second vector graphic spot includes actual data of various types of land features, for example, the second vector graphic spot includes actual data of nine types of land features, such as a facility agricultural land, a crop, a garden fruit tree, a forest, a grass, a water area, a building, a road, other land features, and the like, wherein the actual data includes actual type information, actual position information, and actual area information of each type of land feature, and the second vector graphic spot is also in a vector data format. And taking the first vector pattern spot as prediction data and the second vector pattern spot as verification data.
And 103, carrying out polygon superposition on the first vector pattern spot and the second vector pattern spot to obtain an intersection to obtain a third vector pattern spot, and determining the pattern spot areas of a plurality of preset ground objects according to the third vector pattern spot.
In the scheme provided by the embodiment of the application, after the first vector pattern spot and the second vector pattern spot are determined, polygon superposition is carried out on the first vector pattern spot and the second vector pattern spot to obtain an intersection so as to obtain a third vector pattern spot.
In a possible implementation manner, the third vector patch includes predicted category information, predicted position information, predicted area information of the preset multiple surface features, and actual category information, actual position information, and actual area information of the preset multiple surface features.
Specifically, the first vector pattern spot and the second vector pattern spot are both in a vector data format, that is, the first vector pattern spot and the second vector pattern spot include a plurality of vector elements, and the first vector pattern spot and the second vector pattern spot are subjected to polygon superposition to obtain an intersection, that is, any two corresponding vector elements in the first vector pattern spot and the second vector pattern spot are subjected to polygon superposition to obtain the intersection. Furthermore, the third vector pattern spot is obtained by performing polygon superposition on the first vector pattern spot and the second vector pattern spot to obtain intersection, so that the third vector pattern spot comprises the prediction data in the first vector pattern spot and the verification data in the second vector pattern spot, and after the third vector pattern spot is obtained, the prediction pattern spot area information and the actual pattern spot area information of each ground feature are determined according to the third vector pattern spot.
And 104, establishing an error matrix according to the area of the pattern spot, calculating according to the error matrix to obtain overall precision, user precision and producer precision, and generating a verification report according to the overall precision, the user precision and the producer precision.
In the solution provided in the embodiment of the present application, there are various ways to establish the error matrix according to the area of the pattern spot, and a preferred way is taken as an example for description below.
In one possible implementation, establishing an error matrix according to the area of the image spot includes: determining the area of the pattern spot corresponding to the correctly classified ground object according to the predicted position information, the predicted category information, the actual position information and the actual category information corresponding to each ground object; and taking the predicted area information corresponding to each ground feature as the column of the error matrix, taking the actual area information corresponding to each ground feature as the row of the error matrix, and taking the area of the pattern spot corresponding to the ground feature with correct classification as the data on the diagonal line of the error matrix to establish and obtain the error matrix.
In order to facilitate understanding of the above process of establishing the error matrix, the vector pattern spot includes nine types of ground objects such as agricultural land, crops, garden fruit trees, grasses, waters, buildings, roads, and other ground objects.
And establishing a 9 x 9 error matrix according to the predicted data of the nine types of ground objects in the first vector image spots and the actual data in the second vector image spots, wherein the columns of the error matrix represent the predicted area information corresponding to each ground object, the rows of the error matrix represent the actual area information corresponding to each ground object, and the area of the image spots corresponding to the ground objects with correct classification is used as the data on the diagonal line of the error matrix. Referring specifically to table 1, the embodiments of the present application provide an error matrix.
TABLE 1
Figure BDA0002670230670000101
Further, after the error matrix is determined, the overall accuracy, the user accuracy, and the producer accuracy are obtained according to the error matrix calculation, specifically, there are various ways of obtaining the overall accuracy, the user accuracy, and the producer accuracy according to the error matrix calculation, and a preferred way is described as an example below.
In one possible implementation, calculating from the error matrix an overall accuracy, a user accuracy, and a producer accuracy includes:
calculating the overall accuracy, the user accuracy and the producer accuracy by the following formulas:
Figure BDA0002670230670000111
Figure BDA0002670230670000112
Figure BDA0002670230670000113
wherein OA represents the overall accuracy, XijRepresenting the sum of the areas of the spots corresponding to the correctly classified ground objects in the error matrix; n represents the total area of the crop planting area to be measured; PA represents the user precision; x is the number of+jRepresenting the sum of the areas of the jth column in the error matrix; x is the number ofi+Representing the sum of the areas of the ith column in the error matrix.
To further improve the accuracy of the precision verification results of the two types of measurement data, in one possible implementation, generating a verification report according to the overall precision, the user precision, and the producer precision includes: calculating to obtain a total standard deviation according to a preset random error standard deviation formula, and determining the number of pixels of the second vector pattern spot boundary and the area occupied by a single pixel; calculating the area error of the pattern spot of each ground feature according to the total standard deviation, the number of the pixels and the area occupied by the single pixel; and determining the value ranges of the overall precision, the user precision and the producer precision according to the image spot area error of each ground feature, and generating the verification report according to the value ranges of the overall precision, the user precision and the producer precision.
Specifically, since there are errors between the remote sensing images with different resolutions and the existence of these errors affects the accuracy of the precision verification of the second type of measurement data, the influence of the errors between the remote sensing images with different resolutions on the accuracy of the precision verification of the second type of measurement data is not considered in the process of calculating the overall precision, the user precision and the producer precision in step 104, and therefore, in order to further improve the accuracy of the result of the precision verification of the second type of measurement data, there are errors between the remote sensing images with different resolutions in the process of calculating the overall precision, the user precision and the producer precision.
In the solution provided in the embodiment of the present application, the errors considered mainly include the scale error and registration error between images. Because the resolution of the data of the first remote sensing image of the second type of measurement in the remote sensing measurement of the crop planting area is generally 2 meters, and the resolution of the second remote sensing image corresponding to the base map for field investigation is generally higher than 1 meter, the situation that a mixed pixel exists on the target edge can be generated between the first vector image spot determined in the step 101 and the second vector image spot determined in the step 102 due to different scales, namely the real target class and the background class are possibly in one pixel, so that the boundary deviation can be generated in the production of the vector image spots, and the boundary deviation is the scale error. Specifically, refer to fig. 2a, fig. 2b and fig. 2c, wherein fig. 2a is a schematic diagram of a vector patch generated by an image with a resolution of 2 meters according to an embodiment of the present disclosure; fig. 2b is a schematic view of a vector patch generated from a 1-meter resolution image according to an embodiment of the present disclosure; fig. 2c is a schematic diagram of a size error between a vector pattern patch generated by an image with a resolution of 2 meters and a vector pattern patch generated by an image with a resolution of 1 meter according to an embodiment of the present application.
In addition, in step 102, the second remote sensing image is obtained by performing ortho-rectification on the first remote sensing image, theoretically, each pixel of the first remote sensing image and each pixel of the second remote sensing image correspond to the same geographic position, but in the ortho-rectification of the actual image, a deviation of 1 pixel exists in the best corrected situation between the two images, which results in a deviation of at least 1 pixel between the prediction result and the verification data, namely, a registration error. Specifically, referring to fig. 3, a schematic diagram of a registration error provided in an embodiment of the present application is provided.
Furthermore, due to the randomness of scale errors and registration errors, it is quite difficult to accurately determine the errors, but the influence range of the errors can be estimated, and it can be known from the forming mechanism of the scale errors that in the scheme provided by the embodiment of the present application, the scale difference of the images involved in the remote sensing measurement of the crop planting area is generally the difference between the 2 m resolution of the first remote sensing image and the 1 m resolution of the second remote sensing image, that is, the range represented by one pixel on the first remote sensing image corresponds to 4 pixels on the second remote sensing image, and the error range of the boundary of the vector image spot data on the second remote sensing image is within 1 pixel without considering human errors. Therefore, in the scheme provided by the embodiment of the application, the image registration error related to the remote sensing measurement of the crop planting area is also within 1 pixel. Specifically, the total standard deviation can be calculated by the following standard deviation formula of random errors:
Figure BDA0002670230670000121
wherein σ represents the total standard deviation; sigmanStandard deviation representing the single term random error; a isnRepresenting a single term random error transfer coefficient; q represents the number of random errors.
Further, in the scheme provided in the embodiment of the present application, the number q of random errors is 2, and the transfer coefficient a of a single random error is an1, standard deviation σ of the single term random errorn1, total standard deviation
Figure BDA0002670230670000131
The unit is a picture element.
In the scheme provided by the embodiment of the application, after the precision verification is carried out, the registration error and the scale error are also considered, the area error of the image spot corresponding to each ground feature is calculated according to the registration error and the scale error, the precision range with the registration error and the scale error removed is calculated according to the area error of the image spot, the verification report is generated according to the precision range, and the accuracy of the precision verification result of the second-class measurement data is further improved.
Calculating to obtain a total standard deviation according to a preset random error standard deviation formula, and determining the number of pixels of the second vector pattern spot boundary and the area occupied by a single pixel; calculating the area error of the pattern spot of each ground feature according to the total standard deviation, the number of the pixels and the area occupied by the single pixel; determining the value ranges of the overall precision, the user precision and the producer precision according to the image spot area error of each ground feature, correcting the verification report according to the value ranges of the overall precision, the user precision and the producer precision,
further, in step 104, the error matrix is calculated according to the area, so that the spot area error of each feature can be calculated according to the total standard deviation, the number of pixels, and the area occupied by a single pixel, specifically, there are various ways of calculating the spot area error of each feature according to the total standard deviation, the number of pixels, and the area occupied by a single pixel, and a preferred way is described below as an example.
In one possible implementation, calculating a spot area error of each feature according to the total standard deviation, the number of pixels, and an area occupied by the single pixel includes:
calculating the area error of the image spot of each ground feature by the following formula:
σArea=L×A×σ
wherein σAreaRepresenting the area error of the image spot of each ground feature; l represents the number of pixels corresponding to the second vector image spot boundary on the second remote sensing image; a represents the area occupied by the single pixel; σ represents the total standard deviation.
Further, in a possible implementation manner, determining the value ranges of the overall accuracy, the user accuracy, and the producer accuracy according to the spot area error of each feature includes:
the value ranges of the overall accuracy, the user accuracy and the producer accuracy are represented by the following formulas:
Figure BDA0002670230670000141
Figure BDA0002670230670000142
Figure BDA0002670230670000143
wherein OAerrorA value range representing the overall accuracy; PAerrorRepresenting the value range of the user precision; UA (UA)errorA value range representing the precision of the producer; sigmaijRepresenting the sum of the area errors of the spots corresponding to the correctly classified ground objects in the error matrix,
Figure BDA0002670230670000144
m represents the number of the preset ground features,
Figure BDA0002670230670000145
and (4) representing the total standard deviation of the t-th preset ground feature.
And obtaining a final precision verification report according to the obtained overall precision, user precision and producer precision range, wherein the final precision verification report comprises the quantity and distribution condition of effective verification data (field investigation), an error matrix, overall classification precision, the precision verification of spot area inspection and crop variety inspection and the like. The two types of measurement results are aimed at the crop planting area. Therefore, the overall accuracy, the user accuracy of the crop type and the producer accuracy of the crop type are mainly considered to judge whether the result accuracy is qualified. Generally, the minimum of the three precision ranges of the overall precision, the user precision of the crop type and the producer precision of the crop type is more than 85 percent, and the crop type is considered to be qualified.
According to the scheme provided by the embodiment of the application, polygon superposition is carried out on a first vector pattern spot and a second vector pattern spot to obtain an intersection, a third vector pattern spot is obtained, then the pattern spot areas of a plurality of preset ground objects are determined according to the third vector pattern spot, an error matrix is established according to the pattern spot areas, finally, the overall precision, the user precision and the producer precision are obtained according to the error matrix, and a verification report is generated according to the overall precision, the user precision and the producer precision. Namely, the error matrix is constructed according to the spot areas of a plurality of surface features in the first vector spots and the second vector spots, so that the influence of the number of classified spots on the precision of the measured data and the influence of the spot areas on the precision of the measured data are considered when constructing the error matrix, and the accuracy of the precision verification result of the second-class measured data is further improved.
Based on the same inventive concept as the method shown in fig. 1, the embodiment of the present application provides an apparatus for verifying the accuracy of two types of measurement data of crop planting area, referring to fig. 4, the apparatus includes:
the determining unit 401 is configured to determine a first remote sensing image of a crop planting area to be measured in any crop maturity period, and visually interpret and delineate the first remote sensing image to obtain a first vector pattern spot;
a delineating unit 402, configured to determine a second remote sensing image that is the same as the first remote sensing image in area range and is orthorectified with the first remote sensing image, and delineate actual classification information obtained by a preset field survey into the second remote sensing image to obtain a second vector pattern;
the superposition unit 403 is configured to perform polygon superposition on the first vector pattern spot and the second vector pattern spot to obtain an intersection to obtain a third vector pattern spot, and determine the pattern spot areas of a plurality of preset surface features according to the third vector pattern spot;
a calculating unit 404, configured to establish an error matrix according to the area of the pattern spot, calculate a total accuracy, a user accuracy, and a producer accuracy according to the error matrix, and generate a verification report according to the total accuracy, the user accuracy, and the producer accuracy.
Optionally, the third vector patch includes predicted category information, predicted position information, predicted area information of the preset multiple surface features, and actual category information, actual position information, and actual area information of the preset multiple surface features.
Optionally, the calculating unit 404 is specifically configured to:
determining the area of the pattern spot corresponding to the correctly classified ground object according to the predicted position information, the predicted category information, the actual position information and the actual category information corresponding to each ground object;
and taking the predicted area information corresponding to each ground feature as the column of the error matrix, taking the actual area information corresponding to each ground feature as the row of the error matrix, and taking the area of the pattern spot corresponding to the ground feature with correct classification as the data on the diagonal line of the error matrix to establish and obtain the error matrix.
Optionally, the calculating unit 404 is specifically configured to:
calculating the overall accuracy, the user accuracy and the producer accuracy by the following formulas:
Figure BDA0002670230670000151
Figure BDA0002670230670000152
Figure BDA0002670230670000161
wherein OA represents the overall accuracy, XijRepresenting the sum of the areas of the spots corresponding to the correctly classified ground objects in the error matrix; n represents the total area of the crop planting area to be measured; PA represents the user precision; x is the number of+jRepresenting the sum of the areas of the jth column in the error matrix; x is the number ofi+Representing the sum of the areas of the ith column in the error matrix.
Optionally, the calculating unit 404 is specifically configured to:
calculating to obtain a total standard deviation according to a preset random error standard deviation formula, and determining the number of pixels of the second vector pattern spot boundary and the area occupied by a single pixel;
calculating the area error of the pattern spot of each ground feature according to the total standard deviation, the number of the pixels and the area occupied by the single pixel;
and determining the value ranges of the overall precision, the user precision and the producer precision according to the image spot area error of each ground feature, and generating the verification report according to the value ranges of the overall precision, the user precision and the producer precision.
Optionally, the calculating unit 404 is specifically configured to:
calculating the area error of the image spot of each ground feature by the following formula:
σArea=L×A×σ
wherein σAreaRepresenting the area error of the image spot of each ground feature; l represents the number of boundary pixels of the second vector image spot; a represents the area occupied by the single pixel; σ represents the total standard deviation.
Optionally, the calculating unit 404 is specifically configured to:
the value ranges of the overall accuracy, the user accuracy and the producer accuracy are represented by the following formulas:
Figure BDA0002670230670000162
Figure BDA0002670230670000163
Figure BDA0002670230670000164
wherein OAerrorA value range representing the overall accuracy; PAerrorRepresenting the value range of the user precision; UA (UA)errorA value range representing the precision of the producer; sigmaijRepresenting the sum of the area errors of the spots corresponding to the correctly classified ground objects in the error matrix,
Figure BDA0002670230670000171
m represents the number of the preset ground features,
Figure BDA0002670230670000172
representing the total of the t-th preset featuresStandard deviation.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (10)

1. A method for verifying the accuracy of second-class measurement data of crop planting area is characterized by comprising the following steps:
determining a first remote sensing image of a crop planting area to be measured in any crop maturity period, and visually interpreting and sketching the first remote sensing image to obtain a first vector pattern spot;
determining a second remote sensing image which has the same area range as the first remote sensing image and is orthorectified with the first remote sensing image, and sketching actual classification information obtained by preset field investigation into the second remote sensing image to obtain a second vector pattern spot;
carrying out polygon superposition on the first vector pattern spot and the second vector pattern spot to obtain an intersection to obtain a third vector pattern spot, and determining the pattern spot areas of a plurality of preset surface features according to the third vector pattern spot;
and establishing an error matrix according to the area of the pattern spot, calculating according to the error matrix to obtain overall accuracy, user accuracy and producer accuracy, and generating a verification report according to the overall accuracy, the user accuracy and the producer accuracy.
2. The method of claim 1, wherein the third vector patch includes predicted category information, predicted position information, predicted area information of the preset plurality of features, and actual category information, actual position information, actual area information of the preset plurality of features.
3. The method of claim 2, wherein building an error matrix from the spot area comprises:
determining the area of the pattern spot corresponding to the correctly classified ground object according to the predicted position information, the predicted category information, the actual position information and the actual category information corresponding to each ground object;
and taking the predicted area information corresponding to each ground feature as the column of the error matrix, taking the actual area information corresponding to each ground feature as the row of the error matrix, and taking the area of the pattern spot corresponding to the ground feature with correct classification as the data on the diagonal line of the error matrix to establish and obtain the error matrix.
4. The method of claim 3, wherein calculating from the error matrix yields an overall accuracy, a user accuracy, and a producer accuracy, comprising:
calculating the overall accuracy, the user accuracy and the producer accuracy by the following formulas:
Figure FDA0002670230660000021
Figure FDA0002670230660000022
Figure FDA0002670230660000023
wherein OA represents the overall accuracy, XijRepresenting the sum of the areas of the spots corresponding to the correctly classified ground objects in the error matrix; n represents the total area of the crop planting area to be measured; PA represents the user precision; x is the number of+jRepresenting the sum of the areas of the jth column in the error matrix; x is the number ofi+Representing the sum of the areas of the ith column in the error matrix.
5. The method of any of claims 1 to 4, wherein generating a validation report based on the overall accuracy, the user accuracy, and the producer accuracy comprises:
calculating to obtain a total standard deviation according to a preset random error standard deviation formula, and determining the number of pixels of the second vector pattern spot boundary and the area occupied by a single pixel;
calculating the area error of the pattern spot of each ground feature according to the total standard deviation, the number of the pixels and the area occupied by the single pixel;
and determining the value ranges of the overall precision, the user precision and the producer precision according to the image spot area error of each ground feature, and generating the verification report according to the value ranges of the overall precision, the user precision and the producer precision.
6. The method of claim 5, wherein calculating the spot area error for each feature based on the total standard deviation, the number of pixels, and the area occupied by the single pixel comprises:
calculating the area error of the image spot of each ground feature by the following formula:
σArea=L×A×σ
wherein σAreaRepresenting the area error of the image spot of each ground feature; l represents the number of pixels corresponding to the second vector image spot boundary on the second remote sensing image; a represents the area occupied by the single pixel; σ represents the total standard deviation.
7. The method of claim 6, wherein determining the value ranges of the overall accuracy, the user accuracy, and the producer accuracy from the spot area error of each feature comprises:
the value ranges of the overall accuracy, the user accuracy and the producer accuracy are represented by the following formulas:
Figure FDA0002670230660000031
Figure FDA0002670230660000032
Figure FDA0002670230660000033
wherein OAerrorA value range representing the overall accuracy; PAerrorRepresenting the value range of the user precision; UA (UA)errorA value range representing the precision of the producer; sigmaijRepresenting the sum of the area errors of the spots corresponding to the correctly classified ground objects in the error matrix,
Figure FDA0002670230660000034
m represents the number of the preset ground features,
Figure FDA0002670230660000035
and (4) representing the total standard deviation of the t-th preset ground feature.
8. The utility model provides a verify device of two types of measured data precision of crop planting area which characterized in that includes:
the determining unit is used for determining a first remote sensing image of a crop planting area to be measured in any crop maturity period, and visually interpreting and sketching the first remote sensing image to obtain a first vector pattern spot;
the delineation unit is used for determining a second remote sensing image which has the same area range as the first remote sensing image and is orthorectified with the first remote sensing image, and delineating actual classification information obtained by a preset field survey in the second remote sensing image to obtain a second vector graphic spot;
the superposition unit is used for carrying out polygon superposition on the first vector pattern spot and the second vector pattern spot to obtain an intersection so as to obtain a third vector pattern spot, and determining the pattern spot areas of a plurality of preset ground objects according to the third vector pattern spot;
and the calculation unit is used for establishing an error matrix according to the area of the pattern spot, calculating according to the error matrix to obtain overall precision, user precision and producer precision, and generating a verification report according to the overall precision, the user precision and the producer precision.
9. The apparatus of claim 8, wherein the third vector patch comprises predicted category information, predicted position information, predicted area information of the preset plurality of features, and actual category information, actual position information, actual area information of the preset plurality of features.
10. The apparatus of claim 9, wherein the computing unit is specifically configured to:
calculating to obtain a total standard deviation according to a preset random error standard deviation formula, and determining the number of pixels of the second vector pattern spot boundary and the area occupied by a single pixel;
calculating the area error of the pattern spot of each ground feature according to the total standard deviation, the number of the pixels and the area occupied by the single pixel;
and determining the value ranges of the overall precision, the user precision and the producer precision according to the image spot area error of each ground feature, and generating the verification report according to the value ranges of the overall precision, the user precision and the producer precision.
CN202010930985.3A 2020-09-07 2020-09-07 Method and device for verifying accuracy of second-class measurement data of crop planting area Active CN112241981B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010930985.3A CN112241981B (en) 2020-09-07 2020-09-07 Method and device for verifying accuracy of second-class measurement data of crop planting area

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010930985.3A CN112241981B (en) 2020-09-07 2020-09-07 Method and device for verifying accuracy of second-class measurement data of crop planting area

Publications (2)

Publication Number Publication Date
CN112241981A true CN112241981A (en) 2021-01-19
CN112241981B CN112241981B (en) 2024-03-22

Family

ID=74170707

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010930985.3A Active CN112241981B (en) 2020-09-07 2020-09-07 Method and device for verifying accuracy of second-class measurement data of crop planting area

Country Status (1)

Country Link
CN (1) CN112241981B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113554675A (en) * 2021-07-19 2021-10-26 贵州师范大学 Edible fungus yield estimation method based on unmanned aerial vehicle visible light remote sensing
CN114283335A (en) * 2021-12-27 2022-04-05 河南大学 Historical period remote sensing identification precision verification preparation method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101446981A (en) * 2008-12-26 2009-06-03 北京农业信息技术研究中心 Land-utilization modification investigation method and system based on the combination of PDA and 3S
CN105225227A (en) * 2015-09-07 2016-01-06 中国测绘科学研究院 The method and system that remote sensing image change detects
CN106123812A (en) * 2016-08-14 2016-11-16 覃泽林 The method and device of relief surface sugarcane acreage is obtained based on remote sensing image
CN107084688A (en) * 2017-05-06 2017-08-22 湖北大学 A kind of crop area Dynamic Change by Remote Sensing monitoring method based on plot yardstick

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101446981A (en) * 2008-12-26 2009-06-03 北京农业信息技术研究中心 Land-utilization modification investigation method and system based on the combination of PDA and 3S
CN105225227A (en) * 2015-09-07 2016-01-06 中国测绘科学研究院 The method and system that remote sensing image change detects
CN106123812A (en) * 2016-08-14 2016-11-16 覃泽林 The method and device of relief surface sugarcane acreage is obtained based on remote sensing image
CN107084688A (en) * 2017-05-06 2017-08-22 湖北大学 A kind of crop area Dynamic Change by Remote Sensing monitoring method based on plot yardstick

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
MICHAEL CRAIG ET AL: "A Literature Review of Crop Area Estimation", 《UN-FAO》 *
王利军 等: "高分 1 号冬小麦解译面积核算方法研究", 《中国农业资源与区划》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113554675A (en) * 2021-07-19 2021-10-26 贵州师范大学 Edible fungus yield estimation method based on unmanned aerial vehicle visible light remote sensing
CN114283335A (en) * 2021-12-27 2022-04-05 河南大学 Historical period remote sensing identification precision verification preparation method

Also Published As

Publication number Publication date
CN112241981B (en) 2024-03-22

Similar Documents

Publication Publication Date Title
Han et al. Modeling maize above-ground biomass based on machine learning approaches using UAV remote-sensing data
Wu et al. An improved high spatial and temporal data fusion approach for combining Landsat and MODIS data to generate daily synthetic Landsat imagery
An et al. Quantifying time-series of leaf morphology using 2D and 3D photogrammetry methods for high-throughput plant phenotyping
Mahmon et al. Differences of image classification techniques for land use and land cover classification
CN114091613B (en) Forest biomass estimation method based on high-score joint networking data
CN104408463B (en) High-resolution construction land pattern spot identification method
CN112241981B (en) Method and device for verifying accuracy of second-class measurement data of crop planting area
US11719858B2 (en) Determination of location-specific weather information for agronomic decision support
CN109508881B (en) Sea island region classification and ecological resource value evaluation method
Liu et al. A novel entropy-based method to quantify forest canopy structural complexity from multiplatform lidar point clouds
CN115457408A (en) Land monitoring method and device, electronic equipment and medium
US20120154398A1 (en) Method of determining implicit hidden features of phenomena which can be represented by a point distribution in a space
CN110210112A (en) Couple the urban heat land effect Scene Simulation method of land use planning
Tong et al. A least squares-based method for adjusting the boundaries of area objects
Agarwal et al. Development of machine learning based approach for computing optimal vegetation index with the use of sentinel-2 and drone data
CN117035174A (en) Method and system for estimating biomass on single-woodland of casuarina equisetifolia
CN109726679B (en) Remote sensing classification error spatial distribution mapping method
CN116485174A (en) Method and device for evaluating risk of ozone pollution on crop yield reduction
CN114662621B (en) Agricultural machinery working area calculation method and system based on machine learning
CN116091936A (en) Agricultural condition parameter inversion method for fusing point-land block-area scale data
CN114842326A (en) Calibration-free sandalwood plant seedling shortage positioning method
Zimmermann et al. Accuracy assessment of normalized digital surface models from aerial images regarding tree height determination in Saxony, Germany
CN113420875A (en) Convolutional neural network soil available phosphorus analysis model construction system and method
Congalton 21 How to Assess the Accuracy of Maps Generated from Remotely Sensed Data
CN116844075B (en) Tillage environment judging method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant