Accurate calculation method for volume variation of multi-temporal photogrammetric data
Technical Field
The invention relates to the technical field of surveying and mapping, in particular to an accurate calculation method for multi-temporal photogrammetry data volume variation.
Background
In urban construction, the earth volume calculation needs joint calculation among multi-temporal photogrammetric data acquired in multiple time periods, the deviation among surface three-dimensional models is solved, and the accurate fill excavation volume is calculated. The problems of large workload and inaccurate result exist in the prior methods of section method, topographic point mapping and the like.
The unmanned aerial vehicle can quickly obtain three-dimensional space data of the surface by combining a photogrammetry or a computer vision method, but the photogrammetry of the unmanned aerial vehicle for the earth and stone volume calculation usually needs accurate space reference, so that a high-precision POS module (RTK/PPK/PPP) for accurate positioning must be installed on the unmanned aerial vehicle, or a ground image control point measurement method is utilized, which undoubtedly needs greater cost investment, and the manual workload of field and interior industries is increased.
Under the condition of no image control point and high-precision POS assistance, the following problems exist in the spatial registration of the unmanned aerial vehicle image data obtained at different times: 1) only with GPS assistance and without high-precision POS assistance, the accurate flight position and attitude of the unmanned aerial vehicle cannot be determined; 2) the accurate spatial reference of the photogrammetric result cannot be determined without image control points; 3) the observed object is variable and direct matching will result in spatial matching inaccuracies.
Therefore, the problem of realizing the spatial registration of the image data of the unmanned aerial vehicle in different periods without the assistance of image control points and high-precision POS (point of sale) is solved, and the volume change of the data in different periods is accurately calculated.
Disclosure of Invention
In order to overcome the defects in the prior art, the invention provides an accurate calculation method for the volume variation of multi-temporal photogrammetry data, and solves the technical problems that the space registration difficulty of unmanned aerial vehicle image data obtained at different times is high and the calculation of the object volume variation is inaccurate under the condition of no image control point and high-precision POS assistance.
The invention is realized by the following technical scheme:
a method for accurately calculating the volume change of multi-temporal photogrammetric data specifically comprises the following steps:
s1: using an unmanned aerial vehicle to collect photogrammetric data of a plurality of periods before and after the volume change of a measured object, and obtaining an aerial image of the unmanned aerial vehicle and corresponding GPS data;
s2: performing space-three solution on multi-period unmanned aerial vehicle aerial images under the assistance of respective GPS data to obtain space-three results;
s3: carrying out dense matching on the multi-period unmanned aerial vehicle aerial images on the basis of respective space results and three results to obtain dense matching point clouds;
s4: taking the space where the dense matching point clouds in a certain period are located as a fixed reference space, and registering and converting the dense matching point clouds in other periods into the fixed reference space;
s5: and calculating the volume change according to the dense matching point clouds at each period which are converted into the same fixed reference space.
Further, the acquiring of the photogrammetric data in S1 comprises the steps of:
s11: defining a volume calculation boundary of the volume change part according to the measured object;
s12: the measurement boundary of the volume change part is expanded outwards by a certain range to obtain the unmanned aerial vehicle flight acquisition range;
s13: the unmanned aerial vehicle respectively collects photogrammetric data of different periods according to the flight range flight in the S12.
Further, the course and the side direction overlapping degree between the pictures shot by the unmanned aerial vehicle are not less than 60%.
Furthermore, the unmanned aerial vehicle flies according to a Z-shaped air route when collecting photogrammetric data, and the multiple photogrammetric ranges, the flight heights, the overlapping degrees and the focal length parameters of the unmanned aerial vehicle are kept consistent.
Further, the hollow three results of S2 are the inside and outside orientation elements of the aerial image of the drone.
Further, the dense matching method in S3 includes SGM, PMVS, Plane-sweet, and Patch-Match.
Further, the specific step of S4 includes:
s41: dividing the dense matching point cloud resolved by the multi-period unmanned aerial vehicle aerial image by using the boundary of the measured object in the S1, and separating the expanded area dense matching point cloud of the multi-period unmanned aerial vehicle aerial image outside the boundary of the measured object;
s42: taking the expanded region dense matching point cloud of the mth unmanned aerial vehicle aerial image outside the boundary of the measured object as a fixed point, taking the expanded region dense matching point cloud of the other unmanned aerial vehicle aerial image outside the measured object as a floating point, and respectively registering the floating point of each period with the fixed point to obtain a change matrix of the unmanned aerial vehicle aerial image dense matching point cloud of the other period relative to the mth unmanned aerial vehicle aerial image dense matching point cloud;
s43: and transforming the dense matching point cloud of the aerial images of the unmanned aerial vehicles in other periods by using the change matrix in the S42, so that the dense matching point cloud of the aerial images of the unmanned aerial vehicles in other periods and the dense matching point cloud of the aerial images of the unmanned aerial vehicles in the mth period are in a uniform reference space.
Further, the registration manner in S42 is rigid registration or non-rigid ICP registration.
Further, the specific step of S5 includes:
s51: carrying out filtering processing on the dense matching point cloud of the aerial image of the unmanned aerial vehicle in each period, filtering out objects and reserving the terrain;
s52: and (3) uniformly sampling the ground shape points subjected to the dense matching point cloud filtering of the aerial images of the unmanned aerial vehicle at each period, constructing a triangular net or spline fitting surface generation method to generate a continuous or rasterized earth surface, solving difference values of earth surface data at multiple periods, and calculating volume change.
Compared with the prior art, the invention has the beneficial effects that:
the precise calculation method for the volume variation of the multi-temporal photogrammetric data does not need a high-precision POS module or a ground image control point, and is small in workload of operators and low in cost; the method comprises the steps that aerial three-phase aerial images of two phases at the same position acquired by an unmanned aerial vehicle at different times are subjected to aerial three-phase calculation and dense matching to obtain point clouds, and compared with a traditional section method and a topographic point measuring method, the method is much denser and has higher measuring accuracy; the efficiency of aerial photography by using the unmanned aerial vehicle is higher than that of traditional acquisition.
Drawings
Fig. 1 is a flowchart illustrating a method for accurately calculating a volume change of multi-temporal photogrammetric data according to an embodiment of the present invention.
Detailed Description
The following examples are presented to illustrate certain embodiments of the invention in particular and should not be construed as limiting the scope of the invention. The present disclosure may be modified from materials, methods, and reaction conditions at the same time, and all such modifications are intended to be within the spirit and scope of the present invention.
As shown in FIG. 1, for a certain construction site (floor area 0.5 km)2) The method comprises the following steps of respectively carrying out 1-time data acquisition before and after filling and excavating work, analyzing and calculating volume change in filling and excavating construction, and specifically comprises the following steps:
s1: acquiring photogrammetric data of the unmanned aerial vehicle in two periods before filling and excavating work and after filling and excavating work by using the unmanned aerial vehicle to obtain aerial images of the unmanned aerial vehicle and corresponding GPS data;
the acquisition of photogrammetric data comprises the following steps:
s11: defining an earthwork calculation boundary according to the construction red line;
s12: extending the earthwork calculation boundary outwards by 1 flight height range to obtain an unmanned aerial vehicle flight acquisition range;
s13: the unmanned aerial vehicle flies to collect photogrammetric data according to the flight range in S12, only GPS positioning is needed during flight data collection, and high-precision POS positioning is not needed;
the unmanned aerial vehicle shoots vertically downwards or has a certain inclination angle, and the course and the side direction overlapping degree between the pictures are not less than 60 percent; when the unmanned aerial vehicle collects photogrammetric data, the unmanned aerial vehicle flies according to a Z-shaped air route, and the two times of photogrammetric ranges, the flight heights, the overlapping degrees and the focal length parameters of the unmanned aerial vehicle are kept consistent.
S2: in the two periods, aerial images of the unmanned aerial vehicle are respectively subjected to space-three solution under the assistance of respective GPS data, space-three bending caused by camera distortion is restrained through position information, accurate absolute dimension information and rough space reference are given to space-three results, and an inner orientation element and an outer orientation element of the camera are solved;
s3: carrying out dense matching on the aerial images of the unmanned aerial vehicles in the two periods by using an SGM, PMVS, Plane-sweet and Patch-Match dense matching method on the basis of respective air-three results to obtain dense matching point cloud;
s4: with the space where the dense matching point cloud of the aerial image of the unmanned aerial vehicle at the 1 st time is located as a fixed reference space, registering and converting the dense matching point cloud at the 2 nd time into the fixed reference space, the method specifically comprises the following steps:
s41: dividing the densely matched point clouds resolved by the aerial images of the unmanned aerial vehicles in the two periods by using the boundary of the measured object in S1, and separating the densely matched point clouds in the extended area of the aerial images of the unmanned aerial vehicles in the two periods outside the boundary of the measured object;
s42: taking the expanded region dense matching point cloud of the 1 st unmanned aerial vehicle aerial image outside the boundary of the measured object as a fixed point, taking the expanded region dense matching point cloud of the 2 nd unmanned aerial vehicle aerial image outside the measured object as a floating point, registering the 2 nd floating point with the fixed point, and obtaining a change matrix of the 2 nd unmanned aerial vehicle aerial image dense matching point cloud relative to the 1 st unmanned aerial vehicle aerial image dense matching point cloud by adopting a rigid registration or non-rigid ICP registration mode;
s43: and transforming the 2 nd-time dense matching point cloud of the aerial image of the unmanned aerial vehicle by using the change matrix in the S42, so that the 2 nd-time dense matching point cloud of the aerial image of the unmanned aerial vehicle and the 1 st dense matching point cloud of the aerial image of the unmanned aerial vehicle are in a uniform reference space.
S5: calculating the volume change according to the two time period dense matching point clouds which are converted into the same fixed reference space, and specifically comprising the following steps of:
s51: and carrying out filtering treatment on the dense matching point clouds of the aerial images of the unmanned aerial vehicles in the two periods, filtering out objects and reserving the terrain.
S52: and (3) uniformly sampling the ground shape points subjected to the dense matching point cloud filtering of the aerial images of the unmanned aerial vehicles in two periods, constructing a triangular net or spline fitting surface generation method to generate a continuous or rasterized earth surface, solving the difference value of earth surface data in the two periods, and calculating the volume change.
In conclusion, by adopting the accurate calculation method for the volume change of the multi-temporal photogrammetry data, a high-precision POS module or a ground image control point is not needed, the workload of operators is low, and the cost is low; the method comprises the steps that aerial three-phase aerial images of two phases at the same position acquired by an unmanned aerial vehicle at different times are subjected to aerial three-phase calculation and dense matching to obtain point clouds, and compared with a traditional section method and a topographic point measuring method, the method is much denser and has higher measuring accuracy; the efficiency of aerial photography by using the unmanned aerial vehicle is higher than that of traditional acquisition.
The above description is only an embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by the present specification, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.