CN114838710B - Engineering rapid mapping method and mapping system based on unmanned aerial vehicle photographing - Google Patents

Engineering rapid mapping method and mapping system based on unmanned aerial vehicle photographing Download PDF

Info

Publication number
CN114838710B
CN114838710B CN202210318956.0A CN202210318956A CN114838710B CN 114838710 B CN114838710 B CN 114838710B CN 202210318956 A CN202210318956 A CN 202210318956A CN 114838710 B CN114838710 B CN 114838710B
Authority
CN
China
Prior art keywords
mapping
grid
data
aerial vehicle
unmanned aerial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210318956.0A
Other languages
Chinese (zh)
Other versions
CN114838710A (en
Inventor
陈超
方三陵
臧凌宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China First Metallurgical Group Co Ltd
Original Assignee
China First Metallurgical Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China First Metallurgical Group Co Ltd filed Critical China First Metallurgical Group Co Ltd
Priority to CN202210318956.0A priority Critical patent/CN114838710B/en
Publication of CN114838710A publication Critical patent/CN114838710A/en
Application granted granted Critical
Publication of CN114838710B publication Critical patent/CN114838710B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for
    • B64D47/08Arrangements of cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/30Assessment of water resources

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Electromagnetism (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Processing (AREA)

Abstract

The application discloses a rapid engineering surveying and mapping method and system based on unmanned aerial vehicle photographing. The mapping method comprises the steps of collecting aerial photos of a mapping area through the camera equipment, wherein the aerial photos contain grid coordinates generated by the grid filter; collecting flight height data of a mapping area through the laser range finder; obtaining, by the processing end, projection distance data between target mapping points of the mapping region according to the flying height data with reference to the grid coordinates; and acquiring a target mapping model according to the projection distance data through the processing end, and collecting required data after carrying the laser range finder. After data are collected, distance data between points are obtained through simple calculation, and a mapping model can be obtained by combining the height data, so that redundancy of mapping data processing is reduced under the condition of guaranteeing basic accuracy, and related engineering application can be better adapted.

Description

Engineering rapid mapping method and mapping system based on unmanned aerial vehicle photographing
Technical Field
The application relates to the technical field of engineering surveying and mapping, in particular to a rapid surveying and mapping method and system for engineering based on unmanned aerial vehicle photographing.
Background
In project practice, various engineering mapping works are needed, and equipment such as a ruler, a theodolite, a total station and the like is mainly used. But the working difficulty is high in some scenes such as site plane mapping, high building elevation mapping and other working conditions.
In the related art, a carrying lidar or the like is generally used to perform the related work. Although the mapping accuracy is high, the cost and time consumption are high, and the cost performance is insufficient for being selected by a project department in an event independent of the accuracy.
Disclosure of Invention
In view of the above, the present application provides a rapid mapping method and a mapping system for engineering based on unmanned aerial vehicle photographing, which can reduce complexity of mapping data processing while ensuring mapping accuracy.
In a first aspect, the application provides a rapid engineering mapping method based on unmanned aerial vehicle photographing, which is applied to a mapping system, wherein the mapping system comprises a processing end, an unmanned aerial vehicle provided with a camera device and a laser range finder hung on the unmanned aerial vehicle, and the camera device is provided with a grid filter;
the mapping method comprises the following steps:
acquiring aerial photographs of a mapping area through the image pickup equipment, wherein the aerial photographs contain grid coordinates generated by the grid filter;
collecting flight height data of a mapping area through the laser range finder;
obtaining, by the processing end, projection distance data between target mapping points of the mapping region according to the flying height data with reference to the grid coordinates;
and obtaining a target mapping model according to the projection distance data through the processing end.
Optionally, the grid filter is a 2×2 filter or a 4×4 filter.
Optionally, the unmanned aerial vehicle is loaded with an RTK module.
Optionally, the laser rangefinder is mounted coaxially with the image capturing apparatus.
Optionally, the measuring range of the laser range finder is 5cm-40 m.
In another aspect, the present application provides a rapid engineering mapping system based on unmanned aerial vehicle photographing, comprising:
the unmanned aerial vehicle is provided with an image pickup device, wherein the image pickup device is provided with a grid filter, the image pickup device collects aerial pictures of a mapping area, and the aerial pictures contain grid coordinates generated by the grid filter;
the laser range finder is hung on the unmanned aerial vehicle and used for collecting flight height data of a mapping area;
the processing end is used for obtaining projection distance data between target mapping points of the mapping area according to the flying height data under the reference of the grid coordinates and obtaining a target mapping model according to the projection distance data.
The engineering rapid mapping method and the engineering rapid mapping system based on unmanned aerial vehicle photographing are provided, and required data collection is achieved after the laser range finders are carried. After data are collected, distance data between points are obtained through simple calculation, and a mapping model can be obtained by combining the height data, so that redundancy of mapping data processing is reduced under the condition of guaranteeing basic accuracy, and related engineering application can be better adapted.
Drawings
The technical solution and other advantageous effects of the present application will be made apparent by the following detailed description of the specific embodiments of the present application with reference to the accompanying drawings.
Fig. 1 is a schematic diagram of a grid filter according to an embodiment of the present application.
FIG. 2 is a schematic diagram of calculating a projection distance with grid coordinate reference according to an embodiment of the present application.
FIG. 3 is another schematic diagram of calculating a throw distance with grid coordinate reference according to an embodiment of the present application.
FIG. 4 is a schematic illustration of a continuous process point network provided by an embodiment of the present application.
FIG. 5 is a schematic diagram of measurement continuity according to an embodiment of the present application.
Fig. 6 is a schematic diagram of a combining point network according to an embodiment of the present application.
Fig. 7 is a schematic diagram of an embodiment of the application for providing an elevation measurement.
FIG. 8 is a schematic diagram of a slope measurement according to an embodiment of the present application.
FIG. 9 is a schematic diagram of a flatness array according to an embodiment of the present application.
Fig. 10 is a schematic diagram of flatness measurement according to an embodiment of the present application.
FIG. 11 is a schematic diagram of error analysis according to an embodiment of the present application.
Fig. 12 is a flow chart of a mapping method according to an embodiment of the present application.
FIG. 13 is a structural frame diagram of a mapping system according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application. It will be apparent that the described embodiments are only some, but not all, embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to fall within the scope of the application.
In the description of the present application, it should be understood that the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more of the described features. In the description of the present application, the meaning of "a plurality" is two or more, unless explicitly defined otherwise.
In the description of the present application, it should be noted that, unless explicitly specified and limited otherwise, the terms "mounted," "connected," and "connected" are to be construed broadly, and may be either fixedly connected, detachably connected, or integrally connected, for example; can be mechanically connected, electrically connected or can be communicated with each other; can be directly connected or indirectly connected through an intermediate medium, and can be communicated with the inside of two elements or the interaction relationship of the two elements. The specific meaning of the above terms in the present application can be understood by those of ordinary skill in the art according to the specific circumstances.
The following disclosure provides many different embodiments, or examples, for implementing different features of the application. In order to simplify the present disclosure, components and arrangements of specific examples are described below. They are, of course, merely examples and are not intended to limit the application. Furthermore, the present application may repeat reference numerals and/or letters in the various examples, which are for the purpose of brevity and clarity, and which do not themselves indicate the relationship between the various embodiments and/or arrangements discussed. In addition, the present application provides examples of various specific processes and materials, but one of ordinary skill in the art will recognize the application of other processes and/or the use of other materials.
The application provides a rapid engineering mapping method based on unmanned aerial vehicle photographing, which is applied to a mapping system, wherein the mapping system comprises a processing end, an unmanned aerial vehicle provided with camera equipment and a laser range finder hung on the unmanned aerial vehicle, and the camera equipment is provided with a grid filter;
referring to fig. 12, the mapping method includes:
102. and acquiring aerial photographs of the mapping area through the image pickup equipment, wherein the aerial photographs contain grid coordinates generated by the grid filter.
104. And acquiring flight height data of the mapping area through the laser range finder.
106. Obtaining, by the processing terminal, projection distance data between target mapping points of the mapping region according to the flying height data with reference to the grid coordinates;
108. and obtaining a target mapping model according to the projection distance data through the processing end.
As an demonstration of unmanned aerial vehicle, taking into consideration parameters such as economy, flight precision, carrying capacity, imaging quality, endurance and the like, the project selects a brand of the eidolon 4Pro unmanned aerial vehicle to be additionally provided with an RTK module as a flight platform. Limited by the limitation of the bearing capacity of the unmanned aerial vehicle flight platform, a certain brand of small-sized laser range finder is selected, the weight of the small-sized laser range finder is 103g, the measuring range is 5cm-40m, the precision is +/-3 mm, and the communication module is Bluetooth. The unmanned aerial vehicle is refitted, the laser range finder is installed on the camera cradle head, and the laser range finder and the camera are coaxially installed.
Regarding the mesh filter, as shown in FIG. 1As shown. Wherein (a) is a 2×2 4-cell filter. Its grid parameter is the horizontal axis X 4 n Longitudinal axis Y 4 n . And similarly, (b) is a 4×4 16-grid filter. Its grid parameter is the horizontal axis X 16 m Longitudinal axis Y 16 m . And so on.
As an exemplary manner according to the flying height data under the reference of the grid coordinates, the following is exemplified:
referring to fig. 2, a 4-grid filter is taken as an example to demonstrate the calculated relationship. Unmanned aerial vehicle is formulated as origin O 4 0 (i.e. the origin of the camera and the range finder), the center point of the grid filter is O 4 0 ' then O 4 0 With O 4 0 The point of the extension line of the' connecting line projected to the mapping surface is O 4 0 ''. Known parameters are O 4 0 To O 4 0 Distance h of 4 n And the distance X of the grid 4 n (taking the horizontal axis direction as an example). H can be measured by a laser range finder 4 n ' then X can be calculated by the following formula 4 n '。
Formula 2-1;
for example, the unmanned plane platform shoots in the air with the altitude of 40m, and the point is assumed to be the 0 number point. Reading h of laser range finder 4 0 ' 15716mm. In particular, by measuring h 4 n 6mm, X 4 n 8mm. From formula 2-1, it can be seen that:
i.e. in 4-grid filter mode, the first photo has a projected spot distance of 20954.67mm.
And controlling the unmanned plane platform to fly 20954.67mm along the planned transverse axis direction, and planning the unmanned plane platform as a number 1 point. Reading laser range finderReading, h 4 1 ' 15648mm. Identical X 4 t1 ' 20864mm.
The relationship between 0 and 1 points in the above data is shown in fig. 3. In the figure, 0 is the point with the number "0", O 4 0 The position of the unmanned aerial vehicle flight platform at the point 0 is the same as the position. And 1' is the projection point of the point No. 1 on the horizontal plane of the point No. 0. 1' is the intersection point of a circle on the horizontal plane of the 0 # point, wherein the circle is drawn by taking the 0 # point as the center 01 as the distance. The same 1' is O 4 0 And X is 4 n The intersection point of the extension line of (2) and the horizontal plane of the point 0, namely 01', is the distance X between the projection points of the first photo 4 t0 '(20954 mm), i.e. 1' is the "position" where 1 point should theoretically be. But in mathematical relationship 1 point is actually at 1'. This occurs because of the difference in height between the 1 and 0 points, which is, of course, read h by the laser rangefinder 4 n The presence of this height difference is obvious. Also on the second photograph, the theoretical distance of 1'0 should be 10' (i.e., X) 4 t1 ' = 20864 mm), i.e. 1 point actual horizontal plane and 2X 2 grid X of cameras on unmanned aerial vehicle flight deck 4 n Intersection of the views formed over the distance. This causes different distortions of the actual distance 01 at different photo imaging, from which case oblique photogrammetry techniques have been developed. The method eliminates a complex restoration algorithm, and only builds an approximate model of the measured surface through a more efficient combination-superposition algorithm.
At this time, a continuation method can be selected, namely, the shooting point No. 2 of the 3 rd photo is directly selected based on the point No. 0, the unmanned plane platform is controlled to fly 20954.67mm along the planned transverse axis direction, and the following point is unchanged in principle. The mathematical relationship explained above has been explained, that the actual distance 01 is the projection distance X from the "0" point photo 4 t0 ' distance X from Point 1 photo 4 t1 'all are continuous, in short, when dot-net measurement is performed, the data only need to be superimposed by the respective methods, i.e. the dots 0, 1, 2, 3 and 4 in fig. 5 remain continuous, and no discontinuous area 11' is formed.
With respect to mappingThe density of the dot network (fineness of the terrain mesh) of the mapping points of the region is determined by the mapping point distribution of the unmanned aerial vehicle flight platform, and as can be seen from the foregoing description 2-1, the projection point distance X 4 n ' i.e. mapping point spacing. It has 2 influencing factors in practice, one of which is the distance X of the grid 4 n (taking the transverse axis direction as an example), and the second is the flying height h of the unmanned plane flight platform 4 n '. Therefore, the higher the density of the grid filter is, the higher the density of the dot network is, and the higher the flying height of the man-machine flying platform is, the lower the dot network density is. Because the grid filter accuracy has a great influence on mapping accuracy, reasonable arrangement of the grid filter on the consumer-level platform is 2×2 filter or 4×4 filter, and although the virtual filter with higher density can be used under the assistance of software, the calculated amount and the number of flight measuring points are greatly increased, and are discussed later. For example, a 2X 2 mesh filter is at fly height (h 4 0 ') is 15716mm, the density (X) of the mapping point network 4 t0 ') of 20954.67mm at the same fly height (h 16 0 '=h 4 0 ') 4X 4 grid filter mapping Point network Density (X) 16 t0 ') is 10477.33 mm, i.e., the 4×4 grid filter mapping dot network density is 4 times that of the 2×2 grid filter. Similarly, the calculation amount increases as well as the accuracy increases.
Furthermore, in order to fully utilize the computing resources, the density of the measuring points can be increased or decreased in a local area according to the mapping requirement by combining the point networks so as to be compatible with the computing amount and the precision. As shown in FIG. 6, the AB section is a flat land, and the rest sections are sloping lands, so that the AB section can reduce the density of the measuring points, and the rest sections still carry out high-density point network measurement. Spatially equivalent. However, it should be noted that, to ensure data continuity, the pitch can only be changed by "jumping points", and the pitch cannot be changed by changing the flying height of the unmanned aerial vehicle flight platform.
With regard to the above-described mapping model, the projection point distance may be preset, i.e. preset in software, as formulated to be 2000m. From the mathematical relationship in FIG. 2, it can be seen that 16 n ' fixed, combine data h 16 n 、h 16 n ' computable X 16 n . Under the assistance of software, the physical precision problem of grid filter manufacture can be broken through, and a mapping model with higher precision can be obtained under the condition of economy. Likewise, the change X can be made with the aid of software 16 n And the dynamic simultaneous connection is carried out, so that the multi-machine combined measurement function is realized, and the measurement efficiency is improved.
Likewise, the multi-angle spherical rotation function based on the camera platform can also be used for measuring opposite surfaces or other planes. As shown in fig. 7, the unmanned plane flying platform performs the component mapping model after the dot net measurement based on the same principle as the previous one. Further, the same kind of operation can be performed on the inclined plane. The plane inclination angle is measured just before measurement, so that the unmanned aerial vehicle can measure in the normal direction of the inclined plane. During inclined plane measurement, the flight track of the unmanned aerial vehicle flight platform needs to be calculated independently, as shown in fig. 8, and O is completed 1 After position measurement, the unmanned plane flying platform should follow O 1 Aplane fly trailing edge O 2 A is lowered to O 2 Position measurement, and so on. With the help of special software, the calculation and operation workload is greatly reduced, and the work under more complex working conditions can be realized. It is noted that in engineering practice, the horizontal and vertical surfaces should be generally used as reference surfaces.
There is a further application in engineering practice in facade measurements. For example, when measuring the flatness of a large aluminum template, a lattice is drawn on the aluminum template in advance, as shown in fig. 9, and the lattice is required to be uniformly distributed.
And shooting by the unmanned aerial vehicle flight platform near the center of the prediction template, wherein the shooting angle is required to be perpendicular to the prediction template. After the photographing is completed, the lattice in the photograph is analyzed as shown in fig. 10. Since the lattice is preset, the lattice distance is known, that is, the distances AB, BC, CD in the picture are equal, and the distance AB in the picture can be calculated by the method in fig. 2. At some point where the flatness is not zero, the length of BC 'is greater than the actual length BC, as shown in C' in the figure, as in the principle disclosed in FIG. 3. And similarly, calculating the lattice spacing in the photo by analysis to obtain the flatness state.
In order to ensure that engineering application can be practically carried out, the measurement accuracy of the method is analyzed.
According to the method, a consumer-level unmanned aerial vehicle loaded with an RTK module is used as a flight platform, and the hovering precision of the unmanned aerial vehicle is 10mm horizontally and 15mm vertically in an RTK FIX mode according to the official network data. The laser range finder used in the system has the accuracy of + -3 mm. Therefore, it is apparent that the maximum horizontal error is 13mm and the maximum vertical error is 18mm in terms of measurement accuracy. As in fig. 11, the analysis is performed by taking a horizontal measurement as an example.
As shown in the foregoing data, according to the calculation of 2-1,
X 4 t0 ' 20954.67mm, X 4 r0 ' 20978.67mm and delta. X34 mm. The cumulative measurement error was 13mm, and the difference Σx was 47mm. Under ideal conditions (zero error), the unmanned plane flying platform should be at O when the second measuring point is measured 4 1 Under the planned maximum error, the unmanned plane flying platform is actually positioned at O 4 e1 Where it is located. Therefore, the actual plane is AB, and the data display plane is ABe. The working condition is assumed to be the calculated amount of earthwork, and BCC is obtained e B e The face is the error amount. Therefore, the error rate R at this time e The method comprises the following steps:
therefore, the method can be considered to have reliable measurement results and meet the practical requirements of engineering sites.
The paper is known by reading, sorting and analyzing. The five-camera oblique photogrammetry technology is advanced in the oblique photogrammetry technology, the resolution can reach 4cm after combining the technologies such as an image distortion reduction algorithm, a multi-image compound angle coordinate algorithm and the like, the earth level measurement comprehensive error can be controlled to be 0.4%, and the building level measurement comprehensive error can be controlled to be 0.3%. The platform has certain economical efficiency, and the price is usually in the range of 10-30 ten thousand. The single building level data of the dense point cloud reconstruction model is typically 800-1500G, and the earth level data per hundred acres is typically 400-900G. Laser radar mapping generally has various accuracies of 3mm, 5mm, 7mm, 35mm, 50mm and the like according to the influences of a working mode, power, algorithm and the like of the carried radar. The price of the platform is higher, and more products on the market are concentrated in 60-200 ten thousand areas. With higher precision, the acre topographic mapping data can be controlled to be about 100G usually through special software optimization. Likewise, due to the optimization of software, the measured data volume of buildings and structures is not greatly different from the topographic survey and drawing.
Referring to fig. 13, in another aspect, the present application provides an engineering rapid mapping system based on unmanned aerial vehicle photographing, comprising:
a drone 202 equipped with an imaging device, wherein the imaging device is equipped with a grid filter, the imaging device collects aerial pictures of a mapping region, the aerial pictures containing grid coordinates generated by the grid filter;
a laser range finder 204, hung on the unmanned aerial vehicle, for collecting the flying height data of the mapping area;
a processing end 206 is configured to obtain, with reference to the grid coordinates, projection distance data between target mapping points of the mapping region according to the flight height data, and to obtain a target mapping model according to the projection distance data.
It should be appreciated that the processing end may be any device including a storage medium having stored thereon a model, such as a PC or a server, for calculating any intermediate data, such as fly height data, throw distance data, etc.
The present application is not limited to the above-mentioned embodiments, and any changes or substitutions that can be easily understood by those skilled in the art within the technical scope of the present application are intended to be included in the scope of the present application.

Claims (6)

1. The engineering rapid mapping method based on unmanned aerial vehicle photographing is characterized by being applied to a mapping system, wherein the mapping system comprises a processing end, an unmanned aerial vehicle provided with a camera device and a laser range finder hung on the unmanned aerial vehicle, and the camera device is provided with a grid filter;
the mapping method comprises the following steps:
acquiring aerial photographs of a mapping area through the image pickup equipment, wherein the aerial photographs contain grid coordinates generated by the grid filter;
collecting flight height data of a mapping area through the laser range finder;
obtaining, by the processing end, projection distance data between target mapping points of the mapping region according to the flying height data with reference to the grid coordinates;
obtaining a target mapping model according to the projection distance data through the processing end;
wherein the projection distance data between the target survey points of the survey area is obtained from the fly height data with reference to the grid coordinates, obtained from the following formula,
in the above formula, the origin of the camera or the laser range finder is O 4 0 The center point of the grid filter is O 4 0 ',O 4 0 With O 4 0 The point of the extension line of the' connecting line projected to the mapping surface is O 4 0 '',O 4 0 、O 4 0 ' distance between H 4 n The distance of the grid is X 4 n The reading of the laser range finder is h 4 n ' the projection distance is X n 4 '。
2. The mapping method of claim 1, wherein the grid filter is a 2 x 2 filter or a 4 x 4 filter.
3. The mapping method of claim 1, wherein the drone is onboard an RTK module.
4. The mapping method of claim 1, wherein the laser rangefinder is mounted coaxially with the imaging apparatus.
5. The mapping method of claim 1, wherein the laser rangefinder has a measurement range of 5cm to 40m.
6. Engineering is with quick survey and drawing system based on unmanned aerial vehicle is photographed, a serial communication port includes:
the unmanned aerial vehicle is provided with an image pickup device, wherein the image pickup device is provided with a grid filter, the image pickup device collects aerial pictures of a mapping area, and the aerial pictures contain grid coordinates generated by the grid filter;
the laser range finder is hung on the unmanned aerial vehicle and used for collecting flight height data of a mapping area;
the processing end is used for obtaining projection distance data between target mapping points of the mapping area according to the flying height data under the reference of the grid coordinates and obtaining a target mapping model according to the projection distance data;
wherein the projection distance data between the target survey points of the survey area is obtained from the fly height data with reference to the grid coordinates, obtained from the following formula,
in the above formula, the origin of the camera or the laser range finder is O 4 0 The center point of the grid filter is O 4 0 ',O 4 0 With O 4 0 The point of the extension line of the' connecting line projected to the mapping surface is O 4 0 '',O 4 0 、O 4 0 ' distance between H 4 n The distance of the grid isX 4 n The reading of the laser range finder is h 4 n ',O 4 0 The projection distance between'' and the target mapping point is X n 4 '。
CN202210318956.0A 2022-03-29 2022-03-29 Engineering rapid mapping method and mapping system based on unmanned aerial vehicle photographing Active CN114838710B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210318956.0A CN114838710B (en) 2022-03-29 2022-03-29 Engineering rapid mapping method and mapping system based on unmanned aerial vehicle photographing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210318956.0A CN114838710B (en) 2022-03-29 2022-03-29 Engineering rapid mapping method and mapping system based on unmanned aerial vehicle photographing

Publications (2)

Publication Number Publication Date
CN114838710A CN114838710A (en) 2022-08-02
CN114838710B true CN114838710B (en) 2023-08-29

Family

ID=82564143

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210318956.0A Active CN114838710B (en) 2022-03-29 2022-03-29 Engineering rapid mapping method and mapping system based on unmanned aerial vehicle photographing

Country Status (1)

Country Link
CN (1) CN114838710B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109239725A (en) * 2018-08-20 2019-01-18 广州极飞科技有限公司 Ground mapping method and terminal based on laser ranging system
WO2020103023A1 (en) * 2018-11-21 2020-05-28 广州极飞科技有限公司 Surveying and mapping system, surveying and mapping method, apparatus, device and medium
CN112729260A (en) * 2020-12-15 2021-04-30 广州极飞科技股份有限公司 Surveying and mapping system and surveying and mapping method
CN113624208A (en) * 2021-08-10 2021-11-09 钱琳 Map mapping method based on laser ranging technology

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9891049B2 (en) * 2015-10-29 2018-02-13 Trimble Inc. Method of solving initial azimuth for survey instruments, cameras, and other devices with position and tilt information
CN109472806B (en) * 2017-09-07 2020-11-17 广州极飞科技有限公司 Method and device for planning flight area of unmanned aerial vehicle and remote controller
US10788428B2 (en) * 2017-09-25 2020-09-29 The Boeing Company Positioning system for aerial non-destructive inspection

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109239725A (en) * 2018-08-20 2019-01-18 广州极飞科技有限公司 Ground mapping method and terminal based on laser ranging system
WO2020103023A1 (en) * 2018-11-21 2020-05-28 广州极飞科技有限公司 Surveying and mapping system, surveying and mapping method, apparatus, device and medium
CN112729260A (en) * 2020-12-15 2021-04-30 广州极飞科技股份有限公司 Surveying and mapping system and surveying and mapping method
CN113624208A (en) * 2021-08-10 2021-11-09 钱琳 Map mapping method based on laser ranging technology

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
吕树春.无人机倾斜摄影测量在多测合一中的应用研究.《测绘与空间地理信息》.2022,第第45卷卷第249-251页. *

Also Published As

Publication number Publication date
CN114838710A (en) 2022-08-02

Similar Documents

Publication Publication Date Title
CA2821759C (en) Oblique geolocation and measurement system
JP4719753B2 (en) Digital photogrammetry method and apparatus using heterogeneous sensor integrated modeling
CN109597095A (en) Backpack type 3 D laser scanning and three-dimensional imaging combined system and data capture method
CN107504957A (en) The method that three-dimensional terrain model structure is quickly carried out using unmanned plane multi-visual angle filming
CN106767706A (en) A kind of unmanned plane reconnoitres the Aerial Images acquisition method and system of the scene of a traffic accident
CN101919235A (en) Orthophotographic image creating method and imaging device
CN107749072A (en) A kind of unmanned plane image calibrating method suitable for domatic measurement
CN110345925A (en) One kind is for five mesh aerial photograph quality testings and empty three processing methods
CN112150629A (en) Vision-based coal inventory system and method
CN112862966B (en) Method, device, equipment and storage medium for constructing surface three-dimensional model
CN109631854B (en) Method for determining mining area coal mining collapse cracks through low-altitude unmanned aerial vehicle photography
Nasrullah Systematic analysis of unmanned aerial vehicle (UAV) derived product quality
JP2021117047A (en) Photogrammetric method using unmanned flight vehicle and photogrammetric system using the same
CN108050995B (en) Oblique photography non-image control point aerial photography measurement area merging method based on DEM
CN114838710B (en) Engineering rapid mapping method and mapping system based on unmanned aerial vehicle photographing
KR102262120B1 (en) Method of providing drone route
CN116129064A (en) Electronic map generation method, device, equipment and storage medium
CN115077394A (en) Power station dam slope displacement detection method and device and electronic equipment
Thoeni et al. The potential of low-cost rpas for multi-view reconstruction of sub-vertical rock faces
CN114565725A (en) Reverse modeling method for three-dimensional scanning target area of unmanned aerial vehicle, storage medium and computer equipment
CN113650783A (en) Fixed wing oblique photography cadastral mapping method, system and equipment
JP2006145357A (en) Photographing plan supporting device and program for the same
CN111932622B (en) Device, method and system for determining flight altitude of unmanned aerial vehicle
JPH0969148A (en) Method for preparing map data
Niu et al. Accuracy Assessment of UAV Photogrammetry System with RTK Measurements for Direct Georeferencing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant