CN112729260A - Surveying and mapping system and surveying and mapping method - Google Patents

Surveying and mapping system and surveying and mapping method Download PDF

Info

Publication number
CN112729260A
CN112729260A CN202011480482.7A CN202011480482A CN112729260A CN 112729260 A CN112729260 A CN 112729260A CN 202011480482 A CN202011480482 A CN 202011480482A CN 112729260 A CN112729260 A CN 112729260A
Authority
CN
China
Prior art keywords
mapping
surveying
point cloud
aerial
image processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011480482.7A
Other languages
Chinese (zh)
Other versions
CN112729260B (en
Inventor
邓杭
池鹏可
张毫杰
翁立宇
伍宇明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Xaircraft Technology Co Ltd
Original Assignee
Guangzhou Xaircraft Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Xaircraft Technology Co Ltd filed Critical Guangzhou Xaircraft Technology Co Ltd
Priority to CN202011480482.7A priority Critical patent/CN112729260B/en
Publication of CN112729260A publication Critical patent/CN112729260A/en
Application granted granted Critical
Publication of CN112729260B publication Critical patent/CN112729260B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C15/00Surveying instruments or accessories not provided for in groups G01C1/00 - G01C13/00

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Image Processing (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the application relates to the technical field of surveying and mapping, and provides a surveying and mapping system and a surveying and mapping method, wherein the surveying and mapping system comprises an unmanned aerial vehicle and surveying and mapping equipment, and the surveying and mapping equipment is mounted on the unmanned aerial vehicle and has good portability; at the mapping process, unmanned aerial vehicle flies according to setting for the airline to trigger mapping equipment when flying to the point of shooing of setting for and shoot, mapping equipment shoots the picture of taking photo by plane of mapping region, and generates mapping data of mapping region according to the picture of taking photo by plane, thereby can shoot instant generation mapping data, have good real-time. Compared with the prior art, the embodiment of the application can accurately and quickly complete mapping.

Description

Surveying and mapping system and surveying and mapping method
Technical Field
The embodiment of the application relates to the technical field of surveying and mapping, in particular to a surveying and mapping system and a surveying and mapping method.
Background
When using unmanned equipment (e.g., unmanned aerial vehicle, unmanned vehicle, etc.) to perform work, it is necessary to clearly understand the topography of the work area, such as elevation information, height of ground objects, gradient, etc., and to understand the topography of the work area, it is necessary to survey and map the work area.
The existing surveying and mapping mode mainly comprises manual surveying and mapping during agricultural tool operation, and is low in manual surveying and mapping efficiency, high in labor cost and not suitable for surveying and mapping large-area land parcels; data obtained by mapping during farm tool operation are generally topographic data carried by track points, data points are insufficient, the topography of the whole land parcel cannot be completely reflected, and meanwhile, related sensors are not popularized yet.
Disclosure of Invention
An object of the embodiments of the present application is to provide a mapping system and a mapping method, so as to solve the problem that the existing mapping method cannot accurately and quickly complete mapping.
In order to achieve the above purpose, the embodiments of the present application employ the following technical solutions:
in a first aspect, an embodiment of the present application provides a surveying and mapping system, where the surveying and mapping system includes an unmanned aerial vehicle and a surveying and mapping device communicatively connected to the unmanned aerial vehicle, and the surveying and mapping device is mounted on the unmanned aerial vehicle; the unmanned aerial vehicle is used for flying according to a set air route and triggering the mapping equipment to take a picture when flying to a set picture taking point; the surveying and mapping equipment is used for shooting an aerial picture of a surveying and mapping area under the triggering of the unmanned aerial vehicle, generating surveying and mapping data of the surveying and mapping area according to the aerial picture, and sending the surveying and mapping data to the unmanned aerial vehicle.
Optionally, the mapping device comprises a camera and an image processing chip, the camera and the image processing chip being communicatively connected; the camera is used for shooting aerial pictures of the surveying and mapping area under the trigger of the unmanned aerial vehicle and transmitting the aerial pictures to the image processing chip in real time, wherein one set shooting point corresponds to one aerial picture; and the image processing chip is used for generating mapping data of the mapping area according to all the aerial pictures.
Optionally, the image processing chip is further configured to generate pose information of each aerial image and a sparse three-dimensional point cloud of the mapping area in a world coordinate system according to all the aerial images; the image processing chip is further used for reconstructing dense three-dimensional point cloud of the surveying and mapping area based on each pose information; the image processing chip is also used for generating a digital orthophoto map of the surveying and mapping area according to the sparse three-dimensional point cloud; wherein the mapping data comprises the dense three-dimensional point cloud and the digital orthophotomap.
Optionally, the image processing chip includes a central processing unit CPU and a graphics processing unit GPU;
the GPU is used for extracting the features of the aerial photo and outputting the feature information of the aerial photo; the CPU is further used for solving first pose information of each aerial image and first sparse three-dimensional point cloud of the surveying and mapping area by utilizing an SFM algorithm according to each feature information; the CPU is further used for acquiring shooting information of any target aerial picture, wherein RTK coordinates of the unmanned aerial vehicle are recorded in the shooting information, and one set shooting point corresponds to one RTK coordinate; the CPU is further used for calculating a similarity transformation matrix according to the first position and orientation information of the target aerial image and the RTK coordinates; the CPU is further used for transforming each first pose information and the first sparse three-dimensional point cloud according to the similarity transformation matrix to obtain each pose information and the sparse three-dimensional point cloud.
Optionally, the CPU is further configured to perform binocular stereo correction on any two adjacent aerial pictures having a matching relationship according to each pose information, so as to obtain each corrected aerial picture; the GPU is also used for generating a disparity map corresponding to each aerial image by utilizing a semi-global matching SGM algorithm according to each corrected aerial image; the GPU is further used for converting each parallax image into a depth image corresponding to each aerial image according to the relation between parallax and depth; and the CPU is also used for fusing each depth map to obtain the dense three-dimensional point cloud.
Optionally, the CPU is further configured to construct a digital mesh model for the sparse three-dimensional point cloud; the CPU is also used for projecting the grids on the surface of the digital grid model into the corresponding aerial pictures by utilizing a back projection method to obtain a projection area; the CPU is also used for adding texture information to the grid according to the projection area to obtain a digital grid model attached with image textures; and the CPU is also used for carrying out forward projection on the digital grid model attached with the image texture to obtain the digital orthographic projection image.
Optionally, the image processing chip comprises an embedded neural network processor NPU;
the NPU is used for carrying out obstacle identification on the dense three-dimensional point cloud to obtain an obstacle boundary of the surveying and mapping area; the NPU is further used for carrying out boundary identification on the digital orthophoto map to obtain a plot boundary of the surveying and mapping area; wherein the mapping data further comprises the obstacle boundary and the parcel boundary.
Optionally, the image processing chip is further configured to generate pose information of each aerial image and a sparse three-dimensional point cloud of the mapping area in a world coordinate system according to all the aerial images; the image processing chip is further used for reconstructing dense three-dimensional point cloud of the surveying and mapping area based on each pose information; the image processing chip is also used for generating a digital orthophoto map of the surveying and mapping area according to the sparse three-dimensional point cloud; the image processing chip is also used for converting the dense three-dimensional point cloud into a voxel point cloud, and compressing and slicing the voxel point cloud to obtain a voxel point cloud tile; the image processing chip is also used for carrying out slicing processing on the digital orthophoto map to obtain an orthophoto tile; wherein the mapping data comprises the voxel point cloud tile and the orthoimage tile.
Optionally, the image processing chip comprises an embedded neural network processor NPU;
the NPU is used for carrying out obstacle identification on the voxel point cloud to obtain an obstacle boundary of the mapping area; the NPU is further used for carrying out boundary identification on the orthoimage tiles to obtain the parcel boundaries of the surveying and mapping area; wherein the mapping data further comprises the obstacle boundary and the parcel boundary.
Optionally, the mapping system further includes a user terminal and a cloud server, the cloud server is in communication connection with the unmanned aerial vehicle, and the user terminal is in communication connection with the cloud server;
the cloud server is used for receiving and storing the mapping data sent by the unmanned aerial vehicle; the user terminal is used for generating a route generation instruction; the cloud server is further used for receiving the route generation instruction sent by the user terminal and planning the operation route of the surveying area according to the route generation instruction and the surveying data.
Optionally, the cloud server is deployed with an image processing model; the unmanned aerial vehicle is further used for sending the aerial pictures shot by the surveying and mapping equipment to the cloud server; the cloud server is further used for calling the image processing model to generate mapping data of the mapping area according to the aerial picture.
Optionally, the drone comprises a flight controller; the flight controller is used for planning the operation air route of the surveying and mapping area according to the surveying and mapping data.
In a second aspect, the embodiment of the present application further provides a surveying and mapping method, which is applied to a surveying and mapping system, where the surveying and mapping system includes an unmanned aerial vehicle and a surveying and mapping device communicatively connected to the unmanned aerial vehicle, and the surveying and mapping device is mounted on the unmanned aerial vehicle; the mapping method comprises the following steps: the unmanned aerial vehicle flies according to a set air route and triggers the mapping equipment to take a picture when flying to a set picture taking point; the surveying and mapping equipment shoots aerial pictures of surveying and mapping areas under the triggering of the unmanned aerial vehicle, generates surveying and mapping data of the surveying and mapping areas according to the aerial pictures, and sends the surveying and mapping data to the unmanned aerial vehicle.
Compared with the prior art, the surveying and mapping system and the surveying and mapping method provided by the embodiment of the application comprise the unmanned aerial vehicle and the surveying and mapping equipment, and the surveying and mapping equipment is mounted on the unmanned aerial vehicle and has good portability; at the mapping process, unmanned aerial vehicle flies according to setting for the airline to trigger mapping equipment when flying to the point of shooing of setting for and shoot, mapping equipment shoots the picture of taking photo by plane of mapping region, and generates mapping data of mapping region according to the picture of taking photo by plane, thereby can shoot instant generation mapping data, have good real-time. Compared with the prior art, the embodiment of the application can accurately and quickly complete mapping.
Drawings
Fig. 1 shows a schematic structural diagram of a mapping system provided in an embodiment of the present application.
Fig. 2 shows another schematic structural diagram of the mapping system provided in the embodiment of the present application.
Fig. 3 shows a schematic structural diagram of a mapping system provided in an embodiment of the present application.
Fig. 4 shows a schematic flow chart of a mapping method provided by an embodiment of the present application.
Fig. 5 is a schematic flowchart of step S200 in the mapping method shown in fig. 4.
Fig. 6 is a schematic flow chart of step S202 in the mapping method shown in fig. 5.
Fig. 7 is another schematic flow chart of step S202 in the mapping method shown in fig. 5.
Fig. 8 is a schematic flowchart of step S2021 in the mapping method shown in fig. 6 and 7.
Fig. 9 is a flowchart illustrating step S2022 in the mapping method illustrated in fig. 6 and 7.
Fig. 10 is a flowchart illustrating step S2023 in the mapping method illustrated in fig. 6 and 7.
Fig. 11 shows another schematic flow chart of the mapping method provided in the embodiment of the present application.
Fig. 12 is a schematic flow chart of a mapping method provided by an embodiment of the present application.
Icon: 100-a mapping system; 110-a drone; 120-a mapping device; 130-cloud server; 140-a user terminal; 121-a camera; 122-an image processing chip; 1221-CPU; 1222-a GPU; 1223-NPU.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application.
Before working, an unmanned device (e.g., an unmanned aerial vehicle, an unmanned vehicle, etc.) needs to plan a working route first and then work according to the working route. However, in actual work, the topography has a great influence on the effectiveness of the work, such as plant protection work, forest fire extinguishment, and the like. Thus, a way of building a three-dimensional model of a specific area by mapping has emerged.
In an existing scheme, a three-dimensional model of a specific area can be generated by manually performing dotting measurement on the specific area or performing track point measurement during farm tool operation, but this method is quite labor-consuming, and the accuracy of the generated three-dimensional model is limited because of insufficient manually sampled data points.
In another existing scheme, a surveying and mapping unmanned aerial vehicle can be used for collecting images and sending the images to a ground station or a cloud end, the ground station or the cloud end adopts special three-dimensional reconstruction software to generate a three-dimensional model of a specific area, however, the time consumption of the process of generating the three-dimensional model is long, and obviously, the mode has no portability and real-time performance.
Based on the above, the embodiment of the application provides a surveying and mapping system and a surveying and mapping method, the surveying and mapping system comprises an unmanned aerial vehicle and surveying and mapping equipment, the surveying and mapping equipment is mounted on the unmanned aerial vehicle, the mounting position is not limited, and plugging and unplugging can be performed; at the mapping process, unmanned aerial vehicle flies according to setting for the airline to trigger mapping equipment when flying to the point of setting for shooing and shoot, mapping equipment shoots the aerial photograph picture of mapping region, and generates mapping data of mapping region according to the aerial photograph picture, thereby can shoot immediately and generate mapping data. Therefore, the surveying and mapping system provided by the embodiment of the application has good portability and real-time performance, and can accurately and quickly complete surveying and mapping.
The following detailed description is made with reference to the accompanying drawings.
Referring to fig. 1, fig. 1 shows a schematic structural diagram of a surveying and mapping system 100 provided in an embodiment of the present application, where the surveying and mapping system 100 includes an unmanned aerial vehicle 110 and a surveying and mapping device 120, the unmanned aerial vehicle 110 is in communication connection with the surveying and mapping device 120, and the surveying and mapping device 120 is mounted on the unmanned aerial vehicle 110.
The unmanned aerial vehicle 110 is used for flying according to a set air route and triggering the mapping equipment to take a picture when flying to a set picture taking point.
And the mapping equipment 120 is used for shooting an aerial image of the mapping area under the triggering of the unmanned aerial vehicle 110, generating mapping data of the mapping area according to the aerial image, and sending the mapping data to the unmanned aerial vehicle.
The drone 110 may be a surveying drone dedicated to geographic surveying and mapping, or a general drone such as a plant protection drone. When the surveying and mapping area is required to be surveyed, a flight path can be planned for the surveying and mapping area firstly, and each photographing point is arranged on the flight path.
Optionally, each photographing point may be set by a surveying and mapping staff according to the actual situation of a surveying and mapping area, for example, an area with simple terrain and landform such as a lake and a grassland, and a few photographing points may be set on a corresponding navigation segment; the comparatively complicated region of topography landform such as forest, hills can set up some more on its corresponding flight segment and shoot some. Each photographing point can also be set by surveying personnel according to a fixed distance, and the fixed distance can be set according to flight routes, flight speeds, surveying and mapping requirements and the like.
It should be noted that, aerial pictures taken at two adjacent photographing points need to satisfy a certain overlapping rate, for example, 80%, the size of the overlapping rate is related to the surveying and mapping requirement, and can be flexibly set by surveying and mapping personnel according to the actual situation, which is not limited herein.
In one embodiment, referring to fig. 2, the mapping apparatus 120 includes a camera 121 and an image processing chip 122, and the camera 121 is communicatively connected to the image processing chip 122. The camera 121 may have a pan and tilt head.
The camera 121 is configured to take aerial pictures of the mapping area under the trigger of the unmanned aerial vehicle 110, and transmit the aerial pictures to the image processing chip 122 in real time, where one set photographing point corresponds to one aerial picture;
and the image processing chip 122 is used for generating mapping data of the mapping area according to all the aerial pictures.
Wherein, surveying equipment 120 can detachably load on unmanned aerial vehicle 110, and the loading position is not limited. During the flight of the drone 110, the lens of the camera 121 faces downward vertically, the lens may be a wide-angle lens, and the shooting shutter mode of the camera 121 may be a mechanical shutter or an electronic shutter, which is not limited herein.
In the surveying and mapping process, the unmanned aerial vehicle 110 flies according to a planned route in advance, and triggers the camera 121 to take a picture when flying to a set picture taking point, and meanwhile, the unmanned aerial vehicle 110 obtains an RTK (Real-time kinematic) coordinate of the current unmanned aerial vehicle 110, and writes the RTK coordinate into an EXIF (Exchangeable image file format) of an aerial picture taken by the camera 121. Therefore, one set photo corresponds to one aerial photo, and the EXIF of each aerial photo records the RTK coordinates of the drone 110 at the corresponding set photo.
Then, the camera 121 transmits the aerial images to the image processing chip 122 in real time, and the image processing chip 122 may resolve the aerial images, thereby generating mapping data of the mapping area.
The following describes a process of generating the mapping data of the mapping area by the image processing chip 122.
The image processing chip 122 is further configured to generate pose information of each aerial image and a sparse three-dimensional point cloud of a surveying area in a world coordinate system according to all the aerial images.
The image processing chip 122 is further configured to reconstruct a dense three-dimensional point cloud of the mapping area based on each pose information.
The image processing chip 122 is further configured to generate a digital orthophotomap of the mapping area according to the sparse three-dimensional point cloud; wherein the mapping data comprises a dense three-dimensional point cloud and a digital orthophoto map.
In some embodiments, referring to fig. 3, the image Processing chip 122 includes a CPU (Central Processing Unit) 1221 and a GPU (Graphics Processing Unit) 1222, and the CPU1221 and the GPU1222 are communicatively connected by a bus.
The following describes the process by which the image processing chip 122 generates a dense three-dimensional point cloud of a survey area.
And the GPU1222 is used for performing feature extraction on the aerial photo and outputting feature information of the aerial photo.
And the CPU1221 is used for calculating the first pose information of each aerial image and the first sparse three-dimensional point cloud of the surveying and mapping area by using the SFM algorithm according to each piece of characteristic information.
The CPU1221 is further configured to acquire shooting information of any one target aerial photograph, where an RTK coordinate of the unmanned aerial vehicle is recorded in the shooting information, and one set shooting point corresponds to one RTK coordinate.
And the CPU1221 is further configured to calculate a similarity transformation matrix according to the first pose information of the target aerial photograph and the RTK coordinates.
The CPU1221 is further configured to transform each first pose information and the first sparse three-dimensional point cloud according to the similarity transformation matrix, so as to obtain each pose information and the sparse three-dimensional point cloud.
That is, after the unmanned aerial vehicle 110 flies to the set photographing point and the camera 121 takes an aerial image of the surveying and mapping area, the aerial image is transmitted to the CPU1221 in real time, and an RTK coordinate of the unmanned aerial vehicle 110 at the set photographing point is recorded in the shooting information of the aerial image, that is, one set photographing point corresponds to one aerial image, and one RTK coordinate is recorded in one aerial image and corresponds to the set photographing point. The shooting information of the aerial photo can be EXIF of the aerial photo, the EXIF is specially set for the photo of the digital camera, and the attribute information and the shooting data of the digital photo can be recorded.
CPU1221 receives and processes aerial pictures in real time, first, GPU1222 accelerates feature extraction of aerial pictures received in real time, GPU1222 extracts feature information of aerial pictures and transmits the feature information to CPU1221, that is, camera 121 transmits an aerial picture, and GPU1222 extracts feature information of an aerial picture; then, the CPU1221 performs feature matching according to the feature information of each aerial image, and then uses an SFM (structure-from-motion) algorithm to solve the first pose information of each aerial image and the first sparse three-dimensional point cloud of the mapping area according to the feature matching result. The first position and orientation information is the position and the rotation orientation of the aerial image under the local coordinate system, and comprises the following steps: the pitch angle, roll angle, yaw angle, and three-dimensional coordinates of the drone 110 under the local coordinate system. The first sparse three-dimensional point cloud is a three-dimensional model of the mapping area under the local coordinate system.
As can be understood by those skilled in the art, the SFM algorithm refers to a process of obtaining three-dimensional structure information by analyzing the motion of an object in the field of computer vision, and specifically, how to solve the first pose information of each aerial image and the first sparse three-dimensional point cloud of a mapping area by using the SFM algorithm, and the embodiment of the present application is not described in detail.
Optionally, the first pose information And the first sparse three-dimensional point cloud may also be solved by using a visual SLAM (Simultaneous Localization And Mapping) algorithm, And the solving process has higher real-time performance.
After the CPU1221 solves the first pose information of each aerial photograph and the first sparse three-dimensional point cloud of the surveying and mapping area, since the first pose information and the first sparse three-dimensional point cloud are in the local coordinate system, the first pose information and the first sparse three-dimensional point cloud need to be converted into the world coordinate system.
In the processing, the CPU1221 aligns the first position information of the aerial photograph and the RTK coordinate recorded in the shooting information of the aerial photograph, the aligning being to calculate a similarity transformation matrix from the first position information to the RTK coordinate. Then, converting the first pose information of each aerial photo through a similarity transformation matrix to obtain the pose information of each aerial photo in a world coordinate system; and converting the first sparse three-dimensional point cloud of the surveying and mapping area through the similarity transformation matrix to obtain the sparse three-dimensional point cloud of the surveying and mapping area under the world coordinate system. The pose information and the sparse three-dimensional point cloud have absolute coordinates and dimensions of geographic locations, i.e., can correspond to a survey area in the real world.
And the CPU1221 is further configured to perform binocular stereo correction on any two adjacent aerial pictures having a matching relationship according to each pose information, so as to obtain each corrected aerial picture.
And the GPU1222 is further configured to generate a disparity map corresponding to each aerial image by using a semi-global matching SGM algorithm according to each corrected aerial image.
And the GPU1222 is further configured to convert each disparity map into a depth map corresponding to each aerial image according to the relationship between disparity and depth.
And the CPU1221 is further used for fusing each depth map to obtain dense three-dimensional point cloud.
The two adjacent aerial pictures with the matching relation refer to two aerial pictures with overlapping degrees or two aerial pictures with a common view point. As can be understood by those skilled in the art, binocular stereo correction in the field of computer vision refers to a process of transforming two left and right image planes by using binocular calibrated internal and external parameters to achieve co-alignment and coplanarity, specifically, a process of performing binocular stereo correction on two adjacent aerial pictures according to pose information, and details of embodiments of the present application are not described in detail.
For two adjacent corrected aerial pictures, one is used as a reference picture, and the other is used as a target picture, so that a disparity map from the target picture to the reference picture can be generated, and the disparity map corresponding to each aerial picture can be generated in such a way. As can be understood by those skilled in the art, in the field of computer vision, the SGM (semi-global matching) algorithm is a matching algorithm between local matching and global matching, and is commonly used in stereo matching, and therefore, the process of generating a disparity map corresponding to each aerial picture by using the semi-global matching SGM algorithm is specific, and details of the embodiment of the present application are not described herein.
After GPU1222 generates a disparity map corresponding to each aerial image, the disparity map is generated according to the relationship between depth values and disparities
Figure BDA0002837395670000101
And calculating the depth value of each pixel in the disparity map, wherein Z represents the depth value, B represents the shooting distance between two adjacent aerial pictures, f is the focal length of the camera 121, and D is the disparity, so that the disparity map is converted into a depth map.
Then, for the depth map corresponding to each aerial image, the CPU1221 may obtain depth information of spatial points in the mapping area according to the depth map, and may remove common-view redundant points through fusion of the depth maps, to finally obtain a dense three-dimensional point cloud in the mapping area.
The process of the image processing chip 122 to generate a digital orthophotomap of the mapped area is described below.
The CPU1221 is further configured to construct a digital mesh model for the sparse three-dimensional point cloud.
The CPU1221 is further configured to project the mesh to the corresponding aerial image by using a back projection method for the mesh on the surface of the digital mesh model, so as to obtain a projection area.
The CPU1221 is further configured to add texture information to the mesh according to the projection area, and obtain a digital mesh model attached with the image texture.
The CPU1221 is further configured to perform forward projection on the digital mesh model with the image texture attached thereto, so as to obtain a digital orthographic projection image.
For the sparse three-dimensional point cloud in the mapping region, the CPU1221 may construct a mesh in the sparse three-dimensional point cloud, for example, a Delaunay triangulation algorithm may be used to construct triangles in the sparse three-dimensional point cloud, where the triangles are constant data structures representing the sparse three-dimensional point cloud, that is, triangular meshes. In some embodiments, the shape of the mesh may be other shapes without limitation to a triangle, and is not limited herein.
Then, for each grid of the Digital grid model, projecting the grid to a corresponding aerial photo by using a back projection method to obtain a projection area of the grid in the aerial photo, adding texture information to the grid according to pixel values in the projection area, and finally performing forward projection on the Digital grid model attached with image textures to obtain a DOM (Digital ortho photo Map).
In some embodiments, referring again to fig. 3, the image Processing chip 122 may further include an NPU (Neural-network Processing Unit) 1223, and the CPU1221, the GPU1222, and the NPU1223 are communicatively connected by a bus.
And the NPU1223 is used for performing obstacle identification on the dense three-dimensional point cloud to obtain an obstacle boundary of the surveying and mapping area.
The NPU1223 is further used for carrying out boundary identification on the digital orthophoto map to obtain a plot boundary of the surveying and mapping area; wherein the mapping data further comprises obstacle boundaries and parcel boundaries.
That is, the mapping data may include a dense three-dimensional point cloud of the mapping area, the DOM, the obstacle boundaries, and the parcel boundaries.
The process of obtaining the obstacle boundary of the mapping area by the NPU1223 may be: the NPU1223 performs AI obstacle recognition on the dense three-dimensional point cloud, recognizes an obstacle point cloud, and calculates a vector bounding box of the obstacle point cloud, thereby obtaining an obstacle boundary. Obstacles may include, but are not limited to, solid elements such as trees, poles, and towers in a survey area.
The NPU1223 process of finding the parcel boundaries of the mapping area may be: the NPU1223 performs boundary identification on the DOM, an AI boundary identification model can be deployed on the NPU1223, and the NPU1223 can obtain the parcel boundary by running the AI boundary identification model.
Finally, the image processing chip 122 may transmit the mapping data of the mapping area, i.e., the dense three-dimensional point cloud, DOM, obstacle boundaries, and parcel boundaries of the mapping area, to the drone 110.
In some embodiments, after the image processing chip 122 generates the dense three-dimensional point cloud and the DOM of the mapping area, the dense three-dimensional point cloud and the DOM may be compressed separately to save storage space and increase data transmission speed.
The image processing chip 122 is further configured to convert the dense three-dimensional point cloud into a voxel point cloud, and compress and slice the voxel point cloud to obtain a voxel point cloud tile.
The image processing chip 122 is further configured to slice the digital orthophoto map to obtain an orthophoto tile; wherein the mapping data comprises a voxel point cloud tile and an orthoimage tile.
Alternatively, the process of compressing the dense three-dimensional point cloud by the image processing chip 122 may be:
and the CPU1221 is used for converting the dense three-dimensional point cloud into a voxel point cloud.
The CPU1221 is further configured to compress the voxel point cloud according to a set compression format, and slice the compressed voxel point cloud to obtain a voxel point cloud tile.
The dense three-dimensional point cloud is converted into a voxel point cloud, namely the dense three-dimensional point cloud is represented by voxelization, and the voxelization grid size can be 0.1 m. The set compression format can be laz format, and the voxel point cloud is compressed in laz format, so that the data size can be effectively reduced. Meanwhile, the voxel point cloud tile can be obtained by slicing the voxel point cloud through an open source framework (e.g., entwine).
Alternatively, the process of the image processing chip 122 performing compression processing on the DOM may be:
the CPU1221 is further configured to slice the digital orthophoto map to obtain an orthophoto tile. The ortho-image tiles include ortho-image edge tiles and ortho-image non-edge tiles, and the ortho-image edge tiles are saved in a first setting format and the ortho-image non-edge tiles are saved in a second setting format.
The first set format may be png format, and the second set format may be jpeg format. That is, CPU1221 may slice the DOM into orthophoto tiles and save orthophoto edge tiles in png format and orthophoto non-edge tiles in jpeg format. Compared with the pure png format storage mode, the mixed format storage mode can effectively compress the data volume and can reserve the transparent channel at the edge of the orthoimage.
Correspondingly, the NPU1223 is configured to perform obstacle identification on the voxel point cloud, and obtain an obstacle boundary of the mapping area.
The NPU1223 is further used for carrying out boundary identification on the orthoimage tiles to obtain the parcel boundaries of the surveying and mapping area; wherein the mapping data further comprises obstacle boundaries and parcel boundaries.
That is, the mapping data may include voxel point cloud tiles, orthophoto tiles, obstacle boundaries, and parcel boundaries of the mapping area.
The process of obtaining the obstacle boundary of the mapping area by the NPU1223 may be: the NPU1223 performs AI obstacle recognition on the voxel point cloud, recognizes an obstacle point cloud, and calculates a vector bounding box of the obstacle point cloud, thereby obtaining an obstacle boundary.
The NPU1223 process of finding the parcel boundaries of the mapping area may be: the NPU1223 performs boundary identification on the ortho-image tiles, an AI boundary identification model can be deployed on the NPU1223, and the NPU1223 can obtain the parcel boundaries by running the AI boundary identification model.
Finally, the image processing chip 122 may transmit the mapping data of the mapping area, i.e., the voxel point cloud tiles, orthophoto tiles, obstacle boundaries, and parcel boundaries of the mapping area, to the drone 110.
In some embodiments, referring to fig. 1 to fig. 3, the mapping system 100 may further include a cloud server 130 and a user terminal 140, where the cloud server 130 is connected to the drone 110 through a network in a communication manner, and the user terminal 140 is connected to the cloud server 130 through a network in a communication manner.
And the cloud server 130 is configured to receive and store the mapping data sent by the unmanned aerial vehicle 110.
And the user terminal 140 is used for generating the route generation instruction.
The cloud server 130 is further configured to receive an airline generation instruction sent by the user terminal 140, and plan an operation airline of the mapping area according to the airline generation instruction and the mapping data.
After the mapping device 120 generates the mapping data, the mapping data may be sent to the drone 110, and then stored to the cloud server 130 by the drone 110. In this way, if a working route of the mapping area is to be generated, the user may operate the user terminal 140, and the user terminal 140 generates a route generation instruction based on the user operation and sends the route generation instruction to the cloud server 130, where the route generation instruction is used to instruct the cloud server 130 to generate the working route of the mapping area.
In some embodiments, the cloud server 130 may be deployed with an image processing model.
The unmanned aerial vehicle 110 is further configured to send the aerial photograph taken by the mapping device 120 to the cloud server 130.
The cloud server 130 is further configured to invoke the image processing model to generate mapping data of the mapping area according to the aerial image.
The camera 121 may transmit the aerial image to the drone 110 after shooting the aerial image of the mapping area, and the drone 110 sends the aerial image to the cloud server 130 to generate mapping data.
It should be noted that the process of generating the mapping data of the mapping area by the cloud server 130 according to the aerial image is similar to the above-described process of generating the mapping data by the mapping device 120, and is not described herein again. In this embodiment, the mapping device 120 and the cloud server 130 can both generate mapping data of a mapping area according to an aerial picture, and the two have the following differences:
on one hand, the cloud server 130 has stronger computing power, so the cloud server 130 can deploy more image processing models, such as various types of AI boundary recognition models; meanwhile, the cloud server 130 may process aerial pictures with higher resolution; and, in the picture processing process, the cloud server 130 may adopt more various image processing algorithms and the like.
On the other hand, the mapping device 120 can be detachably mounted on the drone 110, and mapping data can be generated instantly by taking a picture instantly, so that the device has good portability and real-time performance.
Therefore, in practice, those skilled in the art can flexibly select, according to actual needs, whether the mapping data is generated by the mapping device 120 or the mapping data is generated by the cloud server 130, which is not limited herein.
Optionally, the drone 110 may include a flight controller, to which the mapping device 120 may send the final generated mapping data.
And the flight controller is used for planning the operation air route of the mapping area according to the mapping data.
The working envelope for the mapping area may also be generated by the flight controller, and the flight controller may transmit the generated working envelope to the user terminal 140.
The following describes a mapping method applied to the above mapping system.
Referring to fig. 4, fig. 4 is a schematic flow chart of a mapping method provided in an embodiment of the present application, where the mapping method includes the following steps:
s100, the unmanned aerial vehicle flies according to a set air route, and triggers the mapping equipment to take a picture when flying to a set picture taking point.
S200, the mapping equipment shoots aerial pictures of the mapping area under the triggering of the unmanned aerial vehicle, generates mapping data of the mapping area according to the aerial pictures and sends the mapping data to the unmanned aerial vehicle.
Referring to fig. 5, based on fig. 4, step S200 may include:
s201, the camera shoots aerial pictures of the surveying and mapping area under the triggering of the unmanned aerial vehicle, and transmits the aerial pictures to the image processing chip in real time, wherein one set shooting point corresponds to one aerial picture.
S202, the image processing chip generates mapping data of the mapping area according to all aerial pictures.
As an implementation manner, referring to fig. 6 on the basis of fig. 5, step S202 may include:
s2021, the image processing chip generates pose information of each aerial image and sparse three-dimensional point cloud of a surveying and mapping area under a world coordinate system according to all the aerial images.
S2022, the image processing chip reconstructs dense three-dimensional point cloud of the surveying and mapping area based on each pose information.
S2023, the image processing chip generates a digital orthophoto map of the surveying and mapping area according to the sparse three-dimensional point cloud; wherein the mapping data comprises a dense three-dimensional point cloud and a digital orthophoto map.
S2024, the NPU identifies the obstacles to the dense three-dimensional point cloud to obtain the obstacle boundary of the surveying and mapping area.
S2025, carrying out boundary identification on the digital orthophoto map by the NPU to obtain a plot boundary of the surveying and mapping area; wherein the mapping data further comprises obstacle boundaries and parcel boundaries.
As another embodiment, referring to fig. 6 on the basis of fig. 5, step S202 may include:
s2021, the image processing chip generates pose information of each aerial image and sparse three-dimensional point cloud of a surveying and mapping area under a world coordinate system according to all the aerial images.
S2022, the image processing chip reconstructs dense three-dimensional point cloud of the surveying and mapping area based on each pose information.
S2023, the image processing chip generates a digital orthophoto map of the surveying and mapping area according to the sparse three-dimensional point cloud.
S202a, the image processing chip converts the dense three-dimensional point cloud into a voxel point cloud, and the voxel point cloud is compressed and sliced to obtain a voxel point cloud tile.
S202b, the image processing chip carries out slicing processing on the digital orthophoto map to obtain an orthophoto tile; wherein the mapping data comprises a voxel point cloud tile and an orthoimage tile.
S202c, the NPU identifies the barrier of the voxel point cloud to obtain the barrier boundary of the mapping area.
S202d, the NPU carries out boundary identification on the orthophoto tile to obtain a plot boundary of the surveying and mapping area; wherein the mapping data further comprises obstacle boundaries and parcel boundaries.
Referring to fig. 8 based on fig. 6 and fig. 7, step S2021 may include:
and S2021a, the GPU performs feature extraction on the aerial image and outputs feature information of the aerial image.
S2021b, the CPU uses the SFM algorithm to calculate the first pose information of each aerial image and the first sparse three-dimensional point cloud of the surveying and mapping area according to each feature information.
S2021c, the CPU obtains shooting information of any target aerial image, wherein the shooting information records RTK coordinates of the unmanned aerial vehicle, and one set shooting point corresponds to one RTK coordinate.
S2021d, the CPU calculates a similarity transformation matrix according to the first position information and the RTK coordinates of the target aerial photo.
S2021e, the CPU transforms each first pose information and the first sparse three-dimensional point cloud according to the similarity transformation matrix to obtain each pose information and the sparse three-dimensional point cloud.
Referring to fig. 9, based on fig. 6 and fig. 7, step S2022 may include:
s2022a, the CPU carries out binocular stereo correction on any two adjacent aerial pictures with matching relation according to each pose information to obtain each corrected aerial picture.
And S2022b, generating a disparity map corresponding to each aerial image by the GPU according to each corrected aerial image by using a semi-global matching SGM algorithm.
S2022c, the GPU converts each disparity map into a depth map corresponding to each aerial image according to the relationship between disparity and depth.
S2022d, fusing each depth map by the CPU to obtain dense three-dimensional point cloud.
Referring to fig. 10 based on fig. 6 and fig. 7, step S2023 may include:
s2023a, the CPU constructs a digital grid model for the sparse three-dimensional point cloud.
S2023b, the CPU projects the grids on the surface of the digital grid model to the corresponding aerial images by using a back projection method to obtain a projection area.
S2023c, the CPU adds texture information to the mesh according to the projection area to obtain a digital mesh model with the image texture attached.
S2023d, the CPU performs forward projection on the digital mesh model with the image texture attached to it to obtain a digital orthographic projection image.
Referring to fig. 11, after step S200, the mapping method may further include steps S300 to S500 based on fig. 4.
S300, the cloud server receives and stores the mapping data sent by the unmanned aerial vehicle.
S400, the user terminal generates a route generation instruction.
S500, the cloud server receives an air route generation instruction sent by the user terminal, and plans an operation air route of the surveying and mapping area according to the air route generation instruction and the surveying and mapping data.
Referring to fig. 12, on the basis of fig. 4, after step S200, the mapping method may further include steps S600 to S700.
S600, the unmanned aerial vehicle sends aerial pictures shot by the surveying and mapping equipment to a cloud server.
S700, the cloud server calls the image processing model to generate the mapping data of the mapping area according to the aerial picture.
Optionally, the drone 110 includes a flight controller, and the mapping device 120 may transmit the finally generated mapping data to the flight controller, and thus, after step S200, the mapping method may further include step S800.
And S800, planning the operation air route of the surveying and mapping area by the flight controller according to the surveying and mapping data.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific processes of the mapping method described above may refer to the corresponding processes in the foregoing system embodiments, and are not described herein again.
In summary, the mapping system and the mapping method provided by the embodiment of the application include an unmanned aerial vehicle and mapping equipment, wherein the mapping equipment is mounted on the unmanned aerial vehicle; at the mapping process, unmanned aerial vehicle flies according to setting for the airline to trigger mapping equipment when flying to the point of setting for shooing and shoot, mapping equipment shoots the aerial photograph picture of mapping region, and generates mapping data of mapping region according to the aerial photograph picture, thereby can shoot immediately and generate mapping data. Therefore, the surveying and mapping system provided by the embodiment of the application has good portability and real-time performance, and can accurately and quickly complete surveying and mapping.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (13)

1. A surveying system, characterized in that the surveying system comprises an unmanned aerial vehicle and a surveying device in communication connection with the unmanned aerial vehicle, the surveying device being mounted on the unmanned aerial vehicle;
the unmanned aerial vehicle is used for flying according to a set air route and triggering the mapping equipment to take a picture when flying to a set picture taking point;
the surveying and mapping equipment is used for shooting an aerial picture of a surveying and mapping area under the triggering of the unmanned aerial vehicle, generating surveying and mapping data of the surveying and mapping area according to the aerial picture, and sending the surveying and mapping data to the unmanned aerial vehicle.
2. The mapping system of claim 1, wherein the mapping device includes a camera and an image processing chip, the camera and the image processing chip being communicatively connected;
the camera is used for shooting aerial pictures of the surveying and mapping area under the trigger of the unmanned aerial vehicle and transmitting the aerial pictures to the image processing chip in real time, wherein one set shooting point corresponds to one aerial picture;
and the image processing chip is used for generating mapping data of the mapping area according to all the aerial pictures.
3. The mapping system of claim 2,
the image processing chip is further used for generating pose information of each aerial image and sparse three-dimensional point cloud of the surveying and mapping area under a world coordinate system according to all the aerial images;
the image processing chip is further used for reconstructing dense three-dimensional point cloud of the surveying and mapping area based on each pose information;
the image processing chip is also used for generating a digital orthophoto map of the surveying and mapping area according to the sparse three-dimensional point cloud; wherein the mapping data comprises the dense three-dimensional point cloud and the digital orthophotomap.
4. The mapping system of claim 3, wherein the image processing chip includes a Central Processing Unit (CPU) and a Graphics Processor (GPU);
the GPU is used for extracting the features of the aerial photo and outputting the feature information of the aerial photo;
the CPU is further used for solving first pose information of each aerial image and first sparse three-dimensional point cloud of the surveying and mapping area by utilizing an SFM algorithm according to each feature information;
the CPU is further used for acquiring shooting information of any target aerial picture, wherein RTK coordinates of the unmanned aerial vehicle are recorded in the shooting information, and one set shooting point corresponds to one RTK coordinate;
the CPU is further used for calculating a similarity transformation matrix according to the first position and orientation information of the target aerial image and the RTK coordinates;
the CPU is further used for transforming each first pose information and the first sparse three-dimensional point cloud according to the similarity transformation matrix to obtain each pose information and the sparse three-dimensional point cloud.
5. The mapping system of claim 4,
the CPU is also used for carrying out binocular stereo correction on any two adjacent aerial pictures with matching relation according to each pose information to obtain each corrected aerial picture;
the GPU is also used for generating a disparity map corresponding to each aerial image by utilizing a semi-global matching SGM algorithm according to each corrected aerial image;
the GPU is further used for converting each parallax image into a depth image corresponding to each aerial image according to the relation between parallax and depth;
and the CPU is also used for fusing each depth map to obtain the dense three-dimensional point cloud.
6. The mapping system of claim 4,
the CPU is also used for constructing a digital grid model for the sparse three-dimensional point cloud;
the CPU is also used for projecting the grids on the surface of the digital grid model into the corresponding aerial pictures by utilizing a back projection method to obtain a projection area;
the CPU is also used for adding texture information to the grid according to the projection area to obtain a digital grid model attached with image textures;
and the CPU is also used for carrying out forward projection on the digital grid model attached with the image texture to obtain the digital orthographic projection image.
7. The mapping system of claim 3, wherein the image processing chip includes an embedded neural network processor NPU;
the NPU is used for carrying out obstacle identification on the dense three-dimensional point cloud to obtain an obstacle boundary of the surveying and mapping area;
the NPU is further used for carrying out boundary identification on the digital orthophoto map to obtain a plot boundary of the surveying and mapping area; wherein the mapping data further comprises the obstacle boundary and the parcel boundary.
8. The mapping system of claim 2,
the image processing chip is further used for generating pose information of each aerial image and sparse three-dimensional point cloud of the surveying and mapping area under a world coordinate system according to all the aerial images;
the image processing chip is further used for reconstructing dense three-dimensional point cloud of the surveying and mapping area based on each pose information;
the image processing chip is also used for generating a digital orthophoto map of the surveying and mapping area according to the sparse three-dimensional point cloud;
the image processing chip is also used for converting the dense three-dimensional point cloud into a voxel point cloud, and compressing and slicing the voxel point cloud to obtain a voxel point cloud tile;
the image processing chip is also used for carrying out slicing processing on the digital orthophoto map to obtain an orthophoto tile; wherein the mapping data comprises the voxel point cloud tile and the orthoimage tile.
9. The mapping system of claim 8, wherein the image processing chip includes an embedded neural network processor NPU;
the NPU is used for carrying out obstacle identification on the voxel point cloud to obtain an obstacle boundary of the mapping area;
the NPU is further used for carrying out boundary identification on the orthoimage tiles to obtain the parcel boundaries of the surveying and mapping area; wherein the mapping data further comprises the obstacle boundary and the parcel boundary.
10. The surveying and mapping system according to any one of claims 1-9, further comprising a user terminal and a cloud server, the cloud server being communicatively connected to the drone, the user terminal being communicatively connected to the cloud server;
the cloud server is used for receiving and storing the mapping data sent by the unmanned aerial vehicle;
the user terminal is used for generating a route generation instruction;
the cloud server is further used for receiving the route generation instruction sent by the user terminal and planning the operation route of the surveying area according to the route generation instruction and the surveying data.
11. The mapping system of claim 10, wherein the cloud server is deployed with an image processing model; the unmanned aerial vehicle is further used for sending the aerial pictures shot by the surveying and mapping equipment to the cloud server;
the cloud server is further used for calling the image processing model to generate mapping data of the mapping area according to the aerial picture.
12. The mapping system of any of claims 1-9, wherein the drone includes a flight controller;
and the flight controller is used for planning the operation air route of the mapping area according to the mapping data.
13. A surveying and mapping method is applied to a surveying and mapping system, wherein the surveying and mapping system comprises an unmanned aerial vehicle and surveying and mapping equipment in communication connection with the unmanned aerial vehicle, and the surveying and mapping equipment is mounted on the unmanned aerial vehicle;
the mapping method comprises the following steps:
the unmanned aerial vehicle flies according to a set air route and triggers the mapping equipment to take a picture when flying to a set picture taking point;
the surveying and mapping equipment shoots aerial pictures of surveying and mapping areas under the triggering of the unmanned aerial vehicle, generates surveying and mapping data of the surveying and mapping areas according to the aerial pictures, and sends the surveying and mapping data to the unmanned aerial vehicle.
CN202011480482.7A 2020-12-15 2020-12-15 Surveying system and surveying method Active CN112729260B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011480482.7A CN112729260B (en) 2020-12-15 2020-12-15 Surveying system and surveying method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011480482.7A CN112729260B (en) 2020-12-15 2020-12-15 Surveying system and surveying method

Publications (2)

Publication Number Publication Date
CN112729260A true CN112729260A (en) 2021-04-30
CN112729260B CN112729260B (en) 2023-06-09

Family

ID=75602267

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011480482.7A Active CN112729260B (en) 2020-12-15 2020-12-15 Surveying system and surveying method

Country Status (1)

Country Link
CN (1) CN112729260B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113670273A (en) * 2021-08-06 2021-11-19 诚邦测绘信息科技(浙江)有限公司 Beach soil environment detection method and system for surveying and mapping, storage medium and intelligent terminal
CN114046776A (en) * 2021-09-22 2022-02-15 北京洛斯达科技发展有限公司 Power transmission engineering water and soil conservation measure implementation checking system
CN114838710A (en) * 2022-03-29 2022-08-02 中国一冶集团有限公司 Rapid mapping method and mapping system for engineering based on unmanned aerial vehicle photographing
CN115880466A (en) * 2023-02-14 2023-03-31 山东省地质测绘院 Urban engineering surveying and mapping method and system based on unmanned aerial vehicle remote sensing

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170084037A1 (en) * 2015-09-17 2017-03-23 Skycatch, Inc. Generating georeference information for aerial images
CN207336750U (en) * 2017-11-08 2018-05-08 北京数字绿土科技有限公司 Airborne mapping equipment, unmanned plane and airborne mapping system
CN111006645A (en) * 2019-12-23 2020-04-14 青岛黄海学院 Unmanned aerial vehicle surveying and mapping method based on motion and structure reconstruction
CN111247564A (en) * 2019-03-12 2020-06-05 深圳市大疆创新科技有限公司 Method for constructing digital earth surface model, processing equipment and system
CN111540049A (en) * 2020-04-28 2020-08-14 华北科技学院 Geological information identification and extraction system and method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170084037A1 (en) * 2015-09-17 2017-03-23 Skycatch, Inc. Generating georeference information for aerial images
CN207336750U (en) * 2017-11-08 2018-05-08 北京数字绿土科技有限公司 Airborne mapping equipment, unmanned plane and airborne mapping system
CN111247564A (en) * 2019-03-12 2020-06-05 深圳市大疆创新科技有限公司 Method for constructing digital earth surface model, processing equipment and system
CN111006645A (en) * 2019-12-23 2020-04-14 青岛黄海学院 Unmanned aerial vehicle surveying and mapping method based on motion and structure reconstruction
CN111540049A (en) * 2020-04-28 2020-08-14 华北科技学院 Geological information identification and extraction system and method

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113670273A (en) * 2021-08-06 2021-11-19 诚邦测绘信息科技(浙江)有限公司 Beach soil environment detection method and system for surveying and mapping, storage medium and intelligent terminal
CN113670273B (en) * 2021-08-06 2024-03-12 诚邦测绘信息科技(浙江)有限公司 Beach soil environment detection method and system for mapping, storage medium and intelligent terminal
CN114046776A (en) * 2021-09-22 2022-02-15 北京洛斯达科技发展有限公司 Power transmission engineering water and soil conservation measure implementation checking system
CN114838710A (en) * 2022-03-29 2022-08-02 中国一冶集团有限公司 Rapid mapping method and mapping system for engineering based on unmanned aerial vehicle photographing
CN114838710B (en) * 2022-03-29 2023-08-29 中国一冶集团有限公司 Engineering rapid mapping method and mapping system based on unmanned aerial vehicle photographing
CN115880466A (en) * 2023-02-14 2023-03-31 山东省地质测绘院 Urban engineering surveying and mapping method and system based on unmanned aerial vehicle remote sensing

Also Published As

Publication number Publication date
CN112729260B (en) 2023-06-09

Similar Documents

Publication Publication Date Title
CN112729260B (en) Surveying system and surveying method
KR102001728B1 (en) Method and system for acquiring three dimentional position coordinates in non-control points using stereo camera drone
KR100912715B1 (en) Method and apparatus of digital photogrammetry by integrated modeling for different types of sensors
KR101754599B1 (en) System and Method for Extracting Automatically 3D Object Based on Drone Photograph Image
KR102007567B1 (en) Stereo drone and method and system for calculating earth volume in non-control points using the same
KR102525281B1 (en) Method and system for visual localization
Barazzetti et al. True-orthophoto generation from UAV images: Implementation of a combined photogrammetric and computer vision approach
CN110246221A (en) True orthophoto preparation method and device
CN104361628A (en) Three-dimensional real scene modeling system based on aviation oblique photograph measurement
Erenoglu et al. Accuracy assessment of low cost UAV based city modelling for urban planning
Wang et al. Estimating earthwork volumes through use of unmanned aerial systems
CN110880202B (en) Three-dimensional terrain model creating method, device, equipment and storage medium
CN112484703A (en) Surveying and mapping equipment and unmanned aerial vehicle
JP2017201261A (en) Shape information generating system
Wendel et al. Automatic alignment of 3D reconstructions using a digital surface model
Eisenbeiss et al. Photogrammetric recording of the archaeological site of Pinchango Alto (Palpa, Peru) using a mini helicopter (UAV)
JP2021117047A (en) Photogrammetric method using unmanned flight vehicle and photogrammetric system using the same
KR102567800B1 (en) Drone used 3d mapping method
KR102484772B1 (en) Method of generating map and visual localization system using the map
WO2022064242A1 (en) The method of automatic 3d designing of constructions and colonies in an smart system using a combination of machine scanning and imaging and machine learning and reconstruction of 3d model through deep learning and with the help of machine learning methods
Duan et al. Research on estimating water storage of small lake based on unmanned aerial vehicle 3D model
CN214173346U (en) Surveying and mapping equipment and unmanned aerial vehicle
CN113129422A (en) Three-dimensional model construction method and device, storage medium and computer equipment
Gotovac et al. Mapping aerial images from UAV
KR20220169342A (en) Drone used 3d mapping method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant