CN214173346U - Surveying and mapping equipment and unmanned aerial vehicle - Google Patents

Surveying and mapping equipment and unmanned aerial vehicle Download PDF

Info

Publication number
CN214173346U
CN214173346U CN202023060583.7U CN202023060583U CN214173346U CN 214173346 U CN214173346 U CN 214173346U CN 202023060583 U CN202023060583 U CN 202023060583U CN 214173346 U CN214173346 U CN 214173346U
Authority
CN
China
Prior art keywords
target area
image
processing unit
image processing
survey data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202023060583.7U
Other languages
Chinese (zh)
Inventor
邓杭
朱嘉炜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Xaircraft Technology Co Ltd
Original Assignee
Guangzhou Xaircraft Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Xaircraft Technology Co Ltd filed Critical Guangzhou Xaircraft Technology Co Ltd
Priority to CN202023060583.7U priority Critical patent/CN214173346U/en
Application granted granted Critical
Publication of CN214173346U publication Critical patent/CN214173346U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the application relates to the technical field of surveying and mapping, and provides surveying and mapping equipment and an unmanned aerial vehicle, wherein the surveying and mapping equipment can be detachably hung on the unmanned aerial vehicle, and has good portability; meanwhile, the mapping equipment comprises image acquisition equipment and an image processing unit, in the mapping process, an original image of a target area is acquired through the image acquisition equipment and sent to the image processing unit, and aerial survey data of the target area is generated on the original image through the image processing unit, so that the original image can be acquired in real time to generate the aerial survey data in real time, and good real-time performance is achieved; the purpose of accurately and quickly completing mapping can be achieved.

Description

Surveying and mapping equipment and unmanned aerial vehicle
Technical Field
The embodiment of the application relates to the technical field of surveying and mapping, and particularly relates to surveying and mapping equipment and an unmanned aerial vehicle.
Background
When using unmanned equipment (e.g., unmanned aerial vehicle, unmanned vehicle, etc.) to perform work, it is necessary to clearly understand the topography of the work area, such as elevation information, height of ground objects, gradient, etc., and to understand the topography of the work area, it is necessary to survey and map the work area.
Currently, there are two common mapping methods: firstly, manual dotting and surveying are performed, so that the mode is quite labor-consuming and low in efficiency; secondly, the unmanned aerial vehicle is surveyed and drawn and the images are sent to a ground station or a cloud end, and the ground station or the cloud end adopts special three-dimensional reconstruction software to generate a three-dimensional model of a specific area, so that the process is long in time consumption and has no real-time property.
SUMMERY OF THE UTILITY MODEL
An object of the embodiment of the application is to provide a surveying and mapping equipment and unmanned aerial vehicle for solve the problem that current surveying and mapping mode can't accurately carry out surveying and mapping in real time.
In order to achieve the above purpose, the embodiments of the present application employ the following technical solutions:
in a first aspect, an embodiment of the present application provides a mapping apparatus, which is detachably mounted on an unmanned aerial vehicle, and includes an image acquisition apparatus and an image processing unit, where the image acquisition apparatus and the image processing unit are in communication connection; the image acquisition equipment is used for acquiring an original image of a target area and sending the original image to the image processing unit; and the image processing unit is used for receiving the original image and generating aerial survey data of the target area according to the original image.
Optionally, the image processing unit includes a central processing unit CPU and a graphics processing unit GPU, and the CPU and the GPU are communicatively connected through a bus; the CPU is used for generating aerial survey data of the target area according to the original image, and the aerial survey data comprises a three-dimensional model and a digital orthophoto map of the target area; and the GPU is used for being matched with the CPU to reconstruct a three-dimensional model of the target area.
Optionally, the image processing unit further comprises an embedded neural network processor NPU, and the CPU, the GPU and the NPU are communicatively connected by a bus; the NPU is used for carrying out obstacle identification on the three-dimensional model of the target area to obtain an obstacle boundary of the target area; the NPU is further used for carrying out boundary identification on the digital orthophoto map of the target area to obtain a plot boundary of the target area; wherein the aerial survey data further comprises an obstacle boundary and a parcel boundary of the target area.
Optionally, the image processing unit comprises an image processing chip; the CPU, the GPU and the NPU are integrated in one image processing chip.
Optionally, the mapping device further comprises a memory, the memory and the image processing unit being communicatively connected by a bus; the memory is used for storing the original image acquired by the image acquisition equipment and the aerial survey data generated by the image processing unit and temporarily storing an intermediate result generated by the image processing unit in the process of generating the aerial survey data.
Optionally, the image acquisition device is connected to the image processing unit through an image video transmission interface; the image video transmission interface comprises a USB interface.
In a second aspect, the embodiment of the present application further provides a drone, where the drone includes a flight controller and the surveying and mapping device described above, and the flight controller is in communication connection with the surveying and mapping device; the flight controller is used for controlling the unmanned aerial vehicle to fly according to a set air route and triggering the mapping equipment to take a picture when the unmanned aerial vehicle flies to a set picture taking point; the surveying and mapping equipment is used for acquiring an original image of a target area under the trigger of the flight controller, generating aerial survey data of the target area according to the original image, and transmitting the aerial survey data back to the flight controller.
Optionally, the drone is a plant protection drone.
Optionally, the drone further comprises an RTK module communicatively coupled with the flight controller; the RTK module is used for acquiring RTK coordinates of the unmanned aerial vehicle in the flight process in real time; the flight controller is further configured to acquire a current RTK coordinate of the unmanned aerial vehicle when the surveying and mapping device is triggered to take a picture, and write the current RTK coordinate into shooting information of the original image.
Optionally, the drone further comprises a communication module, the communication module being in communication with the flight controller; and the communication module is used for sending the aerial survey data of the target area to a ground station, so that the ground station can plan a three-dimensional route or a two-dimensional route of the target area according to the aerial survey data.
Compared with the prior art, the surveying and mapping equipment and the unmanned aerial vehicle provided by the embodiment of the application can be detachably hung on the unmanned aerial vehicle, and have good portability; meanwhile, the mapping equipment comprises image acquisition equipment and an image processing unit, in the mapping process, an original image of a target area is acquired through the image acquisition equipment and sent to the image processing unit, and aerial survey data of the target area is generated on the original image through the image processing unit, so that the original image can be acquired in real time to generate the aerial survey data in real time, and good real-time performance is achieved; the purpose of accurately and quickly completing mapping can be achieved.
Drawings
Fig. 1 shows a schematic structural diagram of a mapping apparatus provided in an embodiment of the present application.
Fig. 2 shows a schematic structural diagram of another mapping apparatus provided in an embodiment of the present application.
Fig. 3 shows a schematic structural diagram of an unmanned aerial vehicle provided in an embodiment of the present application.
Fig. 4 shows a schematic structural diagram of another unmanned aerial vehicle provided in the embodiment of the present application.
Icon: 100-a mapping device; 110-an image acquisition device; 120-an image processing unit; 121-a CPU; 122-GPU; 123-NPU; 124-a memory; 10-unmanned aerial vehicle; 200-a flight controller; 300-an RTK module; 400-a communication module.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application.
Referring to fig. 1, fig. 1 is a schematic structural diagram of a mapping apparatus 100 according to an embodiment of the present disclosure. Surveying equipment 100 is detachably mounted on the unmanned aerial vehicle, and surveying equipment 100 comprises an image acquisition device 110 and an image processing unit 120, wherein image acquisition device 110 and image processing unit 120 are in communication connection.
The image capturing device 110 is configured to capture an original image of the target area and send the original image to the image processing unit 120.
And the image processing unit 120 is configured to receive the original image and generate aerial survey data of the target area according to the original image.
Optionally, when the topography of the target area needs to be surveyed, the surveying equipment 100 is mounted on the unmanned aerial vehicle, and the surveying equipment 100 can be detached, so that the unmanned aerial vehicle has good portability. The unmanned aerial vehicle can be a surveying and mapping unmanned aerial vehicle special for geographical surveying and mapping, and can also be a common unmanned aerial vehicle.
Optionally, aerial photography parameters of the unmanned aerial vehicle in the target area can be determined in advance according to the satellite map, and the aerial photography parameters are used for indicating the aerial photography state of the unmanned aerial vehicle. The aerial photography parameter may include at least one of a flight line, a flight altitude, a flight speed, a photographing distance interval, a photographing time interval, and the like.
When the flight route is set, the distance between any two adjacent flight segments is determined by the aerial survey requirement, it needs to be ensured that the overlap ratio between the original images acquired by the image acquisition device 110 is greater than a certain value, for example, greater than 60%, and the specific value is related to the aerial survey requirement, and can be flexibly set by a surveying and mapping staff according to the actual situation, without limitation.
The flight altitude is determined according to the map resolution of the satellite map, and the flight speed is determined according to the flight route and the flight parameters of the unmanned aerial vehicle.
The shooting distance interval and the shooting time interval are determined according to the flight line, the flight speed and the aerial survey requirements, for example, the number of the original images acquired by the image acquisition device 110 needs to be greater than a preset value (for example, 100), or the overlapping rate between two adjacent original images is not lower than a preset value (for example, 30%), and the like, which is not limited herein.
In one embodiment, the image capturing device 110 may be integrated with the image processing unit 120 into one device, or may be two devices independent from each other.
Alternatively, the image capturing device 110 may be a camera, a video camera, an optical camera, or other device or module with a shooting function.
The image capturing device 110 may be a camera commonly used in the field of aerial survey of unmanned aerial vehicles, for example, camera IMX377 is used.
Optionally, the image acquisition device 110 may include an image sensor (sensor) and a processing element.
The image sensor is used for raw image acquisition, and the image sensor may be a CCD (Charge coupled Device), a CMOS (Complementary Metal Oxide Semiconductor), or the like.
The processing element is used for generating shooting information of the original image so as to complete high-precision synchronization of shooting and positioning of the image acquisition device 110. The shooting information may be EXIF (Exchangeable image file format), which is specifically set for a photograph of a digital camera and can record attribute information and shooting data of the digital photograph.
In the shooting information of the original image, an RTK (Real-time kinematic) coordinate of the unmanned aerial vehicle when the image acquisition device acquires the original image is recorded.
In one embodiment, the image capturing device 110 may be connected to the image processing unit 120 through an image video transmission interface, so that the image capturing device 110 transmits the captured original image to the image processing unit through the image video transmission interface.
Alternatively, the image Video transmission Interface may be, but is not limited to, a USB (Universal Serial Bus) Interface, an HDMI (High Definition Multimedia Interface), an AV (Audio Video) Interface, or other special interfaces.
The image processing unit 120 is mainly responsible for controlling the system operation of the surveying apparatus 100 and data interaction between the surveying apparatus 100 and other modules of the drone.
Referring to fig. 2, the image Processing Unit 120 may include a Central Processing Unit (CPU) 121 and a Graphics Processing Unit (GPU) 122, and the CPU121 and the GPU122 are communicatively connected through a bus.
The CPU121 is used for generating aerial survey data of a target area according to an original image, wherein the aerial survey data comprises a three-dimensional model and a digital orthophoto map of the target area;
and the GPU122 is used for cooperating with the CPU to reconstruct a three-dimensional model of the target area.
The CPU121 is mainly responsible for high-performance mapping calculation in the mapping process, for example, generating a three-dimensional model and a digital orthophoto map of the target region. GPU122 is primarily responsible for image processing algorithm acceleration, such as mapping algorithm front-end feature extraction acceleration, texture mapping algorithm acceleration, and the like.
Optionally, in the mapping process, after the image capturing device 110 captures an original image of the target area, the original image is transmitted to the CPU121 in real time through the image video transmission interface, and the CPU121 receives the original image in real time for processing.
The process of the CPU121 generating the three-dimensional model and the digital orthophoto map of the target region may include the following:
the CPU121 first performs feature extraction acceleration on the original image by the GPU122, and the GPU122 extracts image features and then sends the image features to the CPU 121. The CPU121 performs feature matching according to the image features, and calculates pose information of the original image by an SFM (structure-from-motion) algorithm according to the feature matching result, where the pose information refers to the position and the rotational orientation of the original image, and includes: the pitch angle, roll angle, yaw angle and three-dimensional coordinates of the unmanned aerial vehicle. Until the image acquisition equipment 110 finishes transmitting all the original images, the CPU121 finishes extracting the image features of all the original images, and the CPU121 obtains the pose information of each original image and the sparse three-dimensional point cloud of the target area by calculating through an SFM algorithm according to the image features of all the original images.
The above-described calculated pose information and sparse three-dimensional point cloud are in a local coordinate system, that is, a coordinate system established with the center of the image capturing device 110 as an origin when capturing the original image, and the pose information and sparse three-dimensional point cloud of each original image do not have a uniform reference point. Therefore, they need to be converted to the world coordinate system, so that there is a uniform reference point and the real world can be mapped.
The CPU121 may acquire any one of the original images, acquire an RTK coordinate recorded in the shooting information of the original image, and align the calculated pose information of the original image with the acquired RTK coordinate, that is, calculate a similarity transformation matrix from the pose information to the RTK coordinate.
And then, transforming the pose information of each original image obtained by the solution through a similarity transformation matrix respectively to obtain the pose information of each original image in the world coordinate system. And transforming the sparse three-dimensional point cloud of the target area calculated by the calculation through a similarity transformation matrix to obtain the sparse three-dimensional point cloud of the target area under the world coordinate system. The pose information and the sparse three-dimensional point cloud in the world coordinate system have absolute coordinates and scales of geographic positions.
After the pose information of each original image is obtained, the CPU121 may perform binocular stereo correction on any two adjacent original images having a matching relationship according to the pose information, and obtain a corrected original image, and transmit the corrected original image to the GPU 122. The two adjacent original images having a matching relationship refer to two original images having an overlapping degree or two original images having a common viewpoint.
The GPU122 receives the corrected original images, and restores a disparity map corresponding to each original image through a semi-global matching (SGM) algorithm; and calculating to obtain a depth map corresponding to each original image according to the focal length of the camera and the disparity map corresponding to each original image.
Then, the CPU121 fuses the depth maps corresponding to each of the obtained original images, so as to obtain a dense three-dimensional point cloud of the complete scene.
Meanwhile, in order to save a storage space and increase a data transmission speed, the CPU121 may also compress the obtained dense three-dimensional point cloud.
Alternatively, the CPU121 may represent the dense three-dimensional point cloud by voxelization, and the voxelization grid size may be 0.1m, resulting in a voxel point cloud. And the voxel point cloud is compressed in laz format, so that the data size can be effectively reduced. And finally, slicing the voxel point cloud through an open source framework entwine to obtain a final voxel point cloud tile.
It should be noted that the dense three-dimensional point cloud, the voxel point cloud, and the voxel point cloud tile are three-dimensional models of the target area, the voxel point cloud is used for subsequent obstacle identification, and the voxel point cloud tile is used for final storage or interaction.
After obtaining the sparse three-dimensional point cloud of the target region, the CPU121 may use a Delaunay triangulation algorithm to construct a mesh in the sparse three-dimensional point cloud, where the mesh may be 2.5D or 3D, and is not limited herein. Then, the CPU121 interpolates the mesh in the sparse three-dimensional point cloud, and obtains a DSM (Digital Surface Model) of the target area.
Finally, for each grid on the surface of the DSM, projecting the grid into a corresponding original image by using a back projection method to obtain a projection area of the grid in the original image, and adding texture information and color information to the grid according to pixel values and color values of pixel points in the projection area to obtain a DOM (Digital ortho Map) attached with image textures.
Likewise, to save memory space and increase data transfer speed, the CPU121 may also compress the resulting DOM.
Alternatively, CPU121 may slice the obtained DOM by GDAL to obtain an orthoimage tile, and save the tile at the edge of the orthoimage in png format and the tile at the non-edge of the orthoimage in jpeg format. Compared with the tiles in the pure png format, the tiles in the mixed format can effectively compress the data volume and can reserve the transparent channel at the edge of the orthoimage.
Referring to fig. 2 again, the image Processing Unit 120 may further include an NPU (embedded Neural network processor) 123, and the CPU121, the GPU122 and the NPU123 are communicatively connected through a bus.
And the NPU123 is used for identifying the obstacle of the three-dimensional model of the target area to obtain the obstacle boundary of the target area.
The NPU123 is also used for carrying out boundary identification on the digital orthophoto map of the target area to obtain a land boundary of the target area; the aerial survey data further comprises an obstacle boundary and a plot boundary of the target area.
Optionally, the NPU123 may perform AI obstacle identification on the obtained voxel point cloud of the target area to obtain a vector bounding box of the obstacle. Obstacles may include, but are not limited to, solid elements such as trees, poles, and towers in a survey area.
An AI boundary identification model may also be deployed on the NPU123, and the NPU123 may identify the parcel boundary in the ortho-image tile by running the AI boundary identification model.
In one embodiment, the image processing unit 120 may be an image processing chip, and the CPU121, the GPU122, and the NPU123 are integrated in one image processing chip.
Optionally, the CPU121 and the GPU122 may use chips commonly used in the field of artificial intelligence at present, for example, the CPU121 uses a311D, the GPU uses ARM G52, and the like, and the NPU123 may be an independent NPU built in the a 311D.
Referring again to fig. 2, the image processing unit 120 may further include a memory 124, and the CPU121, the GPU122, the NPU123 and the memory 124 are communicatively connected through a bus.
The memory 124 is used for storing the original image acquired by the image acquisition device 110 and the aerial survey data generated by the image processing unit 120, and temporarily storing intermediate results generated by the image processing unit 120 in the process of generating the aerial survey data.
Alternatively, the Memory 124 may be, but is not limited to, a DRAM (Dynamic Random Access Memory), an SRAM (Static Random-Access Memory), and the like, for example, a DDR (Double Data Rate, Double Data Rate synchronous Dynamic Random Access Memory), and the DDR may be a chip with a model number SU1G32Z11ND4DNQ-053 BT.
Referring to fig. 3, fig. 3 shows a schematic structural diagram of the unmanned aerial vehicle 10 provided in the embodiment of the present application. The drone 10 includes the surveying device 100 and the flight controller 200 described above, with the surveying device 100 and the flight controller 200 being communicatively connected.
Alternatively, the flight controller 200 may be an embedded chip commonly used in the field of unmanned aerial vehicles, such as stm32 rtk imx 6.
And the flight controller 200 is used for controlling the unmanned aerial vehicle 10 to fly according to a set air route, and triggering the mapping equipment 100 to take a picture when the unmanned aerial vehicle 10 flies to a set picture taking point.
The mapping device 100 is configured to acquire an original image of the target area under the trigger of the flight controller 200, generate aerial survey data of the target area according to the original image, and transmit the aerial survey data back to the flight controller 200.
Optionally, the set photographing points may be set by the surveying and mapping staff according to the actual conditions of the target area, for example, in areas with simple landforms such as lakes and grasslands, and a few set photographing points may be set on the corresponding navigation segment; in areas with complex landforms, such as forests, hills and the like, a plurality of set photographing points can be arranged on corresponding navigation sections; and are not limited herein.
Alternatively, the drone 10 may be a plant protection drone, a aerial survey drone, or other drone for terrain mapping, or the like.
Referring to fig. 4, the drone 10 may further include an RTK module 300, and the RTK module 300 is communicatively connected to the flight controller 200 through a bus.
And the RTK module 300 is configured to acquire an RTK coordinate of the unmanned aerial vehicle 10 in the flight process in real time.
The flight controller 200 is further configured to acquire the current RTK coordinate of the unmanned aerial vehicle 10 when the surveying and mapping device is triggered to take a picture, and write the current RTK coordinate into the shooting information of the original image. Referring again to fig. 4, the drone 10 further includes a communication module 400, and the communication module 400 is communicatively connected to the flight controller 200 through a bus.
And the communication module 400 is configured to send the aerial survey data of the target area to the ground station, so that the ground station can plan a three-dimensional route or a two-dimensional route of the target area according to the aerial survey data.
Alternatively, mapping device 100 may be coupled to flight controller 200 via an image video transmission interface such that mapping device 100 may transmit the generated aerial survey data to flight controller 200 via the image video transmission interface.
Optionally, the flight controller 200 is further configured to generate a three-dimensional route or a two-dimensional route of the target area according to the aerial survey data returned by the mapping apparatus 100.
Meanwhile, after the flight controller 200 generates the three-dimensional or two-dimensional flight path of the target area, the three-dimensional or two-dimensional flight path of the target area may be transmitted to the ground station through the communication module 400.
Alternatively, the image Video transmission Interface may be, but is not limited to, a USB (Universal Serial Bus) Interface, an HDMI (High Definition Multimedia Interface), an AV (Audio Video) Interface, or other special interfaces.
Alternatively, flight controller 200 may be connected to image capture device 110 via an EVENT interface. During the mapping process, the image capturing device 110 may continuously send an EVENT pulse through the EVENT interface to obtain the current RTK coordinates of the drone 10. Also, when the image capture device 110 takes a photograph triggered by the flight controller 200, the RTK coordinates currently acquired may be written into the EXIF information of the photograph.
Optionally, the mapping device 100 may also include a power source, and the power source is electrically connected to the power source of the drone 10, such that the drone 10 powers the mapping device 100, e.g., 12V/2A power.
In summary, according to the surveying and mapping device and the unmanned aerial vehicle provided by the embodiment of the application, the surveying and mapping device can be detachably hung on the unmanned aerial vehicle, and has good portability; meanwhile, the mapping equipment comprises image acquisition equipment and an image processing unit, in the mapping process, an original image of a target area is acquired through the image acquisition equipment and sent to the image processing unit, and aerial survey data of the target area is generated on the original image through the image processing unit, so that the original image can be acquired in real time to generate the aerial survey data in real time, and good real-time performance is achieved; the purpose of accurately and quickly completing mapping can be achieved.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (10)

1. A surveying device, which is detachably mounted on an unmanned aerial vehicle, comprises an image acquisition device and an image processing unit, wherein the image acquisition device is in communication connection with the image processing unit;
the image acquisition equipment is used for acquiring an original image of a target area and sending the original image to the image processing unit;
and the image processing unit is used for receiving the original image and generating aerial survey data of the target area according to the original image.
2. The surveying device according to claim 1, wherein the image processing unit comprises a central processing unit CPU and a graphics processing unit GPU, the CPU and the GPU being communicatively connected by a bus;
the CPU is used for generating aerial survey data of the target area according to the original image, and the aerial survey data comprises a three-dimensional model and a digital orthophoto map of the target area;
and the GPU is used for being matched with the CPU to reconstruct a three-dimensional model of the target area.
3. The mapping device of claim 2, wherein the image processing unit further comprises an embedded neural Network Processor (NPU), the CPU, the GPU, and the NPU communicatively coupled by a bus;
the NPU is used for carrying out obstacle identification on the three-dimensional model of the target area to obtain an obstacle boundary of the target area;
the NPU is further used for carrying out boundary identification on the digital orthophoto map of the target area to obtain a plot boundary of the target area; wherein the aerial survey data further comprises an obstacle boundary and a parcel boundary of the target area.
4. The surveying apparatus according to claim 3, wherein the image processing unit includes an image processing chip;
the CPU, the GPU and the NPU are integrated in one image processing chip.
5. The mapping device of claim 1, further comprising a memory, the memory and the image processing unit communicatively connected by a bus;
the memory is used for storing the original image acquired by the image acquisition equipment and the aerial survey data generated by the image processing unit and temporarily storing an intermediate result generated by the image processing unit in the process of generating the aerial survey data.
6. The surveying device according to claim 1, characterized in that the image acquisition device is connected with the image processing unit via an image video transmission interface;
the image video transmission interface comprises a USB interface.
7. A drone, characterized in that it comprises a flight controller and a surveying device according to any one of claims 1-6, the flight controller being in communication connection with the surveying device;
the flight controller is used for controlling the unmanned aerial vehicle to fly according to a set air route and triggering the mapping equipment to take a picture when the unmanned aerial vehicle flies to a set picture taking point;
the mapping equipment is used for acquiring an original image of a target area under the trigger of the flight controller, generating aerial survey data of the target area according to the original image and transmitting the aerial survey data back to the flight controller.
8. The drone of claim 7, wherein the drone is a plant protection drone.
9. The drone of claim 7, further comprising an RTK module communicatively coupled with the flight controller;
the RTK module is used for acquiring RTK coordinates of the unmanned aerial vehicle in the flight process in real time;
the flight controller is further configured to acquire a current RTK coordinate of the unmanned aerial vehicle when the surveying and mapping device is triggered to take a picture, and write the current RTK coordinate into shooting information of the original image.
10. The drone of claim 7, further comprising a communication module communicatively connected with the flight controller;
and the communication module is used for sending the aerial survey data of the target area to a ground station, so that the ground station can plan a three-dimensional route or a two-dimensional route of the target area according to the aerial survey data.
CN202023060583.7U 2020-12-15 2020-12-15 Surveying and mapping equipment and unmanned aerial vehicle Active CN214173346U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202023060583.7U CN214173346U (en) 2020-12-15 2020-12-15 Surveying and mapping equipment and unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202023060583.7U CN214173346U (en) 2020-12-15 2020-12-15 Surveying and mapping equipment and unmanned aerial vehicle

Publications (1)

Publication Number Publication Date
CN214173346U true CN214173346U (en) 2021-09-10

Family

ID=77606763

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202023060583.7U Active CN214173346U (en) 2020-12-15 2020-12-15 Surveying and mapping equipment and unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN214173346U (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115077394A (en) * 2022-07-21 2022-09-20 清华四川能源互联网研究院 Power station dam slope displacement detection method and device and electronic equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115077394A (en) * 2022-07-21 2022-09-20 清华四川能源互联网研究院 Power station dam slope displacement detection method and device and electronic equipment

Similar Documents

Publication Publication Date Title
CN112484703A (en) Surveying and mapping equipment and unmanned aerial vehicle
JP7213809B2 (en) Video-based positioning and mapping method and system
CN109102537B (en) Three-dimensional modeling method and system combining two-dimensional laser radar and dome camera
WO2020135446A1 (en) Target positioning method and device and unmanned aerial vehicle
CN110648389A (en) 3D reconstruction method and system for city street view based on cooperation of unmanned aerial vehicle and edge vehicle
CN104361628A (en) Three-dimensional real scene modeling system based on aviation oblique photograph measurement
CN112740269B (en) Target detection method and device
CN112729260B (en) Surveying system and surveying method
CN103426165A (en) Precise registration method of ground laser-point clouds and unmanned aerial vehicle image reconstruction point clouds
WO2020062434A1 (en) Static calibration method for external parameters of camera
WO2021035731A1 (en) Control method and apparatus for unmanned aerial vehicle, and computer readable storage medium
JP2022077976A (en) Image-based positioning method and system
CN115641401A (en) Construction method and related device of three-dimensional live-action model
WO2023056789A1 (en) Obstacle identification method and system for automatic driving of agricultural machine, device, and storage medium
CN111247564A (en) Method for constructing digital earth surface model, processing equipment and system
Wendel et al. Automatic alignment of 3D reconstructions using a digital surface model
CN214173346U (en) Surveying and mapping equipment and unmanned aerial vehicle
CN114761997A (en) Target detection method, terminal device and medium
CN115330594A (en) Target rapid identification and calibration method based on unmanned aerial vehicle oblique photography 3D model
WO2021120389A1 (en) Coordinate transformation method and apparatus for aerial panoramic roaming data
CN108564654B (en) Picture entering mode of three-dimensional large scene
US20220113423A1 (en) Representation data generation of three-dimensional mapping data
CN109978997A (en) A kind of transmission line of electricity three-dimensional modeling method and system based on inclination image
WO2021051220A1 (en) Point cloud fusion method, device, and system, and storage medium
CN107784666B (en) Three-dimensional change detection and updating method for terrain and ground features based on three-dimensional images

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant