CN112837378B - Aerial camera attitude external dynamic calibration and mapping method based on multi-unmanned aerial vehicle formation - Google Patents

Aerial camera attitude external dynamic calibration and mapping method based on multi-unmanned aerial vehicle formation Download PDF

Info

Publication number
CN112837378B
CN112837378B CN202110151354.6A CN202110151354A CN112837378B CN 112837378 B CN112837378 B CN 112837378B CN 202110151354 A CN202110151354 A CN 202110151354A CN 112837378 B CN112837378 B CN 112837378B
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
camera
image
mapping
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110151354.6A
Other languages
Chinese (zh)
Other versions
CN112837378A (en
Inventor
徐辰晓
王蛟龙
杨晓庆
陈泽阳
段一铭
祝丽娜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangnan University
Original Assignee
Jiangnan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangnan University filed Critical Jiangnan University
Priority to CN202110151354.6A priority Critical patent/CN112837378B/en
Publication of CN112837378A publication Critical patent/CN112837378A/en
Application granted granted Critical
Publication of CN112837378B publication Critical patent/CN112837378B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/32Indexing scheme for image data processing or generation, in general involving image mosaicing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Manufacturing & Machinery (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Image Processing (AREA)

Abstract

An aerial camera attitude external dynamic calibration and mapping method based on multi-unmanned aerial vehicle formation belongs to the technical field of image geographic information calibration. According to the invention, in the aerial image of the multi-unmanned aerial vehicle formation host camera, the global absolute coordinate information of the multi-frame collaborative unmanned aerial vehicle and the pixel position information of the multi-frame collaborative unmanned aerial vehicle in the main unmanned aerial vehicle camera picture are combined, and the attitude view angle of the main unmanned aerial vehicle camera is calculated in an inversion mode, so that mapping is performed, and the problem of low precision in the prior art is solved. The unmanned aerial vehicle aerial photographing system is low in dependence on image characteristic points, high in mapping precision, high in dynamic photographing precision and high in efficiency, and the dependence of the traditional image splicing technology on available image characteristics is broken through.

Description

Aerial camera attitude external dynamic calibration and mapping method based on multi-unmanned aerial vehicle formation
Technical Field
The invention belongs to the technical field of image geographic information calibration, and particularly relates to a multi-unmanned aerial vehicle formation aerial image geographic information calibration method independent of flight attitude.
Background
For aerial images lacking image features which can be effectively identified, namely lacking identifiable and referenced optical features, an image stitching method based on feature matching is often invalid, and indexes such as accuracy and consistency of environment identification, detection and modeling are seriously affected. The attitude-free data has higher requirements on precision, efficiency, frequency, instantaneity and the like, requires higher hardware cost and is not suitable for unmanned aerial vehicle systems with low configuration. At the same time, achieving time synchronization, spatial synchronization is a difficult problem.
When facing a complex aerial photographing environment, many characteristic points are included in the images to be spliced, but when facing a lake aerial photographing, the image characteristics in the aerial photographing object are sparse, the requirements of the airborne optical mapping on the attitude precision of the aerial photographing image are high, and the positioning precision of the inertial sensor and the compass in the traditional calibration method cannot meet the requirements of the aerial photographing mapping with high precision, so that the method cannot be completed, and a method with high precision is needed to be searched to solve the problem in the prior art.
Disclosure of Invention
The invention provides a multi-machine formation aerial image geographic information calibration method independent of flight gestures, which is used for constructing an unmanned aerial vehicle aerial system with low dependence on image characteristic points, high mapping precision, high dynamic shooting precision and high efficiency, and breaks through the dependence of the traditional image splicing technology on available characteristics of images.
The method is realized by a dynamic flying unmanned aerial vehicle collaborative formation calibration system, and the calibration system comprises a main unmanned aerial vehicle and more than three collaborative unmanned aerial vehicles; the calibration and mapping method comprises the following steps:
And firstly, reasonably forming a queue and planning a flight path for the main unmanned aerial vehicle and the cooperative unmanned aerial vehicle, so that the cooperative unmanned aerial vehicle is ensured to be in an image range shot by the main unmanned aerial vehicle and form a certain space geometrical relationship with the main unmanned aerial vehicle. Global absolute coordinate information (i.e., GPS positioning, etc.) of the multiple collaborative drones corresponds to the multiple real space geometry. And the unmanned aerial vehicles are time-synchronized and coordinates are shared synchronously in real time.
Secondly, keeping the formation and the height of the main unmanned aerial vehicle and the cooperative unmanned aerial vehicle unchanged, synchronously flying according to a planned flight path, and shooting the cooperative unmanned aerial vehicle and a mapping area by the main unmanned aerial vehicle at a coordinate point to be mapped;
Thirdly, after shooting is completed, as the shapes of the two geometric relations are directly related to the attitude view angle of the host camera, the pixel position information and the global absolute coordinate information of the multi-frame collaborative unmanned aerial vehicle in the image shot by the host unmanned aerial vehicle camera are combined to obtain the attitude view angle of the host unmanned aerial vehicle camera; firstly, identifying pixel position information of a plurality of collaborative unmanned aerial vehicles in a camera image of a main unmanned aerial vehicle to determine a multi-machine geometric relationship in a camera view angle, secondly, determining a multi-machine actual space geometric relationship based on global absolute coordinate information of the plurality of collaborative unmanned aerial vehicles, and then, inverting and calculating a posture view angle of the main unmanned aerial vehicle camera by utilizing projection transformation relationship of the two geometric relationships, wherein the posture view angle comprises a pitch angle, a roll angle and a yaw angle;
And fourthly, in the acquired shooting image of the host camera, the pixel position information of the cooperative unmanned aerial vehicle in the image of the host unmanned aerial vehicle and the global positioning coordinates thereof are utilized, and the global coordinates of the host camera and the attitude view angle determined in the third step are combined, and the global absolute coordinate information corresponding to the pixels except the pixels of the cooperative unmanned aerial vehicle in the image is converted by performing rotational orthographic projection processing on the pixels of the image, so that the processing of the original image is completed.
And fifthly, combining global absolute coordinate data of aerial photographing data, and splicing the orthographic projection image data after multi-unmanned aerial vehicle formation and multi-frame aerial photographing mapping inversion to form complete image information and coordinate consistency mapping data under a unified coordinate system. And the method adopts a Riemann manifold space description mode to calibrate the image and coordinate data in the manifold space, so that the calibration precision and consistency of the image pixels can be effectively improved.
Further, the unmanned aerial vehicle is four rotor unmanned aerial vehicle.
Further, four rotor unmanned aerial vehicle includes organism carbon fiber support, motor, carbon fiber screw blade, chargeable lithium cell, wireless data transmission communication module, GPS module, pixhawk flight control module and firefly camera cloud platform. The functions of the wireless data transmission communication module, the GPS module, the Pixhawk flight control module and the firefly camera cradle head are as follows:
and a wireless data transmission communication module: the unmanned aerial vehicle transmits the aerial photo, GPS information of the aerial photo and attitude data of the unmanned aerial vehicle in real time; data sharing among unmanned aerial vehicles is carried out so as to keep a matrix type;
And a GPS module: determining the position and posture data of the unmanned aerial vehicle through a GPS;
the Pixhawk flight control module: automatically maintaining the normal flight attitude of the aircraft;
firefly camera cradle head: balance and stabilize the camera, and adjust the pitching angle of the camera.
The invention has the beneficial effects that: according to the invention, in the aerial image of the multi-unmanned aerial vehicle formation host camera, the global absolute coordinate information of the multi-frame collaborative unmanned aerial vehicle and the pixel position information of the multi-frame collaborative unmanned aerial vehicle in the main unmanned aerial vehicle camera picture are combined, and the attitude view angle (pitch angle, roll angle and yaw angle) of the main unmanned aerial vehicle camera is calculated in an inversion mode, so that mapping is performed, and the problem of low precision in the prior art is solved.
Drawings
Fig. 1 is a geometric relationship diagram of projections of a main unmanned aerial vehicle, a collaborative unmanned aerial vehicle and a collaborative unmanned aerial vehicle on a photo.
Fig. 2 is a schematic diagram of the positions of the cooperating unmanned aerial vehicles in the image when the main unmanned aerial vehicle camera is not in use.
Detailed Description
The invention achieves the effects by the design of the software program and the transformation and configuration of the hardware equipment.
A main unmanned aerial vehicle is at 500 meters high altitude for image shooting. Three collaborative unmanned aerial vehicles are positioned at 100 meters in the image range right below the main unmanned aerial vehicle. And reasonably forming three collaborative unmanned aerial vehicles to form a spatial geometric relationship. After the first image is shot by the main unmanned aerial vehicle, the four unmanned aerial vehicles translate simultaneously, and high-speed continuous shooting is performed. And (3) inverting other images by utilizing GPS coordinates of three collaborative unmanned aerial vehicles, and finally splicing the multi-unmanned aerial vehicle and multi-frame aerial mapping inversion data to form complete image information and coordinate consistency mapping data under a unified coordinate system. Therefore, the geographic information calibration of the multi-machine formation aerial image independent of the flight attitude is realized. Software operation part:
1. Setting a multi-unmanned aerial vehicle formation specific operation: right clicking selects "Connection optics" in Mission Planner ground stations and connects 4 unmanned aerial vehicles to Mission Planner ground stations by adopting TCP, selecting "Swarm" in "Temp", setting up main unmanned aerial vehicle, and other unmanned aerial vehicles will automatically enter "Guided" mode. Any slave is selected, takeoff is clicked, 3 slaves take off to a set height, because the slave is a four-rotor unmanned aerial vehicle, the slave can be directly set to 100 meters, then the slave is switched to a main unmanned aerial vehicle, a right button is clicked on a map interface, takeoff is selected, the slave can be directly lifted to 500 meters, the slave can take off to a low altitude, and then climbing is performed. In the Control panel, a master unmanned aerial vehicle and a slave can be controlled to form a required formation, and then the other slaves can automatically keep the formation and follow the master unmanned aerial vehicle to execute tasks only by controlling the master unmanned aerial vehicle.
2. Multiple unmanned aerial vehicle formation landing operations: and controlling the main unmanned aerial vehicle to return to the upper position of the falling point, keeping the original formation by the rest auxiliary unmanned aerial vehicles to return to the upper position of the falling point along with the main unmanned aerial vehicle, then, independently controlling each auxiliary unmanned aerial vehicle to drop, and finally, dropping the main unmanned aerial vehicle.
3. Time synchronization and space synchronization problems among multiple unmanned aerial vehicles: after the multi-unmanned aerial vehicle is formed into a team, the time synchronization problem is solved well, and under absolute time, the GPS coordinates of the master unmanned aerial vehicle and each slave are unified. The absolute time corresponding to the photo photographed by the master unmanned aerial vehicle can be calculated only by calculating the relative time difference between the master unmanned aerial vehicle and any slave computer display screen. As for the space synchronization problem, the algorithm of the multi-unmanned aerial vehicle formation has better robustness, GPS coordinates and pose data of the master unmanned aerial vehicle can be calculated through combination of images and a slave GPS, calculated data and flight control log data are compared, and relative errors can be calculated to ensure that space synchronization and mapping accuracy are achieved.
4. Image stitching algorithm: in the photo shot by the master unmanned aerial vehicle, three slaves are the optical characteristics of the image, and after the GPS coordinate data of the slaves in each image are uniformly processed, the image splicing based on the optical characteristics can be carried out by adopting a simple algorithm, and the higher precision can be kept.
5. Improvement of precision: the accuracy of GPS positioning can be cooperatively improved by adopting a laser radar calibration or RTK positioning method.
In terms of hardware, the present patent invention chooses to use a quad-rotor unmanned aerial vehicle rather than a fixed-wing unmanned aerial vehicle. The four rotor unmanned aerial vehicle is easy to refit, easy to operate and has great advantages in the aspect of water area environment monitoring relative to the fixed-wing unmanned aerial vehicle. The fixed-wing unmanned aerial vehicle is suitable for large-area aerial photography and aerial survey, can not achieve accurate stay fixed points, is high in flight speed, and is not suitable for high-quality acquisition of static images. The multi-aircraft formation aerial image geographic information calibration method which does not depend on the flight attitude is used, the requirements on the acquisition of geographic image information of the water area environment are fixed-point and static image acquisition, and the four-rotor unmanned aerial vehicle can better meet the requirements, so that the four-rotor unmanned aerial vehicle is adopted.
The four-rotor unmanned aerial vehicle hardware part used by the invention comprises a machine body carbon fiber support, a motor, a carbon fiber propeller blade, a rechargeable lithium battery, a wireless data transmission communication module, a GPS module, a Pixhawk flight control module, a firefly camera holder and the like.

Claims (7)

1. The method is characterized in that the calibration method is realized by a dynamic flying unmanned aerial vehicle collaborative formation calibration system, and the calibration system comprises a main unmanned aerial vehicle and more than three collaborative unmanned aerial vehicles; the calibration and mapping method comprises the following steps:
Firstly, reasonably forming a queue and planning a flight path for a main unmanned aerial vehicle and a cooperative unmanned aerial vehicle, so that the cooperative unmanned aerial vehicle is ensured to be in an image range shot by the main unmanned aerial vehicle and form a certain space geometrical relationship with the main unmanned aerial vehicle; the global absolute coordinate information of the multiple collaborative unmanned aerial vehicles corresponds to the geometrical relationship of the actual space of the multiple unmanned aerial vehicles; each unmanned aerial vehicle is time-synchronized and coordinates are synchronously shared in real time;
Secondly, keeping the formation and the height of the main unmanned aerial vehicle and the cooperative unmanned aerial vehicle unchanged, synchronously flying according to a planned flight path, and shooting the cooperative unmanned aerial vehicle and a mapping area by the main unmanned aerial vehicle at a coordinate point to be mapped;
thirdly, after shooting is completed, combining pixel position information of the multiple collaborative unmanned aerial vehicles in an image shot by the main unmanned aerial vehicle camera and global absolute coordinate information of the multiple collaborative unmanned aerial vehicles to obtain a posture view angle of the main unmanned aerial vehicle camera; firstly, identifying pixel position information of a plurality of collaborative unmanned aerial vehicles in a camera image of a main unmanned aerial vehicle to determine a multi-machine geometric relationship in a camera view angle, secondly, determining a multi-machine actual space geometric relationship based on global absolute coordinate information of the plurality of collaborative unmanned aerial vehicles, and then, inverting and calculating a posture view angle of the main unmanned aerial vehicle camera by utilizing projection transformation relationship of the two geometric relationships, wherein the posture view angle comprises a pitch angle, a roll angle and a yaw angle;
Fourth, in the acquired shooting image of the host camera, the pixel position information of the cooperative unmanned aerial vehicle in the image of the host unmanned aerial vehicle and the global positioning coordinates thereof are utilized, and the global coordinates of the host camera and the attitude view angle determined in the third step are combined, and global absolute coordinate information corresponding to the pixels except the pixels of the cooperative unmanned aerial vehicle in the image is converted through rotary orthographic projection processing on the pixels of the image, so that the processing of an original image is completed;
And fifthly, combining global absolute coordinate data of aerial photographing data, and splicing the orthographic projection image data after multi-unmanned aerial vehicle formation and multi-frame aerial photographing mapping inversion to form complete image information and coordinate consistency mapping data in a global absolute coordinate system.
2. The method for dynamically calibrating and mapping the attitude of an aerial camera based on multi-unmanned aerial vehicle formation according to claim 1, wherein the unmanned aerial vehicles are all four-rotor unmanned aerial vehicles.
3. The method for dynamically calibrating and mapping the attitude of an aerial camera based on multi-unmanned aerial vehicle formation according to claim 1, wherein the four-rotor unmanned aerial vehicle comprises a body carbon fiber support, a motor, a carbon fiber propeller blade, a rechargeable lithium battery, a wireless data transmission communication module, a GPS module, a Pixhawk flight control module and a firefly camera cradle head.
4. An aerial camera pose external dynamic calibration and mapping method based on multi-unmanned aerial vehicle formation according to claim 3, wherein the function of the wireless data transmission communication module is as follows: the unmanned aerial vehicle transmits the aerial photo, GPS information of the aerial photo and attitude data of the unmanned aerial vehicle in real time; data sharing among unmanned aerial vehicles is carried out to keep the array type.
5. A method for external dynamic calibration and mapping of the pose of an aerial camera based on multi-unmanned aerial vehicle formation according to claim 3, wherein the GPS module functions: and determining the position and the attitude data of the unmanned aerial vehicle and the load through a GPS.
6. A method for external dynamic calibration and mapping of the pose of an aerial camera based on multi-unmanned aerial vehicle formation according to claim 3, wherein the Pixhawk flight control module functions: and automatically maintaining the normal flight attitude of the aircraft.
7. An aerial camera pose external dynamic calibration and mapping method based on multi-unmanned aerial vehicle formation according to claim 3, wherein the firefly camera cradle head functions: balance and stabilize the camera, and adjust the pitching angle of the camera.
CN202110151354.6A 2021-02-03 2021-02-03 Aerial camera attitude external dynamic calibration and mapping method based on multi-unmanned aerial vehicle formation Active CN112837378B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110151354.6A CN112837378B (en) 2021-02-03 2021-02-03 Aerial camera attitude external dynamic calibration and mapping method based on multi-unmanned aerial vehicle formation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110151354.6A CN112837378B (en) 2021-02-03 2021-02-03 Aerial camera attitude external dynamic calibration and mapping method based on multi-unmanned aerial vehicle formation

Publications (2)

Publication Number Publication Date
CN112837378A CN112837378A (en) 2021-05-25
CN112837378B true CN112837378B (en) 2024-04-30

Family

ID=75931966

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110151354.6A Active CN112837378B (en) 2021-02-03 2021-02-03 Aerial camera attitude external dynamic calibration and mapping method based on multi-unmanned aerial vehicle formation

Country Status (1)

Country Link
CN (1) CN112837378B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI819569B (en) * 2022-04-14 2023-10-21 實踐大學 A zero-time huge aerial photo shooting device based on drone swarm flight

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107808362A (en) * 2017-11-15 2018-03-16 北京工业大学 A kind of image split-joint method combined based on unmanned plane POS information with image SURF features
CN109358642A (en) * 2018-10-22 2019-02-19 江南大学 A kind of fruit tree plant protection and picking method based on multi-rotor unmanned aerial vehicle
CN110223386A (en) * 2019-06-10 2019-09-10 河南科技大学 A kind of digital terrain modeling method based on multi-source unmanned aerial vehicle remote sensing data fusion
CN111754451A (en) * 2019-12-31 2020-10-09 广州极飞科技有限公司 Surveying and mapping unmanned aerial vehicle achievement detection method and device, electronic equipment and storage medium
CN112288634A (en) * 2020-10-29 2021-01-29 江苏理工学院 Splicing method and device for aerial images of multiple unmanned aerial vehicles

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10685229B2 (en) * 2017-12-21 2020-06-16 Wing Aviation Llc Image based localization for unmanned aerial vehicles, and associated systems and methods

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107808362A (en) * 2017-11-15 2018-03-16 北京工业大学 A kind of image split-joint method combined based on unmanned plane POS information with image SURF features
CN109358642A (en) * 2018-10-22 2019-02-19 江南大学 A kind of fruit tree plant protection and picking method based on multi-rotor unmanned aerial vehicle
CN110223386A (en) * 2019-06-10 2019-09-10 河南科技大学 A kind of digital terrain modeling method based on multi-source unmanned aerial vehicle remote sensing data fusion
CN111754451A (en) * 2019-12-31 2020-10-09 广州极飞科技有限公司 Surveying and mapping unmanned aerial vehicle achievement detection method and device, electronic equipment and storage medium
CN112288634A (en) * 2020-10-29 2021-01-29 江苏理工学院 Splicing method and device for aerial images of multiple unmanned aerial vehicles

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
无人机技术在测绘工程测量中的应用;陈忠良;;科学技术创新;20181205(第34期);全文 *
试论无人机测绘数据处理关键技术及应用;李平生;;科技创新与应用;20181203(第34期);全文 *

Also Published As

Publication number Publication date
CN112837378A (en) 2021-05-25

Similar Documents

Publication Publication Date Title
CN109596118B (en) Method and equipment for acquiring spatial position information of target object
US10942529B2 (en) Aircraft information acquisition method, apparatus and device
CN106124517B (en) The multi-rotor unmanned aerial vehicle detection platform system of detection structure part surface crack and its method for detection structure part surface crack
US20190052852A1 (en) Unmanned aerial vehicle surface projection
CN111091613A (en) Three-dimensional live-action modeling method based on unmanned aerial vehicle aerial survey
CN113485392A (en) Virtual reality interaction method based on digital twins
US10852723B2 (en) Unmanned aerial vehicle swarm photography
CN110308457A (en) A kind of power transmission line polling system based on unmanned plane
WO2018210078A1 (en) Distance measurement method for unmanned aerial vehicle, and unmanned aerial vehicle
JP2023072064A (en) Performing 3d reconstruction via unmanned aerial vehicle
US20200004272A1 (en) System and method for intelligent aerial inspection
US20210333807A1 (en) Method and system for controlling aircraft
EP3398093A1 (en) Construction and update of elevation maps
CN105335733A (en) Autonomous landing visual positioning method and system for unmanned aerial vehicle
CN112789672B (en) Control and navigation system, gesture optimization, mapping and positioning techniques
CN105758384A (en) Unmanned aerial vehicle rocking oblique photograph system
CN203845021U (en) Panoramic aerial photographic unit system for aircrafts
CN110537197A (en) Image processing apparatus, maturation history image creation system and program
CN109883398A (en) The system and method that the green amount of plant based on unmanned plane oblique photograph is extracted
CN112381935A (en) Synthetic vision generation and multi-element fusion device
CN107941167B (en) Space scanning system based on unmanned aerial vehicle carrier and structured light scanning technology and working method thereof
CN112837378B (en) Aerial camera attitude external dynamic calibration and mapping method based on multi-unmanned aerial vehicle formation
CN112198903A (en) Modular multifunctional onboard computer system
CN115876197A (en) Mooring lifting photoelectric imaging target positioning method
WO2020062356A1 (en) Control method, control apparatus, control terminal for unmanned aerial vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant