CN110887461B - Unmanned aerial vehicle real-time computer vision processing method based on GPS attitude estimation - Google Patents

Unmanned aerial vehicle real-time computer vision processing method based on GPS attitude estimation Download PDF

Info

Publication number
CN110887461B
CN110887461B CN201911137550.7A CN201911137550A CN110887461B CN 110887461 B CN110887461 B CN 110887461B CN 201911137550 A CN201911137550 A CN 201911137550A CN 110887461 B CN110887461 B CN 110887461B
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
gps
attitude
computer vision
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911137550.7A
Other languages
Chinese (zh)
Other versions
CN110887461A (en
Inventor
刘贞报
邢轶超
江飞鸿
严月浩
张超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northwestern Polytechnical University
Original Assignee
Northwestern Polytechnical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwestern Polytechnical University filed Critical Northwestern Polytechnical University
Priority to CN201911137550.7A priority Critical patent/CN110887461B/en
Publication of CN110887461A publication Critical patent/CN110887461A/en
Application granted granted Critical
Publication of CN110887461B publication Critical patent/CN110887461B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/43Determining position using carrier phase measurements, e.g. kinematic positioning; using long or short baseline interferometry

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)

Abstract

The invention provides an unmanned aerial vehicle real-time computer vision processing acceleration method based on GPS attitude estimation, which comprises a plurality of GPS receivers and a computer processor, wherein the GPS receivers are arranged at fixed positions on an unmanned aerial vehicle, the computer processor receives an original GPS measured value from each GPS satellite, further determines the distance between each GPS receiver and the satellite, determines the attitude of the unmanned aerial vehicle based on the distance, further determines a camera attitude rotation matrix, predicts a target prediction position in an image at the next moment by adopting a computer vision algorithm according to the camera attitude rotation matrix of a camera and combining a target position in the image at the previous moment, and performs target search by taking the target prediction position as a starting point, so that the magnitude order in a search space is reduced, the search speed is accelerated, and the calculation complexity is greatly reduced by reducing the computer vision search space.

Description

Unmanned aerial vehicle real-time computer vision processing method based on GPS attitude estimation
Technical Field
The invention relates to the field of computer vision processing of unmanned aerial vehicles, in particular to a real-time computer vision processing method of an unmanned aerial vehicle based on GPS attitude estimation.
Background
The unmanned aerial vehicle technology has a lot of major innovations in recent years, and great progress is made, but some major technical challenges still exist. The most important thing is that the unmanned aerial vehicle can adapt to any environment, the best method for solving the problem is to carry out real-time computer vision processing, and the unmanned aerial vehicle can quickly find or track a target when flying in an unknown environment by utilizing the computer vision, so that the unmanned aerial vehicle has important significance for multi-machine cooperation and the like.
In view of these needs, it is essential to implement computer vision processing functions on a drone platform. Since these functions generally require good real-time performance, the computing devices are not typically installed in powerful backend cloud computing systems, and the drone must be self-sufficient in terms of computing power.
Therefore, the weight and energy consumption of the computing platform are greatly limited, and a corresponding method is needed to reduce the calculation amount in the visual processing process.
Disclosure of Invention
Aiming at the problem that the computing capability is limited due to the limitation of the size and power consumption of equipment when the unmanned aerial vehicle carries out real-time computer vision processing, the method for accelerating the computer vision processing through GPS attitude estimation is provided, and the method is realized through a plurality of GPS receivers arranged at different positions on the unmanned aerial vehicle.
The invention is realized by the following technical scheme:
the unmanned aerial vehicle real-time computer vision processing method based on GPS attitude estimation comprises the following steps:
step 1, acquiring postures of the unmanned aerial vehicle at n time and n +1 time, and acquiring images shot by the unmanned aerial vehicle at the postures corresponding to the n time and the n +1 time, wherein n is more than or equal to 1;
step 2, determining an attitude rotation matrix of the camera from the moment n to the moment n +1 according to the attitude of the unmanned aerial vehicle at the moment n and the attitude at the moment n + 1;
step 3, calculating the predicted position of the target in the image at the moment n +1 according to the attitude rotation matrix of the camera from the moment n to the moment n +1 and combining the target position in the image at the moment n, and executing target search by taking the predicted position of the target as a starting point to obtain the position of the target in the image at the moment n + 1;
and 4, at the moment of n +2, turning to the step 1 until the target search is completed, and completing the real-time computer vision processing of the unmanned aerial vehicle based on the GPS attitude estimation.
Preferably, the specific method for acquiring the attitude of the unmanned aerial vehicle in step 1 is as follows: the method comprises the steps that a plurality of GPS receivers are installed on the unmanned aerial vehicle, the distance from each GPS receiver to a satellite is obtained, and the attitude of the unmanned aerial vehicle is determined according to the distance from each GPS receiver to the satellite.
Preferably, the method for acquiring the distance between the satellite and the GPS receiver is as follows:
the GPS receivers acquire raw GPS measurements between the GPS receivers and the satellites, the raw GPS measurements including pseudorange and carrier phase data, and the distance between each GPS receiver and the satellite is determined by a carrier phase and pseudorange combined measurement method.
Preferably, the method of the attitude rotation matrix R of the camera in step 2 is as follows:
R×A=B
the method comprises the following steps that A is a measurement matrix of the posture of the unmanned aerial vehicle at n moments, B is a measurement matrix of the posture of the unmanned aerial vehicle at n +1 moments, and R is a posture rotation matrix of a camera from n moments to n +1 moments.
Preferably, the measurement matrix of the unmanned aerial vehicle attitude is a 3 × 3 matrix.
Preferably, in step 3, the camera pose rotation matrix at the time n and the target position in the image at the time n-1 are input into a computer vision algorithm to obtain a predicted target position in the image at the time n.
Preferably, in step 3, when n is 1, the picture obtained at the time n is input to a computer vision algorithm, and the target position in the image at the time is obtained.
Compared with the prior art, the invention has the following beneficial technical effects:
the unmanned aerial vehicle real-time computer vision processing method based on GPS attitude estimation provided by the invention obtains the attitudes of the unmanned aerial vehicle at two adjacent moments and the images shot under the attitudes, determines the rotation matrix of the camera according to the attitudes of the unmanned aerial vehicle at the two adjacent moments, predicts the target predicted position in the image at the next moment by adopting a computer vision algorithm according to the rotation matrix of the camera and combining the target position in the image at the previous moment, and performs target search by taking the target predicted position as a starting point, thereby reducing the magnitude order in the search space, accelerating the search speed, and greatly reducing the computational complexity by reducing the computer vision search space.
Drawings
FIG. 1 is a flow chart of a real-time computer vision processing method for a GPS attitude estimation-based UAV of the present invention;
FIG. 2 is a diagram of the configuration of the GPS receiver of the present invention on a drone;
FIG. 3 is an effect diagram of the unmanned aerial vehicle real-time computer vision processing method based on GPS attitude estimation.
Detailed Description
The present invention will now be described in further detail with reference to the attached drawings, which are illustrative, but not limiting, of the present invention.
As shown in fig. 1, a real-time computer vision processing method for unmanned aerial vehicle based on GPS attitude estimation includes the following steps:
step 1: acquiring the distance between the satellite and the unmanned aerial vehicle at the n moments;
the specific method is that a plurality of GPS receivers are installed on the unmanned aerial vehicle, and the distance between two adjacent GPS receivers on the unmanned aerial vehicle reaches the maximum value.
As shown in fig. 2, a plurality of GPS receivers, preferably four, are mounted on the drone, four GPS receivers 1 being mounted on four arms 2 of the drone, respectively, and all connected to the drone body 3. The GPS receivers are physically fixed on the arm of the unmanned aerial vehicle, the distance between each GPS receiver needs to be accurately measured, and the distance between the GPS receivers is maximized as far as possible, so that the length of a calculation baseline is increased, and the accuracy of attitude estimation is finally enhanced.
The method comprises the steps of obtaining original GPS measurement values of the unmanned aerial vehicle and the satellites through a GPS receiver, transmitting the original GPS measurement values to an onboard data processing computer loaded on the unmanned aerial vehicle from the GPS receiver through a USB (universal serial bus), wireless RF (radio frequency) or other communication mechanisms, determining the distance between each GPS receiver and the satellites through a carrier phase and pseudo-range combined measurement method, wherein the original GPS measurement values comprise pseudo-range and carrier phase data.
Step 2: and determining the attitude of the unmanned aerial vehicle at the moment n according to the distance between the satellite and the unmanned aerial vehicle, and acquiring an image acquired by the unmanned aerial vehicle carrying camera in the attitude.
Specifically, according to the distance that a plurality of GPS receivers installed in different positions on unmanned aerial vehicle received, confirm this moment unmanned aerial vehicle's gesture, because the camera is fixed on unmanned aerial vehicle, consequently, this unmanned aerial vehicle's gesture is the gesture of camera promptly. The method for determining the attitude of the spacecraft by using the GPS is an important aspect of the application of the GPS on the spacecraft, and the attitude determination of the spacecraft based on the GPS has a plurality of methods and algorithms, typically a vector observation method, a recurrence algorithm and the like.
And step 3: and (3) repeating the steps (1) and (2), and acquiring the attitude of the unmanned aerial vehicle at the moment of n +1 and an image shot by the camera under the attitude.
And 4, step 4: determining an attitude rotation matrix of the camera from the moment n to the moment n +1 according to the attitude of the unmanned aerial vehicle at the moment n and the attitude at the moment n + 1;
specifically, the attitude of the unmanned aerial vehicle at the time n is a first attitude, the attitude of the unmanned aerial vehicle at the time n +1 is a second attitude, the attitude matrix of the attitude of the unmanned aerial vehicle is a 3 × 3 matrix, and the attitude matrix of the first attitude is multiplied by the reciprocal of the attitude matrix of the second attitude to be used as an attitude rotation matrix of the camera from the first attitude to the second attitude, so as to represent the rotation amount of the camera between the time n and the time n + 1.
When multiple video frames or multiple still images are taken by the camera at different times, the poses corresponding to the different times may provide an accurate measure of rotation.
The attitude matrix of each moment is expressed as a 3 x 3 matrix, and interframe rotation is formed between two continuous moment images, and the rotation is three-dimensional and comprises yaw, pitch and roll; considering the two attitude matrices a and B, and also considering the attitude rotation matrix R that occurs between a and B, so that R ═ a ═ B-1. The representation of the pose rotation matrix R may optionally be modified for use in computer vision. For example, R can be converted to any other representation of a 3D rotation by standard formulas, such as quaternions.
And 5: and calculating the predicted position of the target in the image at the n +1 moment according to the posture rotation matrix of the camera from the n moment to the n +1 moment and combining the target position in the image at the n moment, and performing target search by taking the predicted position of the target as a starting point to obtain the position of the target in the image at the n +1 moment.
Specifically, the attitude rotation matrix of the camera at the n +1 moment and the target position in the image at the n moment are input into a computer vision algorithm to obtain the target predicted position in the image at the n +1 moment.
When n is 1, inputting the picture obtained at the time n into a computer vision algorithm to obtain the target position in the image at the time.
The position of the target in the image at the time point n +1 in the image at the time point n +2 can be predicted through the calculated attitude rotation matrix of the camera, and the corresponding target detected in the time point n +1 is searched in the image at the time point n +2 by taking the position as the starting point of the search. The search speed is considerably increased by searching first for the predicted position, from where the search boundary is expanded (e.g. in concentric circles or progressively larger bounding boxes), instead of searching arbitrarily in the entire second image.
And 6, at the moment of n +2, turning to the step 1 until the target search is completed, and completing the real-time computer vision processing of the unmanned aerial vehicle based on the GPS attitude estimation.
The invention provides a real-time computer vision processing method of a unmanned aerial vehicle based on GPS attitude estimation, which comprises a plurality of GPS receivers installed at fixed positions on the unmanned aerial vehicle and one or more computer processors, and enables a computer system to execute the step of receiving raw GPS measurement values from each GPS satellite. The raw GPS measurements include pseudorange and carrier phase data, and the distance between each GPS receiver and the satellite is determined by a combined carrier phase and pseudorange measurement method. The method comprises the steps of determining the posture of the unmanned aerial vehicle based on relative distance measurement, determining a camera posture rotation matrix based on the posture of the unmanned aerial vehicle, predicting a target prediction position in an image at the next moment by adopting a computer vision algorithm according to the camera posture rotation matrix of the camera and combining the target position in the image at the previous moment, and performing target search by taking the target prediction position as a starting point, so that the magnitude order in a search space is reduced, the search speed is accelerated, and the calculation complexity is greatly reduced by reducing the computer vision search space.
The invention uses a plurality of GPS receivers to estimate the unmanned aerial vehicle attitude, and is applied to computer vision processing, and magnitude reduction is realized in the search space of the space computer vision heuristic. For example, the system finds several feature points in image a that correspond to the same points in image B, and if the angle of rotation of images a through B is known, the system may re-project a based on that angle, determining the starting point for the search of the feature in image B, so that the process is speeded up.
As shown in fig. 3, the first image 7 contains a certain feature point 4, if the GPS attitude is not accelerated, the start point 5 of the feature search in the second image 8 generally takes the center point, the search speed is slow, and after the GPS attitude acceleration, the start point 5 of the feature search in the second image 9 takes the position obtained by calculating the position of the feature point in the first image through the rotation matrix, so that the search time can be greatly reduced.
The attitude rotation matrix of the camera is used as an input to the basic computer vision algorithm, and the computational complexity is greatly reduced by reducing the computer vision search space. Applications of computer vision algorithms include, but are not limited to: 1. detecting key points; 2. matching key points; 3, a motion structure; 4 motion tracking, such as optical flow; 5. and detecting and tracking the spots. The camera pose is used for visual feature matching, and a 2D point in the image is matched with another 2D point in the second image, so that the two 2D points represent the same real point in 3D space. These points can be tracked by the features output by a commonly used SIFT feature detector.
Computer vision algorithms can be accelerated by sensor measurement inputs. The problem is therefore mainly how to estimate the relative camera pose in a series of images. Of course, drones may be equipped with a variety of spatial and inertial sensors, such as compasses, barometers, accelerometers, gyroscopes, etc., although they may also be applied, but the accuracy is not as desired.
The pose of the camera can be divided into two parts: position and orientation, relative pose between two images from the same camera, including rotation and translation. Unmanned aerial vehicles typically fly in open outdoor environments, providing a good working condition for GPS.
For the measurement of translation, there is currently an application that uses differential GPS to estimate centimeter-level relative motion (using an auxiliary GPS base station on the ground to compensate for transmission delays in the ionosphere), and considering that the range of flight of a drone is large, errors of a few centimeters will translate into very small errors in the resulting images, so the amount of translation of the camera position can be measured using differential GPS.
However, accurate rotation measurement of the unmanned aerial vehicle is difficult, and the unmanned aerial vehicle can change direction rapidly because the unmanned aerial vehicle is driven by a small high-torque direct-current motor. Velocimeters and gyroscopes are also subject to these inertial forces and can track relative rotational positions, but the accuracy is not satisfactory. In computer vision applications, absolute measurements are more reliable than inertial estimation.
In some applications today GPS has been used to estimate absolute bearing (attitude), and on aircraft ships and the like GPS receivers can be installed at known distances, and the relative distance between each receiver and the satellite can be used to infer angular position, and complex GPS calculations based on carrier phase measurements can make these relative measurements very accurate.
The invention uses a plurality of GPS receivers to estimate the unmanned aerial vehicle attitude, and is applied to computer vision processing, and magnitude reduction is realized in the search space of the space computer vision heuristic. For example, the system finds several feature points in image a that correspond to the same points in image B, and if the angle of rotation of images a through B is known, the system may re-project a based on that angle, determining the starting point for the search of the feature in image B, so that the process is speeded up.
Although the present disclosure has been described primarily in the context of a drone, the method is equally applicable to other smart devices such as autonomous cars and the like. The computer system is only one example of a processing system and does not set any limit to the scope of use or functionality of the methods described herein. In addition, the processing system may operate in conjunction with many other general purpose or special purpose computing systems, including but not limited to personal computer systems, server computer systems, hand-held devices, multiprocessor systems, programmable consumer electronics, network PCs, minicomputers, and any distributed cloud computing environment that includes the above systems or devices, and the like.
Components of the computer system may include one or more processors or processing units, a system memory, and various system components coupled to a processor bus. The processor may execute the program modules to implement the methods described herein. The computer system also includes a variety of readable storage media that can be any available media that can be accessed by the computer system including both removable and non-removable media.
The computer system may also communicate with one or more external devices, such as a keyboard, pointing device, display, etc., and allow one or more users to use the device to interact with the computer system; the computer system can also be connected to one or more other computing device's communication devices (e.g., network cards, etc.). The connection may be through a local area network, a wide area network, or a public network (e.g., the internet).
The computer-readable program instructions for performing the methods of the present invention may be assembly instructions, instruction set architecture instructions, machine-related instructions, microcode, firmware instructions, or the like. Code or object code is written in any combination of one or more programming languages, including an object oriented programming language such as C + + or the like and a procedure oriented programming language such as the "C" programming language or the like.
The above-mentioned contents are only for illustrating the technical idea of the present invention, and the protection scope of the present invention is not limited thereby, and any modification made on the basis of the technical idea of the present invention falls within the protection scope of the claims of the present invention.

Claims (7)

1. The unmanned aerial vehicle real-time computer vision processing method based on GPS attitude estimation comprises the following steps: step 1, acquiring the postures of the unmanned aerial vehicle at the time n and the time n +1, and acquiring images shot by the unmanned aerial vehicle at the corresponding postures at the time n and the time n +1, wherein n is more than or equal to 1,
step 2, determining an attitude rotation matrix of the camera from the moment n to the moment n +1 according to the attitude of the unmanned aerial vehicle at the moment n and the attitude at the moment n + 1;
step 3, calculating the predicted position of the target in the image at the moment n +1 according to the attitude rotation matrix of the camera from the moment n to the moment n +1 and combining the target position in the image at the moment n, and executing target search by taking the predicted position of the target as a starting point to obtain the position of the target in the image at the moment n + 1;
and 4, at the moment of n +2, turning to the step 1 until the target search is completed, and completing the real-time computer vision processing of the unmanned aerial vehicle based on the GPS attitude estimation.
2. The real-time computer vision processing method for unmanned aerial vehicle based on GPS attitude estimation as claimed in claim 1, wherein the specific method for obtaining the attitude of unmanned aerial vehicle in step 1 is as follows: the method comprises the steps that a plurality of GPS receivers are installed on the unmanned aerial vehicle, the distance from each GPS receiver to a satellite is obtained, and the attitude of the unmanned aerial vehicle is determined according to the distance from each GPS receiver to the satellite.
3. The method of real-time computer vision processing for unmanned aerial vehicle based on GPS attitude estimation of claim 2, wherein the method of obtaining the distance between the satellite and the GPS receiver is as follows:
the GPS receivers acquire raw GPS measurements between the GPS receivers and the satellites, the raw GPS measurements including pseudorange and carrier phase data, and the distance between each GPS receiver and the satellite is determined by a carrier phase and pseudorange combined measurement method.
4. The real-time computer vision processing method for unmanned aerial vehicle based on GPS attitude estimation according to claim 1, wherein the method of attitude rotation matrix R of camera in step 2 is as follows:
R×A=B
the method comprises the following steps that A is a measurement matrix of the posture of the unmanned aerial vehicle at n moments, B is a measurement matrix of the posture of the unmanned aerial vehicle at n +1 moments, and R is a posture rotation matrix of a camera from n moments to n +1 moments.
5. The method of claim 4, wherein the measurement matrix of the pose of the drone is a 3 x 3 matrix.
6. The real-time computer vision processing method for unmanned aerial vehicle based on GPS attitude estimation as claimed in claim 1, wherein in step 3, the camera attitude rotation matrix at n time and the target position in the image at n-1 time are input into a computer vision algorithm to obtain the target predicted position in the image at n time.
7. The real-time unmanned aerial vehicle computer vision processing method based on GPS attitude estimation as claimed in claim 1, wherein in step 3, when n is 1, the picture obtained at n time is input into a computer vision algorithm, and the target position in the image at the time is obtained.
CN201911137550.7A 2019-11-19 2019-11-19 Unmanned aerial vehicle real-time computer vision processing method based on GPS attitude estimation Active CN110887461B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911137550.7A CN110887461B (en) 2019-11-19 2019-11-19 Unmanned aerial vehicle real-time computer vision processing method based on GPS attitude estimation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911137550.7A CN110887461B (en) 2019-11-19 2019-11-19 Unmanned aerial vehicle real-time computer vision processing method based on GPS attitude estimation

Publications (2)

Publication Number Publication Date
CN110887461A CN110887461A (en) 2020-03-17
CN110887461B true CN110887461B (en) 2021-04-06

Family

ID=69748051

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911137550.7A Active CN110887461B (en) 2019-11-19 2019-11-19 Unmanned aerial vehicle real-time computer vision processing method based on GPS attitude estimation

Country Status (1)

Country Link
CN (1) CN110887461B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112585946A (en) * 2020-03-27 2021-03-30 深圳市大疆创新科技有限公司 Image shooting method, image shooting device, movable platform and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104457761A (en) * 2014-11-18 2015-03-25 上海新跃仪表厂 Characteristic relay method for relative position and attitude based on multi-vision
CN105698762A (en) * 2016-01-15 2016-06-22 中国人民解放军国防科学技术大学 Rapid target positioning method based on observation points at different time on single airplane flight path
CN108845335A (en) * 2018-05-07 2018-11-20 中国人民解放军国防科技大学 Unmanned aerial vehicle ground target positioning method based on image and navigation information
CN109405821A (en) * 2018-09-21 2019-03-01 北京三快在线科技有限公司 Method, apparatus used for positioning and target device
CN109974693A (en) * 2019-01-31 2019-07-05 中国科学院深圳先进技术研究院 Unmanned plane localization method, device, computer equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104457761A (en) * 2014-11-18 2015-03-25 上海新跃仪表厂 Characteristic relay method for relative position and attitude based on multi-vision
CN105698762A (en) * 2016-01-15 2016-06-22 中国人民解放军国防科学技术大学 Rapid target positioning method based on observation points at different time on single airplane flight path
CN108845335A (en) * 2018-05-07 2018-11-20 中国人民解放军国防科技大学 Unmanned aerial vehicle ground target positioning method based on image and navigation information
CN109405821A (en) * 2018-09-21 2019-03-01 北京三快在线科技有限公司 Method, apparatus used for positioning and target device
CN109974693A (en) * 2019-01-31 2019-07-05 中国科学院深圳先进技术研究院 Unmanned plane localization method, device, computer equipment and storage medium

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
GPS姿态测量技术研究现状及发展趋势;孙立刚;《自动化应用》;20171231;第109-110页 *
基于SCKF的4旋翼无人机的姿态估计;张煌军 等;《江西师范大学学报(自然科学版)》;20190330;第43卷(第2期);第154-159页 *
高精度GPS定位方法及其在无人机定位系统中应用的研究;葛昌利;《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》;20190215;第10-15、35-39页 *

Also Published As

Publication number Publication date
CN110887461A (en) 2020-03-17

Similar Documents

Publication Publication Date Title
CN112347840B (en) Vision sensor laser radar integrated unmanned aerial vehicle positioning and image building device and method
US10976446B2 (en) Acceleration of real time computer vision processing on UAVs through GPS attitude estimation
Savage Strapdown inertial navigation integration algorithm design part 2: Velocity and position algorithms
CN107850436B (en) Sensor fusion using inertial and image sensors
CN107850901B (en) Sensor fusion using inertial and image sensors
Schneider et al. Fast and effective online pose estimation and mapping for UAVs
CN107850899B (en) Sensor fusion using inertial and image sensors
WO2016187759A1 (en) Sensor fusion using inertial and image sensors
US11875519B2 (en) Method and system for positioning using optical sensor and motion sensors
Wang et al. Bearing-only visual SLAM for small unmanned aerial vehicles in GPS-denied environments
CN111623773B (en) Target positioning method and device based on fisheye vision and inertial measurement
CN112967392A (en) Large-scale park mapping and positioning method based on multi-sensor contact
CN112136137A (en) Parameter optimization method and device, control equipment and aircraft
CN110887461B (en) Unmanned aerial vehicle real-time computer vision processing method based on GPS attitude estimation
CN112154480B (en) Positioning method and device for movable platform, movable platform and storage medium
Michalczyk et al. Radar-inertial state-estimation for UAV motion in highly agile manoeuvres
CN111351487A (en) Clock synchronization method and device of multiple sensors and computing equipment
Dhahbane et al. Attitude determination and attitude estimation in aircraft and spacecraft navigation. A survey
Azizi et al. 3D inertial algorithm of SLAM for using on UAV
Yang et al. Inertial-aided vision-based localization and mapping in a riverine environment with reflection measurements
Stepanyan et al. Adaptive multi-sensor information fusion for autonomous urban air mobility operations
RU2812755C2 (en) Device for determining object coordinates
CN113470342B (en) Method and device for estimating self-movement
Chathuranga et al. Aerial image matching based relative localization of a uav in urban environments
Steiner et al. Vision-based navigation and hazard detection for terrestrial rocket approach and landing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant