CN115144867A - Target detection positioning method based on unmanned aerial vehicle carrying three-axis pan-tilt camera - Google Patents

Target detection positioning method based on unmanned aerial vehicle carrying three-axis pan-tilt camera Download PDF

Info

Publication number
CN115144867A
CN115144867A CN202210733457.8A CN202210733457A CN115144867A CN 115144867 A CN115144867 A CN 115144867A CN 202210733457 A CN202210733457 A CN 202210733457A CN 115144867 A CN115144867 A CN 115144867A
Authority
CN
China
Prior art keywords
target
aerial vehicle
unmanned aerial
camera
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210733457.8A
Other languages
Chinese (zh)
Inventor
胡佳
李锐
王洪添
朱翔宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Inspur Science Research Institute Co Ltd
Original Assignee
Shandong Inspur Science Research Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Inspur Science Research Institute Co Ltd filed Critical Shandong Inspur Science Research Institute Co Ltd
Priority to CN202210733457.8A priority Critical patent/CN115144867A/en
Publication of CN115144867A publication Critical patent/CN115144867A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for
    • B64D47/08Arrangements of cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1652Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/43Determining position using carrier phase measurements, e.g. kinematic positioning; using long or short baseline interferometry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Theoretical Computer Science (AREA)
  • Electromagnetism (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention relates to a target detection and positioning method based on an unmanned aerial vehicle carrying a three-axis pan-tilt camera. The method takes an unmanned aerial vehicle as a platform and combines a three-axis holder to capture a target after enhancing, denoising, segmenting and extracting an image acquired by a visual sensor; a simple positioning mathematical geometric model is constructed, the existing image data is analyzed and converted by combining with camera parameters, in order to obtain the actual position coordinates of the target, the target is locked in the center of the visual field by the holder camera at any time and moves along with the target, and the relative position of the current target and the unmanned aerial vehicle is timely and accurately calculated by combining the distance between the target and the distance obtained by the laser ranging sensor; and calculating the specific position of the monitored target through coordinate transformation. The invention can effectively solve the problems of complex operation means, high cost, poor robustness and poor accuracy in the target detection and positioning system under the conditions of low cost and low information content, and provides an accurate and rapid positioning system for rescue work.

Description

Target detection positioning method based on unmanned aerial vehicle carrying three-axis pan-tilt camera
Technical Field
The invention relates to the field of target detection, in particular to a method for carrying a three-axis pan-tilt camera and combining TinyML (time varying markup language) to detect and position a target in real time based on an unmanned aerial vehicle.
Background
The camera is an important visual monitoring device, and can remotely acquire useful local information for human beings in a fixed area. With the increase of the number of monitoring devices, video monitoring has gradually become an important tool for maintaining social security and legal treatment, and also becomes an important support for coordinating logistics traffic and detecting potential safety hazards in smart cities. But in some occasions fixed camera can be sheltered from, leaves the control dead angle, and the target can easily walk out of the control region, still has a lot of potential safety hazards. However, the large area intensive deployment of the fixed monitoring system requires extremely high cost and also causes pressure on the communication network. The unmanned aerial vehicle has a wide visual angle in the high latitude, and the three-axis pan-tilt camera has high spatial degree of freedom, can be used as an important component of a tracking and monitoring system, and can be used for well positioning a target in real time by combining an RTK positioning technology through a spatial geometrical relationship.
There are already some algorithms and models based on drone tracking. On one hand, the current implementation mode is complex, and on the other hand, excessive resources are occupied during operation. How to make full use of limited information and camera parameter information to quickly and accurately detect a moving target and obtain the position information of the detected target in real time without being influenced by the external geographical position, and the method for carrying out real-time detection, extraction and positioning of the target by using an unmanned aerial vehicle and combining TinyML is an important method for solving the problem.
Disclosure of Invention
Aiming at the problems in the prior art, the invention discloses a target detection and positioning method based on an unmanned aerial vehicle carrying three-axis pan-tilt camera, which is used for establishing a rapid and accurate fire source target detection and positioning system under the conditions of low cost and low information content.
In order to solve the technical problem, the technical scheme adopted by the invention is as follows: a target detection and positioning method based on an unmanned aerial vehicle carrying a three-axis pan-tilt camera comprises the following steps:
s01), fixing a three-axis pan-tilt at the bottom of the unmanned aerial vehicle, fixing a sensor at a rod arm below the three-axis pan-tilt, and using a sensor vision sensor and a laser ranging sensor;
s02) calibrating the postures of the three-axis pan-tilt camera and the unmanned aerial vehicle, and resolving the postures through the IMU sensors carried by the three-axis pan-tilt camera and the unmanned aerial vehicle;
s03), deploying the TinyML to an airborne application platform, capturing video information in real time through a visual sensor, inputting video stream data into a deployed TinyML detection model for detection, acquiring the position of a target in an image after the target appears in a visual field, enclosing the target in a rectangular frame after the target is recognized, extracting diagonal coordinates of the rectangular frame, calculating central coordinates of the rectangular frame through a midpoint coordinate formula, calculating offset with the center of the visual field to obtain offset coordinates (delta x, delta y), correcting the unmanned aerial vehicle through the (delta x, delta y) to move so that the target is always positioned at a central point, starting a laser range finder to work at the moment, and obtaining the distance L between the unmanned aerial vehicle and the target by reading data of the laser range finder;
s04) obtaining the distance L and the relative angle between the unmanned aerial vehicle and the target through the steps S02) and S03), establishing a coordinate system by taking the camera as an original point, and obtaining the position coordinate of a point P relative to a point O through the following calculation, wherein the point P is the position of the target, and the point O is the position of the camera:
Z′=L*cosβ,
X′=L*sinβ*sinα,
y' = L sin beta cos gamma, wherein L represents the distance between the unmanned aerial vehicle and the target, and alpha, beta and gamma respectively represent included angles of a point P relative to an X axis, a Z axis and a Y axis of the point O;
s05), acquiring unmanned aerial vehicle positioning, namely O point position coordinates, and acquiring longitude, latitude and height of the unmanned aerial vehicle in a navigation coordinate system through an airborne RTK satellite navigation positioning receiverBy means of a coordinate transformation matrix
Figure BDA0003713113150000021
The real coordinate point of the target in the navigation coordinate system, that is, the longitude, latitude and altitude of the target, can be calculated.
Further, the calibration process of the cloud platform camera and the unmanned aerial vehicle attitude is as follows:
s21), selecting a quaternion to determine the heading angle of the unmanned aerial vehicle, wherein the equation of the quaternion is shown as a formula (1):
Figure BDA0003713113150000022
wherein
Figure BDA0003713113150000023
Q, W denotes the time derivative of a quaternion, angular velocity, respectively, Q = Q 0 +q 1 i+q 2 j+q 3 k,q 0 Is the real part of a quaternion, q 1 、q 2 、q 3 Is the imaginary part, i, j, k are units of imaginary numbers, t0 is the initial moment of the movement, Q 0 Is the initial value of the quaternion;
Figure BDA0003713113150000024
for the initial value of angular velocity, its matrix form is obtained by the gyroscope:
Figure BDA0003713113150000025
wherein T is s Representing the time interval of differentiation, ω 1 、ω 2 、ω 3 Respectively represent the angular velocities of X, Y, Z in space, and omega (omega) represents
Figure BDA0003713113150000026
A functional representation with respect to angular velocity;
s22), setting the angular velocity in one sampling period to be constant, and obtaining a quaternion formula of a discrete domain by differentiating the formula (2):
Figure BDA0003713113150000027
wherein Qk and Qk +1 respectively represent quaternions of k and k +1 th sampling time, I represents an identity matrix, and omega (omega Ts) represents omega (omega) substituted into an independent variable T s The latter is represented by the function of,
Figure BDA0003713113150000028
the coordinate transformation matrix from the b system to the R system determined by the quaternion is as follows: wherein b is a carrier coordinate system, and R represents a navigation coordinate system;
Figure BDA0003713113150000029
when the reference coordinate system R is a navigation coordinate system, the coordinate transformation matrix corresponding to the three basic rotations is as follows:
Figure BDA0003713113150000031
wherein the roll angle is R, the pitch angle is P, and the yaw angle is Y, then
Figure BDA0003713113150000032
The angle is used as the rotation angle of the camera along three directions, and the rotation offset of the camera is obtained by making a difference with the angle of the initial platform.
Further, the method is used for detecting and positioning the fire source.
Further, in step S03), the TinyML detection model is used for performing enhanced denoising and segmentation extraction on the video data stream to capture the target later.
The invention has the beneficial effects that: the invention provides a target detection and positioning method based on an unmanned aerial vehicle carrying a three-axis pan-tilt camera. The method is characterized in that the relative position of a fire source and the unmanned aerial vehicle is determined by utilizing a camera and a laser ranging sensor which are carried on the unmanned aerial vehicle, then the accurate positioning data of the unmanned aerial vehicle is obtained through RTK, and then the relative position of a target is converted into the position of an actual navigation coordinate system in a coordinate system conversion mode. A rapid and accurate fire source target detection and positioning system is established under the conditions of low cost and low information content.
Drawings
FIG. 1 is a schematic diagram of an unmanned aerial vehicle carrying a three-axis pan-tilt camera and a laser ranging sensor;
FIG. 2 is a view of the target circle in a rectangle a schematic within a box;
FIG. 3 is a schematic diagram of a coordinate system of the relative positions of the target and the camera;
in the figure: 1. motor I,2, motor II,3, motor III,4, lever arm.
Detailed Description
The present invention will be further described with reference to the following examples.
Example 1
The embodiment discloses a method for carrying a three-axis pan-tilt camera and combining TinyML to detect and position a target in real time based on an unmanned aerial vehicle, which is particularly used for detecting and positioning fire, and the method is realized by the aid of the unmanned aerial vehicle, the three-axis pan-tilt and a sensor, as shown in figure 1, the three-axis pan-tilt is fixed at the bottom of the unmanned aerial vehicle through a motor I1, the sensor is fixed at a lever arm below the pan-tilt, namely a vision sensor and a laser ranging sensor are installed on the lever arm 4, and a measuring point of the laser ranging sensor is specified to an image center point through calibration. In this embodiment, the vision sensor is a camera.
All data processing adopts an ARM high-performance deep learning chip, and can complete image processing and operation of a main algorithm. The processor is connected with the camera through the USB to acquire image data flow and transmits the image data flow to the processor for recognizing the fire source, the shaft displacement is compensated according to the error between the target position fed back by the camera in real time and the center position of the camera after the fire source is detected, the three motors are controlled to rotate to ensure that the fire source is positioned at the center of a visual field, and the relative position of the fire source in the camera is reversely pushed out through the rotating angles of the three motors. The communication module comprises an RS485 part communicated with the motor and a 5G network communication part communicated with the monitoring platform. In serial communication, both communication parties are required to adopt a standard interface, so that different devices can be conveniently connected for communication. RS485 is one of the most commonly used serial communication interfaces. The 5G network communication module has the characteristics of high communication speed, wide network spectrum, flexible communication and the like, and is used as a bridge for connecting a remote monitoring platform.
The method comprises the following steps:
s01) fixing a triaxial holder at the bottom of the unmanned aerial vehicle, fixing a sensor at a rod arm below the triaxial holder, and enabling a sensor vision sensor and a laser ranging sensor, wherein the vision sensor is a camera;
s02) calibrating the postures of the three-axis pan-tilt camera and the unmanned aerial vehicle, wherein the postures are resolved through IMU sensors carried by the three-axis pan-tilt camera and the unmanned aerial vehicle;
in the initial static stage, the unmanned aerial vehicle has a spatial initial attitude, the roll angle and the pitch angle are close to 0 on the horizontal ground, the yaw angle is determined according to the direction of the aircraft nose, the pan-tilt camera also has a similar attitude, and the calibration requires that the relative attitude of the camera and the unmanned aerial vehicle is kept consistent.
S03), deploying the TinyML to an airborne application platform, capturing video information in real time through an image acquisition device, inputting video stream data into a deployed TinyML detection model for detection, acquiring the position of a target in an image after finding the target in a visual field through enhancing denoising and segmentation extraction, enclosing the target in a rectangular frame after identification as shown in figure 2 when the target is captured by a system, extracting diagonal coordinates of the rectangular frame, calculating central coordinates of the rectangular frame through a midpoint coordinate formula, calculating offset with the center of the visual field to obtain offset coordinates (delta x, delta y), correcting the unmanned aerial vehicle to move through the (delta x, delta y) to enable the target to be always positioned at the central point, starting a laser range finder to work at the moment, and obtaining the distance L between the unmanned aerial vehicle and the target by reading data of the laser range finder;
s04) obtaining the distance L and the relative angle between the unmanned aerial vehicle and the target through the steps S02) and S03), wherein the calibration of the step S02) enables the visual angle direction of the camera to be consistent with that of the unmanned aerial vehicle, the camera rotates relative to the unmanned aerial vehicle when tracking the object in the step, and the rotating angle is the relative angle.
As shown in fig. 3, a coordinate system is established with the camera as the origin, and the position coordinates of the point P relative to the point O are obtained by the following calculation, where the point P is the position of the target, and the point O is the camera position:
Z′=L*cosβ,
X′=L*sinβ*sinα,
y' = L sin beta cos gamma, wherein L represents the distance between the unmanned aerial vehicle and the target, and alpha, beta and gamma respectively represent included angles of a point P relative to an X axis, a Z axis and a Y axis of the point O;
s05), acquiring unmanned aerial vehicle positioning, namely O point position coordinates, acquiring longitude, latitude and height of a body in a navigation coordinate system through an airborne RTK satellite navigation positioning receiver, and converting a matrix through coordinates
Figure BDA0003713113150000041
The true coordinate point of the target in the navigation coordinate system, that is, the longitude, latitude and altitude of the target, can be calculated.
In this embodiment, the calibration process of cloud platform camera and unmanned aerial vehicle gesture does:
s21), selecting a quaternion to determine the heading angle of the unmanned aerial vehicle, wherein the equation of the quaternion is shown as a formula (1):
Figure BDA0003713113150000042
wherein
Figure BDA0003713113150000043
Q, W denotes the time derivative of a quaternion, angular velocity, respectively, Q = Q 0 +q 1 i+q 2 j+q 3 k,q 0 Is the real part of a quaternion, q 1 、q 2 、q 3 Is an imaginary part, i, j, k are imaginary units, t 0 As an initial moment of movement, Q 0 Is the initial value of the quaternion;
Figure BDA0003713113150000044
for the initial value of angular velocity, the matrix form is obtained by the gyroscope:
Figure BDA0003713113150000051
wherein T is s Representing the time interval of differentiation, ω 1 、ω 2 、ω 3 Respectively represent the angular velocities of X, Y, Z in space, and omega (omega) represents
Figure BDA0003713113150000052
A functional representation of angular velocity, such as f (x);
s22), setting the angular velocity in one sampling period to be constant, and obtaining a quaternion formula of a discrete domain by differentiating the formula (2):
Figure BDA0003713113150000053
wherein Q k 、Q k+1 Denotes quaternions at k and k +1 th sampling times, respectively, and I denotes an identity matrix, omega (ω T) s ) Denotes the substitution of omega (omega) into the argument T s The latter is represented by the function of,
Figure BDA0003713113150000054
the coordinate transformation matrix from the b system to the R system determined by the quaternion is as follows: wherein the b system is a carrier (pan/tilt/zoom) coordinate system and the R system represents a reference coordinate system;
Figure BDA0003713113150000055
when the reference coordinate system R is the navigation coordinate system, the coordinate transformation matrix corresponding to the three basic rotations is:
Figure BDA0003713113150000056
wherein the roll angle is R, the pitch angle is P, and the yaw angle is Y, then
Figure BDA0003713113150000057
The angle is used as the rotation angle of the camera along three directions, and the rotation offset of the camera is obtained by making a difference with the angle of the initial platform.
The invention relates to a device and a method for carrying a three-axis pan-tilt camera based on an unmanned aerial vehicle and combining with TinyML to detect and position a fire source target in real time. According to the scheme provided by the invention, an unmanned aerial vehicle is combined with a three-axis holder to capture a fire source target; in order to identify the fire source target, the TinyML is deployed to an ARM airborne embedded application platform to identify the fire source, a holder camera constantly locks the target in the center of a visual field after the fire source is identified, the TinyML moves along with the target, the distance between the TinyML and the fire source is obtained by combining a laser ranging sensor, the relative position of the current fire source target and the unmanned aerial vehicle is calculated through mathematical derivation, the relative position is converted into a navigation system coordinate system, and the real geographic position of the monitored fire source can be obtained. The invention is not limited to fire source detection and is also applicable to other target positioning.
The foregoing description is only for the basic principle and the preferred embodiments of the present invention, and modifications and substitutions by those skilled in the art are included in the scope of the present invention.

Claims (4)

1. A target detection positioning method based on an unmanned aerial vehicle carrying three-axis pan-tilt camera is characterized by comprising the following steps: the method comprises the following steps:
s01) fixing a three-axis pan-tilt at the bottom of the unmanned aerial vehicle, and fixing a sensor at a rod arm below the three-axis pan-tilt, wherein the sensor comprises a vision sensor and a laser ranging sensor;
s02) calibrating the postures of the three-axis pan-tilt camera and the unmanned aerial vehicle, and resolving the postures through the IMU sensors carried by the three-axis pan-tilt camera and the unmanned aerial vehicle;
s03), deploying the TinyML to an airborne application platform, capturing video information in real time through a visual sensor, inputting video stream data into a deployed TinyML detection model for detection, acquiring the position of a target in an image after the target appears in a visual field, enclosing the target in a rectangular frame after the target is recognized, extracting diagonal coordinates of the rectangular frame, calculating central coordinates of the rectangular frame through a midpoint coordinate formula, calculating offset with the center of the visual field to obtain offset coordinates (delta x, delta y), correcting the unmanned aerial vehicle through the (delta x, delta y) to move so that the target is always positioned at a central point, starting a laser range finder to work at the moment, and obtaining the distance L between the unmanned aerial vehicle and the target by reading data of the laser range finder;
s04) obtaining the distance L and the relative angle between the unmanned aerial vehicle and the target through the steps S02) and S03), establishing a coordinate system by taking the vision sensor as an original point, and obtaining the position coordinate of a point P relative to a point O through the following calculation, wherein the point P is the position of the target, and the point O is the position of the camera:
Z′=L*cosβ,
X′=L*sinβ*sinα,
Y′=L*sinβ*cosγ,
wherein L represents the distance between the unmanned aerial vehicle and the target, and alpha, beta and gamma respectively represent the included angles of the P point relative to the X axis, the Z axis and the Y axis of the O point;
s05), acquiring unmanned aerial vehicle positioning, namely O point position coordinates, acquiring longitude, latitude and height of a body in a navigation coordinate system through an airborne RTK satellite navigation positioning receiver, and converting a matrix through coordinates
Figure FDA0003713113140000014
The real coordinate point of the target in the navigation coordinate system, that is, the longitude, latitude and altitude of the target, can be calculated.
2. The target detection and positioning method based on the unmanned aerial vehicle-mounted triaxial holder camera according to claim 1, wherein: the calibration process of the posture of the holder camera and the unmanned aerial vehicle is as follows:
s21), selecting a quaternion to determine the heading angle of the unmanned aerial vehicle, wherein the equation of the quaternion is shown as a formula (1):
Figure FDA0003713113140000011
q, Q, W represents the time-dependent differential of the quaternion, and angular velocity, respectively, with quaternion Q = Q 0 +q 1 i+q 2 j+q 3 k,q 0 Is the real part of a quaternion, q 1 、q 2 、q 3 Is an imaginary part, i, j, k are units of imaginary numbers, t 0 As an initial moment of movement, Q 0 Is the initial value of the quaternion;
Figure FDA0003713113140000012
for the initial value of angular velocity, the matrix form is obtained by the gyroscope:
Figure FDA0003713113140000013
wherein T is s Representing the time interval of differentiation, ω 1 、ω 2 、ω 3 Respectively represent the angular velocities of X, Y, Z in space, and omega (omega) represents the functional representation of Q with respect to the angular velocity;
s22), setting the angular velocity in one sampling period to be constant, and obtaining a quaternion formula of a discrete domain by differentiating the formula (2):
Figure FDA0003713113140000021
wherein Qk and Qk +1 respectively represent quaternions at k and k +1 th sampling moments, I represents an identity matrix, and omega (omega Ts) represents omega (omega) substituted into an independent variable T s The latter is represented by the function of,
Figure FDA0003713113140000022
the coordinate transformation matrix from the b system to the R system determined by the quaternion is as follows: whereinb is a carrier coordinate system, and R represents a navigation coordinate system;
Figure FDA0003713113140000023
when the reference coordinate system R is a navigation coordinate system, the coordinate transformation matrix corresponding to the three basic rotations is as follows:
Figure FDA0003713113140000024
wherein the roll angle is R, the pitch angle is P, and the yaw angle is Y, then
Figure FDA0003713113140000025
The angle is used as the rotation angle of the camera along three directions, and the offset of the rotation of the camera is obtained by subtracting the angle from the angle of the initial platform.
3. The target detection and positioning method based on the unmanned aerial vehicle-mounted triaxial holder camera according to claim 1, wherein: the method is used for detecting and positioning the fire source.
4. The target detection and positioning method based on the unmanned aerial vehicle-mounted triaxial holder camera according to claim 1, wherein: in step S03), the TinyML detection model carries out enhanced denoising and segmentation extraction on the video data stream so as to capture a target later.
CN202210733457.8A 2022-06-24 2022-06-24 Target detection positioning method based on unmanned aerial vehicle carrying three-axis pan-tilt camera Pending CN115144867A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210733457.8A CN115144867A (en) 2022-06-24 2022-06-24 Target detection positioning method based on unmanned aerial vehicle carrying three-axis pan-tilt camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210733457.8A CN115144867A (en) 2022-06-24 2022-06-24 Target detection positioning method based on unmanned aerial vehicle carrying three-axis pan-tilt camera

Publications (1)

Publication Number Publication Date
CN115144867A true CN115144867A (en) 2022-10-04

Family

ID=83407324

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210733457.8A Pending CN115144867A (en) 2022-06-24 2022-06-24 Target detection positioning method based on unmanned aerial vehicle carrying three-axis pan-tilt camera

Country Status (1)

Country Link
CN (1) CN115144867A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116840243A (en) * 2023-09-01 2023-10-03 湖南睿图智能科技有限公司 Correction method and system for machine vision object recognition

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107247458A (en) * 2017-05-24 2017-10-13 中国电子科技集团公司第二十八研究所 UAV Video image object alignment system, localization method and cloud platform control method
CN108680143A (en) * 2018-04-27 2018-10-19 南京拓威航空科技有限公司 Object localization method, device based on long-distance ranging and unmanned plane
CN109084766A (en) * 2018-08-28 2018-12-25 桂林电子科技大学 A kind of interior unmanned plane positioning system and method
CN109753076A (en) * 2017-11-03 2019-05-14 南京奇蛙智能科技有限公司 A kind of unmanned plane vision tracing implementing method
CN111596693A (en) * 2020-06-17 2020-08-28 中国人民解放军国防科技大学 Ground target tracking control method and system of unmanned aerial vehicle based on pan-tilt camera
CN111966133A (en) * 2020-08-29 2020-11-20 山东翔迈智能科技有限公司 Visual servo control system of holder
CN113850126A (en) * 2021-08-20 2021-12-28 武汉卓目科技有限公司 Target detection and three-dimensional positioning method and system based on unmanned aerial vehicle

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107247458A (en) * 2017-05-24 2017-10-13 中国电子科技集团公司第二十八研究所 UAV Video image object alignment system, localization method and cloud platform control method
CN109753076A (en) * 2017-11-03 2019-05-14 南京奇蛙智能科技有限公司 A kind of unmanned plane vision tracing implementing method
CN108680143A (en) * 2018-04-27 2018-10-19 南京拓威航空科技有限公司 Object localization method, device based on long-distance ranging and unmanned plane
CN109084766A (en) * 2018-08-28 2018-12-25 桂林电子科技大学 A kind of interior unmanned plane positioning system and method
CN111596693A (en) * 2020-06-17 2020-08-28 中国人民解放军国防科技大学 Ground target tracking control method and system of unmanned aerial vehicle based on pan-tilt camera
CN111966133A (en) * 2020-08-29 2020-11-20 山东翔迈智能科技有限公司 Visual servo control system of holder
CN113850126A (en) * 2021-08-20 2021-12-28 武汉卓目科技有限公司 Target detection and three-dimensional positioning method and system based on unmanned aerial vehicle

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
宋宇;翁新武;郭昕刚;: "基于四元数EKF算法的小型无人机姿态估计", 吉林大学学报(理学版), no. 03, 26 May 2015 (2015-05-26) *
车玉涵;刘富;康冰;: "基于云台相机的四旋翼无人机跟踪控制系统", 吉林大学学报(信息科学版), no. 03, 15 May 2019 (2019-05-15) *
马力;李天松;阳荣凯;黄艳虎;: "基于增强型显式互补滤波的无人机姿态算法", 桂林电子科技大学学报, no. 05, 25 October 2019 (2019-10-25) *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116840243A (en) * 2023-09-01 2023-10-03 湖南睿图智能科技有限公司 Correction method and system for machine vision object recognition
CN116840243B (en) * 2023-09-01 2023-11-28 湖南睿图智能科技有限公司 Correction method and system for machine vision object recognition

Similar Documents

Publication Publication Date Title
US10515458B1 (en) Image-matching navigation method and apparatus for aerial vehicles
CN109887057B (en) Method and device for generating high-precision map
US20190385339A1 (en) Sensor fusion using inertial and image sensors
Panahandeh et al. Vision-aided inertial navigation based on ground plane feature detection
WO2016187757A1 (en) Sensor fusion using inertial and image sensors
WO2016187759A1 (en) Sensor fusion using inertial and image sensors
WO2016187758A1 (en) Sensor fusion using inertial and image sensors
CN101598556A (en) Unmanned plane vision/inertia integrated navigation method under a kind of circumstances not known
Leira et al. A ligth-weight thermal camera payload with georeferencing capabilities for small fixed-wing UAVs
KR102239562B1 (en) Fusion system between airborne and terrestrial observation data
CN111426320A (en) Vehicle autonomous navigation method based on image matching/inertial navigation/milemeter
CN109341686B (en) Aircraft landing pose estimation method based on visual-inertial tight coupling
KR102075028B1 (en) Unmanned High-speed Flying Precision Position Image Acquisition Device and Accurate Position Acquisition Method Using the same
CN108253966A (en) Unmanned plane during flying three-dimensional simulation display methods
Hirose et al. Implementation of UAV localization methods for a mobile post-earthquake monitoring system
CN108444468B (en) Directional compass integrating downward vision and inertial navigation information
CN108613675B (en) Low-cost unmanned aerial vehicle movement measurement method and system
Lo et al. The direct georeferencing application and performance analysis of UAV helicopter in GCP-free area
CN111247389B (en) Data processing method and device for shooting equipment and image processing equipment
CN115144867A (en) Target detection positioning method based on unmanned aerial vehicle carrying three-axis pan-tilt camera
Wang et al. Monocular vision and IMU based navigation for a small unmanned helicopter
Hosseinpoor et al. Pricise target geolocation based on integeration of thermal video imagery and rtk GPS in UAVS
CN113375665B (en) Unmanned aerial vehicle pose estimation method based on multi-sensor elastic coupling
CN111402324A (en) Target measuring method, electronic equipment and computer storage medium
CN109341685B (en) Fixed wing aircraft vision auxiliary landing navigation method based on homography transformation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination