CN118112496A - Positioning method and device for unmanned aerial vehicle, equipment and storage medium - Google Patents

Positioning method and device for unmanned aerial vehicle, equipment and storage medium Download PDF

Info

Publication number
CN118112496A
CN118112496A CN202410228932.5A CN202410228932A CN118112496A CN 118112496 A CN118112496 A CN 118112496A CN 202410228932 A CN202410228932 A CN 202410228932A CN 118112496 A CN118112496 A CN 118112496A
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
module
positioning
moving
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410228932.5A
Other languages
Chinese (zh)
Inventor
肖志昂
杨纪刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Jinwo Technology Co ltd
Original Assignee
Shenzhen Jinwo Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Jinwo Technology Co ltd filed Critical Shenzhen Jinwo Technology Co ltd
Priority to CN202410228932.5A priority Critical patent/CN118112496A/en
Publication of CN118112496A publication Critical patent/CN118112496A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/0246Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves involving frequency difference of arrival or Doppler measurements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S1/00Beacons or beacon systems transmitting signals having a characteristic or characteristics capable of being detected by non-directional receivers and defining directions, positions, or position lines fixed relatively to the beacon transmitters; Receivers co-operating therewith
    • G01S1/02Beacons or beacon systems transmitting signals having a characteristic or characteristics capable of being detected by non-directional receivers and defining directions, positions, or position lines fixed relatively to the beacon transmitters; Receivers co-operating therewith using radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The application relates to the technical field of positioning, and discloses a positioning method for an unmanned aerial vehicle. And then measuring the distance between the unmanned aerial vehicle and a positioning tag used for representing the position of the person to be searched and rescued based on the moving track, the position information and the moving information in the moving flight process of the unmanned aerial vehicle so as to further position the person to be searched and rescued. Because when measuring the distance between unmanned aerial vehicle and the location label, only need guarantee the time synchronization between unmanned aerial vehicle and the location label, need not to carry out time synchronization between the unmanned aerial vehicle to can promote the accuracy of location. The application also discloses a positioning device and equipment for the unmanned aerial vehicle and a storage medium.

Description

Positioning method and device for unmanned aerial vehicle, equipment and storage medium
Technical Field
The present application relates to the field of positioning technologies, and for example, to a positioning method and apparatus for an unmanned aerial vehicle, a device, and a storage medium.
Background
Currently, in habitable buildings, disasters such as fires, earthquakes and the like may occur. Under the condition that disasters occur in a building, people trapped in the building need to be searched and rescuing, and in the process of searching and rescuing the people, the people in the building need to be positioned.
In the related art, people trapped in a building are typically located by unmanned aerial vehicle formation. Specifically, unmanned aerial vehicle formation positions personnel in a building by a TDOA (TIME DIFFERENCE of Arrival) positioning method.
In the process of implementing the embodiments of the present disclosure, it is found that at least the following problems exist in the related art:
In the TDOA location method, time synchronization is required between unmanned aerial vehicles in unmanned aerial vehicle formation. However, when unmanned aerial vehicle formation adopts the TDOA positioning method to fix a position the personnel in the building, shelter from by the building easily between the unmanned aerial vehicle, cause time synchronization information between the unmanned aerial vehicle to lose, and then cause personnel to fix a position unusually, the accuracy of location is difficult to guarantee.
It should be noted that the information disclosed in the above background section is only for enhancing understanding of the background of the application and thus may include information that does not form the prior art that is already known to those of ordinary skill in the art.
Disclosure of Invention
The following presents a simplified summary in order to provide a basic understanding of some aspects of the disclosed embodiments. This summary is not an extensive overview, and is intended to neither identify key/critical elements nor delineate the scope of such embodiments, but is intended as a prelude to the more detailed description that follows.
The embodiment of the disclosure provides a positioning method, a device, equipment and a storage medium for an unmanned aerial vehicle, which can improve the positioning accuracy when the unmanned aerial vehicle is utilized to position a person to be searched and rescuing in a building.
In a first aspect, an embodiment of the present disclosure provides a positioning method for an unmanned aerial vehicle, where the unmanned aerial vehicle includes a flight control module and an air-to-ground data link module; wherein, unmanned aerial vehicle still includes: the system comprises a distance measurement module, an inertial navigation module, a position determination module, an image acquisition module and a positioning module; the method comprises the following steps:
The inertial navigation module calculates the movement track of the unmanned aerial vehicle according to the movement data in the movement process of the unmanned aerial vehicle;
The position determining module calculates position information in the moving process of the unmanned aerial vehicle;
The positioning module acquires an image in the moving process of the unmanned aerial vehicle from the image acquisition module, and calculates the moving information of the unmanned aerial vehicle according to the image;
The distance measurement module is used for measuring the distance between the unmanned aerial vehicle and the positioning tag in the moving process of the unmanned aerial vehicle according to the moving track, the position information and the moving information in the moving process of the unmanned aerial vehicle; the positioning tag is a position mark of a person to be searched and rescued;
And the positioning module performs positioning calculation according to the distance between the unmanned aerial vehicle and the positioning tag so as to determine the position of the personnel to be searched and rescuing.
Optionally, the inertial navigation module calculates a movement track of the unmanned aerial vehicle according to movement data in the movement process of the unmanned aerial vehicle, including: the inertial navigation module determines the initial gesture and the initial speed of the unmanned aerial vehicle according to the initial movement data of the unmanned aerial vehicle under the condition that the unmanned aerial vehicle does not move; and the inertial navigation module determines the movement track of the unmanned aerial vehicle according to the initial gesture and the initial speed of the unmanned aerial vehicle and according to the movement data in the movement process of the unmanned aerial vehicle.
Optionally, the position determining module calculates position information during the movement of the unmanned aerial vehicle, including: the position determining module obtains longitude and latitude information of the unmanned aerial vehicle, and calculates position information of the unmanned aerial vehicle according to the longitude and latitude information.
Optionally, the positioning module calculates movement information of the unmanned aerial vehicle according to the image, including: the positioning module acquires a first image corresponding to a first moment and a second image corresponding to a second moment from the image acquisition module; wherein the first time is earlier than the second time; the positioning module extracts a first characteristic point in the first image and a second characteristic point in the second image; and the positioning module calculates the movement information of the unmanned aerial vehicle according to the first characteristic points and the second characteristic points.
Optionally, the distance measurement module measures the distance between the unmanned aerial vehicle and the positioning tag in the moving process of the unmanned aerial vehicle according to the moving track, the position information and the moving information in the moving process of the unmanned aerial vehicle, and the distance measurement module comprises: the distance measurement module determines the real-time position of the unmanned aerial vehicle according to the moving track, the position information and the moving information in the moving process of the unmanned aerial vehicle; and the distance measurement module calculates the real-time distance between the real-time position of the unmanned aerial vehicle and the positioning tag according to the real-time position of the unmanned aerial vehicle.
In a second aspect, embodiments of the present disclosure provide a positioning device for an unmanned aerial vehicle, the unmanned aerial vehicle including a flight control module and an air-to-ground data link module; the device is integrated in unmanned aerial vehicle, unmanned aerial vehicle still includes: the system comprises a distance measurement module, an inertial navigation module, a position determination module, an image acquisition module and a positioning module; wherein:
the inertial navigation module is used for calculating the movement track of the unmanned aerial vehicle according to the movement data in the movement process of the unmanned aerial vehicle;
the position determining module is used for calculating position information in the moving process of the unmanned aerial vehicle;
the positioning module is used for acquiring images in the moving process of the unmanned aerial vehicle from the image acquisition module and calculating the moving information of the unmanned aerial vehicle according to the images;
The distance measuring module is used for measuring the distance between the unmanned aerial vehicle and the positioning tag in the moving process of the unmanned aerial vehicle according to the moving track, the position information and the moving information in the moving process of the unmanned aerial vehicle; the positioning tag is a position mark of a person to be searched and rescued;
The positioning module is further used for performing positioning calculation according to the distance between the unmanned aerial vehicle and the positioning tag so as to determine the position of the personnel to be searched and rescuing.
Optionally, the distance measurement module is specifically configured to: determining the real-time position of the unmanned aerial vehicle according to the moving track, the position information and the moving information in the moving process of the unmanned aerial vehicle; and calculating the real-time distance between the real-time position of the unmanned aerial vehicle and the positioning tag according to the real-time position of the unmanned aerial vehicle.
In a third aspect, an embodiment of the disclosure provides an electronic device, including a processor and a memory storing program instructions, where the processor is configured to execute the positioning method for a drone according to the first aspect when the program instructions are executed.
In a fourth aspect, an embodiment of the present disclosure provides an unmanned aerial vehicle, including a flight control module and an air-to-ground data link module; wherein, unmanned aerial vehicle still includes: the system comprises a distance measurement module, an inertial navigation module, a position determination module, an image acquisition module and a positioning module;
the inertial navigation module is used for calculating the movement track of the unmanned aerial vehicle according to the movement data in the movement process of the unmanned aerial vehicle;
the position determining module is used for calculating position information in the moving process of the unmanned aerial vehicle;
the positioning module is used for acquiring images in the moving process of the unmanned aerial vehicle from the image acquisition module and calculating the moving information of the unmanned aerial vehicle according to the images;
The distance measuring module is used for measuring the distance between the unmanned aerial vehicle and the positioning tag in the moving process of the unmanned aerial vehicle according to the moving track, the position information and the moving information in the moving process of the unmanned aerial vehicle; the positioning tag is a position mark of a person to be searched and rescued;
The positioning module is further used for performing positioning calculation according to the distance between the unmanned aerial vehicle and the positioning tag so as to determine the position of the personnel to be searched and rescuing.
In a fifth aspect, an embodiment of the disclosure provides a storage medium storing program instructions, where the program instructions, when executed, perform the positioning method for a drone according to the first aspect.
The positioning method, the device, the equipment and the storage medium for the unmanned aerial vehicle provided by the embodiment of the disclosure can realize the following technical effects:
When the unmanned aerial vehicle is used for positioning the personnel to be searched and rescuing in the building, the unmanned aerial vehicle can be controlled to move and fly around the building. In the unmanned aerial vehicle moving and flying process, the inertial navigation module can calculate the moving track of the unmanned aerial vehicle according to the moving data of the unmanned aerial vehicle, the position determining module can also calculate the position information of the unmanned aerial vehicle, and the positioning module can also calculate the moving information of the unmanned aerial vehicle according to the image acquired by the image acquisition module in the unmanned aerial vehicle moving and flying process. The positioning module can measure the distance between the unmanned aerial vehicle and the positioning tag used for representing the position of the person to be searched and rescued based on the moving track, the position information and the moving information in the moving flight process of the unmanned aerial vehicle, and further positions the person to be searched and rescued based on the measured distance. In the embodiment of the disclosure, when the distance between the unmanned aerial vehicle and the positioning tag is measured, only the time synchronization between the unmanned aerial vehicle and the positioning tag is required to be ensured, the time synchronization between the unmanned aerial vehicles is not required, and the situation that the time synchronization between the unmanned aerial vehicles is lost due to building shielding does not exist. Therefore, when the unmanned aerial vehicle is utilized to position the personnel to be searched and rescued in the building, the positioning accuracy can be improved.
The foregoing general description and the following description are exemplary and explanatory only and are not restrictive of the application.
Drawings
One or more embodiments are illustrated by way of example and not limitation in the figures of the accompanying drawings, in which like references indicate similar elements, and in which like reference numerals refer to similar elements, and in which:
Fig. 1 is a schematic view of a drone provided by an embodiment of the present disclosure;
Fig. 2 is a schematic diagram of a positioning method for a drone provided in an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of another positioning method for a drone provided by embodiments of the present disclosure;
fig. 4 is a schematic diagram of a bayesian network corresponding to another positioning method for a drone according to an embodiment of the present disclosure;
Fig. 5 is a factor graph corresponding to a bayesian network corresponding to another positioning method for an unmanned aerial vehicle according to an embodiment of the present disclosure;
fig. 6 is a schematic diagram of another bayesian network corresponding to another positioning method for a drone according to an embodiment of the present disclosure.
Fig. 7 is a schematic view of a positioning device for a drone provided in an embodiment of the present disclosure;
fig. 8 is a schematic diagram of an electronic device according to an embodiment of the disclosure.
Detailed Description
So that the manner in which the features and techniques of the disclosed embodiments can be understood in more detail, a more particular description of the embodiments of the disclosure, briefly summarized below, may be had by reference to the appended drawings, which are not intended to be limiting of the embodiments of the disclosure. In the following description of the technology, for purposes of explanation, numerous details are set forth in order to provide a thorough understanding of the disclosed embodiments. However, one or more embodiments may still be practiced without these details. In other instances, well-known structures and devices may be shown simplified in order to simplify the drawing.
The terms first, second and the like in the description and in the claims of the embodiments of the disclosure and in the above-described figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate in order to describe embodiments of the present disclosure. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion.
The term "plurality" means two or more, and the term "plurality" means two or more, unless otherwise specified.
In the embodiment of the present disclosure, the character "/" indicates that the front and rear objects are an or relationship. For example, A/B represents: a or B.
The term "and/or" is an associative relationship that describes an object, meaning that there may be three relationships. For example, a and/or B, represent: a or B, or, A and B.
The term "corresponding" may refer to an association or binding relationship, and the correspondence between a and B refers to an association or binding relationship between a and B.
Unmanned aircraft, for short, "unmanned aircraft," is unmanned aircraft that is maneuvered using a radio remote control device and a self-contained programming device, or is autonomously operated, either entirely or intermittently, by an on-board computer.
Currently, in habitable buildings, disasters such as fires, earthquakes and the like may occur. Under the condition that disasters occur in a building, people trapped in the building need to be searched and rescuing, and in the process of searching and rescuing the people, the people in the building need to be positioned.
In the related art, people trapped in a building are typically located by unmanned aerial vehicle formation. Specifically, unmanned aerial vehicle formation positions personnel in a building through a TDOA positioning method. In the TDOA location method, time synchronization is required between unmanned aerial vehicles in unmanned aerial vehicle formation. However, when unmanned aerial vehicle formation adopts the TDOA positioning method to fix a position the personnel in the building, shelter from by the building easily between the unmanned aerial vehicle, cause time synchronization information between the unmanned aerial vehicle to lose, and then cause personnel to fix a position unusually, the accuracy of location is difficult to guarantee.
In view of this, the embodiment of the disclosure provides a positioning method, a device, equipment and a storage medium for an unmanned aerial vehicle, wherein the unmanned aerial vehicle adopts a TOF (Time of flight) ranging principle to position personnel trapped in a building, so that the coupling degree between unmanned aerial vehicles is reduced. Utilize unmanned aerial vehicle location to wait to search for and rescue personnel in-process, and when measuring the distance between unmanned aerial vehicle and the location label, only need guarantee the time synchronization between unmanned aerial vehicle and the location label can, need not to carry out time synchronization between the unmanned aerial vehicle, do not exist because building shelter from the condition that causes time synchronization between the unmanned aerial vehicle to lose. Therefore, when the unmanned aerial vehicle is utilized to position the personnel to be searched and rescued in the building, the positioning accuracy can be improved.
As shown in conjunction with fig. 1, an embodiment of the present disclosure provides a drone 10, the drone 10 including a flight control module 11, an air-to-ground data link module 12, a distance measurement module 13, an inertial navigation module 14, a position determination module 15, an image acquisition module 16, and a positioning module 17. Wherein:
The flight control module 11 is respectively connected with the air-ground data link module 12, the distance measurement module 13, the inertial navigation module 14, the position determination module 15, the image acquisition module 16 and the positioning module 17 and is used for controlling the operation of each module. In addition, the flight control module 11 is further configured to perform mobile flight according to a control instruction sent by a user through a remote control device.
The air-ground data link module 12 can sense the electromagnetic environment characteristics of the working area, and dynamically adjust the working parameters (including communication protocol, working frequency, modulation characteristic, network structure, etc.) of the communication system in real time according to the environment characteristics and communication requirements, so as to achieve the purpose of reliable communication or saving communication resources. The air-to-ground data link module 12 includes an uplink and a downlink. The uplink enables the transmission and reception of remote control commands from the ground station to the drone, for example, receiving control commands sent by the user via the remote control device. The downlink enables the transmission and reception of telemetry data from the drone to the ground station and infrared or television images, for example, to feed back to the user the flight status (e.g., power, altitude, etc.) of the drone 10.
The distance measurement module 13 is a UWB (Ultra Wide Band) distance measurement module, and is configured to measure a distance between the unmanned aerial vehicle 10 and a positioning tag worn by a person to be searched and rescued, and the positioning tag is used as an identifier of the person to be searched and rescued, and is configured to characterize a position of the person to be searched and rescued.
The inertial navigation module 14 is an IMU (Inertial Measurement Unit ) inertial navigation module, including an acceleration sensor, an angular velocity sensor, a gyroscope, a geomagnetic sensor, and a velocity sensor. The inertial navigation module 14 obtains movement data (e.g., acceleration, etc.) of the drone 10 via various sensors.
The position determining module 15 is configured to determine an absolute position of the drone 10. The position determining module 15 may be a global navigation satellite system, a global positioning system, or a beidou satellite navigation system.
The image acquisition module 16 is configured to acquire an image during the flight of the unmanned aerial vehicle 10, and may be a camera. The positioning module 17 is configured to perform a calculation process in the process of positioning the person to be searched and rescued by the unmanned aerial vehicle 10.
As shown in fig. 2, an embodiment of the present disclosure provides a positioning method for a drone, and the method is applied to the drone of the above embodiment. The method comprises the following steps:
S21, the inertial navigation module calculates the movement track of the unmanned aerial vehicle according to movement data in the movement process of the unmanned aerial vehicle.
S22, the position determining module calculates position information in the moving process of the unmanned aerial vehicle.
S23, the positioning module acquires images in the moving process of the unmanned aerial vehicle from the image acquisition module, and calculates moving information of the unmanned aerial vehicle according to the images.
S24, the distance measuring module measures the distance between the unmanned aerial vehicle and the positioning tag in the moving process of the unmanned aerial vehicle according to the moving track, the position information and the moving information in the moving process of the unmanned aerial vehicle; the positioning label is a position mark of a person to be searched and rescued.
S25, the positioning module performs positioning calculation according to the distance between the unmanned aerial vehicle and the positioning tag so as to determine the position of the personnel to be searched and rescuing.
When the unmanned aerial vehicle is used for positioning the personnel to be searched and rescuing in the building, the unmanned aerial vehicle can be controlled to move and fly around the building. In the unmanned aerial vehicle moving and flying process, the inertial navigation module can calculate the moving track of the unmanned aerial vehicle according to the moving data of the unmanned aerial vehicle, the position determining module can also calculate the position information of the unmanned aerial vehicle, and the positioning module can also calculate the moving information of the unmanned aerial vehicle according to the image acquired by the image acquisition module in the unmanned aerial vehicle moving and flying process. The positioning module can measure the distance between the unmanned aerial vehicle and the positioning tag used for representing the position of the person to be searched and rescued based on the moving track, the position information and the moving information in the moving flight process of the unmanned aerial vehicle, and further positions the person to be searched and rescued based on the measured distance. By adopting the positioning method for the unmanned aerial vehicle, which is provided by the embodiment of the disclosure, when the distance between the unmanned aerial vehicle and the positioning label is measured, only the time synchronization between the unmanned aerial vehicle and the positioning label is ensured, the time synchronization between the unmanned aerial vehicles is not needed, and the situation that the time synchronization between the unmanned aerial vehicles is lost due to building shielding does not exist. Therefore, when the unmanned aerial vehicle is utilized to position the personnel to be searched and rescued in the building, the positioning accuracy can be improved.
Further, the inertial navigation module calculates a movement track of the unmanned aerial vehicle according to movement data in the movement process of the unmanned aerial vehicle, and the inertial navigation module comprises: and under the condition that the unmanned aerial vehicle does not move, the inertial navigation module determines the initial gesture and the initial speed of the unmanned aerial vehicle according to the initial movement data of the unmanned aerial vehicle. And the inertial navigation module determines the movement track of the unmanned aerial vehicle according to the initial gesture and the initial speed of the unmanned aerial vehicle and according to movement data in the movement process of the unmanned aerial vehicle.
In this embodiment, before the inertial navigation module determines the movement track of the unmanned aerial vehicle, the gesture of the unmanned aerial vehicle needs to be determined. Under the condition that the unmanned aerial vehicle is in a static state, the gravity direction can be determined according to acceleration data collected by an acceleration sensor, and the initial gesture of the unmanned aerial vehicle can be further calculated, and the method specifically comprises the following steps:
When the unmanned aerial vehicle is in a static state, acceleration data acquired by the acceleration sensor are represented by the following formula (1):
G in the formula (1) is gravity acceleration. The acceleration data is expressed in the body coordinate system b as the following formula (2):
Bringing n a and b a into formula (3) below:
in the formula (3), Is the conversion from the b coordinate system to the n coordinate system. Because the rotation matrix is an orthogonal matrix, i.eThe following formula (4) is derived:
In the formula (4), 1 is a pitch angle expressed in terms of euler angles, Is the roll angle expressed in terms of euler angles. Solving the following formula (5) and formula (6):
the yaw angle is calculated by geomagnetism, i.e., the following formula (7):
in equation (7), H x and H y are components of the vector pointing to magnetic north in the x-axis and y-axis of the sensor, provided by geomagnetic sensor acquisition.
So far, the determination process of the initial attitude of the unmanned aerial vehicle is completed. The inertial navigation module can further calculate the movement track of the unmanned aerial vehicle according to the initial gesture of the unmanned aerial vehicle, the initial speed provided by the speed sensor and the movement data acquired by each sensor. The measurement formula (8) of each sensor in the inertial navigation module is as follows:
In equation (8), b g is the zero bias error of the gyroscope over time, η g is the white noise during the gyroscope measurement. b a is zero offset error of the acceleration sensor over time, η a is white noise during measurement of the acceleration sensor.
Then, according to Newton's theorem, the differential equation of the motion model of the unmanned aerial vehicle can be known as formula (9):
in the formula (9) of the present invention, And the A is the lie algebraic form of the rotation vector. Discretizing the formula (9) yields the following formula (10):
in order to avoid repeated integration due to the attitude and position change of the previous frame, the following pre-integration formula (11) is employed in this embodiment:
In equation (11), deltaV ik is the speed increment from time i to time k of the unmanned aerial vehicle, deltat is the sampling interval, deltaR ik is the gyroscope rotation matrix from time i to time k of the update, For the acceleration sensor measurement at time k,/>Zero offset error of acceleration sensor at k moment,/>The sampling noise of the gyroscope and the acceleration sensor at time k respectively.
Further, the position determining module calculates position information in the moving process of the unmanned aerial vehicle, including: the position determining module acquires longitude and latitude information of the unmanned aerial vehicle, and calculates position information of the unmanned aerial vehicle according to the longitude and latitude information.
In this embodiment, the position determining module represents longitude and latitude information of the unmanned aerial vehicle in a mercator rectangular coordinate system as the following formula (12) and formula (13):
x=r*(λ-λ0)*cos(λ) (12);
Where x is the abscissa and y is the ordinate. λ is the longitude, λ 0 is the reference longitude, γ is the latitude, and r is the earth radius.
In this embodiment, the altitude of the unmanned aerial vehicle collected initially is taken as 0 altitude, so that the coordinate of the z axis of the unmanned aerial vehicle in the mercator rectangular coordinate system can be determined, and the position information of the unmanned aerial vehicle can be determined.
Further, the positioning module calculates movement information of the unmanned aerial vehicle according to the image, including: the positioning module acquires a first image corresponding to a first moment and a second image corresponding to a second moment from the image acquisition module; wherein the first time is earlier than the second time. The positioning module extracts a first characteristic point in the first image and a second characteristic point in the second image; and the positioning module calculates the movement information of the unmanned aerial vehicle according to the first characteristic points and the second characteristic points.
In the embodiment, in order to further improve the accuracy of the gesture and the position of the unmanned aerial vehicle and reduce the accumulated error in the pre-integration process of the inertial navigation module. The positioning module firstly acquires a first image corresponding to a first moment from the image acquisition module, and extracts feature points of the first image by adopting an ORB algorithm. And under the condition that no person moves, acquiring a second image corresponding to the second moment, and extracting the characteristic points of the second image by adopting an ORB algorithm. And then, matching the characteristic points of the first image with the characteristic points of the second image by adopting a rapid approximate nearest neighbor matching method, and under the condition of successful matching, determining the normalized displacement and the posture transformation of the unmanned aerial vehicle between two frames according to the matched characteristic points by adopting epipolar geometry.
It should be noted that the normalized displacement is not a true physical scale, and that the epipolar geometry is used only to initialize the calibration. The first image is acquired by controlling the unmanned aerial vehicle to be at a relative static state, then the unmanned aerial vehicle is controlled to move and fly for a certain distance to acquire the second image, and the moving displacement track of the unmanned aerial vehicle can be determined by the inertial navigation module. The physical movement distance corresponding to the unit distance of the normalized displacement is actually a pre-integration result of the inertial navigation module. The corresponding feature points on the match are then converted into three-dimensional coordinates in the navigation system by triangulation. Thus, when the unmanned aerial vehicle moves, the unmanned aerial vehicle passes through 3D-2D: the PnP algorithm can determine the gesture transformation condition and displacement information of the unmanned aerial vehicle.
Further, the distance measurement module measures the distance between the unmanned aerial vehicle and the positioning tag in the moving process according to the moving track, the position information and the moving information in the unmanned aerial vehicle moving process, and the distance measurement module comprises: the distance measurement module determines the real-time position of the unmanned aerial vehicle according to the moving track, the position information and the moving information in the moving process of the unmanned aerial vehicle; and the distance measurement module calculates the real-time distance between the real-time position of the unmanned aerial vehicle and the positioning tag according to the real-time position of the unmanned aerial vehicle.
In this embodiment, after determining the real-time position of the unmanned aerial vehicle, the positioning module may transmit a pulse signal to the positioning tag at the third time t3, and after receiving the pulse signal, the positioning module feeds back a response signal to the positioning module of the unmanned aerial vehicle at the fourth time t4, where the positioning module of the unmanned aerial vehicle receives the response signal at the fifth time t 5. Thus, the real-time distance between the unmanned aerial vehicle and the positioning tag can be calculated through the flight time of the signal between the positioning module and the positioning tag of the unmanned aerial vehicle. Further, under the condition that the unmanned aerial vehicle is in different positions or different angles, various distances can be calculated between the unmanned aerial vehicle and the positioning tag, for example, when the unmanned aerial vehicle is in a first position, the distance between the unmanned aerial vehicle and the positioning tag is calculated to be S1, when the unmanned aerial vehicle is in a second position different from the first position, the distance between the unmanned aerial vehicle and the positioning tag is calculated to be S2, and when the unmanned aerial vehicle is in a third position, the distance between the unmanned aerial vehicle and the positioning tag is calculated to be S3. Therefore, the specific position of the positioning label can be calculated based on the S1, the S2 and the S3, and the positioning of the personnel to be searched and rescued is realized.
As shown in connection with fig. 3, an embodiment of the present disclosure provides another positioning method for a drone, which is applied to the drone in the above embodiment. In the embodiment of the disclosure, two unmanned aerial vehicles are taken as an example for explanation.
In this embodiment, for each unmanned aerial vehicle of two unmanned aerial vehicles, the distance measurement can be performed twice with the positioning tag, and the positioning of the personnel to be searched and rescaled can be completed, and the bayesian network corresponding to the positioning method for unmanned aerial vehicle provided in the embodiment of the present disclosure is shown in fig. 4, where J1 (1), J1 (2), J2 (1), J2 (2) respectively represent the status frames of two unmanned aerial vehicles at different moments, and the status frames are converted into vector a as the following formula (14):
In the formula (14) of the present invention, Is zero offset error of gyroscope,/>Is the zero offset error of acceleration, ω k is the current angular velocity,/>Is the coordinate position [ x, y, z ] of the unmanned aerial vehicle, and V k is the current speed vector [ vx, vy, vz ] of the unmanned aerial vehicle.
A is a position vector of a person to be searched and rescued, and specifically expressed as the following formula (15):
A=[p]T (15);
in the formula (15), p is the position [ tx, ty, tz ] of the person to be searched for and rescued.
The square brightening of the status frame between each moment of the unmanned aerial vehicle is defined as f (k) based on the constraint of the inertial navigation module pre-integration, and the other square brightening constraint is defined as v (k) based on the visual odometer constraint. The connection line between the person to be searched and rescued A and the unmanned aerial vehicle J (k) is a ranging constraint and is defined as r (k). The constraint of the position determination module is g (k).
In this embodiment, the most appropriate state parameter is solved, i.e. the joint probability is maximized, i.e. the maximum likelihood probability, specifically as shown in the following formula (16):
Max P=max P(J1(k),J2(k),A,Z) (16);
In the formula (16), the observed value z= [ f1 (1) f2 (1) v1 (1) v2 (1) g1 (1) g2 (1) g2 (2) r (1) r (2) r (3) r (4) ] T. Vector sets (inertial navigation module pre-integration, visual odometer, position determination module and ranging module) are made up for all sensor measurements at different times.
And in this embodiment ,P(J1(k),J2(k),A,Z)=P(J1(1))×P(J1(2))×P(J2(1))×P(J2(2))×P(A)×P(f1(1)|J1(2),J1(1))×P(v1(1)|J1(2),J1(1))×P(f2(1)|J2(2),J2(1))×P(v2(1)|J2(2),J2(1))×P(g1(1)|J1(1))×P(g2(1)|J2(1))×P(g2(2)|J2(2))×P(r1(1)|J1(1),A)×P(r1(2)|J1(2),A)×P(r2(1)|J2(1),A)×P(r2(2)|J2(2),A)
∝P(J1(k),J2(k),A|Z)=P(J1(1))×P(J1(2))×P(J2(1))×P(J2(2))×P(A)×P(J1(2),J1(1)|f1(1))×P(J1(2),J1(1)|v1(1))×P(J2(2),J2(1)|f2(1))×P(J2(2),J2(1)|v2(1))×P(J1(1)|g1(1))×P(J2(1)|g2(1))×P(J2(2)|g2(2))×P(J1(1),A|r1(1))×P(J1(2),A|r1(2))×P(J2(1),A|r2(1))×P(J2(2),A|r2(2))
Wherein k= [1,2]. At this time, the whole formula translates into how much probability the conditions of the drone and the person to be rescued are at the first and second moments is the greatest given the sensor measurement value Z.
The bayesian network in fig. 4 is converted into the factor graph shown in fig. 5. The factorization in fig. 5 is as follows equation (17):
Φ(J1(k),J2(k),A)=Φ1(J1(1))×Φ2(J1(2))×Φ3(J2(1))×Φ4(J2(2))×Φ5(A)×Φ6(J1(2),J1(1))×Φ7(J1(2),J1(1))×Φ8(J2(2),J2(1))×Φ9(J2(2),J2(1))×Φ10(J1(1))×Φ11(J2(1))×Φ12(J2(2))×Φ14(J1(1),A)×Φ15(J1(2),A)×Φ16(J2(1),A)×Φ17(J2(2),A|r2(2)) (17);
In formula (17), Φ 1 and Φ 3 correspond to the prior constraints factored by P (J1 (1)) and P (J2 (1)), respectively, this constraint is known by the attitude initial solution and position determination module of the inertial navigation module, Φ 245 corresponds to the prior factor constraints of other state frames, but since the prior state of other state frames cannot be obtained at the initial calculation in this embodiment, the covariance of these state frames is effectively infinite, i.e. there is no constraint relationship, and can be directly ignored, so the following formula (18) can be obtained:
Φ(J1(k),J2(k),A)=Φ1(J1(1))×Φ3(J2(1))×Φ6(J1(2),J1(1))×Φ7(J1(2),J1(1))×Φ8(J2(2),J2(1))×Φ9(J2(2),J2(1))×Φ10(J1(1))×Φ11(J2(1))×Φ12(J2(2))×Φ14(J1(1),A)×Φ15(J1(2),A)×Φ16(J2(1),A)×Φ17(J2(2),A) (18);
In formula (18), Φ 6: and the inertial navigation module of the unmanned aerial vehicle 1 is subjected to pre-integration constraint.
Phi 7: the unmanned aerial vehicle 1 visual odometer constraint.
Phi 8: and the unmanned plane 2 inertial navigation module pre-integrates constraint.
Phi 9: unmanned 2 visual odometer constraints.
Phi 10: the unmanned aerial vehicle 1 is at the satellite positioning constraint of the 1 st moment.
Phi 11: the drone 2 is constrained by satellite positioning at time 1.
Phi 12: the unmanned aerial vehicle 2 is at the satellite positioning constraint of the 2 nd moment.
And phi 14 to phi 17, wherein the unmanned aerial vehicle is restricted in different positions and the distance between the personnel to be searched and rescuing.
The gaussian distribution is used to assume its individual factor constraints, namely the following equation (19):
In formula (19), Y (i) is a column vector of state variables related to the current factor, h i (Y (i)) -z (i) is a prediction error at linear point i, and Σ is an error covariance matrix.
As can be seen from the above formula (19), the prediction error is normalized, i.e., divided by sqrt (Σ i). All errors are identical to the same scale, and can be converted from a factor function cumulative maximum problem to a standard least square problem for solving the sum of squares of prediction errors, wherein the standard least square problem is as follows:
in equation (20), Y is a set of all state variables. Wherein,
As a result of a factor function, i.e., Φ i.
In the above formula (20), the conversion to the 2-norm is as follows formula (21):
In the formula (21), the expression "a", As a result of a factor function, i.e., Φ i. Through the calculation, the whole function to be optimized is obtained, and then the optimal parameters are obtained through calculation by adopting a common nonlinear optimization algorithm, so that the track of the unmanned aerial vehicle and the position of the personnel to be rescued are solved. In this embodiment, the optimization algorithm includes, but is not limited to, a linear direct solution, a gaussian newton iteration method, a levenberg-marquardt algorithm, and a DogLeg minimization method, which is not specifically limited in this disclosure.
Optionally, in a case where a person to be searched and rescued moves within a building, a bayesian network corresponding to the positioning method for the unmanned aerial vehicle provided in the embodiment of the present disclosure is shown in fig. 6. Accordingly, the factor graph can be decomposed into the following equation (22):
Φ(J1(k),J2(k),A)=Φ1(J1(1))×Φ2(J1(2))×Φ3(J2(1))×Φ4(J2(2))×Φ5(A)×Φ6(J1(2),J1(1))×Φ7(J1(2),J1(1))×Φ8(J2(2),J2(1))×Φ9(J2(2),J2(1))×Φ10(J1(1))×Φ11(J2(1))×Φ12(J2(2))×Φ14(J1(1),A)×Φ15(J1(2),A)×Φ16(J2(1),A)×Φ17(J2(2),A)×Φ18(J1(3),J1(2))×Φ19(J1(3),J1(2))×Φ20(J2(3),J2(2))×Φ21(J2(3),J2(2))×Φ22(J1(3)) (22);
From the above formula (22), it is easy to get that as time and measured values are increased, the whole bayesian network will be larger and larger, the factor graph to be optimized will be more and more complex, and the optimization algorithm will be slower and slower, and the efficiency will be affected. In view of this, the concept of fixed lag smoothing is introduced, i.e. state frames over a certain time are optimized out, where the optimization is not directly neglected, but is converted into a form of a priori probability. For example, if the state J1 (1), J2 (1) in the dashed box needs to be erased, the edge probability is obtained, that is, P (J1 (1), P (J2 (1)).
Because of the state frame in the previous implementation, the assumption of a multivariate gaussian distribution is met, then in equation (22): Where O is a set of other state variables, equation (23) can be derived:
P(J(k))=N(uJ(k)J(k)J(k)) (23);
In formula (23), u J(k) is the mean of the state quantity J (k), Σ J(k)J(k) is the variance of J (k). The elimination algorithm decomposes the factor graph into the following formula (24), wherein the formula (24) is in a joint probability form which is only related to state variables, namely:
With the joint probability density function known, the mean and covariance matrices for each state variable can be further calculated. Specifically, the above factor graph is further simplified into the following formula (25):
Φ(J1(k),J2(k),A)=Φ1'(J1(1))×Φ2(J1(2))×Φ3'(J2(1))×Φ4(J2(2))×Φ5(A)×Φ6(J1(2),J1(1))×Φ7(J1(2),J1(1))×Φ8(J2(2),J2(1))×Φ9(J2(2),J2(1))×Φ10(J1(1))×Φ11(J2(1))×Φ12(J2(2))×Φ14(J1(1),A)×Φ15(J1(2),A)×Φ16(J2(1),A)×Φ17(J2(2),A)×Φ18(J1(3),J1(2))×Φ19(J1(3),J1(2))×Φ20(J2(3),J2(2))×Φ21(J2(3),J2(2))×Φ22(J1(3)) (25);
In equation (25), Φ 1 (X (k)) is a factor function conforming to the gaussian distribution of N (u X(k)X(k)X(k)). Only the latest 2 state variables are reserved for the unmanned aerial vehicle, the calculation amount is reduced by the variable before marginalization, the fixed smooth hysteresis filtering is called n=2, and when n=1, the fixed smooth hysteresis filtering is degraded into a kalman filter. It is not difficult to get that the performance of the factor graph as a back-end optimizer is superior to that of a kalman filter.
Still further, if J1 (2), J2 (2) and A (1) are marginalized, then J1 (2), J2 (2) are marginalized again after the last marginalization, at which time the remaining state variables are only J1 (3), J2 (3), A (2), which are related to A (1) and J1 (2), J2 (2) only, so that it is only necessary to continue marginalizing A (1) and J1 (2), J2 (2) on the basis of the last marginalization, and ignore the extraneous variables, the final factor graph is as shown in equation (26) below:
Φ(J1(k),J2(k),A)=Φ2'(J1(2))×Φ4'(J2(2))×Φ5'(A)×Φ18(J1(3),J1(2))×Φ19(J1(3),J1(2))×Φ20(J2(3),J2(2))×Φ21(J2(3),J2(2))×Φ22(J1(3)) (26);
Through the formula (26), the optimization function is simplified, so that the corresponding calculation time is reduced, and the real-time positioning calculation is realized.
In the related art, when the unmanned aerial vehicle formation is used for positioning personnel in a building through a TDOA positioning method, the unmanned aerial vehicle formation needs to comprise a plurality of unmanned aerial vehicles to realize positioning. However, when the positioning method for the unmanned aerial vehicle is used for positioning the search and rescue personnel, the positioning requirement of formation of unmanned aerial vehicles by a plurality of unmanned aerial vehicles is not required to be met, the deployment requirement of the unmanned aerial vehicle is simplified, and accurate positioning can be realized even by using one unmanned aerial vehicle. Thereby, reduced the complexity of unmanned aerial vehicle location, promoted the efficiency of unmanned aerial vehicle location, also reduced the cost of unmanned aerial vehicle location equally.
As shown in conjunction with fig. 7, an embodiment of the present disclosure provides a positioning device 700 for a drone, the device 700 being integrated into the drone in the above embodiment, the device including: a distance measurement module 701, an inertial navigation module 702, a position determination module 703, an image acquisition module 704, and a positioning module 705; wherein:
the inertial navigation module 702 is configured to calculate a movement track of the unmanned aerial vehicle according to movement data in a movement process of the unmanned aerial vehicle;
the position determining module 703 is configured to calculate position information during the movement of the unmanned aerial vehicle;
the positioning module 705 is configured to acquire an image during the movement of the unmanned aerial vehicle from the image acquisition module 704, and calculate movement information of the unmanned aerial vehicle according to the image;
The distance measurement module 701 is configured to measure a distance between the unmanned aerial vehicle and the positioning tag in the moving process of the unmanned aerial vehicle according to the moving track, the position information and the moving information in the moving process of the unmanned aerial vehicle; the positioning tag is a position mark of a person to be searched and rescued;
The positioning module 705 is further configured to perform positioning calculation according to a distance between the unmanned aerial vehicle and the positioning tag, so as to determine a position of the person to be searched and rescuing.
Optionally, the inertial navigation module 702 is specifically configured to: under the condition that the unmanned aerial vehicle does not move, determining the initial gesture and the initial speed of the unmanned aerial vehicle according to initial movement data of the unmanned aerial vehicle; and determining the movement track of the unmanned aerial vehicle according to the initial gesture and the initial speed of the unmanned aerial vehicle and according to movement data in the movement process of the unmanned aerial vehicle.
Optionally, the location determining module 703 is specifically configured to: and acquiring longitude and latitude information of the unmanned aerial vehicle, and calculating position information of the unmanned aerial vehicle according to the longitude and latitude information.
Optionally, the positioning module 705 is specifically configured to: the block obtains a first image corresponding to a first time and a second image corresponding to a second time from the image acquisition module 704; wherein the first time is earlier than the second time; extracting a first characteristic point in the first image and a second characteristic point in the second image; and calculating the movement information of the unmanned aerial vehicle according to the first characteristic points and the second characteristic points.
Optionally, the distance measurement module 701 is specifically configured to: determining the real-time position of the unmanned aerial vehicle according to the moving track, the position information and the moving information in the moving process of the unmanned aerial vehicle; and calculating the real-time distance between the real-time position of the unmanned aerial vehicle and the positioning tag according to the real-time position of the unmanned aerial vehicle.
The positioning device for the unmanned aerial vehicle provided by the embodiment of the disclosure may perform the same actions as the positioning method for the unmanned aerial vehicle in the above embodiment, and its implementation principle and technical effects are similar, and will not be described herein.
As shown in connection with fig. 8, an embodiment of the present disclosure provides an electronic device 800 including a processor (processor) 100 and a memory (memory) 101. Optionally, the electronic device 800 may also include a communication interface (Communication Interface) 102 and a bus 103. The processor 100, the communication interface 102, and the memory 101 may communicate with each other via the bus 103. The communication interface 102 may be used for information transfer. The processor 100 may call logic instructions in the memory 101 to perform the positioning method for the drone of the above-described embodiments.
Further, the logic instructions in the memory 101 described above may be implemented in the form of software functional units and may be stored in a computer readable storage medium when sold or used as a stand alone product.
The memory 101 is a computer readable storage medium that can be used to store a software program, a computer executable program, such as program instructions/modules corresponding to the methods in the embodiments of the present disclosure. The processor 100 executes the functional applications and data processing by running the program instructions/modules stored in the memory 101, i.e. implements the positioning method for the drone in the above-described embodiments.
The memory 101 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, at least one application program required for a function; the storage data area may store data created according to the use of the terminal device, etc. Further, the memory 101 may include a high-speed random access memory, and may also include a nonvolatile memory.
The embodiments of the present disclosure provide a storage medium storing computer executable instructions configured to perform the above-described positioning method for a drone.
The storage medium may be a transitory computer readable storage medium or a non-transitory computer readable storage medium.
Embodiments of the present disclosure may be embodied in a software product stored on a storage medium, including one or more instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of a method according to embodiments of the present disclosure. And the aforementioned storage medium may be a non-transitory storage medium including: a plurality of media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or a transitory storage medium.
The above description and the drawings illustrate embodiments of the disclosure sufficiently to enable those skilled in the art to practice them. Other embodiments may involve structural, logical, electrical, process, and other changes. The embodiments represent only possible variations. Individual components and functions are optional unless explicitly required, and the sequence of operations may vary. Portions and features of some embodiments may be included in, or substituted for, those of others. Moreover, the terminology used in the present application is for the purpose of describing embodiments only and is not intended to limit the claims. As used in the description of the embodiments and the claims, the singular forms "a," "an," and "the" (the) are intended to include the plural forms as well, unless the context clearly indicates otherwise. Similarly, the term "and/or" as used in this disclosure is meant to encompass any and all possible combinations of one or more of the associated listed. Furthermore, when used in the present disclosure, the terms "comprises," "comprising," and/or variations thereof, mean that the recited features, integers, steps, operations, elements, and/or components are present, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. Without further limitation, an element defined by the phrase "comprising one …" does not exclude the presence of other like elements in a process, method or apparatus that includes the element. In this context, each embodiment may be described with emphasis on the differences from the other embodiments, and the same similar parts between the various embodiments may be referred to each other. For the methods, products, etc. disclosed in the embodiments, if they correspond to the method sections disclosed in the embodiments, the description of the method sections may be referred to for relevance.
Those of skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. The skilled artisan may use different methods for each particular application to achieve the described functionality, but such implementation should not be considered to be beyond the scope of the embodiments of the present disclosure. It will be clearly understood by those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
In the embodiments disclosed herein, the disclosed methods, articles of manufacture (including but not limited to devices, apparatuses, etc.) may be practiced in other ways. For example, the apparatus embodiments described above are merely illustrative, and for example, the division of the units may be merely a logical function division, and there may be additional divisions when actually implemented, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. In addition, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interface, device or unit indirect coupling or communication connection, which may be in electrical, mechanical or other form. The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to implement the present embodiment. In addition, each functional unit in the embodiments of the present disclosure may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. In the description corresponding to the flowcharts and block diagrams in the figures, operations or steps corresponding to different blocks may also occur in different orders than that disclosed in the description, and sometimes no specific order exists between different operations or steps. For example, two consecutive operations or steps may actually be performed substantially in parallel, they may sometimes be performed in reverse order, which may be dependent on the functions involved. Each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.

Claims (10)

1. The positioning method for the unmanned aerial vehicle comprises a flight control module and an air-ground data link module; the unmanned aerial vehicle is characterized in that the unmanned aerial vehicle further comprises: the system comprises a distance measurement module, an inertial navigation module, a position determination module, an image acquisition module and a positioning module; the method comprises the following steps:
The inertial navigation module calculates the movement track of the unmanned aerial vehicle according to the movement data in the movement process of the unmanned aerial vehicle;
The position determining module calculates position information in the moving process of the unmanned aerial vehicle;
The positioning module acquires an image in the moving process of the unmanned aerial vehicle from the image acquisition module, and calculates the moving information of the unmanned aerial vehicle according to the image;
The distance measurement module is used for measuring the distance between the unmanned aerial vehicle and the positioning tag in the moving process of the unmanned aerial vehicle according to the moving track, the position information and the moving information in the moving process of the unmanned aerial vehicle; the positioning tag is a position mark of a person to be searched and rescued;
And the positioning module performs positioning calculation according to the distance between the unmanned aerial vehicle and the positioning tag so as to determine the position of the personnel to be searched and rescuing.
2. The method of claim 1, wherein the inertial navigation module calculates a movement trajectory of the drone from movement data during movement of the drone, comprising:
The inertial navigation module determines the initial gesture and the initial speed of the unmanned aerial vehicle according to the initial movement data of the unmanned aerial vehicle under the condition that the unmanned aerial vehicle does not move;
And the inertial navigation module determines the movement track of the unmanned aerial vehicle according to the initial gesture and the initial speed of the unmanned aerial vehicle and according to the movement data in the movement process of the unmanned aerial vehicle.
3. The method of claim 1, wherein the position determination module calculates position information during movement of the drone, comprising:
The position determining module obtains longitude and latitude information of the unmanned aerial vehicle, and calculates position information of the unmanned aerial vehicle according to the longitude and latitude information.
4. The method of claim 1, wherein the positioning module calculating movement information of the drone from the image comprises:
The positioning module acquires a first image corresponding to a first moment and a second image corresponding to a second moment from the image acquisition module; wherein the first time is earlier than the second time;
The positioning module extracts a first characteristic point in the first image and a second characteristic point in the second image;
And the positioning module calculates the movement information of the unmanned aerial vehicle according to the first characteristic points and the second characteristic points.
5. The method of claim 1, wherein the distance measurement module measures a distance between the unmanned aerial vehicle and a positioning tag based on a movement track, position information, and movement information during movement of the unmanned aerial vehicle, comprising:
The distance measurement module determines the real-time position of the unmanned aerial vehicle according to the moving track, the position information and the moving information in the moving process of the unmanned aerial vehicle;
and the distance measurement module calculates the real-time distance between the real-time position of the unmanned aerial vehicle and the positioning tag according to the real-time position of the unmanned aerial vehicle.
6. A positioner for unmanned aerial vehicle, unmanned aerial vehicle includes flight control module and air-to-ground data link module, its characterized in that, the device is integrated in unmanned aerial vehicle, unmanned aerial vehicle still includes: the system comprises a distance measurement module, an inertial navigation module, a position determination module, an image acquisition module and a positioning module; wherein:
the inertial navigation module is used for calculating the movement track of the unmanned aerial vehicle according to the movement data in the movement process of the unmanned aerial vehicle;
the position determining module is used for calculating position information in the moving process of the unmanned aerial vehicle;
the positioning module is used for acquiring images in the moving process of the unmanned aerial vehicle from the image acquisition module and calculating the moving information of the unmanned aerial vehicle according to the images;
The distance measuring module is used for measuring the distance between the unmanned aerial vehicle and the positioning tag in the moving process of the unmanned aerial vehicle according to the moving track, the position information and the moving information in the moving process of the unmanned aerial vehicle; the positioning tag is a position mark of a person to be searched and rescued;
The positioning module is further used for performing positioning calculation according to the distance between the unmanned aerial vehicle and the positioning tag so as to determine the position of the personnel to be searched and rescuing.
7. The apparatus of claim 1, wherein the distance measurement module is specifically configured to:
Determining the real-time position of the unmanned aerial vehicle according to the moving track, the position information and the moving information in the moving process of the unmanned aerial vehicle;
and calculating the real-time distance between the real-time position of the unmanned aerial vehicle and the positioning tag according to the real-time position of the unmanned aerial vehicle.
8. An electronic device comprising a processor and a memory storing program instructions, wherein the processor is configured to perform the positioning method for a drone of any one of claims 1 to 5 when the program instructions are run.
9. An unmanned aerial vehicle comprises a flight control module and an air-ground data link module; the unmanned aerial vehicle is characterized in that the unmanned aerial vehicle further comprises: the system comprises a distance measurement module, an inertial navigation module, a position determination module, an image acquisition module and a positioning module;
the inertial navigation module is used for calculating the movement track of the unmanned aerial vehicle according to the movement data in the movement process of the unmanned aerial vehicle;
the position determining module is used for calculating position information in the moving process of the unmanned aerial vehicle;
the positioning module is used for acquiring images in the moving process of the unmanned aerial vehicle from the image acquisition module and calculating the moving information of the unmanned aerial vehicle according to the images;
The distance measuring module is used for measuring the distance between the unmanned aerial vehicle and the positioning tag in the moving process of the unmanned aerial vehicle according to the moving track, the position information and the moving information in the moving process of the unmanned aerial vehicle; the positioning tag is a position mark of a person to be searched and rescued;
The positioning module is further used for performing positioning calculation according to the distance between the unmanned aerial vehicle and the positioning tag so as to determine the position of the personnel to be searched and rescuing.
10. A storage medium storing program instructions which, when executed, perform the positioning method for a drone of any one of claims 1 to 5.
CN202410228932.5A 2024-02-29 2024-02-29 Positioning method and device for unmanned aerial vehicle, equipment and storage medium Pending CN118112496A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410228932.5A CN118112496A (en) 2024-02-29 2024-02-29 Positioning method and device for unmanned aerial vehicle, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410228932.5A CN118112496A (en) 2024-02-29 2024-02-29 Positioning method and device for unmanned aerial vehicle, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN118112496A true CN118112496A (en) 2024-05-31

Family

ID=91213627

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410228932.5A Pending CN118112496A (en) 2024-02-29 2024-02-29 Positioning method and device for unmanned aerial vehicle, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN118112496A (en)

Similar Documents

Publication Publication Date Title
CA3107374C (en) Systems and methods for autonomous machine tracking and localization of mobile objects
Nguyen et al. Robust target-relative localization with ultra-wideband ranging and communication
US10565732B2 (en) Sensor fusion using inertial and image sensors
EP2029970B1 (en) Beacon-augmented pose estimation
CN109885080B (en) Autonomous control system and autonomous control method
EP3158293B1 (en) Sensor fusion using inertial and image sensors
EP3158417B1 (en) Sensor fusion using inertial and image sensors
EP3734394A1 (en) Sensor fusion using inertial and image sensors
Wang et al. Bearing-only visual SLAM for small unmanned aerial vehicles in GPS-denied environments
CN106017463A (en) Aircraft positioning method based on positioning and sensing device
CN111338383B (en) GAAS-based autonomous flight method and system, and storage medium
CN111426320B (en) Vehicle autonomous navigation method based on image matching/inertial navigation/milemeter
Rhudy et al. Unmanned aerial vehicle navigation using wide‐field optical flow and inertial sensors
Campbell et al. A vision based geolocation tracking system for uav's
Zhang et al. Vision-based relative altitude estimation of small unmanned aerial vehicles in target localization
Andert et al. Optical-aided aircraft navigation using decoupled visual SLAM with range sensor augmentation
Miller et al. UAV navigation based on videosequences captured by the onboard video camera
Klavins et al. Unmanned aerial vehicle movement trajectory detection in open environment
CN109186614A (en) Short distance autonomous relative navigation method between a kind of spacecraft
Nemra et al. Robust cooperative UAV visual SLAM
CN118112496A (en) Positioning method and device for unmanned aerial vehicle, equipment and storage medium
Hosen et al. A vision-aided nonlinear observer for fixed-wing UAV navigation
CN102818570A (en) Method for Mars acquisition by using SINS/image matching combination navigation
Wee et al. A Unified Method for Vision Aided Navigation of Autonomous Systems
CN109269499B (en) Target joint networking positioning method based on relative navigation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination