CN114018241A - Positioning method and device for unmanned aerial vehicle - Google Patents

Positioning method and device for unmanned aerial vehicle Download PDF

Info

Publication number
CN114018241A
CN114018241A CN202111294756.8A CN202111294756A CN114018241A CN 114018241 A CN114018241 A CN 114018241A CN 202111294756 A CN202111294756 A CN 202111294756A CN 114018241 A CN114018241 A CN 114018241A
Authority
CN
China
Prior art keywords
coordinate system
unmanned aerial
aerial vehicle
value
acceleration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111294756.8A
Other languages
Chinese (zh)
Other versions
CN114018241B (en
Inventor
蔡泽斌
陈明亮
廖益木
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou On Bright Electronics Co Ltd
Original Assignee
Guangzhou On Bright Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou On Bright Electronics Co Ltd filed Critical Guangzhou On Bright Electronics Co Ltd
Priority to CN202111294756.8A priority Critical patent/CN114018241B/en
Priority to TW110148001A priority patent/TWI805141B/en
Publication of CN114018241A publication Critical patent/CN114018241A/en
Application granted granted Critical
Publication of CN114018241B publication Critical patent/CN114018241B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching

Abstract

Disclosed are a positioning method and apparatus for a drone loaded with an acceleration sensor, a gyro sensor, a height sensor, and an optical flow sensor, the positioning method including: acquiring an original value of the motion acceleration of the unmanned aerial vehicle under a horizontal course coordinate system based on an acceleration value measured by an acceleration sensor and an angular velocity value measured by a gyroscope sensor; acquiring a motion acceleration correction value of the unmanned aerial vehicle in a horizontal course coordinate system based on a relative displacement value measured by an optical flow sensor, a relative height value measured by a height sensor, an acceleration value measured by an acceleration sensor, an angular velocity value measured by a gyroscope sensor and a motion acceleration original value of the unmanned aerial vehicle in the horizontal course coordinate system; and acquiring the position of the unmanned aerial vehicle in the horizontal course coordinate system based on the motion acceleration original value and the motion acceleration correction value of the unmanned aerial vehicle in the horizontal course coordinate system.

Description

Positioning method and device for unmanned aerial vehicle
Technical Field
The invention relates to the field of unmanned aerial vehicles, in particular to a positioning method and equipment for an unmanned aerial vehicle.
Background
Along with the development of unmanned aerial vehicle technology, unmanned aerial vehicle's range of application is wider and wider, and consumer group is bigger and bigger. In the middle and low end market, which is sensitive to manufacturing costs, there is an increasing demand for unmanned aerial vehicles with low cost and high performance. For scenes and limitations that no Global Positioning System (GPS) signal or the GPS signal is weak, and that the unmanned aerial vehicle is small in size and weak in load capacity, displacement and moving speed of the unmanned aerial vehicle relative to a starting position on a detected plane (i.e., a position where the unmanned aerial vehicle is located at the time of takeoff) can be detected by using an optical flow sensor. However, in the case of various maneuvers such as taking-off and landing of an unmanned aerial vehicle, flight navigation, and target tracking, if positioning of the unmanned aerial vehicle is performed by simply using an optical flow sensor, the cumulative offset of the optical flow sensor itself, changes in the attitude and height position of the unmanned aerial vehicle itself, and non-ideal or complex ground characteristics are extremely likely to cause instability of the unmanned aerial vehicle.
Disclosure of Invention
According to the embodiment of the invention, the positioning method for the unmanned aerial vehicle is used for loading the unmanned aerial vehicle with an acceleration sensor, a gyroscope sensor, a height sensor and an optical flow sensor, and comprises the following steps: acquiring an original value of the motion acceleration of the unmanned aerial vehicle under a horizontal course coordinate system based on an acceleration value measured by an acceleration sensor and an angular velocity value measured by a gyroscope sensor; acquiring a motion acceleration correction value of the unmanned aerial vehicle in a horizontal course coordinate system based on a relative displacement value measured by an optical flow sensor, a relative height value measured by a height sensor, an acceleration value measured by an acceleration sensor, an angular velocity value measured by a gyroscope sensor and a motion acceleration original value of the unmanned aerial vehicle in the horizontal course coordinate system; and acquiring the position of the unmanned aerial vehicle in the horizontal course coordinate system based on the motion acceleration original value and the motion acceleration correction value of the unmanned aerial vehicle in the horizontal course coordinate system.
According to an embodiment of the present invention, a positioning apparatus for a drone, wherein the drone is loaded with an acceleration sensor, a gyro sensor, a height sensor, and an optical flow sensor, comprises: a memory having computer-executable instructions stored thereon; and one or more processors configured to execute computer-executable instructions to implement the above-described positioning method for a drone.
According to the positioning method and the positioning device for the unmanned aerial vehicle, the measurement results of the acceleration sensor, the gyroscope sensor, the height sensor and the optical flow sensor are fused with one another, so that the unmanned aerial vehicle can be positioned more truly, more smoothly, in real time and more accurately.
Drawings
The invention may be better understood from the following description of specific embodiments thereof taken in conjunction with the accompanying drawings, in which:
fig. 1 shows a flow chart of a positioning method for a drone according to an embodiment of the invention.
Fig. 2 shows a schematic diagram of a computer system that may implement the positioning method for a drone according to an embodiment of the invention.
Detailed Description
Features and exemplary embodiments of various aspects of the present invention will be described in detail below. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced without some of these specific details. The following description of the embodiments is merely intended to provide a better understanding of the present invention by illustrating examples of the present invention. The present invention is in no way limited to any specific configuration and algorithm set forth below, but rather covers any modification, replacement or improvement of elements, components or algorithms without departing from the spirit of the invention. In the drawings and the following description, well-known structures and techniques are not shown in order to avoid unnecessarily obscuring the present invention.
In view of the above problems existing in the positioning of the unmanned aerial vehicle by independently using the optical flow sensor, the positioning method for the unmanned aerial vehicle, which is complementary in multi-sensor fusion, high in precision and high in robustness, is provided.
Fig. 1 shows a flow chart of a positioning method 100 for a drone according to an embodiment of the invention. It should be noted that the positioning method 100 will use the measurement results from the acceleration sensor, the gyro sensor, the altitude sensor, and the optical flow sensor of the drone control system, and position the drone in the reference coordinate system of the horizontal heading coordinate system of the relative earth motion centered on the drone and always having the head direction of the drone as 0 azimuth.
As shown in fig. 1, the positioning method 100 includes: s102, acquiring an original value of the motion acceleration of the unmanned aerial vehicle under a horizontal course coordinate system based on an acceleration value measured by an acceleration sensor and an angular velocity value measured by a gyroscope sensor; s104, acquiring a motion acceleration correction value of the unmanned aerial vehicle in a horizontal course coordinate system based on a relative displacement value measured by the optical flow sensor, a relative height value measured by the height sensor, an acceleration value measured by the acceleration sensor, an angular velocity value measured by the gyroscope sensor and a motion acceleration original value of the unmanned aerial vehicle in the horizontal course coordinate system; and S106, acquiring the position of the unmanned aerial vehicle in the horizontal course coordinate system based on the motion acceleration original value and the motion acceleration correction value of the unmanned aerial vehicle in the horizontal course coordinate system.
In some embodiments, since the acceleration sensor measures the acceleration value of the unmanned aerial vehicle in the unmanned aerial vehicle body coordinate system, the acceleration value measured by the acceleration sensor needs to be subjected to coordinate system conversion to obtain the acceleration value of the unmanned aerial vehicle in the horizontal heading coordinate system. Namely, the acquiring of the motion acceleration original value of the unmanned aerial vehicle in the horizontal heading coordinate system comprises: acquiring a direction cosine matrix for converting the acceleration value measured by the acceleration sensor into an original value of the motion acceleration of the unmanned aerial vehicle under a horizontal course coordinate system based on the acceleration value measured by the acceleration sensor and the angular velocity value measured by the gyroscope sensor; and acquiring an original value of the motion acceleration of the unmanned aerial vehicle under a horizontal course coordinate system based on the acceleration value measured by the acceleration sensor and the direction cosine matrix.
For example, the acceleration value measured by the acceleration sensor may be converted into the original value of the motion acceleration of the drone in the horizontal heading coordinate system according to equation (1).
an=ab·R (1)
In equation (1), R is an acceleration value for converting the acceleration sensor measurement toChanging the direction into a direction cosine matrix (a direction cosine matrix when the yaw angle is 0) of the original value of the motion acceleration of the unmanned aerial vehicle in a horizontal course coordinate system, and anIs the original value of the motion acceleration of the unmanned aerial vehicle under a horizontal course coordinate system, abAcceleration values measured for the acceleration sensor (in cm/s/s). Here, because the use of direction cosine matrix has reduced the use of trigonometric function, has reduced the power of calculation requirement to the computational equipment that unmanned aerial vehicle carried on and has promoted the computational efficiency of this computational equipment.
For convenience of description, all of the following are units of acceleration of movement in cm/s, speed of movement in cm/s, and relative displacement in cm.
In some embodiments, obtaining the motion acceleration correction value of the drone in the horizontal heading coordinate system includes: acquiring a relative displacement value after the height and angle compensation of the unmanned aerial vehicle under a horizontal course coordinate system based on a relative displacement value measured by an optical flow sensor, a relative height value measured by a height sensor, an acceleration value measured by an acceleration sensor and an angular velocity value measured by a gyroscope sensor; acquiring a relative displacement difference value of the unmanned aerial vehicle after the height and angle compensation in the horizontal course coordinate system based on the compensated relative displacement value of the unmanned aerial vehicle in the horizontal course coordinate system and the complementarily filtered relative displacement value; and acquiring a motion acceleration correction value of the unmanned aerial vehicle in the horizontal course coordinate system based on the relative displacement difference value and the first adaptive integral coefficient after the height and angle compensation of the unmanned aerial vehicle in the horizontal course coordinate system.
For example, the motion acceleration correction value of the drone in the horizontal heading coordinate system may be obtained based on the altitude and angle compensated relative displacement value of the drone in the horizontal heading coordinate system and the first adaptive integral coefficient according to equations (2) and (2-1):
Figure BDA0003336220930000041
Figure BDA0003336220930000042
in equations (2) and (2-1), an_x_correction、an_y_correctionRespectively the motion acceleration correction values of the unmanned aerial vehicle in the horizontal direction (x direction) and the vertical direction (y direction) under the horizontal course coordinate system, Sopt_x、Sopt_yRespectively is the relative displacement value S after the altitude and angle compensation of the unmanned aerial vehicle in the x direction and the y direction under the horizontal course coordinate systemopt_x_dealt、Sopt_y_dealtRespectively compensating the height and the angle of the unmanned aerial vehicle in the x direction and the y direction under a horizontal course coordinate system to obtain a relative displacement difference value Sn_x_filter、Sn_y_filterThe complementary filtered relative displacement values of the unmanned aerial vehicle in the x direction and the y direction under the horizontal heading coordinate system are respectively obtained, and k1 is a first adaptive integral coefficient. Here, an_x_correction、an_y_correctionAll initial values of (a) are zero.
In some embodiments, obtaining the position of the drone in the horizontal heading coordinate system includes: acquiring a motion acceleration value of the unmanned aerial vehicle after complementary filtering in a horizontal course coordinate system based on the motion acceleration original value and the motion acceleration correction value of the unmanned aerial vehicle in the horizontal course coordinate system; and acquiring an original value of the movement speed of the unmanned aerial vehicle in the horizontal course coordinate system based on the movement acceleration value of the unmanned aerial vehicle in the horizontal course coordinate system after complementary filtering.
For example, the complementary filtered motion acceleration value of the drone in the horizontal heading coordinate system may be obtained based on the motion acceleration original value and the motion acceleration corrected value of the drone in the horizontal heading coordinate system according to equation (3).
Figure BDA0003336220930000051
In equation (3), an_x_filter、an_y_filterRespectively are the motion acceleration values a of the unmanned aerial vehicle after complementary filtering in the x direction and the y direction under a horizontal course coordinate systemn_x_origion、an_y_origionThe original values of the motion acceleration of the unmanned aerial vehicle in the x direction and the y direction under the horizontal heading coordinate system, a, are obtained according to equation (1)n_x_correction、an_y_correctionThe correction values of the motion acceleration of the unmanned aerial vehicle in the x direction and the y direction under the horizontal heading coordinate system are respectively.
For example, the original value of the movement speed of the drone in the horizontal heading coordinate system may be obtained based on the complementary filtered values of the movement acceleration of the drone in the horizontal heading coordinate system according to equations (4) and (5).
Figure BDA0003336220930000052
Figure BDA0003336220930000053
In equation (4), Vn_x_origion、Vn_x_origionThe original values V of the movement speed of the unmanned aerial vehicle in the x direction and the y direction under the horizontal course coordinate systemn_x_dealt、Vn_y_dealtThe speed increment in the unit time (e.g., 1s) of the unmanned aerial vehicle in the x direction and the y direction respectively in the horizontal heading coordinate system.
In some embodiments, obtaining the position of the drone under the horizontal heading coordinate system may further include: acquiring a movement speed correction value of the unmanned aerial vehicle in the horizontal course coordinate system based on the relative displacement difference value and the second adaptive integral coefficient after the height and angle compensation of the unmanned aerial vehicle in the horizontal course coordinate system; and acquiring a motion speed value of the unmanned aerial vehicle after complementary filtering in the horizontal course coordinate system based on the motion speed original value and the motion speed correction value of the unmanned aerial vehicle in the horizontal course coordinate system.
For example, the correction value of the movement speed of the unmanned aerial vehicle in the horizontal heading coordinate system may be obtained according to equation (6) based on the compensated relative displacement difference between the altitude and the angle of the unmanned aerial vehicle in the horizontal heading coordinate system and the second adaptive integral coefficient.
Figure BDA0003336220930000061
In equation (6), Vn_x_correction、Vn_y_correctionRespectively correcting the movement speed of the unmanned aerial vehicle in the x direction and the y direction under a horizontal course coordinate system, Sopt_x_dealt、Sopt_y_dealtThe difference values of the altitude and the relative displacement after angle compensation in the x direction and the y direction of the unmanned aerial vehicle under the horizontal heading coordinate system are respectively, and k2 is a second self-adaptive integral coefficient. Here, Vn_x_correction、Vn_y_correctionAll initial values of (a) are zero.
For example, a complementary filtered moving speed value of the drone in the horizontal heading coordinate system may be obtained based on the original moving speed value and the corrected moving speed value of the drone in the horizontal heading coordinate system according to equation (7).
Figure BDA0003336220930000062
In equation (7), Vn_x_filter、Vn_y_filterRespectively are the motion speed values V of the unmanned aerial vehicle after complementary filtering in the x direction and the y direction under the horizontal course coordinate systemn_x_origion、Vn_x_origionThe original values V of the movement speed of the unmanned aerial vehicle in the x direction and the y direction under the horizontal course coordinate systemn_x_correction、Vn_y_correctionThe correction values of the movement speed of the unmanned aerial vehicle in the x direction and the y direction under the horizontal heading coordinate system are respectively.
In some embodiments, obtaining the position of the drone under the horizontal heading coordinate system may further include: and acquiring a relative displacement original value of the unmanned aerial vehicle in the horizontal course coordinate system based on the motion speed value of the unmanned aerial vehicle after complementary filtering in the horizontal course coordinate system.
For example, a relative displacement original value of the drone in the horizontal heading coordinate system may be obtained based on the complementary filtered motion velocity value of the drone in the horizontal heading coordinate system according to equation (8).
Figure BDA0003336220930000071
In equation (8), Sn_x_origion、Sn_y_origionRespectively are the original values V of the relative displacement of the unmanned aerial vehicle in the x direction and the y direction under the horizontal course coordinate systemn_x_filter、Vn_y_filterRespectively are the motion speed values V of the unmanned aerial vehicle after complementary filtering in the x direction and the y direction under the horizontal course coordinate systemn_x_dealt、Vn_y_dealtThe speed increment in the unit time (e.g., 1s) of the unmanned aerial vehicle in the x direction and the y direction respectively in the horizontal heading coordinate system.
In some embodiments, obtaining the position of the drone under the horizontal heading coordinate system may further include: acquiring a relative displacement correction value of the unmanned aerial vehicle in the horizontal course coordinate system based on the relative displacement difference value and the third adaptive integral coefficient after the height and angle compensation of the unmanned aerial vehicle in the horizontal course coordinate system; and acquiring a relative displacement value after complementary filtering of the unmanned aerial vehicle in the horizontal course coordinate system based on the relative displacement initial value and the relative displacement correction value of the unmanned aerial vehicle in the horizontal course coordinate system.
For example, a relative displacement correction value of the drone in the horizontal heading coordinate system may be obtained according to equation (9) based on the compensated relative displacement difference value of the altitude and the angle of the drone in the horizontal heading coordinate system and the third adaptive integral coefficient.
Figure BDA0003336220930000072
In equation (9), Sn_x_correction、Sn_y_correctionRespectively the relative displacement correction values S of the unmanned aerial vehicle in the x direction and the y direction under the horizontal course coordinate systemopt_x_dealt、Sopt_y_dealtThe difference values of the altitude and the relative displacement after the angle compensation in the x direction and the y direction of the unmanned aerial vehicle under the horizontal heading coordinate system are respectively shown, and k3 is a third adaptive productAnd (4) dividing coefficient. Here, Sn_x_correction、Sn_y_correctionAll initial values of (a) are zero.
For example, a complementary filtered relative displacement value of the drone in the horizontal heading coordinate system may be obtained based on the relative displacement initial value and the relative displacement correction value of the drone in the horizontal heading coordinate system according to equation (10).
Figure BDA0003336220930000073
In equation (10), Sn_x_filter、Sn_y_filterRespectively are the complementary filtered relative displacement values S and S of the unmanned aerial vehicle in the x direction and the y direction under the horizontal course coordinate systemn_x_origion、Sn_y_origionRespectively are the original values of the relative displacement of the unmanned aerial vehicle in the x direction and the y direction under the horizontal course coordinate system, Sn_x_correction、Sn_y_correctionAnd the correction values are the relative displacement correction values of the unmanned aerial vehicle in the x direction and the y direction under the horizontal heading coordinate system respectively.
In some embodiments, the first, second, and third adaptive integral coefficients are determined based on the adaptive filter coefficients and the first, second, and third constants, respectively. For example, the first, second, and third adaptive integration coefficients may be determined based on the adaptive filter coefficients and the first, second, and third constants according to equation (11).
Figure BDA0003336220930000081
In equation (11), k1, k2, and k3 are the first, second, and third adaptive integral coefficients, respectively, kfilterFor adaptive filter coefficients, dynamic adjustment is required to obtain smooth true relative displacement of the drone, e1, e2, e3 being first, second, and third constants, respectively.
In some embodiments, it may be determined whether the drone is in a take-off, landing, ascending, descending, or maneuver flight state by monitoring the speed of movement of the drone relative to the detected plane, and the adaptive filter coefficient may be adjusted according to the state of the drone.
For example, the fused motion speed of the drone in three directions, i.e., the x direction, the y direction, and the z direction (vertical direction) in the horizontal heading coordinate system may be obtained according to equation (12), and then the adaptive filter coefficient may be adaptively adjusted according to equation (13) based on the fused motion speed of the drone in the horizontal heading coordinate system.
Figure BDA0003336220930000082
Figure BDA0003336220930000083
In equation (12), Vn_mixThe integrated motion speed V of the unmanned aerial vehicle in the x direction, the y direction and the z direction under the horizontal course coordinate systemn_x_filter、Vn_y_filter、Vn_z_filterThe motion speed values of the unmanned aerial vehicle after complementary filtering in the x direction, the y direction and the z direction under the horizontal heading coordinate system are respectively.
In equation (13), V is expressedn_mixClipping at [0, C]In the interval, the value of C depends on the upper limit of the measurable speed range of the optical flow sensor and the upper limit of the speed of the unmanned aerial vehicle, and the value of B depends on the speed fault-tolerant range of hovering of the unmanned aerial vehicle. When V isn_mixIs at [0, B]In the interval, the coefficient k 'is maintained'filterIs unchanged when Vn_mixIs at (B, C)]In intervals, the function g (V) is usedn_mix) K 'is calculated'filterWherein the function g (V)n_mix) Output value of [1, D ]]In the interval, D is the final fusion velocity measured at the velocity C, the maximum adjustment coefficient with good displacement.
Here, the data quality parameter Q based on the optical flow sensor may be based on equation (14)optFor the original filter coefficient k ″)filterAnd carrying out dynamic adjustment.
Figure BDA0003336220930000091
In equation (14), k ″)filterThe original complementary filter coefficients obtained by setting under the condition of the first quality threshold A. As shown in equation (14), the data quality parameter Q of the optical flow sensoroptK 'when being less than first quality threshold value A'filterIs the original complementary filter coefficient k ″)filter(ii) a Data quality parameter Q of optical flow sensoroptIn the [ A, B) interval (B is a second quality threshold for measuring the data quality of the optical flow sensor), the function f (Q) is usedopt) Calculate k ″)filterWherein the function f (Q)opt) In the output range of (0, 1)]In the interval.
In summary, according to the positioning method for the unmanned aerial vehicle of the embodiment of the invention, the measurement results from the acceleration sensor, the gyroscope sensor, the altitude sensor and the optical flow sensor are fused with each other, so that the unmanned aerial vehicle can be positioned more truly, more smoothly, in real time and more accurately. In addition, the self-adaptive filter coefficient is self-adaptively adjusted according to the motion speed of the unmanned aerial vehicle after complementary filtering in the horizontal course coordinate system, so that the real displacement and speed conditions of the unmanned aerial vehicle can be restored with high precision when the unmanned aerial vehicle is in a hovering and flying moving state. In addition, for the situations that the unmanned aerial vehicle is in a hovering or navigation control state, a target follows and other maneuvering flight states and the environment changes, the real displacement and speed conditions of the unmanned aerial vehicle can be restored with high precision. Meanwhile, the unmanned aerial vehicle is positioned based on basic sensors (namely, an acceleration sensor, a gyroscope sensor, a height sensor and an optical flow sensor) of an unmanned aerial vehicle control system, extra hardware cost and load cannot be increased, the calculation amount is small, and the unmanned aerial vehicle positioning system is suitable for various occasions sensitive to cost and limited in unmanned aerial vehicle volume load capacity.
Fig. 2 shows a schematic diagram of a computer system that may implement the displacement compensation method and apparatus for a multi-rotor drone according to an embodiment of the invention. A computer system 200 suitable for use in implementing embodiments of the present disclosure is described below in conjunction with FIG. 2. It should be understood that the computer system 200 shown in FIG. 2 is only one example and should not impose any limitations on the scope of use or functionality of embodiments of the disclosure.
As shown in fig. 2, computer system 200 may include a processing device (e.g., central processing unit, graphics processor, etc.) 201 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)202 or a program loaded from a storage device 208 into a Random Access Memory (RAM) 203. In the RAM 203, various programs and data necessary for the operation of the computer system 200 are also stored. The processing device 201, the ROM 202, and the RAM 203 are connected to each other via a bus 204. An input/output (I/O) interface 205 is also connected to bus 204.
Generally, the following devices may be connected to the I/O interface 205: input devices 206 including, for example, touch screens, touch pads, cameras, accelerometers, gyroscopes, sensors, etc.; an output device 207 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, a motor, an electronic governor, and the like; a storage device 208 including, for example, a Flash memory (Flash Card); and a communication device 209. The communication means 209 may allow the computer system 200 to communicate with other devices, wirelessly or by wire, for exchanging data. While FIG. 2 illustrates a computer system 200 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided. Each block shown in fig. 2 may represent one device or may represent multiple devices, as desired.
In particular, according to an embodiment of the present invention, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure provide a computer-readable storage medium storing a computer program comprising program code for performing the method 100 shown in fig. 1. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means 209, or installed from the storage means 208, or installed from the ROM 202. The computer program realizes the above-described functions defined in the apparatus according to an embodiment of the present invention when executed by the processing apparatus 201.
It should be noted that the computer readable medium according to the embodiment of the present invention may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. A computer readable storage medium according to an embodiment of the present invention may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In addition, a computer readable signal medium according to an embodiment of the present invention may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (Radio Frequency), etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for embodiments of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The present invention may be embodied in other specific forms without departing from its spirit or essential characteristics. For example, the algorithms described in the specific embodiments may be modified without departing from the basic spirit of the invention. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein.

Claims (13)

1. A positioning method for a drone loaded with an acceleration sensor, a gyroscope sensor, a height sensor, and an optical flow sensor, comprising:
acquiring an original value of the motion acceleration of the unmanned aerial vehicle under a horizontal course coordinate system based on the acceleration value measured by the acceleration sensor and the angular velocity value measured by the gyroscope sensor;
acquiring a motion acceleration correction value of the unmanned aerial vehicle in a horizontal heading coordinate system based on a relative displacement value measured by the optical flow sensor, a relative height value measured by the height sensor, an acceleration value measured by the acceleration sensor, an angular velocity value measured by the gyroscope sensor and a motion acceleration original value of the unmanned aerial vehicle in the horizontal heading coordinate system; and
and acquiring the position of the unmanned aerial vehicle in the horizontal course coordinate system based on the motion acceleration original value and the motion acceleration correction value of the unmanned aerial vehicle in the horizontal course coordinate system.
2. The positioning method according to claim 1, wherein the obtaining of the motion acceleration original value of the unmanned aerial vehicle in the horizontal heading coordinate system comprises:
acquiring a direction cosine matrix for converting the acceleration value measured by the acceleration sensor into an original value of the motion acceleration of the unmanned aerial vehicle under a horizontal course coordinate system based on the acceleration value measured by the acceleration sensor and the angular velocity value measured by the gyroscope sensor; and
acquiring an original value of the motion acceleration of the unmanned aerial vehicle under a horizontal course coordinate system based on the acceleration value measured by the acceleration sensor and the direction cosine matrix,
and the acceleration value measured by the acceleration sensor is the motion acceleration value of the unmanned aerial vehicle under the coordinate system of the unmanned aerial vehicle.
3. The positioning method according to claim 1, wherein obtaining the correction value of the motion acceleration of the drone in the horizontal heading coordinate system comprises:
acquiring the height and angle compensated relative displacement value of the unmanned aerial vehicle in a horizontal heading coordinate system based on the relative displacement value measured by the optical flow sensor, the relative height value measured by the height sensor, the acceleration value measured by the acceleration sensor and the angular velocity value measured by the gyroscope sensor;
acquiring a relative displacement difference value of the unmanned aerial vehicle after the height and angle compensation in the horizontal course coordinate system based on the compensated relative displacement value of the unmanned aerial vehicle in the horizontal course coordinate system and the complementarily filtered relative displacement value; and
acquiring a motion acceleration correction value of the unmanned aerial vehicle in a horizontal course coordinate system based on a relative displacement difference value and a first adaptive integral coefficient after the height and angle compensation of the unmanned aerial vehicle in the horizontal course coordinate system,
wherein the first adaptive integral coefficient is determined based on an adaptive filter coefficient and a first constant, and the adaptive filter coefficient is determined based on a motion acceleration original value and a motion acceleration correction value of the unmanned aerial vehicle in a horizontal heading coordinate system.
4. The positioning method according to claim 3, wherein acquiring the position of the drone in a horizontal heading coordinate system comprises:
acquiring a motion acceleration value of the unmanned aerial vehicle after complementary filtering in a horizontal course coordinate system based on the motion acceleration original value and the motion acceleration correction value of the unmanned aerial vehicle in the horizontal course coordinate system; and
and acquiring an original value of the movement speed of the unmanned aerial vehicle in the horizontal course coordinate system based on the movement acceleration value of the unmanned aerial vehicle in the horizontal course coordinate system after complementary filtering.
5. The positioning method according to claim 4, wherein obtaining the position of the drone in a horizontal heading coordinate system further comprises:
acquiring a movement speed correction value of the unmanned aerial vehicle in a horizontal course coordinate system based on a relative displacement difference value and a second adaptive integral coefficient after the height and angle compensation of the unmanned aerial vehicle in the horizontal course coordinate system; and
acquiring a motion speed value of the unmanned aerial vehicle after complementary filtering in a horizontal course coordinate system based on the motion speed original value and the motion speed correction value of the unmanned aerial vehicle in the horizontal course coordinate system,
wherein the second adaptive integral coefficient is determined based on an adaptive filter coefficient and a second constant, and the adaptive filter coefficient is determined based on a complementary filtered motion velocity value of the drone in a horizontal heading coordinate system.
6. The positioning method according to claim 5, wherein obtaining the position of the drone in a horizontal heading coordinate system further comprises:
and acquiring a relative displacement original value of the unmanned aerial vehicle in the horizontal course coordinate system based on the motion speed value of the unmanned aerial vehicle after the complementary filtering in the horizontal course coordinate system.
7. The positioning method according to claim 6, wherein obtaining the position of the drone in a horizontal heading coordinate system further comprises:
acquiring a relative displacement correction value of the unmanned aerial vehicle in a horizontal course coordinate system based on a relative displacement difference value and a third adaptive integral coefficient after the height and angle compensation of the unmanned aerial vehicle in the horizontal course coordinate system; and
acquiring a relative displacement value after complementary filtering of the unmanned aerial vehicle in a horizontal course coordinate system based on the relative displacement initial value and the relative displacement correction value of the unmanned aerial vehicle in the horizontal course coordinate system,
wherein the third adaptive integration coefficient is determined based on the adaptive filter coefficient and a third constant.
8. The positioning method according to claim 3, wherein the process of determining the adaptive filter coefficients comprises:
acquiring a motion acceleration value of the unmanned aerial vehicle after complementary filtering in a horizontal course coordinate system based on the motion acceleration original value and the motion acceleration correction value of the unmanned aerial vehicle in the horizontal course coordinate system;
acquiring an original value of the movement speed of the unmanned aerial vehicle in a horizontal course coordinate system based on the movement acceleration value of the unmanned aerial vehicle in the horizontal course coordinate system after complementary filtering;
acquiring a movement speed correction value of the unmanned aerial vehicle in a horizontal course coordinate system based on a relative displacement difference value and a second adaptive integral coefficient after the height and angle compensation of the unmanned aerial vehicle in the horizontal course coordinate system;
acquiring a motion speed value of the unmanned aerial vehicle after complementary filtering in a horizontal course coordinate system based on the motion speed original value and the motion speed correction value of the unmanned aerial vehicle in the horizontal course coordinate system; and
determining the self-adaptive filter coefficient based on the motion speed value of the unmanned aerial vehicle after complementary filtering in a horizontal heading coordinate system,
wherein the second adaptive integral coefficient is determined based on the adaptive filter coefficient and a second constant, the adaptive filter coefficient is determined based on a motion velocity value of the unmanned aerial vehicle after complementary filtering in a horizontal heading coordinate system, and an initial value of the adaptive filter coefficient is a predetermined value that depends on a data quality parameter of the optical flow sensor, a data quality threshold for measuring data quality of the optical flow sensor, and an original complementary filter coefficient set by the data quality threshold.
9. The positioning method according to claim 8, wherein the initial values of the motion speed original value, the motion speed correction value and the motion acceleration correction value of the unmanned aerial vehicle in the horizontal heading coordinate system are all zero.
10. The positioning method according to claim 8, wherein the adaptive filter coefficient is determined based on a data quality parameter of the optical flow sensor, a data quality threshold for measuring the data quality of the optical flow sensor, a raw complementary filter coefficient obtained by tuning the data quality threshold, and a complementary filtered motion velocity value of the drone under a horizontal heading coordinate system.
11. The positioning method according to claim 7, wherein the relative displacement original value and the relative displacement correction value of the unmanned aerial vehicle in the horizontal heading coordinate system are both zero.
12. A positioning apparatus for a drone loaded with an acceleration sensor, a gyroscope sensor, a height sensor, and an optical flow sensor, comprising:
a memory having computer-executable instructions stored thereon; and
one or more processors configured to execute the computer-executable instructions to implement the positioning method of any one of claims 1 to 11.
13. An unmanned aerial vehicle, comprising:
an acceleration sensor;
a gyroscope sensor;
a height sensor;
an optical flow sensor; and
the positioning apparatus for drones of claim 12.
CN202111294756.8A 2021-11-03 2021-11-03 Positioning method and device for unmanned aerial vehicle Active CN114018241B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111294756.8A CN114018241B (en) 2021-11-03 2021-11-03 Positioning method and device for unmanned aerial vehicle
TW110148001A TWI805141B (en) 2021-11-03 2021-12-21 Positioning method and device for unmanned aerial vehicles

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111294756.8A CN114018241B (en) 2021-11-03 2021-11-03 Positioning method and device for unmanned aerial vehicle

Publications (2)

Publication Number Publication Date
CN114018241A true CN114018241A (en) 2022-02-08
CN114018241B CN114018241B (en) 2023-12-26

Family

ID=80060164

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111294756.8A Active CN114018241B (en) 2021-11-03 2021-11-03 Positioning method and device for unmanned aerial vehicle

Country Status (2)

Country Link
CN (1) CN114018241B (en)
TW (1) TWI805141B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115790574A (en) * 2023-02-14 2023-03-14 飞联智航(北京)科技有限公司 Unmanned aerial vehicle optical flow positioning method and device and unmanned aerial vehicle

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU118740U1 (en) * 2012-01-12 2012-07-27 Открытое акционерное общество "Концерн "Созвездие" ADAPTIVE NAVIGATION COMPLEX
CN103853156A (en) * 2014-02-07 2014-06-11 中山大学 Small four-rotor aircraft control system and method based on airborne sensor
CN104808231A (en) * 2015-03-10 2015-07-29 天津大学 Unmanned aerial vehicle positioning method based on GPS and optical flow sensor data fusion
US20150293138A1 (en) * 2012-11-07 2015-10-15 Ecole Polytechnique Federale De Lausanne (Epfl) Method to determine a direction and amplitude of a current velocity estimate of a moving device
CN105352495A (en) * 2015-11-17 2016-02-24 天津大学 Unmanned-plane horizontal-speed control method based on fusion of data of acceleration sensor and optical-flow sensor
CN106017463A (en) * 2016-05-26 2016-10-12 浙江大学 Aircraft positioning method based on positioning and sensing device
KR101769602B1 (en) * 2016-08-09 2017-08-18 아이디어주식회사 Apparatus and method of position revision for hovering using optical flow and imu and ultrasonic sensor
CN109916394A (en) * 2019-04-04 2019-06-21 山东智翼航空科技有限公司 A kind of Integrated Navigation Algorithm merging optical flow position and velocity information
CN111398522A (en) * 2020-03-24 2020-07-10 山东智翼航空科技有限公司 Indoor air quality detection system and detection method based on micro unmanned aerial vehicle

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9279683B2 (en) * 2012-03-02 2016-03-08 Moog Inc. Real-time aircraft status detection system and method
US20160349746A1 (en) * 2015-05-29 2016-12-01 Faro Technologies, Inc. Unmanned aerial vehicle having a projector and being tracked by a laser tracker
CN111857176A (en) * 2020-07-20 2020-10-30 广州狸园科技有限公司 GPS unmanned aerial vehicle control method
CN112924999B (en) * 2021-01-14 2023-08-22 华南理工大学 Unmanned aerial vehicle positioning method, system, device and medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU118740U1 (en) * 2012-01-12 2012-07-27 Открытое акционерное общество "Концерн "Созвездие" ADAPTIVE NAVIGATION COMPLEX
US20150293138A1 (en) * 2012-11-07 2015-10-15 Ecole Polytechnique Federale De Lausanne (Epfl) Method to determine a direction and amplitude of a current velocity estimate of a moving device
CN103853156A (en) * 2014-02-07 2014-06-11 中山大学 Small four-rotor aircraft control system and method based on airborne sensor
CN104808231A (en) * 2015-03-10 2015-07-29 天津大学 Unmanned aerial vehicle positioning method based on GPS and optical flow sensor data fusion
CN105352495A (en) * 2015-11-17 2016-02-24 天津大学 Unmanned-plane horizontal-speed control method based on fusion of data of acceleration sensor and optical-flow sensor
CN106017463A (en) * 2016-05-26 2016-10-12 浙江大学 Aircraft positioning method based on positioning and sensing device
KR101769602B1 (en) * 2016-08-09 2017-08-18 아이디어주식회사 Apparatus and method of position revision for hovering using optical flow and imu and ultrasonic sensor
CN109916394A (en) * 2019-04-04 2019-06-21 山东智翼航空科技有限公司 A kind of Integrated Navigation Algorithm merging optical flow position and velocity information
CN111398522A (en) * 2020-03-24 2020-07-10 山东智翼航空科技有限公司 Indoor air quality detection system and detection method based on micro unmanned aerial vehicle

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115790574A (en) * 2023-02-14 2023-03-14 飞联智航(北京)科技有限公司 Unmanned aerial vehicle optical flow positioning method and device and unmanned aerial vehicle

Also Published As

Publication number Publication date
TWI805141B (en) 2023-06-11
TW202319706A (en) 2023-05-16
CN114018241B (en) 2023-12-26

Similar Documents

Publication Publication Date Title
CN106959110B (en) Cloud deck attitude detection method and device
CN112066985B (en) Initialization method, device, medium and electronic equipment for combined navigation system
US11669109B2 (en) Method and apparatus for yaw fusion and aircraft
US20220155800A1 (en) Method and apparatus for yaw fusion and aircraft
CN105352495A (en) Unmanned-plane horizontal-speed control method based on fusion of data of acceleration sensor and optical-flow sensor
CN112835085B (en) Method and device for determining vehicle position
CN113183975B (en) Control method, device, equipment and storage medium for automatic driving vehicle
CN103712598A (en) Attitude determination system and method of small unmanned aerial vehicle
CN114018241B (en) Positioning method and device for unmanned aerial vehicle
US20220189318A1 (en) Aircraft sensor system synchronization
CN109143303B (en) Flight positioning method and device and fixed-wing unmanned aerial vehicle
CN100587644C (en) Integrated single loop controller for camera optical axis stable tracing
Emran et al. A cascaded approach for quadrotor's attitude estimation
CN109521785A (en) It is a kind of to clap Smart Rotor aerocraft system with oneself
KR20130079881A (en) Method and computer-readable recording medium for calibrating position of a target using a fixed target for unmanned aerial vehicle
CN115727842A (en) Rapid alignment method and system for unmanned aerial vehicle, computer equipment and storage medium
Wang et al. Attitude estimation for UAV with low-cost IMU/ADS based on adaptive-gain complementary filter
CN112414365B (en) Displacement compensation method and apparatus and velocity compensation method and apparatus
CA3081595C (en) Drone control device using model prediction control
Tehrani et al. Gyroscope offset estimation using panoramic vision-based attitude estimation and Extended Kalman Filter
CN106527464A (en) Unmanned aerial vehicle (UAV) attitude maintaining method and device
RU106971U1 (en) AUTOMOTIVE CONTROL SYSTEM FOR UNMANNED AIRCRAFT
CN111226093A (en) Information processing device, flight path generation method, program, and recording medium
CN110785722A (en) Parameter optimization method and device for mobile platform, control equipment and aircraft
US20230044834A1 (en) Micro-electromechanical inertial measurement unit

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant