CN114442133A - Unmanned aerial vehicle positioning method, device, equipment and storage medium - Google Patents
Unmanned aerial vehicle positioning method, device, equipment and storage medium Download PDFInfo
- Publication number
- CN114442133A CN114442133A CN202210123261.7A CN202210123261A CN114442133A CN 114442133 A CN114442133 A CN 114442133A CN 202210123261 A CN202210123261 A CN 202210123261A CN 114442133 A CN114442133 A CN 114442133A
- Authority
- CN
- China
- Prior art keywords
- unmanned aerial
- aerial vehicle
- point cloud
- coordinates
- coordinate
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 47
- 238000005457 optimization Methods 0.000 claims abstract description 55
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 45
- 239000011159 matrix material Substances 0.000 claims description 21
- 238000013519 translation Methods 0.000 claims description 21
- 238000001914 filtration Methods 0.000 claims description 17
- 238000006243 chemical reaction Methods 0.000 claims description 16
- 238000004590 computer program Methods 0.000 claims description 13
- 230000006870 function Effects 0.000 claims description 11
- 230000004927 fusion Effects 0.000 claims description 7
- 238000003064 k means clustering Methods 0.000 claims description 6
- 238000012545 processing Methods 0.000 claims description 4
- 230000000694 effects Effects 0.000 abstract description 9
- 230000000750 progressive effect Effects 0.000 abstract description 3
- 238000010586 diagram Methods 0.000 description 8
- 230000005540 biological transmission Effects 0.000 description 5
- 238000000605 extraction Methods 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 2
- 238000007689 inspection Methods 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/48—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
- G01S19/485—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an optical system or imaging system
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Position Fixing By Use Of Radio Waves (AREA)
Abstract
The application discloses a method, a device, equipment and a storage medium for positioning an unmanned aerial vehicle, wherein the method comprises the steps of obtaining Global Positioning System (GPS) data and point cloud data at the current moment, wherein the point cloud data is acquired based on laser radar on unmanned aerial vehicle equipment; determining a node coordinate of the unmanned aerial vehicle device according to the GPS data, wherein the node coordinate is a relative coordinate of the unmanned aerial vehicle device in a first coordinate system, and determining an edge coordinate of the unmanned aerial vehicle device, and the edge coordinate represents a pose variation of the unmanned aerial vehicle device between the current moment and the previous moment; utilize predetermined picture optimization algorithm, fuse node coordinate and limit coordinate, obtain unmanned aerial vehicle equipment's longitude and latitude coordinate, overcome single sensor's limitation, solved single GPS positioning accuracy on the one hand not high, the not high problem of location robustness, on the other hand adds GPS in laser radar location, can avoid single laser radar location to receive the influence of distance progressive error, improves unmanned aerial vehicle equipment's location effect.
Description
Technical Field
The application relates to the technical field of unmanned aerial vehicles, in particular to an unmanned aerial vehicle positioning method, device, equipment and storage medium.
Background
In the construction and maintenance process of a power grid system, the power transmission lines need to be patrolled and examined, but a plurality of power transmission lines are constructed in environments such as valleys and forests, so that the manual patrolling difficulty is high. Along with the rapid development of unmanned aerial vehicle technique, it becomes possible to adopt unmanned aerial vehicle to carry out multi-angle patrol inspection to transmission line.
At present, the positioning technology of an unmanned aerial vehicle mainly depends on a Global Positioning System (GPS), but the GPS is easy to generate multipath effect in the environments of valleys, forests and the like, so that the positioning robustness is poor; in addition, the unmanned aerial vehicle depends on a civil GPS device and is limited by the problem of cost, higher-precision positioning cannot be obtained, the precision range is between 2m and 5m, and the requirement of high-precision positioning cannot be met. Therefore, in order to realize multi-angle inspection of the power transmission line by adopting the unmanned aerial vehicle, the positioning effect of the unmanned aerial vehicle is improved, and the technical problem which needs to be solved urgently is solved.
Disclosure of Invention
The application provides an unmanned aerial vehicle positioning method, device, equipment and storage medium, which are used for solving the technical problem of poor positioning effect of the current unmanned aerial vehicle.
In order to solve the above technical problem, in a first aspect, an embodiment of the present application provides an unmanned aerial vehicle positioning method, including:
acquiring Global Positioning System (GPS) data and point cloud data at the current moment, wherein the point cloud data is acquired based on laser radar on unmanned aerial vehicle equipment;
determining node coordinates of the unmanned aerial vehicle equipment according to the GPS data, wherein the node coordinates are relative coordinates of the unmanned aerial vehicle equipment in a first coordinate system, and the first coordinate system is a coordinate system established by taking a laser radar as an origin;
determining the side coordinates of the unmanned aerial vehicle equipment according to the point cloud data, wherein the side coordinates represent the pose variation of the unmanned aerial vehicle equipment between the current moment and the previous moment;
and fusing the node coordinates and the side coordinates by using a preset graph optimization algorithm to obtain the longitude and latitude coordinates of the unmanned aerial vehicle equipment.
In the embodiment, the GPS data and the point cloud data of the global positioning system at the current moment are acquired, and the point cloud data is acquired based on the laser radar on the unmanned aerial vehicle equipment, so that the limitation of a single sensor is overcome, the problems of low positioning precision and low positioning robustness of the single GPS are solved, and the influence of a distance progressive error on the positioning of the single laser radar can be avoided by adding the GPS in the positioning of the laser radar; determining node coordinates of the unmanned aerial vehicle equipment according to the GPS data, wherein the node coordinates are relative coordinates of the unmanned aerial vehicle equipment in a first coordinate system, and determining side coordinates of the unmanned aerial vehicle equipment according to the point cloud data, wherein the side coordinates represent pose variation of the unmanned aerial vehicle equipment between the current moment and the previous moment; and finally, fusing the node coordinates and the side coordinates by using a preset graph optimization algorithm to obtain the longitude and latitude coordinates of the unmanned aerial vehicle equipment, thereby integrating the advantages of GPS positioning and laser radar positioning, realizing state fusion by using the graph optimization algorithm, enabling the obtained longitude and latitude coordinates to be more accurate, and improving the positioning effect of the unmanned aerial vehicle equipment.
In an embodiment, determining edge coordinates of the drone device from the point cloud data includes:
carrying out noise filtering on the point cloud data to obtain target point cloud data;
extracting first point cloud characteristics of the target point cloud data at the current moment, wherein the point cloud characteristics comprise line characteristics and surface characteristics;
and matching the first point cloud characteristics with the second point cloud characteristics at the previous moment to obtain the edge coordinates of the unmanned aerial vehicle equipment through decoupling, wherein the edge coordinates comprise a target rotation matrix and a target translation vector.
In an embodiment, the noise filtering is performed on the point cloud data to obtain target point cloud data, and the noise filtering includes:
clustering point cloud data based on a K-means clustering algorithm to obtain clustered point cloud data;
and carrying out voxel filtering on the clustered point cloud data to obtain target point cloud data.
In an embodiment, matching the first point cloud feature with the second point cloud feature at the previous time to obtain the edge coordinates of the drone device through decoupling includes:
matching the first point cloud characteristics with the second point cloud characteristics at the previous moment, and determining a rotation matrix and a translation vector of the unmanned aerial vehicle equipment;
and correcting the rotation matrix and the translation vector by using a preset optimization algorithm until the target error function is smaller than a preset value, so as to obtain a target rotation matrix and a target translation vector.
In one embodiment, determining node coordinates of the drone device from GPS data includes:
determining longitude and latitude data of the unmanned aerial vehicle equipment according to the GPS data;
converting the longitude and latitude data into position coordinates under a second coordinate system, wherein the second coordinate system is a geocentric coordinate system;
and carrying out coordinate conversion on the position coordinates to obtain the node coordinates of the unmanned aerial vehicle equipment in the first coordinate system.
In an embodiment, the method for obtaining the longitude and latitude coordinates of the unmanned aerial vehicle device by fusing the node coordinates and the side coordinates by using a preset graph optimization algorithm includes:
determining the optimal state quantity of the unmanned aerial vehicle equipment by taking the node coordinates as node constraints of a graph optimization algorithm and taking the edge coordinates as edge constraints of the graph optimization algorithm;
and carrying out coordinate conversion on the position coordinate corresponding to the optimal state quantity to obtain the longitude and latitude coordinate of the unmanned aerial vehicle equipment.
In an embodiment, determining the optimal state quantity of the unmanned aerial vehicle device by using the node coordinates as the node constraints of the graph optimization algorithm and using the edge coordinates as the edge constraints of the graph optimization algorithm includes:
determining the latest state quantity of the unmanned aerial vehicle equipment by using a graph optimization algorithm according to node constraint and edge constraint;
and performing sliding window operation on the latest state quantity and the historical state quantity of the unmanned aerial vehicle equipment by using a least square optimization algorithm to obtain the optimal state quantity of the unmanned aerial vehicle equipment.
In a second aspect, an embodiment of the present application provides an unmanned aerial vehicle positioning device, including:
the system comprises an acquisition module, a data acquisition module and a data processing module, wherein the acquisition module is used for acquiring Global Positioning System (GPS) data and point cloud data at the current moment, and the point cloud data is acquired based on laser radar on unmanned aerial vehicle equipment;
the first determining module is used for determining node coordinates of the unmanned aerial vehicle equipment according to the GPS data, wherein the node coordinates are relative coordinates of the unmanned aerial vehicle equipment in a first coordinate system, and the first coordinate system is a coordinate system established by taking a laser radar as an origin;
the second determining module is used for determining the side coordinates of the unmanned aerial vehicle equipment according to the point cloud data, wherein the side coordinates represent the pose variation of the unmanned aerial vehicle equipment between the current moment and the previous moment;
and the fusion module is used for fusing the node coordinates and the side coordinates by using a preset graph optimization algorithm to obtain the longitude and latitude coordinates of the unmanned aerial vehicle equipment.
In a third aspect, an embodiment of the present application provides a drone device, including a processor and a memory, where the memory is used to store a computer program, and the computer program, when executed by the processor, implements the drone positioning method according to the first aspect.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a processor, the method for positioning a drone according to the first aspect is implemented.
Please refer to the relevant description of the first aspect for the beneficial effects of the second to fourth aspects, which are not repeated herein.
Drawings
Fig. 1 is a schematic flow chart of a positioning method for an unmanned aerial vehicle according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of an unmanned aerial vehicle positioning system provided in an embodiment of the present application;
fig. 3 is a schematic view of a window formed by edge coordinates and node coordinates according to an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of an unmanned aerial vehicle positioning device provided in an embodiment of the present application;
fig. 5 is a schematic structural diagram of the unmanned aerial vehicle device provided by the embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
As described in the related art, the conventional positioning technology of the unmanned aerial vehicle mainly uses GPS (global positioning system) positioning as a reference, and the specific positioning principle of the GPS is as follows: since the positions of the satellites are known accurately, in the GPS observation, the distances from the satellites to the receiver can be combined into 3 equations by using a distance formula in three-dimensional coordinates and 3 satellites, and the positions (X, Y, Z) of the observation points are solved. Considering the error between the satellite clock and the receiver clock, there are actually 4 unknowns, X, Y, Z sum clock difference, so that the 4 th satellite needs to be introduced to form 4 equations for solving, thereby obtaining the longitude and latitude and the elevation of the observation point, and obtaining the specific position information of the unmanned aerial vehicle.
However, the positioning effect of the GPS is poor in some environments, for example, the GPS positioning system is prone to generate a multipath effect in the environments such as valleys and forests, so that the positioning robustness is poor; and the unmanned aerial vehicle relies on a civil GPS device, is limited by the cost problem, cannot obtain higher-precision positioning, has the precision range of 2 m-5 m, and cannot meet the requirement of high-precision positioning.
Therefore, the embodiment of the application provides an unmanned aerial vehicle positioning method, by acquiring Global Positioning System (GPS) data and point cloud data at the current moment, the point cloud data is acquired based on laser radar on unmanned aerial vehicle equipment, so that the limitation of a single sensor is overcome, on one hand, the problems of low positioning precision and low positioning robustness of the single GPS are solved, and on the other hand, the influence of a distance progressive error on the positioning of the single laser radar can be avoided by adding the GPS in the positioning of the laser radar; determining a node coordinate of the unmanned aerial vehicle device according to the GPS data, wherein the node coordinate is a relative coordinate of the unmanned aerial vehicle device in a first coordinate system, and determining an edge coordinate of the unmanned aerial vehicle device according to the point cloud data, wherein the edge coordinate represents a pose variation of the unmanned aerial vehicle device between the current moment and the previous moment; and finally, fusing the node coordinates and the edge coordinates by using a preset graph optimization algorithm to obtain longitude and latitude coordinates of the unmanned aerial vehicle equipment, thereby integrating the advantages of GPS positioning and laser radar positioning, realizing state fusion by using the graph optimization algorithm, enabling the obtained longitude and latitude coordinates to be more accurate, and improving the positioning effect of the unmanned aerial vehicle equipment.
Referring to fig. 1, fig. 1 is a schematic flow chart of a positioning method for an unmanned aerial vehicle according to an embodiment of the present application. The unmanned aerial vehicle positioning method can be applied to unmanned aerial vehicle equipment including but not limited to unmanned aerial vehicle equipment such as patrol/surveillance unmanned aerial vehicles, agricultural unmanned aerial vehicles, meteorological unmanned aerial vehicles, exploration unmanned aerial vehicles and surveying and mapping unmanned aerial vehicles.
Fig. 2 shows a schematic structural diagram of an unmanned aerial vehicle positioning system provided in an embodiment of the present application. The system comprises a satellite acquisition layer, a longitude and latitude resolving layer, a coordinate system conversion layer, a point cloud acquisition layer, a point cloud filter layer, a point cloud calibration layer, a feature extraction layer, a point cloud matching layer, a graph optimization layer, a coordinate system inversion layer and other system framework layers. It should be noted that the system architecture shown in fig. 2 is not a limitation of the system, and in other embodiments, more or fewer components may be included.
As shown in fig. 1, the positioning method of the unmanned aerial vehicle of the present embodiment includes steps S101 to S104, and the following steps are detailed as follows in conjunction with the system structure of fig. 2:
step S101, Global Positioning System (GPS) data and point cloud data at the current moment are obtained, and the point cloud data are acquired based on laser radar on unmanned aerial vehicle equipment.
In this step, the satellite information acquisition layer is responsible for receiving original data of GPS satellite signals, i.e. the GPS data; the point cloud acquisition layer acquires and generates point cloud data of the surrounding environment of the unmanned aerial vehicle device based on the airborne mechanical laser radar, wherein the format of the point cloud data can be three-dimensional coordinate points XYZ, and the point cloud acquisition layer performs the functions of data acquisition and data transmission and provides original point cloud data for a subsequent processing layer.
Step S102, determining node coordinates of the unmanned aerial vehicle equipment according to the GPS data, wherein the node coordinates are relative coordinates of the unmanned aerial vehicle equipment in a first coordinate system, and the first coordinate system is a coordinate system established by taking the laser radar as an origin.
In this step, the GPS data in the latitude and longitude coordinate system is converted to the node coordinates in the first coordinate system by performing data conversion on the GPS data.
Optionally, determining longitude and latitude data of the unmanned aerial vehicle device according to the GPS data;
converting the longitude and latitude data into position coordinates under a second coordinate system, wherein the second coordinate system is a geocentric coordinate system;
and carrying out coordinate conversion on the position coordinates to obtain the node coordinates of the unmanned aerial vehicle equipment in the first coordinate system.
In the optional embodiment, the longitude and latitude resolving layer calculates the longitude and latitude information of the unmanned aerial vehicle equipment according to the received signals and provides initial information for the coordinate system conversion layer; and the coordinate system conversion layer is responsible for converting the longitude and latitude information into coordinates under a geocentric coordinate system and finally converting the coordinates into relative coordinates under the coordinate system where the laser radar is located, namely node coordinates.
Step S103, determining the side coordinates of the unmanned aerial vehicle equipment according to the point cloud data, wherein the side coordinates represent the pose variation of the unmanned aerial vehicle equipment between the current moment and the last moment.
In the step, point cloud data are filtered through a point cloud filtering layer, the filtered point cloud data are calibrated through a point cloud calibration layer, point cloud features are extracted through a feature extraction layer, and finally a rotation matrix R and a translation vector T, namely edge coordinates, of the unmanned aerial vehicle device are decoupled through a point cloud matching layer in a feature matching mode.
And step S104, fusing the node coordinates and the edge coordinates by using a preset graph optimization algorithm to obtain longitude and latitude coordinates of the unmanned aerial vehicle equipment.
In this step, the graph optimization layer uses an optimization method of a sliding window method to fuse pose change information (i.e., side coordinates) of two frames of point clouds output by the point cloud matching layer with relative poses (i.e., node coordinates) of a GPS output by the coordinate conversion layer, and performs least square optimization on a plurality of historical state quantities at the same time to solve state quantities of a plurality of poses, and finally converts the state quantities into longitude and latitude coordinates.
In an embodiment, on the basis of the embodiment shown in fig. 1, the step S103 includes:
noise filtering is carried out on the point cloud data to obtain target point cloud data;
extracting first point cloud characteristics of the target point cloud data at the current moment, wherein the point cloud characteristics comprise line characteristics and surface characteristics;
and matching the first point cloud characteristics with the second point cloud characteristics at the previous moment to obtain the edge coordinates of the unmanned aerial vehicle equipment through decoupling, wherein the edge coordinates comprise a target rotation matrix and a target translation vector.
In this embodiment, illustratively, during the k-th run, a cloud set of point clouds PC0(k) is acquired during a scan cycle, the scanning time point cloud set PC1(k) is calibrated by periodically combining navigation pose sequences T0(k), all point clouds are calibrated to the same time, then, a point cloud set PC2(K) is formed through K-means clustering denoising, then point cloud line surface characteristics Q (K) are extracted, matching with the point cloud line surface characteristic Q (k-1) at the (k-1) th time, constructing an error function X (k) based on a rotation matrix R and a translation vector T as optimization variables, if the absolute value of X (k) is less than m (m is a preset precision threshold value), and correcting the rotation matrix R (k) and the translation vector T (k) according to an LM optimization method, and continuously iterating and optimizing until the rotation matrix R (k) and the translation vector T (k) are output after the conditions are met, so that the pose between the two nodes is obtained.
Optionally, the noise filtering the point cloud data to obtain target point cloud data includes:
clustering the point cloud data based on a K-means clustering algorithm to obtain clustered point cloud data;
and carrying out voxel filtering on the clustered point cloud data to obtain the target point cloud data.
In the optional embodiment, the point cloud filtering layer retains effective point clouds on the result of the point cloud acquisition layer by a K-means clustering method, removes irrelevant small noise point clouds, and reduces the number of the point clouds, the algorithm complexity and the operation real-time property by adopting a voxel filtering method.
Optionally, the matching the first point cloud feature and the second point cloud feature at the previous time to obtain the edge coordinate of the drone device through decoupling includes:
matching the first point cloud feature with a second point cloud feature at the previous moment, and determining a rotation matrix and a translation vector of the unmanned aerial vehicle device;
and correcting the rotation matrix and the translation vector by using a preset optimization algorithm until a target error function is smaller than a preset value, so as to obtain the target rotation matrix and the target translation vector.
In the optional embodiment, the point cloud calibration layer converts the point cloud image of each scanning period into the point cloud image at the same moment by using a prediction method according to the original data acquired by the point cloud and the point cloud matching conditions of two previous frames in history, and the processed point cloud image is input to the feature extraction layer. The feature extraction layer extracts line features and surface features of the point cloud data according to the curvature condition of the point cloud image, and stores feature sequences corresponding to different moments for matching. And the point cloud matching layer matches the features of the same position according to the feature sequence between the two moments and outputs the feature sequence to the pose estimation layer. And the point cloud matching layer decouples the rotation matrix R and the translation vector T of the unmanned aerial vehicle equipment by utilizing an LM (linear modeling) nonlinear optimization method according to the characteristic matching between two moments, and the processed result is output to the positioning fusion layer.
In an embodiment, based on the embodiment shown in fig. 1, the step S104 includes:
determining the optimal state quantity of the unmanned aerial vehicle equipment by taking the node coordinates as node constraints of a graph optimization algorithm and taking the edge coordinates as edge constraints of the graph optimization algorithm;
and performing coordinate conversion on the position coordinate corresponding to the optimal state quantity to obtain the longitude and latitude coordinates of the unmanned aerial vehicle equipment.
In this embodiment, the graph optimization layer integrates pose transformation information of two frames of point clouds output by the point cloud matching layer with the relative pose of the GPS output by the coordinate transformation layer by using an optimization method of a sliding window method, and performs least square optimization on a plurality of historical state quantities to solve the state quantities of the plurality of poses. And the coordinate system inversion layer converts the fused coordinates into coordinates under a geocentric coordinate system, and finally converts the coordinates into longitude and latitude coordinates which are output as final longitude and latitude coordinates.
Optionally, the determining the optimal state quantity of the drone device by using the node coordinates as a node constraint of a graph optimization algorithm and using the edge coordinates as an edge constraint of the graph optimization algorithm includes:
determining the latest state quantity of the unmanned aerial vehicle equipment according to the node constraint and the edge constraint by using the graph optimization algorithm;
and performing sliding window operation on the latest state quantity and the historical state quantity of the unmanned aerial vehicle equipment by using a least square optimization algorithm to obtain the optimal state quantity of the unmanned aerial vehicle equipment.
In this optional embodiment, as shown in a schematic diagram of a window formed by edge coordinates and node coordinates shown in fig. 3, a GPS calculated value (i.e., node coordinates) is used as a node constraint, a laser radar calculated value (i.e., edge coordinates) is used as an edge constraint between nodes, 6 node constraints and 5 node edge constraints are constructed, iterative optimization is continuously performed by an LM method by continuously using the latest calculated amount and discarded state amounts in a sliding window manner, and an optimal solution of a plurality of state amounts is finally obtained, so as to output a pose.
The method comprises the following specific steps of: firstly, constructing an equivalent linear solving equation, and then calculating the increment of an optimized variable; the calculated optimization delta can then be superimposed on the current optimization to trigger the next iteration of the cycle calculation. And when the iteration number of the algorithm reaches the upper limit or the calculated objective function corresponding to the optimization variable is smaller than a certain threshold, the optimization convergence is considered to be reached, and the value of the optimization variable at the moment is the solved value.
In order to execute the unmanned aerial vehicle positioning method corresponding to the embodiment of the method, corresponding functions and technical effects are realized. Referring to fig. 4, fig. 4 shows a block diagram of a positioning device of an unmanned aerial vehicle according to an embodiment of the present application. For the convenience of explanation, only shown the part relevant with this embodiment, the unmanned aerial vehicle positioner that this application embodiment provided includes:
the acquisition module 401 is configured to acquire global positioning system GPS data and point cloud data at a current time, where the point cloud data is acquired based on a laser radar on an unmanned aerial vehicle device;
a first determining module 402, configured to determine a node coordinate of the unmanned aerial vehicle device according to the GPS data, where the node coordinate is a relative coordinate of the unmanned aerial vehicle device in a first coordinate system, and the first coordinate system is a coordinate system constructed with the laser radar as an origin;
a second determining module 403, configured to determine, according to the point cloud data, an edge coordinate of the drone device, where the edge coordinate represents a pose variation of the drone device between a current time and a previous time;
and a fusion module 404, configured to fuse the node coordinates and the edge coordinates by using a preset graph optimization algorithm to obtain longitude and latitude coordinates of the unmanned aerial vehicle device.
In an embodiment, the second determining module 403 includes:
the filtering unit is used for carrying out noise filtering on the point cloud data to obtain target point cloud data;
the extraction unit is used for extracting a first point cloud feature corresponding to the target point cloud data at the current moment, wherein the point cloud feature comprises a line feature and a surface feature;
and the matching unit is used for matching the first point cloud characteristics with the second point cloud characteristics at the previous moment so as to obtain the edge coordinates of the unmanned aerial vehicle equipment through decoupling, wherein the edge coordinates comprise a target rotation matrix and a target translation vector.
In one embodiment, the filter unit comprises:
the clustering subunit is used for clustering the point cloud data based on a K-means clustering algorithm to obtain clustered point cloud data;
and the filtering subunit is used for carrying out voxel filtering on the clustered point cloud data to obtain the target point cloud data.
In one embodiment, the matching unit includes:
the matching subunit is used for matching the first point cloud feature with a second point cloud feature at the previous moment, and determining a rotation matrix and a translation vector of the unmanned aerial vehicle device;
and the correcting subunit is used for correcting the rotation matrix and the translation vector by using a preset optimization algorithm until a target error function is smaller than a preset value, so as to obtain the target rotation matrix and the target translation vector.
In one embodiment, the first determining module 402 includes:
the first determining unit is used for determining longitude and latitude data of the unmanned aerial vehicle equipment according to the GPS data;
the first conversion unit is used for converting the longitude and latitude data into position coordinates under a second coordinate system, and the second coordinate system is a geocentric coordinate system;
and the second conversion unit is used for carrying out coordinate conversion on the position coordinates to obtain the node coordinates of the unmanned aerial vehicle equipment in the first coordinate system.
In one embodiment, the fusion module includes:
the second determining unit is used for determining the optimal state quantity of the unmanned aerial vehicle equipment by taking the node coordinates as node constraints of a graph optimization algorithm and taking the edge coordinates as edge constraints of the graph optimization algorithm;
and the third conversion unit is used for carrying out coordinate conversion on the position coordinate corresponding to the optimal state quantity to obtain the longitude and latitude coordinates of the unmanned aerial vehicle equipment.
In an embodiment, the second determining unit includes:
a determining subunit, configured to determine, by using the graph optimization algorithm, a latest state quantity of the unmanned aerial vehicle device according to the node constraint and the edge constraint;
and the sliding window subunit is used for performing sliding window operation on the latest state quantity and the historical state quantity of the unmanned aerial vehicle equipment by using a least square optimization algorithm to obtain the optimal state quantity of the unmanned aerial vehicle equipment.
The unmanned aerial vehicle positioning device can implement the unmanned aerial vehicle positioning method of the embodiment of the method. The alternatives in the above-described method embodiments are also applicable to this embodiment and will not be described in detail here. The rest of the embodiments of the present application may refer to the contents of the above method embodiments, and in this embodiment, details are not repeated.
Fig. 5 is a schematic structural diagram of an unmanned aerial vehicle provided in an embodiment of the present application. As shown in fig. 5, the drone device 5 of this embodiment includes: at least one processor 50 (only one is shown in fig. 5), a memory 51, a computer program 52 stored in the memory 51 and executable on the at least one processor 50, a GPS 53 and a lidar 54, the steps of any of the above-described method embodiments being implemented when the computer program 52 is executed by the processor 50.
The drone device 5 may be a patrol/watch drone, an agricultural drone, a meteorological drone, an exploration drone, and a surveying drone, among other drone devices. The drone device may include, but is not limited to, a processor 50, memory 51, GPS 53, and lidar 54. Those skilled in the art will appreciate that fig. 5 is merely an example of the drone device 5 and does not constitute a limitation on the drone device 5, and may include more or fewer components than shown, or some components in combination, or different components, such as a network access device, etc.
The Processor 50 may be a Central Processing Unit (CPU), and the Processor 50 may be other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 51 may in some embodiments be an internal storage unit of the drone device 5, such as a hard disk or memory of the drone device 5. In other embodiments, the memory 51 may also be an external storage device of the drone device 5, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), or the like, provided on the drone device 5. Further, the memory 51 may also include both an internal storage unit and an external storage device of the drone device 5. The memory 51 is used for storing an operating system, an application program, a BootLoader (BootLoader), data, and other programs, such as program codes of the computer program. The memory 51 may also be used to temporarily store data that has been output or is to be output.
In addition, an embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps in any of the method embodiments described above.
The embodiment of the present application provides a computer program product, which, when running on an unmanned aerial vehicle device, enables the unmanned aerial vehicle to implement the steps in the above method embodiments when executed.
In several embodiments provided herein, it will be understood that each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application, or portions thereof, which substantially or substantially contribute to the prior art, may be embodied in the form of a software product stored in a storage medium, including instructions for causing a drone device to perform all or part of the steps of the method according to the various embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above-mentioned embodiments are further detailed to explain the objects, technical solutions and advantages of the present application, and it should be understood that the above-mentioned embodiments are only examples of the present application and are not intended to limit the scope of the present application. It should be understood that any modifications, equivalents, improvements and the like, which come within the spirit and principle of the present application, may occur to those skilled in the art and are intended to be included within the scope of the present application.
Claims (10)
1. An unmanned aerial vehicle positioning method is characterized by comprising the following steps:
acquiring Global Positioning System (GPS) data and point cloud data at the current moment, wherein the point cloud data is acquired based on laser radar on unmanned aerial vehicle equipment;
determining node coordinates of the unmanned aerial vehicle equipment according to the GPS data, wherein the node coordinates are relative coordinates of the unmanned aerial vehicle equipment in a first coordinate system, and the first coordinate system is a coordinate system established by taking the laser radar as an origin;
determining side coordinates of the unmanned aerial vehicle device according to the point cloud data, wherein the side coordinates represent the pose variation of the unmanned aerial vehicle device between the current moment and the last moment;
and fusing the node coordinates and the edge coordinates by using a preset graph optimization algorithm to obtain longitude and latitude coordinates of the unmanned aerial vehicle equipment.
2. The drone positioning method of claim 1, wherein the determining edge coordinates of the drone device from the point cloud data comprises:
noise filtering is carried out on the point cloud data to obtain target point cloud data;
extracting first point cloud characteristics of the target point cloud data at the current moment, wherein the point cloud characteristics comprise line characteristics and surface characteristics;
and matching the first point cloud characteristics with the second point cloud characteristics at the previous moment to obtain the edge coordinates of the unmanned aerial vehicle equipment through decoupling, wherein the edge coordinates comprise a target rotation matrix and a target translation vector.
3. The unmanned aerial vehicle positioning method of claim 2, wherein the noise filtering the point cloud data to obtain target point cloud data comprises:
clustering the point cloud data based on a K-means clustering algorithm to obtain clustered point cloud data;
and carrying out voxel filtering on the clustered point cloud data to obtain the target point cloud data.
4. The unmanned aerial vehicle positioning method of claim 2, wherein the matching the first point cloud feature with a second point cloud feature at a previous time to decouple edge coordinates of the unmanned aerial vehicle device comprises:
matching the first point cloud feature with a second point cloud feature at the previous moment, and determining a rotation matrix and a translation vector of the unmanned aerial vehicle device;
and correcting the rotation matrix and the translation vector by using a preset optimization algorithm until a target error function is smaller than a preset value, so as to obtain the target rotation matrix and the target translation vector.
5. A method as claimed in any one of claims 1 to 4, wherein said determining node coordinates of the drone device from the GPS data comprises:
determining longitude and latitude data of the unmanned aerial vehicle equipment according to the GPS data;
converting the longitude and latitude data into position coordinates under a second coordinate system, wherein the second coordinate system is a geocentric coordinate system;
and carrying out coordinate conversion on the position coordinates to obtain the node coordinates of the unmanned aerial vehicle equipment in the first coordinate system.
6. An unmanned aerial vehicle positioning method as defined in any of claims 1 to 4, wherein the fusing the node coordinates and the edge coordinates using a preset graph optimization algorithm to obtain longitude and latitude coordinates of the unmanned aerial vehicle device comprises:
determining the optimal state quantity of the unmanned aerial vehicle equipment by taking the node coordinates as node constraints of a graph optimization algorithm and taking the edge coordinates as edge constraints of the graph optimization algorithm;
and performing coordinate conversion on the position coordinate corresponding to the optimal state quantity to obtain the longitude and latitude coordinates of the unmanned aerial vehicle equipment.
7. The unmanned aerial vehicle positioning method of claim 6, wherein determining the optimal state quantity of the unmanned aerial vehicle device using the node coordinates as a node constraint of a graph optimization algorithm and the edge coordinates as an edge constraint of the graph optimization algorithm comprises:
determining the latest state quantity of the unmanned aerial vehicle equipment according to the node constraint and the edge constraint by using the graph optimization algorithm;
and performing sliding window operation on the latest state quantity and the historical state quantity of the unmanned aerial vehicle equipment by using a least square optimization algorithm to obtain the optimal state quantity of the unmanned aerial vehicle equipment.
8. An unmanned aerial vehicle positioner, its characterized in that includes:
the system comprises an acquisition module, a storage module and a processing module, wherein the acquisition module is used for acquiring Global Positioning System (GPS) data and point cloud data at the current moment, and the point cloud data is acquired based on laser radar on unmanned aerial vehicle equipment;
the first determining module is used for determining a node coordinate of the unmanned aerial vehicle equipment according to the GPS data, wherein the node coordinate is a relative coordinate of the unmanned aerial vehicle equipment in a first coordinate system, and the first coordinate system is a coordinate system constructed by taking the laser radar as an origin;
the second determining module is used for determining the edge coordinate of the unmanned aerial vehicle equipment according to the point cloud data, wherein the edge coordinate represents the pose variation of the unmanned aerial vehicle equipment between the current moment and the previous moment;
and the fusion module is used for fusing the node coordinates and the edge coordinates by using a preset graph optimization algorithm to obtain longitude and latitude coordinates of the unmanned aerial vehicle equipment.
9. A drone device, characterized by comprising a processor and a memory for storing a computer program which, when executed by the processor, implements the drone positioning method of any one of claims 1 to 7.
10. A computer-readable storage medium, characterized in that it stores a computer program which, when executed by a processor, implements the drone positioning method according to any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210123261.7A CN114442133A (en) | 2022-02-09 | 2022-02-09 | Unmanned aerial vehicle positioning method, device, equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210123261.7A CN114442133A (en) | 2022-02-09 | 2022-02-09 | Unmanned aerial vehicle positioning method, device, equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114442133A true CN114442133A (en) | 2022-05-06 |
Family
ID=81371629
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210123261.7A Pending CN114442133A (en) | 2022-02-09 | 2022-02-09 | Unmanned aerial vehicle positioning method, device, equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114442133A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116908818A (en) * | 2023-07-13 | 2023-10-20 | 广东喜讯智能科技有限公司 | Laser radar calibration method and device based on RTK unmanned aerial vehicle and storage medium |
WO2023236643A1 (en) * | 2022-06-09 | 2023-12-14 | 腾讯科技(深圳)有限公司 | Positioning method and apparatus, device and storage medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108303099A (en) * | 2018-06-14 | 2018-07-20 | 江苏中科院智能科学技术应用研究院 | Autonomous navigation method in unmanned plane room based on 3D vision SLAM |
CN110927765A (en) * | 2019-11-19 | 2020-03-27 | 博康智能信息技术有限公司 | Laser radar and satellite navigation fused target online positioning method |
CN112462385A (en) * | 2020-10-21 | 2021-03-09 | 南开大学 | Map splicing and positioning method based on laser radar under outdoor large environment |
CN113296121A (en) * | 2021-05-26 | 2021-08-24 | 广东电网有限责任公司 | Airborne lidar-based assisted navigation systems, methods, media, and devices |
CN113538677A (en) * | 2021-06-30 | 2021-10-22 | 深圳市优必选科技股份有限公司 | Positioning method, robot and storage medium |
CN113587933A (en) * | 2021-07-29 | 2021-11-02 | 山东山速机器人科技有限公司 | Indoor mobile robot positioning method based on branch-and-bound algorithm |
-
2022
- 2022-02-09 CN CN202210123261.7A patent/CN114442133A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108303099A (en) * | 2018-06-14 | 2018-07-20 | 江苏中科院智能科学技术应用研究院 | Autonomous navigation method in unmanned plane room based on 3D vision SLAM |
CN110927765A (en) * | 2019-11-19 | 2020-03-27 | 博康智能信息技术有限公司 | Laser radar and satellite navigation fused target online positioning method |
CN112462385A (en) * | 2020-10-21 | 2021-03-09 | 南开大学 | Map splicing and positioning method based on laser radar under outdoor large environment |
CN113296121A (en) * | 2021-05-26 | 2021-08-24 | 广东电网有限责任公司 | Airborne lidar-based assisted navigation systems, methods, media, and devices |
CN113538677A (en) * | 2021-06-30 | 2021-10-22 | 深圳市优必选科技股份有限公司 | Positioning method, robot and storage medium |
CN113587933A (en) * | 2021-07-29 | 2021-11-02 | 山东山速机器人科技有限公司 | Indoor mobile robot positioning method based on branch-and-bound algorithm |
Non-Patent Citations (2)
Title |
---|
李昶: "基于3D激光的室外清扫机器人定位与建图算法研究", 《中国优秀硕士学位论文全文数据库 基础科学辑》, no. 5, 15 May 2021 (2021-05-15), pages 005 - 207 * |
陆世东;涂美义;罗小勇;郭超;: "基于图优化理论和GNSS激光SLAM位姿优化算法", 激光与光电子学进展, vol. 57, no. 08, 30 April 2020 (2020-04-30), pages 081024 - 1 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023236643A1 (en) * | 2022-06-09 | 2023-12-14 | 腾讯科技(深圳)有限公司 | Positioning method and apparatus, device and storage medium |
CN116908818A (en) * | 2023-07-13 | 2023-10-20 | 广东喜讯智能科技有限公司 | Laser radar calibration method and device based on RTK unmanned aerial vehicle and storage medium |
CN116908818B (en) * | 2023-07-13 | 2024-05-28 | 广东喜讯智能科技有限公司 | Laser radar calibration method and device based on RTK unmanned aerial vehicle and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU2018278849B2 (en) | Vehicle navigation system using pose estimation based on point cloud | |
CN108732603B (en) | Method and device for locating a vehicle | |
US9378585B2 (en) | System and method for automatic geometric correction using RPC | |
CN112197764B (en) | Real-time pose determining method and device and electronic equipment | |
EP2922023A1 (en) | Three-dimensional object recognition device and three-dimensional object recognition method | |
CN114442133A (en) | Unmanned aerial vehicle positioning method, device, equipment and storage medium | |
CN110187375A (en) | A kind of method and device improving positioning accuracy based on SLAM positioning result | |
Kaniewski et al. | Estimation of UAV position with use of smoothing algorithms | |
KR101439213B1 (en) | Method for 3D Location determination in single image using Rational Polynomial Coefficients information of stereo satellite images | |
KR101890612B1 (en) | Method and apparatus for detecting object using adaptive roi and classifier | |
CN109507706B (en) | GPS signal loss prediction positioning method | |
CN114111775B (en) | Multi-sensor fusion positioning method and device, storage medium and electronic equipment | |
US9529093B2 (en) | Systems and methods for estimating attitude using double differenced GPS carrier phase measurements | |
WO2012071320A1 (en) | Coded filter | |
CN111080682B (en) | Registration method and device for point cloud data | |
CN114264301B (en) | Vehicle-mounted multi-sensor fusion positioning method, device, chip and terminal | |
Dawood et al. | Harris, SIFT and SURF features comparison for vehicle localization based on virtual 3D model and camera | |
CN111912430A (en) | On-orbit geometric calibration method, device, equipment and medium for high-orbit optical satellite | |
CN116399324A (en) | Picture construction method and device, controller and unmanned vehicle | |
CN117367419A (en) | Robot positioning method, apparatus and computer readable storage medium | |
Greco et al. | The study on SAR images exploitation for air platform navigation purposes | |
CN114111791A (en) | Indoor autonomous navigation method and system for intelligent robot and storage medium | |
CN116429090A (en) | Synchronous positioning and mapping method and device based on line laser and mobile robot | |
CN113495281B (en) | Real-time positioning method and device for movable platform | |
CN112987010B (en) | System and method for multi-radar mapping of robot |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |