CN116193581B - Indoor unmanned aerial vehicle hybrid positioning method and system based on member-collecting filtering - Google Patents
Indoor unmanned aerial vehicle hybrid positioning method and system based on member-collecting filtering Download PDFInfo
- Publication number
- CN116193581B CN116193581B CN202310484854.0A CN202310484854A CN116193581B CN 116193581 B CN116193581 B CN 116193581B CN 202310484854 A CN202310484854 A CN 202310484854A CN 116193581 B CN116193581 B CN 116193581B
- Authority
- CN
- China
- Prior art keywords
- unmanned aerial
- aerial vehicle
- positioning
- positioning position
- acquiring
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000001914 filtration Methods 0.000 title claims abstract description 33
- 238000000034 method Methods 0.000 title claims abstract description 33
- 230000004927 fusion Effects 0.000 claims abstract description 11
- 230000001133 acceleration Effects 0.000 claims description 16
- 230000004888 barrier function Effects 0.000 claims description 8
- 238000004364 calculation method Methods 0.000 claims description 4
- 238000005516 engineering process Methods 0.000 description 16
- 238000005259 measurement Methods 0.000 description 11
- 239000011159 matrix material Substances 0.000 description 7
- 238000010586 diagram Methods 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- 238000004891 communication Methods 0.000 description 3
- 230000002093 peripheral effect Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000006698 induction Effects 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012946 outsourcing Methods 0.000 description 1
- 230000000149 penetrating effect Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W64/00—Locating users or terminals or network equipment for network management purposes, e.g. mobility management
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/023—Services making use of location information using mutual or relative location information between multiple location based services [LBS] targets or of distance thresholds
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/30—Services specially adapted for particular environments, situations or purposes
- H04W4/33—Services specially adapted for particular environments, situations or purposes for indoor environments, e.g. buildings
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/30—Services specially adapted for particular environments, situations or purposes
- H04W4/40—Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W64/00—Locating users or terminals or network equipment for network management purposes, e.g. mobility management
- H04W64/006—Locating users or terminals or network equipment for network management purposes, e.g. mobility management with additional information processing, e.g. for direction or speed determination
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30232—Surveillance
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D30/00—Reducing energy consumption in communication networks
- Y02D30/70—Reducing energy consumption in communication networks in wireless communication networks
Landscapes
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The invention belongs to the field of indoor positioning, and discloses an indoor unmanned aerial vehicle hybrid positioning method and system based on member-collecting filtering, wherein the method comprises the following steps: s1, judging whether positioning data of the unmanned aerial vehicle to be positioned can be normally acquired, if so, entering S2, and if not, entering S6; s2, acquiring a first positioning position of the unmanned aerial vehicle to be positioned based on the positioning base station, and acquiring a second positioning position of the unmanned aerial vehicle to be positioned based on the camera; s3, judging whether the second positioning position meets a trigger event, if so, entering S4, and if not, entering S5; s4, carrying out weighted fusion on the first positioning position and the second positioning position to obtain the positioning position of the unmanned aerial vehicle to be positioned; s5, taking the first positioning position as the positioning position of the unmanned aerial vehicle to be positioned; s6, acquiring the positioning position of the unmanned aerial vehicle to be positioned based on the accelerometer of the unmanned aerial vehicle to be positioned. The invention improves the positioning precision of the unmanned aerial vehicle to be positioned in a complex indoor environment.
Description
Technical Field
The invention relates to the field of indoor positioning, in particular to an indoor unmanned aerial vehicle hybrid positioning method and system based on member collection filtering.
Background
With the rapid development of mobile intelligent terminals, wireless communication technologies and sensor network technologies, the demand for real-time location services for providing target location information is increasing. The existing positioning technology comprises GPS satellite positioning technology, infrared positioning technology, bluetooth technology, ultra Wideband (UWB) technology and the like, is widely applied to the fields of military, business, agriculture and the like, and has realized high-precision positioning in an open outdoor environment.
However, for indoor environments, the positioning technology is still greatly limited by the factors of complex indoor environments, the blocking of GPS signals by buildings, the existence of multiple obstacles and the like, and the high-precision real-time positioning of indoor moving targets is difficult to realize.
Compared with the traditional indoor positioning technology, the UWB technology has the advantages of strong penetrating power, low power consumption and high positioning precision, and is widely applied to indoor wireless communication positioning scenes.
However, in the process of positioning by using the UWB technology, there is still a problem that the positioning accuracy of the UWB is low due to abrupt change of the position information of the target to be positioned or shielding by an obstacle such as a wall.
Disclosure of Invention
In view of the above, the invention aims to disclose an indoor unmanned aerial vehicle hybrid positioning method and system based on member-collecting filtering, which solve the problem of how to improve the accuracy of positioning an indoor unmanned aerial vehicle by using UWB technology.
In order to achieve the above purpose, the present invention provides the following technical solutions:
in a first aspect, the invention provides an indoor unmanned aerial vehicle hybrid positioning method based on member-collecting filtering, which comprises the following steps:
s1, judging whether positioning data of the unmanned aerial vehicle to be positioned can be normally acquired, if so, entering S2, and if not, entering S6;
s2, acquiring a first positioning position of the unmanned aerial vehicle to be positioned based on the positioning base station, and acquiring a second positioning position of the unmanned aerial vehicle to be positioned based on the camera;
s3, judging whether the second positioning position meets a trigger event, if so, entering S4, and if not, entering S5;
s4, carrying out weighted fusion on the first positioning position and the second positioning position to obtain the positioning position of the unmanned aerial vehicle to be positioned;
s5, taking the first positioning position as the positioning position of the unmanned aerial vehicle to be positioned;
s6, acquiring the positioning position of the unmanned aerial vehicle to be positioned based on the accelerometer of the unmanned aerial vehicle to be positioned.
Optionally, the positioning data includes a distance between the unmanned aerial vehicle to be positioned and the positioning base station and a target image including the unmanned aerial vehicle to be positioned.
Optionally, determining whether the positioning data of the unmanned aerial vehicle to be positioned can be normally acquired includes:
if the distance between the unmanned aerial vehicle to be positioned and the positioning base station cannot be acquired and/or the image containing the unmanned aerial vehicle to be positioned cannot be acquired, the situation that the positioning data of the unmanned aerial vehicle to be positioned cannot be acquired normally is indicated.
Optionally, acquiring, based on the positioning base station, the first positioning position of the unmanned aerial vehicle to be positioned includes:
UWB ranging is carried out on the unmanned aerial vehicle to be positioned through a plurality of positioning base stations, and distances between the unmanned aerial vehicle to be positioned and the positioning base stations are obtained;
and calculating the first positioning position of the unmanned aerial vehicle based on the distances between the unmanned aerial vehicle to be positioned and the positioning base stations.
Optionally, acquiring, based on the camera, a second positioning position of the unmanned aerial vehicle to be positioned includes:
acquiring a target image containing the unmanned aerial vehicle to be positioned through at least two cameras;
and calculating a second positioning position of the unmanned aerial vehicle to be positioned based on the target image.
Optionally, determining whether the second positioning location satisfies the triggering event includes:
calculating the distance between the second positioning position at the current moment and the second positioning position at the previous moment;
if the distance is larger than the set trigger event threshold, the second positioning position is indicated to meet the trigger event, and if the distance is smaller than or equal to the set trigger event threshold, the second positioning position is indicated to not meet the trigger event.
Optionally, performing weighted fusion on the first positioning position and the second positioning position to obtain a positioning position of the unmanned aerial vehicle to be positioned, including:
the shielding coefficient of the indoor barrier is obtained according to the target image;
determining weights of the first positioning position and the second positioning position respectively based on the shielding coefficient;
and carrying out weighted calculation on the first positioning position and the second positioning position based on the weights of the first positioning position and the second positioning position to obtain the positioning position of the unmanned aerial vehicle to be positioned.
Optionally, acquiring the positioning position of the unmanned aerial vehicle to be positioned based on the accelerometer of the unmanned aerial vehicle to be positioned includes:
using%, />, />) Initial moment representing that distance cannot normally acquire positioning data of unmanned aerial vehicle to be positioned +.>The position of the unmanned aerial vehicle at the latest moment;
establishing a dynamics model;
acquiring the slave of the unmanned aerial vehicleAcceleration to each time between the positioning times t;
inputting the acceleration into a dynamics model to obtain the positioning position of the unmanned aerial vehicle at the positioning time t, />, z/>)。
Alternatively, the kinetic model is as follows:
wherein ,,/> and />Is to be positioned unmanned plane +.>Three-dimensional speed of moment, in->At moment, the initial speed of the unmanned aerial vehicle to be positioned is 0, < > or->= />= />=0;/>, /> and />Indicating that the unmanned aerial vehicle to be positioned is +.>Three-dimensional acceleration at time.
In a second aspect, the invention provides an indoor unmanned aerial vehicle hybrid positioning system based on member-gathering filtering, which comprises a data acquisition judging module, a primary positioning module, a triggering event judging module, a secondary positioning module and an acceleration positioning module;
the data acquisition judging module is used for judging whether the positioning data of the unmanned aerial vehicle to be positioned can be normally acquired or not;
the primary positioning module is used for acquiring a first positioning position of the unmanned aerial vehicle to be positioned based on the positioning base station and acquiring a second positioning position of the unmanned aerial vehicle to be positioned based on the camera when the positioning data of the unmanned aerial vehicle to be positioned can be normally acquired;
the trigger event judging module is used for judging whether the second positioning position meets the trigger event or not;
the secondary positioning module is used for taking the first positioning position as the positioning position of the unmanned aerial vehicle to be positioned when the second positioning position does not meet the triggering event, and carrying out weighted fusion on the first positioning position and the second positioning position when the second positioning position meets the triggering event to obtain the positioning position of the unmanned aerial vehicle to be positioned;
the acceleration positioning module is used for acquiring the positioning position of the unmanned aerial vehicle to be positioned based on the accelerometer of the unmanned aerial vehicle to be positioned when the positioning data of the unmanned aerial vehicle to be positioned cannot be normally acquired.
According to the invention, the UWB positioning result and the position information obtained by the camera ranging technology are combined in a weighting manner, so that the influence of barriers such as walls and the like caused by the simple adoption of UWB ranging in a complex indoor environment is eliminated, and the influence of shielding of obstacles on the camera image acquisition when the multi-camera positioning is adopted, and the positioning precision of the unmanned aerial vehicle to be positioned in the complex indoor environment is improved. Under the condition that the data of the peripheral sensor is lost or unreliable, the machine body acceleration data is acquired through the airborne sensor, the dynamic model estimation position information is established, and the real-time positioning of the indoor unmanned aerial vehicle is guaranteed.
Drawings
The present disclosure will become more fully understood from the detailed description given herein below and the accompanying drawings, which are given by way of illustration only, and thus are not limiting of the present disclosure, and wherein:
fig. 1 is a schematic diagram of an indoor unmanned aerial vehicle hybrid positioning method based on member-collecting filtering.
Fig. 2 is a schematic diagram of the principles of the trilateration method of the present invention.
FIG. 3 is a schematic illustration of the principles of the improved trilateration positioning of the present invention.
Fig. 4 is a schematic diagram of the principle of the present invention for localization by a target image.
FIG. 5 is a schematic diagram of the coordinate system of the present invention.
Fig. 6 is a schematic diagram of an indoor unmanned aerial vehicle hybrid positioning system based on member-collecting filtering according to the present invention.
Detailed Description
In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the disclosed embodiments. The concepts and features of the invention may be readily apparent to one or ordinary skill in the art in view of the specification, claims, and drawings disclosed herein. The following examples further illustrate aspects of the invention but are not intended to limit the scope of the invention.
Examples
The invention provides an indoor unmanned aerial vehicle hybrid positioning method based on member-collecting filtering, which is shown in an embodiment of fig. 1 and comprises the following steps:
s1, judging whether positioning data of the unmanned aerial vehicle to be positioned can be normally acquired, if so, entering S2, and if not, entering S6;
s2, acquiring a first positioning position of the unmanned aerial vehicle to be positioned based on the positioning base station, and acquiring a second positioning position of the unmanned aerial vehicle to be positioned based on the camera;
s3, judging whether the second positioning position meets a trigger event, if so, entering S4, and if not, entering S5;
s4, carrying out weighted fusion on the first positioning position and the second positioning position to obtain the positioning position of the unmanned aerial vehicle to be positioned;
s5, taking the first positioning position as the positioning position of the unmanned aerial vehicle to be positioned;
s6, acquiring the positioning position of the unmanned aerial vehicle to be positioned based on the accelerometer of the unmanned aerial vehicle to be positioned.
Considering that the position information of the target to be positioned is suddenly changed or possibly shielded by obstacles such as walls, the problem of lower UWB positioning accuracy is caused, and event-triggered camera ranging positioning is added to assist in improving the real-time positioning accuracy.
Meanwhile, considering that more obstacle influences possibly exist in an indoor environment, so that the peripheral sensor is unreliable, and when sensor data are lost, an accelerometer carried by the unmanned aerial vehicle to be positioned is adopted for estimation and positioning.
Therefore, the invention can remarkably improve the positioning precision.
Optionally, the positioning data includes a distance between the unmanned aerial vehicle to be positioned and the positioning base station and a target image including the unmanned aerial vehicle to be positioned.
Specifically, the distance between the unmanned aerial vehicle and the positioning base station can be acquired through the positioning base station, and the target image containing the unmanned aerial vehicle to be positioned is acquired through the camera.
Specifically, in the target image, the unmanned aerial vehicle to be positioned is located in the center of the target image.
Optionally, determining whether the positioning data of the unmanned aerial vehicle to be positioned can be normally acquired includes:
if the distance between the unmanned aerial vehicle to be positioned and the positioning base station cannot be acquired and/or the image containing the unmanned aerial vehicle to be positioned cannot be acquired, the situation that the positioning data of the unmanned aerial vehicle to be positioned cannot be acquired normally is indicated.
Optionally, acquiring, based on the positioning base station, the first positioning position of the unmanned aerial vehicle to be positioned includes:
UWB ranging is carried out on the unmanned aerial vehicle to be positioned through a plurality of positioning base stations, and distances between the unmanned aerial vehicle to be positioned and the positioning base stations are obtained;
and calculating the first positioning position of the unmanned aerial vehicle based on the distances between the unmanned aerial vehicle to be positioned and the positioning base stations.
Specifically, a first positioning position of the unmanned aerial vehicle is obtained by calculating based on distances between a plurality of unmanned aerial vehicles to be positioned and a positioning base station, and the method comprises the following steps:
step one: determining initial value coordinates of unmanned aerial vehicle to be positioned by UWB ranging
Three positioning base stations are deployed in the indoor space to be positioned, the unmanned aerial vehicle to be positioned is regarded as a label to be positioned, the positioning base stations are regarded as nodes, and the label is communicated with the nodes by adopting a UWB wireless communication technology.
And acquiring initial value coordinates of the unmanned aerial vehicle to be positioned by using a trilateration method. The method can be used in three or more UWB base station locations.
As shown in figure 2, three nodes i, j and p are used as circle centers to respectively make circles, and the coordinates of each circle center are respectively [ ], (The initial value coordinate of the label to be tested O is (++>The distance from the tag O to the three nodes is +.>,,/>,
Step two: filtering the initial value coordinates by using a member filter to obtain a member filtering initial value
Due to the influences of sensor hardware, environmental factors and the like in actual measurement, the three circles can not intersect at one point, but intersect in one area or not intersect, and the coordinates of the tag calculated by the trilateration method have certain errors. Therefore, the resulting filter processing is required to obtain more accurate position information.
Considering the influence of noise on positioning accuracy, the ranging positioning result needs to be processed through a filtering algorithm, and compared with the existing Kalman filtering algorithm, the member filtering algorithm has the advantage of not considering noise characteristics.
Kalman filtering and ensemble filtering are both state estimates for dynamic systems, but their basic assumptions and methods differ.
1. The kalman filter assumes that the system is gaussian, meaning that the state equation and the measurement equation can be described as a function of gaussian noise. The Kalman filter estimates the current state of the system based on the prior state estimate and the current measurement.
2. The set membership filter does not make any assumptions about the distribution of noise and linearity of the system. It uses set theory and interval analysis to represent and state and measurement uncertainties. The principle of operation of the set-top filter is to calculate a set of possible values for the state vector, which set is consistent with the measured values and system dynamics. The set of possible values is updated in each time step using operations of set theory, such as intersection, joint and minkowski sums. The set-top filter is very useful in applications where the dynamics of the system are unknown or the noise is non-gaussian.
In summary, the Kalman filter is suitable for a linear system with Gaussian noise, while the set-top filter is suitable for a nonlinear system with arbitrary noise distribution.
The distance between each node and the label obtained by the UWB indoor positioning technology is continuously updated, and the initial value coordinates of the label O are obtained by three nodes i, j and p through filtering by using a collector filter, so that the actual errors of the relative distances between the label and the three nodes i, j and p are reduced.
Taking node i as an example, node j and node p are neighbor nodes, and the node i can receive the position information of the label, the coordinates of the neighbor nodes j and p and the measurement information.
The state equation of the system is:
wherein ,、/>is a matrix of known time-varying coefficients, < >>The system state represents the position of the unmanned aerial vehicle to be positioned to be detected, and k-1, k and k+1 represent the states of the last moment, the current moment and the predicted next moment respectively. /> />Is a known positive symmetric matrix representing the range of unknown bounded process noise.
In the set-membership filtering, process noiseRefers to uncertainty or error in the state equation of the dynamic system being modeled. Such uncertainty is typically modeled as a set of possible values or intervals that represent a possible range of actual process noise.
This set of possible values or intervals is typically represented as a set of constraints of the state equation, where each constraint represents one possible value or interval of process noise. These constraints are then used to generate a set of possible values for the next state of the system, which are updated in each time step using operations of set theory such as intersection, union or minkowski sums.
Measurement equation for node i:
wherein , 、/>is a matrix of known time-varying coefficients, < >>:/>,/>Is a known symmetric matrix representing the range of unknown bounded measurement noise.
In the set-membership filtering, noise is measuredRefers to the uncertainty or error in the inode in making measurements of the modeled dynamic system state x. Such uncertainty is typically modeled as a set of possible values or intervals, representing a possible range of actual measurement noise.
This set of possible values or intervals is typically expressed as a set of constraints of the measurement equation, where each constraint represents one possible value or interval of measurement noise. These constraints are then used to generate a set of possible values for the current state of the system, which are updated in each time step using the operations of set theory, such as intersection, union, or Minkovski sum.
And obtaining an initial value of the group member filtering obtained through the group member filtering through the positioning of the improved trilateration method.
As shown in fig. 3, three circles are respectively expressed as:
where n=i, j, p,i is an identity matrix, ensuring that the Qn dimension is consistent with the dimension of the state quantity x.
The minimum outsourcing ellipsoids of the union of every two circles of the three circles are respectively:
in the intersection of the three smallest outsourced ellipsoids, the internal ellipsoids with the largest volume can be obtained as follows:
ellipsoids thus obtainedNamely the initial value of the filtering of the collector +.>The trilateration method developed by the method can realize that when three or more sensors in UWB ranging are used as circle centers to make circles, ellipsoids containing positioning information of a target to be measured can be still obtained under the conditions of intersecting at one point, intersecting at one area or not intersecting.
Step three: distributed member filtering correction positioning area
1) Node i one-step prediction
Initial estimation from node iEstimate of last moment +.>And the measured value obtained by node i at the current instant k +.>The one-step predictive equation for node i can be obtained in the following form,
An ensemble filter:
an estimate of state x for sensor node i at time k.
Filter error:,
a filtered error system is available:
wherein , is a filter parameter.
For ellipsoidsGiven an elliptic constraint matrix ∈ ->The initial constraint condition with initial time isThen->When satisfied, it can be demonstrated by mathematical induction>Thereby ensuring that the state at each moment after the iteration is within the estimated range.
Optimizing filter parameters:tr represents the trace of the matrix.
Thereby solving for filter parametersThereby realizing the ellipsoid state estimation of the local node containing the true position information of the unmanned aerial vehicle to be positioned. The same applies to other nodes for local estimation.
2) Fusing neighbor node information to perform global estimation and optimizing positioning information
The estimated values of the plurality of nodes are weighted and averaged to obtain global state estimation:
wherein ,a global state estimate representing time k; assuming that the node i is the first node, m=1 represents the node i, and N is the total number of nodes, namely, when there are i, j and p nodes, the total number of nodes n=3; />Is a weight coefficient representing the confidence of node m (node i) at time k, solved by the following expression:
for node i, n represents neighbor nodes j and p of m (node i), thereby enabling a global ellipsoidal state estimate containing the true position information of the drone to be positioned.
The UWB positioning result is filtered through a distributed set member filtering algorithm, the obtained position information is optimized, the algorithm does not need to consider the statistical characteristics of disturbance noise, and the real-time performance of positioning the indoor target is high and the robustness is good.
Optionally, acquiring, based on the camera, a second positioning position of the unmanned aerial vehicle to be positioned includes:
acquiring a target image containing the unmanned aerial vehicle to be positioned through at least two cameras;
and calculating a second positioning position of the unmanned aerial vehicle to be positioned based on the target image.
Specifically, four cameras are arranged at four corners of the indoor space to be positioned, and the distance between every two cameras is measured. And enabling the unmanned aerial vehicle to be positioned at the center of at least two camera images, and calculating the position of the target to be positioned according to the geographic position information and the included angle information of at least two cameras which successfully acquire the target image to be positioned. Because the two cameras and the unmanned aerial vehicle to be positioned form a triangle, the indoor position information of the unmanned aerial vehicle to be positioned can be calculated through projection and sine theorem. The method can be expanded to any two or more cameras for successfully acquiring the target image of the unmanned aerial vehicle to be positioned.
Taking as an example two cameras A, B successfully acquiring the target image of the drone to be positioned in fig. 4.
The two cameras are respectively arranged at the tops of two corners of the indoor space, the height difference between the pre-measured cameras and the ground is h, the distance between the two cameras is d and is recorded into the server, and the two cameras monitor the unmanned plane O to be positioned simultaneously. The angle information of the two cameras when the unmanned aerial vehicle to be positioned is positioned at the center of the images of the two cameras is obtained through angle data acquisition software, and the included angle between the camera A and the line segment BA in the horizontal direction is set asThe included angle between the camera and the plane of the camera in the vertical direction is +.>The included angle between the camera B and the line segment AB in the horizontal direction is +.>The included angle between the camera and the plane of the camera in the vertical direction is. For the convenience of calculation, the target unmanned aerial vehicle O is projected onto a plane where the camera is located to obtain O'. In triangle ABO', it is available from the cosine law:
horizontal distance from target to be positioned to two camera connecting lines:
vertical distance of target to be positioned from ground:
after the result is obtained, the three-dimensional space position information of the unmanned aerial vehicle to be positioned can be obtained by combining the known camera position information.
Optionally, determining whether the second positioning location satisfies the triggering event includes:
calculating the distance between the second positioning position at the current moment and the second positioning position at the previous moment;
if the distance is larger than the set trigger event threshold, the second positioning position is indicated to meet the trigger event, and if the distance is smaller than or equal to the set trigger event threshold, the second positioning position is indicated to not meet the trigger event.
Specifically, event triggering detection is carried out on a target image acquired by a camera, if a second positioning position corresponding to the target image does not meet an event triggering condition, a UWB positioning result, namely a first positioning position, is taken as a final result to be output, and position data of the camera at the moment is not transmitted;
and if the second positioning position corresponding to the target image meets the event triggering condition, taking weighted fusion of the second positioning position and the first positioning position as a final result to be output.
Setting event triggering conditions for camera data transmission, and reducing the information quantity and the operation quantity of camera positioning transmission.
In the coordinate system as shown in FIG. 5, the target coordinate at the previous time is set as #,/>,/>) The current time target coordinates are calculated (+)>,/>,/>) The trigger event is as follows:
the trigger event threshold value representing the setting can be set according to actual conditions, so that the target coordinate information calculated by the camera is transmitted under the condition that the position of the unmanned aerial vehicle to be positioned is suddenly changed.
Optionally, performing weighted fusion on the first positioning position and the second positioning position to obtain a positioning position of the unmanned aerial vehicle to be positioned, including:
the shielding coefficient of the indoor barrier is obtained according to the target image;
determining weights of the first positioning position and the second positioning position respectively based on the shielding coefficient;
and carrying out weighted calculation on the first positioning position and the second positioning position based on the weights of the first positioning position and the second positioning position to obtain the positioning position of the unmanned aerial vehicle to be positioned.
Specifically, the duty ratio of the pixel points of the obstacle in the target image can be used as the shielding coefficient. The larger the duty cycle, the more the occlusion is indicated, and the greater the weight of the first positioning location.
Specifically, because the indoor environment is complex and has the conditions of multiple barriers, abrupt change of the position of the unmanned aerial vehicle to be positioned and the like, the influence of barriers such as walls on UWB positioning accuracy is considered, and the influence of the barriers on sight shielding and the like of the cameras is considered, so that the UWB positioning and the multiple-camera positioning method are combined in a weighting manner, the weights of the UWB positioning and the multiple-camera positioning method are adjusted according to actual conditions, the UWB positioning weight is increased when the cameras are shielded more, the camera positioning weight is increased when the camera vision is better, and the positioning accuracy of indoor targets is improved.
Optionally, acquiring the positioning position of the unmanned aerial vehicle to be positioned based on the accelerometer of the unmanned aerial vehicle to be positioned includes:
using%, />, />) Initial moment representing that distance cannot normally acquire positioning data of unmanned aerial vehicle to be positioned +.>The position of the unmanned aerial vehicle at the latest moment;
establishing a dynamics model;
acquiring the slave of the unmanned aerial vehicleAcceleration to each time between the positioning times t;
inputting the acceleration into a dynamics model to obtain the positioning position of the unmanned aerial vehicle at the positioning time t, />, z/>)。
Alternatively, the kinetic model is as follows:
wherein ,,/> and />Is to be positioned unmanned plane +.>Three-dimensional speed of moment, in->At moment, the initial speed of the unmanned aerial vehicle to be positioned is 0, < > or->= />= />=0;/>, /> and />Indicating that the unmanned aerial vehicle to be positioned is +.>Three-dimensional acceleration at time.
Under the condition that an indoor obstacle shields or sensor data is lost, a motion state is acquired according to data of an onboard accelerometer of the unmanned aerial vehicle to be positioned, and a dynamic model of the unmanned aerial vehicle is built, so that the unmanned aerial vehicle to be positioned is positioned under the condition that an external sensor is unreliable or data is lost in a short time.
Examples
As shown in fig. 6, the invention provides an indoor unmanned aerial vehicle hybrid positioning system based on member-gathering filtering, which comprises a data acquisition judging module, a primary positioning module, a triggering event judging module, a secondary positioning module and an acceleration positioning module;
the data acquisition judging module is used for judging whether the positioning data of the unmanned aerial vehicle to be positioned can be normally acquired or not;
the primary positioning module is used for acquiring a first positioning position of the unmanned aerial vehicle to be positioned based on the positioning base station and acquiring a second positioning position of the unmanned aerial vehicle to be positioned based on the camera when the positioning data of the unmanned aerial vehicle to be positioned can be normally acquired;
the trigger event judging module is used for judging whether the second positioning position meets the trigger event or not;
the secondary positioning module is used for taking the first positioning position as the positioning position of the unmanned aerial vehicle to be positioned when the second positioning position does not meet the triggering event, and carrying out weighted fusion on the first positioning position and the second positioning position when the second positioning position meets the triggering event to obtain the positioning position of the unmanned aerial vehicle to be positioned;
the acceleration positioning module is used for acquiring the positioning position of the unmanned aerial vehicle to be positioned based on the accelerometer of the unmanned aerial vehicle to be positioned when the positioning data of the unmanned aerial vehicle to be positioned cannot be normally acquired.
According to the invention, the UWB positioning result and the position information obtained by the camera ranging technology are combined in a weighting manner, so that the influence of barriers such as walls and the like caused by the simple adoption of UWB ranging in a complex indoor environment is eliminated, and the influence of shielding of obstacles on the camera image acquisition when the multi-camera positioning is adopted, and the positioning precision of the unmanned aerial vehicle to be positioned in the complex indoor environment is improved. Under the condition that the data of the peripheral sensor is lost or unreliable, the machine body acceleration data is acquired through the airborne sensor, the dynamic model estimation position information is established, and the real-time positioning of the indoor unmanned aerial vehicle is guaranteed.
Finally, it is noted that the above embodiments are only for illustrating the technical solution of the present invention and not for limiting the same, and although the present invention has been described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications and equivalents may be made thereto without departing from the spirit and scope of the present invention, which is intended to be covered by the claims of the present invention.
Claims (7)
1. An indoor unmanned aerial vehicle hybrid positioning method based on member-set filtering is characterized by comprising the following steps:
s1, judging whether positioning data of the unmanned aerial vehicle to be positioned can be normally acquired, if so, entering S2, and if not, entering S6;
s2, acquiring a first positioning position of the unmanned aerial vehicle to be positioned based on the positioning base station, and acquiring a second positioning position of the unmanned aerial vehicle to be positioned based on the camera;
acquiring a first positioning position of the unmanned aerial vehicle to be positioned based on the positioning base station comprises the following steps:
UWB ranging is carried out on the unmanned aerial vehicle to be positioned through a plurality of positioning base stations, and distances between the unmanned aerial vehicle to be positioned and the positioning base stations are obtained;
calculating a first positioning position of the unmanned aerial vehicle based on the distances between a plurality of unmanned aerial vehicles to be positioned and the positioning base station;
s3, judging whether the second positioning position meets a trigger event, if so, entering S4, and if not, entering S5;
s4, carrying out weighted fusion on the first positioning position and the second positioning position to obtain the positioning position of the unmanned aerial vehicle to be positioned;
s5, taking the first positioning position as the positioning position of the unmanned aerial vehicle to be positioned;
s6, acquiring the positioning position of the unmanned aerial vehicle to be positioned based on the accelerometer of the unmanned aerial vehicle to be positioned;
acquiring the positioning position of the unmanned aerial vehicle to be positioned based on the accelerometer of the unmanned aerial vehicle to be positioned, comprising:
using%, />, />) Initial moment representing that distance cannot normally acquire positioning data of unmanned aerial vehicle to be positioned +.>The position of the unmanned aerial vehicle at the latest moment;
establishing a dynamics model;
acquiring the slave of the unmanned aerial vehicleAcceleration to each time between the positioning times t;
inputting the acceleration into a dynamics model to obtain the positioning position of the unmanned aerial vehicle at the positioning time t, />, z/>);
The kinetic model is as follows:
wherein ,,/> and />Is to be positioned unmanned plane +.>Three-dimensional speed of moment, in->At moment, the initial speed of the unmanned aerial vehicle to be positioned is 0, < > or->= />= />=0;/>, /> and />Indicating that the unmanned aerial vehicle to be positioned is +.>Three-dimensional acceleration at time.
2. The hybrid positioning method of an indoor unmanned aerial vehicle based on member-by-member filtering according to claim 1, wherein the positioning data includes a distance between the unmanned aerial vehicle to be positioned and a positioning base station and a target image containing the unmanned aerial vehicle to be positioned.
3. The hybrid positioning method of an indoor unmanned aerial vehicle based on member-gathering filtering according to claim 2, wherein the determining whether the positioning data of the unmanned aerial vehicle to be positioned can be normally acquired comprises:
if the distance between the unmanned aerial vehicle to be positioned and the positioning base station cannot be acquired and/or the image containing the unmanned aerial vehicle to be positioned cannot be acquired, the situation that the positioning data of the unmanned aerial vehicle to be positioned cannot be acquired normally is indicated.
4. The hybrid positioning method of an indoor unmanned aerial vehicle based on member-gathering filtering of claim 1, wherein the acquiring the second positioning position of the unmanned aerial vehicle to be positioned based on the camera comprises:
acquiring a target image containing the unmanned aerial vehicle to be positioned through at least two cameras;
and calculating a second positioning position of the unmanned aerial vehicle to be positioned based on the target image.
5. The method for hybrid positioning of an indoor unmanned aerial vehicle based on member-gathering filtering of claim 1, wherein determining whether the second positioning location satisfies the trigger event comprises:
calculating the distance between the second positioning position at the current moment and the second positioning position at the previous moment;
if the distance is larger than the set trigger event threshold, the second positioning position is indicated to meet the trigger event, and if the distance is smaller than or equal to the set trigger event threshold, the second positioning position is indicated to not meet the trigger event.
6. The hybrid positioning method of an indoor unmanned aerial vehicle based on member-gathering filtering of claim 2, wherein the weighted fusion of the first positioning position and the second positioning position is performed to obtain the positioning position of the unmanned aerial vehicle to be positioned, and the method comprises the following steps:
the shielding coefficient of the indoor barrier is obtained according to the target image;
determining weights of the first positioning position and the second positioning position respectively based on the shielding coefficient;
and carrying out weighted calculation on the first positioning position and the second positioning position based on the weights of the first positioning position and the second positioning position to obtain the positioning position of the unmanned aerial vehicle to be positioned.
7. The indoor unmanned aerial vehicle hybrid positioning system based on the member collecting filtering is characterized by comprising a data acquisition judging module, a primary positioning module, a triggering event judging module, a secondary positioning module and an acceleration positioning module;
the data acquisition judging module is used for judging whether the positioning data of the unmanned aerial vehicle to be positioned can be normally acquired or not;
the primary positioning module is used for acquiring a first positioning position of the unmanned aerial vehicle to be positioned based on the positioning base station and acquiring a second positioning position of the unmanned aerial vehicle to be positioned based on the camera when the positioning data of the unmanned aerial vehicle to be positioned can be normally acquired;
acquiring a first positioning position of the unmanned aerial vehicle to be positioned based on the positioning base station comprises the following steps:
UWB ranging is carried out on the unmanned aerial vehicle to be positioned through a plurality of positioning base stations, and distances between the unmanned aerial vehicle to be positioned and the positioning base stations are obtained;
calculating a first positioning position of the unmanned aerial vehicle based on the distances between a plurality of unmanned aerial vehicles to be positioned and the positioning base station;
the trigger event judging module is used for judging whether the second positioning position meets the trigger event or not;
the secondary positioning module is used for taking the first positioning position as the positioning position of the unmanned aerial vehicle to be positioned when the second positioning position does not meet the triggering event, and carrying out weighted fusion on the first positioning position and the second positioning position when the second positioning position meets the triggering event to obtain the positioning position of the unmanned aerial vehicle to be positioned;
the acceleration positioning module is used for acquiring the positioning position of the unmanned aerial vehicle to be positioned based on the accelerometer of the unmanned aerial vehicle to be positioned when the positioning data of the unmanned aerial vehicle to be positioned cannot be normally acquired;
acquiring the positioning position of the unmanned aerial vehicle to be positioned based on the accelerometer of the unmanned aerial vehicle to be positioned, comprising:
using%, />, />) Initial moment representing that distance cannot normally acquire positioning data of unmanned aerial vehicle to be positioned +.>The position of the unmanned aerial vehicle at the latest moment;
establishing a dynamics model;
acquiring the slave of the unmanned aerial vehicleAcceleration to each time between the positioning times t;
inputting the acceleration into a dynamics model to obtain the positioning position of the unmanned aerial vehicle at the positioning time t, />, z/>);
The kinetic model is as follows:
wherein ,,/> and />Is to be positioned unmanned plane +.>Three-dimensional speed of moment, in->At moment, the initial speed of the unmanned aerial vehicle to be positioned is 0, < > or->= />= />=0;/>, /> and />Indicating that the unmanned aerial vehicle to be positioned is +.>Three-dimensional acceleration at time.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310484854.0A CN116193581B (en) | 2023-05-04 | 2023-05-04 | Indoor unmanned aerial vehicle hybrid positioning method and system based on member-collecting filtering |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310484854.0A CN116193581B (en) | 2023-05-04 | 2023-05-04 | Indoor unmanned aerial vehicle hybrid positioning method and system based on member-collecting filtering |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116193581A CN116193581A (en) | 2023-05-30 |
CN116193581B true CN116193581B (en) | 2023-08-04 |
Family
ID=86440668
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310484854.0A Active CN116193581B (en) | 2023-05-04 | 2023-05-04 | Indoor unmanned aerial vehicle hybrid positioning method and system based on member-collecting filtering |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116193581B (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110650427A (en) * | 2019-04-29 | 2020-01-03 | 国网浙江省电力有限公司物资分公司 | Indoor positioning method and system based on fusion of camera image and UWB |
CN111982100A (en) * | 2020-07-07 | 2020-11-24 | 广东工业大学 | Course angle resolving algorithm of unmanned aerial vehicle |
WO2022170863A1 (en) * | 2021-02-09 | 2022-08-18 | 华为技术有限公司 | Ultra-wideband positioning method and system |
CN115103439A (en) * | 2022-06-07 | 2022-09-23 | 北京钢铁侠科技有限公司 | Ultra-wideband visual auxiliary positioning method and device and storage medium |
CN115793007A (en) * | 2022-12-19 | 2023-03-14 | 交控科技股份有限公司 | Multi-source fusion positioning method and device applied to navigation service |
CN115979250A (en) * | 2023-03-20 | 2023-04-18 | 山东上水环境科技集团有限公司 | Positioning method based on UWB module, semantic map and visual information |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20230113061A1 (en) * | 2021-10-12 | 2023-04-13 | Samsung Electronics Co., Ltd. | System and method for rf based robot localization |
-
2023
- 2023-05-04 CN CN202310484854.0A patent/CN116193581B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110650427A (en) * | 2019-04-29 | 2020-01-03 | 国网浙江省电力有限公司物资分公司 | Indoor positioning method and system based on fusion of camera image and UWB |
CN111982100A (en) * | 2020-07-07 | 2020-11-24 | 广东工业大学 | Course angle resolving algorithm of unmanned aerial vehicle |
WO2022170863A1 (en) * | 2021-02-09 | 2022-08-18 | 华为技术有限公司 | Ultra-wideband positioning method and system |
CN115103439A (en) * | 2022-06-07 | 2022-09-23 | 北京钢铁侠科技有限公司 | Ultra-wideband visual auxiliary positioning method and device and storage medium |
CN115793007A (en) * | 2022-12-19 | 2023-03-14 | 交控科技股份有限公司 | Multi-source fusion positioning method and device applied to navigation service |
CN115979250A (en) * | 2023-03-20 | 2023-04-18 | 山东上水环境科技集团有限公司 | Positioning method based on UWB module, semantic map and visual information |
Also Published As
Publication number | Publication date |
---|---|
CN116193581A (en) | 2023-05-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN1940591B (en) | System and method of target tracking using sensor fusion | |
Li et al. | Simultaneous registration and fusion of multiple dissimilar sensors for cooperative driving | |
JP2020528994A (en) | Vehicle navigation system using attitude estimation based on point cloud | |
CN113074727A (en) | Indoor positioning navigation device and method based on Bluetooth and SLAM | |
CN110837080A (en) | Rapid calibration method of laser radar mobile measurement system | |
CN109375168B (en) | RSSI-based low-speed moving vehicle positioning method | |
CN112923919B (en) | Pedestrian positioning method and system based on graph optimization | |
Rohani et al. | A new decentralized Bayesian approach for cooperative vehicle localization based on fusion of GPS and inter-vehicle distance measurements | |
Wen et al. | Object-detection-aided GNSS and its integration with lidar in highly urbanized areas | |
CN113706612A (en) | Underground coal mine vehicle positioning method fusing UWB and monocular vision SLAM | |
CN112346104A (en) | Unmanned aerial vehicle information fusion positioning method | |
CN114222240A (en) | Multi-source fusion positioning method based on particle filtering | |
CN113899369A (en) | ultra-wideband/PDR (pulse-modulated Power Rate) indoor positioning method based on adaptive noise reduction algorithm | |
CN116193581B (en) | Indoor unmanned aerial vehicle hybrid positioning method and system based on member-collecting filtering | |
CN114459467B (en) | VI-SLAM-based target positioning method in unknown rescue environment | |
Obst et al. | Probabilistic multipath mitigation for GNSS-based vehicle localization in urban areas | |
CN116105726A (en) | Multi-sensor fusion type wall climbing robot elevation positioning method | |
Kong et al. | Hybrid indoor positioning method of BLE and monocular VINS based smartphone | |
CN113219452B (en) | Distributed multi-radar joint registration and multi-target tracking method under unknown vision field | |
CN114705223A (en) | Inertial navigation error compensation method and system for multiple mobile intelligent bodies in target tracking | |
CN112798020A (en) | System and method for evaluating positioning accuracy of intelligent automobile | |
Gingras et al. | Signal processing requirements and uncertainty modeling issues in cooperative vehicular positioning | |
Han et al. | Research on indoor positioning based on fusion of Wi-Fi/PDR | |
Souli et al. | Online Distributed Relative Positioning Utilizing Multiple Cooperative Autonomous Agents | |
Khamooshi | Cooperative Vehicle Perception and Localization Using Infrastructure-based Sensor Nodes |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |