CN110220513B - Target positioning method, system, unmanned aerial vehicle and storage medium - Google Patents

Target positioning method, system, unmanned aerial vehicle and storage medium Download PDF

Info

Publication number
CN110220513B
CN110220513B CN201910364502.5A CN201910364502A CN110220513B CN 110220513 B CN110220513 B CN 110220513B CN 201910364502 A CN201910364502 A CN 201910364502A CN 110220513 B CN110220513 B CN 110220513B
Authority
CN
China
Prior art keywords
target
unmanned aerial
aerial vehicle
sampling moment
targets
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910364502.5A
Other languages
Chinese (zh)
Other versions
CN110220513A (en
Inventor
徐升
欧勇盛
李�浩
王志扬
段江哗
吴新宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Institute of Advanced Technology of CAS
Original Assignee
Shenzhen Institute of Advanced Technology of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Institute of Advanced Technology of CAS filed Critical Shenzhen Institute of Advanced Technology of CAS
Priority to CN201910364502.5A priority Critical patent/CN110220513B/en
Publication of CN110220513A publication Critical patent/CN110220513A/en
Application granted granted Critical
Publication of CN110220513B publication Critical patent/CN110220513B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The application discloses a method, a system, an unmanned aerial vehicle and a storage medium for positioning targets, wherein the method comprises the steps of obtaining the state information of each target at the current sampling moment by using the direction data of a plurality of targets measured by the same unmanned aerial vehicle at the current sampling moment; evaluating the accurate positioning condition of the unmanned aerial vehicle for measuring a plurality of targets at a plurality of paths to be determined respectively; taking the undetermined path point with accurate positioning condition meeting the preset condition as the next path point of the unmanned aerial vehicle; and controlling the unmanned aerial vehicle to fly to the next path point, continuously measuring in the flying process to obtain the direction data of the multiple targets at the next sampling moment, and obtaining the state information of each target at the next sampling moment by using the direction data of the multiple targets at the next sampling moment. In the process of positioning a plurality of targets, the next path point is determined by the target positioning method provided by the application, so that the plurality of targets can be positioned more accurately.

Description

Target positioning method, system, unmanned aerial vehicle and storage medium
Technical Field
The application relates to the field of unmanned aerial vehicles, in particular to a target positioning method, a target positioning system, an unmanned aerial vehicle and a storage medium.
Background
Compared with a satellite positioning technology, the unmanned aerial vehicle has more advantages in the aspects of accuracy and searching speed, so that the unmanned aerial vehicle is widely applied to target tracking or searching in various fields in recent years. However, when a search task is executed on a plurality of targets, since the search range is often large, in the prior art, a plurality of unmanned aerial vehicles are often adopted for distributed operation for quickly locking the targets. However, the difficulty of the distributed unmanned aerial vehicles in cooperation is high, so that how to realize more accurate positioning of multiple targets with less resource investment is the technical problem to be solved at present.
Disclosure of Invention
The technical problem mainly solved by the application is to provide a method, a system, an unmanned aerial vehicle and a storage medium for positioning targets, and more accurate positioning of a plurality of targets can be realized.
In order to solve the technical problem, the application adopts a technical scheme that: a method of target localization is provided, the method comprising:
obtaining the state information of each target at the current sampling moment by using the position data of a plurality of targets at the current sampling moment, which are measured by the same unmanned aerial vehicle;
evaluating the accurate positioning condition of the unmanned aerial vehicle for measuring the plurality of targets at a plurality of paths to be determined respectively;
taking the undetermined path point with the accurate positioning condition meeting the preset condition as a next path point of the unmanned aerial vehicle;
and controlling the unmanned aerial vehicle to fly towards the next path point, continuously measuring in the flying process to obtain the position data of the multiple targets at the next sampling moment, and obtaining the state information of each target at the next sampling moment by using the position data of the multiple targets at the next sampling moment.
In order to solve the above technical problem, another technical solution adopted by the present application is to provide an object positioning system, where the system includes a processor and a memory connected to the processor;
wherein the memory is used for storing program data;
the processor is configured to execute the program data to perform the method of object localization as described above.
For solving the technical problem, another technical scheme that this application adopted is, provides an unmanned aerial vehicle, unmanned aerial vehicle includes: the system comprises a sensing assembly, a driving circuit and a target positioning system, wherein the sensing assembly and the driving circuit are respectively connected with a processor in the target positioning system;
the sensing assembly is used for acquiring position data of a plurality of targets;
the driving circuit is used for responding to a control instruction of the processor and flying to a next path point;
the target positioning system is used for positioning the plurality of targets, and the target positioning system is the system.
In order to solve the above technical problem, a further technical solution of the present application is to provide a storage medium, where program data are stored, and when the program data are executed, the method for positioning an object as described above is implemented.
According to the scheme, the position data of a plurality of targets measured by the same unmanned aerial vehicle at the current sampling moment is utilized to obtain the state information of each target at the current sampling moment, the positioning accuracy of the unmanned aerial vehicle for measuring the plurality of targets at a plurality of paths to be determined is evaluated, the paths to be determined, the positioning accuracy of which meets the preset conditions, are used as the next path point of the unmanned aerial vehicle, the unmanned aerial vehicle is controlled to fly to the next path point, the position data of the plurality of targets at the next sampling moment is obtained by continuous measurement in the flying process, the state information of each target at the next sampling moment is obtained by utilizing the position data of the plurality of targets at the next sampling moment, the next path point of the unmanned aerial vehicle is determined based on the positioning accuracy of the plurality of targets measured at the plurality of paths to be determined, the determination of one path point which can enable the plurality of targets to be more accurately positioned can be realized, and the plurality of targets can be more accurately positioned.
Drawings
FIG. 1 is a schematic flow chart diagram illustrating an embodiment of a method for object localization according to the present application;
FIG. 2 is a schematic illustration of an azimuth and elevation angle of a target relative to a drone in an embodiment of a method of target location of the present application;
FIG. 3 is a schematic flow chart diagram illustrating a method for object localization according to another embodiment of the present application;
FIG. 4 is a schematic flow chart diagram illustrating a method of object localization according to another embodiment of the present application;
FIG. 5 is a schematic flow chart diagram illustrating a method of locating a target according to yet another embodiment of the present application;
FIG. 6 is a flowchart illustrating an embodiment of a method for object localization according to the present application;
FIG. 7a is a schematic diagram illustrating a path search range in a further embodiment of a method for object localization according to the present application;
FIG. 7b is a schematic diagram of a cost function in an embodiment of a method of object localization according to the present application;
FIG. 7c is a schematic diagram of a cost function in another embodiment of a method of object localization according to the present application;
FIG. 8 is a schematic flow chart diagram illustrating a method of locating an object according to yet another embodiment of the present application;
FIG. 9 is a schematic flow chart diagram illustrating a method of object localization according to another embodiment of the present application;
FIG. 10 is a schematic flow chart diagram illustrating a method for locating an object according to yet another embodiment of the present application;
FIG. 11 is a schematic flow chart diagram illustrating a method of object localization according to another embodiment of the present application;
FIG. 12 is a schematic block diagram illustrating an embodiment of a system for object localization according to the present application;
fig. 13 is a schematic structural diagram of an embodiment of an unmanned aerial vehicle according to the present application;
fig. 14 is a schematic structural diagram of an embodiment of a storage medium according to the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It is to be understood that the specific embodiments described herein are merely illustrative of the application and are not limiting of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In the description of the present application, "plurality" means at least two, e.g., two, three, etc., unless explicitly specifically limited otherwise. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
It needs to explain first that, the technical scheme that this application provided can be applied to unmanned aerial vehicle, and unmanned aerial vehicle can acquire the initial position data of target based on the sensing component that self was equipped, and single unmanned aerial vehicle can be used to fix a position a plurality of targets in the technical scheme that this application provided. Wherein, in one embodiment, the drone is equipped with an AOA (Angle-of-arrival) sensor for tracking multiple moving targets in an area in real time. The type of the unmanned aerial vehicle to which the method provided by the present application is applied is not limited.
Referring to fig. 1, fig. 1 is a schematic flowchart illustrating an embodiment of a target positioning method according to the present application.
S10: and acquiring direction data of a plurality of targets measured by the same unmanned aerial vehicle at the current sampling moment.
The sampling time refers to a time point of sampling directional data of a target, the duration of an interval between adjacent sampling times is set according to needs, the duration of an interval between adjacent sampling times is defined as a time step, the length of the time step can be adjusted according to actual needs, and the length is not limited herein. The technical scheme that this application provided can fix a position a plurality of targets that same unmanned aerial vehicle tracked simultaneously, so step S10 is the direction data of obtaining a plurality of targets that same unmanned aerial vehicle measured and obtained at present sampling moment.
In the current embodiment, the directional data includes measurements of the azimuth and pitch angles of the target relative to the drone. Referring to fig. 2, fig. 2 is a schematic diagram of an azimuth angle and a pitch angle of a target relative to an unmanned aerial vehicle in an embodiment of a target positioning method according to the present application. Wherein for the target P k,1 Target P k,1 Azimuth theta relative to drone k,1 Is an unmanned plane r k With the target P k,1 Orthographic projection P 'on xy plane' k,1 Between the connection line and the reference line Ar k Angle therebetween, wherein the reference line Ar k Is to pass the unmanned plane r k And a straight line parallel to the x-axis on the xy-plane, target P k,1 Relative to unmanned plane r k Angle of pitch phi k,1 Is a target P k,1 With unmanned aerial vehicle r k Between and the target P k,1 Orthographic projection P 'on xy plane' k,1 With unmanned aerial vehicle r k Angle between the connecting lines, wherein r k Is the position of the drone at the time of the k sample. Wherein, because unmanned aerial vehicle has certain volume, the position of event unmanned aerial vehicle can refer to the position of the last sensing component of equipping of unmanned aerial vehicle. In the technical solution provided by the present application, the same drone simultaneously obtains the direction data of the multiple targets at the current sampling time, the position information of the multiple targets relative to the drone is not the same, and the specific type of the direction data of the multiple targets at the current sampling time obtained in step S10 is not limited in the technical solution provided by the present application, and a specific method for obtaining the direction data is not limited.
In the current embodiment, the directional data of the multiple targets at the current sampling instant is obtained by AOA sensors equipped on the drone. Specifically, AOA sensors can acquire directional data of multiple different targets simultaneously at discrete sampling times k =1,2,3 \ 8230while using techniques provided hereinIn the scheme, k is used for representing sampling time, and i is used for identifying the ith target or the targets i, p tracked by the unmanned aerial vehicles at the same time k,i Indicating the position information of the target i at the kth sampling instant. In the present embodiment, the target p k,i Ideal azimuth angle theta k,i And a pitch angle phi k,i The mathematical model of (2) is as in formula (1):
Figure BDA0002047767620000041
Figure BDA0002047767620000042
wherein p is k,i =[p xk,i ,p yk,i ,p zk,i ]And r k =[r xk ,r yk ,r zk ]Respectively the target i at the kth sampling moment and the position information p of the unmanned aerial vehicle xk,i Representing the coordinates of the object i on the x-coordinate axis at the k-th sampling instant p yk,i Representing the coordinates of the target i on the y coordinate axis at the k-th sampling instant p zk,i Representing the coordinates of the target i on the z coordinate axis at the kth sampling instant r xk ,r yk ,r zk Is the position data of the represented drone in the x-axis, y-axis and z-axis directions at the kth sampling instant. I | P xk,i ,p yk,i ]-[r xk ,r yk ]I is p xk,i ,p yk,i ]-[r xk ,r yk ]Computing the Euclidean norm, tan -1 Is a four quadrant arctangent function.
In another embodiment, because the unmanned aerial vehicle is influenced by wind, engine vibration and other factors during flying, the airborne sensing assembly measurement contains large gaussian noise, and the target positioning accuracy is influenced. Correspondingly, in the current embodiment, when the mathematical models of the azimuth angle and the pitch angle are constructed, the noise which can be encountered in the practical application process is taken into account, and the corresponding target p k,i Azimuth angle containing noise
Figure BDA0002047767620000043
And a pitch angle
Figure BDA0002047767620000044
The actual measurement model of the azimuth angle and the pitch angle is shown in formula (2), and the actual measurement model of the azimuth angle and the pitch angle can be simply referred to as actual measurement value:
Figure BDA0002047767620000045
Figure BDA0002047767620000046
wherein n is k,i And m k,i Respectively mean value of zero and variance of
Figure BDA0002047767620000047
And
Figure BDA0002047767620000048
independent additive white Gaussian noise, θ k,i And phi k,i Ideal azimuth and elevation angles are shown. The position data and the speed of the drone may be obtained at least by an onboard navigation device, and it is understood that the obtaining manner of the position data and the speed of the drone is not limited herein.
S20: and processing the direction data of each target at the current sampling moment by using a three-dimensional pseudo-linear Kalman filtering algorithm to obtain the state information of each target at the current sampling moment.
The method comprises the steps of obtaining state information of each target at the current sampling moment, and obtaining direction data of each target at the current sampling moment.
The direction data of the target at the current sampling moment includes a direction angle and a pitch angle measured at the current sampling moment, the three-dimensional pseudo-linear kalman filtering algorithm includes a pseudo-linear kalman filtering algorithm of an xy plane and a pseudo-linear kalman filtering algorithm of a z axis, the state information of the target at least includes a three-dimensional coordinate position of the target and a motion speed of the target in a three-dimensional direction, and it can be understood that in other embodiments, the state information of the target may further include other contents.
In the embodiment illustrated in fig. 1, by acquiring the direction data of the multiple targets measured by the same unmanned aerial vehicle at the current sampling time, and then processing the direction data of each target at the current sampling time by using the three-dimensional pseudo-linear kalman filter algorithm, the state information of each target at the current sampling time is obtained, so that the multiple targets can be well positioned by the same unmanned aerial vehicle, and resources required to be invested in positioning and tracking the multiple targets are saved.
As described above, the three-dimensional pseudo-linear Kalman filtering algorithm includes an xy-plane pseudo-linear Kalman filtering algorithm and a z-axis pseudo-linear Kalman filtering algorithm.
Correspondingly, in another embodiment, the step S20 uses a three-dimensional pseudo-linear kalman filtering algorithm to process the directional data of each target at the current sampling time, and obtaining the state information of each target at the current sampling time includes: processing the direction data of each target at the current sampling moment by using a pseudo-linear Kalman filtering algorithm of the xy plane to obtain the state information of the xy plane of each target at the current sampling moment; and processing the direction data of each target at the current sampling moment by using a pseudo-linear Kalman filtering algorithm of the z axis to obtain the state information of the z axis of each target at the current sampling moment.
In the current embodiment, for the case that the pseudo-linear kalman filtering algorithm of the xy plane is firstly used to process the direction data of each target at the current sampling time to obtain the state information of each target at the xy plane at the current sampling time, or the case that the pseudo-linear kalman filtering algorithm of the z axis is firstly used to process the direction data of each target at the current sampling time to obtain the state information of each target at the z axis at the current sampling time, no special limitation is made, and the setting and adjustment can be specifically performed according to the requirements in different embodiments.
Further, please refer to fig. 3, wherein fig. 3 is a schematic flowchart illustrating a target positioning method according to another embodiment of the present application. In the current embodiment, the step of processing the direction data of each target at the current sampling time by using a pseudo-linear kalman filtering algorithm of xy planes/z axes to obtain the state information of the xy planes/z axes of each target at the current sampling time includes steps S31 to S33.
S31: and predicting to obtain the predicted state information of the target at the current sampling moment by using the final state information of the target at the previous sampling moment.
In the present embodiment, the three-dimensional dynamic model can be divided into two parts, the xy plane and the z axis. The dynamic model of the target on the xy plane is defined as
Figure BDA0002047767620000061
The dynamic model of the target in the z-axis direction is defined as
Figure BDA0002047767620000062
Where the superscript T of the matrix represents the transpose of the vector or matrix. Specifically, the predicted state information of the target on the xy plane at the current sampling time is obtained based on the final state information of the target on the xy plane at the previous sampling time and formula (3), and the predicted state information of the target in the z-axis direction at the current sampling time is obtained based on the final state information of the target in the z-axis direction at the previous sampling time and formula (4).
a k|k-1,i =Ua k-1|k-1,i (3)
b k|k-1,i =Gb k-1|k-1,i (4)
Where k denotes the current sampling instant, k-1 denotes the previous sampling instant, a k|k-1,i Representing the predicted state information of the target i on the xy plane at the current sampling instant, a k-1|k-1,i Representing the final of the target i at the previous sampling instant in the xy planeStatus information, b k|k-1,i Representing the predicted state information of the target i in the z-axis direction at the current sampling instant, b k-1|k-1,i The final state information of the target i in the z-axis direction at the previous sampling instant is indicated.
In the technical scheme provided by the application, the state information of the ith target in the three-dimensional space is defined as
Figure BDA0002047767620000063
Dynamic model satisfaction of the objective
Figure BDA0002047767620000064
In a very small time interval, the motion of the target in the xy plane can be regarded as uniform linear motion, U is the speed state of the target making uniform linear motion, G is a motion conversion matrix in the z-axis direction, and specifically
Figure BDA0002047767620000065
Wherein the content of the first and second substances,
Figure BDA0002047767620000066
which represents the speed of movement of the target i in the x-axis direction at the current sampling instant, and, similarly,
Figure BDA0002047767620000067
and
Figure BDA0002047767620000068
for representing the movement speed of the object i in the y-axis direction and the z-axis direction at the current sampling instant, respectively, the element T in the matrix represents the time step between discrete instants k and k +1, q k,i Representing the system process noise.
S32: and processing the direction data of the target at the current sampling moment to obtain the measurement state information of the target at the current sampling moment. The measurement state information of the target in the xy plane at the current sampling moment is obtained by carrying out nonlinear processing on the direction data of the target.
Wherein the content of the first and second substances,measurement status information h in xy plane for target i k,i Is obtained based on the formula (5), and the measurement state information l in the z-axis direction for the target i k,i Is obtained based on the formula (6).
Figure BDA0002047767620000069
Figure BDA00020477676200000610
Wherein the content of the first and second substances,
Figure BDA0002047767620000071
and the actual measurement value of the azimuth angle of the target i is measured by the unmanned aerial vehicle at the current sampling moment k. The dynamic model of the object i can be directly defined as l in the z-axis k,i =[1,0] T . In the technical scheme that this application provided, the nonlinear operation who measures the unmanned aerial vehicle's that the sensing element measured the gained practical measurement value converts linear operation into, can avoid the approximate data that causes of data to lose, and then reduces the error of carrying out the location to a plurality of targets.
S33: and obtaining the final state information of the target at the current sampling moment based on the predicted state information and the measured state information of the target at the current sampling moment.
And further obtaining final state information of the target at the current sampling moment in the respectively obtained predicted state information of the target at the current sampling moment and the measured state information of the target at the current sampling moment, wherein the final state information of the target at the current sampling moment at least comprises state information of the target on an xy plane and state information of the target in the z-axis direction. Further, see the detailed description of the content included in step S33 in fig. 4.
Further, please refer to fig. 4, where fig. 4 is a schematic flowchart illustrating a target positioning method according to another embodiment of the present application. In the present embodiment, step S33 illustrated in fig. 3 further includes step S41 before step S33, and step S33: and obtaining final state information of the target at the current sampling moment based on the predicted state information and the measured state information of the target at the current sampling moment, and then comprising the steps S42 to S45. Specifically, the method in the current embodiment includes:
s41: and predicting to obtain the predicted state accuracy of the target at the current sampling moment by using the final state accuracy of the target at the previous sampling moment.
The final state accuracy is a parameter for measuring the accuracy of the final state information, and correspondingly, the predicted state accuracy is a parameter for measuring the accuracy of the predicted state information. According to the technical scheme, after the final state information of the target is obtained at each sampling moment, the final state accuracy of the target at the current sampling moment is further obtained, and the measurement accuracy of the target i on an xy plane is represented by W. Specifically, W is used in the present embodiment k|k-1,i Representing the final state accuracy W of target i based on the previous sample time k-1|k-1,i The calculated accuracy of the predicted state at the current sampling time. Specifically, the predicted state accuracy W k|k-1,i Is obtained based on the formula (7).
W k|k-1,i =UW k-1|k-1,i U T +M k-1,i (7)
Wherein, U T Representing the transpose of U, M k-1,i Is the covariance matrix of the edge path noise at time k-1.
S42: and acquiring the error of the measurement state information of the target at the current sampling moment relative to the prediction state information.
Wherein the error g of the measured state information relative to the predicted state information of the target at the current sampling moment k,i Is obtained based on the formula (8).
Figure BDA0002047767620000072
Wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0002047767620000073
expressed is the transposed matrix, r, of the measurement state information of the target in the xy plane xyk The position data of the unmanned aerial vehicle on the xy plane at the current sampling moment is shown as a k|k-1,i The predicted state information of target i at the current sampling instant is indicated. In the current embodiment, the error of the measurement state information of the target at the current sampling time relative to the prediction state information is equal to the product of the transposed matrix of the measurement state information of the target at the xy plane and the difference between the position information of the unmanned aerial vehicle at the current sampling time and the prediction state information of the target at the current sampling time.
S43: and acquiring the distance between the target and the unmanned aerial vehicle, and obtaining the measurement accuracy of the unmanned aerial vehicle on the target at the current sampling moment based on the distance between the target and the unmanned aerial vehicle.
Wherein, the distance between the target i at the current sampling moment and the unmanned aerial vehicle is obtained based on the formula (9), and the measurement accuracy ξ of the target i at the current sampling moment k,i Is obtained based on the formula (10).
Δ=||[p xk|k-1,i ,p yk|k-1,i ] T -[r xk ,r yk ] T || (9)
Figure BDA0002047767620000081
Wherein, the distance between the target and the unmanned aerial vehicle at the current sampling moment is represented by delta, p xk|k-1,i ,p yk|k-1,i The distance between the target and the unmanned aerial vehicle is obtained by using a distance formula and based on the predicted state information of the target at the current sampling moment and the current position information of the unmanned aerial vehicle. The current position information of the unmanned aerial vehicle can be obtained based on the movement speed and the track of the unmanned aerial vehicle, and is a clear known quantity, which is not described herein. Sigma θ,i Is the white gaussian noise at the time when the target i measures the azimuth, which is a known quantity.
S44: and obtaining the weight of the measurement state information of the target at the current sampling moment by utilizing the measurement state information, the prediction state accuracy and the measurement accuracy of the target at the current sampling moment.
Wherein the weight omega of the measurement state information of the target at the current sampling moment k,i Is obtained based on the formula (11), and the formula (11) is as follows:
Figure BDA0002047767620000082
wherein the weight ω is k,i The weight ratio of the measurement state information of the target at the current sampling moment to the sum of the measurement state information and the prediction state information of the target at the current sampling moment is used, and other parameters in the formula can refer to the contents correspondingly stated in the context of the application.
S45: and obtaining final state information of the target at the current sampling moment by using the error and the weight.
Further, step S45 includes: the final state information a of the target i in the xy plane at the current sampling time is obtained by using the following formula (12) k|k,i . In the current embodiment, the final state information a of the target i k|k,i Equal to weight ω k,i And error g k,i And the sum of the predicted state information of the target at the current sampling instant. Specifically, equation (12) is as follows:
a k|k,i =a k|k-1,ik,i g k,i (12)
further, in the current embodiment, in the embodiment illustrated in fig. 3, after the final state information of the target at the current sampling time is obtained through step S33, the method provided by the present application further includes step S46, further obtaining the final state accuracy of the target i at the current sampling time.
S46: and obtaining the final state accuracy of the target at the current sampling moment based on the predicted state accuracy, the measured state information and the weight of the target at the current sampling moment.
Further, step S46 includes obtaining the final state accuracy W of the xy plane of the target i at the current sampling instant using the following equation (13) k|k,i And the output is stored for calling when the target is positioned at the next sampling moment.
Figure BDA0002047767620000091
Wherein I is an identity matrix, W k|k-1,i To predict the covariance matrix, the final state accuracy, ω, of the target at the current sampling time is expressed k,i And h k,i All can be obtained according to the above formula, and will not be described herein.
In yet another embodiment, the solving process of the final state information of the target in the z-axis direction is further described.
First, the final state accuracy S of the target in the z-axis direction at the previous sampling instant is used k-1|k-1,i Predicting to obtain the accuracy S of the predicted state of the target in the z-axis direction at the current sampling moment k|k-1,i . The calculation formula is shown in formula (14).
S k|k-1,i =GS k-1|k-1,i G T +Q k-1,i (14)
Wherein Q k-1,i The covariance matrix of the system process noise is shown, and G is the motion transformation matrix in the z-axis direction.
Furthermore, the error c of the measured state information of the target at the current sampling moment relative to the predicted state information is obtained k,i Specifically, the calculation is performed according to the formula (15),
Figure BDA0002047767620000092
and then obtaining the distance between the target and the unmanned aerial vehicle on the xy plane based on a formula (16)
Figure BDA0002047767620000093
And obtaining the measurement accuracy f of the unmanned aerial vehicle to the target at the current sampling moment based on the distance between the target and the unmanned aerial vehicle and a formula (17) k,i
Figure BDA0002047767620000094
Figure BDA0002047767620000095
Wherein the content of the first and second substances,
Figure BDA0002047767620000096
is the actual measurement of the pitch angle of target i relative to the drone. It should be noted that, in the current embodiment, when the final state information of the target on the xy plane is set to be acquired first, and then the final state information of the target in the z-axis direction is acquired, the distance between the target and the unmanned aerial vehicle on the xy plane is calculated
Figure BDA0002047767620000097
In the process, the method adopts the method that the final state information of the target at the current sampling moment on the xy plane is calculated and more accurate
Figure BDA0002047767620000098
It is to be understood that when in another embodiment it is set to obtain the final state information of the target on the xy plane and the final state information of the target on the z axis simultaneously, then the calculation is performed
Figure BDA0002047767620000099
The predicted state information of the target on the xy plane at the current sampling moment is adopted.
Then, the measurement state information l of the target at the current sampling moment is utilized k,i Predicted state accuracy S k|k-1,i And measurement accuracy f k,i Obtaining the weight t of the measurement state information of the target at the current sampling moment k,i Wherein, t k,i The weight ratio of the actual measurement state information of the target to the sum of the measurement state information and the predicted state information is shown, and the specific calculation formula is shown in formula (18).
Figure BDA0002047767620000101
Further, when calculating the final state information of the target on the z-axis, step S45 in fig. 4 uses the error and the weight to obtain the final state information of the target at the current sampling time includes: the final state information b of the target i on the z-axis at the current sampling time k is obtained using the following formula (19) k|k,i Equation (19) is as follows:
b k|k,i =b k|k-1,i +t k,i c k,i (19)
and finally, after the final state information of the target in the z-axis direction at the current sampling moment is obtained, the final state accuracy of the target at the current sampling moment is further obtained.
Specifically, the step S46, based on the accuracy of the predicted state of the target at the current sampling time, the measured state information and the weight, obtaining the accuracy of the final state of the target at the current sampling time includes: the final state accuracy S of the target i in the z-axis at the current sampling instant is obtained using equation (20) below k|k,i For a specific calculation process, see formula (20).
Figure BDA0002047767620000102
Where I is the identity matrix, S k|k-1,i Is the predicted state accuracy of target i in the z-axis direction at the current sampling instant k.
In the current embodiment, the method provided by the application can realize more accurate positioning of the target by dividing the three-dimensional dynamic model into an xy plane and a z axis.
Referring to fig. 5, fig. 5 is a schematic flowchart illustrating a target positioning method according to another embodiment of the present application. In the current embodiment, the influence of the waypoint of the drone on the target positioning accuracy is further considered, so the method provided by the present application further includes the content shown in fig. 5, and the method includes:
s51: and evaluating the accurate positioning condition of the unmanned aerial vehicle for measuring a plurality of targets at the plurality of path points to be determined respectively.
In the process of positioning the target by the unmanned aerial vehicle, the accuracy of target positioning can be influenced by the search path of the unmanned aerial vehicle. Therefore, the method for positioning the target further determines a next path point of the unmanned aerial vehicle, which can enable the target positioning to be more accurate, when or after the three-dimensional pseudo-linear Kalman filtering algorithm is used for processing the direction data of the target at the current sampling moment. When determining the next path points of the unmanned aerial vehicle, firstly, preliminarily determining a plurality of path points to be determined, and then evaluating the path points to be determined, specifically, evaluating the accurate positioning condition of the unmanned aerial vehicle for measuring a plurality of targets at the plurality of path points to be determined. After the positioning accuracy condition that the unmanned aerial vehicle measures a plurality of targets at the plurality of path points to be determined is obtained through evaluation, the next path point of the unmanned aerial vehicle can be further determined based on the positioning accuracy condition.
The accurate positioning condition is obtained based on the accuracy of the corresponding prediction state when the unmanned aerial vehicle is at the undetermined path point. Correspondingly, the positioning accuracy of each target is embodied as a cost function, so that the positioning accuracy of each target can be obtained based on the mathematical model of the cost function of the target corresponding to the formula (21).
J i (r k )=J xy,i (r xyk )+J z,i (r zk )=tr(W k|k,i )+tr(S k|k,i ) (21)
Wherein tr (-) represents the matrix tracing, i.e., tr (W) k|k,i ) And tr (S) k|k,i ) Respectively show that the W is obtained k|k,i And S k|k,i Trace of (a), r xyk Is the unmanned aerial vehicle position information in the xy plane at k sampling instants, and r zk Is the z-axis position information of the drone at the kth sampling instant.
Furthermore, because in the technical scheme provided by the application, the same unmanned aerial vehicle simultaneously positions a plurality of targets, the positioning accuracy of the plurality of targets at the undetermined path point needs to be evaluated simultaneously when the next path point is determined. Since the cost functions of the targets are additive, the cost functions of each target can be added for the case of accurate positioning of a plurality of targets. In the current embodiment, a certain pending pathPoint-corresponding cost function J i (r k ) The smaller the value of (c), the better the corresponding representation is accurately located at the currently pending waypoint.
Furthermore, since the weight ratios of the multiple targets tracked by the unmanned aerial vehicle at the same time are different, when the positioning accuracy of the multiple targets is calculated, the weight ratios corresponding to the different targets are further considered, and the positioning accuracy of the multiple targets can be specifically obtained according to the formula (22).
Figure BDA0002047767620000111
Wherein λ is i The weight proportion occupied by each target is shown, the weight proportion of each target can be preset and adjusted according to the requirement, N t Denotes the number of targets.
S52: and taking the undetermined path point with the accurate positioning condition meeting the preset condition as the next path point of the unmanned aerial vehicle.
In the current embodiment, an undetermined path point with accurate positioning condition meeting the preset condition is further set and selected as the next path point of the unmanned aerial vehicle. The preset conditions at least comprise that the positioning accuracy of the undetermined path point is greater than or equal to a preset value or the undetermined path point optimally corresponds to the positioning accuracy, wherein the fact that the positioning accuracy is greater than or equal to the preset value means that the cost function value corresponding to the undetermined path point is less than or equal to a certain preset value, and the fact that the positioning accuracy is optimal means that the cost function value corresponding to the undetermined path point is minimum. And if the path point with the optimal positioning accuracy for measuring the multiple targets can be determined as the next path point of the unmanned aerial vehicle.
S53: and controlling the unmanned aerial vehicle to fly to the next path point, continuously measuring in the flying process to obtain direction data of the multiple targets at the next sampling moment, and obtaining the state information of each target at the next sampling moment by using the direction data of the multiple targets at the next sampling moment.
After the next path point of the unmanned aerial vehicle is determined, a control command is generated and sent to a driving circuit of the unmanned aerial vehicle so as to control the unmanned aerial vehicle to fly to the next path point. And continuously measuring the direction data of the plurality of targets at the next sampling moment in the flying process, and obtaining the state information of each target at the next sampling moment by using the direction data of the plurality of targets at the next sampling moment.
Referring to fig. 6, fig. 6 is a schematic flowchart illustrating an embodiment of a target positioning method according to the present application.
S61: and obtaining the state information of each target at the current sampling moment by using the direction data of the plurality of targets at the current sampling moment, which is measured by the same unmanned aerial vehicle.
The direction data of the multiple targets at the current sampling time is obtained by an onboard sensing component of the unmanned aerial vehicle, the state information of each target at the current sampling time can be obtained according to at least one corresponding embodiment shown in fig. 1 to 4 and corresponding embodiments, and the process of obtaining the state information of each target at the current sampling time may specifically refer to the above, which is not described herein.
After the state information of each target at the current sampling time is obtained, and before the state information of each target at the next sampling time is obtained, the technical scheme provided by the application can execute the related evaluation and path optimization algorithm as described in the steps S62 to S63 to obtain a more accurate next path point, so as to improve the positioning accuracy of a plurality of targets.
S62: and evaluating the accurate positioning condition of the unmanned aerial vehicle for measuring a plurality of targets at the plurality of paths to be determined respectively.
After obtaining the state information of each target at the current sampling time, a number of pending waypoints are first determined. And then evaluating the positioning accuracy of the unmanned aerial vehicle respectively measuring a plurality of targets at a plurality of paths to be positioned based on the step S62. And determining a plurality of preliminarily determined undetermined path points based on a 3D grid search algorithm. In the current embodiment, the positioning accuracy is obtained by processing the positioning accuracy of multiple targets. What is evaluated in step S62 is that if the multiple targets are located when the unmanned aerial vehicle flies to the undetermined waypoint, the obtained accuracy of the state information of the multiple targets is the whole, which may be specifically referred to the explanation of step S51 in fig. 5 above.
S63: and taking the undetermined path point with the accurate positioning condition meeting the preset condition as the next path point of the unmanned aerial vehicle.
The preset conditions are set according to requirements and can be adjusted according to requirements in different embodiments. The preset conditions at least include: the positioning accuracy is the best, and the positioning accuracy is greater than the set value and meets other related conditions, which is not specifically set forth herein.
S64: and controlling the unmanned aerial vehicle to fly to the next path point, continuously measuring in the flying process to obtain the direction data of the multiple targets at the next sampling moment, and obtaining the state information of each target at the next sampling moment by using the direction data of the multiple targets at the next sampling moment. Step S64 is the same as step S53 in fig. 5, and may specifically refer to the corresponding explanation in fig. 5, which is not described herein again.
Referring to fig. 7a, fig. 7a is a schematic diagram illustrating a path search range in another embodiment of a method for object location according to the present application. In the present embodiment, all possible next waypoints of the drone constitute a waypoint having a radius of length v, limited by the speed of flight of the drone xyz T and we need to search the sphere to find the best next waypoint. Fig. 7a shows a search area required by using a 3D mesh search algorithm, where R1 indicates a position where the current drone is located, and all search waypoints in R2 are possible next certain waypoints, which are defined as undetermined waypoints in this application.
In the present embodiment, all possible next waypoints on the spherical surface should satisfy the relationship defined by formula (23), and all possible next waypoints are defined as the undetermined waypoints.
r k+1 =r kj ,j=1,2,...,N xy (23)
Wherein r is k Is the position of the unmanned plane at the current sampling moment, mu j Defining N for a vector from the current position of the unmanned aerial vehicle to the next path point xy Is the number of searches in the xy plane, N z Is the number of searches in the z-axis, j is the serial number used to represent the currently computed pending path point, where μ j Satisfies formula (24):
Figure BDA0002047767620000131
in formula (24), s =1,2, \ 8230;, N z ,l=1,2,…,N xy And is
Figure BDA0002047767620000132
And
Figure BDA0002047767620000133
and indicating the azimuth angle and the pitch angle of the vector of the path point to be determined relative to the path point where the unmanned aerial vehicle is currently located.
In the process of uniformly searching the next path point, the azimuth angle and the pitch angle of each undetermined path point satisfy the relationship defined by the formula (25).
Figure BDA0002047767620000134
Figure BDA0002047767620000135
In the formula (25), N xy >1,N z >1,
Figure BDA0002047767620000136
And is
Figure BDA0002047767620000137
After the azimuth angle and the pitch angle of the path point to be determined are obtained, the information [ r ] to be positioned of the unmanned aerial vehicle can be based on xk+1 ,r yk+1 ,r zk+1 ]And recalculating the final state information of the target at the current sampling time on each path point to be determinedObtaining an undetermined azimuth angle of each target when the unmanned aerial vehicle flies to an undetermined path point
Figure BDA0002047767620000138
And an undetermined pitch angle
Figure BDA0002047767620000139
Then the undetermined azimuth angle of each target is determined
Figure BDA00020477676200001310
And an undetermined pitch angle
Figure BDA00020477676200001311
And respectively substituting the equations (26) and (27) to obtain the accuracy to be measured corresponding to the path point to be measured, then substituting the obtained accuracy to be measured into the equation (21), and then using the equation (22) to further solve to obtain the cost function value of the target corresponding to the current path point to be measured, wherein the cost function value of the target corresponding to the path point to be measured is as described in the equation (26).
Figure BDA00020477676200001312
Where k +1 denotes the next sampling instant,
Figure BDA00020477676200001313
indicated is the distance of the target i from the sensing assembly with which the drone is equipped, on the xy plane, at the next sampling instant.
Equation (27) is as follows:
Figure BDA0002047767620000141
wherein the content of the first and second substances,
Figure BDA0002047767620000142
and
Figure BDA0002047767620000143
using [ r xk+1 ,r yk+1 ,r zk+1 ]And calculating the final target state information at the moment k, and then obtaining the final cost function value of the currently undetermined path point. The details of the equations (26) and (27) can be found in the corresponding embodiments of fig. 1 to 4, and are not described in detail herein.
Figure BDA0002047767620000144
As described above, the undetermined waypoint whose accurate positioning condition meets the preset condition is taken as the next waypoint of the unmanned aerial vehicle. In the present embodiment, the predetermined condition is that the positioning accuracy is optimal, so the next path point r is k+1 The calculation result of equation (29).
Figure BDA0002047767620000145
Wherein the content of the first and second substances,
Figure BDA0002047767620000146
the expression is to take the minimum value of the cost function, and N is expressed by N xy And N z The determined total search points. For certain N xy And N z ,N=N xy ×N z
Because among the technical scheme that this application provided, realize simultaneously by same unmanned aerial vehicle and fix a position a plurality of targets, but in the confirmation process of next waypoint, local minimum problem can appear, and then influence the degree of accuracy to the target location, so need further optimize the quantity and the scope of target search.
Referring to fig. 7b and fig. 7c, fig. 7b is a diagram illustrating a cost function for facilitating understanding of the local minimum problem according to an embodiment of the method for object localization, and the local minimum problem is explained by taking a specific numerical example. As in one embodiment, the three targets are from [0,300,200, respectively] T m,[100,-200,100] T m and [200,100,300 ]] T m starts to move. Of an objectAt a constant speed v of 3,3,1] T m/s,[6,4,2] T m/s and [4,0,4] T m/s. In addition, the target has a zero-mean gaussian distribution of random accelerations when in motion. The variance of the acceleration of the target is 0.5 2 ,0.1 2 ,0.1 2 ] T m 4 /s 4 ,[0.1 2 ,0.4 2 ,0.2 2 ] T m 4 /s 4 And [0.5 2 ,0.5 2 ,0.1 2 ] T m 4 /s 4 . For each target, presetting a parameter q in a three-dimensional pseudo-linear Kalman filter x =q y =q z And =0.01. The initial position of the unmanned plane is [0,0 ]]m, the initial state matrix of the target estimator is
Figure BDA0002047767620000147
The initial covariance matrices of the three targets' 3D PLKF (three-dimensional pseudo-linear Kalman Filter Algorithm) are all the same, diag [ W 0|0 ,S 0|0 ]=diag[10 4 ,10 4 ,10 4 ,10 4 ,10 4 ,10 4 ]. Velocity v of unmanned plane xyz =70m/s, T =1s. Using a search with a fixed number N of searches xy =N z And (5) when k =6, obtaining all cost function values of points with the next searching range of k = 7. Fig. 7b and 7c show the cost function values and the corresponding contour plots for the waypoints on the sphere, respectively. We use azimuth and elevation angles
Figure BDA0002047767620000151
To represent points on the sphere, where Cost function value in fig. 7b represents the Cost function value of the waypoint,
Figure BDA0002047767620000152
(hierarchy) indicates the azimuth of the target relative to the drone, and phi (hierarchy) is the pitch of the target relative to the drone.
Cost function values for points on the sphere at k =7 in fig. 7b Q for the contour plot of global minimum in fig. 7c 2 Denotes a local minimum value Q 1 And (4) showing.
From fig. 7b and 7c, we can see that the cost function is a non-convex function and that there is a local minimum. Under the influence of prediction accuracy, the local minimum problem is more serious in the initial stage of target tracking, namely when the measurement information is less. As the measurements increase and the estimation algorithm runs, the positioning accuracy improves gradually, the cost function graph will change and the local minimum points may disappear, which makes the size of the search grid dynamically adjustable as needed when searching for a path. Furthermore, if the search grid size is not appropriate, the selected waypoints are always not sufficiently close to the global minimum. Therefore, further optimization of the number and scope of target searches is needed.
Specifically, please refer to fig. 8 for solving the above problem, wherein fig. 8 is a schematic flowchart of another embodiment of a target positioning method according to the present application. In the present embodiment, step S81 is further included before step S62 illustrated in fig. 6, and step S62 illustrated in fig. 6 includes steps S82 to S84. In the current embodiment, the method includes:
s81: obtaining the search number N of the xy plane by using a Leptochis constant estimation algorithm xy And the number of searches in the z-axis N z
In the present embodiment, the cost function of path optimization can be seen as a rischz continuous function with respect to the drone position and the target position. Therefore, in the current embodiment, a Rippschitz constant estimation algorithm is adopted for optimizing the path point search number, and the xy plane search number N is obtained xy And the number of searches N in the z-axis z . Details of step S81 can be further referred to corresponding contents of fig. 10.
S82: and determining N undetermined path points, and estimating estimated direction data of each target at the next sampling moment, which is obtained by measurement of the unmanned aerial vehicle positioned at each undetermined path point.
Obtaining the search number N of the xy plane by using a Richoz constant estimation algorithm xy And the number of searches in the z-axis N z Then, the number of searches N is further based on the xy plane xy And the number of searches in the z-axis N z Determining N pendingAnd (4) path points. Wherein N is equal to N xy *N z . And then, the predicted direction data of each target at the next sampling time is obtained by using the contents described in fig. 1 to fig. 4. In other embodiments, step S82 may also include step S91 and step S92 as illustrated in fig. 9.
S83: and determining the accurate positioning condition of each target corresponding to the undetermined path point on the xy plane and the accurate positioning condition of each target on the z axis by using the estimated direction data of each target corresponding to the undetermined path point at the next sampling moment. In other embodiments, step S83 may further include step S93 and step S94 as illustrated in fig. 9.
S84: and obtaining the accurate positioning conditions of the multiple targets corresponding to the path points to be determined according to the accurate positioning condition of each target in the xy plane and the accurate positioning condition of each target in the z axis. In other embodiments, step S84 may further include step S95 as illustrated in fig. 9.
Step S83 and step S84 can refer to the content of the embodiment portion corresponding to fig. 5, and are not described herein again.
Referring to fig. 9, fig. 9 is a schematic flowchart illustrating a target positioning method according to another embodiment of the present application. In the current embodiment, a method for positioning a target provided by the present application includes:
s91: and determining N undetermined path points.
After the unmanned aerial vehicle obtains the state information of each target at the current sampling moment, N undetermined path points are further determined, and the mode for determining the N undetermined path points can be randomly selected according to a certain rule or the range of the undetermined path points is determined and then the undetermined path points are further determined based on the flying characteristics of the unmanned aerial vehicle.
S92: and obtaining the pre-estimated direction data of each target at the next sampling moment, which is measured when the unmanned aerial vehicle is positioned at the undetermined path point, by using the three-dimensional vector from the current position of the unmanned aerial vehicle to the undetermined path point and the state information of each target at the current sampling moment.
The estimated direction data of each target at the next sampling moment comprises the following steps: and the estimated direction data of each target on the xy plane at the next sampling moment and the estimated direction data of each target in the z-axis direction at the next sampling moment. And furthermore, the estimated direction data of each target on the xy plane at the next sampling moment at least comprises an estimated azimuth angle, and the estimated direction data of each target in the z-axis direction at the next sampling moment at least comprises an estimated pitch angle. In other embodiments, the estimated direction data of the target may also be understood as the measurement information of the target for predicting the next time, and the estimated direction data may be obtained by using the above equations (3) and (4).
S93: and processing the estimated direction data of each target at the next sampling moment by using a pseudo-linear Kalman filtering algorithm of the xy plane to obtain the state accuracy of each target on the xy plane, and performing matrix tracing on the state accuracy of each target on the xy plane to obtain the accurate positioning condition of each target on the xy plane. The functional relationship of the accurate positioning of the target on the xy plane is represented as the cost function.
In the current embodiment, the positioning accuracy of each target in the xy plane can be obtained according to the formula (30), specifically, the formula (30) is as follows:
J xy,i (r xyk )=tr(W k|k,i ) (30)
wherein, J i (r k ) Is a value of the cost function corresponding to the target i, J xy,i (r xyk ) Is a cost function value, J, for the target i in the xy plane z,i (r zk ) The cost function value of the target i in the z-axis direction is shown, tr (-) represents the matrix tracing, r xyz Is the drone information in the xy plane, and r zk Is the position information of the unmanned aerial vehicle in the z-axis direction at the kth sampling time, W k|k,i Is the accuracy of the target i at the current sampling instant on the xy plane, S k|k,i Is the accuracy of the target i in the z-axis direction at the current sampling instant.
S94: and processing the pre-estimated direction data of each target at the next sampling moment by using a pseudo-linear Kalman filtering algorithm of the z axis to obtain the state accuracy of each target in the z axis, and performing matrix tracing on the state accuracy of each target in the z axis to obtain the positioning accuracy condition of each target in the z axis. In the current embodiment, the positioning accuracy of each target in the z-axis can be obtained according to formula (31), specifically, formula (31) is as follows:
J z,i (r zk )=tr(S k|k,i ) (31)
s95: and summing the accurate positioning condition of each target in the xy plane and the accurate positioning condition of each target in the z axis, and performing weighted summation on the sum of each target to obtain the accurate positioning condition of a plurality of targets corresponding to the path points to be determined. Because the cost function is additive, the positioning accuracy of each target on the xy plane and the positioning accuracy of each target on the z axis can be directly obtained by adding the positioning accuracy of each target on the xy plane and the positioning accuracy of each target on the z axis by using a formula (32), and then the positioning accuracy of the targets can be obtained by adding the cost functions of the multiple targets positioned by the unmanned aerial vehicle at present by using a formula (33).
J i (r k )=J xy,i (r xyk )+J z,i (r zk )=tr(W k|k,i )+tr(S k|k,i ) (32)
Figure BDA0002047767620000171
S96: and taking the undetermined path point with the accurate positioning condition meeting the preset condition as the next path point of the unmanned aerial vehicle.
S97: and controlling the unmanned aerial vehicle to fly to the next path point, continuously measuring in the flying process to obtain the direction data of the multiple targets at the next sampling moment, and obtaining the state information of each target at the next sampling moment by using the direction data of the multiple targets at the next sampling moment.
Step S96 and step S97 are the same as step S63 and step S64 illustrated in fig. 6, and reference may be specifically made to the description of the corresponding parts in fig. 6, which is not repeated herein.
Further, please refer to fig. 10, fig. 10 is a flowchart illustrating a target positioning method according to another embodiment of the present application. In the present embodiment, step S81 in fig. 8 includes:
s101: and solving a target Lipschitz constant based on a Lipschitz constant estimation algorithm. In order to ensure the accessibility of the global optimum value, the grid distance between each next path point on the three-dimensional spherical surface is less than or equal to
Figure BDA0002047767620000172
Where M is the Lipschitz constant, an accurate Lipschitz constant needs to be obtained.
Further, please refer to fig. 11, fig. 11 is a schematic flowchart illustrating a target positioning method according to another embodiment of the present application. In the present embodiment, the solving process of the target lipschitz constant is elaborated, and the step S101 further includes:
s111: and randomly selecting L groups of test path points within a preset range.
Each group of test path points comprises a plurality of test path points, and the preset range is a region defined by a pitch angle search range and an azimuth angle search range. In the present embodiment, the search range of the pitch angle
Figure BDA0002047767620000173
Search range of azimuth [ -pi, pi]。
S112: and calculating the positioning accuracy of each testing path point for measuring a plurality of targets, and calculating the partial derivative of the position of each testing path point according to the positioning accuracy of each testing path point.
The process of calculating the positioning accuracy of each test path point for measuring multiple targets in step S112 is referred to the above equations (26) and (27), and will not be described in detail here. After the positioning accuracy of each test path point for measuring a plurality of targets is obtained, the position of each test path point is subjected to partial derivation based on the positioning accuracy of each test path point based on a formula (34).
Figure BDA0002047767620000181
Wherein, gamma is l Is per test path pointThe deviation value of the positioning accuracy to the position of the test path point can also be understood as the slope of each path point.
S113: and selecting an absolute maximum partial derivative value from the partial derivatives of each group of test path points to obtain L maximum partial derivative values. After obtaining the partial derivatives, the absolute values of the partial derivatives are further obtained, and the maximum partial derivative with the absolute value is selected and output as M, as shown in the following formula (35) l And then obtaining a data set M = { M = { M = } 1 ,M 2 ,...,M D In which s is l Indicating the location of the waypoints, s l Including in the present embodiment at least the azimuth and elevation angles of the waypoint relative to the current waypoint.
Figure BDA0002047767620000182
S114: and fitting the L maximum partial derivatives into an inverse Weber distribution model.
After obtaining the data set M, the maximum value data in the data set M is obtained as M, that is
Figure BDA0002047767620000183
Since the number of cycles is set in steps S111 to S113, a data set m = { m } is obtained 1 ,m 2 ,...,m L Then set the data m 1 ,m 2 ,...,m L Fitted into an inverse weber distribution model.
S115: and solving a position parameter of the inverse Weber distribution based on a second moment method, and outputting the position parameter as a target RipShtz constant.
After the position parameter of the inverse weber distribution is obtained, the position parameter is output as the target lipschitz constant M, and then the unit search distance is obtained based on the obtained target lipschitz constant, and step S102 in fig. 10 is collectively executed.
S102: and solving a unit search distance according to the target Lipschitz constant.
Further, step S102 includes: the unit search distance d is calculated using equation (36).
Figure BDA0002047767620000184
Wherein M is a target Lipschitz constant, and epsilon is a preset positioning accuracy value.
S103: the unit search distance is converted into a unit search pitch angle and a unit search azimuth angle.
Figure BDA0002047767620000185
S104: taking the quotient of the pitch angle searching range and the unit searching pitch angle as the searching number N of the xy plane xy Taking the quotient of the azimuth angle searching range and the unit searching azimuth angle as the searching number N in the z-axis z . Wherein, the pitch angle search range is unmanned aerial vehicle height turn rate, and the search range of azimuth is unmanned aerial vehicle azimuth turn rate.
Fig. 12 shows a schematic structural diagram of an embodiment of a target positioning system according to the present application, where fig. 12 is a schematic structural diagram of a target positioning system according to the present application. The system 120 for object localization comprises a processor 121 and a memory 122 connected to the processor 121. Wherein memory 122 stores program data and results of the execution of the operations of processor 121. And processor 121, when executing stored program data, is configured to perform a method of object localization as described in the various embodiments above. In one embodiment, the system 120 for target positioning provided herein may be directly loaded on a drone. In another embodiment, the system 120 for locating an object provided in the present application may be loaded on a device that can communicate with a drone in real time, which is not specifically listed here.
This application still provides an unmanned aerial vehicle, please see fig. 13, and fig. 13 is the structural schematic diagram of an unmanned aerial vehicle embodiment of this application. The unmanned aerial vehicle 130 that this application provided includes: a sensing component 133, a driving circuit 134 and an object positioning system 135, wherein the sensing component 133 and the driving circuit 134 are respectively connected with the processor 131 in the object positioning system 135.
The sensing component 133 is configured to acquire orientation data of a plurality of targets and feed the orientation data back to the processor 131 in the target positioning system 135. Further, the sensing component 133 includes an AOA (Angle-of-Angle) sensor.
The driving circuit 134 is connected to the processor 131, and the driving circuit 134 is configured to fly to a next path point in response to a control instruction of the processor 131. The next waypoint is determined based on any one of the embodiments shown in fig. 6 to 11 and corresponding embodiments.
The memory 132 is used to store program data.
The target positioning system 135 is a system as illustrated in fig. 12, and the processor 131 in the target positioning system 135 is configured to run program data to execute the target positioning method as described in any one of fig. 1 to fig. 11 and the corresponding embodiments thereof, so as to position a plurality of targets.
The present application further provides a storage medium, and referring to fig. 14, fig. 14 is a schematic structural diagram of an embodiment of the storage medium according to the present application. The storage medium 140 stores program data 141, which program data 141, when executed, implements a method of object localization as described above. Specifically, the storage medium 140 having a storage function may be one of a memory, a personal computer, a server, a network device, or a usb disk.
The above description is only for the purpose of illustrating embodiments of the present application and is not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings of the present application or are directly or indirectly applied to other related technical fields, are also included in the scope of the present application.

Claims (9)

1. A method of object localization, the method comprising:
obtaining state information of each target at the current sampling moment by using direction data of a plurality of targets measured by the same unmanned aerial vehicle at the current sampling moment;
determining N pendingAnd route points, wherein the pre-estimated direction data of each target at the next sampling moment, which is obtained by the measurement of the unmanned aerial vehicle positioned at the undetermined route points, is obtained by utilizing the three-dimensional vector from the current position of the unmanned aerial vehicle to the undetermined route points and the state information of each target at the current sampling moment, wherein N is equal to N xy *N z ,N xy And N z The correspondence represents the number of searches in the xy plane and the z axis;
processing the estimated direction data of each target at the next sampling moment by using a pseudo-linear Kalman filtering algorithm of an xy plane to obtain the state accuracy of each target in the xy plane, and performing matrix tracing on the state accuracy of each target in the xy plane to obtain the accurate positioning condition of each target in the xy plane; and
processing the pre-estimated direction data of each target at the next sampling moment by using a pseudo-linear Kalman filtering algorithm of a z-axis to obtain the state accuracy of each target on the z-axis, and performing matrix tracing on the state accuracy of each target on the z-axis to obtain the positioning accuracy condition of each target on the z-axis;
summing the positioning accuracy of each target in an xy plane and the positioning accuracy of each target in a z axis, and performing weighted summation on the sum of each target to obtain the positioning accuracy of the plurality of targets corresponding to the undetermined path point;
taking the undetermined path point with the accurate positioning condition meeting the preset condition as a next path point of the unmanned aerial vehicle;
and controlling the unmanned aerial vehicle to fly towards the next path point, continuously measuring in the flying process to obtain direction data of a plurality of targets at the next sampling moment, and obtaining the state information of each target at the next sampling moment by using the direction data of the plurality of targets at the next sampling moment.
2. The method of claim 1, wherein prior to determining the N pending waypoints, further comprising:
obtaining the search number N of the xy plane by using a Leptochis constant estimation algorithm xy And the number of searches in the z-axis N z
3. The method according to claim 2, wherein the Rippschtz constant estimation algorithm is adopted to obtain the search number N of the xy plane xy And the number of searches in the z-axis N z The method comprises the following steps:
solving a target Leptoschitz constant based on the Leptoschitz constant estimation algorithm;
solving a unit search distance according to the target Leptochz constant;
converting the unit search distance into a unit search pitch angle and a unit search azimuth angle;
taking the quotient of the pitch angle searching range and the unit searching pitch angle as the searching number N of the xy plane xy Taking the quotient of the azimuth angle searching range and the unit searching azimuth angle as the searching number N in the z-axis z
4. The method of claim 3, wherein the finding a target lipschitz constant based on the lipschitz constant estimation algorithm comprises:
randomly selecting L groups of test path points in a preset range, wherein each group of test path points comprises a plurality of test path points, and the preset range is an area defined by the pitch angle search range and the azimuth angle search range;
calculating the positioning accuracy of each testing path point for measuring the plurality of targets, and calculating the position deviation of each testing path point according to the positioning accuracy of each testing path point;
selecting a maximum partial derivative value from the partial derivative values of each group of test path points to obtain L maximum partial derivative values;
fitting the L maximum partial derivatives into an inverse Weber distribution model;
and solving the position parameter of the inverse Weber distribution based on a second moment method, and outputting the position parameter as a target RipShtz constant.
5. The method of claim 3,
the pitch angle search range is the unmanned aerial vehicle altitude turn rate, and the azimuth angle is the unmanned aerial vehicle azimuth angle turn rate.
6. The method of claim 3, wherein said determining a unit search distance from said target lipschitz constant comprises:
the unit search distance d is calculated using the following formula:
Figure FDA0003794135660000021
and M is a target RipShtz constant, and epsilon is a preset positioning accuracy value.
7. An object positioning system, comprising a processor and a memory coupled to the processor;
wherein the memory is used for storing program data;
the processor is configured to execute the program data to perform the method according to any one of claims 1 to 6.
8. A drone, characterized in that it comprises: the system comprises a sensing assembly, a driving circuit and a target positioning system, wherein the sensing assembly and the driving circuit are respectively connected with a processor in the target positioning system;
the sensing assembly is used for acquiring direction data of a plurality of targets;
the driving circuit is used for responding to a control instruction of the processor and flying to a next path point;
the object localization system for localizing the plurality of objects, the object localization system being the system of claim 7.
9. A storage medium characterized in that it stores program data which, when executed, implement the method according to any one of claims 1 to 6.
CN201910364502.5A 2019-04-30 2019-04-30 Target positioning method, system, unmanned aerial vehicle and storage medium Active CN110220513B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910364502.5A CN110220513B (en) 2019-04-30 2019-04-30 Target positioning method, system, unmanned aerial vehicle and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910364502.5A CN110220513B (en) 2019-04-30 2019-04-30 Target positioning method, system, unmanned aerial vehicle and storage medium

Publications (2)

Publication Number Publication Date
CN110220513A CN110220513A (en) 2019-09-10
CN110220513B true CN110220513B (en) 2022-10-04

Family

ID=67820544

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910364502.5A Active CN110220513B (en) 2019-04-30 2019-04-30 Target positioning method, system, unmanned aerial vehicle and storage medium

Country Status (1)

Country Link
CN (1) CN110220513B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112578354B (en) * 2020-02-28 2024-02-23 加特兰微电子科技(上海)有限公司 Method for determining azimuth angle of target object, computer device and storage medium
CN114216463B (en) * 2021-11-04 2024-05-28 国家电网有限公司 Path optimization target positioning method and device, storage medium and unmanned equipment

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101251593A (en) * 2008-03-31 2008-08-27 中国科学院计算技术研究所 Method for tracking target of wireless sensor network
CN101458325A (en) * 2009-01-08 2009-06-17 华南理工大学 Wireless sensor network tracking method based on self-adapting prediction
CN102830391A (en) * 2011-06-16 2012-12-19 中国科学院沈阳自动化研究所 Accuracy index calculating method of infrared search and track system
CN105848285A (en) * 2016-05-16 2016-08-10 国网重庆市电力公司电力科学研究院 Compressive sensing-based power grid equipment patrol inspection positioning method
CN107084714A (en) * 2017-04-29 2017-08-22 天津大学 A kind of multi-robot Cooperation object localization method based on RoboCup3D
CN107255795A (en) * 2017-06-13 2017-10-17 山东大学 Localization Approach for Indoor Mobile and device based on EKF/EFIR mixed filterings
WO2017189771A1 (en) * 2016-04-27 2017-11-02 Skogsrud Simen Method of iterative motion control
CN107743299A (en) * 2017-09-08 2018-02-27 天津大学 Towards the consensus information filtering algorithm of unmanned aerial vehicle onboard mobile sensor network
CN108254716A (en) * 2017-12-12 2018-07-06 四川大学 A kind of observation platform track optimizing method based on particle cluster algorithm
JP2018147467A (en) * 2017-03-03 2018-09-20 アルパイン株式会社 Flight controller and flight control method for unmanned aircraft
CN109254591A (en) * 2018-09-17 2019-01-22 北京理工大学 The dynamic route planning method of formula sparse A* and Kalman filtering are repaired based on Anytime

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
SG10201400810VA (en) * 2014-03-20 2015-10-29 Nanyang Polytechnic Method and system for multi-layer positioning system
CN106225790B (en) * 2016-07-13 2018-11-02 百度在线网络技术(北京)有限公司 A kind of determination method and device of unmanned vehicle positioning accuracy

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101251593A (en) * 2008-03-31 2008-08-27 中国科学院计算技术研究所 Method for tracking target of wireless sensor network
CN101458325A (en) * 2009-01-08 2009-06-17 华南理工大学 Wireless sensor network tracking method based on self-adapting prediction
CN102830391A (en) * 2011-06-16 2012-12-19 中国科学院沈阳自动化研究所 Accuracy index calculating method of infrared search and track system
WO2017189771A1 (en) * 2016-04-27 2017-11-02 Skogsrud Simen Method of iterative motion control
CN105848285A (en) * 2016-05-16 2016-08-10 国网重庆市电力公司电力科学研究院 Compressive sensing-based power grid equipment patrol inspection positioning method
JP2018147467A (en) * 2017-03-03 2018-09-20 アルパイン株式会社 Flight controller and flight control method for unmanned aircraft
CN107084714A (en) * 2017-04-29 2017-08-22 天津大学 A kind of multi-robot Cooperation object localization method based on RoboCup3D
CN107255795A (en) * 2017-06-13 2017-10-17 山东大学 Localization Approach for Indoor Mobile and device based on EKF/EFIR mixed filterings
CN107743299A (en) * 2017-09-08 2018-02-27 天津大学 Towards the consensus information filtering algorithm of unmanned aerial vehicle onboard mobile sensor network
CN108254716A (en) * 2017-12-12 2018-07-06 四川大学 A kind of observation platform track optimizing method based on particle cluster algorithm
CN109254591A (en) * 2018-09-17 2019-01-22 北京理工大学 The dynamic route planning method of formula sparse A* and Kalman filtering are repaired based on Anytime

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于特征压缩和卡尔曼滤波融合的车辆实时跟踪算法;郑斌等;《Infats Proceedings of the 14th International Forum of Automotive Traffic Safety》;20171201;240-245 *

Also Published As

Publication number Publication date
CN110220513A (en) 2019-09-10

Similar Documents

Publication Publication Date Title
US4954837A (en) Terrain aided passive range estimation
US9223007B2 (en) Kalman filtering with indirect noise measurements
CN110554376A (en) Radar range finding method for vehicles
CN110186456B (en) Target positioning method, system, unmanned aerial vehicle and storage medium
CN109668553A (en) Navigation equipment based on inertia and the inertial navigation method based on opposite pre-integration
CN110285800B (en) Cooperative relative positioning method and system for aircraft cluster
CN110220513B (en) Target positioning method, system, unmanned aerial vehicle and storage medium
KR101833007B1 (en) Method and system for estimating position and velocity of underwater vehicle using doppler beacon
Zhu et al. Vision/GPS-based docking control for the UAV autonomous aerial refueling
CN110672103A (en) Multi-sensor target tracking filtering method and system
CN111736144B (en) Maneuvering turning target state estimation method only by distance observation
CN111722213B (en) Pure distance extraction method for maneuvering target motion parameters
CN110908404B (en) AUV intelligent observation motion method based on data driving
CN109769206B (en) Indoor positioning fusion method and device, storage medium and terminal equipment
Indiveri et al. Fixed target 3D localization based on range data only: A recursive least squares approach
CN111708013A (en) Target tracking filtering method for distance coordinate system
RU2232402C2 (en) Method for determination of range to sources of radio emission and rate of closure to them in single-position radar systems
Shmatko et al. Estimation of rotation measurement error of objects using computer simulation
CN112230194B (en) Deblurring method, equipment and storage medium based on translation array
Manchester et al. Method for optical-flow-based precision missile guidance
CN104792336B (en) A kind of state of flight measurement method and device
JP5939395B2 (en) Method and apparatus for measuring relative posture of moving object
CN114624725B (en) Target object monitoring method and device
Mallick et al. Posterior Cramér-Rao Lower Bound for Angle-only Filtering in 3D
JP6757227B2 (en) Motion parameter estimation device, motion parameter estimation method and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant