CN113340272A - Ground target real-time positioning method based on micro-group of unmanned aerial vehicle - Google Patents
Ground target real-time positioning method based on micro-group of unmanned aerial vehicle Download PDFInfo
- Publication number
- CN113340272A CN113340272A CN202110728010.7A CN202110728010A CN113340272A CN 113340272 A CN113340272 A CN 113340272A CN 202110728010 A CN202110728010 A CN 202110728010A CN 113340272 A CN113340272 A CN 113340272A
- Authority
- CN
- China
- Prior art keywords
- unmanned aerial
- aerial vehicle
- target
- elevation
- positioning
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C5/00—Measuring height; Measuring distances transverse to line of sight; Levelling between separated points; Surveyors' levels
- G01C5/005—Measuring height; Measuring distances transverse to line of sight; Levelling between separated points; Surveyors' levels altimeters for aircraft
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/04—Interpretation of pictures
- G01C11/06—Interpretation of pictures by comparison of two or more pictures of the same area
- G01C11/12—Interpretation of pictures by comparison of two or more pictures of the same area the pictures being supported in the same relative position as when they were taken
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C15/00—Surveying instruments or accessories not provided for in groups G01C1/00 - G01C13/00
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Multimedia (AREA)
- Position Fixing By Use Of Radio Waves (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention discloses a ground target real-time positioning method based on unmanned aerial vehicle micro-clusters, which comprises the following steps: firstly, performing elevation measurement zero drift correction on each unmanned aerial vehicle in the unmanned aerial vehicle micro group; step two, performing direction angle measurement zero drift correction on each unmanned aerial vehicle in the unmanned aerial vehicle micro group; step three, hovering the micro-group of unmanned aerial vehicles at any time k, and resolving a pitch angle and a direction angle of a connecting line between each unmanned aerial vehicle and a target center; step four, calculating the longitude and latitude and the target elevation of the target center based on the single unmanned aerial vehicle positioning; and fifthly, carrying out data fusion on the target center longitude and latitude and the target elevation of each unmanned aerial vehicle to obtain the estimated values of the target longitude and latitude and the target elevation based on the micro-group of the unmanned aerial vehicles. According to the invention, zero drift correction is respectively carried out on the direction angle measurement and the elevation measurement before positioning, and data fusion is carried out on the positioning results of the multiple unmanned aerial vehicles during positioning, so that the accuracy of target positioning is favorably improved.
Description
Technical Field
The invention relates to the technical field of ground target positioning of unmanned aerial vehicles, in particular to a ground target real-time positioning method based on micro-clusters of unmanned aerial vehicles.
Background
Modern unmanned aerial vehicles are often equipped with instruments such as flight control system based on singlechip, airborne GPS/big dipper, controllable camera cloud platform, attitude measurement appearance, both can carry out autonomous flight under the condition of knowing self real-time position and elevation, can also take a picture to aerial or target area on the ground with the help of airborne camera simultaneously, so ground target positioning function based on unmanned aerial vehicle has obtained extensive attention.
The method comprises the steps of carrying out target longitude and latitude and elevation positioning based on an unmanned aerial vehicle, generally adopting an onboard camera holder and a laser range finder to obtain the relative position of a ground target and the unmanned aerial vehicle, then solving out the absolute position of the target by combining the position of the unmanned aerial vehicle, or adopting an image matching method to compare an image containing a target object with a reference image to determine the position of the target object, wherein the reference image is only suitable for the longitude and latitude positioning of the target object, and cannot give elevation information. Civil unmanned aerial vehicle has the outstanding advantage of low price, light, easy network deployment, nevertheless is applied to target object location with it, also has obvious disadvantage, and the precision of sensor such as civil unmanned aerial vehicle's airborne camera, elevation measuring apparatu, attitude measurement appearance is lower, and random error is great, and civil unmanned aerial vehicle does not generally equip laser range finder, consequently, if adopt the unit to carry out ground target location can not expect high positioning accuracy. However, the micro-group formed by a small number of unmanned aerial vehicles can be used for carrying out target positioning to improve the advantages and avoid the disadvantages, and the positioning accuracy is improved on the premise of keeping the advantage of low cost, so that the positioning algorithm is a technical key.
The ground target positioning method in the prior art comprises the following steps: the method comprises the steps of shooting by aligning at least two unmanned aerial vehicles to the same target, and determining a direction vector between the target and each unmanned aerial vehicle by using position and angle data so as to determine the position of the target. However, the method does not fully utilize the measurement information of the sensor of the unmanned aerial vehicle, does not provide a solution for the null shift of key error sources such as elevation and direction angle, and does not provide real-time indexes and specific positioning accuracy.
Therefore, it is an urgent need for those skilled in the art to provide a method for positioning a ground target based on a micro group of unmanned aerial vehicles, which can improve the accuracy of positioning the target by correcting the null drift of elevation and direction angle.
Disclosure of Invention
In view of the above, the invention provides a ground target real-time positioning method based on micro-groups of unmanned aerial vehicles, which respectively performs null shift correction on direction angle measurement and elevation measurement before positioning, and performs data fusion on positioning results of multiple unmanned aerial vehicles during positioning, thereby being beneficial to improving the accuracy of target positioning.
In order to achieve the purpose, the invention adopts the following technical scheme:
a ground target real-time positioning method based on micro-swarm of unmanned aerial vehicles comprises the following steps:
firstly, before positioning a ground target, performing elevation measurement zero drift correction on each unmanned aerial vehicle in the unmanned aerial vehicle micro group;
secondly, before positioning the ground target, performing direction angle measurement zero drift correction on each unmanned aerial vehicle in the unmanned aerial vehicle micro group;
step three, hovering the micro-group of unmanned aerial vehicles at any time k, and resolving a pitch angle and a direction angle of a connecting line between each unmanned aerial vehicle and a target center by using the self attitude of each unmanned aerial vehicle, the attitude of an airborne camera holder and the detection result of the target in the image;
step four, calculating target center longitude and target elevation based on single unmanned aerial vehicle positioning by combining the pitch angle and the direction angle of a connecting line between each unmanned aerial vehicle and the target center, and the self elevation and the self position longitude and latitude of each unmanned aerial vehicle;
and fifthly, carrying out data fusion on the target center longitude and latitude and the target elevation of each unmanned aerial vehicle to obtain the estimated values of the target longitude and latitude and the target elevation based on the micro-group of the unmanned aerial vehicles.
Preferably, in the above method for positioning ground targets in real time based on micro-swarm of unmanned aerial vehicles, the first step includes:
before positioning a ground target, enabling the unmanned aerial vehicle to take off from a specific parking apron, and adjusting the position of a machine body to keep the parking apron in an image center in the ascending process;
after the unmanned aerial vehicle reaches the height position of 10 meters, hovering and acquiring images for multiple times, and calculating the geodetic height values of the unmanned aerial vehicles according to the size of the parking apron in the images;
wherein f is the focal length of the camera, gamma is the length of the apron in the image, alpha is the visual angle of the apron in the camera, x is the actual side length of the apron,the geodetic elevation value of the unmanned aerial vehicle;
smoothing the geodetic elevation values obtained by calculating the plurality of images by using wavelet filtering, and taking the smoothed values as accurate values of the geodetic elevation at the current moment;
and (3) making difference between the geodetic elevation value and the geodetic elevation measurement value of the unmanned aerial vehicle at the same moment, and averaging the difference to obtain the geodetic elevation measurement null shift error of the unmanned aerial vehicle.
Preferably, in the above method for positioning a ground target in real time based on a micro group of unmanned aerial vehicles, the second step includes:
before positioning a ground target, the micro-group of unmanned aerial vehicles is hovered for the first time;
recording the longitude, the latitude, the direction angle and the vertical axis reading of a triaxial attitude measuring instrument of a single hovering unmanned aerial vehicle in the primary hovering process as direction angle calibration initial information;
controlling the unmanned aerial vehicle to carry out secondary hovering after the unmanned aerial vehicle advances for 10m towards the direction of the machine head, and recording the longitude, the latitude, the direction angle and the vertical axis reading of the three-axis attitude measuring instrument of the unmanned aerial vehicle in the secondary hovering process;
calculating the null shift error of the direction angle of the unmanned aerial vehicle by using the following formula
Wherein, lo0、la0、And theta0Respectively representing the longitude, the latitude, the direction angle and the vertical axis reading of a three-axis attitude measuring instrument of the single unmanned aerial vehicle in the primary hovering process; lo1、la1、And theta1Respectively representing the longitude, the latitude, the direction angle of the single unmanned aerial vehicle and the vertical axis reading of the three-axis attitude measuring instrument in the secondary hovering process.
Preferably, in the above method for positioning a ground target in real time based on a micro group of unmanned aerial vehicles, the third step includes:
by utilizing the collinear property of camera imaging, the center of the target and the airborne vehicle are calculated according to the pixel position of the center of the target in the shot imageIncluded angle theta in vertical direction of optical axis of camera lens2And the included angle between the target center and the positive north direction of the optical axis of the airborne camera lensθ2Andthe calculation formula of (a) is as follows:
wherein the pixel position of the target in the image is (x)p,yp) The pixel size of the image is (w)p,hp);
Obtaining the included angle theta between the optical axis of the airborne camera lens and the vertical direction according to the self posture of the unmanned aerial vehicle and the posture of the airborne camera holder1Included angle between optical axis of camera lens and north direction
An included angle theta between the center of the combined target and the optical axis of the airborne camera lens2And the included angle theta between the optical axis of the airborne camera lens and the vertical direction1And calculating an included angle between the target center and the unmanned aerial vehicle connecting line in the same vertical direction as a target pitch angle theta, wherein the calculation formula is as follows:
θ=θ1-θ2;
combining the included angle between the target center and the optical axis of the airborne camera lens and the included angle between the optical axis of the airborne camera and the due north direction, calculating the included angle between the connecting line of the target center and the unmanned aerial vehicle and the due north direction as a target direction angleThe calculation formula is as follows:
preferably, in the above method for positioning a ground target in real time based on a micro group of unmanned aerial vehicles, the fourth step includes:
solving the longitude and latitude position of the target relative to the unmanned aerial vehicle according to the target pitch angle and the target direction angle obtained by the third step by combining the geodetic elevation measurement value of the unmanned aerial vehicle;
solving the longitude and latitude of the target center based on single unmanned aerial vehicle positioning by combining the measured value of the longitude and latitude of the unmanned aerial vehicle;
the altitude elevation value of the target based on single-machine positioning is solved by utilizing the altitude elevation measurement value of the unmanned aerial vehicle and the geodetic elevation measurement value of the unmanned aerial vehicle, and the calculation formula is as follows:
h=hF-hD;
wherein (x, y) represents the geodetic coordinate system coordinates of the target, (x)0,y0) The coordinates of a geodetic coordinate system of the unmanned aerial vehicle are represented, h represents the geodetic elevation measurement value of the unmanned aerial vehicle, hFIndicates the altitude elevation of the unmanned plane, hDIs the geodetic elevation measurement value of the unmanned plane.
Preferably, in the above method for positioning a ground target in real time based on a micro group of unmanned aerial vehicles, step five includes:
deducing pitch angle and direction angle errors of target positioning according to airborne camera lens optical errors, airborne camera pan-tilt attitude measurement errors and unmanned aerial vehicle attitude measurement errors of a single unmanned aerial vehicle;
performing fusion calculation on the geodetic elevation measurement error, the target pitch angle error and the target direction angle error of the single unmanned aerial vehicle to obtain a target longitude and latitude relative positioning error area of the single unmanned aerial vehicle, wherein the area is a fan ring taking the single unmanned aerial vehicle as a center;
introducing a self longitude and latitude measurement error of the single unmanned aerial vehicle, translating a target longitude and latitude relative positioning error area center within a self longitude and latitude measurement error range of the single unmanned aerial vehicle, taking a circumscribed circle of a maximum coverage area as a target longitude and latitude final positioning error area of the single unmanned aerial vehicle, and taking the radius of the final positioning error area as the target longitude and latitude positioning error of the single unmanned aerial vehicle;
obtaining a target elevation positioning error one-dimensional probability circle of the single unmanned aerial vehicle according to the fusion of the altitude elevation measurement error and the geodetic elevation measurement error of the single unmanned aerial vehicle, and taking the radius of the target elevation positioning error one-dimensional probability circle as the target elevation positioning error of the single unmanned aerial vehicle;
and fusing the target longitude and latitude positioning errors and the target elevation positioning errors of the n unmanned aerial vehicles to obtain the estimated values of the target longitude and latitude and the target elevation of the micro-group of the unmanned aerial vehicles.
Preferably, in the above ground target real-time positioning method based on micro-swarm of unmanned aerial vehicles, the calculation formula of the pitch angle and the direction angle error of target positioning is:
in the above formula, δpLocating the pitch angle error, delta, for the targetyLocating the azimuth error, δ, for the target2aFor single unmanned aerial vehicle airborne camera pan-tilt direction angle measurement error, delta4aFor the self direction angle measurement error, delta, of a single unmanned aerial vehicle2cFor the airborne camera pan tilt roll angle measurement error, delta, of a single unmanned aerial vehicle4cFor the measurement error of the self roll angle of the single unmanned aerial vehicle, delta3bFor single unmanned aerial vehicle airborne camera pan-tilt optical distortion lateral error, delta2bFor single unmanned aerial vehicle airborne camera pan-tilt angle measurement error, delta4bFor the measurement error of the self pitch angle of the single unmanned plane, delta3aFor single unmanned aerial vehicle airborne camera pan-tilt optical distortion longitudinal error thetagrIs the roll angle of the head, thetarThe roll angle of a single unmanned aerial vehicle, h is the pixel elevation of the image, w is the pixel width of the image, and thetacwIs the transverse angle of view, theta, of the camera lenschIs the lateral viewing angle of the camera lens.
Preferably, in the above ground target real-time positioning method based on micro-swarm of unmanned aerial vehicles, a calculation formula of a target longitude and latitude relative positioning error area of a single unmanned aerial vehicle is as follows:
wherein r is1Is the inner radius of the fan ring, r2Is the outer radius of the fan ring, thetafrIs the central angle of the fan ring, hfElevation of the ground for a single unmanned aerial vehicle, delta5For single unmanned aerial vehicle self geodetic height measurement error, thetagpFor the pitch angle of the airborne camera pan tilt of a single unmanned aerial vehicle, thetapIs the pitch angle of a single unmanned aerial vehicle.
Preferably, in the above ground target real-time positioning method based on micro-swarm of unmanned aerial vehicles, the radius r of the final positioning error areadThe calculation formula of (2) is as follows:
wherein, delta1Representing the self longitude and latitude measurement error of the single unmanned aerial vehicle;
radius r of target elevation positioning error one-dimensional probability circlehThe calculation formula of (2) is as follows:
wherein, delta6The height elevation measurement error of the single unmanned aerial vehicle is measured.
Preferably, in the above method for positioning a ground target in real time based on a micro-cluster of unmanned aerial vehicles, the calculation formula of the estimated values of the longitude and latitude and the elevation of the target of the micro-cluster of unmanned aerial vehicles is as follows:
wherein n represents that the unmanned aerial vehicle micro group contains n unmanned aerial vehicles, (x)Q,yQ) A calculated value of the longitude and latitude of the target based on the micro group of the unmanned aerial vehicle, hQCalculating a value for a target elevation based on the micro-population of drones.
According to the technical scheme, compared with the prior art, the ground target real-time positioning method based on the micro-group of the unmanned aerial vehicles makes up the disadvantage of low target positioning precision of a civil single unmanned aerial vehicle, effectively reduces zero drift errors of elevation and angle measurement, and can improve the ground target positioning precision of the low-cost unmanned aerial vehicles.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
FIG. 1 is a flow chart of a method for real-time positioning of ground targets based on micro-swarm of unmanned aerial vehicles according to the present invention;
FIG. 2 is a schematic diagram illustrating calculation of elevation values of a target based on stand-alone positioning according to the present invention;
FIG. 3 is a schematic diagram illustrating the calculation of the target pitch angle based on the single-machine positioning according to the present invention;
FIG. 4 is a schematic diagram illustrating the calculation of target direction angles based on single-machine positioning according to the present invention;
FIG. 5 is a schematic diagram illustrating the resolving of the longitude and latitude of the target center based on the positioning of a single unmanned aerial vehicle according to the present invention;
FIG. 6 is a scattergram of target latitude and longitude position measurements provided by the present invention;
FIG. 7 is a graphical illustration of target elevation measurements provided by the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1, an embodiment of the present invention discloses a ground target real-time positioning method based on micro-swarm of unmanned aerial vehicles, which comprises the following steps:
firstly, before positioning a ground target, performing elevation measurement zero drift correction on each unmanned aerial vehicle in the unmanned aerial vehicle micro group;
secondly, before positioning the ground target, performing direction angle measurement zero drift correction on each unmanned aerial vehicle in the unmanned aerial vehicle micro group;
step three, hovering the micro-group of unmanned aerial vehicles at any time k, and resolving a pitch angle and a direction angle of a connecting line between each unmanned aerial vehicle and a target center by using the self attitude of each unmanned aerial vehicle, the attitude of an airborne camera holder and the detection result of the target in the image;
step four, calculating target center longitude and target elevation based on single unmanned aerial vehicle positioning by combining the pitch angle and the direction angle of a connecting line between each unmanned aerial vehicle and the target center, and the self elevation and the self position longitude and latitude of each unmanned aerial vehicle;
and fifthly, carrying out data fusion on the target center longitude and latitude and the target elevation of each unmanned aerial vehicle to obtain the estimated values of the target longitude and latitude and the target elevation based on the micro-group of the unmanned aerial vehicles.
The above steps are described in detail below with an embodiment.
The unmanned aerial vehicle micro-cluster of this embodiment comprises three the same longitude and latitude M200 unmanned aerial vehicles, and the unmanned aerial vehicle carries on the camera cloud platform and is ZENMUSEX5S triaxial machinery cloud platform, and the airborne camera is BLK35S (about 20g, 200 frames per second or 5ms one frame, effective pixel is 100 ten thousand). The CPU main frequency of the computer for resolving is 3.1GHz, the internal memory is 8G, and the single-shot target positioning resolving consumes 1ms of time.
The method comprises the following steps that firstly, three unmanned aerial vehicles take off from a specific parking apron, wherein the parking apron is a white square marker with a fixed size, the unmanned aerial vehicle body is adjusted in the ascending process to keep the parking apron at the central position of an image, multiple times of image acquisition is carried out after the position of the parking apron reaches 10 meters, optical distortion is removed according to the distortion coefficient of a camera, the elevation at the moment is calculated, a calculation schematic diagram is shown in a figure 2, and the specific steps are as follows:
knowing the focal length f of the camera and the length gamma of the apron image in the whole image, the viewing angle alpha of the apron in the camera can be obtained, and the calculation formula is as follows:
and knowing the actual side length x of the apron, according to the trigonometric relation, the measured elevation h of the unmanned aerial vehicle from the ground can be obtained, and the calculation formula is as follows:
and calculating the elevation measurement values of the images acquired for multiple times respectively, smoothing by using average filtering, taking the smoothed values as accurate values of the current elevation, and taking the difference between the current measurement values and the accurate values as the elevation measurement null shift error of the unmanned aerial vehicle. Therefore, when each unmanned aerial vehicle measures, the zero drift value is added to the measured height measurement value so as to eliminate the zero drift error. The current measurement value is the elevation obtained by the sensor of the unmanned aerial vehicle; the data for smoothing filtering processing is that a camera is used for carrying out image acquisition on a standard reference object, the geometric dimension and camera parameters of the reference object are known, the accurate distance between the camera and the reference object can be calculated and can be regarded as an accurate value, and the difference between the elevation and the accurate value obtained by the unmanned aerial vehicle sensor is the null shift error.
Step two, enabling three unmanned aerial vehicles to hover at 100m, and recording the self longitude lo of the unmanned aerial vehicle for one unmanned aerial vehicle0And latitude la0Angle of directionAnd the vertical axis reading theta of the three-axis attitude measuring instrument0And the initial information is used as the direction angle calibration. After confirming that there is no obstacle in unmanned aerial vehicle the place ahead through unmanned aerial vehicle forward-looking camera, set up unmanned aerial vehicle self pitch angle for 1, control unmanned aerial vehicle to advance to the aircraft nose direction to real-time recording unmanned aerial vehicle self longitude and latitude position. When unmanned aerial vehicle arrived and is 10 m's position with initial position distance, set for unmanned aerial vehicle self pitch angle for 0, unmanned aerial vehicle carries out power and hovers, records unmanned aerial vehicle self longitude lo this moment1And latitude la1Angle of directionAnd the vertical axis reading theta of the three-axis attitude measuring instrument1. Assuming that the unmanned aerial vehicle does not have transverse displacement during flight, the formula of the estimated value of the self direction angle of the unmanned aerial vehicle at the moment is as follows:
then the formula of the estimated null shift value of the direction angle of the unmanned aerial vehicle is as follows:
therefore, when each unmanned aerial vehicle measures, the direction angle measured value needs to be added with its own null shift valueTo eliminate the null shift error.
The first step and the second step are used for improving the target positioning accuracy of the single unmanned aerial vehicle.
And step three, enabling the micro-cluster of the unmanned aerial vehicles to hover at any time k. By utilizing the collinear property of camera imaging, the angle of the target center with the optical axis of the camera is obtained through the pixel position calculation of the target center in a shot picture, and the included angle of the optical axis of the camera with the vertical direction and the due north direction is obtained by measuring the gesture of the unmanned aerial vehicle and the gesture of the holder, which is shown in fig. 3 and 4:
as shown in fig. 3, point O is a center of the camera lens, HG is a virtual image plane, D is a position of the target on the ground, and E is an image point of the target. OC is the onboard camera optical axis. Let TOC equal to theta1Can be solved angle DOC by the gesture data of airborne camera and unmanned aerial vehicle and be theta2,For the longitudinal visual angle of the airborne camera, the angle TOD is the angle to be solved, namely the included angle between the connecting line of the target and the unmanned aerial vehicle and the vertical direction, and the calculation formula is as follows:
θ=θ1-θ2;
as shown in fig. 4, point O is the center of the lens of the onboard camera, point AB is the virtual imaging plane, point D is the position of the target, and point H is the image point of the target. OF is the camera optical axis. Is provided withCan be obtained by the attitude data of the airborne camera and the unmanned aerial vehicle,is the transverse visual angle of the onboard camera,(see fig. 4) is the calculated angle, that is, the included angle between the connecting line of the target and the unmanned aerial vehicle and the north direction is the same, and the calculation formula is as follows:
and step four, calculating the longitude and latitude of the target by utilizing the included angle between the central connecting line of the target and the unmanned plane to be measured and the vertical and due north directions, the altitude of the unmanned plane to be measured and the accurate longitude and latitude of the unmanned plane to be measured. The specific geometrical relationship is shown in fig. 5. Wherein, the T point coordinate (x)0,y0) Is the coordinate of the geodetic coordinate system of the unmanned aerial vehicle, the coordinate (x, y) of the point C is the coordinate of the geodetic coordinate system of the target, OT-h is the elevation of the unmanned aerial vehicle, theta is the included angle between the center of the unmanned aerial vehicle and the connecting line of the object in the vertical direction, namely the pitch angle,the angle between the projection of the connecting line of the center of the unmanned aerial vehicle and the object on the ground and the true north direction is the direction angle. The equation for solving the coordinate values of the geodetic coordinate system of the target point is as follows:
because the vertical projection surface of the unmanned aerial vehicle to the ground and the positioning target are positioned at the same elevation, and the target absolute elevation is obtained by calculating the difference according to the altitude elevation and the ground elevation of the unmanned aerial vehicle, the target elevation formula is as follows:
h=hF-hD
wherein h isFThe altitude elevation of the unmanned aerial vehicle is the altitude elevation of the unmanned aerial vehicle, namely the altitude elevation of the unmanned aerial vehicle output by a GPS (global positioning system) locator; h isDThe altitude of the unmanned aerial vehicle is the altitude of the ground, namely the altitude of the vertical projection point of the unmanned aerial vehicle on the ground at the current moment from the self output by the air pressure altitude sensor.
Step five, firstly, measuring the optical error of the airborne camera holder and the attitude measuring error of the airborne camera holder for the error information measured by a certain unmanned planeAnd deducing the errors of the pitch angle and the direction angle of the target positioning according to the attitude measurement errors of the unmanned aerial vehicle. Setting the error of the target positioning pitch angle as deltapError of angle of direction deltayThe formula can be derived from the spatial relationship and the coordinate transformation.
Wherein, delta2aFor the measurement error of the direction angle of the airborne camera holder of the unmanned aerial vehicle, delta4aFor the measurement error of the self direction angle of the unmanned aerial vehicle, delta2cFor the measurement error of the roll angle of the airborne camera tripod head of the unmanned aerial vehicle, delta4cFor the measurement error of the roll angle of the unmanned aerial vehicle, delta3bFor the transverse error of the optical distortion of the airborne camera pan-tilt of the unmanned aerial vehicle, delta2bFor unmanned aerial vehicle airborne camera pan-tilt angle measurement error, delta4bFor the measurement error of the pitch angle of the unmanned aerial vehicle, delta3aFor the vertical error of the optical distortion of the airborne camera pan-tilt of the unmanned aerial vehicle, thetagrIs the roll angle of the head, thetarIs the roll angle of the unmanned aerial vehicle, h is the pixel elevation of the image, w is the pixel width of the image, thetacwIs the transverse angle of view, theta, of the camera lenschIs the lateral viewing angle of the camera lens.
The method comprises the steps of carrying out fusion calculation on the ground elevation measurement error of the unmanned aerial vehicle and the target pitch angle and target direction angle error to obtain a single unmanned aerial vehicle relative target longitude and latitude positioning error area, wherein the area is a fan ring with the unmanned aerial vehicle as the center, and the inner radius r of the fan ring1Outer radius r2And central angle thetafrThe formula is as follows:
wherein h isfFor the ground elevation, delta, of unmanned aerial vehicle5For unmanned aerial vehicleHeight measurement error of body area, thetagpIs the pitch angle of the head, thetapFor the pitch angle, delta, of the dronepLocating the pitch angle error, delta, for the targetyThe azimuth error is located for the target.
Adding the longitude and latitude measurement error of the unmanned aerial vehicle, namely translating the center of the relative positioning error area within the longitude and latitude measurement error range of the unmanned aerial vehicle, and taking the maximum coverage area, wherein the area may be a complex polygon, so that the circumscribed circle of the area is generally taken as the final positioning error area, the coordinates of the circle center of the circular area are (x, y), and the radius r isdThe formula is as follows:
because the target elevation is obtained by calculating the difference according to the altitude elevation and the ground elevation of the unmanned aerial vehicle, the target elevation positioning error one-dimensional probability circle radius formula is as follows:
wherein, delta6The altitude elevation measurement error of the unmanned aerial vehicle is measured.
Three unmanned aerial vehicle target positioning results are fused to obtain an unmanned aerial vehicle micro-group target positioning result, and the formula is as follows:
thus, the calculation of the longitude and latitude and the elevation of the target at the moment k is completed. At the next time (k +1), k ← (k +1) is assigned, and the process returns to step three to perform the next round of solution. Embodiments locate a fixed target. As a result of the calculation, the target location refresh period is 10ms, and the location is completed once every 10ms through data compression, data transmission/data communication (single beat takes 1ms) and target location calculation. FIGS. 6 and 7 are comparisons of calculated target latitude and longitude and elevation values with actual values for 1000 refresh cycles. Through statistics, the longitude and latitude positioning accuracy is 1.67m, the elevation positioning accuracy is 1.03m, and the probability accuracy of circles is shown. The technology of the patent has real-time performance on a fixed target. For a low-speed moving target with the moving speed of less than 100km/h (27m/s or 0.027m/ms), the moving speed of a target of 10ms is less than 0.27m, so that the positioning precision of the mobile target is similar to that of a fixed target by the technology of the invention with the longitude and latitude positioning precision of 1.67m and the elevation positioning precision of 1.03m, namely, the technology has obvious real-time positioning advantage of the low-speed moving target.
The embodiments in the present description are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. The device disclosed by the embodiment corresponds to the method disclosed by the embodiment, so that the description is simple, and the relevant points can be referred to the method part for description.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
Claims (10)
1. A ground target real-time positioning method based on micro-swarm of unmanned aerial vehicles is characterized by comprising the following steps:
firstly, before positioning a ground target, performing elevation measurement zero drift correction on each unmanned aerial vehicle in the unmanned aerial vehicle micro group;
secondly, before positioning the ground target, performing direction angle measurement zero drift correction on each unmanned aerial vehicle in the unmanned aerial vehicle micro group;
step three, hovering the micro-group of unmanned aerial vehicles at any time k, and resolving a pitch angle and a direction angle of a connecting line between each unmanned aerial vehicle and a target center by using the self attitude of each unmanned aerial vehicle, the attitude of an airborne camera holder and the detection result of the target in the image;
step four, calculating target center longitude and target elevation based on single unmanned aerial vehicle positioning by combining the pitch angle and the direction angle of a connecting line between each unmanned aerial vehicle and the target center, and the self elevation and the self position longitude and latitude of each unmanned aerial vehicle;
and fifthly, carrying out data fusion on the target center longitude and latitude and the target elevation of each unmanned aerial vehicle to obtain the estimated values of the target longitude and latitude and the target elevation based on the micro-group of the unmanned aerial vehicles.
2. The method of claim 1, wherein the first step comprises:
before positioning a ground target, enabling the unmanned aerial vehicle to take off from a specific parking apron, and adjusting the position of a machine body to keep the parking apron in an image center in the ascending process;
after the unmanned aerial vehicle reaches the height position of 10 meters, hovering and acquiring images for multiple times, and calculating the geodetic height values of the unmanned aerial vehicles according to the size of the parking apron in the images;
wherein f is the focal length of the camera, gamma is the length of the apron in the image, alpha is the visual angle of the apron in the camera, x is the actual side length of the apron,the geodetic elevation value of the unmanned aerial vehicle;
smoothing the geodetic elevation values obtained by calculating the plurality of images by using wavelet filtering, and taking the smoothed values as accurate values of the geodetic elevation at the current moment;
and (3) making difference between the geodetic elevation value and the geodetic elevation measurement value of the unmanned aerial vehicle at the same moment, and averaging the difference to obtain the geodetic elevation measurement null shift error of the unmanned aerial vehicle.
3. The method for positioning the ground target in real time based on the micro-swarm of the unmanned aerial vehicles according to claim 1, wherein the second step comprises:
before positioning a ground target, the micro-group of unmanned aerial vehicles is hovered for the first time;
recording the longitude, the latitude, the direction angle and the vertical axis reading of a triaxial attitude measuring instrument of a single hovering unmanned aerial vehicle in the primary hovering process as direction angle calibration initial information;
controlling the unmanned aerial vehicle to carry out secondary hovering after the unmanned aerial vehicle advances for 10m towards the direction of the machine head, and recording the longitude, the latitude, the direction angle and the vertical axis reading of the three-axis attitude measuring instrument of the unmanned aerial vehicle in the secondary hovering process;
calculating the null shift error of the direction angle of the unmanned aerial vehicle by using the following formula
Wherein, lo0、la0、And theta0Respectively representing the longitude, the latitude, the direction angle and the vertical axis reading of a three-axis attitude measuring instrument of the single unmanned aerial vehicle in the primary hovering process; lo1、la1、And theta1Respectively representing the longitude, the latitude, the direction angle of the single unmanned aerial vehicle and the vertical axis reading of the three-axis attitude measuring instrument in the secondary hovering process.
4. The method according to claim 3, wherein the third step comprises:
calculating the included angle theta between the target center and the vertical direction of the optical axis of the airborne camera lens according to the pixel position of the target center in the shot image by utilizing the collinear property of camera imaging2And the included angle between the target center and the positive north direction of the optical axis of the airborne camera lensθ2Andthe calculation formula of (a) is as follows:
wherein the pixel position of the target in the image is (x)p,yp) The pixel size of the image is (w)p,hp);
Obtaining the included angle theta between the optical axis of the airborne camera lens and the vertical direction according to the self posture of the unmanned aerial vehicle and the posture of the airborne camera holder1Included angle between optical axis of camera lens and north direction
An included angle theta between the center of the combined target and the optical axis of the airborne camera lens2And the included angle theta between the optical axis of the airborne camera lens and the vertical direction1And calculating an included angle between the target center and the unmanned aerial vehicle connecting line in the same vertical direction as a target pitch angle theta, wherein the calculation formula is as follows:
θ=θ1-θ2;
combining the included angle between the target center and the optical axis of the airborne camera lens and the included angle between the optical axis of the airborne camera and the due north direction, calculating the included angle between the connecting line of the target center and the unmanned aerial vehicle and the due north direction as a target direction angleThe calculation formula is as follows:
5. the method of claim 4, wherein the fourth step comprises:
solving the longitude and latitude position of the target relative to the unmanned aerial vehicle according to the target pitch angle and the target direction angle obtained by the third step by combining the geodetic elevation measurement value of the unmanned aerial vehicle;
solving the longitude and latitude of the target center based on single unmanned aerial vehicle positioning by combining the measured value of the longitude and latitude of the unmanned aerial vehicle;
the altitude elevation value of the target based on single-machine positioning is solved by utilizing the altitude elevation measurement value of the unmanned aerial vehicle and the geodetic elevation measurement value of the unmanned aerial vehicle, and the calculation formula is as follows:
h=hF-hD;
wherein (x, y) represents the geodetic coordinate system coordinates of the target, (x)0,y0) The coordinates of a geodetic coordinate system of the unmanned aerial vehicle are represented, h represents the geodetic elevation measurement value of the unmanned aerial vehicle, hFIndicates the altitude elevation of the unmanned plane, hDIs the geodetic elevation measurement value of the unmanned plane.
6. The method according to claim 1, wherein the step five comprises:
deducing pitch angle and direction angle errors of target positioning according to airborne camera lens optical errors, airborne camera pan-tilt attitude measurement errors and unmanned aerial vehicle attitude measurement errors of a single unmanned aerial vehicle;
performing fusion calculation on the geodetic elevation measurement error, the target pitch angle error and the target direction angle error of the single unmanned aerial vehicle to obtain a target longitude and latitude relative positioning error area of the single unmanned aerial vehicle, wherein the area is a fan ring taking the single unmanned aerial vehicle as a center;
introducing a self longitude and latitude measurement error of the single unmanned aerial vehicle, translating a target longitude and latitude relative positioning error area center within a self longitude and latitude measurement error range of the single unmanned aerial vehicle, taking a circumscribed circle of a maximum coverage area as a target longitude and latitude final positioning error area of the single unmanned aerial vehicle, and taking the radius of the final positioning error area as the target longitude and latitude positioning error of the single unmanned aerial vehicle;
obtaining a target elevation positioning error one-dimensional probability circle of the single unmanned aerial vehicle according to the fusion of the altitude elevation measurement error and the geodetic elevation measurement error of the single unmanned aerial vehicle, and taking the radius of the target elevation positioning error one-dimensional probability circle as the target elevation positioning error of the single unmanned aerial vehicle;
and fusing the target longitude and latitude positioning errors and the target elevation positioning errors of the n unmanned aerial vehicles to obtain the estimated values of the target longitude and latitude and the target elevation of the micro-group of the unmanned aerial vehicles.
7. The method of claim 6, wherein the calculation formula of the pitch angle and the direction angle errors of the target positioning is as follows:
in the above formula, δpLocating the pitch angle error, delta, for the targetyLocating the azimuth error, δ, for the target2aFor single unmanned aerial vehicle airborne camera pan-tilt direction angle measurement error, delta4aFor the self direction angle measurement error, delta, of a single unmanned aerial vehicle2cFor the airborne camera pan tilt roll angle measurement error, delta, of a single unmanned aerial vehicle4cFor the measurement error of the self roll angle of the single unmanned aerial vehicle, delta3bFor single unmanned aerial vehicle airborne camera pan-tilt optical distortion lateral error, delta2bFor single unmanned aerial vehicle airborne camera pan-tilt angle measurement error, delta4bFor the measurement error of the self pitch angle of the single unmanned plane, delta3aFor single unmanned aerial vehicle airborne camera pan-tilt optical distortion longitudinal error thetagrIs the roll angle of the head, thetarThe roll angle of a single unmanned aerial vehicle, h is the pixel elevation of the image, w is the pixel width of the image, and thetacwIs the transverse angle of view, theta, of the camera lenschIs the lateral viewing angle of the camera lens.
8. The method of claim 7, wherein the calculation formula of the target longitude and latitude relative positioning error area of a single unmanned aerial vehicle is as follows:
wherein r is1Is the inner radius of the fan ring, r2Is the outer radius of the fan ring, thetafrIs the central angle of the fan ring, hfElevation of the ground for a single unmanned aerial vehicle, delta5For single unmanned aerial vehicle self geodetic height measurement error, thetagpFor the pitch angle of the airborne camera pan tilt of a single unmanned aerial vehicle, thetapIs the pitch angle of a single unmanned aerial vehicle.
9. The real-time ground target positioning method based on micro-swarm of unmanned aerial vehicles according to claim 8Method, characterized in that the radius r of the final positioning error zonedThe calculation formula of (2) is as follows:
wherein, delta1Representing the self longitude and latitude measurement error of the single unmanned aerial vehicle;
radius r of target elevation positioning error one-dimensional probability circlehThe calculation formula of (2) is as follows:
wherein, delta6The height elevation measurement error of the single unmanned aerial vehicle is measured.
10. The method of claim 9, wherein the estimated values of the latitude and longitude and the elevation of the target of the micro-cluster of unmanned aerial vehicles are calculated according to the following formula:
wherein n represents that the unmanned aerial vehicle micro group contains n unmanned aerial vehicles, (x)Q,yQ) A calculated value of the longitude and latitude of the target based on the micro group of the unmanned aerial vehicle, hQCalculating a value for a target elevation based on the micro-population of drones.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110728010.7A CN113340272B (en) | 2021-06-29 | 2021-06-29 | Ground target real-time positioning method based on micro-group of unmanned aerial vehicle |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110728010.7A CN113340272B (en) | 2021-06-29 | 2021-06-29 | Ground target real-time positioning method based on micro-group of unmanned aerial vehicle |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113340272A true CN113340272A (en) | 2021-09-03 |
CN113340272B CN113340272B (en) | 2022-09-06 |
Family
ID=77481579
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110728010.7A Active CN113340272B (en) | 2021-06-29 | 2021-06-29 | Ground target real-time positioning method based on micro-group of unmanned aerial vehicle |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113340272B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114194698A (en) * | 2021-12-14 | 2022-03-18 | 北京华横科技有限公司 | Tally terminal and information processing method |
CN114240758A (en) * | 2021-12-24 | 2022-03-25 | 柳州市侗天湖农业生态旅游投资有限责任公司 | Mountain tea garden low-altitude image splicing method taking quadrilateral plots as reference objects |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0148704A2 (en) * | 1984-01-06 | 1985-07-17 | Thomson-Csf | Monitoring method for target localization by way of an unmanned aircraft |
CN101191994A (en) * | 2006-12-01 | 2008-06-04 | 鸿富锦精密工业(深圳)有限公司 | Optical look angle measuring systems and its measuring method |
CN103822615A (en) * | 2014-02-25 | 2014-05-28 | 北京航空航天大学 | Unmanned aerial vehicle ground target real-time positioning method with automatic extraction and gathering of multiple control points |
CN104501779A (en) * | 2015-01-09 | 2015-04-08 | 中国人民解放军63961部队 | High-accuracy target positioning method of unmanned plane on basis of multi-station measurement |
CN106705936A (en) * | 2016-12-06 | 2017-05-24 | 浙江华飞智能科技有限公司 | Method and device for optimizing altitude of unmanned aerial vehicle |
CN107990874A (en) * | 2017-11-23 | 2018-05-04 | 南京中高知识产权股份有限公司 | A kind of ground elevation three-dimensional laser scanner and scan method |
CN109116864A (en) * | 2018-09-07 | 2019-01-01 | 佛山皖和新能源科技有限公司 | A kind of unmanned plane cluster terrestrial information acquisition identification management method |
CN110134132A (en) * | 2019-04-29 | 2019-08-16 | 西北工业大学 | A kind of system and method for multiple no-manned plane collaboration target positioning |
US20200278418A1 (en) * | 2019-01-02 | 2020-09-03 | Electronics And Telecommunications Research Institute | Method and apparatus for identifying location information of signal source by using unmanned aerial vehicle |
CN111879313A (en) * | 2020-07-31 | 2020-11-03 | 中国人民解放军国防科技大学 | Multi-target continuous positioning method and system based on unmanned aerial vehicle image recognition |
US20210157336A1 (en) * | 2019-11-26 | 2021-05-27 | Lg Electronics Inc. | Unmanned aerial vehicle and station |
-
2021
- 2021-06-29 CN CN202110728010.7A patent/CN113340272B/en active Active
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0148704A2 (en) * | 1984-01-06 | 1985-07-17 | Thomson-Csf | Monitoring method for target localization by way of an unmanned aircraft |
CN101191994A (en) * | 2006-12-01 | 2008-06-04 | 鸿富锦精密工业(深圳)有限公司 | Optical look angle measuring systems and its measuring method |
CN103822615A (en) * | 2014-02-25 | 2014-05-28 | 北京航空航天大学 | Unmanned aerial vehicle ground target real-time positioning method with automatic extraction and gathering of multiple control points |
CN104501779A (en) * | 2015-01-09 | 2015-04-08 | 中国人民解放军63961部队 | High-accuracy target positioning method of unmanned plane on basis of multi-station measurement |
CN106705936A (en) * | 2016-12-06 | 2017-05-24 | 浙江华飞智能科技有限公司 | Method and device for optimizing altitude of unmanned aerial vehicle |
CN107990874A (en) * | 2017-11-23 | 2018-05-04 | 南京中高知识产权股份有限公司 | A kind of ground elevation three-dimensional laser scanner and scan method |
CN109116864A (en) * | 2018-09-07 | 2019-01-01 | 佛山皖和新能源科技有限公司 | A kind of unmanned plane cluster terrestrial information acquisition identification management method |
US20200278418A1 (en) * | 2019-01-02 | 2020-09-03 | Electronics And Telecommunications Research Institute | Method and apparatus for identifying location information of signal source by using unmanned aerial vehicle |
CN110134132A (en) * | 2019-04-29 | 2019-08-16 | 西北工业大学 | A kind of system and method for multiple no-manned plane collaboration target positioning |
US20210157336A1 (en) * | 2019-11-26 | 2021-05-27 | Lg Electronics Inc. | Unmanned aerial vehicle and station |
CN111879313A (en) * | 2020-07-31 | 2020-11-03 | 中国人民解放军国防科技大学 | Multi-target continuous positioning method and system based on unmanned aerial vehicle image recognition |
Non-Patent Citations (2)
Title |
---|
杨帅等: "无人机图像侦察目标定位方法及精度分析", 《红外技术》 * |
马俊杰等: "基于协同无人机的感兴趣目标定位", 《计算机测量与控制》 * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114194698A (en) * | 2021-12-14 | 2022-03-18 | 北京华横科技有限公司 | Tally terminal and information processing method |
CN114194698B (en) * | 2021-12-14 | 2023-11-24 | 北京华横科技有限公司 | Tally terminal and information processing method |
CN114240758A (en) * | 2021-12-24 | 2022-03-25 | 柳州市侗天湖农业生态旅游投资有限责任公司 | Mountain tea garden low-altitude image splicing method taking quadrilateral plots as reference objects |
CN114240758B (en) * | 2021-12-24 | 2022-08-05 | 柳州市侗天湖农业生态旅游投资有限责任公司 | Mountain tea garden low-altitude image splicing method taking quadrilateral plots as reference objects |
Also Published As
Publication number | Publication date |
---|---|
CN113340272B (en) | 2022-09-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Stöcker et al. | Quality assessment of combined IMU/GNSS data for direct georeferencing in the context of UAV-based mapping | |
CN110926474B (en) | Satellite/vision/laser combined urban canyon environment UAV positioning and navigation method | |
EP3454008B1 (en) | Survey data processing device, survey data processing method, and survey data processing program | |
US20110282580A1 (en) | Method of image based navigation for precision guidance and landing | |
CN106408601B (en) | A kind of binocular fusion localization method and device based on GPS | |
US20200191556A1 (en) | Distance mesurement method by an unmanned aerial vehicle (uav) and uav | |
JP6138326B1 (en) | MOBILE BODY, MOBILE BODY CONTROL METHOD, PROGRAM FOR CONTROLLING MOBILE BODY, CONTROL SYSTEM, AND INFORMATION PROCESSING DEVICE | |
CN113850126A (en) | Target detection and three-dimensional positioning method and system based on unmanned aerial vehicle | |
CN113340272B (en) | Ground target real-time positioning method based on micro-group of unmanned aerial vehicle | |
US10527416B2 (en) | System and method for measuring a displacement of a mobile platform | |
CN107917699B (en) | Method for improving aerial three quality of mountain landform oblique photogrammetry | |
CN105043392B (en) | A kind of aircraft pose determines method and device | |
CN109341666B (en) | Unmanned aerial vehicle aerial photography crosswind-resistant course determining method | |
CN111426320A (en) | Vehicle autonomous navigation method based on image matching/inertial navigation/milemeter | |
JP2011141262A (en) | Altitude measuring device and method | |
CN112710311A (en) | Automatic planning method for three-dimensional live-action reconstruction aerial camera points of terrain adaptive unmanned aerial vehicle | |
CN112461204B (en) | Method for satellite to dynamic flying target multi-view imaging combined calculation of navigation height | |
EP1584896A1 (en) | Passive measurement of terrain parameters | |
CN112862818B (en) | Underground parking lot vehicle positioning method combining inertial sensor and multi-fisheye camera | |
CN109146936A (en) | A kind of image matching method, device, localization method and system | |
CN111412898B (en) | Large-area deformation photogrammetry method based on ground-air coupling | |
CN114812513A (en) | Unmanned aerial vehicle positioning system and method based on infrared beacon | |
CN111207688B (en) | Method and device for measuring distance of target object in vehicle and vehicle | |
JP2746487B2 (en) | Aircraft position measurement method for vertical take-off and landing aircraft | |
CN110887475B (en) | Static base rough alignment method based on north polarization pole and polarized solar vector |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |