CN103852077B - Automatic anti-cheating judgment method for unmanned aerial vehicle positioning information in link failure process - Google Patents

Automatic anti-cheating judgment method for unmanned aerial vehicle positioning information in link failure process Download PDF

Info

Publication number
CN103852077B
CN103852077B CN201410103184.4A CN201410103184A CN103852077B CN 103852077 B CN103852077 B CN 103852077B CN 201410103184 A CN201410103184 A CN 201410103184A CN 103852077 B CN103852077 B CN 103852077B
Authority
CN
China
Prior art keywords
information
destination
latitude
longitude
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410103184.4A
Other languages
Chinese (zh)
Other versions
CN103852077A (en
Inventor
向锦武
丁文锐
康传波
李红光
刘硕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing northern sky long hawk UAV Technology Co. Ltd.
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN201410103184.4A priority Critical patent/CN103852077B/en
Publication of CN103852077A publication Critical patent/CN103852077A/en
Application granted granted Critical
Publication of CN103852077B publication Critical patent/CN103852077B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)

Abstract

The invention discloses an automatic anti-cheating judgment method for unmanned aerial vehicle positioning information in a link failure process. The automatic anti-cheating judgment method comprises the following steps: step 1, planning flight front leg information; step 2, carrying out data acquisition on waypoint coordinate information and carrying out table T1 information matching; step 3, extracting surface feature angular point characteristics and position elevation information in flight and matching task load planning information with landform information; step 4, calculating a weighting coefficient and a matching threshold value according to a work condition information table T4; and step 5, carrying out weighting calculation and judging whether cheat exists or not. The automatic anti-cheating judgment method integrates onboard multi-sensor information and improves the utilization rate of onboard resources. The automatic anti-cheating judgment method establishes a composite information model and improves the safety of an unmanned aerial vehicle.

Description

Unmanned plane location information anti-fraud autonomous determination methods during link failure
Technical field
The invention belongs to unmanned plane Autonomous Control and technical field of image processing, be specifically related to during a kind of link failure unmanned The autonomous determination methods of machine false bearing information, can be used for the anti-positional information of unmanned plane and cheats to make up " unmanned " aspect not Foot.
Background technology
At present, UAS is among development with the correlation technique such as information-based, and unmanned plane is at military and civilian Aspect all has use value widely, and application scenarios also more sophisticated is changeable simultaneously.For realizing unmanned plane autonomous flight, need Navigator improves true and reliable spatial positional information.Wherein, if data link communication is normal, although on machine " unmanned ", But whole UAS loop " has people ", unmanned plane operator can judge flight path on ground by downlink telemetry data Whether it coincide, if the deviation from taking more greatly manual remote control corrective action with expection.
As described in epimere, on the one hand for technical standpoint, crack, during link failure owing to gps signal can be trapped The unmanned plane probability by false navigation information interference can be there is in system loop " unmanned ".On the other hand, according to relevant event and phase Pass information also further illustrates " unmanned " and is very likely spoofed, as Iran declares to have cracked U.S. army's GPS navigation signal, and interference Communication equipment makes link failure, is successfully inveigled by dummy location information, captures a frame of U.S. army to perform within the border in Iran RQ-170 " sentry " scounting aeroplane of task;It addition, the engineering research life of Austin University of Texas of the U.S. is in safety of America portion Entrust in carry out one experiment, utilize homemade " electronic deception " equipment successfully to kidnap a civilian unmanned plane of frame, principle and she Bright capture U.S. army unmanned plane is similar to, and is all by allowing unmanned plane link failure make it lose the contact of earth station operator, broken Its dummy location information is given to control unmanned plane after solving navigation information form.
Therefore, improving unmanned plane anti-fraud ability is the new problem that can not be ignored.Wherein can start with from two aspects, one is Improve for Navigation of Pilotless Aircraft and anti-interference, the Anti TBIgG ability of data chain device, prevent unmanned plane from the aspect such as signal, information Got involved by outside and control.Second, improve unmanned plane even more important to the identification ability of deceptive information.
Summary of the invention
The invention aims to solve said system " unmanned " problem, from human perception things and the side made decisions Formula is started with, based on the data that can obtain in real time in unmanned plane during flying and prior information, it is proposed that unmanned plane location during link failure The autonomous determination methods of information anti-fraud, front to course line, the extraction of destination corresponding informance and storage, flight to certain destination including flight Time obtain data extract real-time relevant information in real time after, the multi-source information between multiple destinations is combined process after, examine The impact considered to the complex working condition information such as geographical, meteorological and humane is weighted processing and independently judges that whether unmanned plane is by falseness position Put information deception, improve the safe autonomy of unmanned plane.
Unmanned plane location information anti-fraud autonomous determination methods during the link failure of the present invention, including following step:
The first step, fly front leg information planning;
If the starting point of leg is from point, the point of arrival is to point, according to routeing requirement, aloft arranges N on leg Individual destination, N is more than or equal to 2;
Build leg planning table T1: include the destination number of each destination, longitude, latitude, highly;
Build mission payload planning table T2: when being included in unmanned plane shooting picture, the platform azimuth of unmanned plane, platform are bowed The elevation angle, camera focus value;
Build ground imaging region landforms table T3: include the atural object warp of ground, leg imaging region between destination and destination Degree, atural object latitude, atural object elevation, SUSAN Corner Feature;
Build work information table T4: include geographical G, meteorological V, humane H, imaging resolution R;
By leg planning table T1, mission payload planning table T2, ground imaging region landforms table T3, work information table T4 input To airborne flight control computer;
Second step, destination coordinate information data acquisition also carries out table T1 information matches.
When needs carry out the operation of unmanned plane anti-fraud, the destination of flight is detected and identifies, specifically includes following Several steps:
(1) destination detection
Use SUSAN angular-point detection method, the ground imaging region that this destination is corresponding is detected, obtains α angle point Feature;
Obtaining coordinate information and the elevation information of destination, coordinate information includes the longitude of destination, latitude, is expressed as PWay-point(longitude, latitude), elevation information includes the height of destination, is expressed as AAltitude
(2) destination coupling
Access surface imaging regional landforms table T3, obtains β Corner Feature of corresponding destination, and α step (1) obtained is individual Corner Feature, mates with β Corner Feature, and the angle point that the match is successful is γ, ifThen the match is successful, M0=1, Otherwise, it fails to match, M0=0;
Meanwhile, accessing leg planning table T1, obtain coordinate information and the elevation information of corresponding destination, coordinate information includes boat The longitude of point, latitude, be expressed as P'Way-point(longitude, latitude), elevation information includes the height of destination, represents For A'Altitude;The P that step (1) is obtainedWay-point(longitude,latitude)、AAltitudeWith P'Way-point (longitude,latitude)、A'AltitudeMate, calculate the difference of the two, obtain result M1、M2:
MP=| PWay-point(longitude,latitude)-P'Way-point(longitude,latitude)
MA=| AAltitude-A'Altitude|
MPRepresent the difference of destination coordinate figure, MARepresent the difference of destination height value, if the difference of longitude and latitude is all in the range of 0.1, Then M1=1, otherwise M1=0, if height difference is in the range of 10 meters simultaneously, then M2=1, otherwise M2=0;
3rd step, extracts atural object Corner Feature and position elevation information in-flight, carries out mission payload planning information and ground Looks information matches;
Arrive appointment destination, in conjunction with prior image information or GIS-Geographic Information System, according in mission payload planning table T2 The parameter of destination correspondence unmanned plane, adjusts unmanned plane camera angle, gathers multi-source data;
Multi-source data includes:
A: image information: according to mission payload planning table T2, adjusts ccd video camera attitude, it is thus achieved that destination correspondence landforms High definition Aerial Images;
B: navigation information: utilize integrated navigation data, obtains real-time landforms position data;
C: elevation information: use laser ranging, it is thus achieved that multiple spot elevation information;
Then multi-source data coupling is carried out, including:
(1) geomorphology information coupling
First Aerial Images is carried out pretreatment;Secondly, the N number of Corner Feature of Aerial Images is calculated;Finally, access airbornely Looks data base, extracts multiple Corner Features of current location, and N number of corner characteristics Point matching, be calculated landforms view data and The matching characteristic of Aerial Images data is counted M, ifThen think that the match is successful, M3=1, it is otherwise M3=0.
(2) position is mated with altitude data
First, the geomorphologic map obtained collection includes landforms as coordinate information and multiple spot elevation information, landforms coordinate information Longitude, latitude, be expressed as P'Object(longitude, latitude), obtains airborne current landforms coordinate information, is expressed as PObject(longitude, latitude), if difference MOIn preset range, return M5=1, it is otherwise 0;
MO=| PObject(longitude,latitude)-P'Object(longitude,latitude)|
Elevation information is mated, calculates its average and variance, respectively α1、β1, then access airborne elevation database, Extract the altitude data of current location corresponding point, and calculate its mean variance, be designated as α2、β2
M4=| α12|+|β12|
4th step, calculates weight coefficient and matching threshold according to work information table T4 data;
According to work information table T4, give different characteristic coupling weight coefficient λ and matching threshold V:
λ=2* (G+V+H+R)
V = 2 G + V + H + R + 10
5th step, weighted calculation, it may be judged whether there is deception.
When P is more than threshold value V, then unmanned plane is not spoofed, otherwise, unmanned plane is spoofed:
P=λ * M0*M1*M2*M3*M5/M4
It is an advantage of the current invention that:
(1) comprehensive airborne multi-sensor information, improves airborne resource utilization;
(2) set up composite information model, improve unmanned plane safety;
Accompanying drawing explanation
Fig. 1 is the method flow diagram of the present invention;
Fig. 2 is the image information coupling flow chart of the present invention;
Detailed description of the invention
Below in conjunction with accompanying drawing, the detailed description of the invention of the present invention is described in detail.
Unmanned plane location information anti-fraud autonomous determination methods during the link failure of the present invention, flow process is as it is shown in figure 1, include Following step:
The first step, fly front leg information planning;
If the starting point of leg is from point, the point of arrival is to point, according to routeing requirement, aloft arranges N on leg Individual destination, N is more than or equal to 2;
Build leg planning table T1: leg planning table T1 and include the destination number of each destination, longitude, latitude, highly, example As, including from a destination and to the leg planning table of a destination as shown in table T1:
Table T1 leg planning table
Wherein: From-Num is from a destination number, From-Lon is from a longitude, and From-Lat is from a latitude,
From-Alt is that Arrive-Num is for some destination number, and Arrive-Lon is to some longitude from a height, Arrive-Lat is for some latitude, and Arrive-Alt is for some height;
When structure mission payload planning table T2: mission payload planning table T2 is included in unmanned plane shooting picture, unmanned plane Platform azimuth, the platform angle of pitch, camera focus value, such as, mission payload planning table T2 is as shown in table T2:
Table T2 mission payload planning table
Wherein, Platform-Azimuth is platform azimuth, and Platform-Pitching Angle is platform pitching Angle, Camera Focal Length Value is camera focus value;
Build ground imaging region landforms table T3: imaging area, ground when obtaining the airflight of leg between destination and destination Territory, obtains significant entity atural object therein based on prior image data or GIS-Geographic Information System, such as signal beacon, mountains and rivers, River or above ground structure etc..After extracting Corner Feature and the co-ordinate position information data of above-mentioned atural object, generate ground surface imaging Regional landforms table T3, ground imaging region landforms table T3 include the atural object warp of ground, leg imaging region between destination and destination Degree, atural object latitude, atural object elevation, SUSAN Corner Feature, such as, ground imaging region landforms table T3 is as shown in table T3:
Table T3 ground imaging region landforms table
Wherein: Marker-Lon is atural object longitude, Marker-Lat is atural object latitude, and Marker-Alt is atural object elevation, Marker-feature is Corner Feature;
Build work information table T4: the work information of destination is inserted work information table T4, work information bag according to grade Include geographical G, meteorological V, humane H, imaging resolution R;Geographical grade is divided into building, mountain range, lake, Meteorological Grade be divided into sunny, Cloudy day, haze, sleet, humane grade be divided into city, suburb, imaging resolution grade be divided into high-resolution, intermediate-resolution, low point Resolution,
Such as, the work informations such as geography, meteorology, humanity, imaging resolution are respectively divided into the N individual grade such as (1,3,5,7) deposit Entering table T4, wherein, geographical G is divided into building: 1, mountain range: 3, lake: 5 ..., meteorological V is divided into sunny: 1, the cloudy day: 3, haze: 5, Sleet: 7 ..., humane H is divided into city: 1, suburb: 3 ..., image resolution ratio R is divided into high-resolution: 1, intermediate-resolution: 3, Low resolution: 5 ...;Shown in work information table T4:
Table T4 work information table
By leg planning table T1, mission payload planning table T2, ground imaging region landforms table T3, work information table T4 input To airborne flight control computer;
Wherein, table T1, T2, T3 and T4 can be injected in airborne flight control computer according to table 5 mode.By leg planning table T1 information recorded table T5, shows as destination numbering, coordinate and elevation information, mission payload planning table T2 information be recorded table T5, shows as load planning information, and ground imaging region landforms table T3 information be recorded table T5, show as SUSAN feature and Its object coordinates elevation information, finally recorded work information table 4 in table T5, shows as work information.
Table T5 integrated information table
Second step, destination coordinate information data acquisition also carries out table T1 information matches.
Unmanned plane is during practical flight, when being in UVA cruise mode, when needs carry out unmanned plane anti-fraud operation Time, the destination of flight is detected and identifies, specifically includes the following aspects:
(1) destination detection
Use SUSAN angular-point detection method, the ground imaging region that this destination is corresponding is detected, obtains α angle point Feature;
Owing to SUSAN angular-point detection method has the features such as algorithm is simple, position is accurate, noise resisting ability is strong, so this Invention utilizes this detection method to detect destination terrestrial reference.
Obtaining coordinate information and the elevation information of destination, coordinate information includes the longitude of destination, latitude, is expressed as PWay-point(longitude, latitude), elevation information includes the height of destination, is expressed as AAltitude
(2) destination coupling
In access surface imaging regional landforms table T3, obtain β Corner Feature of corresponding destination, α step (1) obtained Individual Corner Feature, mates with β Corner Feature, and the angle point that the match is successful is γ, ifThen the match is successful, M0= 1, otherwise, it fails to match, M0=0;
Meanwhile, accessing leg planning table T1, obtain coordinate information and the elevation information of corresponding destination, coordinate information includes boat The longitude of point, latitude, be expressed as P'Way-point(longitude, latitude), elevation information includes the height of destination, represents For A'Altitude;The P that step (1) is obtainedWay-point(longitude,latitude)、AAltitudeWith P'Way-point (longitude,latitude)、A'AltitudeMate, calculate the difference of the two, obtain result M1、M2:
MP=| PWay-point(longitude,latitude)-P'Way-point(longitude,latitude)|
MA=| AAltitude-A'Altitude|
MPRepresent the difference of destination coordinate figure, MARepresent the difference of destination height value, if the difference of longitude and latitude is all in the range of 0.1, Then M1=1, otherwise M1=0, if height difference is in the range of 10 meters simultaneously, then M2=1, otherwise M2=0;
3rd step, extracts atural object Corner Feature and position elevation information in-flight, carries out mission payload planning information and ground Looks information matches;
Arrive appointment destination, in conjunction with prior image information or GIS-Geographic Information System, according in mission payload planning table T2 The parameter of destination correspondence unmanned plane, adjusts unmanned plane camera angle, gathers multi-source data;
Multi-source data includes:
A: image information: according to mission payload planning table T2, adjusts ccd video camera attitude, it is thus achieved that destination correspondence landforms High definition Aerial Images;
B: navigation information: utilize integrated navigation data (predominantly GPS) etc., obtain real-time landforms position data;
C: elevation information: use laser ranging, it is thus achieved that multiple spot elevation information;
Then multi-source data coupling is carried out, including:
(1) geomorphology information coupling
As in figure 2 it is shown, due in unmanned plane during flying by weather, temperature, the impact of the factors such as humidity, current Aerial Images with There is some difference in airborne relief data storehouse, first Aerial Images is carried out pretreatment, including medium filtering denoising, gray scale Enhancement Method;Secondly, the N number of Corner Feature of Aerial Images is calculated;Finally, access airborne relief data storehouse, extract current location Multiple Corner Features, with N number of corner characteristics Point matching, are calculated the matching characteristic point of landforms view data and Aerial Images data Number M, ifThen think that the match is successful, M3=1, it is otherwise M3=0.
(2) position is mated with altitude data
First, to gathering the geomorphologic map obtained as coordinate information machine multiple spot elevation information, coordinate information is mated, with the Described in two steps (2), landforms coordinate information includes the longitude of landforms, latitude, is expressed as P'Object(longitude, Latitude), obtain airborne current landforms coordinate information, be expressed as PObject(longitude, latitude), if difference MO? In the range of 0.1, return M5=1, it is otherwise 0;Elevation information is mated, then calculates its average and variance, respectively α1、β1, so The airborne elevation database of rear access, extracts the altitude data of current location corresponding point, and calculates its mean variance, be designated as α2、β2
MO=| PObject(longitude,latitude)-P'Object(longitude,latitude)|
M4=| α12|+|β12|
4th step, calculates weight coefficient and matching threshold according to work information table T4 data;
Under the conditions of different work informations, the extraction of characteristics of image is different with matching result, as schemed during meteorological condition difference Image angle point feature extraction is less, and matching result is undesirable.Therefore according to work information table T4, according to work information, difference is given Characteristic matching weight coefficient λ and matching threshold V, λ is directly proportional to work information grade, and V is inversely proportional to grade:
λ=2* (G+V+H+R)
V = 2 G + V + H + R + 10
5th step, weighted calculation, it may be judged whether there is deception.
The geomorphological environment of unmanned plane during flying height and current location can change along with factors such as shooting time weather, Allow in location matches that certain error exists, the data that summary collects, experiments verify that, choose following public affairs Formula, P meets more than in the case of threshold value V, it is believed that unmanned plane is not spoofed, otherwise, then it is assumed that it is spoofed, starts emergency measure.
P=λ * M0*M1*M2*M3*M5/M4
The present invention is directed to the practical application request of unmanned plane anti-fraud, it is proposed that one is independent of data link, nothing Man-machine location information anti-fraud autonomous judgment technology scheme, according to airborne equipment measurement data and self-contained priori data, Carry out autonomous station identification, it may be judged whether be spoofed.

Claims (2)

1. unmanned plane location information anti-fraud autonomous determination methods during link failure, including following step:
The first step, fly front leg information planning;
If the starting point of leg is from point, the point of arrival is to point, according to routeing requirement, aloft arranges N number of boat on leg Point, N is more than or equal to 2;
Build leg planning table T1: include the destination number of each destination, longitude, latitude, highly;
Build mission payload planning table T2: when being included in unmanned plane shooting picture, the platform azimuth of unmanned plane, platform pitching Angle, camera focus value;
Build ground imaging region landforms table T3: include the atural object longitude of ground, leg imaging region between destination and destination, Thing latitude, atural object elevation, SUSAN Corner Feature;
Build work information table T4: include geographical G, meteorological V, humane H, imaging resolution R;
Leg planning table T1, mission payload planning table T2, ground imaging region landforms table T3, work information table T4 are inputted to machine Carry in flight control computer;
Second step, destination coordinate information data acquisition also carries out table T1 information matches;
When needs carry out the operation of unmanned plane anti-fraud, the destination of flight is detected and identifies, specifically includes following Step:
(1) destination detection
Use SUSAN angular-point detection method, the ground imaging region that this destination is corresponding is detected, obtains α Corner Feature;
Obtaining coordinate information and the elevation information of destination, coordinate information includes the longitude of destination, latitude, is expressed as PWay-point (longitude, latitude), elevation information includes the height of destination, is expressed as AAltitude
(2) destination coupling
Access surface imaging regional landforms table T3, obtains β Corner Feature of corresponding destination, α angle point step (1) obtained Feature, mates with β Corner Feature, and the angle point that the match is successful is γ, ifThen the match is successful, M0=1, no Then, it fails to match, M0=0;
Meanwhile, accessing leg planning table T1, obtain coordinate information and the elevation information of corresponding destination, coordinate information includes destination Longitude, latitude, be expressed as P'Way-point(longitude, latitude), elevation information includes the height of destination, is expressed as A'Altitude;The P that step (1) is obtainedWay-point(longitude,latitude)、AAltitudeWith P'Way-point (longitude,latitude)、A'AltitudeMate, calculate the difference of the two, obtain result M1、M2:
MP=| PWay-point(longitude,latitude)-P'Way-point(longitude,latitude)|
MA=| AAltitude-A'Altitude|
MPRepresent the difference of destination coordinate figure, MARepresent the difference of destination height value, if the difference of longitude and latitude is all in the range of 0.1, then M1 =1, otherwise M1=0, if height difference is in the range of 10 meters, then M2=1, otherwise M2=0;
3rd step, extracts atural object Corner Feature and position elevation information in-flight, carries out mission payload planning information and landforms letter Breath coupling;
Arrive appointment destination, in conjunction with prior image information or GIS-Geographic Information System, according to destination in mission payload planning table T2 The parameter of corresponding unmanned plane, adjusts unmanned plane camera angle, gathers multi-source data;
Multi-source data includes:
A: image information: according to mission payload planning table T2, adjusts ccd video camera attitude, it is thus achieved that the high definition of destination correspondence landforms Aerial Images;
B: navigation information: utilize integrated navigation data, obtains real-time landforms position data;
C: elevation information: use laser ranging, it is thus achieved that multiple spot elevation information;
Then multi-source data coupling is carried out, including:
(1) geomorphology information coupling
First Aerial Images is carried out pretreatment;Secondly, the N number of Corner Feature of Aerial Images is calculated;Finally, airborne landforms number is accessed According to storehouse, extract multiple Corner Features of current location, mate with N number of Corner Feature, be calculated landforms view data and take photo by plane The matching characteristic of view data is counted M, ifThen think that the match is successful, M3=1, it is otherwise M3=0;
(2) position is mated with altitude data
First, the geomorphologic map obtained collection includes the warp of landforms as coordinate information and multiple spot elevation information, landforms coordinate information Degree, latitude, be expressed as P'Object(longitude, latitude), obtains airborne current landforms coordinate information, is expressed as PObject (longitude, latitude), if difference MOIn preset range, return M5=1, it is otherwise 0;
MO=| PObject(longitude,latitude)-P'Object(longitude,latitude)|
Elevation information is mated, calculates its average and variance, respectively α1、β1, then access airborne elevation database, extract The altitude data of current location corresponding point, and calculate its mean variance, it is designated as α2、β2
M4=| α12|+|β12|
4th step, calculates weight coefficient and matching threshold according to work information table T4 data;
According to work information table T4, give different characteristic coupling weight coefficient λ and matching threshold V:
λ=2* (G+V+H+R)
V = 2 G + V + H + R + 10
5th step, weighted calculation;
When P is more than threshold value V, then unmanned plane is not spoofed, otherwise, unmanned plane is spoofed:
P=λ * M0*M1*M2*M3*M5/M4
Unmanned plane location information anti-fraud autonomous determination methods during link failure the most according to claim 1, described is pre- Determine scope less than or equal to 0.1.
CN201410103184.4A 2014-03-19 2014-03-19 Automatic anti-cheating judgment method for unmanned aerial vehicle positioning information in link failure process Active CN103852077B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410103184.4A CN103852077B (en) 2014-03-19 2014-03-19 Automatic anti-cheating judgment method for unmanned aerial vehicle positioning information in link failure process

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410103184.4A CN103852077B (en) 2014-03-19 2014-03-19 Automatic anti-cheating judgment method for unmanned aerial vehicle positioning information in link failure process

Publications (2)

Publication Number Publication Date
CN103852077A CN103852077A (en) 2014-06-11
CN103852077B true CN103852077B (en) 2017-01-11

Family

ID=50860018

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410103184.4A Active CN103852077B (en) 2014-03-19 2014-03-19 Automatic anti-cheating judgment method for unmanned aerial vehicle positioning information in link failure process

Country Status (1)

Country Link
CN (1) CN103852077B (en)

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104713548A (en) * 2015-03-30 2015-06-17 成都好飞机器人科技有限公司 Automatic positioning method for air vehicle
EP3152089A4 (en) 2015-03-31 2017-08-02 SZ DJI Technology Co., Ltd. Systems and methods for geo-fencing device communications
JP6467520B2 (en) * 2015-03-31 2019-02-13 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Method and system for issuing an alarm
CN107409051B (en) 2015-03-31 2021-02-26 深圳市大疆创新科技有限公司 Authentication system and method for generating flight controls
CN105242285B (en) * 2015-10-15 2017-11-07 北京航空航天大学 A kind of Navigation of Pilotless Aircraft data based on satellite communication are spoofed recognition methods
CN105353342B (en) * 2015-10-27 2018-01-12 西安电子科技大学 A kind of spoofing attack method of FM signal indoor locating system
US10671091B2 (en) * 2017-03-03 2020-06-02 Alpine Electronics, Inc. Flight control device and flight control method for unmanned aerial vehicle
CN108521803A (en) * 2017-03-15 2018-09-11 深圳市大疆创新科技有限公司 Unmanned vehicle destination planing method, system, electronic equipment and storage medium
CN107314772B (en) * 2017-07-25 2020-05-15 哈尔滨工业大学(威海) Unmanned aerial vehicle self-learning waypoint track flight method and system thereof
CN107861135B (en) * 2017-10-26 2021-06-11 国家电网公司 Unmanned aerial vehicle satellite navigation deception detection method for power inspection
CN107976690B (en) * 2017-11-17 2021-04-30 西安电子科技大学 Multi-unmanned aerial vehicle GPS deception resisting method in intelligent cooperative environment
CN110376613B (en) * 2019-07-11 2022-12-23 西安爱生技术集团公司 GPS deception jamming detection method based on unmanned aerial vehicle magnetic heading
CN113033508B (en) * 2021-05-20 2021-08-10 成都奥伦达科技有限公司 Point cloud-based fine routing inspection waypoint rapid generation method
CN113985923B (en) * 2021-12-27 2022-03-22 山东天空之眼智能科技有限公司 Unmanned aerial vehicle automatic capturing method and automatic capturing system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102426019A (en) * 2011-08-25 2012-04-25 航天恒星科技有限公司 Unmanned aerial vehicle scene matching auxiliary navigation method and system
CN102435188A (en) * 2011-09-15 2012-05-02 南京航空航天大学 Monocular vision/inertia autonomous navigation method for indoor environment
US8315794B1 (en) * 2006-09-05 2012-11-20 Honeywell International Inc. Method and system for GPS-denied navigation of unmanned aerial vehicles
CN103175524A (en) * 2013-02-20 2013-06-26 清华大学 Visual-sense-based aircraft position and attitude determination method under mark-free environment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7725257B2 (en) * 2006-09-05 2010-05-25 Honeywell International Inc. Method and system for navigation of an ummanned aerial vehicle in an urban environment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8315794B1 (en) * 2006-09-05 2012-11-20 Honeywell International Inc. Method and system for GPS-denied navigation of unmanned aerial vehicles
CN102426019A (en) * 2011-08-25 2012-04-25 航天恒星科技有限公司 Unmanned aerial vehicle scene matching auxiliary navigation method and system
CN102435188A (en) * 2011-09-15 2012-05-02 南京航空航天大学 Monocular vision/inertia autonomous navigation method for indoor environment
CN103175524A (en) * 2013-02-20 2013-06-26 清华大学 Visual-sense-based aircraft position and attitude determination method under mark-free environment

Also Published As

Publication number Publication date
CN103852077A (en) 2014-06-11

Similar Documents

Publication Publication Date Title
CN103852077B (en) Automatic anti-cheating judgment method for unmanned aerial vehicle positioning information in link failure process
CN103822635B (en) The unmanned plane during flying spatial location real-time computing technique of view-based access control model information
CN105865454B (en) A kind of Navigation of Pilotless Aircraft method generated based on real-time online map
Samad et al. The potential of Unmanned Aerial Vehicle (UAV) for civilian and mapping application
CN102353377B (en) High altitude long endurance unmanned aerial vehicle integrated navigation system and navigating and positioning method thereof
CN107316012B (en) Fire detection and tracking method of small unmanned helicopter
CN101201248B (en) Aviation close range photography displacement measurement system based on unmanned aerial vehicle as well as measurement method thereof
CN113359810A (en) Unmanned aerial vehicle landing area identification method based on multiple sensors
CN107247458A (en) UAV Video image object alignment system, localization method and cloud platform control method
US20140297068A1 (en) Identification and analysis of aircraft landing sites
CN111829963B (en) Distributed remote sensing satellite system
CN108475442A (en) Augmented reality method, processor and unmanned plane for unmanned plane
KR101771492B1 (en) Method and system for mapping using UAV and multi-sensor
Revuelto et al. Light and shadow in mapping alpine snowpack with unmanned aerial vehicles in the absence of ground control points
CN114004977A (en) Aerial photography data target positioning method and system based on deep learning
CN105956081A (en) Ground station map updating method and apparatus
CN110244381A (en) Unmanned plane geological exploration methods, devices and systems
Li et al. Unmanned Aerial Vehicle Remote Sensing for Antarctic Research: A review of progress, current applications, and future use cases
Paulin et al. Application of raycast method for person geolocalization and distance determination using UAV images in Real-World land search and rescue scenarios
Starek et al. Application of unmanned aircraft systems for coastal mapping and resiliency
Rauhala et al. Measuring the spatiotemporal variability in snow depth in subarctic environments using UASs–Part 1: Measurements, processing, and accuracy assessment
Lu et al. Multi-information Based Safe Area Step Selection Algorithm for UAV's Emergency Forced Landing.
CN108983820B (en) Ecological protection red line area boundary exploration method and system
Funk et al. A passive cloud detection system for UAV: Analysis of issues, impacts and solutions
CN115574816B (en) Bionic vision multi-source information intelligent perception unmanned platform

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20170421

Address after: 100191 Beijing, Xueyuan Road, No. 37, inside the

Patentee after: Beijing northern sky long hawk UAV Technology Co. Ltd.

Address before: 100191 Haidian District, Xueyuan Road, No. 37,

Patentee before: Beihang University

TR01 Transfer of patent right