CN107783545A - Post disaster relief rotor wing unmanned aerial vehicle obstacle avoidance system based on OODA ring multi-sensor information fusions - Google Patents

Post disaster relief rotor wing unmanned aerial vehicle obstacle avoidance system based on OODA ring multi-sensor information fusions Download PDF

Info

Publication number
CN107783545A
CN107783545A CN201610726244.7A CN201610726244A CN107783545A CN 107783545 A CN107783545 A CN 107783545A CN 201610726244 A CN201610726244 A CN 201610726244A CN 107783545 A CN107783545 A CN 107783545A
Authority
CN
China
Prior art keywords
aerial vehicle
unmanned aerial
sensor
height
obstacle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610726244.7A
Other languages
Chinese (zh)
Other versions
CN107783545B (en
Inventor
田雨农
王鑫照
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dalian Roiland Technology Co Ltd
Original Assignee
Dalian Roiland Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dalian Roiland Technology Co Ltd filed Critical Dalian Roiland Technology Co Ltd
Priority to CN201610726244.7A priority Critical patent/CN107783545B/en
Publication of CN107783545A publication Critical patent/CN107783545A/en
Application granted granted Critical
Publication of CN107783545B publication Critical patent/CN107783545B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • G01D21/02Measuring two or more variables by means not covered by a single other subclass

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

Post disaster relief rotor wing unmanned aerial vehicle obstacle avoidance system based on OODA ring multi-sensor information fusions, including:Data aggregation layer, characteristic layer, decision-making level and detection means;The detection means, including:Radar altitude sensor, vertical range of the measurement unmanned plane to ground;GPS/ Big Dipper alignment sensors, are positioned in real time, to realize the spot hover of unmanned plane and can realize the measurement of unmanned plane height and the measurement of unmanned plane relative velocity;AHRS modules, gather the flight attitude and sail information of unmanned plane;Millimetre-wave radar sensor, for realizing barrier to the telemeasurement of unmanned plane and the measurement of relative velocity;Ultrasonic radar sensor, for realizing close-in measurement of the barrier to unmanned plane;Binocular vision sensor, measure the size and shape of barrier;The much information that the application is obtained according to multiple sensors, environment or barrier feature can be made and more recognize comprehensively and correctly.

Description

Obstacle avoidance system of post-disaster rescue rotor unmanned aerial vehicle based on OODA (object oriented data acquisition) ring multi-sensor information fusion
Technical Field
The invention belongs to the technical field of unmanned aerial vehicle obstacle avoidance, and particularly relates to a post-disaster rescue rotor unmanned aerial vehicle obstacle avoidance system and method.
Background
In recent years, the unmanned aerial vehicle technology has rapidly become a new hotspot for research and development at home and abroad, and the unmanned aerial vehicle has the characteristics of high maneuverability, flexible operation, low cost, real-time image transmission, high resolution and the like, so that the unmanned aerial vehicle is applied to various social fields, such as disaster rescue, electric power inspection, forestry fire prevention, agricultural spraying, vegetation protection, aerial photography and the like.
In the rescue scene after a disaster, the unmanned aerial vehicle technology is gradually developed due to the fact that a plurality of limitations exist in the traditional means. The post-disaster rescue unmanned aerial vehicle can observe and intervene in a rescue scene from the air by the fastest and most convenient means under the conditions that the environment is severe, the scene condition cannot be known in time and the rescue is urgent after a disaster event occurs. The unmanned aerial vehicle shoots and records the situation of the post-disaster site through the high-definition camera, and then transmits back the collected real-time site aerial photography data. The unmanned aerial vehicle is applied to post-disaster rescue, personal safety of flying personnel can be avoided, disaster site conditions can be looked up for rescue personnel at the first time, and work such as post-disaster rescue is arranged.
Due to the fact that the scene environment is complex and unknown after the disaster, the unmanned aerial vehicle is damaged when the unmanned aerial vehicle is collided during shooting and recording after the disaster, and quick understanding of the scene disaster relief situation is delayed, and therefore the flight safety of the unmanned aerial vehicle is guaranteed in the process of rescue after the disaster.
Disclosure of Invention
The invention provides an obstacle avoidance system of an after-disaster rescue rotor unmanned aerial vehicle based on OODA ring multi-sensor information fusion, aiming at the conditions that damage and the like are caused because complex environments cannot avoid obstacles in time in the after-disaster rescue process of the unmanned aerial vehicle, and the unmanned aerial vehicle can be prevented from colliding in real time, so that the safety of the unmanned aerial vehicle is ensured.
The invention provides a post-disaster rescue rotor unmanned aerial vehicle obstacle avoidance system based on OODA ring multi-sensor information fusion, which comprises: the system comprises a data fusion layer, a characteristic layer, a decision layer and a detection device;
the detection device comprises:
the radar height sensor is used for measuring the vertical distance from the unmanned aerial vehicle to the ground;
the GPS/Beidou positioning sensor is used for positioning in real time so as to realize fixed-point hovering of the unmanned aerial vehicle and realize measurement of the height of the unmanned aerial vehicle and measurement of the relative speed of the unmanned aerial vehicle;
the AHRS module is used for acquiring the flight attitude and navigation information of the unmanned aerial vehicle;
the millimeter wave radar sensor is used for realizing the remote measurement from the obstacle to the unmanned aerial vehicle and the measurement of the relative speed;
the ultrasonic radar sensor is used for realizing the close range measurement from the obstacle to the unmanned aerial vehicle;
a binocular vision sensor measuring the size and shape of the obstacle;
the main controller is used for controlling the unmanned aerial vehicle to complete obstacle avoidance action by analyzing the data obtained by each sensor;
the main controller is respectively connected with the radar height sensor, the GPS/Beidou positioning sensor, the AHRS module, the millimeter wave radar sensor, the ultrasonic radar sensor and the binocular vision sensor.
Further, millimeter wave radar sensor, ultrasonic radar sensor, two mesh vision sensors are four, install four faces in front of unmanned aerial vehicle, back, left and right respectively.
Further, a millimeter wave radar sensor, comprising:
the antenna module transmits signals to the space region in each direction and receives the scattered echo signals of the obstacles in the space region;
the radio frequency front-end module is used for realizing the transmission and receiving processing of signals;
and the baseband processing module analyzes the relative distance, the relative speed and the azimuth angle of the front target obstacle and sends the analyzed result to the main controller.
Furthermore, the ultrasonic radar sensor measures the distance within the range of 0-10 m, and the millimeter wave radar sensor measures the distance within the range of 1-50 m.
Further, the AHRS module, including the three-axis gyroscope, accelerometer, and magnetometer of the MEMS, outputs data of three-dimensional acceleration, three-dimensional angular velocity, and three-dimensional geomagnetic field intensity.
Furthermore, the data fusion layer processes the data collected by each sensor:
1) The millimeter wave radar sensor outputs data including a relative distance R1 between the unmanned aerial vehicle and the obstacle, a relative speed V1, and an angle between the obstacle and a radar normal, wherein the angle includes an azimuth angle theta 1 and a pitch angle psi 1;
2) The relative distance R2 between the unmanned aerial vehicle and the obstacle is input by the ultrasonic radar sensor;
3) The binocular vision sensor outputs an object area S, an azimuth angle theta 2 and a relative distance R3;
4) The radar height sensor outputs a height value R4 of the unmanned aerial vehicle and the ground;
5) The GPS/Beidou positioning sensor outputs time T, the positioning state S and A are positioning, V is not positioning, north latitude N or south latitude S, east longitude E or west longitude W and unmanned aerial vehicle speed V2;
6) The AHRS module outputs three-dimensional acceleration A _ x, A _ y, A _ z, three-dimensional angular velocity w _ x, w _ y, w _ z and three-dimensional geomagnetic field intensity m _ x, m _ y, m _ z, and the current attitude data of the unmanned aerial vehicle, namely flight azimuth angle theta 3, pitch angle psi 2 and roll angle are calculated through the data
Furthermore, the feature layer performs data fusion of the relative distance between the unmanned aerial vehicle and the obstacle, data fusion of the relative height between the unmanned aerial vehicle and the ground, data fusion of the relative speed between the unmanned aerial vehicle and the obstacle, and attribute features such as the size and the shape of the obstacle:
the data fusion of the relative distance between the unmanned aerial vehicle and the obstacle is processed according to the distance range:
A. detecting the ultrasonic radar sensor, the binocular vision sensor and the millimeter wave radar sensor within the range of 0m to 10m, introducing alpha and beta weighted values to carry out weighted average on the ultrasonic radar sensor, the binocular vision sensor and the millimeter wave radar sensor, and carrying out kalman data fusion on the weighted and fused data;
B. the distance is within the range of 10m to 20m, the weighted value of alpha is introduced to carry out weighted average on the two sensors, namely the binocular vision sensor and the millimeter wave radar sensor, and kalman data fusion is carried out on the weighted and fused data;
C. performing kalman data fusion on the data of the millimeter wave radar sensor within the range of 20-50 m;
as shown below
Furthermore, the data fusion of the relative height of the unmanned aerial vehicle and the ground is to perform data fusion on the height value of the unmanned aerial vehicle acquired by the radar height sensor and the GPS/Beidou positioning sensor; the data fusion of the height value is divided into two types according to the distance;
for the range with the height less than 100m, detecting the height of the unmanned aerial vehicle by adopting a radar height sensor and a GPS/Beidou positioning sensor, performing weighted average on the detected result, namely introducing alpha values to perform weighted average processing on the height values of the two sensors, and performing data fusion on the height values by adopting kalman after the processing; correcting the height value according to AHRS attitude data:
for the height more than 100m, a GPS/Beidou positioning sensor is adopted to directly perform kalman data fusion on the acquired height data, and then AHRS attitude data is adopted to perform height correction;
wherein H1 is the height acquired by the radar height sensor, and H2 is the height acquired by the GPS/Beidou positioning sensor; and simultaneously, the obstacle is identified according to the characteristics of the binocular vision sensor, such as the size, the shape and the like of the obstacle.
As a further step, the height correction by using the AHRS attitude and heading data specifically comprises:
psi 2 is pitch angle andis a roll angle; h is the measured height, and H' is the corrected height value.
As a further step, the data fusion of the relative speeds of the unmanned aerial vehicle and the obstacle is obtained by using the millimeter wave radar sensor and the GPS/beidou positioning sensor, and the speed data of the two sensors is processed by weighted average, that is, the weighted average is processed
V = a x V1+ (1-a) x V2, a is the weight ratio of the two sensors, the velocity data after weighted mean processing is subjected to Kalman data fusion, and obstacle identification is carried out on the characteristics of the size, the shape and the like of an obstacle according to the binocular vision sensor.
As a further step, the decision layer performs obstacle avoidance by the following steps:
p1, firstly, judging the relative distance between the unmanned aerial vehicle and the obstacle, and dividing the relative distance into three parts, namely three distance ranges of less than N1m, from N1m to N2m and from N2m to N3 m;
p2, after the distance division is finished, the danger grade division is carried out according to the relative speed of the unmanned aerial vehicle and the barrier:
when the distance is less than N1M, the speed is greater than M1M/s, and the early warning time is less than Qs, the alarm belongs to a danger level, and when the speed is less than M1M/s, the alarm belongs to an alarm level;
when the distance N1M is less than or equal to R < N2M and the speed is greater than M2M/s, the vehicle is in a danger level; when the speed M1M/s is less than or equal to V and less than M2M/s, the warning level is set, and when the speed is less than M1M/s, the prompting level is set;
when the distance N2M is less than or equal to R < N3M, and when the speed is greater than M3M/s, the speed is in a danger level; when the speed M2M/s is less than or equal to V and less than M3M/s, the warning level is set, when the speed M1M/s is less than or equal to V and less than M2M/s, the prompting level is set, and when the speed is less than M1M/s, the irrelevant level is set;
p3, then judging the height value of the unmanned aerial vehicle and the ground, and dividing the height value H into four grades;
when the height is smaller than X1m, the barrier comprises a wall, trees and people, the attributes of the barrier are identified according to the binocular vision sensor, the wall, the trees and the people are distinguished, then the barrier can be completely avoided for the people after the barrier is selected to hover, and the height of the trees and the wall needs to be further judged through the climbing process;
when the height X1m is less than or equal to H and less than X2m, the obstacles comprise walls and trees, the attributes of the obstacles are identified according to the binocular vision sensor, the walls and the trees are distinguished, then the obstacles climb upwards after urgent hovering is selected, through the climbing process, the obstacles can be completely avoided for the trees, but the height of the walls needs to be further judged;
when the height X2m is not more than H and less than X3m, and at the height, the barrier comprises a wall and a high-voltage wire, the attributes of the barrier are identified according to the vision acquisition sensor, the wall and the high-voltage wire are distinguished, then the barrier is climbed upwards after the barrier is selected to be suspended urgently, the barrier can be completely avoided for the trees through the climbing process, but the height of the wall needs to be judged continuously;
when the height H is larger than or equal to X3m, the attributes of the obstacles are identified according to the binocular vision sensor at the height, and if the obstacles are confirmed to be the wall surface, the obstacles are turned back and avoided after the obstacles are selected to be hovered urgently;
p4: for the danger level, the operation of step P3 needs to be performed; for the warning grade, carrying out the operation of the step P3 after the emergency deceleration is needed; and returning to detect again without judging the third step for the prompt level and the irrelevant level.
Due to the adoption of the technical scheme, the invention can obtain the following technical effects: according to the method and the device, the environment or the barrier characteristics can be more comprehensively and correctly known according to various information obtained by a plurality of sensors, so that the false alarm risk brought to the system by a single sensor is overcome; meanwhile, the multiple sensors can be used for acquiring the characteristics of the environment and the obstacles, mutual promotion is realized, the defects of the sensors are overcome, and the obstacle avoidance task is better completed. The information fusion technology of multisensor handles the analysis with the data that each sensor gathered to for main control unit accomplishes rescue unmanned aerial vehicle safety after the calamity and keeps away the barrier and make better judgement basis.
The post-disaster rescue rotor unmanned aerial vehicle system enables redundant or complementary information of multiple sensors in time and space to be subjected to combined processing through reasonably controlling and fully using data resources of the multiple sensors so as to obtain description of consistency of environment and obstacle target characteristic acquisition. The invention adopts an OODA ring information fusion technology, namely four processing stages: observation, adjustment, decision and action.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions of the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a block diagram of the structure of an inspection apparatus according to the present application;
fig. 2 is a schematic structural diagram of an obstacle avoidance system of a post-disaster rescue rotor unmanned aerial vehicle based on OODA ring multi-sensor information fusion;
FIG. 3 is a schematic diagram of a data fusion structure of the relative distance between an unmanned aerial vehicle and an obstacle;
FIG. 4 is a schematic diagram of a data fusion structure of the relative heights of the unmanned aerial vehicle and the ground;
FIG. 5 is a schematic diagram of a data fusion structure of relative speeds of the UAV and the obstacle;
FIG. 6 is a schematic diagram of a decision layer structure;
FIG. 7 is a flow chart of a decision layer in an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the following describes the technical solutions of the embodiments of the present invention clearly and completely with reference to the accompanying drawings in the embodiments of the present invention:
the unmanned aerial vehicle that this application was directed to mainly is many rotors rescue rotor unmanned aerial vehicle after the calamity. The multi-rotor unmanned aerial vehicle generates flight power by utilizing the rotation of the main rotor to cut air, and compared with a fixed wing, the multi-rotor unmanned aerial vehicle has the characteristics of in-situ take-off, hovering in the air, low flight speed, large load, high flexibility and capability of flying at ultra-low altitude. The multi-rotor unmanned aerial vehicle can take off and land vertically without a runway, can hover in the air after taking off, and is suitable for application in a post-disaster complex environment. The control principle is simple, and four remote sensing operations of the controller correspond to the front-back, left-right, up-down and yawing movements of the aircraft. The invention adopts the information fusion technology of OODA ring, OODA is English abbreviation of observation (Oberve), adjustment (Orient), decision (Decide) and action (Act), namely four processing stages: observation, adjustment, decision and action. By combining the characteristics of the OODA ring, the OODA ring is introduced into the obstacle avoidance system of the post-disaster rescue rotor unmanned aerial vehicle.
Example 1
This embodiment provides a rescue rotor unmanned aerial vehicle after calamity keeps away barrier system, includes: the system comprises a data fusion layer, a characteristic layer, a decision layer and a detection device;
the detection device comprises:
the radar height sensor is used for measuring the vertical distance from the unmanned aerial vehicle to the ground;
the GPS/Beidou positioning sensor is used for carrying out real-time positioning so as to realize tasks such as fixed-point hovering of the unmanned aerial vehicle and the like, and can realize measurement of the height of the unmanned aerial vehicle and measurement of the relative speed of the unmanned aerial vehicle;
the AHRS module is used for acquiring the flight attitude and navigation information of the unmanned aerial vehicle; the AHRS module comprises a three-axis gyroscope of an MEMS, an accelerometer and a magnetometer, and outputs data of three-dimensional acceleration, three-dimensional angular velocity and three-dimensional geomagnetic field intensity.
The millimeter wave radar sensor adopts a linear frequency modulation triangular wave system and is used for realizing the remote measurement from the barrier to the unmanned aerial vehicle; the millimeter wave radar sensor includes: an antenna module for forming transmitting and receiving beams required for radar detection; transmitting signals to the space region in each direction, and receiving the scattering echo signals of the obstacles in the space region; the radio frequency front-end module is used for realizing the transmission and receiving processing of signals according to the application scene and the functional requirements of the obstacle avoidance millimeter wave radar of the unmanned aerial vehicle; and the baseband processing module is used for controlling the emission modulation waveform, signal acquisition and signal processing work, analyzing the relative distance, the relative speed and the azimuth angle of the front target obstacle and sending the analyzed result to the main controller, so that the data acquisition and transmission work of the millimeter wave radar sensor on the target obstacle is completed. The millimeter wave radar sensor measures the distance within the range of 1-50 m.
The ultrasonic radar sensor is used for realizing the short-distance measurement from the obstacle to the unmanned aerial vehicle, and the ultrasonic radar sensor is used for ranging within the range of 0-10 meters;
a binocular vision sensor measuring the size and shape of the obstacle;
the main controller is used for controlling the unmanned aerial vehicle to complete obstacle avoidance action by analyzing the data obtained by each sensor;
the main controller is respectively connected with the radar height sensor, the GPS/Beidou positioning sensor, the AHRS module, the millimeter wave radar sensor, the ultrasonic radar sensor and the binocular vision sensor.
Preferably, four millimeter wave radars, four ultrasonic radars and four binocular vision sensors are respectively installed on the front, the rear, the left and the right surfaces of the unmanned aerial vehicle. Because the multi-rotor unmanned aerial vehicle can fly back and forth, left and right, and each surface is required to be designed to be anti-collision, each surface is subjected to long-distance measurement by one millimeter wave radar sensor, one ultrasonic radar sensor is subjected to short-distance measurement, and one binocular vision sensor is used for measuring target attributes such as direction, size and shape, so that the post-disaster rescue rotor unmanned aerial vehicle designed by the invention needs four millimeter wave radar sensors, four ultrasonic sensors and four binocular vision sensors.
Example 2
As a further limitation to example 1: the data fusion layer processes the data collected by each sensor:
1) The millimeter wave radar sensor outputs data including a relative distance R1 between the unmanned aerial vehicle and the obstacle, a relative speed V1, and an angle between the obstacle and a radar normal, wherein the angle includes an azimuth angle theta 1 and a pitch angle psi 1;
2) The relative distance R2 between the unmanned aerial vehicle and the obstacle is input by the ultrasonic radar sensor;
3) The binocular vision sensor outputs an object area S, an azimuth angle theta 2 and a relative distance R3;
4) The radar height sensor outputs a height value R4 of the unmanned aerial vehicle and the ground;
5) The GPS/Beidou positioning sensor is mainly used for acquiring the altitude H2 and the horizontal navigational speed V2 of the unmanned aerial vehicle;
the GPS data follows NMEA0183 protocol, and the output information is standard and has a fixed format. Among them, there are GPGGA and GPVTG statements that are closely related to drone navigation. Their data format is specified as follows:
(1) GPGGA, UTC time, latitude hemisphere, longitude hemisphere, GPS mode, star count, HDOP horizontal accuracy factor, altitude, M, height of earth ellipsoid relative to the earth horizon, M, differential time, differential station ID h < CR > < LF >.
(2) GPVTG, ground heading based on true North, T, ground heading based on magnetic North, M, ground speed (knots), N, ground speed (kilometers per hour), K, mode indication hh < CR > < LF >.
Through extracting the altitude data of the specific position in the GPGGA statement, the altitude H2 of the unmanned aerial vehicle can be obtained, and through extracting the ground speed (kilometer per hour) of the specific position in the GPVTG statement, the horizontal navigational speed V2 of the unmanned aerial vehicle can be obtained.
6) The AHRS module outputs three-dimensional acceleration A _ x, A _ y, A _ z, three-dimensional angular velocity w _ x, w _ y, w _ z and three-dimensional geomagnetic field intensity m _ x, m _ y, m _ z, and the current attitude data of the unmanned aerial vehicle, namely flight azimuth angle theta 3, pitch angle psi 2 and roll angle are calculated through the data
Example 3
As a supplement to embodiment 1 or 2, the feature layer performs data fusion of the relative distance between the unmanned aerial vehicle and the obstacle, data fusion of the relative height between the unmanned aerial vehicle and the ground, data fusion of the relative speed between the unmanned aerial vehicle and the obstacle, and attribute features such as the size and shape of the obstacle;
the data fusion of the relative distance between the unmanned aerial vehicle and the obstacle is processed according to the distance range:
A. within the range of 0m to 10m, the ultrasonic radar sensor, the binocular vision sensor and the millimeter wave radar sensor are used for detecting, but the relative accuracies of the radars are different, within the close range, the accuracy of the ultrasonic wave is higher, but in order to improve the high resolving accuracy, weighted average is adopted, namely, alpha and beta weighted values are introduced to carry out weighted average on the ultrasonic radar sensor, the binocular vision sensor and the millimeter wave radar sensor, and kalman data fusion is carried out on the weighted and fused data;
B. the distance is within the range of 10m to 20m and exceeds the range of ultrasonic radar ranging, but the vision sensor and the millimeter wave radar can also detect, so that within the range of the distance, weighted average is adopted, namely alpha weighted value is introduced to carry out weighted average on the two sensors of the binocular vision sensor and the millimeter wave radar sensor, and kalman data fusion is carried out on the weighted and fused data;
C. the distance is within 50m range, and exceeds the range of the ultrasonic radar and the vision sensor, but the millimeter wave radar can also detect, so that the Kalman data fusion is directly carried out on the data of the millimeter wave radar sensor within the distance range without adopting a weighting algorithm;
as shown below
The data fusion of the relative height of the unmanned aerial vehicle and the ground is to perform data fusion on the height value of the unmanned aerial vehicle acquired by the radar height sensor and the GPS/Beidou positioning sensor; the data fusion of the height value is divided into two types according to the distance; for the range with the height less than 100m, detecting the height of the unmanned aerial vehicle by adopting a radar height sensor and a GPS/Beidou positioning sensor, performing weighted average on the detected result, namely introducing alpha values to perform weighted average processing on the height values of the two sensors, and performing data fusion on the height values by adopting kalman after the processing; because the height value of the unmanned aerial vehicle is closely related to the attitude of the unmanned aerial vehicle, the height value needs to be corrected according to AHRS attitude data:
for the height more than 100m, only the height data obtained by GPS/Beidou positioning is accurate, so that the GPS/Beidou positioning sensor is adopted in the height range, the obtained height data is directly subjected to kalman data fusion, and then, the AHRS attitude data is adopted for height correction;
wherein H1 is the height acquired by the radar height sensor, and H2 is the height acquired by the GPS/Beidou positioning sensor; and simultaneously, the obstacle is identified according to the characteristics of the binocular vision sensor, such as the size, the shape and the like of the obstacle. The identification method can adopt artificial intelligence and other methods, such as mode identification, neural network algorithm and other methods.
Optionally, the height correction by using the AHRS attitude and heading data specifically includes:
psi 2 is pitch angle andis a roll angle; h is the measured height, and H' is the corrected height value.
The data fusion of the relative speed of the unmanned aerial vehicle and the obstacle is obtained by adopting a millimeter wave radar sensor and a GPS/Beidou positioning sensor, and the speed data of the two sensors are subjected to weighted average value processing, namely
V = α × V1+ (1- α) × V2, α is the weight ratio of the two sensors, and kalman data fusion is performed on the velocity data subjected to weighted mean processing, so that more accurate relative velocity data is obtained; and identifying the obstacle according to the size, the shape and other characteristics of the obstacle by the binocular vision sensor. The identification method can adopt artificial intelligence and other methods, such as mode identification, neural network algorithm and other methods.
Example 4
As a supplement to embodiment 1 or 2 or 3, the decision layer performs obstacle avoidance by the following steps:
p1, firstly, judging the relative distance between the unmanned aerial vehicle and the obstacle, and dividing the relative distance into three parts, namely three distance ranges of less than N1m, from N1m to N2m and from N2m to N3 m;
p2, after the distance division is finished, the danger grade division is carried out according to the relative speed of the unmanned aerial vehicle and the barrier:
when the distance is less than 10m, the speed is greater than 3m/s, and the early warning time is less than 3s, the alarm belongs to a danger level, and when the speed is less than 3m/s, the alarm belongs to an alarm level;
when the distance R is more than or equal to 10m and less than 20m and the speed is more than 6m/s, the speed is in a danger level; when the speed is more than or equal to 3m/s and V is less than 6m/s, the alarm level is set, and when the speed is less than 3m/s, the prompt level is set;
when the distance is more than or equal to 20m and R is less than 50m, the speed is at a dangerous level when being more than 16 m/s; when the speed is more than or equal to 6m/s and less than or equal to 16m/s, the warning level is set, when the speed is more than or equal to 3m/s and less than or equal to 6m/s, the prompting level is set, and when the speed is less than 3m/s, the irrelevant level is set;
p3, then judging the height value of the unmanned aerial vehicle and the ground, and dividing the height value H into four grades;
when the height is smaller than X1m and at the height, main obstacles comprise walls, trees and people, the attributes of the obstacles are identified according to the binocular vision sensor, the walls, the trees and the people are distinguished, then the obstacles climb upwards after the obstacles are suspended urgently, the obstacles can be completely avoided for the people through the climbing process, but the height of the trees and the walls needs to be further judged;
when the height X1m is not more than H and less than X2m, at the height, the main obstacles comprise walls and trees, the attributes of the obstacles are identified according to the binocular vision sensor, the walls and the trees are distinguished, then the trees climb upwards after being suspended in an emergency mode, through the climbing process, the trees can be completely prevented, but the height of the walls needs to be further judged;
when the height X2m is not more than H and less than X3m, at the height, the main obstacles comprise a wall and a high-voltage wire, the attributes of the obstacles are identified according to a vision acquisition sensor, the wall and the high-voltage wire are distinguished, then the obstacles climb upwards after emergency hovering is selected, through the climbing process, the obstacles can be completely avoided for trees, but the height of the wall needs to be continuously judged;
when the height H is more than or equal to X3m, the attributes of the obstacles are identified according to the binocular vision sensor at the height, and if the obstacles are confirmed to be a wall surface, the obstacles are avoided by turning back after urgent hovering is selected;
p4: for the danger level, the operation of step P3 needs to be performed; for the warning grade, the operation of the step P3 is carried out after the emergency deceleration is needed; and returning to detect again without judging the third step for the prompt level and the irrelevant level.
The above description is only for the preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art should be able to cover the technical solutions and the inventive concepts of the present invention within the technical scope of the present invention.

Claims (10)

1. Rescue rotor unmanned aerial vehicle after calamity keeps away barrier system based on OODA ring multisensor information fusion, its characterized in that includes: the system comprises a data fusion layer, a characteristic layer, a decision layer and a detection device;
the detection device comprises:
the radar height sensor is used for measuring the vertical distance from the unmanned aerial vehicle to the ground;
the GPS/Beidou positioning sensor is used for positioning in real time so as to realize fixed-point hovering of the unmanned aerial vehicle and realize measurement of the height of the unmanned aerial vehicle and measurement of the relative speed of the unmanned aerial vehicle;
the AHRS module is used for acquiring the flight attitude and navigation information of the unmanned aerial vehicle;
the millimeter wave radar sensor is used for realizing the remote measurement from the obstacle to the unmanned aerial vehicle and the measurement of the relative speed;
the ultrasonic radar sensor is used for realizing the close range measurement from the obstacle to the unmanned aerial vehicle;
a binocular vision sensor measuring the size and shape of the obstacle;
the main controller is used for controlling the unmanned aerial vehicle to complete obstacle avoidance action by analyzing the data obtained by each sensor;
the main controller is respectively connected with the radar height sensor, the GPS/Beidou positioning sensor, the AHRS module, the millimeter wave radar sensor, the ultrasonic radar sensor and the binocular vision sensor.
2. The obstacle avoidance system of the post-disaster rescue rotor unmanned aerial vehicle based on OODA ring multi-sensor information fusion as claimed in claim 1, wherein the number of the millimeter wave radar sensors, the number of the ultrasonic radar sensors and the number of the binocular vision sensors are four, and the four sensors are respectively installed on the front, the rear, the left and the right surfaces of the unmanned aerial vehicle.
3. The post-disaster rescue rotor unmanned aerial vehicle obstacle avoidance system based on OODA ring multi-sensor information fusion of claim 1 or 2, characterized in that, millimeter wave radar sensor includes:
the antenna module transmits signals to the space region in each direction and receives the scattered echo signals of the obstacles in the space region; the radio frequency front-end module is used for realizing the transmission and receiving processing of signals; the base band processing module analyzes the relative distance, the relative speed and the azimuth angle of the front target barrier and sends the analyzed results to the main controller;
the AHRS module comprises a three-axis gyroscope of an MEMS, an accelerometer and a magnetometer, and outputs data of three-dimensional acceleration, three-dimensional angular velocity and three-dimensional geomagnetic field intensity.
4. The post-disaster rescue rotor unmanned aerial vehicle obstacle avoidance system based on OODA ring multi-sensor information fusion of claim 1, characterized in that, the data fusion layer processes the data collected by each sensor:
1) The data output by the millimeter wave radar sensor are the relative distance R1 between the unmanned aerial vehicle and the obstacle, the relative speed V1, and the angle between the obstacle and the radar normal, including an azimuth angle theta 1 and a pitch angle psi 1;
2) The relative distance R2 between the unmanned aerial vehicle and the obstacle is input by the ultrasonic radar sensor;
3) The binocular vision sensor outputs an object area S, an azimuth angle theta 2 and a relative distance R3;
4) The radar height sensor outputs a height value H1 of the unmanned aerial vehicle and the ground;
5) The GPS/Beidou positioning sensor obtains the altitude H2 and the horizontal navigational speed V2 of the unmanned aerial vehicle;
6) The AHRS module outputs three-dimensional acceleration A _ x, A _ y, A _ z, three-dimensional angular velocity w _ x, w _ y, w _ z and three-dimensional geomagnetic field intensity m _ x, m _ y, m _ z, and the current attitude data of the unmanned aerial vehicle, namely a flight azimuth angle theta 3, a pitch angle psi 2 and a rolling angle are calculated through the data
5. The post-disaster rescue rotor unmanned aerial vehicle obstacle avoidance system based on OODA ring multi-sensor information fusion as claimed in claim 1, wherein the feature layer performs data fusion of the relative distance between the unmanned aerial vehicle and the obstacle, data fusion of the relative height between the unmanned aerial vehicle and the ground, data fusion of the relative speed between the unmanned aerial vehicle and the obstacle, and attribute features such as the size and shape of the obstacle.
6. The obstacle avoidance system of the post-disaster rescue rotor unmanned aerial vehicle based on OODA ring multi-sensor information fusion as claimed in claim 5, wherein the data fusion of the relative distance between the unmanned aerial vehicle and the obstacle is processed according to the distance range:
A. detecting the ultrasonic radar sensor, the binocular vision sensor and the millimeter wave radar sensor within the range of 0m to 10m, introducing alpha and beta weighted values to carry out weighted average on the ultrasonic radar sensor, the binocular vision sensor and the millimeter wave radar sensor, and carrying out kalman data fusion on the weighted and fused data;
B. the distance is within the range of 10m to 20m, the weighted value of alpha is introduced to carry out weighted average on the two sensors, namely the binocular vision sensor and the millimeter wave radar sensor, and Kalman data fusion is carried out on the data after weighted fusion;
C. performing kalman data fusion on the data of the millimeter wave radar sensor within the range of 50 m;
as shown below
7. The post-disaster rescue rotor unmanned aerial vehicle obstacle avoidance system based on OODA ring multi-sensor information fusion as claimed in claim 5, wherein the data fusion of the relative heights of the unmanned aerial vehicle and the ground is the data fusion of the altitude values of the unmanned aerial vehicle acquired by a radar altitude sensor and a GPS/Beidou positioning sensor; data fusion of height values is divided into two categories according to distance: for the range with the height less than 100m, detecting the height of the unmanned aerial vehicle by adopting a radar height sensor and a GPS/Beidou positioning sensor, performing weighted average on the detected result, namely introducing alpha values to perform weighted average processing on the height values of the two sensors, and performing data fusion on the height values by adopting kalman after the processing; correcting the height value according to AHRS attitude data:
for the height more than 100m, a GPS/Beidou positioning sensor is adopted to directly perform kalman data fusion on the acquired height data, and then AHRS attitude data is adopted to perform height correction;
wherein H1 is the height acquired by the radar height sensor, and H2 is the height acquired by the GPS/Beidou positioning sensor; and simultaneously, the obstacle is identified according to the characteristics of the binocular vision sensor, such as the size, the shape and the like of the obstacle.
8. The obstacle avoidance system of the post-disaster rescue rotor unmanned aerial vehicle based on OODA ring multi-sensor information fusion as claimed in claim 5, wherein the height correction by using AHRS attitude data is specifically:
psi 2 is pitch angle andis a roll angle; h is the measured height, and H' is the corrected height value.
9. The obstacle avoidance system of the post-disaster rescue rotor unmanned aerial vehicle based on OODA ring multi-sensor information fusion as claimed in claim 5, wherein the data fusion of the relative speeds of the unmanned aerial vehicle and the obstacle is obtained by using a millimeter wave radar sensor and a GPS/Beidou positioning sensor, and the speed data of the two sensors are processed by weighted average, that is, the weighted average is processed
V = α × V1+ (1- α) × V2, α is a weight ratio of the two sensors, kalman data fusion is performed on the velocity data after weighted average processing, and obstacle recognition is performed on features such as the size and shape of an obstacle according to a binocular vision sensor.
10. The obstacle avoidance system of the post-disaster rescue rotor unmanned aerial vehicle based on OODA ring multi-sensor information fusion as claimed in claim 1, wherein the decision layer completes obstacle avoidance by the following steps:
p1, firstly, judging the relative distance between the unmanned aerial vehicle and the obstacle, and dividing the relative distance into three parts, namely three distance ranges of less than N1m, from N1m to N2m and from N2m to N3 m;
p2, after the distance division is finished, the danger grade division is carried out according to the relative speed of the unmanned aerial vehicle and the barrier:
when the distance is less than N1M, the speed is greater than M1M/s, and the early warning time is less than Qs, the alarm belongs to a danger level, and when the speed is less than M1M/s, the alarm belongs to an alarm level;
when the distance N1M is less than or equal to R < N2M and the speed is greater than M2M/s, the vehicle is in a danger level; when the speed M1M/s is less than or equal to V and less than M2M/s, the warning level is set, and when the speed is less than M1M/s, the prompting level is set;
when the distance N2M is less than or equal to R < N3M, and when the speed is greater than M3M/s, the speed is in a danger level; when the speed M2M/s is less than or equal to V and less than M3M/s, the warning level is set, when the speed M1M/s is less than or equal to V and less than M2M/s, the prompting level is set, and when the speed is less than M1M/s, the irrelevant level is set;
p3, then judging the height value of the unmanned aerial vehicle and the ground, and dividing the height value H into four grades;
when the height is smaller than X1m, the barrier comprises a wall, trees and people, the attributes of the barrier are identified according to the binocular vision sensor, the wall, the trees and the people are distinguished, then the barrier can be completely avoided for the people after the barrier is selected to hover, and the height of the trees and the wall needs to be further judged through the climbing process;
when the height X1m is not more than H and less than X2m, and at the height, the barrier comprises a wall and a tree, the attributes of the barrier are identified according to the binocular vision sensor, the wall and the tree are distinguished, then the barrier can be completely avoided for the tree after the barrier is selected to be suspended urgently and climb upwards, and the barrier can be completely avoided for the tree through the climbing process, but the height of the wall needs to be further judged;
when the height X2m is not more than H and less than X3m, and at the height, the barrier comprises a wall and a high-voltage wire, the attributes of the barrier are identified according to the vision acquisition sensor, the wall and the high-voltage wire are distinguished, then the barrier is climbed upwards after the barrier is selected to be suspended urgently, the barrier can be completely avoided for the trees through the climbing process, but the height of the wall needs to be judged continuously;
when the height H is more than or equal to X3m, the attributes of the obstacles are identified according to the binocular vision sensor at the height, and if the obstacles are confirmed to be a wall surface, the obstacles are avoided by turning back after urgent hovering is selected;
p4: for the danger level, the operation of step P3 needs to be performed; for the warning grade, the operation of the step P3 is carried out after the emergency deceleration is needed; and returning to detect again without judging the third step for the prompt level and the irrelevant level.
CN201610726244.7A 2016-08-25 2016-08-25 Obstacle avoidance system of post-disaster rescue rotor unmanned aerial vehicle based on OODA (object oriented data acquisition) ring multi-sensor information fusion Active CN107783545B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610726244.7A CN107783545B (en) 2016-08-25 2016-08-25 Obstacle avoidance system of post-disaster rescue rotor unmanned aerial vehicle based on OODA (object oriented data acquisition) ring multi-sensor information fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610726244.7A CN107783545B (en) 2016-08-25 2016-08-25 Obstacle avoidance system of post-disaster rescue rotor unmanned aerial vehicle based on OODA (object oriented data acquisition) ring multi-sensor information fusion

Publications (2)

Publication Number Publication Date
CN107783545A true CN107783545A (en) 2018-03-09
CN107783545B CN107783545B (en) 2021-04-27

Family

ID=61438790

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610726244.7A Active CN107783545B (en) 2016-08-25 2016-08-25 Obstacle avoidance system of post-disaster rescue rotor unmanned aerial vehicle based on OODA (object oriented data acquisition) ring multi-sensor information fusion

Country Status (1)

Country Link
CN (1) CN107783545B (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109131279A (en) * 2018-09-05 2019-01-04 深圳普思英察科技有限公司 A kind of autonomous vendors' cart and its barrier-avoiding method
CN109597410A (en) * 2018-11-28 2019-04-09 北京航天自动控制研究所 A kind of the movable detection platform and autonomous method for searching of autonomous path planning
CN109657928A (en) * 2018-11-28 2019-04-19 大连理工大学 A kind of the closed loop cooperative scheduling frame and dispatching method of onboard sensor system
CN109828274A (en) * 2019-01-07 2019-05-31 深圳市道通智能航空技术有限公司 Adjust the method, apparatus and unmanned plane of the main detection direction of airborne radar
CN110034961A (en) * 2019-04-11 2019-07-19 重庆邮电大学 It take OODA chain as the infiltration rate calculation method of first body
CN111638514A (en) * 2020-06-19 2020-09-08 四川陆垚控制技术有限公司 Unmanned aerial vehicle height measurement method and unmanned aerial vehicle navigation filter
CN112105949A (en) * 2019-10-31 2020-12-18 深圳市大疆创新科技有限公司 Movable platform
CN112131730A (en) * 2020-09-14 2020-12-25 中国人民解放军军事科学院评估论证研究中心 Freezing analysis method and device for group intelligent unmanned system
CN112214033A (en) * 2020-09-25 2021-01-12 中国直升机设计研究所 Helicopter driving aid decision support system based on OODA
CN112363176A (en) * 2020-11-06 2021-02-12 日立楼宇技术(广州)有限公司 Elevator shaft inspection and modeling method and device and inspection modeling system
CN112965517A (en) * 2021-01-31 2021-06-15 国网江苏省电力有限公司常州供电分公司 Unmanned aerial vehicle inspection safety obstacle avoidance system and method based on binocular vision fusion laser radar and electromagnetic field detection
WO2021217334A1 (en) * 2020-04-27 2021-11-04 深圳市大疆创新科技有限公司 Moveable platform, and obstacle avoidance method and device therefor

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080243383A1 (en) * 2006-12-12 2008-10-02 Ching-Fang Lin Integrated collision avoidance enhanced GN&C system for air vehicle
CN103135550A (en) * 2013-01-31 2013-06-05 南京航空航天大学 Multiple obstacle-avoidance control method of unmanned plane used for electric wire inspection
CN104567799A (en) * 2014-11-28 2015-04-29 天津大学 Multi-sensor information fusion-based method for measuring height of small unmanned gyroplane
CN104615142A (en) * 2014-12-19 2015-05-13 重庆大学 Flight controller for civil small UAV (Unmanned Aerial Vehicle)
CN105136145A (en) * 2015-08-11 2015-12-09 哈尔滨工业大学 Kalman filtering based quadrotor unmanned aerial vehicle attitude data fusion method
CN205067739U (en) * 2015-09-23 2016-03-02 北京行易道科技有限公司 Small -size millimeter wave radar sensor device
CN105892489A (en) * 2016-05-24 2016-08-24 国网山东省电力公司电力科学研究院 Multi-sensor fusion-based autonomous obstacle avoidance unmanned aerial vehicle system and control method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080243383A1 (en) * 2006-12-12 2008-10-02 Ching-Fang Lin Integrated collision avoidance enhanced GN&C system for air vehicle
CN103135550A (en) * 2013-01-31 2013-06-05 南京航空航天大学 Multiple obstacle-avoidance control method of unmanned plane used for electric wire inspection
CN104567799A (en) * 2014-11-28 2015-04-29 天津大学 Multi-sensor information fusion-based method for measuring height of small unmanned gyroplane
CN104615142A (en) * 2014-12-19 2015-05-13 重庆大学 Flight controller for civil small UAV (Unmanned Aerial Vehicle)
CN105136145A (en) * 2015-08-11 2015-12-09 哈尔滨工业大学 Kalman filtering based quadrotor unmanned aerial vehicle attitude data fusion method
CN205067739U (en) * 2015-09-23 2016-03-02 北京行易道科技有限公司 Small -size millimeter wave radar sensor device
CN105892489A (en) * 2016-05-24 2016-08-24 国网山东省电力公司电力科学研究院 Multi-sensor fusion-based autonomous obstacle avoidance unmanned aerial vehicle system and control method

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109131279A (en) * 2018-09-05 2019-01-04 深圳普思英察科技有限公司 A kind of autonomous vendors' cart and its barrier-avoiding method
CN109597410A (en) * 2018-11-28 2019-04-09 北京航天自动控制研究所 A kind of the movable detection platform and autonomous method for searching of autonomous path planning
CN109657928A (en) * 2018-11-28 2019-04-19 大连理工大学 A kind of the closed loop cooperative scheduling frame and dispatching method of onboard sensor system
CN109657928B (en) * 2018-11-28 2022-10-04 大连理工大学 Cooperative scheduling method of closed-loop cooperative scheduling framework of vehicle-mounted sensor system
CN109828274A (en) * 2019-01-07 2019-05-31 深圳市道通智能航空技术有限公司 Adjust the method, apparatus and unmanned plane of the main detection direction of airborne radar
CN110034961B (en) * 2019-04-11 2022-02-15 重庆邮电大学 Seepage rate calculation method taking OODA chain as element
CN110034961A (en) * 2019-04-11 2019-07-19 重庆邮电大学 It take OODA chain as the infiltration rate calculation method of first body
CN112105949A (en) * 2019-10-31 2020-12-18 深圳市大疆创新科技有限公司 Movable platform
WO2021217334A1 (en) * 2020-04-27 2021-11-04 深圳市大疆创新科技有限公司 Moveable platform, and obstacle avoidance method and device therefor
CN111638514A (en) * 2020-06-19 2020-09-08 四川陆垚控制技术有限公司 Unmanned aerial vehicle height measurement method and unmanned aerial vehicle navigation filter
CN111638514B (en) * 2020-06-19 2023-08-04 四川陆垚控制技术有限公司 Unmanned aerial vehicle height measurement method and unmanned aerial vehicle navigation filter
CN112131730A (en) * 2020-09-14 2020-12-25 中国人民解放军军事科学院评估论证研究中心 Freezing analysis method and device for group intelligent unmanned system
CN112131730B (en) * 2020-09-14 2024-04-30 中国人民解放军军事科学院评估论证研究中心 Fixed-grid analysis method and device for intelligent unmanned system of group
CN112214033A (en) * 2020-09-25 2021-01-12 中国直升机设计研究所 Helicopter driving aid decision support system based on OODA
CN112214033B (en) * 2020-09-25 2022-12-30 中国直升机设计研究所 Helicopter driving aid decision support system based on OODA
CN112363176A (en) * 2020-11-06 2021-02-12 日立楼宇技术(广州)有限公司 Elevator shaft inspection and modeling method and device and inspection modeling system
CN112363176B (en) * 2020-11-06 2023-08-04 日立楼宇技术(广州)有限公司 Elevator hoistway inspection and modeling method and device and inspection and modeling system
CN112965517A (en) * 2021-01-31 2021-06-15 国网江苏省电力有限公司常州供电分公司 Unmanned aerial vehicle inspection safety obstacle avoidance system and method based on binocular vision fusion laser radar and electromagnetic field detection

Also Published As

Publication number Publication date
CN107783545B (en) 2021-04-27

Similar Documents

Publication Publication Date Title
CN107783106B (en) Data fusion method between unmanned aerial vehicle and barrier
CN107783545B (en) Obstacle avoidance system of post-disaster rescue rotor unmanned aerial vehicle based on OODA (object oriented data acquisition) ring multi-sensor information fusion
US11697497B2 (en) Aerial vehicles having countermeasures deployed from a platform for neutralizing target aerial vehicles
CN110631593B (en) Multi-sensor fusion positioning method for automatic driving scene
US20190273909A1 (en) Methods and systems for selective sensor fusion
CN109029422B (en) Method and device for building three-dimensional survey map through cooperation of multiple unmanned aerial vehicles
JP6900608B2 (en) How to fly an unmanned aerial vehicle to stationary and moving objects
CN206057974U (en) A kind of obstacle avoidance system applied on rotor wing unmanned aerial vehicle
CN107783548B (en) Data processing method based on multi-sensor information fusion technology
CN107783549B (en) Single-rotor-wing plant protection unmanned aerial vehicle obstacle avoidance system based on multi-sensor information fusion technology
CN103822635B (en) The unmanned plane during flying spatial location real-time computing technique of view-based access control model information
CN107783544B (en) Method for controlling single-rotor plant protection unmanned aerial vehicle to avoid obstacle flight
CN112130579A (en) Tunnel unmanned aerial vehicle inspection method and system
CN109911188A (en) The bridge machinery UAV system of non-satellite navigator fix environment
CN107783119A (en) Apply the Decision fusion method in obstacle avoidance system
CN103135550A (en) Multiple obstacle-avoidance control method of unmanned plane used for electric wire inspection
US20140172357A1 (en) Analysis of sensor data
CN104298248A (en) Accurate visual positioning and orienting method for rotor wing unmanned aerial vehicle
CN107783547A (en) Post disaster relief rotor wing unmanned aerial vehicle obstacle avoidance system and method
WO2018170733A1 (en) Control method for movable platform and movable platform
KR20190000439A (en) Unmanned air vehicle for birds control and operating method by using the same
EP3989034A1 (en) Automatic safe-landing-site selection for unmanned aerial systems
CN110673627A (en) Forest unmanned aerial vehicle searching method
Hosseinpoor et al. Pricise target geolocation based on integeration of thermal video imagery and rtk GPS in UAVS
CN110997488A (en) System and method for dynamically controlling parameters for processing sensor output data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant