WO2019095453A1 - 一种无人机定点悬停系统和方法 - Google Patents

一种无人机定点悬停系统和方法 Download PDF

Info

Publication number
WO2019095453A1
WO2019095453A1 PCT/CN2017/114574 CN2017114574W WO2019095453A1 WO 2019095453 A1 WO2019095453 A1 WO 2019095453A1 CN 2017114574 W CN2017114574 W CN 2017114574W WO 2019095453 A1 WO2019095453 A1 WO 2019095453A1
Authority
WO
WIPO (PCT)
Prior art keywords
module
control
fixed
point
optical flow
Prior art date
Application number
PCT/CN2017/114574
Other languages
English (en)
French (fr)
Inventor
张文利
马英轩
冯昊
郭向
陈华敏
Original Assignee
北京工业大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京工业大学 filed Critical 北京工业大学
Priority to US16/495,089 priority Critical patent/US20200097025A1/en
Publication of WO2019095453A1 publication Critical patent/WO2019095453A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • G05D1/0816Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft to ensure stability
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/269Analysis of motion using gradient-based methods
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • B64C39/024Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U50/00Propulsion; Power supply
    • B64U50/10Propulsion
    • B64U50/19Propulsion using electrically powered motors
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0088Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/102Simultaneous control of position or course in three dimensions specially adapted for aircraft specially adapted for vertical take-off of aircraft
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • G06T7/41Analysis of texture based on statistical description of texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/10UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows

Definitions

  • the invention belongs to the technical field of drones, and particularly relates to a system and method for a fixed-point hovering of a drone that can be used in an indoor environment.
  • the fixed-point hover of the Unmanned Aerial Vehicle is defined as the drone's own autonomous flight function or the control of the remote control device, so that the drone can stay in the specified position in the air for a certain period of time.
  • drones are fixed-point hovering.
  • the most mature and widely used method is to use the combined navigation method of GPS+barometer + gyroscope.
  • the barometer is used to measure the height change
  • the GPS module gives the coordinates of the horizontal position
  • the data provided in a small space or when the drone moves at a high speed will cause the attitude compensation command of the fuselage to be seriously delayed, resulting in serious consequences such as a crash of the aircraft.
  • Due to the shielding of the GPS signal by the building the indoor GPS is in a standby state, and the hovering signal of the drone is completely controlled by the inertial navigation device.
  • the inertial navigation system will produce a large system error. The longer the flight time, the larger the error, and the accuracy of the fixed-point hovering is greatly reduced.
  • an optical flow fixed point method which uses an airborne optical flow sensor to measure the optical flow field of the drone relative to the ground, thereby obtaining the current speed vector information (speed + direction) of the drone, and the optical flow module according to the above
  • the motion vector information calculates a reverse acceleration control variable that keeps the drone in a fixed-point hover state.
  • the method does not require the assistance of an external transmitting signal, so the application range is wide, and the system error is small.
  • the optical flow information of the entire image is calculated, the calculation amount is large, and the calculation of the unmanned aerial vehicle processor is unbearable. The amount will result in low computational efficiency and the control signal will be seriously delayed. Any small delay during the flight Late can lead to serious consequences, so the practicality of the method needs to be improved.
  • Patent name indoor positioning device (application number: CN201620160519.0)
  • An indoor positioning device is located on a drone, and the device comprises: an ultrasonic transmitting PCB board, an ultrasonic receiving PCB board and an optical flow PCB board, and the optical flow PCB board comprises an optical flow camera and a micro control unit MCU, and an ultrasonic wave Emitter board: used to generate and transmit ultrasonic waves under the control of the MCU on the optical flow PCB board; ultrasonic receiving board: used to transmit ultrasonic waves to the MCU on the optical flow PCB board when receiving ultrasonic waves; optical flow camera: used The ground image is taken and the captured ground image is transmitted to the MCU; the MCU is used to control the ultrasonic transmitting PCB board to emit ultrasonic waves, and starts timing.
  • the timing is stopped, according to the timing.
  • the duration is calculated by calculating the height of the drone; and the displacement of the drone in the horizontal x direction and the vertical y direction of the ground is calculated according to the ground image transmitted from the optical flow camera.
  • the number of components in this device is too large, and the manufacturing cost is high.
  • the timeliness of optical flow positioning is improved by an external computer and a large number of auxiliary originals, and the cost is too high and the problem of excessive calculation of the optical flow algorithm cannot be solved.
  • Patent name A UAV speed monitoring method and system application number: (CN201610529291.2)
  • the invention provides a UAV speed monitoring method and system, the method comprises: acquiring a current flying height, an angular velocity and an image; acquiring feature points of the image, calculating an optical flow of each feature point; and counting light of each feature point Flow, selecting the optical flow with the highest repetition rate of the optical flow in the same direction as the optical flow of the image; calculating the current flight speed according to the flying height, the angular velocity, and the optical flow of the image.
  • the invention also relates to a system corresponding to the above-mentioned UAV speed monitoring method, which calculates the optical flow of each feature point by calculating the feature points of the current image, and selects the optical flow with the highest optical flow repetition rate in the same direction as the
  • the optical flow of the image is calculated according to the optical flow of the image, the current flying height and the angular velocity, the calculation is simple, the calculation amount is small, and the influence of a plurality of factors is considered, and the calculation result is accurate.
  • the invention effectively reduces the amount of calculation of the optical flow, and is only used as a detection method, and cannot be integrated with the drone as a whole, thereby controlling the motion and posture of the drone, and thus cannot be effectively applied to the fixed-point hover of the drone. .
  • this patent proposes an improved method for fixed-point hovering of a drone based on optical flow method.
  • the patent is applicable to a fixed-point hovering method without a GPS signal condition, and the auxiliary device can be deployed without pre-deployment. In the case of normal work, the amount of calculation is small.
  • the patent's fixed-point accuracy and control signal refresh rate are superior to other existing methods in the scope of application, providing a technical basis for the development of UAV indoor applications.
  • the technical problem to be solved by the invention is that when the optical flow field in the image is calculated by the optical flow method, only an area with obvious texture features can obtain an effective optical flow field, and an area with an insignificant texture feature cannot obtain an effective optical flow field. .
  • the traditional optical flow fixed point method does not consider the texture feature, and the entire image is calculated by optical flow, so the calculation amount is large, which affects the timeliness.
  • the invention adopts the following technical solutions to achieve the stated object, and a drone fixed-point hovering system, including a drone flight control 10, a fixed-point hover control module 20 and a drone motor module 30.
  • the drone flight control 10 is used to control the flight of the drone; the fixed-point hover control module 20 is used to control the fixed-point hovering in the flight of the drone; and the motor module 30 is used to change the flight state of the drone.
  • the drone flight control 10 includes a data receiving module 110, a control amount calculation module 120, and an ESC control module 130.
  • the data receiving module 110 is configured to receive the acceleration control variable sent by the fixed point hover control module 20 and send it to the control quantity calculation module 120.
  • the control quantity calculation module 120 receives the control acceleration signal from the data receiving module 110, and according to the received control acceleration data.
  • the PWM wave signal waveform parameters to be generated are calculated and then sent to the ESC control module 130; the ESC control module 130 generates a PWM wave signal based on the information received from the control amount calculation module 120.
  • the fixed point hover control module 20 includes an image acquisition module 210, a fixed point hover effective area identification module 220, an optical flow field calculation module 230, a control parameter calculation module 240, and a data encoding and transmission module 250.
  • the image acquisition module 210 includes an image sensor that collects video data and is responsible for storing the collected video data in a main storage storage module.
  • the fixed-point hover effective area identification module 220 is a set of single-chip microcomputer or microcomputer and driving structure, and is collected according to the image collection module 210.
  • the obtained image data uses the SAD algorithm to detect the identified texture regions in the image, and sends the regions as the fixed-point hover effective region to the optical flow field calculation module 230;
  • the optical flow field calculation module 230 is a set of combined attitude sensors, ie, the gyro
  • the single-chip microcomputer and the driving structure of the instrument acquires the divided video portion from the fixed-point hovering effective area identifying module 220, and uses the HS optical flow method in the optical flow algorithm to perform optical flow calculation, and combines the motion state provided by the attitude sensor, that is, the gyroscope.
  • the data is compensated, and finally the velocity vector information of the drone relative to the ground is obtained and sent to the control parameter calculation module 240.
  • the control parameter calculation module 240 calculates the maintenance drone according to the velocity vector data obtained by the optical flow field calculation module 230. Controlled acceleration required for hovering and sent to data encoding and transmission Block 250; action module 250 for encoding and transmitting data is encoded and the calculation result is transmitted to the data receiving module 10 flight control 110.
  • a fixed-point hovering method for a drone the specific flow of the fixed-point hover control module 20 is as follows, and the flow diagram is shown in FIG. 2.
  • Step 1 Corresponding to the image acquisition module 210, the image information of the fixed-point hovering position is acquired by the image sensor in the image acquisition module 210.
  • Step 2 Corresponding to the fixed-point hover effective area identification module 220, the fixed-point hovering effective area identification module 220 performs detection and segmentation of the texture area, and finally identifies the effective area of the fixed-point hovering, as follows:
  • Step 2-1 Image Partition
  • the M*N grayscale image acquired by the image acquisition module 210 is divided into sub-regions of size n*n.
  • M represents the length of the grayscale image
  • N represents the width of the grayscale image
  • Step 2-2 Calculate the SAD value
  • the SAD value is obtained.
  • the formula for calculating the SAD value is as follows:
  • f t represents the pixel value of the current time t
  • f t+1 represents the pixel value of the next time t+1
  • C represents the SAD value
  • C is the absolute value of the two-frame image difference and the SAD value
  • f represents the value of the pixel point
  • t represents the current time
  • Step 2-3 Determine the effective area of the fixed point hover
  • the difference between the SAD value of each region calculated in step 2-2 and the threshold value T is compared, and the absolute value of the difference value is 0, that is, the fixed area of the fixed point hovering. That is, if the absolute value of the difference in the region is greater than 0, it is the effective region, and the optical flow is calculated.
  • Step 3 Corresponding to the fixed point hovering optical flow field calculation module 230.
  • the optical flow value is calculated by using the HS optical flow method or other optical flow algorithm to calculate the optical flow of the fixed-point hovering effective area segmented in step 2-2, and the optical flow velocity is passed through a similar triangular relationship between the sensor and the actual object plane, and converted into Metric speed V t .
  • Step 4 Corresponding to the parameter control module 240, calculating the reverse acceleration of the control parameter a of the drone fixed point according to the actual speed V t obtained in the above step 3, and transmitting it to the flight control module 10 to implement the fixed point control.
  • Step 5 The corresponding data encoding and transmitting module 250 transmits the a data obtained in step 4 to the data receiving module 110 of the flight control through the I2C port through the data encoding and transmitting module 250;
  • Step 6 The data receiving module 110 of the flight controller 10 receives the drone offset data a and transmits it to the control amount calculation module 120.
  • Step 7 The control quantity calculation module 120 receives the data a and performs calculation to obtain a compensation amount of the UAV motion offset, that is, a speed vector opposite to the UAV offset speed vector, and converts it into an ESC control.
  • the signal is sent to the ESC control module 130.
  • Step 8 The ESC control module 130 receives the ESC control signal sent by the control amount calculation module 120 to control the magnitude of the current output to the UAV motor module 30.
  • Step 9 The UAV motor module 30 receives the current of the ESC control module 130, controls the UAV to move in the opposite direction of the existing motion, and causes the UAV to hover at a fixed point.
  • the system provided by the present invention includes an unmanned aerial vehicle system and method for a drone flight control, a motor, and a fixed point hover control module.
  • the fixed-point hover control module uses the texture feature to obtain the effective region, and reduces the calculation amount compared with the conventional optical flow fixed-point method, and then uses the optical flow method to calculate the optical flow field for the effective region. Further, control acceleration information is obtained.
  • the invention reduces the calculation amount of the optical flow fixed point, improves the calculation timeliness, and improves the stability of the fixed point hovering.
  • Figure 1 is a representative embodiment of the present patent - an architecture diagram of a drone fixed point system based on optical flow method.
  • Figure 2 Flow chart of the drone control system of the drone.
  • Figure 3 is a flow chart of the HS optical flow method.
  • An embodiment of the present invention is a drone fixed-point hovering system including a drone flight control 10, a fixed-point hover control module 20, and a drone motor module 30.
  • the drone flight control 10 is used to control the flight of the drone; the fixed-point hover control module 20 is used to control the fixed-point hovering in the flight of the drone; and the motor module 30 is used to change the flight state of the drone.
  • the drone flight control 10 includes a data receiving module 110, a control amount calculation module 120, and an ESC control module 130.
  • the data receiving module 110 is configured to receive the acceleration control variable sent by the fixed point hover control module 20 and send it to the control amount calculation module 120; the control amount calculation module 120 receives the control from the data receiving module 110. Acceleration signal, and calculating a PWM wave signal waveform parameter to be generated according to the received control acceleration data, and then sent to the ESC control module 130; the ESC control module 130 generates a PWM wave signal according to the information received from the control amount calculation module 120. .
  • the fixed point hover control module 20 includes an image acquisition module 210, a fixed point hover effective area identification module 220, an optical flow field calculation module 230, a control parameter calculation module 240, and a data encoding and transmission module 250.
  • the image acquisition module 210 includes an image sensor that collects video data and is responsible for storing the collected video data in a main storage storage module.
  • the fixed-point hover effective area identification module 220 is a set of single-chip microcomputer or microcomputer and driving structure, and is collected according to the image collection module 210.
  • the obtained image data uses the SAD algorithm to detect the identified texture regions in the image, and sends the regions as the fixed-point hover effective region to the optical flow field calculation module 230;
  • the optical flow field calculation module 230 is a set of combined attitude sensors, ie, the gyro
  • the single-chip microcomputer and the driving structure of the instrument acquires the divided video portion from the fixed-point hovering effective area identifying module 220, and uses the HS optical flow method in the optical flow algorithm to perform optical flow calculation, and combines the motion state provided by the attitude sensor, that is, the gyroscope.
  • the data is compensated, and finally the velocity vector information of the drone relative to the ground is obtained and sent to the control parameter calculation module 240.
  • the control parameter calculation module 240 calculates the maintenance drone according to the velocity vector data obtained by the optical flow field calculation module 230. Controlled acceleration required for hovering and sent to data encoding and transmission Block 250; action module 250 for encoding and transmitting data is encoded and the calculation result is transmitted to the data receiving module 10 flight control 110.
  • a fixed-point hovering method for a drone the specific flow of the fixed-point hover control module 20 is as follows, and the flow diagram is shown in FIG. 2.
  • Step 1 Corresponding to the image acquisition module 210, the image information of the fixed-point hovering position is acquired by the image sensor in the image acquisition module 210.
  • Step 2 Corresponding to the fixed-point hover effective area identification module 220, the fixed-point hovering effective area identification module 220 performs detection and segmentation of the texture area, and finally identifies the effective area of the fixed-point hovering, as follows:
  • Step 2-1 Image Partition
  • the M*N grayscale image collected by the image acquisition module 210 is divided into sub-regions of size n*n.
  • n 4.
  • M represents the length of the grayscale image
  • N represents the width of the grayscale image
  • Step 2-2 Calculate the SAD value
  • the SAD value is obtained.
  • the formula for calculating the SAD value is as follows:
  • f t represents the pixel value of the current time t
  • f t+1 represents the pixel value of the next time t+1
  • C represents the SAD value
  • C is the absolute value of the two-frame image difference and the SAD value
  • f represents the value of the pixel point
  • t represents the current time
  • Step 2-3 Determine the effective area of the fixed point hover
  • the difference between the SAD value of each region calculated in step 2-2 and the threshold value T is compared, and the absolute value of the difference value is 0, that is, the fixed area of the fixed point hovering. That is, if the absolute value of the difference in the region is greater than 0, it is the effective region, and the optical flow is calculated.
  • Step 3 Corresponding to the fixed point hovering optical flow field calculation module 230.
  • the optical flow value is calculated by using the HS optical flow method or other optical flow algorithm to calculate the optical flow of the fixed-point hovering effective area segmented in step 2-2, and the optical flow velocity is passed through a similar triangular relationship between the sensor and the actual object plane, and converted into Metric speed V t .
  • Step 4 Corresponding to the parameter control module 240, calculating the reverse acceleration of the control parameter a of the drone fixed point according to the actual speed V t obtained in the above step 3, and transmitting it to the flight control module 10 to implement the fixed point control.
  • Step 5 Corresponding data encoding and transmitting module 250.
  • the data obtained in step 4 is sent to the data receiving module 110 of the flight control through the I2C port through the data encoding and transmitting module 250;
  • Step 6 (or change to attitude control step 1):
  • the data receiving module 110 of the flight controller 10 receives the drone offset data a and transmits it to the control amount calculation module 120.
  • Step 7 The control quantity calculation module 120 receives the data a and performs calculation to obtain a compensation amount of the UAV motion offset, that is, a speed vector opposite to the UAV offset speed vector, and converts it into an ESC control.
  • the signal is sent to the ESC control module 130.
  • Step 8 The ESC control module 130 receives the ESC control signal sent by the control amount calculation module 120 to control the magnitude of the current output to the UAV motor module 30.
  • Step 9 The UAV motor module 30 receives the current of the ESC control module 130, controls the UAV to move in the opposite direction of the existing motion, and causes the UAV to hover at a fixed point.
  • the HS optical flow method steps are as follows:
  • I is the optical flow value
  • x is the abscissa in the grayscale image
  • y is the ordinate in the grayscale image
  • t is the current time.
  • (x, y, t) is a pixel point at which the gray scale image is at time t and the horizontal and vertical coordinates are (x, y).
  • Equation (5) is the optical flow constraint equation, which reflects a correspondence between the gray level u and the velocity v.
  • I x is the partial derivative of I to x
  • I y is the partial derivative of I to y
  • I t is the partial derivative of I to t.
  • Equation (5) contains two variables: grayscale u and velocity v, and the changes in u and v as the pixel moves are slow, and the local region does not change much, especially when the target is doing non-deformed rigid body motion.
  • the spatial rate of change of the local area velocity is zero. Therefore a new condition is introduced, namely the global smoothing constraint of the optical flow.
  • is the velocity smoothing term function
  • ⁇ c is the velocity smoothing term function at the SAD value c.
  • the ⁇ in the equation is the smoothing weight coefficient, which represents the weight of the smoothness term.
  • the vector differential operator symbol is a recognized mathematical symbol.
  • is the eigenvalue of the gray image pixel matrix.
  • the nine point difference format can be used for calculation.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Combustion & Propulsion (AREA)
  • Chemical & Material Sciences (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Automation & Control Theory (AREA)
  • Multimedia (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Game Theory and Decision Science (AREA)
  • Medical Informatics (AREA)
  • Health & Medical Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Probability & Statistics with Applications (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Image Analysis (AREA)

Abstract

本发明公开了一种无人机定点悬停系统和方法,属于无人机技术领域,具体涉及一种可用于室内环境下的无人机定点悬停系统和方法。该系统包括无人机飞控、定点悬停控制模块和无人机电机模块。无人机飞控用于控制无人机的飞行;定点悬停控制模块用于控制无人机飞行中的定点悬停;电机模块用于改变无人机的飞行运动状态。本发明提供的定点悬停控制模块中利用纹理特征获取有效区域,相对于传统光流定点方法减少了计算量,然后利用光流法针对上述有效区域计算光流场。进而得到控制加速度信息。本发明减少光流定点的计算量,提高计算时效性,提高定点悬停稳定性。

Description

一种无人机定点悬停系统和方法 技术领域
本发明属于无人机技术领域,具体涉及一种可用于室内环境下的无人机定点悬停系统和方法。
背景技术
随着经济发展和科学技术的不断进步,对无人机方面的研究越来越深入,无人机被广泛应用于航拍、移动检测、安防等领域。在这些领域中无人机的飞行平台的稳定性和准确快速的悬停能力,具有非常重要的影响。
无人机(UnmannedAerial Vehicle)的定点悬停的定义为:通过无人机本身的自主飞行功能、或遥控装置的控制,使无人机在空中指定位置进行一定时间的停留。
目前无人机定点悬停,最成熟且应用广泛的方法是利用GPS+气压计+陀螺仪的组合导航方式。其中气压计用来测量高度变化,GPS模块给出水平位置的坐标,最终结合陀螺仪的测量数据得出三维坐标,将坐标提供给无人机实现定点悬停。
但由于低成本的GPS数据刷新频率太低,其提供的数据在狭小空间或当无人机高速运动的时候将会导致机身的姿态补偿命令严重滞后,导致飞机坠毁等严重后果。由于建筑物对GPS信号的屏蔽,在室内GPS处于待机状态,无人机的悬停信号完全由惯性导航装置装置控制。但经过长时间的飞行后,惯性导航系统会产生较大的系统误差,飞行时间越长误差越大,定点悬停准确率大幅下降。
此外,还有通过无线电结合激光技术,实现无人机定点悬停的方法。该技术首先通过无线电技术对无人机进行粗定位,再通过激光技术进行位置矫正最终得出定点悬停的准确位置。该方法比GPS定点方法精确但成本过于高昂且使用前需要提前部署较多基站设备和激光发射设备,使得该项技术的普及度不高。
另外,有光流定点方法,该方法使用机载的光流传感器测算无人机相对于地面的光流场,以此得到无人机当前速度矢量信息(速度+方向),光流模块根据上述运动矢量信息计算出反向加速度控制变量,使无人机保持定点悬停状态。该方法不需要外部发射信号的协助,因此适用范围较广,系统误差较小,但由于要计算整幅图像的光流信息,因此计算量较大,无人机机载处理器难以承受的计算量会导致计算效率低,给出控制信号将会严重滞后。在飞行过程中任何细小的延 迟都可能导致严重后果,因此该方法的实用性有待提高。
现有代表性技术有以下2种
(1)专利名称:室内定位装置(申请号:CN201620160519.0)
一种室内定位装置,位于无人机上,其特征在于,该装置包括:超声波发射PCB板、超声波接收PCB板和光流PCB板,且,光流PCB板包括光流摄像头和微控单元MCU,超声波发射板:用于在光流PCB板上的MCU的控制下产生并发射超声波;超声波接收板:用于在接收到超声波时,将超声波发送给光流PCB板上的MCU;光流摄像头:用于拍摄地面图像,并将拍摄到的地面图像传输给MCU;MCU:用于控制超声波发射PCB板发出超声波,并开始计时,当接收到超声波接收PCB板发来的超声波时,停止计时,根据计时时长,计算无人机所处的高度;根据光流摄像头传来的地面图像,计算无人机在地面的水平x方向和垂直y方向的位移。
此装置元器件数量过多,制造成本偏高,通过外附计算机和大量辅助原件提高光流定位的时效性,成本过高且未能解决光流算法计算量过多的问题。
(2)专利名称:一种无人机速度监测方法及系统申请号:(CN201610529291.2)
本发明提供了一种无人机速度监测方法及系统,方法包括:获取当前的飞行高度、角速度以及图像;获取所述图像的特征点,计算各特征点的光流;统计各特征点的光流,选择同一个方向光流重复率最高的光流作为所述图像的光流;依据所述飞行高度、角速度以及所述图像的光流计算当前的飞行速度。本发明还涉及一种与上述无人机速度监测方法对应的系统,通过计算获取当前图像的特征点计算各特征点的光流,并选择同一个方向光流重复率最高的光流作为所述图像的光流;再依据所述图像的光流、当前的飞行高度和角速度计算出当前的飞行速度,计算简单,运算量小,并且考虑了多个因素的影响,计算结果精确。
这项发明有效减少了光流的计算量,仅作为一种检测方法,未能与无人机整体有机结合,进而控制无人机运动及姿态,因此未能有效应用于无人机定点悬停。
针对现有技术存在的问题,本专利提出了一种改进的基于光流法的无人机定点悬停方法,该专利适用于无GPS信号情况的定点悬停方法,可以在没有预先部署辅助设备的情况下正常工作,计算量较小。本专利在适用范围内定点精度和控制信号刷新速度都优于其他现有方法,为无人机室内应用开发提供技术基础。
发明内容
本发明要解决的技术问题为:利用光流法计算图像中光流场时,只有纹理特征明显的区域可以得出有效的光流场,而纹理特征不明显的区域无法得到有效的光流场。传统的光流定点方法未考虑到纹理特征,是将整个图像进行光流计算,因此计算量较大,影响时效性。
本发明采用以下技术方案实现既定的发明目的,一种无人机定点悬停系统,包括无人机飞控10、定点悬停控制模块20和无人机电机模块30。
无人机飞控10用于控制无人机的飞行;定点悬停控制模块20用于控制无人机飞行中的定点悬停;电机模块30用于改变无人机的飞行运动状态。
无人机飞控10包含数据接收模块110、控制量计算模块120和电调控制模块130。
数据接收模块110用于接收定点悬停控制模块20发送的加速度控制变量并发送到控制量计算模块120;控制量计算模块120从数据接收模块110接收控制加速度信号,并根据接收到的控制加速度数据计算需要生成的PWM波信号波形参数,然后发送到电调控制模块130;电调控制模块130根据从控制量计算模块120接收到的信息生成PWM波信号。
定点悬停控制模块20包含图像采集模块210、定点悬停有效区域识别模块220、光流场计算模块230、控制参数计算模块240和数据编码和发送模块250。图像采集模块210包括采集视频数据的图像传感器以及负责将采集视频数据存入主储存器存储模块,定点悬停有效区域识别模块220为一套单片机或微型计算机及驱动结构,根据图像采集模块210收集到的图象数据利用SAD算法检测图像中被识别的纹理区域,并把这些区域作为定点悬停有效区域发送到光流场计算模块230;光流场计算模块230为一套结合姿态传感器即陀螺仪的单片机及驱动结构,是从定点悬停有效区域识别模块220获取分割后的视频部分,利用光流算法中的HS光流法进行光流计算,并结合姿态传感器即陀螺仪提供的运动状态数据进行补偿,最后得出无人机相对于地面的速度矢量信息并发送给控制参数计算模块240;控制参数计算模块240根据光流场计算模块230得出的速度矢量数据计算出维持无人机定点悬停所需的控制加速度并发送到数据编码和发送模块250;数据编码和发送模块250的作用是将运算结果编码并传输到无人机飞控10中的数据接收模块110。
一种无人机定点悬停方法,定点悬停控制模块20的具体流程如下,流程示意图如图2所示。
步骤1:对应图像采集模块210,由图像采集模块210中的图像传感器获取定点悬停位置的视频信息。
步骤2:对应定点悬停有效区域识别模块220,由定点悬停有效区域识别模块220进行纹理区域的检测与分割,最终识别出定点悬停有效区域,具体操作如下:
步骤2-1:图像分区
将图像采集模块210采集到的M*N灰度图像划分成n*n大小的子区域。
M代表灰度图像的长度,N代表灰度图像的宽度。
步骤2-2:计算SAD值
对步骤2-1中划分的每个子区域,求SAD值。SAD值的计算公式如下:
Figure PCTCN2017114574-appb-000001
ft表示当前时刻t的像素值,ft+1表示下一时刻t+1的像素值,C表示SAD值。
C为两帧图像差分的绝对值和即SAD值,f代表像素点的值,t表示当前时刻。
步骤2-3:判断定点悬停有效区域
将步骤2-2中计算每个区域的SAD值与阈值T进行差值比较,差值的绝对值0为纹理区域,即作为定点悬停有效区域。即如果本区域差值绝对值大于0时,为有效区域,对进行光流计算。
步骤3:对应定点悬停光流场计算模块230。
使用HS光流法或其他光流算法对步骤2-2分割出的定点悬停有效区域进行光流计算得出光流值,将光流速度通过传感器与实际物体平面的相似三角形关系,并转换为公制速度Vt
步骤4:对应参数控制模块240,根据上述步骤3中得到的实际速度Vt计算出无人机定点的控制参数a反向加速度,并将其传输到飞控模块10实现定点控制。
Figure PCTCN2017114574-appb-000002
其中a为反向加速度,x为上一循环中偏离定位点的距离。
步骤5:对应数据编码和发送模块250通过数据编码和发送模块250将步骤4中得到的a数据通过I2C口发送给飞控的数据接收模块110;
步骤6:飞控10的数据接收模块110接收无人机偏移数据a并将其传输给控制量计算模块120。
步骤7:控制量计算模块120接收数据a并进行计算,得出无人机运动偏移量的补偿量,即与无人机偏移速度矢量相反的速度矢量,并将其转化为电调控制信号,发送给电调控制模块130。
步骤8:电调控制模块130接收控制量计算模块120发送的电调控制信号,控制向无人机电机模块30输出的电流大小。
步骤9:无人机电机模块30接收电调控制模块130的电流,控制无人机向现有运动的反方向运动,使无人机定点悬停。
本发明提供的系统包括无人机飞控、电机以及定点悬停控制模块的无人机系统和方法。其中的定点悬停控制模块中利用纹理特征获取有效区域,相对于传统光流定点方法减少了计算量,然后利用光流法针对上述有效区域计算光流场。进而得到控制加速度信息。
与现有技术相比较,本发明减少光流定点的计算量,提高计算时效性,提高定点悬停稳定性。
附图说明
图1本专利的代表实施例-一种基于光流法的无人机定点系统架构图。
图2无人机定点悬停控制模块流程图。
图3HS光流法流程图。
具体实施方式
本发明的实施例如图1所示为一种无人机定点悬停系统,包括无人机飞控10、定点悬停控制模块20和无人机电机模块30。
无人机飞控10用于控制无人机的飞行;定点悬停控制模块20用于控制无人机飞行中的定点悬停;电机模块30用于改变无人机的飞行运动状态。
无人机飞控10包含数据接收模块110、控制量计算模块120和电调控制模块130。
数据接收模块110用于接收定点悬停控制模块20发送的加速度控制变量并发送到控制量计算模块120;控制量计算模块120从数据接收模块110接收控制 加速度信号,并根据接收到的控制加速度数据计算需要生成的PWM波信号波形参数,然后发送到电调控制模块130;电调控制模块130根据从控制量计算模块120接收到的信息生成PWM波信号。
定点悬停控制模块20包含图像采集模块210、定点悬停有效区域识别模块220、光流场计算模块230、控制参数计算模块240和数据编码和发送模块250。图像采集模块210包括采集视频数据的图像传感器以及负责将采集视频数据存入主储存器存储模块,定点悬停有效区域识别模块220为一套单片机或微型计算机及驱动结构,根据图像采集模块210收集到的图象数据利用SAD算法检测图像中被识别的纹理区域,并把这些区域作为定点悬停有效区域发送到光流场计算模块230;光流场计算模块230为一套结合姿态传感器即陀螺仪的单片机及驱动结构,是从定点悬停有效区域识别模块220获取分割后的视频部分,利用光流算法中的HS光流法进行光流计算,并结合姿态传感器即陀螺仪提供的运动状态数据进行补偿,最后得出无人机相对于地面的速度矢量信息并发送给控制参数计算模块240;控制参数计算模块240根据光流场计算模块230得出的速度矢量数据计算出维持无人机定点悬停所需的控制加速度并发送到数据编码和发送模块250;数据编码和发送模块250的作用是将运算结果编码并传输到无人机飞控10中的数据接收模块110。
一种无人机定点悬停方法,定点悬停控制模块20的具体流程如下,流程示意图如图2所示。
步骤1:对应图像采集模块210,由图像采集模块210中的图像传感器获取定点悬停位置的视频信息。
步骤2:对应定点悬停有效区域识别模块220,由定点悬停有效区域识别模块220进行纹理区域的检测与分割,最终识别出定点悬停有效区域,具体操作如下:
步骤2-1:图像分区
将图像采集模块210采集到的M*N灰度图像划分成n*n大小的子区域,本实施例中,n=4。
M代表灰度图像的长度,N代表灰度图像的宽度。
步骤2-2:计算SAD值
对步骤2-1中划分的每个子区域,求SAD值。SAD值的计算公式如下:
Figure PCTCN2017114574-appb-000003
ft表示当前时刻t的像素值,ft+1表示下一时刻t+1的像素值,C表示SAD值。
C为两帧图像差分的绝对值和即SAD值,f代表像素点的值,t表示当前时刻。
步骤2-3:判断定点悬停有效区域
将步骤2-2中计算每个区域的SAD值与阈值T进行差值比较,差值的绝对值0为纹理区域,即作为定点悬停有效区域。即如果本区域差值绝对值大于0时,为有效区域,对进行光流计算。
步骤3:对应定点悬停光流场计算模块230。
使用HS光流法或其他光流算法对步骤2-2分割出的定点悬停有效区域进行光流计算得出光流值,将光流速度通过传感器与实际物体平面的相似三角形关系,并转换为公制速度Vt
步骤4:对应参数控制模块240,根据上述步骤3中得到的实际速度Vt计算出无人机定点的控制参数a反向加速度,并将其传输到飞控模块10实现定点控制。
Figure PCTCN2017114574-appb-000004
其中a为反向加速度,x为上一循环中偏离定位点的距离。
步骤5:对应数据编码和发送模块250.通过数据编码和发送模块250将步骤4中得到的a数据通过I2C口发送给飞控的数据接收模块110;
步骤6(或改为姿态控制步骤1):飞控10的数据接收模块110接收无人机偏移数据a并将其传输给控制量计算模块120。
步骤7:控制量计算模块120接收数据a并进行计算,得出无人机运动偏移量的补偿量,即与无人机偏移速度矢量相反的速度矢量,并将其转化为电调控制信号,发送给电调控制模块130。
步骤8:电调控制模块130接收控制量计算模块120发送的电调控制信号,控制向无人机电机模块30输出的电流大小。
步骤9:无人机电机模块30接收电调控制模块130的电流,控制无人机向现有运动的反方向运动,使无人机定点悬停。
HS光流法步骤如下:
提出如下两个假设:
1)运动物体的灰度在短时间间隔内保持不变。
2)给定邻域内的速度向量场变化是缓慢的。
根据如上假设,可由式(1)得出公式(3):
I(x,y,t)=I(x+δx,y+δy,t+δt        (3)(3)
其中,I为光流值,x为灰度图像中的横坐标,y为灰度图像中的纵坐标,t为当前时刻。
(x,y,t)为在灰度图像在t时刻,横纵坐标为(x,y)的像素点。
上式右边在点(x,y)处泰勒展开,在极限中当δt趋近于0时,上式变成:
Figure PCTCN2017114574-appb-000005
令u=dx/dt v=dy/dt,得:
Ixu+Iyv+It=0(5)
式(5)即光流约束方程,它反映了灰度u与速度v的一个对应关系。Ix为I对x的偏导,Iy为I对y的偏导,It为I对t的偏导。
式(5)中含有两个变量:灰度u和速度v,u和v随着像素点移动而发生的改变是缓慢的,局部区域的变化不大,尤其是在目标做无变形刚体运动时,局部区域速度的空间变化率为0。因此引入一个新的条件,即光流的全局平滑约束条件。
因此,引入速度平滑项:
Figure PCTCN2017114574-appb-000006
对于所有的像素点(x,y,t),需要满足式(6)和最小。
ζ为速度平滑项函数,ζc为在SAD值为c处的速度平滑项函数。
综合光流约束条件(3)和速度平滑约束条件(4),建立以下的极小化方程:
Figure PCTCN2017114574-appb-000007
式中的α是平滑权重系数,表示速度光滑项所占的权重。α越大,精度越高。
采用变分计算,根据欧拉方程得到
Figure PCTCN2017114574-appb-000008
Figure PCTCN2017114574-appb-000009
注:
Figure PCTCN2017114574-appb-000010
为向量微分算子符号,为公认数学符号。
(8),(9)式中,
Figure PCTCN2017114574-appb-000011
拉普拉斯算子,用某一点的速度与其周围速度平均值之差来近似,有式(9)及式(10):
Figure PCTCN2017114574-appb-000012
Figure PCTCN2017114574-appb-000013
使用迭代法进行求解。λ为灰度图像像素矩阵的特征值。
对于灰度u和速度v的均值,可利用九点差分格式进行计算。
至此,所有的量都已明确下来,输入前后两帧灰度进行迭代运算得到速度场了,具体的执行过程见图3。

Claims (2)

  1. 一种无人机定点悬停系统,其特征在于:包括无人机飞控(10)、定点悬停控制模块(20)和无人机电机模块(30);
    无人机飞控(10)用于控制无人机的飞行;定点悬停控制模块(20)用于控制无人机飞行中的定点悬停;无人机电机模块(30)用于改变无人机的飞行运动状态;
    无人机飞控(10)包含数据接收模块(110)、控制量计算模块(120)和电调控制模块(130);
    数据接收模块(110)用于接收定点悬停控制模块(20)发送的加速度控制变量并发送到控制量计算模块(120);控制量计算模块(120)从数据接收模块(110)接收控制加速度信号,并根据接收到的控制加速度数据计算需要生成的PWM波信号波形参数,然后发送到电调控制模块(130);电调控制模块(130)根据从控制量计算模块(120)接收到的信息生成PWM波信号;
    定点悬停控制模块(20)包含图像采集模块(210)、定点悬停有效区域识别模块(220)、光流场计算模块(230)、控制参数计算模块(240)和数据编码和发送模块(250);图像采集模块(210)包括采集视频数据的图像传感器以及负责将采集视频数据存入主储存器存储模块,定点悬停有效区域识别模块(220)为一套单片机或微型计算机及驱动结构,根据图像采集模块(210)收集到的图象数据利用SAD算法检测图像中被识别的纹理区域,并把这些区域作为定点悬停有效区域发送到光流场计算模块(230);光流场计算模块(230)为一套结合姿态传感器即陀螺仪的单片机及驱动结构,是从定点悬停有效区域识别模块(220)获取分割后的视频部分,利用光流算法中的HS光流法进行光流计算,并结合姿态传感器即陀螺仪提供的运动状态数据进行补偿,最后得出无人机相对于地面的速度矢量信息并发送给控制参数计算模块(240);控制参数计算模块(240)根据光流场计算模块(230)得出的速度矢量数据计算出维持无人机定点悬停所需的控制加速度并发送到数据编码和发送模块(250);数据编码和发送模块(250)的作用是将运算结果编码并传输到无人机飞控(10)中的数据接收模块(110)。
  2. 利用权利要求1所述系统进行的一种无人机定点悬停方法,其特征在于:定点悬停控制模块(20)的具体流程如下,
    步骤1:对应图像采集模块(210),由图像采集模块(210)中的图像传感器获取定点悬停位置的视频信息;
    步骤2:对应定点悬停有效区域识别模块(220),由定点悬停有效区域识别模块(220)进行纹理区域的检测与分割,最终识别出定点悬停有效区域,具体操作如下:
    步骤2-1:图像分区
    将图像采集模块(210)采集到的M*N灰度图像划分成n*n大小的子区域;
    M代表灰度图像的长度,N代表灰度图像的宽度;
    步骤2-2:计算SAD值
    对步骤2-1中划分的每个子区域,求SAD值;SAD值的计算公式如下:
    Figure PCTCN2017114574-appb-100001
    ft表示当前时刻t的像素值,ft+1表示下一时刻t+1的像素值,C表示SAD值;
    C为两帧图像差分的绝对值和即SAD值,f代表像素点的值,t表示当前时刻;
    步骤2-3:判断定点悬停有效区域
    将步骤2-2中计算每个区域的SAD值与阈值T进行差值比较,差值的绝对值0为纹理区域,即作为定点悬停有效区域;即如果本区域差值绝对值大于0时,为有效区域,对进行光流计算;
    步骤3:对应定点悬停光流场计算模块(230);
    使用HS光流法或其他光流算法对步骤2-2分割出的定点悬停有效区域进行光流计算得出光流值,将光流速度通过传感器与实际物体平面的相似三角形关系,并转换为公制速度Vt
    步骤4:对应参数控制计算模块(240),根据上述步骤3中得到的实际速度Vt计算出无人机定点的控制参数a反向加速度,并将其传输到无人机飞控(10)实现定点控制;
    Figure PCTCN2017114574-appb-100002
    其中,a为反向加速度,x为上一循环中偏离定位点的距离;
    步骤5:对应数据编码和发送模块(250)通过数据编码和发送模块(250)将步骤4中得到的a数据通过I2C口发送给飞控的数据接收模块(110);
    步骤6:飞控10的数据接收模块(110)接收无人机偏移数据a并将其传输给控制量计算模块(120);
    步骤7:控制量计算模块(120)接收数据a并进行计算,得出无人机运动偏移量的补偿量,即与无人机偏移速度矢量相反的速度矢量,并将其转化为电调控制信号,发送给电调控制模块(130);
    步骤8:电调控制模块(130)接收控制量计算模块(120)发送的电调控制信号,控制向无人机电机模块(30)输出的电流大小;
    步骤9:无人机电机模块(30)接收电调控制模块(130)的电流,控制无人机向现有运动的反方向运动,使无人机定点悬停。
PCT/CN2017/114574 2017-11-15 2017-12-05 一种无人机定点悬停系统和方法 WO2019095453A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/495,089 US20200097025A1 (en) 2017-11-15 2017-12-05 An uav fixed point hover system and method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201711133073.8 2017-11-15
CN201711133073.8A CN107943064B (zh) 2017-11-15 2017-11-15 一种无人机定点悬停系统和方法

Publications (1)

Publication Number Publication Date
WO2019095453A1 true WO2019095453A1 (zh) 2019-05-23

Family

ID=61932433

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/114574 WO2019095453A1 (zh) 2017-11-15 2017-12-05 一种无人机定点悬停系统和方法

Country Status (3)

Country Link
US (1) US20200097025A1 (zh)
CN (1) CN107943064B (zh)
WO (1) WO2019095453A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI747718B (zh) * 2020-12-14 2021-11-21 大陸商廣州昂寶電子有限公司 位移補償方法和設備及速度補償方法和設備
CN114877876A (zh) * 2022-07-12 2022-08-09 南京市计量监督检测院 一种无人机悬停精度评估方法

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190009103A (ko) * 2017-07-18 2019-01-28 삼성전자주식회사 외부 객체와의 거리에 기반하여 이동 되는 전자 장치 및 그 제어 방법
CN109050969B (zh) * 2018-06-14 2024-01-30 广东工业大学 一种多旋翼无人机视觉定点悬停稳定性测试平台
CN109062238A (zh) * 2018-09-19 2018-12-21 张洋 控制无人机悬停的装置
CN109533277A (zh) * 2018-12-06 2019-03-29 北京工业大学 一种基于手势识别的交互式跟拍飞行器
CN109634302B (zh) * 2018-12-06 2022-04-08 河池学院 一种基于光学定位的四旋翼飞行器系统
CN109634297A (zh) * 2018-12-18 2019-04-16 辽宁壮龙无人机科技有限公司 一种基于光流传感器定位导航的多旋翼无人机及控制方法
CN109948424A (zh) * 2019-01-22 2019-06-28 四川大学 一种基于加速度运动特征描述子的群体异常行为检测方法
CN110174898A (zh) * 2019-06-18 2019-08-27 华北电力大学(保定) 一种基于图像反馈的多旋翼无人机控制方法
CN110907741B (zh) * 2019-12-18 2022-04-08 中国人民解放军战略支援部队信息工程大学 无人机飞控模块电波暗室辐射干扰效应等效替代试验系统和方法
CN112985388B (zh) * 2021-02-08 2022-08-19 福州大学 基于大位移光流法的组合导航方法及系统
CN113928558A (zh) * 2021-09-16 2022-01-14 上海合时无人机科技有限公司 一种基于无人机的自动拆装间隔棒方法
CN117969682B (zh) * 2024-03-28 2024-06-04 北京工业大学 一种超声导波监测信号的光流图像处理方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103853156A (zh) * 2014-02-07 2014-06-11 中山大学 一种基于机载传感器的小型四旋翼飞行器控制系统及方法
US9387928B1 (en) * 2014-12-18 2016-07-12 Amazon Technologies, Inc. Multi-use UAV docking station systems and methods
CN206096947U (zh) * 2016-09-29 2017-04-12 厦门大学嘉庚学院 一种适用于室内自主飞行的四旋翼飞行器
US9678507B1 (en) * 2015-06-25 2017-06-13 Latitude Engineering, LLC Autonomous infrastructure element survey systems and methods using UAV fleet deployment
CN107077140A (zh) * 2016-03-28 2017-08-18 深圳市大疆创新科技有限公司 无人飞行器的悬停控制方法、控制系统和无人飞行器
CN107289910A (zh) * 2017-05-22 2017-10-24 上海交通大学 一种基于tof的光流定位系统

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103853156A (zh) * 2014-02-07 2014-06-11 中山大学 一种基于机载传感器的小型四旋翼飞行器控制系统及方法
US9387928B1 (en) * 2014-12-18 2016-07-12 Amazon Technologies, Inc. Multi-use UAV docking station systems and methods
US9678507B1 (en) * 2015-06-25 2017-06-13 Latitude Engineering, LLC Autonomous infrastructure element survey systems and methods using UAV fleet deployment
CN107077140A (zh) * 2016-03-28 2017-08-18 深圳市大疆创新科技有限公司 无人飞行器的悬停控制方法、控制系统和无人飞行器
CN206096947U (zh) * 2016-09-29 2017-04-12 厦门大学嘉庚学院 一种适用于室内自主飞行的四旋翼飞行器
CN107289910A (zh) * 2017-05-22 2017-10-24 上海交通大学 一种基于tof的光流定位系统

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI747718B (zh) * 2020-12-14 2021-11-21 大陸商廣州昂寶電子有限公司 位移補償方法和設備及速度補償方法和設備
CN114877876A (zh) * 2022-07-12 2022-08-09 南京市计量监督检测院 一种无人机悬停精度评估方法

Also Published As

Publication number Publication date
CN107943064B (zh) 2019-12-03
CN107943064A (zh) 2018-04-20
US20200097025A1 (en) 2020-03-26

Similar Documents

Publication Publication Date Title
WO2019095453A1 (zh) 一种无人机定点悬停系统和方法
CN112567201B (zh) 距离测量方法以及设备
US10565732B2 (en) Sensor fusion using inertial and image sensors
CN105652891B (zh) 一种旋翼无人机移动目标自主跟踪装置及其控制方法
EP3158293B1 (en) Sensor fusion using inertial and image sensors
Achtelik et al. Stereo vision and laser odometry for autonomous helicopters in GPS-denied indoor environments
CN107209514B (zh) 传感器数据的选择性处理
CN103853156B (zh) 一种基于机载传感器的小型四旋翼飞行器控制系统及方法
WO2018053861A1 (en) Methods and system for vision-based landing
WO2016187759A1 (en) Sensor fusion using inertial and image sensors
CN102190081B (zh) 基于视觉的飞艇定点鲁棒控制方法
WO2016187758A1 (en) Sensor fusion using inertial and image sensors
CN105182992A (zh) 无人机的控制方法、装置
CN107390704B (zh) 一种基于imu姿态补偿的多旋翼无人机光流悬停方法
CN105759829A (zh) 基于激光雷达的微型无人机操控方法及系统
Li et al. UAV autonomous landing technology based on AprilTags vision positioning algorithm
CN105045276A (zh) 无人机飞行控制方法及装置
WO2019051832A1 (zh) 可移动物体控制方法、设备及系统
CN107831776A (zh) 基于九轴惯性传感器的无人机自主返航方法
GB2246261A (en) Tracking arrangements and systems
CN112947550A (zh) 一种基于视觉伺服的非法飞行器打击方法及机器人
Amidi et al. Research on an autonomous vision-guided helicopter
CN113206951B (zh) 一种基于扑翼飞行系统的实时电子稳像方法
Zheng et al. Integrated navigation system with monocular vision and LIDAR for indoor UAVs
CN206117842U (zh) 基于无人机的图像采集系统及无人机

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17932079

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17932079

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17932079

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 02/12/2020)