CN105974932A - Unmanned aerial vehicle control method - Google Patents

Unmanned aerial vehicle control method Download PDF

Info

Publication number
CN105974932A
CN105974932A CN201610269672.1A CN201610269672A CN105974932A CN 105974932 A CN105974932 A CN 105974932A CN 201610269672 A CN201610269672 A CN 201610269672A CN 105974932 A CN105974932 A CN 105974932A
Authority
CN
China
Prior art keywords
control
communication interface
unmanned plane
attitude
key frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610269672.1A
Other languages
Chinese (zh)
Other versions
CN105974932B (en
Inventor
王国胜
吕强
郭峰
张洋
林辉灿
马建业
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Academy of Armored Forces Engineering of PLA
Original Assignee
Academy of Armored Forces Engineering of PLA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Academy of Armored Forces Engineering of PLA filed Critical Academy of Armored Forces Engineering of PLA
Priority to CN201610269672.1A priority Critical patent/CN105974932B/en
Publication of CN105974932A publication Critical patent/CN105974932A/en
Application granted granted Critical
Publication of CN105974932B publication Critical patent/CN105974932B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

An embodiment of the invention discloses an unmanned aerial vehicle control method. The method is characterized in that a visual sensor collects environment image information; an image acquisition module acquires environment image information collected by the visual sensor and transmits the environment image information to a processing module; the processing module acquires a height, a position and attitude estimation of an unmanned aerial vehicle according to the environment image information, issues visual pose estimation to a theme through a specific message form, simultaneously issues a map theme and outputs map information; a flight control panel transmits information of a current position, an attitude and a speed of the unmanned aerial vehicle to a communication interface module through a MAVLink protocol; a control module analyzes each current state of the unmanned aerial vehicle and issues a control command to the theme so that a communication interface node can carry out subscription, wherein each current state of the unmanned aerial vehicle is acquired through subscription; And the communication interface module integrates visual estimation pose information obtained from the image acquisition module through the subscription and the control command from the control module through the subscription, and sends to the flight control panel through the MAVLink protocol, and the flight control panel controls the unmanned aerial vehicle.

Description

Unmanned aerial vehicle (UAV) control method
Technical field
The present invention relates to unmanned vehicle field, particularly relate to a kind of unmanned aerial vehicle (UAV) control method.
Background technology
Many rotor wing unmanned aerial vehicles (Multi-rotor aircraft) are controlled by wireless remote control device, it is possible to perform some special tasks under unpiloted state.Its VTOL, flexible and reliability high, it is possible to carry more detection and processing equipment.
Along with the miniaturization of the development of microelectric technique, computer hardware and microelectronic sensor and variation and the continuing to optimize of control algolithm, enhance many rotor wing unmanned aerial vehicles information processing capability and speed, thus its type performing task is more and more various.
The widely used GPS of current unmanned plane and inertial navigation system position and navigate, but in the environment of the gps signal such as building and valley disappearance, need to be precisely controlled unmanned plane, and how improving unmanned aerial vehicle (UAV) control precision is the problem being currently needed for solving.
Summary of the invention
The embodiment of the present invention provides a kind of unmanned aerial vehicle (UAV) control method, it is possible to increase unmanned aerial vehicle (UAV) control precision.
The embodiment of the present invention adopts the following technical scheme that
A kind of unmanned aerial vehicle (UAV) control method, described method is applied to unmanned aerial vehicle control system, described system includes: vision sensor, flight panel, main control computer, unmanned plane, image capture module that described main control computer includes being sequentially connected with, processing module, communication interface modules, control module, described method includes:
Described vision sensor gathers ambient image information;
Described image capture module obtains the ambient image information that described vision sensor gathers, and by the transmission of described ambient image information to described processing module;
Described processing module, according to described ambient image information, obtains the height of unmanned plane, position and Attitude estimation, and estimates to be published to theme by particular message form by vision pose, issue map theme output map information simultaneously;
The information such as unmanned plane current location, attitude and speed are sent to described communication interface modules by MAVLink agreement by described flight panel;
The unmanned plane obtained each state current is subscribed in described control module analysis, and control instruction is published to theme, subscribes to for communication interface node.
Described communication interface modules is integrated and is subscribed to the vision estimation posture information obtained from described image capture module and subscribe to the control command obtained from described control module, and it is sent to described flight panel by MAVLink agreement, described unmanned plane is controlled by described flight panel so that described unmanned plane combines vision pose estimated result according to control command and completes corresponding autonomous flight.
Optionally, also include: described communication interface modules subscribes to positioning result from described processing module, positions according to described positioning result.
Optionally, also include:
Described communication interface modules subscribes to Autonomous Control order theme, realizes being controlled unmanned plane according to described Autonomous Control order theme.
Optionally, also include: described communication interface modules obtains the current local location of unmanned plane by described panel.
Optionally, also include:
Described communication interface modules is provided with first interface, the second interface, described communication interface modules is obtained by described first interface and flies to control heartbeat packet, position data bag, attitude angle data bag, and described communication interface modules obtains vision pose estimated result and control command by described second interface.
Optionally, described control module obtains described unmanned plane current location, generates position control instruction, described position control order is published to described communication interface modules, and instructs by monitoring the state described position control of adjustment of described unmanned plane.
Optionally, the current position of described unmanned plane and attitude are estimated by described processing module, and described processing module provides current position and attitude to described flight panel in real time;
Described flight panel obtains current location estimation result (x, y, z) He four elements (x, y, z, w) the Attitude estimation result of form by the way of topic of subscription.
Optionally, described processing module is to each new key frame KiCompleting local map structuring, step includes:
S1, insertion key frame;
S2, superseded nearest point map;
S3, create new point map;
S4, locally bundle adjustment;
S5, locally key frame eliminate.
Optionally, described processing module is for the key frame K processed last in the map structuring thread of localiCarrying out loop occlusion detection thread, described processing module passes through key frame KiWord bag vector calculate its with system in the similarity of key frame, complete the closed loop detection of candidate, if there being key frame and the K of multiple positioniScene outward appearance similar, then have multiple closed loop candidate;
Described processing module calculates present frame KiWith closed loop frame KlBetween similarity transformation, the cumulative errors in loop are modified, similarity transformation is using the geometric verification as closed loop simultaneously;
After described processing module determines closed loop, by calculated similarity transformation, the pose of current key frame is modified, this correction simultaneously will have influence on the key frame node adjacent with current key frame, carry out pose figure optimization and reach global coherency, thus each point map converts according to the key frame revised.
Optionally, described unmanned plane includes positioner, attitude controller, actuating motor, and described positioner receives flight control information, after carrying out the resolving of attitude, control described actuating motor by described attitude controller, thus described unmanned plane is carried out flight control.
Unmanned aerial vehicle (UAV) control method based on technique scheme, after utilizing vision sensor to obtain ambient image data, it is transferred to the main control computer being mounted on unmanned plane, the image received is processed by main control computer, complete position and Attitude estimation and map structuring, afterwards relative position is directly sent to flight panel as flight parameter, flight panel resolves the position of current flight device, after height and attitude data, described unmanned plane is controlled, make described unmanned plane combine vision pose estimated result according to control command and complete corresponding autonomous flight.
It should be appreciated that it is only exemplary and explanatory that above general description and details hereinafter describe, the disclosure can not be limited.
Accompanying drawing explanation
Accompanying drawing herein is merged in description and constitutes the part of this specification, it is shown that meet embodiments of the invention, and for explaining the principle of the present invention together with description.
One of structural representation of a kind of unmanned aerial vehicle control system that Fig. 1 provides for the embodiment of the present invention;
The two of the structural representation of a kind of unmanned aerial vehicle control system that Fig. 2 provides for the embodiment of the present invention;
The three of the structural representation of a kind of unmanned aerial vehicle control system that Fig. 3 provides for the embodiment of the present invention;
The four of the structural representation of a kind of unmanned aerial vehicle control system that Fig. 4 provides for the embodiment of the present invention.
Detailed description of the invention
Here will illustrate exemplary embodiment in detail, its example represents in the accompanying drawings.When explained below relates to accompanying drawing, unless otherwise indicated, the same numbers in different accompanying drawings represents same or analogous key element.Embodiment described in following exemplary embodiment does not represent all embodiments consistent with the present invention.On the contrary, they only with describe in detail in appended claims, the present invention some in terms of the example of consistent apparatus and method.
A kind of unmanned aerial vehicle control system that Fig. 1 provides for the embodiment of the present invention, this system includes: vision sensor 11, flight panel 12, main control computer 13, unmanned plane 14.
The present invention executes in example, and flight panel 12 is carried out from main command and controls the hinge flown, and main control computer 13 obtains flying quality and state by flight panel 12, is also to complete Autonomous Control by flight panel 12 to fly.
In one embodiment of the invention, flight panel 12 can use following configuration: STM32F427Cortex M4168MHz microprocessor, RAM is that 256KB, flash are stored as 2MB.In addition to main CPU, an also STM32F103 processor, it is to avoid when primary processor loses efficacy, unmanned plane was out of control.For expanding capacity of will, flight panel 12 can provide multiple interface to connect external sensor.
In one embodiment of the invention, flight panel 12 can have plurality of flight and meet different mission requirements, including: manual mode, height and position stable mode, mission mode, hovering pattern and from plate mode.Wherein, (main control computer as connected) set-point and the control command of outside can be obtained by serial ports from plate mode, including position, attitude, speed and acceleration etc., the controller of flight panel will resolve according to given desired value and fly.
In the embodiment of the present invention, vision sensor 11 can complete the collection to ambient image, according to vision localization and the needs of map structuring algorithm, the image being provided that 640*480 pixel resolution can be used, and the acquisition (or under 320*240 resolution, 120 frames are per second) of image is completed with the frequency acquisition of the highest 60 frames per second, there is the vision sensor of the wide-angle imaging pattern of 75 degree.
Main control computer 13 mainly processes the data of vision sensor 11 and whole platform is carried out Autonomous Control.The configuration hardware parameter of main control computer 11 can be as shown in table 1, main control computer 13 connects TP-link-WN823N USB Wi-Fi module and flight panel TELEM2 port by USB hub extension, sets up in plate main control computer 13 and flight panel 12 and the reliable communication of earth station's computer.Another USB port is then directly connected to PS3Eye monocular vision sensor, carries Ubuntu 12.04LTS operating system, the robot operating system (ROS) of Hydro version and OpenCV image procossing storehouse at plate main control computer.
Table 1 main control computer 13 configures hardware parameter
The embodiment of the present invention provides a kind of unmanned aerial vehicle (UAV) control method, described method is applied to the unmanned aerial vehicle control system shown in Fig. 1, this system includes: vision sensor 11, flight panel 12, main control computer 13, unmanned plane 14, as shown in Figure 2, image capture module 131 that described main control computer 13 includes being sequentially connected with, processing module 132, communication interface modules 133, control module 134, described method includes:
Described vision sensor 11 gathers ambient image information;
Described image capture module 131 obtains the ambient image information that described vision sensor 11 gathers, and by the transmission of described ambient image information to described processing module 132;
Described processing module 132, according to described ambient image information, obtains the height of unmanned plane, position and Attitude estimation, and estimates to be published to theme by particular message form by vision pose, issue map theme output map information simultaneously;
The information such as unmanned plane current location, attitude and speed are sent to described communication interface modules 133 by MAVLink agreement by described flight panel 12;
Described control module 134 is analyzed and is subscribed to the unmanned plane obtained each state current, and control instruction is published to theme, subscribes to for communication interface node.
Described communication interface modules 133 is integrated and is subscribed to the vision estimation posture information obtained from described image capture module and subscribe to the control command obtained from described control module 134, and it is sent to described flight panel 12 by MAVLink agreement, described unmanned plane is controlled by described flight panel 12 so that described unmanned plane 14 combines vision pose estimated result according to control command and completes corresponding autonomous flight.
The unmanned aerial vehicle (UAV) control method of the embodiment of the present invention, after utilizing vision sensor to obtain ambient image data, it is transferred to the main control computer being mounted on unmanned plane, the image received is processed by main control computer, complete position and Attitude estimation and map structuring, afterwards relative position is directly sent to flight panel as flight parameter, after flight panel resolves the position of current flight device, height and attitude data, described unmanned plane is controlled so that described unmanned plane combines vision pose estimated result according to control command and completes corresponding autonomous flight.
In the embodiment of the present invention, vision sensor 11 can be monocular vision sensor, and vision sensor 11 gathers ambient image information and is transferred to the image capture module 131 of main control computer 13, completes Image semantic classification.
With map structuring algorithm, picture frame is processed according to vision localization by processing module 132, complete the height of unmanned plane 14, position and Attitude estimation, and estimate to be published to theme by particular message form by vision pose, issue map theme output map information simultaneously.
The information such as unmanned plane current location, attitude and speed are sent to communication interface modules 133 by MAVLink agreement by flight panel 12, and main control computer 13 obtains the current flight state of unmanned plane 14.
Control module 134 is analyzed and is subscribed to the unmanned plane obtained each state current, and control instruction is published to theme, subscribes to for communication interface node.
Communication interface modules 133 is integrated and is subscribed to the vision estimation posture information obtained from image capture module 131 and subscribe to the control command obtained from control module 134, and it being sent to flight panel 12 by MAVLink agreement, unmanned plane 14 will combine vision pose estimated result according to control command and complete corresponding autonomous flight.
In the embodiment of the present invention, optionally, described communication interface modules 133 subscribes to positioning result from described processing module, positions according to described positioning result.
Optionally, described communication interface modules 133 subscribes to Autonomous Control order theme, realizes being controlled unmanned plane according to described Autonomous Control order theme.
Optionally, also include: described communication interface modules 133 obtains the current local location of unmanned plane by described panel.
Optionally, described communication interface modules 133 is provided with first interface, the second interface, described communication interface modules is obtained by described first interface (/mavlink/to) and flies to control heartbeat packet, position data bag, attitude angle data bag, and described communication interface modules obtains vision pose estimated result and control command by described second interface (/mavlink/from).
In the embodiment of the present invention, communication interface modules 133 can receive the heartbeat packet of unmanned plane 14, attitude angle data bag and position data bag etc. by Mavlink agreement, and category with specific message format by data publication to different themes.Have subscribed the themes such as vision and position simultaneously, then with the form of Mavlink agreement, these subject datas are sent to flight panel 12, obtain the data of external sensor and realize autonomous flight control.
In the embodiment of the present invention, the different themes that communication interface modules 133 can be subscribed to, different themes can also be issued, communication interface modules 133 can and flight panel 12 between information transmission communication protocol, communication interface modules 133 is provided with/mavlink/to interface, / mavlink/from interface, / mavlink/to is that communication interface modules 133 acquisition flies to control heartbeat packet, position data bag, the passage of the information such as attitude angle data bag, / mavlink/from is the passage that flight panel 12 obtains the information such as vision pose estimated result and control command from communication interface modules 133.
In the embodiment of the present invention, communication interface modules 133 can subscribe to/the theme such as mavros/position/vision ,/mavros/setpoint/local_position ,/mavros/rc/override ,/mavros/safety_area/set ,/mavros/mocap/pose.In the theme subscribed to, / mavros/position/vision is vision localization and the positioning result in map constructing method, location estimation result subject data in track thread is published to this theme, i.e. flight panel 12 and obtains the current location of vision sensor, it is achieved location./ mavros/setpoint/local_position is Autonomous Control order theme, position control order is published to this theme and realizes the Autonomous Control to unmanned plane.
In the embodiment of the present invention, communication interface modules 133 can issue/the theme such as mavros/position/local ,/mavros/imu/data ,/mavros/optical_flow ,/mavros/state ,/mavros/rc ,/mavros/battery ,/mavros/camera_image ,/mavros/mission/waypoints ,/mavros/imu/aum_pressure.In the theme issued ,/mavros/position/local is the current local location of unmanned plane that flight panel 12 identifies after acquisition/mavros/position/vision theme vision localization result, and/mavros/imu/data is Inertial Measurement Unit data.Other also includes the subject informations such as barometer, remote-control data and aerial mission.
Optionally, described control module 133 obtains described unmanned plane current location, generates position control instruction, described position control order is published to described communication interface modules, and adjusts described position control instruction by the state monitoring described unmanned plane.
In the embodiment of the present invention, control module 134 subscribes to the current location theme of aircraft, it is provided that position feed back, then position control order is published to communication interface modules /mavros/setpoint/local_position theme on.By the monitoring in real time of unmanned plane state is adjusted position control information.
Optionally, the current position of described unmanned plane and attitude are estimated by described processing module 133, and described processing module provides current position and attitude to described flight panel in real time;
Described flight panel 12 obtains current location estimation result (x, y, z) He four elements (x, y, z, w) the Attitude estimation result of form by the way of topic of subscription.
In the embodiment of the present invention, track thread is that each two field picture obtaining camera processes, and completes the pose to each two field picture and estimates, track thread framework is as shown in Figure 3.
In the embodiment of the present invention, track thread mainly position and attitude to current flight device is estimated, while main control computer processes track thread, needs to make flight panel obtain current position and attitude in real time.So needing the theme completing pose estimated result in track thread to issue.The flight panel of aircraft obtains the location estimation result (x of current sensor by the way of topic of subscription, y, z) He four element (x, y, z, w) the Attitude estimation result of form, so track thread issues the theme of an entitled pose in ROS by PoseStamped type of message.
Optionally, described processing module 132 is to each new key frame KiCompleting local map structuring, step includes:
S1, insertion key frame;
S2, superseded nearest point map;
S3, create new point map;
S4, locally bundle adjustment;
S5, locally key frame eliminate.
Optionally, described processing module 132 is for the key frame K processed last in the map structuring thread of localiCarrying out loop occlusion detection thread, described processing module passes through key frame KiWord bag vector calculate its with system in the similarity of key frame, complete the closed loop detection of candidate, if there being key frame and the K of multiple positioniScene outward appearance similar, then have multiple closed loop candidate;
Described processing module 132 calculates present frame KiWith closed loop frame KlBetween similarity transformation, the cumulative errors in loop are modified, similarity transformation is using the geometric verification as closed loop simultaneously;
After described processing module 132 determines closed loop, by calculated similarity transformation, the pose of current key frame is modified, this correction simultaneously will have influence on the key frame node adjacent with current key frame, carry out pose figure optimization and reach global coherency, thus each point map converts according to the key frame revised.
In the embodiment of the present invention, loop occlusion detection thread is primarily directed to the last key frame K processed in the map structuring thread of localiCarry out.By key frame KiWord bag vector calculate its with system in the similarity of key frame, complete the closed loop detection of candidate.If there being key frame and the K of multiple positioniScene outward appearance similar, then have multiple closed loop candidate.
In the vision localization and map structuring of unmanned plane 14, the skew of map has 7 degree of freedom, 3 rotations, 3 translations and 1 yardstick.It is determined that whether be a closed-loop path, need to calculate present frame KiWith closed loop frame KlBetween similarity transformation, the cumulative errors in loop are modified.Similarity transformation is using the geometric verification as closed loop simultaneously.
After determining closed loop, then having needed Closed-cycle correction, be modified the pose of current key frame by calculated similarity transformation, this correction simultaneously will have influence on the key frame node adjacent with current key frame.Finally carrying out pose figure optimization and reach global coherency, after optimization, each point map converts according to the key frame revised.
In the embodiment of the present invention, as shown in Figure 4, this system can also include earth station's computer 15.In the embodiment of the present invention, startup and the initialization of program is completed in order to build earth station system, and realize the real-time monitoring to unmanned plane during flying state, use a wireless router networking, automatic IP address allocation is to main control computer and earth station's computer, make both complete association, complete being wirelessly transferred of main control computer picture.
Earth station's computer 15 completes based on Virtual network computer (VNC) remote control tool, it is service end (vncserver) at plate main control computer, carry Ubuntu operating system, earth station's computer is client (vncviewer), carries WIN7 operating system.
The client-access tools selection VNCViewer of the VNC of earth station's computer 15, needs after startup instrument to be configured IP address, and Server window is the IP address of main control computer service end.It can be seen that service end is at the picture of plate main control computer after successful connection, the operation such as the main control computer of service end can be started in real time, arrange in earth station's client simultaneously.
As shown in Figure 4, described unmanned plane 14 includes positioner 141, attitude controller 142, actuating motor 143, described positioner 141 receives flight control information, after carrying out the resolving of attitude, make described actuating motor 143 by described attitude controller control 142, thus described unmanned plane 14 is carried out flight control.
The unmanned aerial vehicle (UAV) control method of the embodiment of the present invention, after utilizing monocular vision sensor to obtain ambient image data, be transferred to be mounted on unmanned plane at plate main control computer, the image received is processed by main control computer, complete position and Attitude estimation and map structuring, afterwards relative position is directly sent to as flight parameter the flight panel of bottom, flight panel resolves the position of current flight device, main control computer is sent back after height and attitude data, control command parameter is sent to the positioner of aircraft by main control computer, formed and control loop.After positioner then completes the resolving of attitude, by attitude controller, aircraft is carried out autonomous flight control.
Being described above various embodiments of the present invention, described above is exemplary, and non-exclusive, and it is also not necessarily limited to disclosed each embodiment.In the case of the scope and spirit without departing from illustrated each embodiment, many modifications and changes will be apparent from for those skilled in the art.The selection of term used herein, it is intended to explain the principle of each embodiment, actual application or the improvement to the technology in market best, or make other those of ordinary skill of the art be understood that each embodiment disclosed herein.
The above; being only the detailed description of the invention of the present invention, but protection scope of the present invention is not limited thereto, any those familiar with the art is in the technical scope that the invention discloses; change can be readily occurred in or replace, all should contain within protection scope of the present invention.Therefore, protection scope of the present invention should be as the criterion with scope of the claims.

Claims (10)

1. a unmanned aerial vehicle (UAV) control method, it is characterised in that described method is applied to unmanned aerial vehicle control system, Described system includes: vision sensor, flight panel, main control computer, unmanned plane, described master control meter Image capture module that calculation machine includes being sequentially connected with, processing module, communication interface modules, control module, institute The method of stating includes:
Described vision sensor gathers ambient image information;
Described image capture module obtains the ambient image information that described vision sensor gathers, and by described ring Border image information transmission is to described processing module;
Described processing module, according to described ambient image information, obtains the height of unmanned plane, position and attitude and estimates Meter, and estimate to be published to theme by particular message form by vision pose, issue the output of map theme simultaneously Cartographic information;
The information such as unmanned plane current location, attitude and speed are assisted by described flight panel by MAVLink View is sent to described communication interface modules;
The unmanned plane obtained each state current is subscribed in described control module analysis, and control instruction is published to Theme, subscribes to for communication interface node.
Described communication interface modules is integrated and is subscribed to the vision estimation pose letter obtained from described image capture module Breath and the control command obtained from the subscription of described control module, and it is sent to institute by MAVLink agreement Stating flight panel, described unmanned plane is controlled by described flight panel so that described unmanned plane according to Control command combines vision pose estimated result and completes corresponding autonomous flight.
System the most according to claim 1, it is characterised in that also include: described communication interface mould Block subscribes to positioning result from described processing module, positions according to described positioning result.
System the most according to claim 1 and 2, it is characterised in that also include:
Described communication interface modules subscribes to Autonomous Control order theme, real according to described Autonomous Control order theme Now unmanned plane is controlled.
System the most according to any one of claim 1 to 3, it is characterised in that also include: institute State communication interface modules and obtain the current local location of unmanned plane by described panel.
System the most according to any one of claim 1 to 4, it is characterised in that also include:
Described communication interface modules is provided with first interface, the second interface, and described communication interface modules passes through institute State first interface acquisition to fly to control heartbeat packet, position data bag, attitude angle data bag, described communication interface modules Vision pose estimated result and control command is obtained by described second interface.
System the most according to any one of claim 1 to 5, it is characterised in that described control mould Block obtains described unmanned plane current location, generates position control instruction, described position control order is published to Described communication interface modules, and adjust described position control instruction by the state monitoring described unmanned plane.
System the most according to any one of claim 1 to 6, it is characterised in that
The current position of described unmanned plane and attitude are estimated by described processing module, and described processing module is real Time provide current position and attitude to described flight panel;
Described flight panel obtain by the way of topic of subscription current location estimation result (x, y, z) and Four elements (x, y, z, w) the Attitude estimation result of form.
System the most according to any one of claim 1 to 7, it is characterised in that described process mould Block is to each new key frame KiCompleting local map structuring, step includes:
S1, insertion key frame;
S2, superseded nearest point map;
S3, create new point map;
S4, locally bundle adjustment;
S5, locally key frame eliminate.
System the most according to claim 8, it is characterised in that described processing module is for partly Figure builds the last key frame K processed in threadiCarrying out loop occlusion detection thread, described processing module is passed through Key frame KiWord bag vector calculate its with system in the similarity of key frame, complete the closed loop detection of candidate, If there being key frame and the K of multiple positioniScene outward appearance similar, then have multiple closed loop candidate;
Described processing module calculates present frame KiWith closed loop frame KlBetween similarity transformation, accumulative in loop Error is modified, and similarity transformation is using the geometric verification as closed loop simultaneously;
After described processing module determines closed loop, by the calculated similarity transformation position to current key frame Appearance is modified, and this correction simultaneously will have influence on the key frame node adjacent with current key frame, carry out position Appearance figure optimization reaches global coherency, thus each point map converts according to the key frame revised.
System the most according to any one of claim 1 to 9, it is characterised in that described unmanned plane Including positioner, attitude controller, actuating motor, described positioner receives flight control information, After carrying out the resolving of attitude, control described actuating motor by described attitude controller, thus to described unmanned Machine carries out flight control.
CN201610269672.1A 2016-04-27 2016-04-27 Unmanned aerial vehicle (UAV) control method Expired - Fee Related CN105974932B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610269672.1A CN105974932B (en) 2016-04-27 2016-04-27 Unmanned aerial vehicle (UAV) control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610269672.1A CN105974932B (en) 2016-04-27 2016-04-27 Unmanned aerial vehicle (UAV) control method

Publications (2)

Publication Number Publication Date
CN105974932A true CN105974932A (en) 2016-09-28
CN105974932B CN105974932B (en) 2018-11-09

Family

ID=56994061

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610269672.1A Expired - Fee Related CN105974932B (en) 2016-04-27 2016-04-27 Unmanned aerial vehicle (UAV) control method

Country Status (1)

Country Link
CN (1) CN105974932B (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106873619A (en) * 2017-01-23 2017-06-20 上海交通大学 A kind of processing method in unmanned plane during flying path
CN107450583A (en) * 2017-08-23 2017-12-08 浙江工业大学 Unmanned plane motion tracking system based on the valiant imperial processor of high pass
CN107526681A (en) * 2017-08-11 2017-12-29 上海木爷机器人技术有限公司 The test system and method for testing of a kind of robot
CN107831776A (en) * 2017-09-14 2018-03-23 湖南优象科技有限公司 Unmanned plane based on nine axle inertial sensors independently makes a return voyage method
CN108845587A (en) * 2018-06-08 2018-11-20 赫星科技有限公司 Unmanned plane real-time control system and unmanned plane
CN109816769A (en) * 2017-11-21 2019-05-28 深圳市优必选科技有限公司 Scene map generation method, device and equipment based on depth camera
CN110455285A (en) * 2019-07-22 2019-11-15 深圳联合飞机科技有限公司 A kind of Navigation of Pilotless Aircraft method and navigation device in satellite navigation signals failure
CN111178342A (en) * 2020-04-10 2020-05-19 浙江欣奕华智能科技有限公司 Pose graph optimization method, device, equipment and medium
CN111247576A (en) * 2017-08-11 2020-06-05 联想(北京)有限公司 Subscription information configuration
CN111796603A (en) * 2020-06-16 2020-10-20 五邑大学 Smoke inspection unmanned aerial vehicle system, inspection detection method and storage medium
CN112802104A (en) * 2021-02-04 2021-05-14 华南理工大学 Loop detection method based on RGB-D camera
CN113409485A (en) * 2021-08-03 2021-09-17 广东电网有限责任公司佛山供电局 Inspection data acquisition method and device, computer equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102967297A (en) * 2012-11-23 2013-03-13 浙江大学 Space-movable visual sensor array system and image information fusion method
CN104537709A (en) * 2014-12-15 2015-04-22 西北工业大学 Real-time three-dimensional reconstruction key frame determination method based on position and orientation changes
CN104950906A (en) * 2015-06-15 2015-09-30 中国人民解放军国防科学技术大学 Unmanned aerial vehicle remote measuring and control system and method based on mobile communication network

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102967297A (en) * 2012-11-23 2013-03-13 浙江大学 Space-movable visual sensor array system and image information fusion method
CN104537709A (en) * 2014-12-15 2015-04-22 西北工业大学 Real-time three-dimensional reconstruction key frame determination method based on position and orientation changes
CN104950906A (en) * 2015-06-15 2015-09-30 中国人民解放军国防科学技术大学 Unmanned aerial vehicle remote measuring and control system and method based on mobile communication network

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
商博: "《基于ROS的室内四旋翼飞行器SLAM研究》", 《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》 *
张臻炜 等: "《一种基于计算机视觉的无人机实时三维重建方法》", 《设计与研究》 *

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106873619A (en) * 2017-01-23 2017-06-20 上海交通大学 A kind of processing method in unmanned plane during flying path
CN106873619B (en) * 2017-01-23 2021-02-02 上海交通大学 Processing method of flight path of unmanned aerial vehicle
CN107526681B (en) * 2017-08-11 2020-05-26 上海木木聚枞机器人科技有限公司 Robot test system and test method
CN107526681A (en) * 2017-08-11 2017-12-29 上海木爷机器人技术有限公司 The test system and method for testing of a kind of robot
CN111247576A (en) * 2017-08-11 2020-06-05 联想(北京)有限公司 Subscription information configuration
CN107450583A (en) * 2017-08-23 2017-12-08 浙江工业大学 Unmanned plane motion tracking system based on the valiant imperial processor of high pass
CN107831776A (en) * 2017-09-14 2018-03-23 湖南优象科技有限公司 Unmanned plane based on nine axle inertial sensors independently makes a return voyage method
CN109816769A (en) * 2017-11-21 2019-05-28 深圳市优必选科技有限公司 Scene map generation method, device and equipment based on depth camera
CN108845587A (en) * 2018-06-08 2018-11-20 赫星科技有限公司 Unmanned plane real-time control system and unmanned plane
CN110455285A (en) * 2019-07-22 2019-11-15 深圳联合飞机科技有限公司 A kind of Navigation of Pilotless Aircraft method and navigation device in satellite navigation signals failure
CN111178342A (en) * 2020-04-10 2020-05-19 浙江欣奕华智能科技有限公司 Pose graph optimization method, device, equipment and medium
CN111796603A (en) * 2020-06-16 2020-10-20 五邑大学 Smoke inspection unmanned aerial vehicle system, inspection detection method and storage medium
CN112802104A (en) * 2021-02-04 2021-05-14 华南理工大学 Loop detection method based on RGB-D camera
CN112802104B (en) * 2021-02-04 2022-09-16 华南理工大学 Loop detection method based on RGB-D camera
CN113409485A (en) * 2021-08-03 2021-09-17 广东电网有限责任公司佛山供电局 Inspection data acquisition method and device, computer equipment and storage medium
CN113409485B (en) * 2021-08-03 2023-12-12 广东电网有限责任公司佛山供电局 Inspection data acquisition method and device, computer equipment and storage medium

Also Published As

Publication number Publication date
CN105974932B (en) 2018-11-09

Similar Documents

Publication Publication Date Title
CN105974932A (en) Unmanned aerial vehicle control method
US12019458B2 (en) Systems and methods for coordinating device actions
Qin et al. Robust initialization of monocular visual-inertial estimation on aerial robots
Achtelik et al. Onboard IMU and monocular vision based control for MAVs in unknown in-and outdoor environments
EP3158412B1 (en) Sensor fusion using inertial and image sensors
EP3158293B1 (en) Sensor fusion using inertial and image sensors
EP3158417B1 (en) Sensor fusion using inertial and image sensors
Shen et al. Vision-based state estimation for autonomous rotorcraft MAVs in complex environments
EP3158411B1 (en) Sensor fusion using inertial and image sensors
WO2017206179A1 (en) Simple multi-sensor calibration
Loianno et al. Smartphones power flying robots
WO2021217430A1 (en) System and method for operating a movable object based on human body indications
EP3398019A1 (en) Methods and systems for movement control of flying devices
EP2895819A1 (en) Sensor fusion
US20210009270A1 (en) Methods and system for composing and capturing images
Loianno et al. A swarm of flying smartphones
WO2021081774A1 (en) Parameter optimization method and apparatus, control device, and aircraft
WO2022126397A1 (en) Data fusion method and device for sensor, and storage medium
US10375359B1 (en) Visually intelligent camera device with peripheral control outputs
Shaqura et al. Human supervised multirotor UAV system design for inspection applications
JP2019023865A (en) Method, system, and program for executing error recovery
Lachow et al. Autonomous Ouadcopter for Multiple Robot Tracking and Interaction in GPS-Denied Environments
CN118225082A (en) Multi-machine co-location system and method based on visual location and UWB assistance
AZRAD et al. Localization of Small Unmanned Air Vehicle in GPS-Denied Environment Using Embedded Stereo Camera

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20181109