CN114089779B - Autonomous control method and system for vision of aerial robot in GPS refusing environment - Google Patents

Autonomous control method and system for vision of aerial robot in GPS refusing environment Download PDF

Info

Publication number
CN114089779B
CN114089779B CN202111407640.0A CN202111407640A CN114089779B CN 114089779 B CN114089779 B CN 114089779B CN 202111407640 A CN202111407640 A CN 202111407640A CN 114089779 B CN114089779 B CN 114089779B
Authority
CN
China
Prior art keywords
error
control
aerial robot
speed
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111407640.0A
Other languages
Chinese (zh)
Other versions
CN114089779A (en
Inventor
钟杭
范泷文
王耀南
缪志强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan University
Original Assignee
Hunan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan University filed Critical Hunan University
Priority to CN202111407640.0A priority Critical patent/CN114089779B/en
Publication of CN114089779A publication Critical patent/CN114089779A/en
Application granted granted Critical
Publication of CN114089779B publication Critical patent/CN114089779B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses an autonomous control method and system for vision of an aerial robot in a GPS refusing environment, comprising the following steps: the camera module acquires an ambient environment image in real time and sends the ambient environment image to the computer module, and the computer module obtains a homography matrix according to the ambient environment image and a preset reference image and sends the homography matrix to the gesture controller; the gesture controller obtains a first control error and a gesture error control variable according to the homography matrix and a preset virtual control error function, and sends the first control error to the computer module; the computer module estimates the speed of the computer module in real time according to the first control error and a preset speed estimation error function and sends the speed to the gesture controller; the attitude controller obtains an attitude control quantity according to the self speed estimated in real time, a preset error control mode, a preset dynamic equation of the aerial robot, a first control error and an attitude error control variable, and controls the autonomous flight of the aerial robot according to the attitude control quantity.

Description

Autonomous control method and system for vision of aerial robot in GPS refusing environment
Technical Field
The invention belongs to the field of autonomous control of aerial robots, and particularly relates to an autonomous control method and system for vision of aerial robots in a GPS refused environment.
Background
The air robot theory and technology has been developed rapidly, the functions that can be achieved by the air robot are more and more powerful, people-carrying aircrafts are gradually replaced in some fields, and the air robot has high efficiency especially in environments and challenging projects where people are hard to reach. However, many application scenes of the aerial robot face not only the challenges of the GPS refusing environment, but also the high-efficiency self-attitude control problem, so that the adoption of the self-attitude control technology based on the on-board visual sensor becomes the key of whether the aerial robot can fly autonomously in the GPS refusing environment in the future.
The control of the aerial robot requires its state information and then takes appropriate control actions. Extracting only their translational speeds from on-board sensing remains a pending problem for airborne robots. However, the translational speed is a key information for the control of the aerial robot. For this purpose, the most common sensing means are GPS and vision. GPS relies on external sources (satellites) to provide global position information for the vehicle, so that it cannot operate in cluttered urban areas, is unreliable at low altitudes, is subject to satellite signal cut-off, and is a non-passive sensing modality. The traditional aerial robot uses GPS to provide position information for the aerial robot to obtain the position information, then the aim of operation is achieved through the position loop control of the controller, but in some complex scenes (such as urban streets, house rooms and the like), the GPS support cannot be provided, stable positioning information cannot be provided for the aerial robot, the situation that the gesture of the aerial robot is uncontrollable is caused, and the aerial robot controlled through the position loop has control hysteresis and gesture instability.
For the attitude control of the aerial robot in the GPS rejection environment, which is in need of high dynamic robustness, the traditional position control cannot meet the requirement, so that a new control method is needed to enable the aerial robot to still maintain the high robust control in a more complex environment.
Disclosure of Invention
Aiming at the technical problems, the invention provides an autonomous control method and an autonomous control system for vision of an aerial robot in a GPS refusing environment, wherein the autonomous control of flying can be performed in the GPS refusing environment.
The technical scheme adopted for solving the technical problems is as follows:
the autonomous control method of the vision of the aerial robot in the GPS refusing environment comprises the following steps:
step S100: the camera module of the aerial robot acquires the surrounding environment image in real time and sends the surrounding environment image to the computer module of the aerial robot, and the computer module acquires the surrounding environment image in real time, obtains a homography matrix according to the surrounding environment image and a preset reference image and sends the homography matrix to the gesture controller of the aerial robot;
step S200: the attitude controller obtains a first control error and an attitude error control variable according to the homography matrix and a preset virtual control error function, and sends the first control error to a computer module of the aerial robot;
step S300: the computer module of the aerial robot estimates the speed of the aerial robot in real time according to the first control error and a preset speed estimation error function and sends the speed to the gesture controller;
step S400: the attitude controller obtains an attitude control quantity according to the self speed estimated in real time, a preset error control mode, a preset dynamic equation of the aerial robot, a first control error and an attitude error control variable, and controls the autonomous flight of the aerial robot according to the attitude control quantity.
Preferably, step S100 includes:
step S110: acquiring a preset picture containing a target pose as a preset reference image, calculating ORB characteristic points of the preset reference image and generating a descriptor corresponding to the reference image as a matching object of a subsequent frame;
step S120: acquiring surrounding environment images in the flying process of the aerial robot as current image frames in real time, calculating ORB characteristic points of the current image frames and generating descriptors corresponding to the current image frames, performing characteristic matching on the descriptors corresponding to the current image frames and the descriptors corresponding to the reference images by adopting a random sampling consistency algorithm, and screening matched points until a preset number of matching pairs with highest confidence coefficient are selected;
step S130: solving a homography matrix between the current image frame and a preset reference image frame according to a preset number of matching pairs with highest confidence coefficient;
step S140: and sending the homography matrix to a gesture controller of the aerial robot.
Preferably, the step S200 specifically includes:
wherein e 1 Representing a pre-set virtual error function,meaning defined, H is homography matrix, H v Is a virtual homography matrix,>for the product of the pitching rotation matrix and the rolling rotation matrix of the aerial robot, I 3 Is 3 x 3 identity matrix, m * =[0,0,1] T ,e z =[0,0,1] T ,/>For a defined posing error control variable, T represents the matrix transpose, e p1 For the first control error to be defined, vex () represents the conversion of a matrix into a vector.
Preferably, the speed estimation error function preset in step S300 is specifically:
wherein,is an estimate of the true error of the defined virtual image plane, for example>Is an estimate of the real speed of the defined virtual image plane,/->Deviation of the estimated value of the true error of the defined virtual image plane,/is>Is the deviation of the estimated value of the real speed of the defined virtual image plane, V v The real speed value of the virtual image plane of the aerial robot is obtained;
the update function of the speed value is specifically:
wherein k is 1 、k 2 、a * For the purpose of a manual setting value,for intermediate variables of the speed estimation part in the computer module, Ω z ' true yaw rate of aerial robot, e z =[0,0,1] T ,F v Is the stress sum of the virtual image plane, m is the actual weight of the aerial robot, +.>Representing the derivative of intermediate variables [] × Representing the vector being replaced by an antisymmetric matrix.
Preferably, step S400 includes:
step S410: solving a dynamic model equation of a virtual plane according to a preset dynamic equation of the aerial robot and a first control error;
step S420: obtaining a second control error according to the self speed estimated in real time and a preset error control mode;
step S430: obtaining an attitude control rate equation according to the dynamic model equation of the virtual plane and the second control error;
step S440: and inputting the second control error into a gesture control rate equation to obtain lift force, pitch angle speed and rolling angular speed of the aerial robot, and obtaining yaw angular speed according to gesture error control variables, wherein the lift force, the pitch angle speed, the rolling angular speed and the yaw angular speed are gesture control amounts.
Preferably, the dynamic model equation of the virtual plane in step S410 is specifically:
wherein,is the derivative of the first control error, +.>For the real speed value of the virtual image plane of the aerial robot, < > for>Derivative of the product of the pitch rotation matrix and the roll rotation matrix for an aerial robot, e p1 As a first control error, Ω z ' true yaw rate e for an airborne robot z =[0,0,1] T M is the actual weight of the aerial robot, a * For manual setting, V v For the real speed value of the virtual image plane of the aerial robot, < > for>Is the product of a pitching rotation matrix and a rolling rotation matrix of the aerial robot, I is an identity matrix, J is the rotational inertia of the machine body, Γ is a moment, F v Is the stress sum, +.>F=-T‘e z +mgR T e z F is the resultant force, T' is the lift force, g is the weight constant, Ω is the angular velocity, +.>Is the derivative of angular velocity.
Preferably, step S420 specifically includes:
wherein e p1 E is the first control error p2 E is the third control error p3 For the second control error, m is the actual weight of the aerial robot, c, k 3 Is a debug parameter.
Preferably, step S430 specifically includes:
wherein,is the derivative of the sum of forces of the virtual image plane, F v Is the stress sum of the virtual image plane, m is the actual weight of the aerial robot, and Ω z ' true yaw rate of aerial robot, e z =[0,0,1] T ,/>E is the product of the pitching rotation matrix and the rolling rotation matrix of the unmanned plane p3 K is the second control error 3 >0、k 4 >0 is a manual setting value omega x To roll angular velocity, Ω y At pitch rate, T' is lift, +.>Is the derivative of lift.
Preferably, in step S440, the yaw rate is obtained according to the attitude error control variable, specifically:
wherein Ω z For yaw rate, k 5 For the purpose of a manual setting value,is a posing error control variable.
The system comprises an aerial robot, wherein the aerial robot comprises an aerial robot body, a camera module, a computer module and a gesture controller, the computer module, the camera module and the gesture controller are arranged on the aerial robot body, the computer module is arranged right above the middle of the aerial robot body, the camera module is arranged right below the aerial robot body and has a downward visual field, the camera module is connected with the computer module, and the computer module is connected with the gesture controller;
the camera module collects surrounding environment images in real time and sends the surrounding environment images to the computer module;
the computer module acquires an ambient image in real time, acquires a homography matrix according to the ambient image and a preset reference image, and sends the homography matrix to the gesture controller; estimating the speed of the self in real time according to the first control error and a preset speed estimation error function and sending the speed to the gesture controller
The gesture controller obtains a first control error and a gesture error control variable according to the homography matrix and a preset virtual control error function, and sends the first control error to the computer module; and obtaining an attitude control quantity according to the self speed estimated in real time, a preset error control mode, a preset dynamic equation of the aerial robot, the first control error and an attitude error control variable, and controlling the autonomous flight of the aerial robot according to the attitude control quantity.
In the method, the aerial robot does not need to know the position information of the aerial robot, and can realize a plurality of tasks such as hovering, landing, target tracking and the like only through visual information. The method is characterized in that an image of a camera with a downward visual field is input, a new gesture controller is provided by solving homography matrixes of a preset reference image and a current image, meanwhile, a self speed estimation module is designed in a computer module to improve the robustness of the method in a GPS rejection environment, the homography matrixes and the speed estimated by the self speed estimation module are input into the gesture controller to obtain gesture control quantity, a target pose is realized according to the gesture control quantity, the robust control in the GPS rejection environment is realized, various troublesome problems of hovering, landing, tracking and the like of a traditional aerial robot in the GPS rejection environment are solved, and the aerial robot is autonomously controlled. The aerial robot controlled by the method can stably run under the condition of GPS refusal, and is only controlled autonomously by a computer module of the aerial robot.
Drawings
Fig. 1 is a flowchart of an autonomous control method for vision of an aerial robot in a GPS rejection environment according to an embodiment of the present invention.
Fig. 2 is a schematic structural diagram of an autonomous control system for vision of an aerial robot in a GPS rejection environment according to an embodiment of the present invention.
Detailed Description
In order to make the technical scheme of the present invention better understood by those skilled in the art, the present invention will be further described in detail with reference to the accompanying drawings.
In one embodiment, as shown in fig. 1, the method for controlling visual servo of an aerial robot in a GPS rejection environment comprises the following steps:
step S100: the camera module of the aerial robot acquires the surrounding environment image in real time and sends the surrounding environment image to the computer module of the aerial robot, and the computer module acquires the surrounding environment image in real time, obtains a homography matrix according to the surrounding environment image and a preset reference image and sends the homography matrix to the gesture controller of the aerial robot;
step S200: the attitude controller obtains a first control error and an attitude error control variable according to the homography matrix and a preset virtual control error function, and sends the first control error to a computer module of the aerial robot;
step S300: the computer module of the aerial robot estimates the speed of the aerial robot in real time according to the first control error and a preset speed estimation error function and sends the speed to the gesture controller;
step S400: the attitude controller obtains an attitude control quantity according to the self speed estimated in real time, a preset error control mode, a preset dynamic equation of the aerial robot, a first control error and an attitude error control variable, and controls the autonomous flight of the aerial robot according to the attitude control quantity.
Specifically, the hardware system realized by the method comprises an aerial robot body, a camera module, a computer module and a gesture controller, wherein the computer module, the camera module and the gesture controller are arranged on the aerial robot body, the computer module is arranged right above the middle of the aerial robot body, the camera module is arranged right below the aerial robot body and has a downward visual field, the computer module is connected with the computer module, the computer module is connected with the gesture controller, further, the visual field range of the camera is 135 degrees, the refreshing frequency is 60HZ, the picture pixel size is 1440x1080, and the hollow middle robot platform adopts a four-rotor unmanned aerial vehicle but is not limited to the four-rotor unmanned aerial vehicle.
In the method, the aerial robot does not need to know the position information of the aerial robot, and can realize a plurality of tasks such as hovering, landing, target tracking and the like only through visual information. The method is characterized in that an image of a camera with a downward visual field is input, a new gesture controller is provided by solving homography matrixes of a preset reference image and a current image, meanwhile, a self speed estimation module is designed in a computer module to improve the robustness of the method under the GPS rejection environment, the homography matrixes and the self estimated speed are input into the gesture controller to obtain gesture control quantity, a target pose is realized according to the gesture control quantity, the robust control under the GPS rejection environment is realized, the problems of various troubles such as hovering, landing and tracking of a traditional aerial robot in the GPS rejection environment are solved, and the autonomous control of the aerial robot is realized. The aerial robot controlled by the method can stably run under the condition of GPS refusal, and is only controlled autonomously by an onboard computer of the aerial robot, compared with the traditional control method based on the GPS, the method is more suitable for autonomous flight under a complex environment, and a feasible scheme is provided for practical application of the aerial robot under the GPS refusal environment.
In one embodiment, step S100 includes:
step S110: acquiring a preset picture containing a target pose as a preset reference image, calculating ORB characteristic points of the preset reference image and generating a descriptor corresponding to the reference image as a matching object of a subsequent frame;
step S120: acquiring surrounding environment images in the flying process of the aerial robot as current image frames in real time, calculating ORB characteristic points of the current image frames and generating descriptors corresponding to the current image frames, performing characteristic matching on the descriptors corresponding to the current image frames and the descriptors corresponding to the reference images by adopting a random sampling consistency algorithm, and screening matched points until a preset number of matching pairs with highest confidence coefficient are selected;
step S130: solving a homography matrix between the current image frame and a preset reference image frame according to a preset number of matching pairs with highest confidence coefficient;
step S140: and sending the homography matrix to a gesture controller of the aerial robot.
Specifically, the input required by the gesture controller is a homography matrix of a preset reference picture and a current picture, for example, if tracking is performed, a picture containing a target is obtained in advance to serve as a reference picture, the controller can control the unmanned aerial vehicle to arrive at the reference picture, and if the unmanned aerial vehicle is a video frame, a previous frame picture is taken as the reference picture. Further, the preset number of matching pairs is 64, not limited to this number, and may be adjusted according to practical situations, and if there is no preset number of matching pairs in the feature matching process, the matching is considered to be failed, and step S120 is performed again.
In one embodiment, step S200 is specifically:
wherein e 1 Representing a pre-set virtual error function,meaning defined, H is homography matrix, H v Is a virtual homography matrix,>for the product of the pitching rotation matrix and the rolling rotation matrix of the aerial robot, I 3 Is 3 x 3 identity matrix, m * =[0,0,1] T ,e z =[0,0,1] T ,/>For a defined posing error control variable, T represents the matrix transpose, e p1 For the first control error to be defined, vex () represents the conversion of a matrix into a vector.
In one embodiment, the speed estimation error function preset in step S300 is specifically:
wherein,is an estimate of the true error of the defined virtual image plane, for example>Is an estimate of the real speed of the defined virtual image plane,/->Deviation of the estimated value of the true error of the defined virtual image plane,/is>Is the deviation of the estimated value of the real speed of the defined virtual image plane, V v The real speed value of the virtual image plane of the aerial robot is obtained;
the update function of the speed value is specifically:
wherein k is 1 、k 2 、a * For the purpose of a manual setting value,for intermediate variables of the speed estimation part in the computer module, Ω z ' true yaw rate of aerial robot, e z =[0,0,1] T ,F v Is the stress sum of the virtual image plane, m is the actual weight of the aerial robot, +.>Representing the derivative of intermediate variables [] × Representing the vector being replaced by an antisymmetric matrix.
Specifically, when k 1 >0 and k 2 >0, the whole estimator will quickly stabilize and solve for the Lieplov functionWhen we choose α=2min (k 1 ,k 2 a * ) When this is the case, the whole non-linear estimator can be proven to be stable. In the actual test, k 1 =1.5、k 2 =8.0. Through this step, an estimated speed value of the current rotary-wing unmanned aerial vehicle is obtained.
In one embodiment, step S400 includes:
step S410: solving a dynamic model equation of a virtual plane according to a preset dynamic equation of the aerial robot and a first control error;
step S420: obtaining a second control error according to the self speed estimated in real time and a preset error control mode;
step S430: obtaining an attitude control rate equation according to the dynamic model equation of the virtual plane and the second control error;
step S440: and inputting the second control error into a gesture control rate equation to obtain lift force, pitch angle speed and rolling angular speed of the aerial robot, and obtaining yaw angular speed according to gesture error control variables, wherein the lift force, the pitch angle speed, the rolling angular speed and the yaw angular speed are gesture control amounts.
Further, the dynamic equation of the preset aerial robot is:
considering that the rotor unmanned aerial vehicle is an under-actuated model, a control rule converted into a virtual plane is needed.
In one embodiment, the dynamic model equation of the virtual plane in step S410 is specifically:
wherein,is the derivative of the first control error, +.>For the real speed value of the virtual image plane of the aerial robot, < > for>Derivative of the product of the pitch rotation matrix and the roll rotation matrix for an aerial robot, e p1 As a first control error, Ω z ' true yaw rate e for an airborne robot z =[0,0,1] T M is the actual weight of the aerial robot, a * For manual setting, V v Virtual image for aerial robotPlane true speed value +.>Is the product of a pitching rotation matrix and a rolling rotation matrix of the aerial robot, I is an identity matrix, J is the rotational inertia of the machine body, Γ is a moment, F v Is the stress sum, +.>F=-T‘e z +mgR T e z F is the resultant force, T' is the lift force, g is the weight constant, Ω is the angular velocity, +.>Is the derivative of angular velocity.
In one embodiment, step S420 is specifically:
wherein e p1 E is the first control error p2 E is the third control error p3 For the second control error, m is the actual weight of the aerial robot, c, k 3 Is a debug parameter.
Specifically, in actual testing, m=2.5, c=1.2, k 3 =1.0 is the best data in the test result, and at this time, the velocity estimation value can quickly converge, which satisfies the requirements required for the subsequent control.
In one embodiment, step S430 is specifically:
wherein,is the derivative of the sum of forces of the virtual image plane, F v Is the stress sum of the virtual image plane, m is the actual weight of the aerial robot, and Ω z ' true yaw rate of aerial robot, e z =[0,0,1] T ,/>E is the product of the pitching rotation matrix and the rolling rotation matrix of the unmanned plane p3 K is the second control error 3 >0、k 4 >0 is a manual setting value omega x To roll angular velocity, Ω y At pitch rate, T' is lift, +.>Is the derivative of lift.
Specifically, the value of the second control error is input to a control rate equation to obtain lift force, pitch angle speed and roll angle speed.
In one embodiment, the yaw rate is obtained according to the attitude error control variable in step S440, specifically:
wherein Ω z For yaw rate, k 5 For the purpose of a manual setting value,is a posing error control variable.
Regarding the manual setting values in the above formula, there is the following relationship:
continuously calculating T and omega in the autonomous flight of the rotor unmanned plane y 、Ω x 、Ω z At the same time controlIn the quantity control input controller, the rotary unmanned aerial vehicle is controlled in a gesture, so that the autonomous flight of the rotary unmanned aerial vehicle in the GPS refusing environment is realized.
The invention provides a visual servo control method of a rotor unmanned aerial vehicle in a GPS refusing environment, which provides an estimator of self speed, and can rapidly and accurately estimate self speed information without measuring data of additional sensor information such as GPS, IMU and the like; compared with methods based on characteristics, optical flow and the like, the homography-based method has stronger robustness and higher efficiency, and can estimate the speed of the rotor unmanned aerial vehicle in real time; the speed estimation method is not only suitable for various rotor unmanned aerial vehicles such as four rotors, six rotors and eight rotors, but also can be applied to any robot device, and compatibility problems can not be generated; the actual control quantity of the rotor unmanned aerial vehicle is calculated through the onboard computer processing of the local visual information, so that the rotor unmanned aerial vehicle can realize multiple operations such as autonomous navigation, hovering, tracking and the like in the GPS refused environment without falling; the method has strong robustness to the scene, and can realize autonomous flight even under the condition of more complex environments.
In one embodiment, as shown in fig. 2, the autonomous control system for vision of the aerial robot in the GPS rejection environment comprises the aerial robot, wherein the aerial robot comprises an aerial robot body 1, a camera module 2, a computer module 3 and a gesture controller, the computer module 3, the camera module 2 and the gesture controller are arranged on the aerial robot body 1, the computer module 3 is arranged right above the middle of the aerial robot body 1, the camera module 2 is arranged right below the aerial robot body 1 and has a downward visual field, the camera module 2 is connected with the computer module 3, and the computer module 3 is connected with the gesture controller;
the camera module 2 collects surrounding environment images in real time and sends the surrounding environment images to the computer module 3;
the computer module 3 acquires an ambient image in real time, obtains a homography matrix according to the ambient image and a preset reference image, and sends the homography matrix to the gesture controller; the self speed is estimated in real time according to the first control error and a preset speed estimation error function and is sent to the gesture controller;
the gesture controller obtains a first control error and a gesture error control variable according to the homography matrix and a preset virtual control error function, and sends the first control error to the computer module 3; and obtaining an attitude control quantity according to the self speed estimated in real time, a preset error control mode, a preset dynamic equation of the aerial robot, the first control error and an attitude error control variable, and controlling the autonomous flight of the aerial robot according to the attitude control quantity.
Specific limitation regarding the autonomous control system of the vision of the aerial robot in the GPS-reject environment can be found in the above limitation regarding the autonomous control method of the vision of the aerial robot in the GPS-reject environment, and will not be described herein.
The aerial robot vision autonomous control method and the aerial robot vision autonomous control system under the GPS rejection environment provided by the invention are described in detail. The principles and embodiments of the present invention have been described herein with reference to specific examples, the description of which is intended only to facilitate an understanding of the core concepts of the invention. It should be noted that it will be apparent to those skilled in the art that various modifications and adaptations of the invention can be made without departing from the principles of the invention and these modifications and adaptations are intended to be within the scope of the invention as defined in the following claims.

Claims (9)

  1. An autonomous control method for vision of an aerial robot in a GPS refusing environment is characterized by comprising the following steps:
    step S100: the method comprises the steps that a camera module of an aerial robot collects surrounding environment images in real time and sends the surrounding environment images to a computer module of the aerial robot, the computer module obtains the surrounding environment images in real time, and a homography matrix is obtained according to the surrounding environment images and a preset reference image and sent to a gesture controller of the aerial robot;
    step S200: the gesture controller obtains a first control error and a gesture error control variable according to the homography matrix and a preset virtual control error function, and sends the first control error to the computer module;
    step S300: the computer module estimates the speed of the computer module in real time according to the first control error and a preset speed estimation error function and sends the speed to the gesture controller;
    step S400: the attitude controller obtains an attitude control quantity according to the real-time estimated self speed, a preset error control mode, a preset dynamic equation of the aerial robot, the first control error and the attitude error control variable, and controls the autonomous flight of the aerial robot according to the attitude control quantity;
    step S400 includes:
    step S410: solving a dynamic model equation of a virtual plane according to a preset dynamic equation of the aerial robot and the first control error;
    step S420: obtaining a second control error according to the real-time estimated self speed and a preset error control mode;
    step S430: obtaining an attitude control rate equation according to the dynamic model equation of the virtual plane and the second control error;
    step S440: and inputting the second control error into the attitude control rate equation to obtain lift force, pitch angle speed and rolling angular speed of the aerial robot, and obtaining yaw angular speed according to the attitude error control variable, wherein the lift force, the pitch angle speed, the rolling angular speed and the yaw angular speed are attitude control amounts.
  2. 2. The method according to claim 1, wherein step S100 comprises:
    step S110: acquiring a preset picture containing a target pose as a preset reference image, calculating ORB characteristic points of the preset reference image and generating a descriptor corresponding to the reference image as a matching object of a subsequent frame;
    step S120: acquiring surrounding environment images in the flying process of the aerial robot in real time as current image frames, calculating ORB characteristic points of the current image frames and generating descriptors corresponding to the current image frames, performing characteristic matching on the descriptors corresponding to the current image frames and the descriptors corresponding to the reference images by adopting a random sampling consistency algorithm, and screening matched points until a preset number of matching pairs with highest confidence level are selected;
    step S130: solving a homography matrix between the current image frame and the preset reference image frame according to the preset number of matching pairs with the highest confidence coefficient;
    step S140: and sending the homography matrix to a gesture controller of the aerial robot.
  3. 3. The method according to claim 2, wherein step S200 is specifically:
    wherein e 1 Representing a pre-set virtual error function,meaning defined, H is homography matrix, H v Is a virtual homography matrix,> for the product of the pitching rotation matrix and the rolling rotation matrix of the aerial robot, I 3 Is 3 x 3 identity matrix, m * =[0,0,1] T ,e z =[0,0,1] T ,/>For a defined posing error control variable, T represents the matrix transpose, e p1 For the first control error to be defined, vex () represents the conversion of a matrix into a vector.
  4. 4. A method according to claim 3, wherein the speed estimation error function preset in step S300 is specifically:
    wherein,is an estimate of the true error of the defined virtual image plane, for example>Is an estimate of the real speed of the defined virtual image plane,/->Deviation of the estimated value of the true error of the defined virtual image plane,/is>To be the deviation of the estimated value of the real speed of the defined virtual image plane, V v The real speed value of the virtual image plane of the aerial robot is obtained;
    the update function of the speed value is specifically:
    wherein k is 1 、k 2 、a * For the purpose of a manual setting value,for intermediate variables of the speed estimation part in the computer module, Ω z ' true yaw rate of aerial robot, e z =[0,0,1] T ,F v Is the stress sum of the virtual image plane, m is the air machineActual weight of robot->Representing the derivative of intermediate variables [] × Representing the vector being replaced by an antisymmetric matrix.
  5. 5. The method according to claim 1, wherein the dynamic model equation of the virtual plane in step S410 is specifically:
    wherein,is the derivative of the first control error, +.>For the real speed value of the virtual image plane of the aerial robot, < > for>Derivative of the product of the pitch rotation matrix and the roll rotation matrix for an aerial robot, e p1 As a first control error, Ω z ' true yaw rate e for an airborne robot z =[0,0,1] T M is the actual weight of the aerial robot, a * For manual setting, V v For the real speed value of the virtual image plane of the aerial robot, < > for>Is the product of a pitching rotation matrix and a rolling rotation matrix of the aerial robot, I is an identity matrix, J is the rotational inertia of the machine body, Γ is a moment, F v Is the stress sum, +.>F=-T‘e z +mgR T e z F is the resultant force, T' is the lift force, g is the weight constant, Ω is the angular velocity, +.>Is the derivative of angular velocity.
  6. 6. The method according to claim 5, wherein step S420 is specifically:
    wherein e p1 E is the first control error p2 E is the third control error p3 For the second control error, m is the actual weight of the aerial robot, c, k 3 Is a debug parameter.
  7. 7. The method according to claim 6, wherein step S430 is specifically:
    wherein,is the derivative of the sum of forces of the virtual image plane, F v Is the stress sum of the virtual image plane, m is the actual weight of the aerial robot, and Ω z ' true yaw rate of aerial robot, e z =[0,0,1] T ,/>E is the product of the pitching rotation matrix and the rolling rotation matrix of the unmanned plane p3 K is the second control error 3 >0、k 4 >0 is a manual setting value omega x To roll angular velocity, Ω y At pitch rate, T' is lift, +.>Is the derivative of lift.
  8. 8. The method according to claim 7, wherein the yaw rate is obtained in step S440 based on the attitude error control variable, specifically:
    wherein Ω z For yaw rate, k 5 For the purpose of a manual setting value,is a posing error control variable.
  9. The aerial robot vision autonomous control system under the GPS rejection environment is characterized by comprising an aerial robot, wherein the aerial robot comprises an aerial robot body, a camera module, a computer module and a gesture controller, the computer module, the camera module and the gesture controller are arranged on the aerial robot body, the computer module is arranged right above the middle of the aerial robot body, the camera module is arranged right below the aerial robot body and has a downward visual field, the camera module is connected with the computer module, and the computer module is connected with the gesture controller;
    the camera module collects surrounding environment images in real time and sends the surrounding environment images to the computer module;
    the computer module acquires the surrounding environment image in real time, obtains a homography matrix according to the surrounding environment image and a preset reference image, and sends the homography matrix to the gesture controller; the self speed is estimated in real time according to the first control error and a preset speed estimation error function and is sent to the gesture controller;
    the gesture controller obtains a first control error and a gesture error control variable according to the homography matrix and a preset virtual control error function, and sends the first control error to the computer module; obtaining an attitude control quantity according to the self speed estimated in real time, a preset error control mode, a preset dynamic equation of the aerial robot, the first control error and the attitude error control variable, and controlling the autonomous flight of the aerial robot according to the attitude control quantity;
    the attitude controller obtains an attitude control quantity according to the self speed estimated in real time, a preset error control mode, a preset dynamic equation of the aerial robot, the first control error and the attitude error control variable, and controls the autonomous flight of the aerial robot according to the attitude control quantity, and the method comprises the following steps:
    solving a dynamic model equation of a virtual plane according to a preset dynamic equation of the aerial robot and the first control error; obtaining a second control error according to the real-time estimated self speed and a preset error control mode; obtaining an attitude control rate equation according to the dynamic model equation of the virtual plane and the second control error; and inputting the second control error into the attitude control rate equation to obtain lift force, pitch angle speed and rolling angular speed of the aerial robot, and obtaining yaw angular speed according to the attitude error control variable, wherein the lift force, the pitch angle speed, the rolling angular speed and the yaw angular speed are attitude control amounts.
CN202111407640.0A 2021-11-24 2021-11-24 Autonomous control method and system for vision of aerial robot in GPS refusing environment Active CN114089779B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111407640.0A CN114089779B (en) 2021-11-24 2021-11-24 Autonomous control method and system for vision of aerial robot in GPS refusing environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111407640.0A CN114089779B (en) 2021-11-24 2021-11-24 Autonomous control method and system for vision of aerial robot in GPS refusing environment

Publications (2)

Publication Number Publication Date
CN114089779A CN114089779A (en) 2022-02-25
CN114089779B true CN114089779B (en) 2024-03-19

Family

ID=80304320

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111407640.0A Active CN114089779B (en) 2021-11-24 2021-11-24 Autonomous control method and system for vision of aerial robot in GPS refusing environment

Country Status (1)

Country Link
CN (1) CN114089779B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116149193B (en) * 2023-04-24 2023-06-23 湖南大学 Anti-disturbance control method and system for rotor unmanned aerial vehicle based on vision

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110058602A (en) * 2019-03-27 2019-07-26 天津大学 Multi-rotor unmanned aerial vehicle autonomic positioning method based on deep vision
CN110068335A (en) * 2019-04-23 2019-07-30 中国人民解放军国防科技大学 Unmanned aerial vehicle cluster real-time positioning method and system under GPS rejection environment
CN110231828A (en) * 2019-05-31 2019-09-13 燕山大学 Quadrotor drone Visual servoing control method based on NFTSM

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110058602A (en) * 2019-03-27 2019-07-26 天津大学 Multi-rotor unmanned aerial vehicle autonomic positioning method based on deep vision
CN110068335A (en) * 2019-04-23 2019-07-30 中国人民解放军国防科技大学 Unmanned aerial vehicle cluster real-time positioning method and system under GPS rejection environment
CN110231828A (en) * 2019-05-31 2019-09-13 燕山大学 Quadrotor drone Visual servoing control method based on NFTSM

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
A Homography-Based Visual Servo Control Approach for an Underactuated Unmanned Aerial Vehicle in GPS-Denied Environments;Hang Zhong 等;《IEEE》;20220329;全文 *
Autonomous Flight Control of a Nano Quadrotor Helicopter in a GPS-Denied Environment Using On-Board Vision;Zhang, X 等;《IEEE》;20151231;全文 *
基于视觉伺服的无人机自主着陆仿真系统设计;杨建业;戚国庆;盛安冬;;电子设计工程;20190605(第11期);全文 *

Also Published As

Publication number Publication date
CN114089779A (en) 2022-02-25

Similar Documents

Publication Publication Date Title
Nekoukar et al. Robust path tracking of a quadrotor using adaptive fuzzy terminal sliding mode control
CN110347171B (en) Aircraft control method and aircraft
Mahony et al. Image-based visual servo control of aerial robotic systems using linear image features
Jasim et al. A robust controller for multi rotor UAVs
Romero et al. Stabilization and location of a four rotor helicopter applying vision
Zhao et al. Vision-based tracking control of quadrotor with backstepping sliding mode control
Mills et al. Vision based control for fixed wing UAVs inspecting locally linear infrastructure using skid-to-turn maneuvers
CN109521785B (en) Intelligent rotor craft system capable of being shot with oneself
Xie et al. Dynamic IBVS of a rotary wing UAV using line features
Tanaka et al. Wireless vision-based stabilization of indoor microhelicopter
Xie Dynamic visual servoing of rotary wing unmanned aerial vehicles
CN116360492B (en) Object tracking method and system for flapping wing flying robot
Lin et al. Development of an unmanned coaxial rotorcraft for the DARPA UAVForge challenge
CN114089779B (en) Autonomous control method and system for vision of aerial robot in GPS refusing environment
Wheeler et al. Cooperative tracking of moving targets by a team of autonomous UAVs
CN116149193B (en) Anti-disturbance control method and system for rotor unmanned aerial vehicle based on vision
CN106292297B (en) Attitude control method based on PID controller and L1 adaptive controller
CN113138608B (en) Four-rotor unmanned aerial vehicle vision servo control method using disturbance observer and nonlinear speed observer
CN111176311A (en) Sliding mode delay estimation control method for attitude of quad-rotor unmanned aerial vehicle and storage medium
Serra et al. Nonlinear IBVS controller for the flare maneuver of fixed-wing aircraft using optical flow
Andersen et al. Improving MAV pose estimation using visual information
Herisse et al. A nonlinear terrain-following controller for a vtol unmanned aerial vehicle using translational optical flow
CN108873935A (en) Control method, device, equipment and the storage medium of logistics distribution unmanned plane landing
Fusini et al. Experimental validation of a uniformly semi-globally exponentially stable non-linear observer for gnss-and camera-aided inertial navigation for fixed-wing uavs
CN116540753A (en) Unmanned aerial vehicle landing control method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant