CN108733076A - Method and device for grabbing target object by unmanned aerial vehicle and electronic equipment - Google Patents
Method and device for grabbing target object by unmanned aerial vehicle and electronic equipment Download PDFInfo
- Publication number
- CN108733076A CN108733076A CN201810507667.9A CN201810507667A CN108733076A CN 108733076 A CN108733076 A CN 108733076A CN 201810507667 A CN201810507667 A CN 201810507667A CN 108733076 A CN108733076 A CN 108733076A
- Authority
- CN
- China
- Prior art keywords
- unmanned plane
- target object
- mechanical arm
- position information
- relative position
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 44
- 238000003384 imaging method Methods 0.000 claims description 23
- 239000011159 matrix material Substances 0.000 claims description 21
- 230000015654 memory Effects 0.000 claims description 20
- 230000009471 action Effects 0.000 claims description 9
- 210000001258 synovial membrane Anatomy 0.000 claims description 7
- 238000004590 computer program Methods 0.000 claims description 5
- 230000003044 adaptive effect Effects 0.000 claims description 4
- 230000003068 static effect Effects 0.000 claims description 4
- 230000007704 transition Effects 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 17
- 230000003287 optical effect Effects 0.000 description 8
- 238000005516 engineering process Methods 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 238000001514 detection method Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 210000000078 claw Anatomy 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000004033 diameter control Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000013486 operation strategy Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The embodiment of the invention discloses a method and a device for grabbing a target object by an unmanned aerial vehicle and electronic equipment, wherein the method comprises the following steps: acquiring relative position information between a target object and the unmanned aerial vehicle in real time; determining a planned path of the target object tracked by the unmanned aerial vehicle according to the relative position information; controlling the drone to track the target object based on the planned path; when the relative distance between the unmanned aerial vehicle and the target object reaches a set threshold value, the unmanned aerial vehicle and the mechanical arm are controlled to move so as to grab the target object. By adopting the technical scheme, the unmanned aerial vehicle can effectively grab the target object in the environment without the dynamic catching system, and especially can effectively grab the moving target object.
Description
Technical field
The present embodiments relate to the method, apparatus that unmanned plane field more particularly to a kind of unmanned plane capture target object
And electronic equipment.
Background technology
With using rotor type aircraft as the rapid development of the unmanned air vehicle technique of representative, unmanned plane take photo by plane, scout, agriculture
The fields such as industry, express transportation, disaster relief are widely applied.
But current unmanned plane technology in terms of capturing target object is still immature, is also limited to indoor move of dependence and catches
The crawl of system, and mainly for the crawl of stationary indoors object, when target object is in outdoor and catches system without dynamic
It then can not effectively be captured when environment, especially when it is mobile object to capture object, it is even more impossible to accurately be captured.
Invention content
The present invention provides a kind of method, apparatus and electronic equipment of unmanned plane crawl target object, to realize that unmanned plane exists
Dynamic catch effectively captures target object in the environment of system.
To achieve the above object, the embodiment of the present invention adopts the following technical scheme that:
In a first aspect, an embodiment of the present invention provides a kind of method that unmanned plane captures target object, the method includes:
The relative position information between target object and unmanned plane is obtained in real time;
Determine that unmanned plane tracks the planning path of target object according to the relative position information;
It controls the unmanned plane and is based on the planning path tracking target object;
When the relative distance between unmanned plane and the target object reaches given threshold, unmanned plane and machinery are controlled
Arm is acted to capture the target object.
Further, the real-time relative position information obtained between target object and unmanned plane, including:
Imaging of the target object on corresponding left and right two imaging planes of binocular camera that unmanned plane loads is determined respectively
The corresponding abscissa of point;
Based on the abscissa calculate between imaging point of the target object on left and right two imaging planes away from
From;
According to existing geometrical relationship, the parameter in conjunction with binocular camera and institute between left and right two imaging planes
The distance between imaging point is stated, the relative depth information between the target object and unmanned plane is calculated;
Wherein, the parameter of the binocular camera includes camera focus and camera centre-to-centre spacing.
Further, described to determine that unmanned plane tracks the planning path of target object, packet according to the relative position information
It includes:
According to the relative depth information based on the dynamic trajectory planning being pre-designed, unmanned plane is calculated in real time and tracks target
The planning path of object;
Wherein, the dynamic trajectory being pre-designed is planned to:
xd(t)=Δ x (t) exp (- w1*t)+∫vTxdt
yd(t)=Δ y (t) exp (- w2*t)+∫vTydt
zd(t)=Δ z (t) exp (- w3*t)+∫vTzdt+ρ;
xd(t)、yd(t)、zd(t) indicate that the planning path of unmanned plane in the x, y, z-directions, t indicate time, Δ x respectively
(t), the relative depth information of Δ y (t), Δ z (t) between target object and unmanned plane, w1、w2、w3Join for the control of system
Number, vTx、vTy、vTzThe respectively movement speed of target object in the x, y, z-directions, when the target object is static target object
When body, vTx、vTy、vTzIt is the height that 0, ρ is target object.
Further, described to be determined according to the relative position information when the target object is mobile target object
Before unmanned plane tracks the planning path of target object, further include:
The movement speed of target object in the x, y, z-directions is calculated in the light stream sensor loaded by unmanned plane.
Further, when the relative distance between unmanned plane and the target object reaches given threshold, nobody is controlled
Machine and mechanical arm are acted to capture the target object, including:
When the relative distance in the directions x and the directions y between unmanned plane and the target object reaches the first given threshold
When, the mechanical arm adjustment state of unmanned plane is controlled, is prepared with carrying out crawl;
When the mechanical arm of unmanned plane adjustment state is completed, control unmanned plane declines;
When unmanned plane drops to setting height, and in the phase in the directions x and the directions y between unmanned plane and the target object
It adjusts the distance when reaching the first given threshold, controls unmanned plane and mechanical arm captures the target object.
Further, the control unmanned plane and mechanical arm capture the target object, including:
Unmanned plane is controlled based on adaptive synovial membrane control algolithm and mechanical arm captures the target object, wherein control nothing
The controlled quentity controlled variable that man-machine and mechanical arm captures the target object includes:
qk'=qd′+λe
Wherein, F indicates the lift of unmanned plane, τx、τyAnd τzRespectively indicate unmanned plane under unmanned plane body coordinate system about
Three torques of three axis of x, y, z, τn×1It is respectively for the controlled quentity controlled variable of the rotational angle of n motor, R, Q, I on unmanned plane mechanical arm
The transition matrix of system,For the roll angle and pitch angle of unmanned plane, Ψ is the yaw angle of unmanned plane,For the predictive estimation of sytem matrix, A, K are the positive definite gain matrix of system, and s indicates synovial membrane face, qdFor nobody
The expecting state matrix of machine mechanical arm hybrid system, λ are positive definite matrix, and e is the virtual condition of unmanned plane mechanical arm hybrid system
Error matrix between expecting state.
Further, after control unmanned plane catches target object, further include:
Control unmanned plane carries target object and flies to predeterminated position.
Second aspect, an embodiment of the present invention provides the device that a kind of unmanned plane captures target object, described device includes:
Acquisition module, for obtaining the relative position information between target object and unmanned plane in real time;
Determining module, for determining that unmanned plane tracks the planning path of target object according to the relative position information;
Tracing module is based on the planning path tracking target object for controlling the unmanned plane;
Handling module, for when the relative distance between unmanned plane and the target object reaches given threshold, controlling
Unmanned plane and mechanical arm action are to capture the target object.
The third aspect an embodiment of the present invention provides a kind of electronic equipment, including first memory, first processor and is deposited
The computer program that can be run on a memory and on first processor is stored up, the first processor executes the computer journey
The method that the unmanned plane crawl target object as described in above-mentioned first aspect is realized when sequence.
Fourth aspect, an embodiment of the present invention provides a kind of storage medium including computer executable instructions, the meters
Calculation machine executable instruction realizes that the unmanned plane as described in above-mentioned first aspect captures object when being executed by computer processor
The method of body.
A kind of method of unmanned plane crawl target object provided in an embodiment of the present invention, by obtain in real time target object with
Relative position information between unmanned plane determines that unmanned plane tracks the planning road of target object according to the relative position information
Diameter controls the unmanned plane and is based on the planning path tracking target object, when between unmanned plane and the target object
Relative distance when reaching given threshold, control unmanned plane and mechanical arm action to capture the technology hand of the target object
Section, realize unmanned plane it is dynamic catch system in the environment of target object is effectively captured, especially for moving target
Object can be captured effectively.
Description of the drawings
To describe the technical solutions in the embodiments of the present invention more clearly, institute in being described below to the embodiment of the present invention
Attached drawing to be used is needed to be briefly described, it should be apparent that, the accompanying drawings in the following description is only some implementations of the present invention
Example without creative efforts, can also be implemented for those of ordinary skill in the art according to the present invention
The content of example and these attached drawings obtain other attached drawings.
Fig. 1 is the method flow schematic diagram that a kind of unmanned plane that the embodiment of the present invention one provides captures target object;
Fig. 2 is the schematic diagram of a kind of pixel coordinate system and image coordinate system that the embodiment of the present invention one provides;
Fig. 3 is a kind of camera imaging schematic diagram that the embodiment of the present invention one provides;
Fig. 4 is a kind of binocular range measurement principle figure that the embodiment of the present invention one provides;
Fig. 5 is the method flow schematic diagram that a kind of unmanned plane provided by Embodiment 2 of the present invention captures target object;
Fig. 6 is a kind of structural schematic diagram of unmanned plane mechanical arm provided by Embodiment 2 of the present invention;
Fig. 7 is a kind of structural schematic diagram of unmanned plane and three-freedom mechanical arm provided by Embodiment 2 of the present invention;
Fig. 8 is the method flow signal that another form of unmanned plane provided by Embodiment 2 of the present invention captures target object
Figure;
Fig. 9 is the apparatus structure schematic diagram that a kind of unmanned plane that the embodiment of the present invention three provides captures target object;
Figure 10 is the structural schematic diagram for a kind of electronic equipment that the embodiment of the present invention four provides.
Specific implementation mode
For make present invention solves the technical problem that, the technical solution that uses and the technique effect that reaches it is clearer, below
The technical solution of the embodiment of the present invention will be described in further detail in conjunction with attached drawing, it is clear that described embodiment is only
It is a part of the embodiment of the present invention, instead of all the embodiments.Embodiment in the present invention, those skilled in the art are not having
The every other embodiment obtained under the premise of creative work is made, shall fall within the protection scope of the present invention.
Embodiment one
Fig. 1 is the method flow schematic diagram that a kind of unmanned plane that the embodiment of the present invention one provides captures target object.This reality
The method for applying unmanned plane crawl target object disclosed in example can be adapted for outdoor environment, and does not depend on to move and catch system, can be right
Stationary object or mobile object are effectively captured.This method can be executed by the device of unmanned plane crawl target object,
Wherein the device can be integrated in unmanned plane body or be integrated in special service in nobody by software and or hardware realization
In the server of machine.Shown in Fig. 1, this method may include steps of:
110, the relative position information between target object and unmanned plane is obtained in real time.
Wherein, the purpose for obtaining the relative position information between target object and unmanned plane in real time is in order in real time according to institute
The path for stating relative position information adjustment unmanned plane tracking target object, so that unmanned plane is constantly close to target object.
Specifically, the relative position information between target object and unmanned plane can be obtained in real time in the following way:
Imaging of the target object on corresponding left and right two imaging planes of binocular camera that unmanned plane loads is determined respectively
The corresponding abscissa of point;
Based on the abscissa calculate between imaging point of the target object on left and right two imaging planes away from
From;
According to existing geometrical relationship, the parameter in conjunction with binocular camera and institute between left and right two imaging planes
The distance between imaging point is stated, the relative depth information between the target object and unmanned plane is calculated;
Wherein, the parameter of the binocular camera includes camera focus and camera centre-to-centre spacing, the target object and nothing
Relative depth information between man-machine characterizes the relative position relation between target object and unmanned plane.
In the imaging process of binocular camera, there is four coordinate systems, respectively:Pixel coordinate system, image coordinate system,
Camera coordinates system and world coordinate system.The schematic diagram of pixel coordinate system and image coordinate system shown in Figure 2, pixel coordinate
System is using the upper left corner of image as origin Oo, transverse and longitudinal coordinate (u, v) indicates pixel columns in the picture and line number respectively;
xO1Y is image coordinate system, origin O1For the optical axis of camera and the intersection point of the plane of delineation, the generally center of the plane of delineation,
The referred to as principal point of image can carry out the point under Two coordinate system mutual according to the relationship of pixel coordinate system and image coordinate system
Conversion.Camera imaging schematic diagram shown in Figure 3, O are the optical center of camera, ZCFor the optical axis of camera, optical axis and the plane of delineation
Intersection point be O1, coordinate system O-XCYCZCFor camera coordinates system, OW-XWYWZWFor world coordinate system, OO1The distance between be camera
Focal length f.
The three-dimensional estimation of target object in actual scene can be determined by technique of binocular stereoscopic vision, specifically,
Binocular range measurement principle figure shown in Figure 4, OLAnd ORThe optical center of respectively left and right camera, the optical axis of left and right camera and each
From imaging plane as shown in figure 4, the distance between the optical center of left and right camera is B, i.e., camera centre-to-centre spacing is B, two cameras
It is equal in the Y coordinate of same plane, the projection centre of left and right camera, synchronization spatial point P (x, y, z) left and right camera at
Imaging point in image plane is respectively XLAnd XR, had according to triangle geometrical relationship:
Wherein, the coordinate system in above-mentioned formula where each amount is:XL、XRWith Y respectively in the plane of delineation of left and right camera
Under, i.e., the coordinate under plane of delineation coordinate system, origin is respectively the optical axis of left and right camera and the intersection point of image plane, and f and B are
Constant, the respectively focal length of camera and camera centre-to-centre spacing, x, y, z are the coordinate under left camera coordinates system, origin OL.Depending on
Poor d is imaging point XLAnd XRThe distance between, and had according to geometrical relationship:
D=B- (XL-XR)
Obtain relative depth information x, y and z of spatial point P.
120, determine that unmanned plane tracks the planning path of target object according to the relative position information.
Specifically, determine that unmanned plane tracks the planning path of target object according to the relative position information, including:
According to the relative depth information based on the dynamic trajectory planning being pre-designed, unmanned plane is calculated in real time and tracks target
The planning path of object;
Wherein, the dynamic trajectory being pre-designed is planned to:
xd(t)=Δ x (t) exp (- w1*t)+∫vTxdt
yd(t)=Δ y (t) exp (- w2*t)+∫vTydt
zd(t)=Δ z (t) exp (- w3*t)+∫vTzdt+ρ;
Wherein, xd(t)、yd(t)、zd(t) indicate that the planning path of unmanned plane in the x, y, z-directions, t indicate the time respectively,
The relative depth information of Δ x (t), Δ y (t), Δ z (t) between target object and unmanned plane, w1、w2、w3For the control of system
Parameter, w1、w2、w3Usually take 0.5, vTx、vTy、vTzThe respectively movement speed of target object in the x, y, z-directions, when the mesh
When mark object is static target object, vTx、vTy、vTzIt is the height that 0, ρ is target object.
The movement speed v of the target object in the x, y, z-directionsTx、vTy、vTzThe light stream that can be loaded by unmanned plane
The movement speed of target object in the x, y, z-directions is calculated in sensor.
130, it controls the unmanned plane and is based on the planning path tracking target object.
While controlling unmanned plane based on the planning path tracking target object, constantly detection obtains target
Real-time relative position information between object and unmanned plane, to adjust the planning path that unmanned plane tracks target object in real time, from
And it realizes unmanned plane and rapidly and accurately catch up with target object.
140, when the relative distance between unmanned plane and the target object reaches given threshold, control unmanned plane and
Mechanical arm is acted to capture the target object.
In general, the relative distance between the unmanned plane and the target object refers specifically to relative distance in z-direction,
I.e. unmanned plane is first in the horizontal plane tracked target object, when unmanned plane and target object are maintained at identical horizontal plane
When upper, then in the vertical direction close to target object.
The method of unmanned plane provided in this embodiment crawl target object, obtained in real time by binocular camera target object with
Then relative depth information between unmanned plane obtains unmanned plane tracking according to the depth information according to preset trajectory planning
The path of target object, so control unmanned plane track target object, when unmanned plane is close to target object, control unmanned plane with
And mechanical arm action, realize unmanned plane it is dynamic catch system in the environment of target object is effectively captured, and can needle
Mobile target object can be captured effectively.
Embodiment two
Fig. 5 is the method flow schematic diagram that a kind of unmanned plane provided by Embodiment 2 of the present invention captures target object.Upper
On the basis of stating embodiment, above-mentioned steps 140 are optimized in the present embodiment, and the benefit of optimization is to further increase unmanned plane
Capture the accuracy of target object.Shown in Fig. 5, the method includes:.
510, the relative position information between target object and unmanned plane is obtained in real time.
520, determine that unmanned plane tracks the planning path of target object according to the relative position information.
530, it controls the unmanned plane and is based on the planning path tracking target object.
540, when the relative distance in the directions x and the directions y between unmanned plane and the target object reaches the first setting
When threshold value, the mechanical arm adjustment state of unmanned plane is controlled, is prepared with carrying out crawl.
Specifically, mainly the mechanical arm of control unmanned plane is adjusted to seized condition at this time.For example, see it is shown in fig. 6 nobody
The structural schematic diagram of machine mechanical arm, during unmanned plane tracks target object, in order to reduce air drag and enhancing nothing
Man-machine stability, upper arm 61 and the underarm 62 of usual mechanical arm fold, while gripper 63 and 64 is also to receive
Hold together state.When unmanned plane is when horizontal plane catch up with target object, i.e., in the directions x and y between unmanned plane and the target object
When the relative distance in direction reaches the first given threshold, upper arm 61 and the underarm 62 of control machinery arm trail, while control machine
Machinery claw 63 and 64 is opened.The control of mechanical arm state is realized particular by the rotational angle of each motor of control machinery arm.
At this time adjustment mechanical arm state compared to unmanned plane in the directions z also very close to target object when adjust mechanical arm again
State, can further improve unmanned plane crawl target object accuracy, if unmanned plane in the directions z also very close to target object
When adjust mechanical arm again, be easy to miss best crawl opportunity, especially when target object is mobile object, effect becomes apparent from.
550, when the mechanical arm of unmanned plane adjustment state is completed, control unmanned plane declines.
560, when unmanned plane drops to setting height, and in the directions x and the directions y between unmanned plane and the target object
Relative distance when reaching the first given threshold, control unmanned plane and mechanical arm capture the target object.
Specifically, the control unmanned plane and mechanical arm capture the target object, including:
Unmanned plane is controlled based on adaptive synovial membrane control algolithm and mechanical arm captures the target object, wherein control nothing
The controlled quentity controlled variable that man-machine and mechanical arm captures the target object includes:
qk'=qd′+λe
Wherein, F indicates the lift of unmanned plane, τx、τyAnd τzRespectively indicate unmanned plane under unmanned plane body coordinate system about
Three torques of three axis of x, y, z, τn×1For the controlled quentity controlled variable of the rotational angle of n motor on unmanned plane mechanical arm, can be according to specific
The demand of scene carries the mechanical arm of different degree of freedom, only the control to mechanical arm need to can be realized by changing the concrete numerical value of n
System so that system has stronger scalability;R, Q, I are respectively the transition matrix of system,For the roll of unmanned plane
Angle and pitch angle, Ψ are the yaw angle of unmanned plane,For the predictive estimation of sytem matrix, A, K are the positive definite of system
Gain matrix, s indicate synovial membrane face, qdFor the expecting state matrix of unmanned plane mechanical arm hybrid system, λ is positive definite matrix, and e is nothing
Error matrix between the virtual condition and expecting state of man-machine mechanical arm hybrid system, τ(3)Third in representing matrix τ
Amount, τ(n+6)The n-th+6 amounts in representing matrix τ, matrix τ is the vector of a multirow one row, and sgn (x) is sign function, works as x>
When 0, sgn (x)=1;Work as x<When 0, sgn (x)=- 1;As x=0, sgn (x)=0.Fig. 7 shows unmanned plane and Three Degree Of Freedom
The structural schematic diagram of mechanical arm, wherein label 710 indicates that the body of unmanned plane, label 720 indicate mechanical arm, η1、η2And η3Point
Not Biao Shi control machinery arm action motor rotational angle, θ be unmanned plane pitch angle, f indicate unmanned plane total life, f1
And f2Indicate that the lift that corresponding position propeller rotational generates, M3 indicate the torque of control unmanned plane pitching angle theta, L respectively1、L2With
L3Indicate that mechanical arm corresponds to the length of section respectively, P indicates position of the unmanned plane under world's inertial coodinate system, P1、P2And P3Respectively
Expression corresponds to the connecting rod position of mechanical arm under world's inertial coodinate system.
Further, after control unmanned plane catches target object, further include:
Control unmanned plane carries target object and flies to predeterminated position, completes the crawl of target object.
The method of unmanned plane crawl target object provided in this embodiment, when unmanned plane catch up with target object in horizontal plane
When, start the mechanical arm adjustment state for controlling unmanned plane, when the mechanical arm adjustment state of unmanned plane is completed and then controls nobody
Machine declines, and when dropping to setting height, controls unmanned plane and mechanical arm action to capture target object, improves unmanned plane
The accuracy for capturing target object controls unmanned plane by using adaptive synovial membrane control algolithm and mechanical arm system captures target
Object improves the interference free performance and robustness of whole system;And the scalability of system is strong, it can be according to the need of concrete scene
It asks, carries the mechanical arm and gripper of different degree of freedom.
Based on the above technical solution, by taking the target object moved in the air as an example, Fig. 8 provides another form
Unmanned plane crawl target object method flow schematic diagram, as shown in figure 8, the method includes:
810, unmanned plane takes off.
Unmanned plane is taken off first to setting height, can be hovered or not hovered through phase machine testing object, and predict mesh
The relative position of object and unmanned plane is marked, and the data information that detection obtains is fed back into controller, so that controller controls nobody
Machine is in the horizontal plane tracked target object.
820, phase machine testing object.
830, relative position is predicted.
840, unmanned plane horizontal plane is tracked.
850, judge whether unmanned plane catch up with object in horizontal plane, that is, determine whether Δ x (t) ≈ 0 and Δ y (t) ≈ 0,
If so, continuing to execute step 860, otherwise 820 continuation of return to step sequence executes.
Wherein, Δ x (t) indicates that unmanned plane and the relative distance of object in the x direction, Δ y (t) indicate unmanned plane and mesh
Mark the relative distance of object in y-direction.
860, mechanical arm state adjusts.
When unmanned plane is when horizontal plane catch up with object, the surface (or underface) that maintenance unmanned plane is in object is adjusted
Complete machine tool arm state.
870, judge whether machine performance adjustment is completed, if so, continue to execute step 880, otherwise return to step 820 after
Continuous sequence executes.
880, unmanned plane declines.
After the adjustment of mechanical arm state is completed, unmanned plane is begun to decline close to target object.
890, judge whether unmanned plane drops to setting height and unmanned plane and object are in identical horizontal plane, if
It is then to continue to execute step 8910, otherwise 820 continuation of return to step sequence executes.
8910, mechanical arm captures.
When unmanned plane drops to setting height, then control machinery arm captures object.
8920, it flies to target location.
Last unmanned plane carries object and flies to predeterminated position, is grabbed to mobile target to complete unmanned plane mechanical arm
It takes.
During control machinery arm captures object, the position mistake in real time between detection unmanned plane and object is kept
Difference, i.e. step 8930 detect site error between unmanned plane and object by sensor in real time, and by controller according to should
The grasping movement of error real-time control machinery arm.
The embodiment of the present invention makes unmanned plane not by designing a kind of method that the unmanned plane of view-based access control model captures target object
It is influenced again by local environment, can independently and accurately capture mobile object, and the stability of system and robustness make unmanned function
It is enough independently to adapt to various severe occasions.At the same time, the operation strategies of unmanned plane are also more extensive, can capture static target
While, mobile surface targets can not only be captured, moreover it is possible to capture aerial mobile target.In this way, in the mankind or other traditional nothings
The man-machine occasion that can not be operated, such as the flood disaster relief, connect the fields such as object in the air, can smoothly use provided in an embodiment of the present invention
Unmanned plane machine captures the method for target object to solve.
Embodiment three
Fig. 9 is the apparatus structure schematic diagram that a kind of unmanned plane that the embodiment of the present invention three provides captures target object, referring to
Shown in Fig. 9, which includes:Acquisition module 910, determining module 920, tracing module 930 and handling module 940;
Wherein, acquisition module 910 for obtaining the relative position information between target object and unmanned plane in real time;Determine mould
Block 920 is used to determine that unmanned plane tracks the planning path of target object according to the relative position information;Tracing module 930 is used for
It controls the unmanned plane and is based on the planning path tracking target object;Handling module 940 be used for when unmanned plane with it is described
When relative distance between target object reaches given threshold, unmanned plane and mechanical arm action are controlled to capture the object
Body.
The device of unmanned plane provided in this embodiment crawl target object, obtained in real time by binocular camera target object with
Then relative depth information between unmanned plane obtains unmanned plane tracking according to the depth information according to preset trajectory planning
The path of target object, so control unmanned plane track target object, when unmanned plane is close to target object, control unmanned plane with
And mechanical arm action, realize unmanned plane it is dynamic catch system in the environment of target object is effectively captured, and can needle
Mobile target object can be captured effectively.
Example IV
Figure 10 is the structural schematic diagram for a kind of electronic equipment that the embodiment of the present invention four provides.As shown in Figure 10, the electronics
Equipment includes:First processor 1070, first memory 1071 and be stored on first memory 1071 and can first processing
The computer program run on device 1070;Wherein, the quantity of first processor 1070 can be one or more, with one in Figure 10
For a first processor 1070;First processor 1070 is realized when executing the computer program as described in above-described embodiment
Unmanned plane crawl target object method.As shown in Figure 10, the electronic equipment can also include the first input unit 1072
With the first output device 1073.First processor 1070, first memory 1071, the first input unit 1072 and first output dress
Setting 1073 can be connected by bus or other modes, in Figure 10 for being connected by bus.
First memory 1071 is used as a kind of computer readable storage medium, and can be used for storing software program, computer can
Program and module are executed, as unmanned plane captures device/module of target object (for example, unmanned plane is grabbed in the embodiment of the present invention
Take acquisition module 910 and the determining module 920 etc. in the device of target object).First processor 1070 is stored in by operation
Software program, instruction in first memory 1071 and module, to execute the various function application and number of electronic equipment
According to processing, that is, realize the method that above-mentioned unmanned plane captures target object.
First memory 1071 can include mainly storing program area and storage data field, wherein storing program area can store
Application program needed for operating system, at least one function;Storage data field can be stored uses created number according to terminal
According to etc..In addition, first memory 1071 may include high-speed random access memory, can also include nonvolatile memory,
A for example, at least disk memory, flush memory device or other non-volatile solid state memory parts.In some instances,
One memory 1071 can further comprise the memory remotely located relative to first processor 1070, these remote memories can
To pass through network connection to electronic equipment/storage medium.The example of above-mentioned network includes but not limited to internet, enterprises
Net, LAN, mobile radio communication and combinations thereof.
First input unit 1072 can be used for receiving the number or character information of input, and generate the use with electronic equipment
Family is arranged and the related key signals input of function control.First output device 1073 may include that display screen etc. shows equipment.
Embodiment five
The embodiment of the present invention five also provides a kind of storage medium including computer executable instructions, and the computer can be held
When being executed by computer processor for executing a kind of method that unmanned plane captures target object, this method includes for row instruction:
The relative position information between target object and unmanned plane is obtained in real time;
Determine that unmanned plane tracks the planning path of target object according to the relative position information;
It controls the unmanned plane and is based on the planning path tracking target object;
When the relative distance between unmanned plane and the target object reaches given threshold, unmanned plane and machinery are controlled
Arm is acted to capture the target object.
Certainly, a kind of storage medium including computer executable instructions that the embodiment of the present invention is provided, computer
The method operation that executable instruction is not limited to the described above, can also be performed the unmanned plane that any embodiment of the present invention is provided and grabs
Take the relevant operation of target object.
By the description above with respect to embodiment, it is apparent to those skilled in the art that, the present invention
It can be realized by software and required common hardware, naturally it is also possible to which by hardware realization, but the former is more in many cases
Good embodiment.Such understanding, technical scheme of the present invention substantially the part that contributes to existing technology in other words
It can be expressed in the form of software products, which can store in a computer-readable storage medium,
Such as the floppy disk of computer, read-only memory (Read-Only Memory, ROM), random access memory (Random Access
Memory, RAM), flash memory (FLASH), hard disk or CD etc., including some instructions are used so that computer equipment (can be with
It is personal computer, storage medium or the network equipment etc.) it executes described in each embodiment of the present invention.
Note that above are only presently preferred embodiments of the present invention and institute's application technology principle.It will be appreciated by those skilled in the art that
The present invention is not limited to specific embodiments described here, can carry out for a person skilled in the art it is various it is apparent variation,
It readjusts and substitutes without departing from protection scope of the present invention.Therefore, although being carried out to the present invention by above example
It is described in further detail, but the present invention is not limited only to above example, without departing from the inventive concept, also
May include other more equivalent embodiments, and the scope of the present invention is determined by scope of the appended claims.
Claims (10)
1. a kind of method of unmanned plane crawl target object, which is characterized in that including:
The relative position information between target object and unmanned plane is obtained in real time;
Determine that unmanned plane tracks the planning path of target object according to the relative position information;
It controls the unmanned plane and is based on the planning path tracking target object;
When the relative distance between unmanned plane and the target object reaches given threshold, controls unmanned plane and mechanical arm is dynamic
Make to capture the target object.
2. according to the method described in claim 1, it is characterized in that, the real-time phase obtained between target object and unmanned plane
To location information, including:
Imaging point pair of the target object on corresponding left and right two imaging planes of binocular camera that unmanned plane loads is determined respectively
The abscissa answered;
The distance between imaging point of the target object on left and right two imaging planes is calculated based on the abscissa;
According to existing geometrical relationship between left and right two imaging planes, in conjunction with binocular camera parameter and it is described at
The distance between picture point calculates the relative depth information between the target object and unmanned plane;
Wherein, the parameter of the binocular camera includes camera focus and camera centre-to-centre spacing.
3. according to the method described in claim 2, it is characterized in that, described determine that unmanned plane chases after according to the relative position information
The planning path of track target object, including:
According to the relative depth information based on the dynamic trajectory planning being pre-designed, unmanned plane is calculated in real time and tracks target object
Planning path;
Wherein, the dynamic trajectory being pre-designed is planned to:
xd(t)=Δ x (t) exp (- w1*t)+∫vTxdt
yd(t)=Δ y (t) exp (- w2*t)+∫vTydt
zd(t)=Δ z (t) exp (- w3*t)+∫vTzdt+ρ;
xd(t)、yd(t)、zd(t) indicate that the planning path of unmanned plane in the x, y, z-directions, t indicate time, Δ x (t), Δ respectively
The relative depth information of y (t), Δ z (t) between target object and unmanned plane, w1、w2、w3For the control parameter of system, vTx、
vTy、vTzThe respectively movement speed of target object in the x, y, z-directions, when the target object is static target object,
vTx、vTy、vTzIt is the height that 0, ρ is target object.
4. described according to the method described in claim 3, it is characterized in that, when the target object is mobile target object
Before determining the planning path that unmanned plane tracks target object according to the relative position information, further include:
The movement speed of target object in the x, y, z-directions is calculated in the light stream sensor loaded by unmanned plane.
5. according to the method described in claim 1, it is characterized in that, when the relative distance between unmanned plane and the target object
When reaching given threshold, unmanned plane and mechanical arm action are controlled to capture the target object, including:
When the relative distance in the directions x and the directions y between unmanned plane and the target object reaches the first given threshold, control
The mechanical arm of unmanned plane processed adjusts state, is prepared with carrying out crawl;
When the mechanical arm of unmanned plane adjustment state is completed, control unmanned plane declines;
When unmanned plane drops to setting height, and between unmanned plane and the target object the directions x and the directions y it is opposite away from
When from reaching the first given threshold, controlling unmanned plane and mechanical arm captures the target object.
6. according to the method described in claim 5, it is characterized in that, the control unmanned plane and mechanical arm capture the object
Body, including:
Unmanned plane is controlled based on adaptive synovial membrane control algolithm and mechanical arm captures the target object, wherein control unmanned plane
And mechanical arm captures the controlled quentity controlled variable of the target object and includes:
qk'=qd′+λe
Wherein, F indicates the lift of unmanned plane, τx、τyAnd τzRespectively indicate unmanned plane under unmanned plane body coordinate system about x, y,
Three torques of tri- axis of z, τn×1It is respectively system for the controlled quentity controlled variable of the rotational angle of n motor, R, Q, I on unmanned plane mechanical arm
Transition matrix, θd、For the roll angle and pitch angle of unmanned plane, Ψ is the yaw angle of unmanned plane,For system
The predictive estimation of matrix, A, K are the positive definite gain matrix of system, and s indicates synovial membrane face, qdFor unmanned plane mechanical arm hybrid system
Expecting state matrix, λ are positive definite matrix, and e is the mistake between the virtual condition and expecting state of unmanned plane mechanical arm hybrid system
Poor matrix.
7. according to claim 1-5 any one of them methods, which is characterized in that when control unmanned plane catch target object it
Afterwards, further include:
Control unmanned plane carries target object and flies to predeterminated position.
8. a kind of device of unmanned plane crawl target object, which is characterized in that described device includes:
Acquisition module, for obtaining the relative position information between target object and unmanned plane in real time;
Determining module, for determining that unmanned plane tracks the planning path of target object according to the relative position information;
Tracing module is based on the planning path tracking target object for controlling the unmanned plane;
Handling module, for when the relative distance between unmanned plane and the target object reaches given threshold, controlling nobody
Machine and mechanical arm action are to capture the target object.
9. a kind of electronic equipment, including first memory, first processor and storage are on a memory and can be in first processor
The computer program of upper operation, which is characterized in that realized when the first processor executes the computer program as right is wanted
The method for asking the unmanned plane described in any one of 1-7 to capture target object.
10. a kind of storage medium including computer executable instructions, the computer executable instructions are by computer disposal
The method that the unmanned plane crawl target object as described in any one of claim 1-7 is realized when device executes.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810507667.9A CN108733076B (en) | 2018-05-24 | 2018-05-24 | Method and device for grabbing target object by unmanned aerial vehicle and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810507667.9A CN108733076B (en) | 2018-05-24 | 2018-05-24 | Method and device for grabbing target object by unmanned aerial vehicle and electronic equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108733076A true CN108733076A (en) | 2018-11-02 |
CN108733076B CN108733076B (en) | 2021-09-07 |
Family
ID=63935168
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810507667.9A Active CN108733076B (en) | 2018-05-24 | 2018-05-24 | Method and device for grabbing target object by unmanned aerial vehicle and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108733076B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108563235A (en) * | 2018-05-24 | 2018-09-21 | 南方科技大学 | Multi-rotor unmanned aerial vehicle, method, device and equipment for grabbing target object |
CN117631691A (en) * | 2024-01-25 | 2024-03-01 | 安徽大学 | Multi-rotor unmanned aerial vehicle grabbing track design method, device, equipment and medium |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090294584A1 (en) * | 2008-06-02 | 2009-12-03 | Gilbert Lovell | Stabilized UAV recovery system |
CN104875882A (en) * | 2015-05-21 | 2015-09-02 | 合肥学院 | Quadrotor |
CN106064378A (en) * | 2016-06-07 | 2016-11-02 | 南方科技大学 | Control method and device for unmanned aerial vehicle mechanical arm |
CN107223275A (en) * | 2016-11-14 | 2017-09-29 | 深圳市大疆创新科技有限公司 | The method and system of multichannel sensing data fusion |
CN107656545A (en) * | 2017-09-12 | 2018-02-02 | 武汉大学 | A kind of automatic obstacle avoiding searched and rescued towards unmanned plane field and air navigation aid |
-
2018
- 2018-05-24 CN CN201810507667.9A patent/CN108733076B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090294584A1 (en) * | 2008-06-02 | 2009-12-03 | Gilbert Lovell | Stabilized UAV recovery system |
CN104875882A (en) * | 2015-05-21 | 2015-09-02 | 合肥学院 | Quadrotor |
CN106064378A (en) * | 2016-06-07 | 2016-11-02 | 南方科技大学 | Control method and device for unmanned aerial vehicle mechanical arm |
CN107223275A (en) * | 2016-11-14 | 2017-09-29 | 深圳市大疆创新科技有限公司 | The method and system of multichannel sensing data fusion |
CN107656545A (en) * | 2017-09-12 | 2018-02-02 | 武汉大学 | A kind of automatic obstacle avoiding searched and rescued towards unmanned plane field and air navigation aid |
Non-Patent Citations (8)
Title |
---|
ERDING ALTUK,等: "Quadrotor Control Using Dual Camera Visual Feedback", 《PROCEEDINGS OFTBR 2003 IEEE INLERO~TIOORL CONFERENCE ON ROBOTICS &AUTOMITION 》 * |
JUSTIN THOMAS,等: "Toward Image Based Visual Servoing for Aerial Grasping and Perching", 《PROCEEDINGS OF 2014 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA)》 * |
XIANG T,等: "Adaptive Flight Control for Quadrotor UAVs with Dynamic Inversion and Neural Networks", 《2016 IEEE INTERNATIONAL CONFERENCE ON MULTISENSOR FUSION AND INTEGRATION FOR INTELLIGENT SYSTEMS》 * |
于乃功,等: "仿人型机械臂抓取目标路径规划的研究", 《研究 开发》 * |
于振中,等: "基于Kinect 的移动机器人实时局部路径规划", 《计算机工程》 * |
张华,等: "一种动态参数更新的无人机三维路径规划方法", 《自动化仪表》 * |
肖珂,等: "基于Kinect视频技术的葡萄园农药喷施路径规划算法", 《农业工程学报》 * |
都业贵: "基于视觉伺服的飞行机械臂抓取控制", 《中国优秀硕士学位论文全文数据库 信息科技辑》 * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108563235A (en) * | 2018-05-24 | 2018-09-21 | 南方科技大学 | Multi-rotor unmanned aerial vehicle, method, device and equipment for grabbing target object |
CN108563235B (en) * | 2018-05-24 | 2022-02-08 | 南方科技大学 | Multi-rotor unmanned aerial vehicle, method, device and equipment for grabbing target object |
CN117631691A (en) * | 2024-01-25 | 2024-03-01 | 安徽大学 | Multi-rotor unmanned aerial vehicle grabbing track design method, device, equipment and medium |
CN117631691B (en) * | 2024-01-25 | 2024-04-12 | 安徽大学 | Multi-rotor unmanned aerial vehicle grabbing track design method, device, equipment and medium |
Also Published As
Publication number | Publication date |
---|---|
CN108733076B (en) | 2021-09-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108563235A (en) | Multi-rotor unmanned aerial vehicle, method, device and equipment for grabbing target object | |
CN104184932B (en) | Ball machine control method and device | |
CN109753076A (en) | A kind of unmanned plane vision tracing implementing method | |
CN106104203A (en) | The distance detection method of a kind of mobile object, device and aircraft | |
CN109895099A (en) | A kind of flight mechanical arm visual servo grasping means based on physical feature | |
Hui et al. | A novel autonomous navigation approach for UAV power line inspection | |
CN107831776A (en) | Unmanned plane based on nine axle inertial sensors independently makes a return voyage method | |
CN109643131A (en) | Unmanned plane, its control method and recording medium | |
CN109213204A (en) | AUV sub-sea floor targets based on data-driven search navigation system and method | |
CN109669474B (en) | Priori knowledge-based multi-rotor unmanned aerial vehicle self-adaptive hovering position optimization algorithm | |
CN114912287A (en) | Robot autonomous grabbing simulation system and method based on target 6D pose estimation | |
CN109213197A (en) | A kind of autonomous method for inspecting of unmanned plane for single time tangent tower of direct current | |
CN108731681A (en) | Rotor wing unmanned aerial vehicle method of navigation, related computer program, electronic equipment and unmanned plane | |
Holz et al. | Continuous 3D sensing for navigation and SLAM in cluttered and dynamic environments | |
CN109460054A (en) | A kind of autonomous method for inspecting of unmanned plane for single time anchor support of direct current | |
CN108733076A (en) | Method and device for grabbing target object by unmanned aerial vehicle and electronic equipment | |
CN112947550A (en) | Illegal aircraft striking method based on visual servo and robot | |
CN110928311B (en) | Indoor mobile robot navigation method based on linear features under panoramic camera | |
CN106325278B (en) | A kind of robot localization air navigation aid based on Aleuroglyphus ovatus | |
WO2022193081A1 (en) | Method and apparatus for controlling unmanned aerial vehicle, and unmanned aerial vehicle | |
CN113961013A (en) | Unmanned aerial vehicle path planning method based on RGB-D SLAM | |
CN109502038A (en) | Unmanned aerial vehicle autonomous inspection method for alternating-current single-circuit strain tower | |
CN109062259A (en) | A kind of unmanned plane automatic obstacle-avoiding method and device thereof | |
WO2020225979A1 (en) | Information processing device, information processing method, program, and information processing system | |
CN114859370A (en) | Positioning method and apparatus, computer apparatus, and computer-readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |