CN112149467A - Method for executing tasks by airplane cluster and long airplane - Google Patents

Method for executing tasks by airplane cluster and long airplane Download PDF

Info

Publication number
CN112149467A
CN112149467A CN201910572274.0A CN201910572274A CN112149467A CN 112149467 A CN112149467 A CN 112149467A CN 201910572274 A CN201910572274 A CN 201910572274A CN 112149467 A CN112149467 A CN 112149467A
Authority
CN
China
Prior art keywords
coordinate system
target
matrix
camera
airplane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910572274.0A
Other languages
Chinese (zh)
Inventor
张文凯
巴航
张永伟
陈宇楠
李刚强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingdong Qianshi Technology Co Ltd
Original Assignee
Beijing Jingdong Century Trading Co Ltd
Beijing Jingdong Shangke Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingdong Century Trading Co Ltd, Beijing Jingdong Shangke Information Technology Co Ltd filed Critical Beijing Jingdong Century Trading Co Ltd
Priority to CN201910572274.0A priority Critical patent/CN112149467A/en
Publication of CN112149467A publication Critical patent/CN112149467A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects

Abstract

The disclosure provides a method for an airplane cluster to execute tasks and a long airplane, and relates to the field of flight control. The method comprises the following steps: the method comprises the steps that a leader in an airplane cluster acquires a monitoring image of the ground based on a camera carried by the leader, a target is identified from the monitoring image by using a target detection model, and a plurality of wing airplanes in the airplane cluster are scheduled to execute corresponding tasks on the target, so that the burden of workers can be reduced, the dependence on the performance of a communication link between the airplane and a ground station is reduced, and the real-time performance and the reliability of target discovery and task execution are improved.

Description

Method for executing tasks by airplane cluster and long airplane
Technical Field
The disclosure relates to the field of flight control, in particular to a method for an airplane cluster to execute tasks and a long airplane.
Background
Fixed-wing aircraft (or simply "aircraft") clusters, usually with the long plane transmitting the captured image back to the ground station, the ground station staff find the target based on the image transmitted back by the long plane, and transmit the information of the target to the long plane and the wing plane. The ground station staff can also instruct a longplane to continue monitoring the target, or else instruct several bureaucratic attack targets, etc.
Disclosure of Invention
The inventor finds that the staff at the ground station finds the target based on the image returned by the long airplane, so that the workload of the staff is heavy. In addition, the communication link between the aircraft and the ground station has problems of time delay or link interruption, and the like, which can cause problems of long time delay and reliability of target discovery and task instructions.
The long plane autonomously identifies the target from the monitoring image on the ground and schedules the wing plane to execute the task, so that the burden of workers can be reduced, the dependence on the performance of a communication link between the plane and the ground station is reduced, and the real-time performance and the reliability of target discovery and task execution are improved.
Some embodiments of the present disclosure provide a method for an aircraft cluster to perform a task, comprising:
a long airplane in the airplane cluster acquires a ground monitoring image based on a camera carried by the long airplane;
identifying a target from the monitoring image by using a target detection model, wherein the target detection model is trained by using a training image and target marking information in advance;
and scheduling a plurality of wing planes in the airplane cluster to execute corresponding tasks on the target.
In some embodiments, the lead plane sends position information of the target in the inertial coordinate system, determined based on the pixel points of the target on the image, to the scheduled bureaucratic plane.
In some embodiments, determining the position information of the target in the inertial coordinate system by the long machine comprises:
determining the height of a target under a camera coordinate system according to the focal length of the camera, a first height and a second height of the camera, wherein the first height is the height of an optical center of the camera under an inertial coordinate system, and the second height is the height of a pixel point of the target on an image under the inertial coordinate system;
determining an internal reference matrix between a homogeneous matrix formed based on the height of the target in a camera coordinate system and a position point of the target in the camera coordinate system according to the focal length of the camera and a conversion factor between the pixel and the length;
and calculating the position information of the target under the inertial coordinate system according to the homogeneous matrix and the internal reference matrix and by combining a homogeneous conversion matrix from a holder coordinate system to a camera coordinate system, a homogeneous conversion matrix from an airplane body coordinate system to the holder coordinate system, a homogeneous conversion matrix from the airplane coordinate system to the airplane body coordinate system and a homogeneous conversion matrix from the inertial coordinate system to the airplane body coordinate system.
In some embodiments, determining the height of the target in the camera coordinate system by the long machine comprises:
constructing a proportional relation, wherein the proportional relation is as follows: the ratio of the difference between the first height and the second height to the first height is equal to the ratio of the focal length of the camera to the height of the target in the camera coordinate system;
and calculating to obtain the height of the target under the camera coordinate system based on the proportional relation.
In some embodiments, the position information of the long computer-calculated target in the inertial coordinate system comprises:
and multiplying the internal reference matrix with each homogeneous conversion matrix to obtain a first matrix, and determining the result obtained by multiplying the inverse matrix of the first matrix with the homogeneous matrix as the position information of the target in the inertial coordinate system.
In some embodiments, further comprising: and the lead plane monitors the task execution situation according to the monitoring image and determines whether to continuously schedule the wing plane to execute the corresponding task on the target according to the task execution situation.
In some embodiments, the Global Positioning System (GPS) coordinates of a wing plane of an airplane comprise:
calculating the NED coordinate of the long machine according to the GPS coordinate of the long machine and by combining the conversion matrix of the GPS coordinate system and the ECEF coordinate system and the conversion matrix of the ECEF coordinate system and the NED coordinate system;
calculating the NED coordinates of a wing plane according to the NED coordinates of the long plane and the coordinates of the wing plane in the local coordinate system of the long plane;
calculating the GPS coordinates of a wing plane according to the NED coordinates of the wing plane and combining the inverse matrix of the conversion matrix of the GPS coordinate system and the ECEF coordinate system and the inverse matrix of the conversion matrix of the ECEF coordinate system and the NED coordinate system.
Some embodiments of the present disclosure provide a long airplane in an airplane cluster, including:
an image acquisition unit configured to acquire a monitoring image of the ground based on the onboard camera;
a target identification unit configured to identify a target from the monitored image using a target detection model, the target detection model being trained in advance using a training image and target labeling information;
a wing plane scheduling unit configured to schedule a number of wing planes in the cluster of airplanes to perform respective tasks on the target.
In some embodiments, further comprising: the target position calculating unit is configured to determine position information of the target in an inertial coordinate system based on pixel points of the target on the image;
the bureau plane scheduling unit is also configured to send the position information of the target in the inertial coordinate system to the bureau plane being scheduled.
In some embodiments, further comprising: a monitoring unit configured to monitor a situation of task execution according to a monitoring image;
the wing plane scheduling unit is also configured to determine whether to continue scheduling the wing plane to execute the corresponding task on the target according to the situation of task execution.
Some embodiments of the present disclosure provide a long airplane in an airplane cluster, including: a memory; and a processor coupled to the memory, the processor configured to perform a method of an aircraft cluster performing tasks in any of the preceding embodiments based on instructions stored in the memory.
Some embodiments of the disclosure propose a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements a method of performing a task by a cluster of aircraft in any of the aforementioned embodiments.
Drawings
The drawings that will be used in the description of the embodiments or the related art will be briefly described below. The present disclosure will be more clearly understood from the following detailed description, which proceeds with reference to the accompanying drawings,
it is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without undue inventive faculty.
Fig. 1 is a schematic diagram of some embodiments of an aircraft cluster system of the present disclosure.
Fig. 2 is a schematic view of some embodiments of the bureaucratic plane of the present disclosure, determining the GPS coordinates of the bureaucratic plane from the GPS coordinates of the longplane and the coordinates of the bureaucratic plane in the local coordinate system of the longplane.
Fig. 3 is a schematic illustration of some embodiments of a method of performing tasks by a cluster of aircraft according to the present disclosure.
Fig. 4a and 4b show schematic diagrams of the position relationship of the coordinate system (1-5).
Fig. 5a and 5b show modeling schematics of the camera coordinate system and pixel coordinate system (5-6) of the present disclosure.
Fig. 6 is a schematic structural diagram of some embodiments of long aircraft in a cluster of aircraft according to the present disclosure.
Fig. 7 is a schematic structural diagram of some embodiments of long aircraft in a cluster of aircraft according to the present disclosure.
Detailed Description
The technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the drawings in the embodiments of the present disclosure.
Fig. 1 is a schematic diagram of some embodiments of an aircraft cluster system of the present disclosure.
As shown in fig. 1, the aircraft cluster system 10 comprises a long plane 11, several bureaucratic planes 12 and a ground station 13. Long machine 11 can communicate remotely with ground station 13. The bureaucratic machines 12 may not communicate with the ground station 13. The board computers 11 and several bureaucratic computers 12 form a local area network, through which the board computers 11 and bureaucratic computers 12 can communicate. If a contingency occurs with a lead plane 11, a certain wing plane 12 can become a lead plane.
During the takeoff phase of the cluster of airplanes, the long plane is launched first and then the respective wing planes are launched one after the other. The bureau of bureaus receives the message of the leader in real time, for example, the message of the leader includes information such as a GPS (Global Positioning System) coordinate, a speed, and a voyage of the leader. The bureaucratic plane plans the track autonomously according to the information of the long plane.
In order to plan a trajectory, a bureaucratic plane needs to determine its GPS coordinates from the GPS coordinates of the long plane and from the coordinates of the bureaucratic plane in the local coordinate system of the long plane.
The local coordinate system of the long machine takes the mass center of the long machine as an origin O, the X axis points to the north of the earth, the Y axis points to the east of the earth, the X-Y plane is parallel to the horizontal plane, and the Z axis is vertical to the X-Y plane and points downwards. The long machine local coordinate system coincides with the NED (North-East-Down, northeast) coordinate system. The coordinate of a bureaucratic machine i in the local coordinate system of the long machine is as follows:
Figure BDA0002111224740000051
let the GPS coordinate of the length machine be
Figure BDA0002111224740000052
Wherein
Figure BDA0002111224740000053
Respectively longitude, latitude and altitude of the long machine.
As shown in fig. 2, the determination of the GPS coordinates of a bureaucratic plane from the GPS coordinates of the longplane and the coordinates of the bureaucratic plane in the local coordinate system of the longplane comprises:
and step 21, calculating the NED coordinate of the long machine according to the GPS coordinate of the long machine and by combining a conversion matrix of a GPS coordinate system and an ECEF (Earth Centered Earth Fixed) coordinate system and a conversion matrix of the ECEF coordinate system and the NED coordinate system.
The formula is expressed as:
Figure BDA0002111224740000054
wherein the content of the first and second substances,
Figure BDA0002111224740000055
NED coordinates representing a long machine, of which
Figure BDA0002111224740000056
The NED coordinates of the long machine are respectively expressed in the values of the x-axis, y-axis and z-axis directions, T1A transformation matrix, T, representing the GPS coordinate system and the ECEF coordinate system2A transformation matrix representing an ECEF coordinate system and an NED coordinate system,
Figure BDA0002111224740000057
GPS coordinates, T, representing a long machine1And T2Reference is made to the prior art.
The ECEF coordinate system is characterized in that the position of an original point is located in the earth mass center, the X axis passes through the intersection point of a Greenwich mean line and an equator line, the positive direction is the direction of the original point pointing to the intersection point, the Z axis points to the north pole through the original point, and the Y axis, the X axis and the Z axis form a right-hand coordinate system.
Step 22, calculating the NED coordinates of a bureaucratic machine from the NED coordinates of the bureau machine and the coordinates of the bureau machine in the local coordinate system of the bureau machine.
The formula is expressed as:
Figure BDA0002111224740000058
wherein the content of the first and second substances,
Figure BDA0002111224740000059
NED co-ordinates representing a bureaucrati, of which
Figure BDA00021112247400000510
Values of the NED coordinates of a bureaucratic i in the direction of its x, y and z axes, respectively. The meanings of the other symbols are referred to above.
And step 23, calculating the GPS coordinates of the bureaucratic machines according to the NED coordinates of the bureaucratic machines and combining the inverse matrix of the conversion matrix of the GPS coordinate system and the ECEF coordinate system and the inverse matrix of the conversion matrix of the ECEF coordinate system and the NED coordinate system.
The formula is expressed as:
Figure BDA0002111224740000061
wherein the content of the first and second substances,
Figure BDA0002111224740000062
representing the GPS coordinates of a bureaucratic i. The meanings of the other symbols are referred to above.
After the cluster of airplanes takes off, the formation of the long plane and the wing plane flies. The formation flight mode may be preset or adjusted as directed by the longplane. The formation flight mode may be, for example, a center mode centered on the long aircraft, a master-slave mode including the long aircraft, or the like, but is not limited to the illustrated example.
During formation flight, the cluster of airplanes completes corresponding tasks in a cooperative manner performed by the leader and the bureaucratic planes. As described in detail below.
Fig. 3 is a schematic illustration of some embodiments of a method of performing tasks by a cluster of aircraft according to the present disclosure.
As shown in fig. 3, the method of this embodiment includes:
and step 31, long aircrafts in the airplane cluster acquire ground monitoring images based on cameras carried by the long aircrafts.
And step 32, the long machine identifies the target from the monitoring image by using a target detection model.
The target detection model is trained by utilizing a training image and target marking information in advance. The target detection model can be a MobileNet-SSD model, for example, so as to improve the real-time performance of target identification.
After the long machine recognizes the target, step 33 may be executed directly, or the target information may be transmitted to the ground station to be confirmed, and step 33 may be executed after the confirmation.
And step 33, the longplane schedules a plurality of bureaucratic planes in the aircraft cluster to execute corresponding tasks on the target.
The task may be, for example, area rescue, forest fire fighting, material delivery, anti-terrorist reconnaissance, target attack, etc.
After step 33, step 34 may also optionally be performed.
And step 34, the lead plane monitors the task execution situation according to the monitoring image and determines whether to continue to schedule the wing plane to execute the corresponding task on the target according to the task execution situation.
Taking a fire fighting task as an example, according to a preset target and a corresponding wing plane scheduling strategy, a leader sends a command of putting a fire extinguishing agent to a fire position to the 1 st wing plane and the 2 nd wing plane for the first time, the 3 rd, 4 th and 5 th wing planes are on standby, the leader observes a fire situation at the same time, and when the 1 st wing plane and the 2 nd wing plane complete the fire fighting task and a fire is not extinguished, the leader continues to send the command of putting the fire extinguishing agent to the fire position to the 3 rd, 4 th and 5 th wing planes and continues to execute the fire fighting task. Different targets can set different wing plane scheduling strategies, and the same target can set the same or different wing plane scheduling strategies as required.
In the embodiment, the long plane autonomously identifies the target from the monitoring image of the ground and schedules the wing plane to execute the task, so that the burden of workers can be reduced, the dependence on the performance of a communication link between the plane and the ground station is reduced, and the real-time performance and the reliability of target discovery and task execution are improved.
When a lead plane schedules a lead plane, the lead plane can send the position information of a target in an inertial coordinate system, which is determined based on a pixel point of the target on an image, to the scheduled lead plane so that the lead plane knows the position information of the target. The following specifically describes a calculation process of the position information of the target.
Some coordinate systems are defined below. Fig. 4a and 4b show schematic diagrams of the position relationship of the coordinate system (1-5).
(1) Inertial frame (inertial frame), denoted as OI-XIYIZI. Origin OIAt the point of departure (HOME), XIPointing to the North (North), YIEast, ZIPointing to the ground with the coordinate (X)I,YI,ZI)。
Figure BDA0002111224740000071
Represents the origin OIDistance from the center of gravity of the aircraft (noted CM).
(2) Aircraft coordinate system (vehicle frame), denoted as Ov-XvYvZv. Origin OvAt the aircraft center of gravity (denoted CM), XvPointing to north, YvPointing to east, ZvPointing to the ground with the coordinate (X)v,Yv,Zv)
(3) Body coordinate system (body frame), denoted as Ob-XbYbZb. Origin ObAt the aircraft center of gravity, XbPointing to the aircraft nose, YbPointing to the right wing, Z, in the plane of the fuselagebThe vertical body plane points downwards with the coordinate of (X)b,Yb,Zb)。
(4) Tripod head coordinate system (gimbal frame), denoted as Og-XgYgZg. Origin OgAt the center of gravity (denoted as Gimbal), X of the pan/tilt headgIn the direction of the optical axis, YgParallel to the image platform to the right, ZgParallel to the image plane downwards, with the coordinate (X)g,Yg,Zg)。
(5) Camera coordinate system (camera frame), denoted as Oc-XcYcZc. Origin OcAt the optical center, XcParallel image plane down, YcParallel image plane to the right, ZcParallel to the optical axis forward and having the coordinate (X)v,Yv,Zv)。
(6) Pixel coordinate system (pixel frame), denoted as Op-XpYp. Origin OpIn the upper right corner of the image, the coordinate (X)p,Yp),Xp,YpAre pixel values.
Some symbols are defined below.
viRepresenting the expression of the vector v in the coordinate system i.
Figure BDA0002111224740000081
A rotation matrix representing coordinate system i through coordinate system j.
Figure BDA0002111224740000082
A translation matrix representing coordinate system i to coordinate system j.
Figure BDA0002111224740000083
A homogeneous transformation matrix representing coordinate system i to coordinate system j.
Understandably, once the coordinate system i, j is defined, the rotation matrix of the coordinate system i to the coordinate system j
Figure BDA0002111224740000084
Translation matrix
Figure BDA0002111224740000085
Homogeneous transformation matrix
Figure BDA0002111224740000086
Are known quantities that can be determined. For example, based on the previously defined coordinate systems (1-5), a homogeneous transformation matrix of the pan-tilt coordinate system to the camera coordinate system
Figure BDA0002111224740000087
Homogeneous conversion matrix from airplane body coordinate system to holder coordinate system
Figure BDA0002111224740000088
Homogeneous conversion matrix from airplane coordinate system to airplane body coordinate system
Figure BDA0002111224740000089
And a homogeneous transformation matrix from the inertial coordinate system to the aircraft body coordinate system
Figure BDA00021112247400000810
Etc., are known quantities that can be determined.
Fig. 5a and 5b show modeling schematics of the camera coordinate system and pixel coordinate system (5-6) of the present disclosure. The following describes, with reference to fig. 5a and 5b, the long machine determining the position information of the target in the inertial coordinate system based on the pixel points of the target on the image.
The position point of the target P in the camera coordinate system is
Figure BDA00021112247400000811
Figure BDA00021112247400000812
The pixel coordinate on the image is q ═ xip,yip,1,1)T. The focal length of the camera is f. Let λ be pz. Wherein p isx,py,pzThe units of (d) are units of length, such as meters (m). x is the number ofip,yipThe units of (d) are each a pixel unit, e.g., a pixel value. Sx、SyConversion factors between pixel and length in x, y directions, xim、yimThe unit of (b) is a unit of length, e.g. meter. In the camera coordinate system, the pixel unit is converted into a length unit (for example, the pixel value is converted into meters), and the following formula is given:
xim=(-yip+0y)Sy,yim=(xip-0x)Sx
Figure BDA00021112247400000813
by being similar to a triangle, the following formula is given:
Figure BDA00021112247400000814
wherein, the homogeneous matrix formed based on the height λ of the target under the camera coordinate system is:
Figure BDA0002111224740000091
wherein, I is an identity matrix, and C represents a homogeneous matrix lambdaq and a position point of a target under a camera coordinate system
Figure BDA0002111224740000092
C is:
Figure BDA0002111224740000093
it can be seen that according to the focal length of the camera and the conversion factor between the pixel and the length, an internal reference matrix, 0, between a homogeneous matrix formed based on the height of the target in the camera coordinate system and the position point of the target in the camera coordinate system is determinedx=0y=0。
From the matrix transformation relationship, we derive:
Figure BDA0002111224740000094
the following is obtained from the above equation:
Figure BDA0002111224740000095
therefore, the position information of the target under the inertial coordinate system is calculated according to the homogeneous matrix and the internal reference matrix and by combining the homogeneous conversion matrix from the holder coordinate system to the camera coordinate system, the homogeneous conversion matrix from the airplane body coordinate system to the holder coordinate system, the homogeneous conversion matrix from the airplane coordinate system to the airplane body coordinate system and the homogeneous conversion matrix from the inertial coordinate system to the airplane body coordinate system. That is, the first matrix is obtained by multiplying the internal reference matrix and each homogeneous conversion matrix
Figure BDA0002111224740000096
Inverting the first matrix
Figure BDA0002111224740000097
Determining the position information of the target in an inertial coordinate system by the result obtained by multiplying the homogeneous matrix lambdaq
Figure BDA0002111224740000098
Therefore, lambda value is calculated, so that lambda q and further lambda q can be calculated
Figure BDA0002111224740000099
The calculation process of the λ value is described below.
Let pccIs the optical center position of the camera
Figure BDA00021112247400000910
The homogeneous coordinate of the optical center in the inertial coordinate system is as follows:
Figure BDA00021112247400000911
pixel coordinate q ═ xip,yip,1,1)TThe coordinates in the inertial frame are:
Figure BDA0002111224740000101
Figure BDA0002111224740000102
the coordinates of the optical center of the camera in the inertial coordinate system are shown, and the height of the optical center of the camera in the inertial coordinate system is shown as
Figure BDA0002111224740000103
Figure BDA0002111224740000104
The coordinates of the pixel points of the target on the image under the inertial coordinate system are shown, and the height of the pixel points of the target on the image under the inertial coordinate system is shown as
Figure BDA0002111224740000105
The projection of the target P on the optical axis is B, d is
Figure BDA0002111224740000106
A distance from P, λ is
Figure BDA0002111224740000107
A distance from B, m is
Figure BDA0002111224740000108
And
Figure BDA0002111224740000109
the distance between them. Triangle shape
Figure BDA00021112247400001010
And triangle
Figure BDA00021112247400001011
Similarly, the following proportional relation is constructed: the ratio between the difference between the first height and the second height and the first height is equal to the ratio between the focal length of the camera and the height of the target in the camera coordinate system, and the formula is as follows:
Figure BDA00021112247400001012
and calculating the height of the target under a camera coordinate system based on the proportional relation:
Figure BDA00021112247400001013
thus, the λ value is calculated. From the foregoing, it can be calculated
Figure BDA00021112247400001014
Fig. 6 is a schematic structural diagram of some embodiments of long aircraft in a cluster of aircraft according to the present disclosure.
As shown in fig. 6, the lengthening machine of this embodiment includes:
an image acquisition unit 61 configured to acquire a monitoring image of the ground based on the onboard camera;
a target recognition unit 62 configured to recognize a target from the monitored image using a target detection model, the target detection model being trained in advance using a training image and target labeling information;
a wing plane scheduling unit 63 configured to schedule a number of wing planes in the cluster of airplanes to perform respective tasks on the target.
In some embodiments, the longplane further comprises: a target position calculation unit 64 configured to determine position information of the target in the inertial coordinate system based on pixel points of the target on the image; the bureau plane scheduling unit 63 is also configured to send the position information of the target in the inertial coordinate system to the bureau plane being scheduled.
In some embodiments, the longplane further comprises: a monitoring unit 65 configured to monitor a situation of task execution according to the monitoring image; the wing plane scheduling unit 63 is further configured to determine whether to continue scheduling the wing plane to execute the corresponding task on the target according to the situation of task execution.
Fig. 7 is a schematic structural diagram of some embodiments of long aircraft in a cluster of aircraft according to the present disclosure.
As shown in fig. 7, the lengthening machine of this embodiment includes: a memory 71; and a processor 72 coupled to the memory, the processor configured to perform the method of performing a task for a cluster of aircraft of any of the foregoing embodiments based on instructions stored in the memory.
Memory 610 may include, for example, system memory, fixed non-volatile storage media, and the like. The system memory stores, for example, an operating system, an application program, a Boot Loader (Boot Loader), and other programs.
As will be appreciated by one skilled in the art, embodiments of the present disclosure may be provided as a method, system, or computer program product. Accordingly, the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present disclosure may take the form of a computer program product embodied on one or more computer-usable non-transitory storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The above description is only exemplary of the present disclosure and is not intended to limit the present disclosure, so that any modification, equivalent replacement, or improvement made within the spirit and principle of the present disclosure should be included in the scope of the present disclosure.

Claims (12)

1. A method of an aircraft cluster performing a mission, comprising:
a long airplane in the airplane cluster acquires a ground monitoring image based on a camera carried by the long airplane;
identifying a target from the monitoring image by using a target detection model, wherein the target detection model is trained by using a training image and target marking information in advance;
and scheduling a plurality of wing planes in the airplane cluster to execute corresponding tasks on the target.
2. The method as claimed in claim 1, wherein the fans send to the scheduled bureaucratic machines the position information of the targets in the inertial coordinate system determined on the basis of the pixel points of the targets on the image.
3. The method of claim 2, wherein determining the position information of the target in the inertial frame by the long machine comprises:
determining the height of a target under a camera coordinate system according to the focal length of the camera, a first height and a second height of the camera, wherein the first height is the height of an optical center of the camera under an inertial coordinate system, and the second height is the height of a pixel point of the target on an image under the inertial coordinate system;
determining an internal reference matrix between a homogeneous matrix formed based on the height of the target in a camera coordinate system and a position point of the target in the camera coordinate system according to the focal length of the camera and a conversion factor between the pixel and the length;
and calculating the position information of the target under the inertial coordinate system according to the homogeneous matrix and the internal reference matrix and by combining a homogeneous conversion matrix from a holder coordinate system to a camera coordinate system, a homogeneous conversion matrix from an airplane body coordinate system to the holder coordinate system, a homogeneous conversion matrix from the airplane coordinate system to the airplane body coordinate system and a homogeneous conversion matrix from the inertial coordinate system to the airplane body coordinate system.
4. The method of claim 3, wherein determining the height of the target in the camera coordinate system by the long machine comprises:
constructing a proportional relation, wherein the proportional relation is as follows: the ratio of the difference between the first height and the second height to the first height is equal to the ratio of the focal length of the camera to the height of the target in the camera coordinate system;
and calculating to obtain the height of the target under the camera coordinate system based on the proportional relation.
5. The method of claim 3, wherein the position information of the long computer-calculated target in the inertial coordinate system comprises:
and multiplying the internal reference matrix with each homogeneous conversion matrix to obtain a first matrix, and determining the result obtained by multiplying the inverse matrix of the first matrix with the homogeneous matrix as the position information of the target in the inertial coordinate system.
6. The method of claim 1, further comprising:
and the lead plane monitors the task execution situation according to the monitoring image and determines whether to continuously schedule the wing plane to execute the corresponding task on the target according to the task execution situation.
7. A method as claimed in claim 1, wherein, as the cluster of airplanes is flying in formation, the bureaucratic machines determine, from the GPS coordinates of the prolate machine and the coordinates of the bureaucratic machine in the local coordinate system of the prolate machine, the GPS coordinates of the bureaucratic machine comprising:
calculating the NED coordinate of the long machine according to the GPS coordinate of the long machine and by combining the conversion matrix of the GPS coordinate system and the ECEF coordinate system and the conversion matrix of the ECEF coordinate system and the NED coordinate system;
calculating the NED coordinates of a wing plane according to the NED coordinates of the long plane and the coordinates of the wing plane in the local coordinate system of the long plane;
calculating the GPS coordinates of a wing plane according to the NED coordinates of the wing plane and combining the inverse matrix of the conversion matrix of the GPS coordinate system and the ECEF coordinate system and the inverse matrix of the conversion matrix of the ECEF coordinate system and the NED coordinate system.
8. A long aircraft in an aircraft cluster, comprising:
an image acquisition unit configured to acquire a monitoring image of the ground based on the onboard camera;
a target identification unit configured to identify a target from the monitored image using a target detection model, the target detection model being trained in advance using a training image and target labeling information;
a wing plane scheduling unit configured to schedule a number of wing planes in the cluster of airplanes to perform respective tasks on the target.
9. The longplane of claim 8, further comprising:
the target position calculating unit is configured to determine position information of the target in an inertial coordinate system based on pixel points of the target on the image;
the bureau plane scheduling unit is also configured to send the position information of the target in the inertial coordinate system to the bureau plane being scheduled.
10. The longplane of claim 8, further comprising:
a monitoring unit configured to monitor a situation of task execution according to a monitoring image;
the wing plane scheduling unit is also configured to determine whether to continue scheduling the wing plane to execute the corresponding task on the target according to the situation of task execution.
11. A long aircraft in an aircraft cluster, comprising: a memory; and
a processor coupled to the memory, the processor configured to perform the method of any of claims 1-6 based on instructions stored in the memory.
12. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method of any one of claims 1-7.
CN201910572274.0A 2019-06-28 2019-06-28 Method for executing tasks by airplane cluster and long airplane Pending CN112149467A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910572274.0A CN112149467A (en) 2019-06-28 2019-06-28 Method for executing tasks by airplane cluster and long airplane

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910572274.0A CN112149467A (en) 2019-06-28 2019-06-28 Method for executing tasks by airplane cluster and long airplane

Publications (1)

Publication Number Publication Date
CN112149467A true CN112149467A (en) 2020-12-29

Family

ID=73869071

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910572274.0A Pending CN112149467A (en) 2019-06-28 2019-06-28 Method for executing tasks by airplane cluster and long airplane

Country Status (1)

Country Link
CN (1) CN112149467A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113741531A (en) * 2021-09-15 2021-12-03 江苏航空职业技术学院 Unmanned aerial vehicle cluster cooperative control system and control method for sharing target task information

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009066350A (en) * 2007-09-18 2009-04-02 Namco Bandai Games Inc Program, information storage medium, and words output determination controller
CN102081404A (en) * 2011-01-27 2011-06-01 西北工业大学 Synergistic target tracking method for dual unmanned planes under communication constraint
CN103699106A (en) * 2013-12-30 2014-04-02 合肥工业大学 Multi-unmanned aerial vehicle cooperative task planning simulation system based on VR-Forces simulation platform
WO2015096806A1 (en) * 2013-12-29 2015-07-02 刘进 Attitude determination, panoramic image generation and target recognition methods for intelligent machine
CN107727079A (en) * 2017-11-30 2018-02-23 湖北航天飞行器研究所 The object localization method of camera is regarded under a kind of full strapdown of Small and micro-satellite
CN108052110A (en) * 2017-09-25 2018-05-18 南京航空航天大学 UAV Formation Flight method and system based on binocular vision
CN108521670A (en) * 2018-03-14 2018-09-11 中国人民解放军国防科技大学 UWB communication and positioning based method for multi-machine-oriented close formation flight and integrated system
CN108983816A (en) * 2018-08-07 2018-12-11 中南大学 Multi-rotor unmanned aerial vehicle mutative scale collaboration monitoring formation flight control method
CN109753076A (en) * 2017-11-03 2019-05-14 南京奇蛙智能科技有限公司 A kind of unmanned plane vision tracing implementing method

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009066350A (en) * 2007-09-18 2009-04-02 Namco Bandai Games Inc Program, information storage medium, and words output determination controller
CN102081404A (en) * 2011-01-27 2011-06-01 西北工业大学 Synergistic target tracking method for dual unmanned planes under communication constraint
WO2015096806A1 (en) * 2013-12-29 2015-07-02 刘进 Attitude determination, panoramic image generation and target recognition methods for intelligent machine
CN103699106A (en) * 2013-12-30 2014-04-02 合肥工业大学 Multi-unmanned aerial vehicle cooperative task planning simulation system based on VR-Forces simulation platform
CN108052110A (en) * 2017-09-25 2018-05-18 南京航空航天大学 UAV Formation Flight method and system based on binocular vision
CN109753076A (en) * 2017-11-03 2019-05-14 南京奇蛙智能科技有限公司 A kind of unmanned plane vision tracing implementing method
CN107727079A (en) * 2017-11-30 2018-02-23 湖北航天飞行器研究所 The object localization method of camera is regarded under a kind of full strapdown of Small and micro-satellite
CN108521670A (en) * 2018-03-14 2018-09-11 中国人民解放军国防科技大学 UWB communication and positioning based method for multi-machine-oriented close formation flight and integrated system
CN108983816A (en) * 2018-08-07 2018-12-11 中南大学 Multi-rotor unmanned aerial vehicle mutative scale collaboration monitoring formation flight control method

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
2016 IEEE AEROSPACE CONFERENCE: "Open source computer-vision based guidance system for UAVs on-board decision making", 2016 IEEE AEROSPACE CONFERENCE *
李雪松;李颖晖;张水清;李霞;徐浩军;: "基于视线的无人机鲁棒自适应编队控制设计", 飞行力学, no. 06 *
温杰;: ""战隼"变"飞镖" 美国空军"忠诚的僚机"作战概念", 兵器知识, no. 07 *
王辰;黎军;富佳伟;姚富宽;: "基于蜂群编队形式的气动耦合优化模型研究", 飞机设计, no. 02 *
符小卫;冯慧成;高晓光;: "通信距离约束下双无人机目标跟踪算法", 系统工程与电子技术, no. 08 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113741531A (en) * 2021-09-15 2021-12-03 江苏航空职业技术学院 Unmanned aerial vehicle cluster cooperative control system and control method for sharing target task information

Similar Documents

Publication Publication Date Title
Quintero et al. Flocking with fixed-wing UAVs for distributed sensing: A stochastic optimal control approach
CN109613931A (en) Isomery unmanned plane cluster object tracking system and method based on biological social force
CN108885470B (en) Task execution method, mobile device, system and storage medium
CN109508036B (en) Relay point generation method and device and unmanned aerial vehicle
CN112789568A (en) Control and navigation system
CN111712773A (en) Control method, electronic equipment and system for cooperative work of unmanned aerial vehicle
CN111213367B (en) Load control method and device
CN203845021U (en) Panoramic aerial photographic unit system for aircrafts
CN112789672A (en) Control and navigation system, attitude optimization, mapping and positioning technology
US20230419843A1 (en) Unmanned aerial vehicle dispatching method, server, base station, system, and readable storage medium
Rilanto Trilaksono et al. Hardware‐in‐the‐loop simulation for visual target tracking of octorotor UAV
Lin et al. Development of an unmanned coaxial rotorcraft for the DARPA UAVForge challenge
Wheeler et al. Cooperative tracking of moving targets by a team of autonomous UAVs
Leichtfried et al. Autonomous flight using a smartphone as on-board processing unit in GPS-denied environments
CN112149467A (en) Method for executing tasks by airplane cluster and long airplane
Moffatt et al. Collaboration between multiple UAVs for fire detection and suppression
Valenti Approximate dynamic programming with applications in multi-agent systems
CN108225316B (en) Carrier attitude information acquisition method, device and system
Bailey Unmanned aerial vehicle path planning and image processing for orthoimagery and digital surface model generation
Mian et al. Autonomous spacecraft inspection with free-flying drones
Bethke Persistent vision-based search and track using multiple UAVs
CN109799841A (en) A kind of unmanned aerial vehicle ground control system, equipment and storage medium
Sanna et al. A novel ego-motion compensation strategy for automatic target tracking in FLIR video sequences taken from UAVs
Cummings et al. Development and testing of a quad rotor smartphone control system for novice users
Sinsley et al. An intelligent controller for collaborative unmanned air vehicles

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
TA01 Transfer of patent application right

Effective date of registration: 20210305

Address after: 101, 1st floor, building 2, yard 20, Suzhou street, Haidian District, Beijing 100080

Applicant after: Beijing Jingbangda Trading Co.,Ltd.

Address before: 100086 8th Floor, 76 Zhichun Road, Haidian District, Beijing

Applicant before: BEIJING JINGDONG SHANGKE INFORMATION TECHNOLOGY Co.,Ltd.

Applicant before: BEIJING JINGDONG CENTURY TRADING Co.,Ltd.

Effective date of registration: 20210305

Address after: Room a1905, 19 / F, building 2, No. 18, Kechuang 11th Street, Daxing District, Beijing, 100176

Applicant after: Beijing Jingdong Qianshi Technology Co.,Ltd.

Address before: 101, 1st floor, building 2, yard 20, Suzhou street, Haidian District, Beijing 100080

Applicant before: Beijing Jingbangda Trading Co.,Ltd.

TA01 Transfer of patent application right
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination