Background
Fixed-wing aircraft (or simply "aircraft") clusters, usually with the long plane transmitting the captured image back to the ground station, the ground station staff find the target based on the image transmitted back by the long plane, and transmit the information of the target to the long plane and the wing plane. The ground station staff can also instruct a longplane to continue monitoring the target, or else instruct several bureaucratic attack targets, etc.
Disclosure of Invention
The inventor finds that the staff at the ground station finds the target based on the image returned by the long airplane, so that the workload of the staff is heavy. In addition, the communication link between the aircraft and the ground station has problems of time delay or link interruption, and the like, which can cause problems of long time delay and reliability of target discovery and task instructions.
The long plane autonomously identifies the target from the monitoring image on the ground and schedules the wing plane to execute the task, so that the burden of workers can be reduced, the dependence on the performance of a communication link between the plane and the ground station is reduced, and the real-time performance and the reliability of target discovery and task execution are improved.
Some embodiments of the present disclosure provide a method for an aircraft cluster to perform a task, comprising:
a long airplane in the airplane cluster acquires a ground monitoring image based on a camera carried by the long airplane;
identifying a target from the monitoring image by using a target detection model, wherein the target detection model is trained by using a training image and target marking information in advance;
and scheduling a plurality of wing planes in the airplane cluster to execute corresponding tasks on the target.
In some embodiments, the lead plane sends position information of the target in the inertial coordinate system, determined based on the pixel points of the target on the image, to the scheduled bureaucratic plane.
In some embodiments, determining the position information of the target in the inertial coordinate system by the long machine comprises:
determining the height of a target under a camera coordinate system according to the focal length of the camera, a first height and a second height of the camera, wherein the first height is the height of an optical center of the camera under an inertial coordinate system, and the second height is the height of a pixel point of the target on an image under the inertial coordinate system;
determining an internal reference matrix between a homogeneous matrix formed based on the height of the target in a camera coordinate system and a position point of the target in the camera coordinate system according to the focal length of the camera and a conversion factor between the pixel and the length;
and calculating the position information of the target under the inertial coordinate system according to the homogeneous matrix and the internal reference matrix and by combining a homogeneous conversion matrix from a holder coordinate system to a camera coordinate system, a homogeneous conversion matrix from an airplane body coordinate system to the holder coordinate system, a homogeneous conversion matrix from the airplane coordinate system to the airplane body coordinate system and a homogeneous conversion matrix from the inertial coordinate system to the airplane body coordinate system.
In some embodiments, determining the height of the target in the camera coordinate system by the long machine comprises:
constructing a proportional relation, wherein the proportional relation is as follows: the ratio of the difference between the first height and the second height to the first height is equal to the ratio of the focal length of the camera to the height of the target in the camera coordinate system;
and calculating to obtain the height of the target under the camera coordinate system based on the proportional relation.
In some embodiments, the position information of the long computer-calculated target in the inertial coordinate system comprises:
and multiplying the internal reference matrix with each homogeneous conversion matrix to obtain a first matrix, and determining the result obtained by multiplying the inverse matrix of the first matrix with the homogeneous matrix as the position information of the target in the inertial coordinate system.
In some embodiments, further comprising: and the lead plane monitors the task execution situation according to the monitoring image and determines whether to continuously schedule the wing plane to execute the corresponding task on the target according to the task execution situation.
In some embodiments, the Global Positioning System (GPS) coordinates of a wing plane of an airplane comprise:
calculating the NED coordinate of the long machine according to the GPS coordinate of the long machine and by combining the conversion matrix of the GPS coordinate system and the ECEF coordinate system and the conversion matrix of the ECEF coordinate system and the NED coordinate system;
calculating the NED coordinates of a wing plane according to the NED coordinates of the long plane and the coordinates of the wing plane in the local coordinate system of the long plane;
calculating the GPS coordinates of a wing plane according to the NED coordinates of the wing plane and combining the inverse matrix of the conversion matrix of the GPS coordinate system and the ECEF coordinate system and the inverse matrix of the conversion matrix of the ECEF coordinate system and the NED coordinate system.
Some embodiments of the present disclosure provide a long airplane in an airplane cluster, including:
an image acquisition unit configured to acquire a monitoring image of the ground based on the onboard camera;
a target identification unit configured to identify a target from the monitored image using a target detection model, the target detection model being trained in advance using a training image and target labeling information;
a wing plane scheduling unit configured to schedule a number of wing planes in the cluster of airplanes to perform respective tasks on the target.
In some embodiments, further comprising: the target position calculating unit is configured to determine position information of the target in an inertial coordinate system based on pixel points of the target on the image;
the bureau plane scheduling unit is also configured to send the position information of the target in the inertial coordinate system to the bureau plane being scheduled.
In some embodiments, further comprising: a monitoring unit configured to monitor a situation of task execution according to a monitoring image;
the wing plane scheduling unit is also configured to determine whether to continue scheduling the wing plane to execute the corresponding task on the target according to the situation of task execution.
Some embodiments of the present disclosure provide a long airplane in an airplane cluster, including: a memory; and a processor coupled to the memory, the processor configured to perform a method of an aircraft cluster performing tasks in any of the preceding embodiments based on instructions stored in the memory.
Some embodiments of the disclosure propose a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements a method of performing a task by a cluster of aircraft in any of the aforementioned embodiments.
Detailed Description
The technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the drawings in the embodiments of the present disclosure.
Fig. 1 is a schematic diagram of some embodiments of an aircraft cluster system of the present disclosure.
As shown in fig. 1, the aircraft cluster system 10 comprises a long plane 11, several bureaucratic planes 12 and a ground station 13. Long machine 11 can communicate remotely with ground station 13. The bureaucratic machines 12 may not communicate with the ground station 13. The board computers 11 and several bureaucratic computers 12 form a local area network, through which the board computers 11 and bureaucratic computers 12 can communicate. If a contingency occurs with a lead plane 11, a certain wing plane 12 can become a lead plane.
During the takeoff phase of the cluster of airplanes, the long plane is launched first and then the respective wing planes are launched one after the other. The bureau of bureaus receives the message of the leader in real time, for example, the message of the leader includes information such as a GPS (Global Positioning System) coordinate, a speed, and a voyage of the leader. The bureaucratic plane plans the track autonomously according to the information of the long plane.
In order to plan a trajectory, a bureaucratic plane needs to determine its GPS coordinates from the GPS coordinates of the long plane and from the coordinates of the bureaucratic plane in the local coordinate system of the long plane.
The local coordinate system of the long machine takes the mass center of the long machine as an origin O, the X axis points to the north of the earth, the Y axis points to the east of the earth, the X-Y plane is parallel to the horizontal plane, and the Z axis is vertical to the X-Y plane and points downwards. The long machine local coordinate system coincides with the NED (North-East-Down, northeast) coordinate system. The coordinate of a bureaucratic machine i in the local coordinate system of the long machine is as follows:
let the GPS coordinate of the length machine be
Wherein
Respectively longitude, latitude and altitude of the long machine.
As shown in fig. 2, the determination of the GPS coordinates of a bureaucratic plane from the GPS coordinates of the longplane and the coordinates of the bureaucratic plane in the local coordinate system of the longplane comprises:
and step 21, calculating the NED coordinate of the long machine according to the GPS coordinate of the long machine and by combining a conversion matrix of a GPS coordinate system and an ECEF (Earth Centered Earth Fixed) coordinate system and a conversion matrix of the ECEF coordinate system and the NED coordinate system.
The formula is expressed as:
wherein the content of the first and second substances,
NED coordinates representing a long machine, of which
The NED coordinates of the long machine are respectively expressed in the values of the x-axis, y-axis and z-axis directions, T
1A transformation matrix, T, representing the GPS coordinate system and the ECEF coordinate system
2A transformation matrix representing an ECEF coordinate system and an NED coordinate system,
GPS coordinates, T, representing a long machine
1And T
2Reference is made to the prior art.
The ECEF coordinate system is characterized in that the position of an original point is located in the earth mass center, the X axis passes through the intersection point of a Greenwich mean line and an equator line, the positive direction is the direction of the original point pointing to the intersection point, the Z axis points to the north pole through the original point, and the Y axis, the X axis and the Z axis form a right-hand coordinate system.
Step 22, calculating the NED coordinates of a bureaucratic machine from the NED coordinates of the bureau machine and the coordinates of the bureau machine in the local coordinate system of the bureau machine.
The formula is expressed as:
wherein the content of the first and second substances,
NED co-ordinates representing a bureaucrati, of which
Values of the NED coordinates of a bureaucratic i in the direction of its x, y and z axes, respectively. The meanings of the other symbols are referred to above.
And step 23, calculating the GPS coordinates of the bureaucratic machines according to the NED coordinates of the bureaucratic machines and combining the inverse matrix of the conversion matrix of the GPS coordinate system and the ECEF coordinate system and the inverse matrix of the conversion matrix of the ECEF coordinate system and the NED coordinate system.
The formula is expressed as:
wherein the content of the first and second substances,
representing the GPS coordinates of a bureaucratic i. The meanings of the other symbols are referred to above.
After the cluster of airplanes takes off, the formation of the long plane and the wing plane flies. The formation flight mode may be preset or adjusted as directed by the longplane. The formation flight mode may be, for example, a center mode centered on the long aircraft, a master-slave mode including the long aircraft, or the like, but is not limited to the illustrated example.
During formation flight, the cluster of airplanes completes corresponding tasks in a cooperative manner performed by the leader and the bureaucratic planes. As described in detail below.
Fig. 3 is a schematic illustration of some embodiments of a method of performing tasks by a cluster of aircraft according to the present disclosure.
As shown in fig. 3, the method of this embodiment includes:
and step 31, long aircrafts in the airplane cluster acquire ground monitoring images based on cameras carried by the long aircrafts.
And step 32, the long machine identifies the target from the monitoring image by using a target detection model.
The target detection model is trained by utilizing a training image and target marking information in advance. The target detection model can be a MobileNet-SSD model, for example, so as to improve the real-time performance of target identification.
After the long machine recognizes the target, step 33 may be executed directly, or the target information may be transmitted to the ground station to be confirmed, and step 33 may be executed after the confirmation.
And step 33, the longplane schedules a plurality of bureaucratic planes in the aircraft cluster to execute corresponding tasks on the target.
The task may be, for example, area rescue, forest fire fighting, material delivery, anti-terrorist reconnaissance, target attack, etc.
After step 33, step 34 may also optionally be performed.
And step 34, the lead plane monitors the task execution situation according to the monitoring image and determines whether to continue to schedule the wing plane to execute the corresponding task on the target according to the task execution situation.
Taking a fire fighting task as an example, according to a preset target and a corresponding wing plane scheduling strategy, a leader sends a command of putting a fire extinguishing agent to a fire position to the 1 st wing plane and the 2 nd wing plane for the first time, the 3 rd, 4 th and 5 th wing planes are on standby, the leader observes a fire situation at the same time, and when the 1 st wing plane and the 2 nd wing plane complete the fire fighting task and a fire is not extinguished, the leader continues to send the command of putting the fire extinguishing agent to the fire position to the 3 rd, 4 th and 5 th wing planes and continues to execute the fire fighting task. Different targets can set different wing plane scheduling strategies, and the same target can set the same or different wing plane scheduling strategies as required.
In the embodiment, the long plane autonomously identifies the target from the monitoring image of the ground and schedules the wing plane to execute the task, so that the burden of workers can be reduced, the dependence on the performance of a communication link between the plane and the ground station is reduced, and the real-time performance and the reliability of target discovery and task execution are improved.
When a lead plane schedules a lead plane, the lead plane can send the position information of a target in an inertial coordinate system, which is determined based on a pixel point of the target on an image, to the scheduled lead plane so that the lead plane knows the position information of the target. The following specifically describes a calculation process of the position information of the target.
Some coordinate systems are defined below. Fig. 4a and 4b show schematic diagrams of the position relationship of the coordinate system (1-5).
(1) Inertial frame (inertial frame), denoted as O
I-X
IY
IZ
I. Origin O
IAt the point of departure (HOME), X
IPointing to the North (North), Y
IEast, Z
IPointing to the ground with the coordinate (X)
I,Y
I,Z
I)。
Represents the origin O
IDistance from the center of gravity of the aircraft (noted CM).
(2) Aircraft coordinate system (vehicle frame), denoted as Ov-XvYvZv. Origin OvAt the aircraft center of gravity (denoted CM), XvPointing to north, YvPointing to east, ZvPointing to the ground with the coordinate (X)v,Yv,Zv)
(3) Body coordinate system (body frame), denoted as Ob-XbYbZb. Origin ObAt the aircraft center of gravity, XbPointing to the aircraft nose, YbPointing to the right wing, Z, in the plane of the fuselagebThe vertical body plane points downwards with the coordinate of (X)b,Yb,Zb)。
(4) Tripod head coordinate system (gimbal frame), denoted as Og-XgYgZg. Origin OgAt the center of gravity (denoted as Gimbal), X of the pan/tilt headgIn the direction of the optical axis, YgParallel to the image platform to the right, ZgParallel to the image plane downwards, with the coordinate (X)g,Yg,Zg)。
(5) Camera coordinate system (camera frame), denoted as Oc-XcYcZc. Origin OcAt the optical center, XcParallel image plane down, YcParallel image plane to the right, ZcParallel to the optical axis forward and having the coordinate (X)v,Yv,Zv)。
(6) Pixel coordinate system (pixel frame), denoted as Op-XpYp. Origin OpIn the upper right corner of the image, the coordinate (X)p,Yp),Xp,YpAre pixel values.
Some symbols are defined below.
viRepresenting the expression of the vector v in the coordinate system i.
A rotation matrix representing coordinate system i through coordinate system j.
A translation matrix representing coordinate system i to coordinate system j.
A homogeneous transformation matrix representing coordinate system i to coordinate system j.
Understandably, once the coordinate system i, j is defined, the rotation matrix of the coordinate system i to the coordinate system j
Translation matrix
Homogeneous transformation matrix
Are known quantities that can be determined. For example, based on the previously defined coordinate systems (1-5), a homogeneous transformation matrix of the pan-tilt coordinate system to the camera coordinate system
Homogeneous conversion matrix from airplane body coordinate system to holder coordinate system
Homogeneous conversion matrix from airplane coordinate system to airplane body coordinate system
And a homogeneous transformation matrix from the inertial coordinate system to the aircraft body coordinate system
Etc., are known quantities that can be determined.
Fig. 5a and 5b show modeling schematics of the camera coordinate system and pixel coordinate system (5-6) of the present disclosure. The following describes, with reference to fig. 5a and 5b, the long machine determining the position information of the target in the inertial coordinate system based on the pixel points of the target on the image.
The position point of the target P in the camera coordinate system is
The pixel coordinate on the image is q ═ x
ip,y
ip,1,1)
T. The focal length of the camera is f. Let λ be p
z. Wherein p is
x,p
y,p
zThe units of (d) are units of length, such as meters (m). x is the number of
ip,y
ipThe units of (d) are each a pixel unit, e.g., a pixel value. S
x、S
yConversion factors between pixel and length in x, y directions, x
im、y
imThe unit of (b) is a unit of length, e.g. meter. In the camera coordinate system, the pixel unit is converted into a length unit (for example, the pixel value is converted into meters), and the following formula is given:
xim=(-yip+0y)Sy,yim=(xip-0x)Sx
by being similar to a triangle, the following formula is given:
wherein, the homogeneous matrix formed based on the height λ of the target under the camera coordinate system is:
wherein, I is an identity matrix, and C represents a homogeneous matrix lambdaq and a position point of a target under a camera coordinate system
C is:
it can be seen that according to the focal length of the camera and the conversion factor between the pixel and the length, an internal reference matrix, 0, between a homogeneous matrix formed based on the height of the target in the camera coordinate system and the position point of the target in the camera coordinate system is determinedx=0y=0。
From the matrix transformation relationship, we derive:
the following is obtained from the above equation:
therefore, the position information of the target under the inertial coordinate system is calculated according to the homogeneous matrix and the internal reference matrix and by combining the homogeneous conversion matrix from the holder coordinate system to the camera coordinate system, the homogeneous conversion matrix from the airplane body coordinate system to the holder coordinate system, the homogeneous conversion matrix from the airplane coordinate system to the airplane body coordinate system and the homogeneous conversion matrix from the inertial coordinate system to the airplane body coordinate system. That is, the first matrix is obtained by multiplying the internal reference matrix and each homogeneous conversion matrix
Inverting the first matrix
Determining the position information of the target in an inertial coordinate system by the result obtained by multiplying the homogeneous matrix lambdaq
Therefore, lambda value is calculated, so that lambda q and further lambda q can be calculated
The calculation process of the λ value is described below.
Let p
ccIs the optical center position of the camera
The homogeneous coordinate of the optical center in the inertial coordinate system is as follows:
pixel coordinate q ═ xip,yip,1,1)TThe coordinates in the inertial frame are:
the coordinates of the optical center of the camera in the inertial coordinate system are shown, and the height of the optical center of the camera in the inertial coordinate system is shown as
The coordinates of the pixel points of the target on the image under the inertial coordinate system are shown, and the height of the pixel points of the target on the image under the inertial coordinate system is shown as
The projection of the target P on the optical axis is B, d is
A distance from P, λ is
A distance from B, m is
And
the distance between them. Triangle shape
And triangle
Similarly, the following proportional relation is constructed: the ratio between the difference between the first height and the second height and the first height is equal to the ratio between the focal length of the camera and the height of the target in the camera coordinate system, and the formula is as follows:
and calculating the height of the target under a camera coordinate system based on the proportional relation:
thus, the λ value is calculated. From the foregoing, it can be calculated
Fig. 6 is a schematic structural diagram of some embodiments of long aircraft in a cluster of aircraft according to the present disclosure.
As shown in fig. 6, the lengthening machine of this embodiment includes:
an image acquisition unit 61 configured to acquire a monitoring image of the ground based on the onboard camera;
a target recognition unit 62 configured to recognize a target from the monitored image using a target detection model, the target detection model being trained in advance using a training image and target labeling information;
a wing plane scheduling unit 63 configured to schedule a number of wing planes in the cluster of airplanes to perform respective tasks on the target.
In some embodiments, the longplane further comprises: a target position calculation unit 64 configured to determine position information of the target in the inertial coordinate system based on pixel points of the target on the image; the bureau plane scheduling unit 63 is also configured to send the position information of the target in the inertial coordinate system to the bureau plane being scheduled.
In some embodiments, the longplane further comprises: a monitoring unit 65 configured to monitor a situation of task execution according to the monitoring image; the wing plane scheduling unit 63 is further configured to determine whether to continue scheduling the wing plane to execute the corresponding task on the target according to the situation of task execution.
Fig. 7 is a schematic structural diagram of some embodiments of long aircraft in a cluster of aircraft according to the present disclosure.
As shown in fig. 7, the lengthening machine of this embodiment includes: a memory 71; and a processor 72 coupled to the memory, the processor configured to perform the method of performing a task for a cluster of aircraft of any of the foregoing embodiments based on instructions stored in the memory.
Memory 610 may include, for example, system memory, fixed non-volatile storage media, and the like. The system memory stores, for example, an operating system, an application program, a Boot Loader (Boot Loader), and other programs.
As will be appreciated by one skilled in the art, embodiments of the present disclosure may be provided as a method, system, or computer program product. Accordingly, the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present disclosure may take the form of a computer program product embodied on one or more computer-usable non-transitory storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The above description is only exemplary of the present disclosure and is not intended to limit the present disclosure, so that any modification, equivalent replacement, or improvement made within the spirit and principle of the present disclosure should be included in the scope of the present disclosure.