IL179304A - Method and system for debriefing flying scenes - Google Patents

Method and system for debriefing flying scenes

Info

Publication number
IL179304A
IL179304A IL179304A IL17930406A IL179304A IL 179304 A IL179304 A IL 179304A IL 179304 A IL179304 A IL 179304A IL 17930406 A IL17930406 A IL 17930406A IL 179304 A IL179304 A IL 179304A
Authority
IL
Israel
Prior art keywords
data
scene
flight
debriefing
aircraft
Prior art date
Application number
IL179304A
Other versions
IL179304A0 (en
Inventor
Oded Efrati
Original Assignee
Israel Aerospace Ind Ltd
Oded Efrati
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Israel Aerospace Ind Ltd, Oded Efrati filed Critical Israel Aerospace Ind Ltd
Priority to IL179304A priority Critical patent/IL179304A/en
Publication of IL179304A0 publication Critical patent/IL179304A0/en
Publication of IL179304A publication Critical patent/IL179304A/en

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/16Control of vehicles or other craft
    • G09B19/165Control of aircraft
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/003Simulators for teaching or training purposes for military purposes and tactics
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/02Simulators for teaching or training purposes for teaching control of vehicles or other craft
    • G09B9/08Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of aircraft, e.g. Link trainer

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Traffic Control Systems (AREA)
  • Debugging And Monitoring (AREA)

Description

179304 | 453490 τηκ A Method and System for Debriefing Flying Scenes Israel Aircraft Industries Ltd. Win η >ιΐΝη The Inventor: Oded EFRATI C. 171220 A METHOD AND SYSTEM FOR DEBRIEFING FLYING SCENES FIELD OF THE INVENTION This invention relates to a method and system for flight debriefing, such as debriefing performance of pilots in training flights and others.
BACKGROUND OF THE INVENTION The performance of pilots plays a significant role in aerial combat scenes. The more sophisticated and fast the aircraft and the more aircrafts that participate in the mission, the more significant is the role of the pilot to successfully accomplish the tasks. This would naturally calls for longer training sessions of the pilots in order to better control the aircraft and possibly better corporate with mating aircrafts when the mission requires cooperation of two or more aircrafts. The high costs of operating aircrafts for the training sessions lead to the development of sophisticated ground simulators (such as for F-16 fight simulator) in order to simulate the behavior of the aircraft and to allow the pilot to train and improve his skills in a wide range of selected flight scenes whilst obviating the need to actually operate the aircraft and obviously saving considerable costs. Other types of simulators, e.g. of the kind disclosed in US pat. No. 5,807,109 (Tzidon et al), utilize low cost civilian aircrafts that are fitted with advanced simulation of avionics and possibly other features of high performance aircrafts (such as F-15 Eagle or F-16 Falcon fighters) in order to simulate the functionality of the avionics of high performance aircrafts using, however, low cost aircrafts. The low cost aircrafts are naturally considerably cheaper to purchase and operate compared to the high performance aircrafts.
Other important tools to improve the pilots' skills are debriefing tools. Very common debriefing tools are ground debriefing tools. For instance, there exists a system for fitting in the F-16 fighter which records flight data during flight. After flight the data is downloaded to ground station (from a pod of the aircraft) and the performance of the aircraft as well as of the pilot can be played back off-line, thereby identifying faults, mistakes and weak points. The pilot (and possibly other members in a debriefing team) can observe the played back scenario and the analyzed data and to learn what needs to be improved in future flights/simulations. The debriefing tools can obviously be used also when two or more aircrafts participate in a mission.
Debriefing functions are also incorporated in flight simulators including the one disclosed in the Ί 09 patent.
Ground debriefing, whilst providing an important training enhancement means, does not assist in real-time intra-scene events, since it inherently operates post factum after termination of the flight. There is a need to analyze the pilot/aircraft behavior in real-time, e.g. for alerting on possible likelihood of collision between aircrafts in missions that require two or more participants. US 5,325,302 (Tzidon et al.) provides for a warning system for predicting collision between two or more relatively moving objects. In accordance with certain embodiment of the '302 patent, the absolute position in space relative to a fixed frame of reference of each participating aircraft is determined and this information being constantly updated and then transmitted to all the other participating aircraft. It is now possible to predict the aircraft's location in space. This information permits the trajectory of the aircraft to be predicted, the prediction being constantly refined in accordance with the positional and acceleration data of the aircraft. The trajectories are predicted in respect of all the participating aircrafts and are then checked to ensure whether two or more trajectories intersect, thereby indicating an ensuing collision. In the event that such a collision is predicted, the pilots of the respective aircrafts are warned in real time, thereby permitting them to take appropriate remedial action. The latter scenario exemplifies a so called intra-scene debriefing where the data is analyzed in real time (by this example the predicted flight trajectories of the participating aircrafts) and if necessary (i.e. a possible collision state is encountered), a proper alert is initiated. Another example of an intra-scene debriefing is an Air-to-Air missile trajectory calculations where a virtual missile launch result is calculated in order to evaluate whether a specific missile has hit the target aircraft. Despite the fact that the indication of hit/miss is deferred (due to the calculated relative flight times of the missile and the target aircraft), this scenario is still regarded as real time analysis, i.e. falls in the category of intra-scene debriefing.
As specified above, training flights in many air forces around the world are based on the idea of repeating a specific exercise several times during a single flight session, such as performing repeatedly landing of the aircraft, performing designated interception task, A/G exercise, etc. The flight session is broken down to distinct scenes during which the pilot performs the designated exercise and tries to improve for the next scene within the same flight session. To this end, the pilots try to identify the mistakes of the performed exercise in order to improve for the next scene. The means that may help the pilot are intra- scene indications, such as, the specified potential collision alert, however, pilots depend predominantly on their own ability to monitor and memorize the different stages of the scene in order to identify mistakes and attempt and improve them in the subsequent scene. Obviously, the ability to monitor and memorize events, vary from one individual to the other which is an inherent drawback and is error prone as the pilot is typically very focused on the designated exercise and may overlook important details that may assist him in improving performance during • the next scene.
As described above, there are advanced ground debriefing tools which provide accurate and fine tuned debriefing means for identifying faults, mistakes and others, assisting the pilot (and possibly other members in a debriefing team) to learn what needs to be improved in future flights. These tools, however, can only be used post factum i.e. after the flight session has been terminated and accordingly any conclusions can be implemented for the next flight session and not for subsequent flight scenes within the same flight session. Considering the very high price tag of flight session (e.g. about $ 15,000 per hour for one flight hour of F-15), the fact that flight performance analysis can only be used after the flight session is a significant disadvantage.
Prior art references considered to be relevant as background to the invention are listed below. Acknowledgement of the references herein is not to be inferred as meaning that these are in any way relevant to the patentability of the invention disclosed herein.
US 4,442,491 discloses an air training evaluation process which corrects the position and velocity sensed by an aircraft's inertial navigation system (INS) to obtain more accurate data on the aircraft's position during an engagement.
In carrying out the process the aircraft's on-board INS system is employed to obtain pre-flight data on position and velocity, data on position and velocity during the engagement, and post-flight data on position and velocity. The pre-flight data on position and velocity and the post-flight data on position and velocity are compared with independently determined data on the pre-flight position and velocity of the aircraft and with independently determined data on the post-flight position and velocity of the aircraft respectively to obtain pre-flight error functions and post-flight error functions on position and velocity. These functions and the known time variant drift characteristics of position and velocity of the INS are employed to derive position and velocity correction functions during the time of the engagement. The position and velocity correction functions are employed to correct the INS data on position and velocity during the engagement. The corrected data then are employed to display post-flight, a more accurate position of the aircraft during the engagement relative to background portrayals of features or terrain on the earth.
During an engagement of two or more aircraft, aircraft-to-aircraft or aircraft-to-ground corrections may be made to the corrected INS data post-flight for further enhancement of accuracy of the INS data.
US 5,308,022 discloses that Kinematic data is generated onboard an aircraft for describing its state of motion as it flies over a tracking range. The - 4a - kinematic data is transmitted to a plurality of remote stations on the tracking range and tracking data is generated which represents the actual state vector of the flying aircraft. Also generated is pseudo tracking data, identical to the tracking data but delayed in time, for representing the motion and trajectory of a pseudo aircraft. Utilizing the pseudo tracking data an image of the flying aircraft is generated from the viewpoint of the cockpit of the pseudo aircraft. A person may then view the image to observe the dynamics of the flying aircraft as if the person were chasing the actual aircraft in the pseudo aircraft. The method permits sophisticated maneuvers in the remote piloting of aircraft as well as flight performance analysis.
There is a need in the art for an airborne debriefing system/method for post-scene intra-flight debriefing. Such a system/method would provide important information to the pilot(s) between subsequent flight scenes during the same flight session, thereby assisting the pilot(s) to identify mistakes and improve the performance during the flight, and improve the efficiency of training flights.
SUMMARY OF THE INVENTION By one embodiment, the invention provides a method for performing debriefing during flight of at least one flying platform, comprising: a. performing a flight training session that includes at least two flight scenes; b. monitoring selected data of at least one flying platform during at least one of said flight scenes; c. analyzing the monitored data; d. invoking post scene debriefing of selected parameters that are related to the monitored data in respect of at least one of said scenes.
By a further embodiment, the invention provides a method wherein said flying platform being an aircraft.
By a still further embodiment, the invention provides a method wherein said aircraft being a combat or training aircraft. - 4b - By another embodiment, the invention provides a platform being manned.
By yet another embodiment, the invention provides a method wherein said platform being unmanned.
Still by a further embodiment, the invention provides a method wherein said training session includes landing a flying platform.
By a further embodiment, the invention provides a method wherein said monitored data include access angle, approach velocity, angle of attack during - 5 -approach; and wherein said selected parameters include at least one of average access angle, minimum velocity and angle of attack before touchdown.
Still by a further embodiment, the invention provides a method wherein said training session involves a formation of at least two flying platforms which can communicate through a data link.
By yet another embodiment, the invention provides a method wherein said training session includes intercepting at least one enemy flying platform by at least one flying platform.
Still by a further embodiment, the invention provides a method wherein said data include distance between the at least two flying platforms, velocity and altitude of each flying platform, susceptibility of each flying platform to hit by A/A missile of the enemy flying platform; and wherein said selected parameters include at least one of maximal distance between the at least two flying platforms, lowest velocity of each flying platform and in what altitude, a number of times that each flying platform was susceptible to hit by an A/A missile of the enemy flying platform.
By still yet another embodiment, the invention provides a method comprising: i. identifying mission data including session, data for monitoring and at least one parameter; ii. uploading to the at least one platform said identified mission data; iii. identifying start of scene and monitoring data during said scene or portion thereof and identifying end of said monitoring; iv. analyzing the monitored data for debriefing, post scene, on said at least one parameter.
By a further embodiment, the invention provides a method wherein said debriefed data is provided to MFD display configurable to present at least one event that pertain to said at least one parameter.
Thus by a still further embodiment, the invention provides a method wherein said uploading is performed using transfer cartridge or other removable storage.
By still a further embodiment, the invention thus provides a method wherein said uploading is performed using wireless data link. - 6 - Yet by a still further embodiment, the invention provides a method wherein said data for monitoring and parameters are identified using mission tool.
Thus by a further embodiment, the invention provides a method wherein said monitoring is performed using AACMI system modules.
By a final embodiment, the invention provides a method wherein said AACMI system being EHUD system.
Still by a further embodiment, the invention provides a system for performing debriefing during flight of a flying platform; the system includes a processor and associated database configured to perform the following, comprising: a. monitoring selected data of at least one flying platform during at least one flight scene; then at least one flight scene forms part of at least two flight scenes that constitute a flight training session; b. analyzing the monitored data; c. invoking post scene debriefing of selected parameters that are related to the monitored data in respect of at least one of said scenes.
BRIEF DESCRIPTION OF THE DRAWINGS . In order to understand the invention and to see how it may be carried out in practice, a preferred embodiment will now be described, by way of non-limiting example only, with reference to the accompanying drawings, in which: Fig. 1A illustrates an operational scenario using an airborne debriefing method, in accordance with an embodiment of the invention; Fig. IB illustrates an operational scenario using an airborne debriefing method, in accordance with another embodiment of the invention;, Fig. 1C illustrates a typical sequence of operation for determining start scene and end scene events, in accordance with a specific embodiment of the invention; Fig. 2 illustrates a screen layout exemplifying an inter-scene debriefing results, in accordance with an embodiment of the invention; and .7.
Fig. 3 illustrates a flow diagram of a sequence of operation, in accordance with an embodiment of the invention; Fig. 4 illustrates a generalized block diagram of system components, in accordance with an embodiment of the invention.
DETAILED DESCRIPTION OF THE INVENTION In the context of the invention, the term "flight" or "flight session" includes two or more flight scenes where in each scene the pilot(s) performs identical or similar actions. Non-limiting example being a training flight where the pilot trains how to land the aircraft and each landing attempt or selected portion thereof is regarded as a scene. Another non-limiting example is a couple of aircrafts which train how to intercept "an enemy aircraft" which intrudes a friendly territory. Each intercepting attempt or selected portion thereof may be regarded as flight scene. Note, that whereas the description refers mainly to military application, it is by no means bound by this specific application. Note also that whereas the description refers mainly to aircraft, the invention is, likewise, applicable also to other manned or unmanned vehicles. Note also that the description refers mainly to pilots, it is likewise applicable to other aircrew members, depending upon the particular application.
Bearing this in mind, attention is drawn to Fig. 1A illustrating an operational scenario using an airborne debriefing method, in accordance with an embodiment of the invention. By- this particular example, the pilot is trained in landings and to this end, he repeatedly attempts to land the aircraft 1A on a landing course 2A along flight trajectory 3A. Each attempt to land the aircraft 1A is regarded by this example as a distinct flight scene, and the airborne debriefing system in accordance with an embodiment of the invention aims at recording and analyzing data during landing attempt and applying post scene debriefing in order to provide the pilot with relevant data on the previous landing attempt and to indicate on possible faults.
In accordance with one embodiment, flight data can be recorded during the landing process as the aircraft approaches the ground, such as access angle, approach velocity, angle of attack during approach and possibly others. As the aircraft takes off (see 4 A) the pilot may activate the debriefing mode before approaching to the next scene (in this particular example another landing attempt). The pilot may extract information, such as the average access angle, minimum velocity and angle of attack before touchdown (being example of sought parameters), which are important for achieving a successful landing. Note that in the post scene state (i.e. after a landing attempt and before the next one), the pilot is not focused in performing the (landing) task and can concentrate on the extracted data, analyze it, identify mistakes (e.g. too low approach velocity and/or too high angle of attack) and pay more attention to this particular landing parameters during the next scene, i.e. next landing attempt. The post scene debriefing has the advantage that the pilot can "process" the data without the mental pressure that is inherent when actually focusing on the task, e.g. the landing task. Thus, it would be considerably more difficult for the pilot to monitor and process all these data whilst in the intra scene phase (i.e. when actually landing the aircraft 5A) since he must be focused in performing the actual landing procedure.
Those versed in the art will readily appreciate that the invention is not bound by debriefing the specified parameters for the training landing session, and accordingly other parameters may be monitored/debriefed, all depending upon the particular application. Note also that the invention is not bound to any specific timing of performing the post-scene debriefing and, accordingly, any timing following the scene and before starting the next scene is applicable. By way of non-limiting example, the post scene debriefing could be triggered in 6A, 7A or 8A. Note that the scene is dependent on the particular task. For instance, in the example of Fig. 1 A, the monitored parameters are access angle, approach velocity, angle of attack during approach (until touchdown). By this specific example, the scene is thus defined by the landing trajectory /duration section - 9 -through which the specified parameters are monitored. Note generally that the scene has a scene start point and scene end point either or both determined manually (namely in response to a command by the pilot) or automatically, namely when certain conditions are met. For example, in accordance with the example of Fig. 1A, a start scene is encountered when certain conditions are met, e.g. when the aircraft is at certain altitude, and at certain rate of descent. Automatic end scene conditions may be encountered, e.g. when touchdown occurs. These are of course only examples and other condition may apply, depending upon the particular application. As will be explained in greater detail below, during the scene (or in certain embodiments in portions thereof) certain data of interest may be monitored. The data is analyzed (wholly or partially during the scene) and after the scene data is debriefed.
Reverting now to Fig. 1A, during the scene, the pilot is focused in the actual landing. Accordingly, the data is monitored and partially or fully analyzed. After having terminated the scene (by this particular example the touchdown) a post scene phase commences during which the pilot prepares himself for the next landing session. This would involve taking off and approaching another landing attempt. Obviously at this stage, the pilot is not focused in performing the task and can therefore invoke the debriefing action and being informed on his rank in respect of the tested parameters (by this example the access angle, approach velocity, angle of attack during approach (until touchdown). In certain embodiments, the pilot can invoke the debriefing mode at any desired location in the post scene phase. Examples are at locations 6A, 7A or 8A which are disposed after touchdown of the previous scene and before entering the next landing attempt. Note that the data is debriefed after the end of the scene when the pilot is no longer concentrated in accomplishing the mission of the scene, e.g. in the latter example, landing.
Note also that in certain embodiments, the analysis (or portion thereof) of the monitored data (whenever applicable) can be performed during the scene or τ - 10 -thereafter. The post scene analysis (whenever applicable) can be performed before or during the debriefing phase.
Note also those subsequent scenes do not necessarily have to be identical. For instance, the pilot may perform a landing scene at another airfield (possibly with higher rate of descent compared to the previous landing scene) and still assess the scene according to requested pararneters (e.g. the minimal angle of attack during touchdown).
Attention is now drawn to Fig. IB, illustrating an operational scenario using an airborne debriefing method, in accordance with another embodiment of the invention. This embodiment illustrates yet another non-limiting example of monitoring analyzing and debriefing parameters for training landing session. In accordance with this embodiment, the pilot's performance are tested and scored when approaching landing. The purpose of the training is to test whether the pilot performs appropriate approach for landing. As shown in Fig. IB, the aircraft should land on the runway IB (having runway edges 2B1 and 2B2).
More specifically, in order to achieve optimal landing, the aircraft should pass near one of a so called imaginary entry points (of which four are shown in Fig. IB and designated 3B1 to 3B4), and accordingly what is tested is how close the aircraft has traversed the entry point.
The relevant entry point depends on the landing direction of the aircraft.
For instance, in the example of Fig. IB, the landing trajectory of the aircraft 4B should coincide or pass near entry point 3B1. The respective entry point has a given X,Y displacement relative to the runway (at the runway plain) and is placed at a given altitude above the runway. For achieving optimal landing, the aircraft should pass near an entry point within a given altitude and distance errors and at a given velocity.
Thus, in accordance with this embodiment, when the landing procedure evaluation is selected during mission planning, two sets of data are loaded from a data-base: geometry and evaluation parameters.
The geometry of the runway is defined by the following data: - 1 1 - Coordinates of the runway edge points (see IB for the runway and 2B for the edges.
Runway altitude (ZM), namely the altitude above or below sea level.
· Position of the entry points (3B1 to 3B4) relative to the runway.
• Altitude of the entry points (HA), namely altitude above sea level.
Evaluation parameters are defined per aircraft type. By this particular example, they include: dR - allowable distance (error) from an entry point. dH - allowable deviation (error) from the entry altitude VA - required velocity at the entry point dV - allowable deviation from the required velocity Ai ,- an angle that defines a maximum direction range for the approach H - ceiling of the approach region (above runway altitude) The HM is used as a barrier entry check. Thus, an aircraft traversing the entry point at an altitude higher than HM is not considered as approaching the runway for landing and therefore the specified monitoring, analysis and debriefing procedures is not triggered on.
Similarly, it is assumed that an aircraft that approaches the runway for landing should traverse the entry point at an angle within an angle range (bound by angle A], see 5B in Fig. IB). Thus, if an aircraft traverses the entry point at traversing angle different than A] , the aircraft is not considered as approaching the runway for landing and therefore the specified monitoring, analysis and debriefing procedure is not triggered on.
In accordance with this embodiment, the monitoring and analyzing procedure for accomplishing the desired debriefing is invoked periodically while the aircraft is in the air, say for example at a rate of lHz.
The algorithm uses position and velocity data to test if the aircraft (AC) has started a landing procedure. To this end, the following conditions are met: ■ The AC is within a test region (marked by the shaded rectangle 6B in Figure IB) ■ The altitude is below the ceiling HM ■ The approach angle is smaller than Aj.
As will be explained in greater detail below, by this specific embodiment, the scene starts when SI conditions are encountered and it ends when end conditions are encountered (e.g. touchdown). By this particular embodiment, both scene start and scene end conditions are determined automatically.
The position, direction and speed at the cross point are recorded and used for evaluation. The procedure is successful if all the following conditions are met after analyzing the monitored data: 1. The distance of the aircraft D from the nearest entry point is small: D≤dR 2. The altitude H of the aircraft is close to the required value : |H - ZM - (HA - ZM )\≤dH 3. The speed V is close to the required value \V - ≤dV Having monitored and analyzed the data during the scene, the post scene debriefing can be activated. At any stage following the scene and before the entry to the next landing scene, the pilot can invoke the debriefing and being informed about the analysis result. Note that whilst the debriefing is activated following the scene, it refers to data monitored within the scene.
For instance, in the example of Fig. IB, it appears that the distance error of the trajectory 4B relative to the entry point 3B1 does not exceed the specified allowed error dR. By this example, the pilot will hear the debriefing results whenever he decides. The aim is to be as close as possible to the entry point in order to complete a successful landing pattern.
Note that whereas in the latter example, the analysis occurs during the scene, this is not necessarily always the case and accordingly, some or all of the analysis stages (e.g. whether I V - VA I < dV) during the post scene stage .Note also that the post scene debriefing can be performed at any stage following a landing and before entry to the next landing scene. By the latter example, a debriefing can be invoked before approaching the entry point.
Bearing the example of Fig. IB in mind, attention is drawn also to Fig. 1C illustrating a typical sequence of operation for determining start scene and end scene events. Thus, at the onset, an enquiry is made which S condition is encountered 1C (the default is SO). If SO applies, this means that start scene condition has not been encountered. Next, 2C it is enquired when start scene conditions are encountered (e.g. when certain altitude, velocity and position conditions are met) If in the affirmative, start scene conditions are met and the state is changed to 3C.
Control is now returned to inquiry 1C, and then it is inquired if base' line is crossed 4C. When baseline is crossed the state is modified to S2 (5C), as shown in Fig. IB Note that this specific example, as long as base line is not crossed, monitoring does not commence. Note that in accordance with certain other - 14 -nonlimiting embodiments, S2 state is obviated and recording commences already at SI . Reverting now to Fig. 1C, control is returned to inquiry 1C, and attempt is made to check whether the aircraft performs a landing maneuver (in which case it must turn and cross again the base line within a short period (as shown in Fig. IB), or it does not plan to land, in which case it will proceed flight trajectory away off the runway 2B. In the latter case, the aircraft will obviously not cross again the base line for approaching the runway. Bearing all this in mind, an enquiry is made whether the aircraft crosses again the base line. This is achieved through the following series of inquiries. Has timeout Tout been reached? (6C). If the timeout has been encountered and the aircraft did not cross again the same base line, this means that the aircraft does not perform a landing maneuver and accordingly, state is reverted to default SO (7C). If on the other hand, the timeout has not encountered, and S2 is of concern (8C), it is enquired whether the same cross line has been crossed (9C). If in the affirmative, the state is changed to S3 (IOC), whereas if it is not the same base line, this means that the aircraft has approached the baseline at the other direction without re-entering the same baseline, and obviously without attempting to land. This also leads to switching again to default state SO (11C). Now reverting t the basic inquiry 1C and checking that the timeout Tout has not been encountered (6C) and if it still not met and the state is S3 (12C) it is checked whether end conditions are met (13C), by this particular example whether touchdown has occurred, in which case end scene is declared 14C.
The debriefing will be performed after end scene and in this case will refer to data monitored during S I and see whether the approaching conditions to the entry point 3B1 are successful.
Those versed in the art will readily appreciate that the invention is of course not bound by the use of the specified states S0-S3 and obviously not to the specific start and end scene conditions described above.
Having described a post scene debriefing sequence with reference to specific scenarios illustrates in Figs. 1A and IB, it should be noted that the post scene debriefing can apply to other training sessions such as a formation of aircrafts in a training session for intercepting of at least one enemy aircraft where, for example, the following data are monitored: distance between the formation members, velocity and altitude of each aircraft, monitor susceptibility of aircraft to hit by enemy A/A missile. In the post interception scene, the monitored data is analyzed and the following parameters may be debriefed: maximal distance between the formation members, lowest velocity of each aircraft and the number of times that each aircraft was susceptible to hit by enemy A/A missile. The pilots can then discuss the so extracted parameters, identify weak points and perform better in the next formation of aircrafts scene (e.g. focusing that the maximal distance would not exceed a predetermined value in order to maintain the ability to protect each other). The start and end scene event for the formation training can be determined automatically (say when certain distance between aircrafts is encountered) or manually, e.g. when the pilot provides appropriate command for starting to monitor the pertinent data for later performing the formation flight debriefing stage.
Those versed in the art will readily appreciate that the invention is not bound by monitoring and debriefing the specified data and parameters for the interception/landing session,, and accordingly other data/parameters may be monitored/debriefed, all depending upon the particular application.
Note that the more the parameters have to monitor, the more difficult is to monitor them during intra scene, and the more useful is the post-scene debriefing for accomplishing better performance in the next scene.
Another non-limiting training session is an Air-to-ground flight where the data: distance from target and aircraft's altitude are monitored and these data are • analyzed and later debriefed. Here also, other data/parameters may be monitored/ debriefed, all as required and appropriate.
There follows a description of another non-limiting example concerning airborne debriefing preparation and execution scenario. The session described is a 2 Vs. 2 air-to-air combat, wherein 2 aircrafts are considered as Blue aircrafts and 2 aircrafts are considered as Red.
In accordance with this embodiment, the first stage before the flight will be to identify the parameters which will be required for airborne debriefing. These parameters may be identified using a Mission Planning tool, e.g. mission planning module of the EHUD AACMI system, commercially available from IAI limited, Israel. The training members (i.e. by this example the pilots) identify the desired session for debriefing (e.g. 2 vs. 2 air-to-air combat) and parameters (e.g. threat by enemy missile and maximal distance between participants) in order to be able to perform the analysis and debriefing of the required data in a short time during the flight. As described, there are many parameters pilots can debrief during the flight, and for faster retrieval, it is important to identify these parameters before the flight.
The second stage will be to upload the Mission Planning data into the airborne debriefing system. This may be done by using e.g. an existing aircraft data transfer cartridge (DTC), any other type of removable data storage module (RDS), or by way of another non limiting example, wireless communication.
The third stage is to identify the beginning and the end of the training scene that will be later debriefed. The start/end scene events can be preprogrammed or performed manually by the pilot at selected stage during the flight. In accordance with certain embodiments, the pilot will identify the beginning and the end scene using an interface to the airborne debriefing system. The system will start monitoring (e.g. by recording) data of the self aircraft and may record data of other participating aircraft via a datalink system. If desired, the system may record data of interest using existing devices, such as data gathered from the aircraft's existing AACMI (Autonomous Air Combat Maneuvering Instrumentation) system.
During the fourth stage, after the pilot has selected the end of the session, the system starts to analyze the data monitored in order to debrief the sought parameters. As will be explained in greater detail below, in order to provide an output which is relevant to the pilot, the system runs algorithms which analyze the data according to the parameters requested in advance in the mission planning. Note that the invention is not bound by the specified split to distinct stages and the contents of each stage.
Bearing in mind the specified stages, there is shown an exemplary debriefing display Multi-Function-Display (MFD) (10) containing a plurality of operational selectable buttons (OSB) 10A-10T. Operational selectable buttons 10A-10T serve for a variety of functions on the display 10 and can be programmed to execute particular elements of airborne debriefing in accordance with various embodiments of the invention. The manner of programming the buttons 10A-10T is generally known, per se, and therefore will not be further expounded herein.
By this specific example, OSB 10G allows the pilot to identify the starting phase (start scene) of a training scene in order to start the recording of the relevant data. OSB 10H allows the pilot to identify the end phase of a training scene in order to end the recording of the data (end scene). A start and end points are entered e.g. in order to prevent irrelevant data recording and analysis. For instance, when a formation is flying to the training zone, it is not relevant if one of the aircraft is vulnerable to an "enemy" missile launch since, at this stage, there is no training session running.
By this specific example, OSB 101 allows the pilot to select between different debriefing parameters (during the post scene debriefing) as set in a mission planning application. For example, a pilot can enter the "threat by missile" parameter and then toggle between the different events 10A and 10B. Each event illustrates a' snap-shot of a scenario under consideration e.g. a 2D picture of the scenario will be presented on the MFD to help the pilot understand the location of each participant during the analyzed "threat by missile" - 18 -parameter. More specifically, in the post scene debriefing the pilot is present with an intra scene event (i.e. event that occurred during the scene and pertain to the sought parameter(s)) where a first aircraft (11 in Fig. 2 is indicative of an altitude of 11,000 feet) is shown together with the threatening aircraft (at 15,000 feet) where the latter aircraft threatens the first aircraft with a missile. The event includes the relevant event data such as the relative distance between the aircrafts (2.2NM), the closing velocity (120Kts) and the aspect angle (1 1R) when the threatening event occurred. The other couple of aircrafts at respective altitudes of 15,000 feet and 23,000 feet are also shown (at the lower right part) of the screen.
The invention is, of course, not bound by the use of the MFD as an output device and also not by the use of the specified particular buttons for accomplishing the recording/analysis operations. Note that by the specific example illustrated in Fig. 2, the analysis phase took into account and recorded the following data: real-time distance between opponent aircrafts, aspect angle, closing velocity between aircrafts, flight direction of the aircrafts, seeker type, weapon (missile envelope: including range and flight characteristics of the missile) and used a known per se kill assessment algorithms in order to debrief the "threat by missile" parameter and present event(s) where the sought parameter occurred.
By one embodiment, the various modules of the Main Airborne Debriefing System software (MADS) may include an existing and/or new interfaces and controllers in order to provide hardware inputs. By this specific example, the MADS receives inputs from the signal and data collection subsystems and data link module, and performs all debriefing calculations based on preprogrammed logic and mission planning inputs. Output from the main airborne debriefing system is transmitted to the audio and/or visual user interface displays. The invention is, of course, not bound by the use of MADS.
Turning now to Fig. 3, there is shown a flow diagram 30, in accordance with an embodiment of the invention. As shown, at first, mission plan data, including session type, such as 2 vs. 2 air combat and parameter(s) are loaded 31. - 19 - Thereafter, upon receipt of start scene command 32, the data relevant for later analyzing the sought parameters, are recorded 33 during the scene. The recording continues until end scene command is received 34. Note that by this embodiment, data is monitored throughout the entire scene. The recorded data 5 are then analyzed to arrive to the results according to the sought session(s) and parameter(s) (35). Upon receipt of debrief command, the analyzed data is outputted to the output device (36). Note that by another embodiment, the analysis or certain parts thereof, can be performed in response to the issuance of debrief command, all depending upon the particular application.
Turning now to Fig. 4, there is shown a generalized block diagram of the known EFfUD airborne POD system components (40), in accordance with an embodiment of the invention. The logic of the invention as described with certain embodiments above, is executed on integrated main module (41), fitted in the POD which is mounted on an aircraft. The module includes CPU (42) for realizing the various processing tasks. It also accommodates data link module (43) for receiving network members' data. Voice module 44 facilitates storage and activation of voice output, for instance voice messages activated during the airborne debriefing phase. GPS antenna 45 provides location data that are useful e.g. for navigation command. Power supply 46 provides power and RFI unit 47 filters out radiation interferences, all as generally known per se.
BMF filter 48 aims at filtering S-band frequency transition interference over GPS receiver. Data link module 49 serves for facilitating communication with mating aircraft under the control of module 43, e.g. in formation training session for receiving mating aircraft location data. The voice transmitter module 401 is coupled to the voice module 44 and allows transmission of voice data to the pilot. The entire system 40 is fitted in a POD (402) coupled for example to AIM-9 interface 403.
Those versed in the art will readily appreciate that the system architecture of the invention is not bound by the generalized example illustrated in Fig. 4.
The system and method, according to certain embodiments of the invention, would provide the pilot(s) at his request, a feedback after the training - 20 -scene has been completed. This tool would provide the pilots means to improve during the flight, obviating the need to wait for post flight debriefing.
It is very typical that, when a trainee builds up his flying skills, an instructor flies with the trainee, serving, inter alia, as "manned debriefing means" by observing the trainee's behavior, identifying mistakes and providing helpful comments for improving performance during subsequent scenes within the same flight. Manned instructor poses, obviously, an overhead and in addition requires two seats aircraft. Using an airborne debriefing system in accordance with the invention would, in certain scenarios, obviate the need for manned instructor and provide the trainee with the necessary information in order to improve for the next flight scene.
The present invention has been described with a certain degree of particularity, but those versed in the art will readily appreciate that other modifications and alterations can be carried out without departing from the scope of the following Claims:

Claims (28)

179304/2 21 CLAIMS:
1. A computerized method for performing debriefing during flight of at least one flying platform, comprising: a. performing a flight training session that includes at least two flight scenes; the training session includes at least one post scene that separates at least two of said flight scenes; b. automatically monitoring selected data of at least one flying platform during at least one of said flight scenes; c. automatically analyzing the monitored data; and d. providing during at least one of said post scenes a post scene debriefing of selected parameters that are related to the monitored data in respect of at least one of said scenes.
2. The method according to Claim 1, wherein said flying platform being an aircraft.
3. The method according to Claim 2, wherein said aircraft being a combat or training aircraft.
4. The method according to any one of the preceding claims, wherein said platform being manned.
5. The method according to any one claims 1 to 3, wherein said platform being unmanned.
6. The method according to any one of the preceding claims, wherein said training session includes landing a flying platform.
7. The method according to Claim 6, wherein said monitored data include access angle, approach velocity, angle of attack during approach; and wherein said selected parameters include at least one of average access angle, minimum velocity and angle of attack before touchdown.
8. The method according to any one of the preceding claims, wherein said training session involves a formation of at least two flying platforms which can communicate through a data link.
9. The method according to any one of the preceding claims, wherein said training session includes intercepting at least one enemy flying platform by at least one flying platforms. 01712207V28-01 179304/2 22
10. The method according to Claim 9, wherein said data include distance between the at least two flying platforms, velocity and altitude of each flying platform, susceptibility of each flying platform to hit by A/A missile of the enemy flying platform; and wherein said selected parameters include at least one of maximal distance between the at least two flying platforms, lowest velocity of each flying platform and in what altitude, a number of times that each flying platform was susceptible to hit by an A/A missile of the enemy flying platform.
11. The method according to any one of the preceding Claims, comprising i) identifying mission data including session data for monitoring and at least one parameter; ii) uploading to the at least one platform said identified mission data; iii) identifying start of scene and monitoring data during said scene or portion thereof and identifying end of said scene; iv) analyzing the monitored data for debriefing, post scene, on said at least one parameters.
12. The method according to any one of the preceding claims, wherein said debriefed data is provided to MFD display configurable to present at least one event that pertains to said at least one parameter.
13. The method according to Claim 1 1, wherein said uploading is performed using transfer cartridge or other removable storage.
14. The method according to Claim 11, wherein said uploading is performed using wireless data link.
15. The method according to any one of the preceding claims, wherein said data for monitoring and parameters are identified using mission tool.
16. The method according to any one of the preceding claims wherein said monitoring is performed using AACMI system modules, and wherein said AACMI system modules are used for debriefing.
17. The method according to Claim 16, wherein said AACMI system being EHUD system.
18. A system for performing debriefing during flight of a flying platform; the flight includes a training session that includes at least two flight scenes; the training session includes at 01712207X28-01 179304/2 23 least one post scene that separates at least two of said flight scenes; the system including a processor and associated data base, data link associated to the processor and configured to communicate data to and from at least one other flying platform; the processor and associated database are configured to perform the following, comprising: a. monitoring selected data of at least one flying platform during at least one flight scene; the at least one flight scene forms part of at least two flight scenes that constitute a flight training session; b. analyzing the monitored data; and c. providing during at least one of said post scenes a post scene debriefing of selected parameters that are related to the monitored data in respect of at least one of said scenes.
19. The system according to Claim 18, wherein said flying platform being an aircraft.
20. The system according to Claims 18 or 19, wherein said platform being manned.
21. The system according to Claims 18 or 19, wherein said platform being unmanned.
22. The method according to any one of claims 1 to 17, further comprising at least one additional flying platform and wherein said (b) further comprising communicating data from said additional flying platform through a data link.
23. The method according to any one of claims 1 to 17, wherein said flight scene is either started or ended manually.
24. The method according to any one of claims 1 to 17, wherein said flight scene is either started or ended automatically.
25. The system according to Claim 18, wherein said flight scene is either started or ended manually.
26. The system according to Claim 18, wherein said flight scene is either started or ended automatically.
27. A method according to any one of claims 1 to 17 or 22 to 24, substantially as described herein with reference to the accompanying drawings. 01712207V28-01 179304/2 24
28. A system according to any one of claims 18 to 21, 25 or 26, substantially as described herein with reference to the accompanying drawings. FortheApplicants, REINHOLDCOHN&PARTNERS By: 01712207\28-01
IL179304A 2004-06-01 2006-11-15 Method and system for debriefing flying scenes IL179304A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
IL179304A IL179304A (en) 2004-06-01 2006-11-15 Method and system for debriefing flying scenes

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
IL16226504A IL162265A0 (en) 2004-06-01 2004-06-01 Flight debriefing system
PCT/IL2005/000567 WO2005119629A2 (en) 2004-06-01 2005-06-01 A method and system for debriefing flying scenes
IL179304A IL179304A (en) 2004-06-01 2006-11-15 Method and system for debriefing flying scenes

Publications (2)

Publication Number Publication Date
IL179304A0 IL179304A0 (en) 2007-03-08
IL179304A true IL179304A (en) 2011-03-31

Family

ID=35463599

Family Applications (2)

Application Number Title Priority Date Filing Date
IL16226504A IL162265A0 (en) 2004-06-01 2004-06-01 Flight debriefing system
IL179304A IL179304A (en) 2004-06-01 2006-11-15 Method and system for debriefing flying scenes

Family Applications Before (1)

Application Number Title Priority Date Filing Date
IL16226504A IL162265A0 (en) 2004-06-01 2004-06-01 Flight debriefing system

Country Status (2)

Country Link
IL (2) IL162265A0 (en)
WO (1) WO2005119629A2 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
ES2482740A1 (en) * 2012-12-18 2014-08-04 Universitat Politècnica De Catalunya Improvement system for pilots of aircraft performing aerial photography and method relating to said system
CN112379992B (en) * 2020-12-04 2024-01-30 中国科学院自动化研究所 Role-based multi-agent task cooperative message transmission and exception handling method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5308022A (en) * 1982-04-30 1994-05-03 Cubic Corporation Method of generating a dynamic display of an aircraft from the viewpoint of a pseudo chase aircraft
US5826206A (en) * 1996-03-12 1998-10-20 Training Inovations Group, Llc Debriefing systems and methods for retrieving and presenting multiple datastreams with time indication marks in time synchronism

Also Published As

Publication number Publication date
WO2005119629A2 (en) 2005-12-15
IL162265A0 (en) 2005-11-20
IL179304A0 (en) 2007-03-08
WO2005119629A3 (en) 2006-05-04

Similar Documents

Publication Publication Date Title
US9099009B2 (en) Performance-based simulation system for an aircraft
US7599765B2 (en) Dynamic guidance for close-in maneuvering air combat
US5787333A (en) Aircraft survivability equipment training method and apparatus for low flyers
US9840328B2 (en) UAS platforms flying capabilities by capturing top human pilot skills and tactics
EP1355286A2 (en) Autonomous on-board and in-flight generated weapon simulation system for representation of virtual scenarios
Rochlin Iran air flight 655 and the USS Vincennes: Complex, large-scale military systems and the failure of control
García et al. Autonomous drone with ability to track and capture an aerial target
Laird et al. Coordinated behavior of computer generated forces in TacAir-Soar
EP0654776B1 (en) Pilot training device
IL179304A (en) Method and system for debriefing flying scenes
KR102140291B1 (en) Ground control station for controlling of suicide type unmanned plane
Dawkins et al. Deployment and flight operations of a large scale UAS combat swarm: Results from DARPA service academies swarm challenge
Tirpak The robotic air force
Giles et al. Expanding domains for multi-vehicle unmanned systems
Geister et al. Operational integration of UAS into the ATM system
Stevenson Assessment of the equivalent level of safety requirements for small unmanned aerial vehicles
Grant Eyes Wide Open
Conwell et al. Evolution of Human Systems Integration for Remotely Piloted Aircraft Systems
Sunil et al. Male RPAS Integration into European Airspace: Part 1: Real-Time Simulation Analysis of Contingencies in TMAs
Geuther et al. Safeguard with autonomous navigation demonstration technical closeout report
Ren et al. Cooperative Surveillance with Multiple UAVs
EP0969439A2 (en) System for training pilots
Swihart et al. A sensor integration technique for preventing collisions between air vehicles
How et al. Multi-vehicle experimental platform for distributed coordination and control
Ferry et al. Towards true UAV autonomy

Legal Events

Date Code Title Description
FF Patent granted
KB Patent renewed
KB Patent renewed
KB Patent renewed