WO2003096303A1 - Affichage d'elements - Google Patents

Affichage d'elements Download PDF

Info

Publication number
WO2003096303A1
WO2003096303A1 PCT/AU2003/000571 AU0300571W WO03096303A1 WO 2003096303 A1 WO2003096303 A1 WO 2003096303A1 AU 0300571 W AU0300571 W AU 0300571W WO 03096303 A1 WO03096303 A1 WO 03096303A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
ofthe
virtual
feature
aircraft
Prior art date
Application number
PCT/AU2003/000571
Other languages
English (en)
Inventor
David Grant Roberts
Original Assignee
David Grant Roberts
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by David Grant Roberts filed Critical David Grant Roberts
Priority to AU2003221641A priority Critical patent/AU2003221641A1/en
Publication of WO2003096303A1 publication Critical patent/WO2003096303A1/fr

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/02Simulators for teaching or training purposes for teaching control of vehicles or other craft
    • G09B9/08Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of aircraft, e.g. Link trainer
    • G09B9/30Simulation of view from aircraft
    • G09B9/36Simulation of night or reduced visibility flight
    • G09B9/38Simulation of runway outlining or approach lights
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/02Simulators for teaching or training purposes for teaching control of vehicles or other craft
    • G09B9/08Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of aircraft, e.g. Link trainer
    • G09B9/30Simulation of view from aircraft
    • G09B9/301Simulation of view from aircraft by computer-processed or -generated image
    • G09B9/302Simulation of view from aircraft by computer-processed or -generated image the image being transformed by computer processing, e.g. updating the image to correspond to the changing point of view

Definitions

  • the present invention relates to a method and apparatus for displaying one or more virtual features to an operator of a vehicle, and in particular to a system for displaying virtual features to pilots of aircraft.
  • Flight simulators are ground based devices used in the training of pilots and crew members. Typically such devices include structures which mimic the interior of an aircraft cockpit and the sounds and fields of vision encountered during flight. They may also be mounted on hydraulics to mimic aircraft motion. Such ground based simulators have been used to train pilots to learn basic manoeuvres and instrument navigation procedures. These devices are however limited in the complexity of manoeuvres that may be taught and also in the realism ofthe effect produced. This is particularly the case due to the limited range of movement which makes it difficult to simulate g-forces encountered during manoeuvres. The total experience does not equate to the same experience as manipulating the controls of an airborne aircraft, particularly in relation to take-off and landing procedures.
  • a particular problem in training arises because an engine and propeller idling produces different drag behaviour, and hence aircraft performance, to that which occurs with an engine and propeller stopped.
  • the former situation arises in a training exercise while latter occurs in a real emergency.
  • Training for an effective response has therefore traditionally involved an instructing pilot depowering or feathering one engine during a training flight while paying close attention to the student's response in case it is necessary to override that student in order to avert a disastrous, and often fatal, collision with the ground. This aspect of training has led to a significant number of accidents and mortalities during the training period.
  • Training in landing procedures is also known to produce risk particularly in adverse weather conditions such as high wind and also due to incorrect aircraft attitude, speed and angle of descent.
  • landing strips in mountainous regions may be sloped, to minimise the required length.
  • the aircraft will land in an uphill direction, such that the slope helps the aircraft slow down faster, whilst taking off is in a downhill direction, to help the aircraft gain the necessary speed for take off.
  • the landing strip in place such as Papua New Guinea, the landing strip has a large hill behind it. This prevents the aircraft flying over the strip below a certain altitude. Thus landing and take-off can only proceed from one end of the landing strip. This form of situation is difficult to train for practically due to the inherent risks should a problem occur in the approach to the landing strip.
  • the present invention provides a method of allowing a user of a- moving vehicle to interact with a virtual feature, the method including causing a processor to: a) Determine a position for the virtual feature; b) Determine the current position of the vehicle in accordance with signals from a sensor, the sensor being adapted to generate a signal representative of the position ofthe vehicle; c) Determine the relative position of the virtual feature with respect to the vehicle; d) Generate an image ofthe virtual feature in accordance with: i) The determined relative position; and, ii) Feature data stored in a store, the feature data representing one or more virtual features; e) Present the image to the user on a display; and, f) Repeat steps (b) to (e) as required.
  • the position ofthe vehicle typically includes at least one of: a) A vehicle location; and, b) A vehicle orientation.
  • the vehicle can be an aircraft, in which case the location typically includes the latitude, longitude and altitude ofthe aircraft.
  • the orientation typically includes at least one of: a) The pitch; b) The roll; and, c) The heading.
  • the method can include determining the location ofthe vehicle using at least one of: a) An absolute sensor for detecting the absolute location ofthe vehicle; and, b) A relative sensor for detecting movement ofthe vehicle.
  • the processor being coupled to the absolute and relative sensors, the method including causing the processor to: a) Periodically determine an absolute vehicle location in accordance with signals received from the absolute sensors; b) Determine movement of the vehicle from the absolute position in accordance with signals received from the relative sensors; and, c) Determine the current location of the vehicle in accordance with the most recently determined absolute location and any movement of the vehicle therefrom.
  • the method may include causing the processor to determine the current vehicle position in accordance with a PID algorithm.
  • the method may include causing the processor to determine the orientation of the vehicle in accordance with a vector analysis ofthe vehicle location.
  • the method may involve causing the processor to a) Determine a number of successive vehicle locations at predetermined time intervals; b) Determine a direction vector connecting each pair of the successive vehicle locations; and, c) Determine the curvilinear acceleration in accordance with the direction vectors.
  • the processor can also be coupled to a flow sensor for determining the direction of air flow over the aircraft, the method including causing the processor to determine the heading or pitch ofthe aircraft in accordance with signals from the flow sensor.
  • the vehicle is an aircraft it can include a number of orientation sensors, the method including causing the processor to determine the orientation of the aircraft in accordance with signals from the orientation sensors.
  • the processor may being coupled to an input, the method including causing the processor to respond to commands received via the input to perform at least one of: a) Selecting the one or more virtual features to be displayed; and, b) Determining the position ofthe one or more virtual features.
  • the commands can be received from at least one of: a) A user; and, b) A remote processing system.
  • the display preferably includes a headset, the method including causing the processor to: a) Determine the position ofthe headset in accordance with signals received from a headset position sensor; and, b) Generate the images in accordance with the determined headset position.
  • the headset position sensor can be adapted to generate signals representing at least one of: a) The location ofthe headset within the vehicle; and, b) The orientation ofthe headset with respect to the vehicle.
  • the method may include causing the processor to generate images by: a) Determining a transformation matrix in accordance with the relative position ofthe aircraft with respect to the virtual feature; b) Apply the transformation matrix to the feature data to generate modified feature data; and, c) Generate the image in accordance with the modified feature data.
  • the method may include causing the processor to: a) Determine a second transformation matrix in accordance with the position of the headset; b) Apply the second transformation matrix to the modified feature data to generate further modified feature data; and, c) Generate the image in accordance with the further modified feature data.
  • the method may including causing the processor to generate the image by; a) Generating a three dimensional model of the feature in accordance with the feature data; and, b) Apply one or more textures or to the model.
  • the method can include causing the processor to control the position of the virtual feature.
  • the virtual feature may represent another physical object.
  • the method may include determining the position ofthe vehicle using at least one of: a) GPS sensors; and, b) Accelerometers.
  • the method may include causing the processor to generate an image including multiple virtual features.
  • the relative position of the virtual features can be defined in accordance with data stored in the store.
  • the method can also include causing the processor to store the vehicle position in the store thereby allowing the motion ofthe vehicle to be subsequently determined.
  • the present invention provides a method of training a user to control a vehicle, the method including: a) Selecting a virtual feature; b) Determining a position for the virtual feature; - c) Generating an image of the virtual feature, in accordance with the relative position ofthe vehicle and the virtual feature; and, d) Presenting the image to the user thereby allowing the user to interact with the virtual feature to thereby gain experience in controlling the vehicle.
  • the method may be performed in accordance with the method of the first broad form ofthe invention.
  • the method can also include determining the position of the virtual feature to be physically separated from any physical objects, thereby reducing the risk of the user hitting an object with the vehicle.
  • the vehicle may be an aircraft, the method including determining the position of the virtual feature to be at a predetermined altitude above the ground.
  • the method can be adapted to train the user in take off and landing procedures, the virtual feature representing at least a landing strip.
  • the present invention provides apparatus for allowing a user of a moving vehicle to interact with a virtual feature, the apparatus including: a) At least one sensor for generating a signal representative of the position of the vehicle; b) A store for storing feature data representing one or more virtual features; c) A display; and, d) A processor adapted to: i) Determine a position for the virtual feature; ii) Determine the current position of the vehicle in accordance with signals from the sensor; iii) Determine the relative position of the virtual feature with respect to the vehicle; iv) Generate an image ofthe virtual feature in accordance with the determined relative position; v) Present the image to the user on the display; and, vi) Repeat steps (d-ii) to (d-v) as required.
  • the apparatus can be adapted to perform the method of the first or second broad forms ofthe invention.
  • the apparatus can include at least one of: a) An absolute sensor for detecting the absolute location ofthe vehicle; and, b) A relative sensor for detecting movement of the vehicle.
  • the absolute sensor can include a GPS sensor, with the relative sensor being one or more accelerometers.
  • the apparatus can include a flow sensor, the processor being adapted to determine an orientation in accordance with signals from the flow sensor.
  • the processor can be coupled to an input, the processor being adapted to respond to commands received via the input to perform at least one of: a) Selecting the one or more virtual features to be displayed; and, b) Determining the position ofthe one or more virtual features.
  • the processor can be coupled to a remote processing system, the input commands being received from the remote processing system.
  • the display preferably includes a headset, the processor being adapted to: a) Determine the position ofthe headset in accordance with signals received from a headset position sensor; and, b) Generate the images in accordance with the determined headset position.
  • the headset position sensor can be adapted to generate signals representing at least one of: a) The location ofthe headset within the vehicle; and, b) The orientation ofthe headset with respect to the vehicle.
  • the present invention provides a computer program product for allowing a user of a moving vehicle to interact with a virtual feature, the computer program product including computer executable code which when executed on a suitable processing system causes the processing system to perform the method of the first or second broad form ofthe invention.
  • the present invention provides apparatus for training a user to control a vehicle, the apparatus including: a) At least one sensor for generating a signal representative ofthe position of the vehicle; b) A store for storing feature data representing one or more virtual features; c) A display; and, d) A processor adapted to: i) Select a virtual feature; ii) Determine a position for the virtual feature; iii) Generate images of the virtual feature, in accordance with the relative position ofthe vehicle and the virtual feature; and, iv) Present the images to the user thereby allowing the user to interact with the virtual feature to thereby gain experience in controlling the vehicle.
  • the apparatus can be apparatus according to the third broad form ofthe invention.
  • the present invention provides a method of allowing a number of vehicles to interact with at least one virtual feature, each vehicle including a display system for presenting an image of a virtual feature, the method including: a) Determine a position for the virtual feature; b) Transfer an indication ofthe position of each virtual feature to each vehicle; c) Causing the display system provided in each vehicle to: i) Determine the relative position of the vehicle with respect to the virtual feature; ii) Generate an image ofthe virtual feature in accordance with: (1) The determined relative position; and,
  • the method can include causing a processing system to determine the position of the virtual feature in accordance with at least one of: a) Feature data stored in a store; and, b) Input commands provided by an operator.
  • the processing system can be the display system of one of the vehicles, or a separate base station.
  • the position ofthe virtual feature can include an indication of at least one of: a) A location including at least one of: i) A latitude; ii) A longitude; and, iii) An altitude; and, b) An orientation.
  • the method may include using a display system having: a) At least one sensor for generating a signal representative ofthe position of the vehicle; b) A store for storing feature data representing one or more virtual features; c) A display; and, d) A processor adapted to: i) Determine the position ofthe virtual feature from the processing system; ii) Determine the current position of the vehicle in accordance with signals from the sensor; iii) Determine the relative position of the virtual feature with respect to the vehicle; iv) Generate an image ofthe virtual feature in accordance with the determined relative position; v) Present the image to the user on the display; and, vi) Repeat steps (d-ii) to (d-v) as required.
  • the display system being adapted to perform the method of any one of the first or second broad forms ofthe invention.
  • the method can further include: a) Causing the display system to periodically store position data indicating the position ofthe vehicle; b) Transfer the position data to the processing system; and, c) Store the position data in a database for subsequent retrieval.
  • the processing system can be adapted to provide an indication of the same one or more virtual features to a number of vehicles.
  • the one or more virtual features may represent a landing strip.
  • the present invention provides apparatus for allowing a number of vehicles to interact with at least one virtual feature, each vehicle including a display system for presenting an image of a virtual feature, the apparatus including a processing system adapted to: a) Determine a position for the virtual feature; and, b) Transfer an indication of the position of each virtual feature to each vehicle, the display system being adapted to: i) Determine the relative position of the vehicle with respect to the virtual feature; ii) Generate an image ofthe virtual feature in accordance with: (1) The determined relative position; and,
  • the processing system can be adapted to determine the position of the virtual feature in accordance with at least one of: a) Feature data stored in a store; and, b) Input commands provided by an operator.
  • the position ofthe virtual feature including an indication of at least one of: a) A location including at least one of: i) A latitude; ii) A longitude; and, iii) An altitude; and, b) An orientation.
  • Each display system may include: a) At least one sensor for generating a signal representative of the position of the vehicle; b) A store for storing feature data representing one or more virtual features; c) A display; and, d) A processor adapted to: i) Determine the position ofthe virtual feature from the processing system; ii) Determine the current position of the vehicle in accordance with signals from the sensor; iii) Determine the relative position of the virtual feature with respect to the vehicle; iv) Generate an image ofthe virtual feature in accordance with the determined relative position; v) Present the image to the user on the display; and, vi) Repeat steps (d-ii) to (d-v) as required.
  • the display system being adapted to perform the method of any one of the first or second broad forms of the invention.
  • the processing system can be adapted to provide an indication of the same one or more virtual features to a number of vehicles.
  • the one or more virtual features may represent a landing strip.
  • Figure 1 is a schematic diagram of an example of a system for implementing the present invention
  • Figures 2 A and 2B are a flow chart ofthe operation ofthe system of Figure 1;
  • Figures 3 A and 3B are schematic diagrams of the relative position of an aircraft with respect to a virtual landing strip
  • Figures 4A and 4B are schematic diagrams of the images presented for the relative positions of Figures 3 A and 3B respectively;
  • Figure 5 is a schematic diagram representing the presentation of an image on the display of Figure 1 to account for the location of a user's head
  • Figure 6 is a schematic diagram of an example of a system for integrating the processing system of Figure 1 with a base station;
  • Figure 7 is a schematic diagram of a second example of a system for implementing the present invention.
  • Figure 8 is a flow chart ofthe operation ofthe system of Figure 7. Detailed Description of the Preferred Embodiments
  • the apparatus includes a processing system 10 coupled to a sensing system 11, and an external display 12.
  • the system is mounted within an aircraft in flight with the processing system 10 being adapted to analyse signals from the sensing system 11 to determine the position of the aircraft.
  • the processing system 10 uses this information to generate an image representing at least one virtual feature which is presented to a user on the display 12.
  • the virtual feature is intended to represent a feature having a predetermined position, thereby allowing the pilot to control the aircraft in accordance with the position of the virtual feature, allowing the pilot to interact with the virtual feature as though it is a real feature.
  • the virtual feature may be at a position fixed with respect to the ground, or may be moving.
  • the processing system 10 uses signals from the sensing system 11 to detect these aircraft movements and modify the image presented on the display 12, to thereby mimic the change in appearance of the virtual feature as would be seen from the aircraft if the virtual feature were a real feature.
  • the processing system 10 may be formed from any suitable processing system programmed in accordance with appropriate instructions.
  • the processing system includes a processor 20 coupled to a memory 21, a user interface 22, which may be in the form of a keyboard and associated monitor, or the like, and an external interface 23, coupled together via a bus 24.
  • the external interface 23 is utilised to couple the processing system to the sensing system 11 and the display 12.
  • the processor 20 executes application software stored in the memory 21, causing the processor 20 to analyse signals received from the sensing system 11 to thereby determine movement of the aircraft.
  • the processor 20 can then determine the position of the aircraft with respect to the virtual feature, and generate an image of the virtual feature as it would appear from an aircraft.
  • the image is presented on the display 12 thereby allowing the system to present the virtual feature to the pilot as though it were a real feature.
  • the processing system 10 may be any suitable form of processing system such as a personal computer, lap-top, palm-top or may be alternatively a suitably programmed flight computer, specialised hardware or the like.
  • the processing system 10 is activated.
  • the activation will be performed by a user, depending on the use of the system.
  • the system may be activated by an instructor.
  • the pilot may activate the system themselves, or in the event that the system is being used during a race, then the system may be activated by race officials. Activation may occur prior to take off or at any time during flight.
  • the user selects a feature or set of features, and a respective position using the user interface 22.
  • the memory 21 may include a feature list defining the types of features that may be presented.
  • the user can simply select one or more features from the feature list, and then provide an indication of the desired location at which the or each feature is to be presented.
  • a set of features may therefore include for example a landing strip and surrounding features, such as airport buildings, trees and a landscape. In this case, it may only be necessary to define the position of one feature, with the location of the remaining features being defined relative to this feature.
  • the location of a landing strip may be defined by the user, with the position of the remaining features being specified in the memory 21 relative to the defined position ofthe landing strip.
  • the position of the feature may be defined in relatively, with respect to the current position of the aircraft, or absolutely, with respect to the ground.
  • the user of the system will input a position at which the virtual feature is to be presented, as a specified distance, and/or direction from the aircraft.
  • this is usually achieved by specifying the coordinates at which the virtual feature is to be presented. These coordinates may define the latitude, longitude, altitude and orientation at which the virtual feature is to be presented.
  • the position of the virtual feature may be determined relative to some other set feature which may be geographic or man made.
  • the man made feature may comprise one or more beacon emitting towers.
  • the virtual feature can be selected by selecting a scenario from a scenario list stored in the memory 21.
  • the scenario list would be associated with respective scenario data, which would indicate for each scenario the one or more virtual features that need to be presented.
  • the position ofthe virtual features may be defined manually, as described above, or may defined in accordance with information stored in the scenario data.
  • an instructor may select a landing simulation scenario in which case the processing system 10 will access the scenario data and determine those features which are required in order to implement the scenario.
  • This may include for example a virtual landing strip, and surrounding associated environment or landscape, or the like.
  • Other visual environmental effects such as reduced visibility caused by fog, or night conditions, or the like, may also be simulated by the presentation of appropriate images.
  • Figure 3A shows an aircraft 30, flying at an altitude 31 above the ground 32.
  • a virtual landing strip 33 is defined at an altitude 34, and at a horizontal distance 35 from the aircraft 30.
  • the virtual features can be presented above ground level, as shown in Figure 3A.
  • the landing strip may be presented at a high altitude 34, such as several thousand feet, allowing the pilot to practice landings under a number of different conditions.
  • the pilot will merely pass through the landing strip unharmed, with the pilot having several thousand feet of altitude to recover control ofthe aircraft.
  • the processing system 10 operates to determine the position of the aircraft in accordance with signals from the sensing system 11.
  • the signals will normally allow the processing system 10 to determine the location and orientation ofthe aircraft, as will be described in more detail below.
  • the processor 20 uses the position of the aircraft and the position of the virtual feature to thereby determine the relative position of the virtual feature with respect to the aircraft.
  • the determination at steps 120, 130 are not required.
  • the user may define the virtual feature to be at a distance and direction relative to the aircraft, but an absolute altitude, in which case, the processor 20 will not initially need to calculate the distance and direction of the virtual feature relative to the aircraft.
  • the processing system 10 will need to determine the altitude of the aircraft, and from this and the defined feature altitude, calculate the relative altitude ofthe aircraft and the feature.
  • the processing system 10 therefore determines the relative altitude difference between the aircraft 30 and the landing strip 36 (ie. 31-34), and uses this to calculate the angle ⁇ , representing the angle between the pilot's line of sight and the direction ofthe virtual feature, and the distance 36.
  • the processing system 10 operates to generate a visual representation of the virtual feature in accordance with image data stored in the memory 21.
  • image data may define how the virtual feature will look from every possible relative position and orientation of the virtual feature with respect to the aircraft.
  • the processor 20 will determine the relative position and orientation of the aircraft and the virtual feature, and simply look-up the image of the virtual feature that is to be presented.
  • the image data typically defines the virtual feature in terms of a three dimensional model.
  • the processor will determine the relative appearance of the model in accordance with the relative position of the aircraft and the feature, and then generate the image in real time as required.
  • the generated image will be presented to the user on the display 12, such that the appearance of the image is equivalent to the appearance of an actual object provided at the location specified for the position of the virtual feature.
  • the images may be generated in accordance with the appearance of real objects.
  • real objects such as real airports or the like
  • suitable techniques such as photographs, satellite images, or the like.
  • These images can be used to derive textures which can then be applied to the three dimensional representation of the virtual feature, such that the virtual feature appears indistinguishable from a real object, as will be appreciated by persons skilled in the art.
  • the image may additionally include further information regarding the virtual feature, such as the position of the virtual feature presented as text information. This may include the altitude, distance and bearing of the feature, which can be used to aid the pilot in navigating toward the feature. This is particularly useful as an aid in training pilots.
  • the processing system 10 will operate to detect movement of the aircraft in accordance with signals from the sensing system 11. The manner in which this is achieved will depend on the respective implementation, as will be described in more detail below.
  • the processing system 10 will determine the new relative position of the aircraft with respect to the position of the virtual feature at step 160.
  • the aircraft has moved from the position shown in dotted lines to the new position, and the processing system 10 must therefore determine the new angle ⁇ , , representing the new angle between the pilot's line of sight and the direction of the virtual feature, and the new distance 37, as shown.
  • the processing system 10 will determine if any output of an event needs to occur.
  • the processing system 10 will be adapted to determine if any part ofthe aircraft intersects the runway which would correspond to a crash. In this instance, an indication of this will be provided to the user via the display or the like.
  • the position of the feature may be defined in terms of a point, but will typically occupy a volume of space. Accordingly, the definition of the feature provided in the memory 21, will include an indication of the dimensions ofthe feature. Dimensions ofthe aircraft will also typically be provided, with the processor 20 using this information to determine if any part ofthe aircraft intersects any part ofthe feature.
  • the processing system will update the representation presented to the user on the display.
  • the processing system will redraw the virtual feature based on the new position of the aircraft.
  • An example of this is shown in Figure 4B.
  • the landing strip 33 appears larger, as it is closer to the aircraft 33, and is positioned more towards the centre of the display 33, due to the reduced size ofthe angle ⁇ , compared to the angle ⁇ .
  • the processing system 10 determines if any further action is required and in particular if the process is over. If not, the processing system 10 returns to step 150 to detect further movement of the aircraft. Otherwise the processing system ends the process at step 200.
  • the system operates to generate an image of one or more virtual features, and then modify this as the position of the aircraft changes relative to the virtual feature. Constant monitoring of the aircraft position by the processing system 10 in accordance with signals from the sensing system 11 results in constant alteration of the image of the virtual feature to reflect the changing position of the aircraft relative to the feature.
  • Sensing System It will be appreciated by a person skilled in the art that a wide range of sensing systems may be used. In particular, any sensing system 11 that allows the processing system 10 to determine the location and orientation ofthe aircraft maybe used.
  • the sensing system 11 may be formed from one or more sensors adapted to determine the location of the aircraft, and one or more sensors adapted to determine the aircraft orientation.
  • the sensing system 11 may be adapted to determine the location of the aircraft only, with the orientation of the aircraft being determined mathematically by the processing system 10, in accordance with changes in the position of the aircraft and additional information such as the angle of attack and side slip ofthe aircraft.
  • the location of the aircraft is typically defined in terms of the earth such as in terms ofthe aircraft latitude, longitude and altitude. Accordingly, the location ofthe aircraft may be determined using signals from any device or devices well known to the aviation industry for producing an accurate indication ofthe location ofthe aircraft.
  • GPS readings or the like, as will be appreciated by persons skilled in the art.
  • the location of the aircraft may be fixed relative to some other feature such as one or more beacon emitting towers.
  • the aircraft will include a sensor adapted to detect signals from the one or more towers.
  • the sensor must be adapted to detect the direction of the signal source and other signal attributes, in order to determine the distance and direction of the tower. Additional information regarding the altitude of the aircraft may also be required to determine the position ofthe aircraft.
  • a triangulation procedure can be performed by comparing signals detected from three or more towers, allowing the position of the aircraft to be determined absolutely.
  • a further variation is for the system to utilise a common variation on GPS using a correction signal obtained from a beacon.
  • the beacon which is provided at a fixed geographical location uses a GPS signal to determine its location.
  • a processing system at the beacon compares the position as determined from the GPS signals, to its known location, to determine an error in the location as determined from the GPS signals.
  • An error signal representing this error is then transmitted to the processing system 10.
  • the processing system will modify the location determined from an on board GPS system in accordance with the error signal, to thereby improve the accuracy of the aircraft position as determined by the on board GPS system.
  • Movement of the aircraft may be determined by measuring changes in the aircraft location in accordance with changes in the signals from suitable location sensors which determine the absolute location of the aircraft, such as the GPS systems described above.
  • the absolute location sensors will only be updated periodically with a relatively low frequency, such as once every second (1Hz). As the aircraft can move a significant amount during this time, and it is necessary to be able to determine the location of the aircraft at intermediate times, this prevents these absolute location sensors alone being used to determine the current aircraft location. For example, in the case of an aircraft flying at 115Kts, the aircraft will move up to 60m in one second, with significantly greater distances being travelled at higher speeds.
  • the absolute location sensors 11A periodically determine the absolute position of the aircraft, with the relative locations sensors 1 IB determining motion of the aircraft away from the absolute location, thereby allowing the current aircraft position to be updated.
  • the relative location sensors 11B typically only provide a limited degree of accuracy, and accordingly each time a signal is next received from the absolute location sensors 11 A, the current aircraft position is reset to that of the current absolute position, with the above process being repeated.
  • the GPS or other location sensors may also provide only a limited degree of accuracy, again leading to different errors in the position measured by the location sensors.
  • This coupled with inaccuracies in the relative sensors may lead to significant discrete changes in the aircraft position each time a signal is received from the location sensors 11, and the processing system 10 therefore operates to smooth out the overall motion of the aircraft using a PID (proportional, integral and derivative) algorithm, as will be appreciated by persons skilled in the art.
  • PID proportional, integral and derivative
  • the algorithm includes a proportional term, an integral term and a derivative term.
  • the proportional term is used to calculate a new current aircraft position in direct proportion to signals from the relative location sensors 11B.
  • the integral term often referred to as a reset term, is used to correct the calculated current aircraft position in accordance with the signals from the absolute locations sensors 11 A.
  • the derivative term is used to anticipate the changes in aircraft position, in accordance with the rate of the previous rate of change of aircraft position, to thereby allow a smooth change in aircraft position to be ensured.
  • P(t) current smoothed aircraft position at time (t);
  • the orientation ofthe aircraft may be determined using a number of techniques.
  • this is achieved by using a vector analysis based on the aircraft location in successive instances.
  • the location of the aircraft is monitored periodically, with the change in the aircraft location being used to calculate a direction vector representing the direction ofthe motion ofthe aircraft.
  • the aircraft can be assumed to be orientated along the line of the vector to determine the pitch of the aircraft.
  • measuring the rate of change ofthe vector can be used to calculate the curvilinear acceleration ofthe aircraft, and using this information together with an indication of the speed of the aircraft can be used to determine the roll ofthe aircraft.
  • this is achieved by resolving the force generated by the curvilinear acceleration, and the force due to gravity, and assuming that these forces are balanced by a lift force generated in a direction perpendicular to the aircraft wings.
  • the aircraft location need only be determined periodically, and accordingly, this can be achieved in accordance with signals from the absolute location sensors outlined above, alone.
  • additional analysis can be performed in accordance with the location of the aircraft as determined using signals from the relative location sensors, or the result of the aircraft position as determined using the PID algorithm, as will be appreciated by persons skilled in the art.
  • the aircraft will typically not be aligned directly along the direction vector, and that the force due to lift may not be in a direction perpendicular to the wings, due to factors such as side slip, and the angle of attack of the aircraft.
  • a side wind gusts or unbalanced control inputs may cause side slip, with other conditions, such as head, rear wind, turbulence, or a landing approach at low speed, altering the angle of attack ofthe aircraft.
  • external sensors shown at 11C in Figure 1 to detect the angle of attack, and/or of side slip of the aircraft.
  • This may be achieved for example using sensors, such as flow sensors, or a weather vane, moveable in at least two dimensions to measure the direction ofthe flow of air over the fuselage.
  • the processing system 10 analyses signals from the external sensors 11C to determine the angle of attack and side slip, using these to modify the calculation outlined above to determine the aircraft orientation thereby allowing the aircraft orientation and in particular, the aircraft heading and pitch to be determined more accurately, as will be appreciated by persons skilled in the art.
  • aircraft orientation may be determined using additional sensors, such as devices and systems well known in the aviation industry for determining aircraft orientation and pitch. This can include readings of directional bearings, degree of roll or rotation, aircraft fore and aft pitch.
  • the orientation of the aircraft absolutely using GPS systems.
  • This may be achieved by having a number of GPS systems positioned through the aircraft, such as in the wing tips, and the fore and aft sections of the aircraft, to allow the absolute orientation ofthe aircraft including the pitch, yaw or roll, ofthe aircraft to be determined. It will be appreciated that whilst differences in the position of the GPS sensors will be relatively small, and may be less than the error in which the GPS system can pin-point a location, as each sensor will typically encounter the same errors, this may not be a problem.
  • the orientation can be measured using gyroscopes, or accelerometers, to determine the current or changes in the pitch and roll of the aircraft.
  • the yaw of the aircraft may be determined in a similar way.
  • Determination of the orientation may also be obtained from on board sensors, such as the artificial horizon or the like, as will be appreciated by persons skilled in the art.
  • additional signals from external sensors may be used to determine side slip or angle of attack, which may also be used when calculating the orientation ofthe aircraft, as described above.
  • Additional Sensing It is also possible to provide additional sensors (not shown) connected to the aircraft controls. This allows the control applied by the pilot to be compared to the actual movement of the aircraft. This allows a subsequent analysis to be provided to the pilot, allowing the pilot to identify where actions taken were insufficient to counteract external environmental effects, such as cross winds, or the like.
  • sensors for determining information external to the aircraft can be included.
  • laser or radar based systems can be utilised to monitor motion of the aircraft with respect to the ground, with externally positioned flow sensors being used to monitor the flow of air over the aircraft.
  • the pilot In use, the pilot must be able to view the display 12 and utilise this to view the virtual feature.
  • a number of different forms of display can therefore be used.
  • the display could be in the form of a HUD (Heads Up Display), or the like, which operates to project an image onto the cockpit, or windows, a separate monitor, or a headset which is worn by the user to provide a respective image in front of each eye.
  • HUD Heads Up Display
  • This latter technique is particularly beneficial as this allows a three dimensional stereoscopic image of the virtual feature to be generated, as will be appreciated by persons skilled in the art.
  • the user may also be important that the user is also able to view external surroundings, depending on the implementation and use.
  • the pilot may be presented only with images of the virtual features, with it being left to the instructor to view the external environment to ensure that aircraft safety is maintained.
  • it may be preferred to allow the pilot to view both the virtual images and the external environment. This allows the virtual images to be generated in such as a way that the pilot interacts both with the real world and the virtual images simultaneously.
  • the display 12 is a HUD this can project an image onto the cockpit, or windows, allowing the pilot to view both the generated images and the external surroundings as will be appreciated by a person skilled in the art.
  • the display 12 is a monitor or the like, this can be positioned at a suitable position within the cockpit to allow the user to view of the virtual feature on the monitor and the external environment through the cockpit windows.
  • the processing system may be adapted to obtain an image of the external environment from an imaging system, such as a number of external cameras 13.
  • the processing system 10 determines a representation ofthe external surroundings from the cameras 13, and then superimposes a virtual feature, presenting the superimposed image to the pilot.
  • the display 12 could be provided in the form of a headset which is formed from one or more screens positioned in front of the pilot's eyes. In this instance it is not generally possible to view the external environment through the headset screens and it may therefore be necessary to obtain pictures of the external environment using the cameras 13 and show these superimposed with the virtual feature on the display 12, or alternatively, present only images of the virtual feature, with the external environment being monitored by an instructor, or the like.
  • the headset includes a single screen
  • attitude and orientation of the pilot's head may also be necessary to determine the attitude and orientation of the pilot's head to ensure that the image presented is correct.
  • movement of the pilot's head with respect to the display may cause the resulting image to be displayed in an apparently incorrect position due to parallax effects.
  • the system may include an initial calibration, requiring the pilot to indicate whether the presented image corresponds to a respective external location.
  • An example of this is shown in Figure 5.
  • the display is formed from an image projected onto a display 12.
  • the desired position of the virtual image is shown at 40.
  • a first pilot viewing the image as shown at 41 will require that the image is presented on the display at the point 42
  • a second pilot viewing the image as shown at 43 will require that the image is presented on the display at the point 44.
  • the processing system 10 may be adapted to present an image on the display, with the pilot being required to indicate whether this appears to be positioned over an external feature having a predetermined position with respect to the display 12, such as the nose of the aircraft. If not, the pilot can enter corrections via the user interface 22, causing the position of the presented image to be moved. It will be appreciated that this action can be used to allow different sized pilots to view images on the screen.
  • the processing system 10 can be coupled to one or more sensors for detecting the position of the pilots head.
  • An example of this is shown in Figure 1, in which two cameras 14 are provided for imaging the pilots head, allowing the processing system 10 to determine the position of the pilots head, allowing the processing system 10 modify the images displayed to counteract movement of the pilots head during flight. This can account not only for different sizes of pilot, but also for movement ofthe pilots head, such as lateral movement, during flight.
  • the sensors 14 may be adapted to monitor the position of a sensor attached to the pilot's head, such as a headband, or the like.
  • the display 12 is a headset
  • a similar procedure will be required to monitor the orientation ofthe headset with respect to the aircraft, thereby ensuring that the appearance of the images presented to the user reflect the pilot's current head orientation, as well as the orientation of the aircraft, as required.
  • This may be achieved by a tracker system fitted to the headset, to allow the orientation, attitude and/or position of the headset with respect to the aircraft to be determined.
  • the tracker may use magnetic sensors or inertial sensors such as a rate, gyro and/or accelerometers, or the like as will be appreciated by persons skilled in the art.
  • this may be achieved with one or more sensors 14 mounted in the aircraft, which can image a guide mounted to the headset to detect the headset location and orientation.
  • the cameras 13 can be mounted on the headset 12, with the cameras being used to determine images of the environment as would be viewed by the pilot in use. This allows the images to be analysed to determine the orientation of the pilot's head and may therefore be used to obviate the requirement for separate position sensors 14, as well as to determine images of the surrounding environment for superposition with the image ofthe virtual feature, as will be appreciated by persons skilled in the art.
  • processing system 10 may be adapted to maintain a record ofthe flight.
  • the processor 20 may be adapted to store the determined position of the aircraft. This will typically be performed periodically, such that a sequence of aircraft positions can be used to reconstruct the flight path ofthe aircraft during the flight.
  • any flight can be replayed at a subsequent time, by generating an appropriate image sequence, to thereby aid the pilot in tuition, reviewing a race or the like.
  • the flight can be viewed from any viewpoint, including a viewpoint ofthe pilot, or the aircraft, or an external viewpoint, thereby further aiding the pilot.
  • this allows the flight path of the aircraft to be replayed to the user so they can determine any mistakes made.
  • This action may be performed by the processing system 10, for example once the aircraft has landed, or may be achieved by exporting the data to a remote processing system, for subsequent review.
  • a number of processing systems 10 are coupled to a central base station 1 via wireless connections, for example, through the use of a radio connection, or via wired connection once the aircraft has landed.
  • the base station 1 includes a processing system 2 coupled to a database 3 as shown.
  • data collected by the processing systems 10 can be downloaded to the base station 1, and stored in the database 3 by the processing system 2.
  • the processing system 2 may be any form of processing system such as a server, personal computer, lap-top, palm-top, or the like, adapted to operate appropriate applications software to allow the desired functionality to be achieved.
  • the base station 1 can subsequently export the collected data to other processing systems, for example via the Internet, or display images based on the data as required.
  • this allows the base station to present images representing movement of the aircraft during a procedure, such as an attempted landing.
  • the presented images may be identical to those presented on the display 12, allowing the procedure to be replayed to the pilot for subsequent review.
  • alternative views from any perspective such as external views of the aircraft, views from the virtual feature, or landing strip, or the like, as well as additional information such as control readouts, or the like, can be displayed. This further aids in training as it allows the pilot to review the action taken during the flight and the corresponding effect.
  • the pilot can review the controls activated during the flight and the corresponding motion ofthe aircraft, as well as the external environmental conditions, allowing the pilot to gain a better understanding of the effect of the controls on the aircraft in flight, which can further aid in training or the like.
  • this form of architecture can also be used to control the operation of the processing systems 10, by providing input commands via the base station 1.
  • this allows the presentation of virtual images to be controlled for a number of aircraft simultaneously. This is particularly useful in allowing the base station 1 to generate images representing a virtual landing strip or airfield. This allows a number of aircraft to practice take off or landing procedures on a virtual landing strip, instead of a real landing strip. This leads to a number of benefits. First, in some locations availability of landing strips is limited and use for practicing purposes is therefore restricted to allow for required take-offs and landings. In this case, however, any number of virtual landing strips can be generated, allowing real airfields to be used for required landings.
  • the position of this virtual feature may be permanently defined by the base station 1 , such that when pilots wish to practice take off or landing procedures, the aircraft can simply fly to a predetermined location, and activate the processing system 10.
  • the processing system 10 can be adapted to automatically determine the position of the virtual landing strip, and cause this to be presented as required.
  • the base station 1 may also be adapted to monitor the location of different aircraft in accordance with positional data received from a respective processing system 10 associated with each aircraft. This allows the base station 1 to cause the processing system 10 on each aircraft to generate a respective virtual feature for each other aircraft in the vicinity.
  • the image generated by the processing system 10 may include an indication of the position of each aircraft, such as an altitude, distance and bearing, to aid pilot's in understanding the relative position ofthe other aircraft, thereby further enhancing safety.
  • the base station 1 can provide flight coordination, as would be required around a normal airfield. This may be provided manual by an individual, or automatically, in accordance with predetermined algorithms.
  • the system can also be used in race scenarios, or the like, where a number of aircraft are racing against each other around a set course.
  • the processing system 10 provided in each aircraft can be configured to generate virtual images in identical locations, so that each of a number of aircraft fly over the same course.
  • the virtual images could be presented in different locations to allow the aircraft to fly the same course but in different spatial locations, thereby reducing the possibility of a collision, or the like.
  • the absolute and relative location sensors are formed from GPS sensors 50, for determining the aircraft position, an altimeter 51 for determining the aircraft altitude, and one or more accelerometers 52 for determining the aircraft orientation.
  • One or more external air flow sensors 53 for determining the angle of the aircraft motion with respect to the surrounding air, and hence the side slip and angle of attack.
  • the display is in the form of a headset worn by the pilot in use, with headset position sensors 14 being provided for determining the location and orientation ofthe headset with respect to the aircraft.
  • the GPS sensors 50, the altimeter 51 and the accelerometers 52, together with the processing system 10, can be incorporated into a common housing.
  • the unit can be placed in a plane before flight, with the pilot fitting the headset after take-off, to allow training with respect to the virtual features to be performed.
  • a flow sensor can be temporarily attached to the aircraft fuselage, and coupled to the unit wirelessly, for example through the use of a bluetooth connection or the like.
  • Figure 8 shows a flow chart for the operation ofthe system of Figure 7.
  • the system is set-up, with the position and orientation of the virtual feature being provided at step 300.
  • the virtual feature position is defined in accordance with a global coordinate system, and stored in the memory 21.
  • the system is calibrated to accurately relate the virtual world to the real world.
  • the aircraft position will in actual fact be that of the portable system.
  • the memory 21 is typically adapted to include aircraft specifications for different types of aircraft, including information regarding the aircraft dimensions. The position of the housing within the aircraft can then be provided via the user interface, allowing the processing system 10 to determine the spatial volume within the coordinate system filled by the aircraft.
  • the calibration process may also be performed to align the view on the display 12 to the user's actual view.
  • the aim of the calibration process is to overcome variations between the user's physical position within the aircraft, and the position ofthe aircraft with respect to the virtual feature. In the case in which a headset is used as the display, this process is performed to allow the position ofthe headset to be confirmed.
  • a non-limiting example of a calibration process may include the following steps:
  • the user aligns the aircraft along a known vector at a known point; (4) the user aims at three points widely spaced in his view with known geographical positions (eg. 100 meters distance). These points may be located using GPS equipment;
  • the user inputs his satisfaction with the indicated position; (6) the orientation of the user's head when looking at the known points is then recorded by the software using the means of determining the orientation ofthe user's head relative to the aircraft.
  • This data may be related mathematically together with data from the head position sensors to define the position of the headset in an aircraft coordinate system, with an indication of this being stored in the memory 21. Accordingly, this generally corresponds to the steps 100 to 120 outlined above.
  • the processing system 10 obtains signals from the GPS sensors 50 and the altimeter 51, the accelerometers 52, and the flow sensors 53, respectively.
  • the processing system 10 uses the signals from selected ones of the sensors 50, 51, 52, 53, to determine the current aircraft location. This will typically be achieved using the GPS sensors 50 and the altimeter 51 to periodically determine an absolute position, with movement from the absolute position being determined in accordance with signals from the accelerometer 52.
  • the processing system 10 uses successive aircraft locations to perform vector analysis of the aircraft movement, which coupled together with information from the sensors 14 regarding the flow of air over the aircraft is used to determine the orientation of the aircraft, taking into account the side-slip and angle of attack, and heading at step 380.
  • the position of the aircraft will be determined in accordance with the global coordinate system, and hence define the relative position ofthe aircraft with respect to the virtual features. This position, including at least one of the orientation, location and velocity of the aircraft may be recorded at step 390, to allow subsequent analysis to be performed. It will be appreciated that the processing system 10 may alternatively store the location data prior to determining the orientation, using the stored locations to subsequently determine an approximate orientation as described above.
  • the position ofthe headset with respect to the aircraft is determined at step 400.
  • the processing system 10 uses the relative position of the aircraft with respect to the virtual feature, together with the position of the headset to determine an image of the virtual feature at step 410, which is presented to the user on the display 12, in accordance with any calibration.
  • this is achieved by determining a first transformation matrix in accordance with the position ofthe aircraft in the global coordinate system. This may be determined at step 370, 380, or 410 depending on the implementation. The position of the headset in the aircraft coordinate system is then used to determine a second transformation matrix, at step 400 or 410.
  • the processing system 10 then accesses the memory 21 and obtains image data representing the one or more virtual features to be presented.
  • the processing system 10 then applies the first transformation matrix to the image data, to determine modified image data representing the appearance of the virtual feature from the aircraft. The manner in which this is achieved will be appreciated by persons skilled in the art.
  • the processing system 10 applies the second transformation matrix to the modified image data to determine further modified image data representing the appearance ofthe virtual features from the headset.
  • each individual will be provided with a headset, with the headset displaying an image of the virtual feature in accordance with that individual's point of view.
  • a screen such as a monitor or the like can display the virtual feature in accordance with the modified image data, therefore representing the aircraft view, allowing an instructor to view the pilot's interaction with the virtual feature, whilst still viewing the external; environment separately. This is important for safety considerations in order to ensure that all external factors can be accounted for, even if not correctly displayed by the processing system 10. In any event, should a fault develop with the system, the pilot can simply remove the headset, and return to flying as normal.
  • This process of generating the image will then generally be repeated as required to provide ongoing modification of the image compatible with changes in the aircraft position and/or position ofthe user's head.
  • a pilot will be presented with a view of a scene including one or more virtual features, the appearance of which will change as the aircraft approaches, circles or departs from the scene.
  • On broad sensors may determine and record the difference between the coordinates of the aircraft and those of the virtual scene to provide an accurate record of proximity and details of approach and departure and relative attitudes to thereby provide the basis for indepth analysis of a pilot's performance.
  • the presented image accurately indicates the view of the virtual feature as if it was in fact a real feature located at that point.
  • the image of the virtual feature may also be overlaid on the pilot's view of the real world. It is also possible that the image may be simply displayed on a screen in the pilot's view but separate and spaced from the pilot. The realism of such an arrangement would be less than that of a HUD arrangement, but may still be beneficial.
  • the screen may exclude the external view or may be adapted to overlay the image on the outlook of the real world.
  • the relative position of the pilot's head, aircraft and virtual image is constantly monitored with feed back into the system to establish the relative position and thereby produce an appropriate image.
  • Other occupants of the aircraft, especially an instructor, may also be presented with the same view as the pilot.
  • the view may be presented on one or more screens in the aircraft which are independent of viewers. Alternatively each viewing person may have a HUD. It is of course possible to have a screen displaying an image relative to the aircraft position and attitude alone, although this effect is not as realistic as including allowance for the position ofthe pilot's head.
  • the systems described above therefore provide a system for an airborne aircraft for accurately representing to a pilot an image of a virtual feature when viewed as notionally positioned at a set location.
  • a virtual feature notionally located at the specific position changes in the position ofthe aircraft relative to the set position are reflected in changes in the appearance of the virtual feature which are accurately related to the position and orientation of the aircraft and preferably the orientation of the pilot's head.
  • the present system also allows real practice of a glide approach.
  • the general consensus is that "deadstick" landings with the engine or engines killed are to be discouraged and as a result many pilots therefore have little or no knowledge of the proper method for a manoeuvre in which the margin for error is slim.
  • the present invention provides the opportunity to learn an appropriate response in a real situation in particular because the virtual runway can be located at a height sufficient to allow safe recovery after an attempt at a landing, thereby reducing the risk to the pilot and instructor. This also reduces costs associated with landings at real airfields, as will be appreciated by persons skilled in the art.
  • an onboard computer or even a remotely located computer in signal contact with sensors
  • the virtual feature may be defined by a number of reference points which can be used to record aspects of the aircraft's interaction with the virtual feature.
  • the virtual feature may act as a focal point for a number of aircraft which may share a block of airspace within or around it to simulate circuit training and traffic without the expense of using a real aerodrome.
  • the virtual feature being geographically referenced will be in the same set position relative to all aircraft. This can advantageously be achieved by having the position of the one or more virtual features set centrally, for example by using the architecture shown in Figure 6, to allow the base station 1 to set the virtual feature for all ofthe aircraft.
  • a computer may be programmed with appropriate software to report selected velocities and clearances at selected times or continuously. The actual performance of the aircraft and crew can then be recorded and may be used as a diagnostic and feed back device during training.
  • the image may be of a virtual airfield which represents a real and existing airfield somewhere in the world. As described above this can be easily achieved by obtaining physical images of real locations, through the use of satellite images, or the like, and then using these to construct representations of the virtual features in the form of image data and associated textures. Pilots may then be taught the appropriate approach to a specific airport or airports anywhere in the world. This may have particular advantage in relation to approaches which can be dangerous or challenging.
  • the system can be used in training pilots for combat purposes, for example by imaging the region surrounding a target, and then allowing the pilot to practice attacking the target as represented by virtual images.
  • This is important traditional flight simulators can only mimic the performance of real aircraft to a limited extent, which may be critical in combat situations. Accordingly, this technique allows pilots to practice attacking a virtual targets in a real aircraft in flight.
  • the virtual target can be located at a sufficient altitude to allow the pilot to recover should any problems occur during the virtual attack.
  • the virtual feature represented may be one or more towers which set out a track for a pylon racing competition.
  • natural geographical features may be set up to define a pathway to challenge the manoeuvring of both the pilot and the ability of the aircraft.
  • the image may represent a landscape including structures representing sky writing, including construction lines. Accuracy, style and speed of sky writing will be better than otherwise possible.
  • the virtual image may also be used to recreate events surrounding a known accident or incident to better assess the prevailing conditions and likely causation ofthe accident or incident.
  • the invention may be used in conditions involving manoeuvring a real aircraft around synthetic structures, representations or marks that are located with a known reference to a real world feature.
  • an aircraft may be flown through a landscape with realistic views and accurate physical behaviour by the aircraft except that interaction between the aircraft and landscape is virtual not physical.
  • a landscape may contain an aerodrome that is sufficient in detail about its surroundings to provide a realistic view including depth cues.
  • the aircraft may then be manoeuvred around as if it were a real feature.
  • the position ofthe virtual feature may also be adapted to alter over time.
  • the processing system 10 can be adapted to modify the position of the virtual feature in accordance with predetermined rules stored in the memory 21, such that the user of the system will set an initial starting point for the virtual feature, with subsequent movement from this point being controlled by the processing system 10, or the base station 1 , depending on the respective implementation.
  • the virtual feature may represent other aircraft, allowing pilots to practice collision avoidance techniques, as well as air based combat, or search scenarios.
  • the virtual feature could represent an aircraft carrier, or the like, allowing the pilot to practice locating and approaching the carrier for landing purposes. This will allow military training to be performed without undue risk to the pilot or aircraft.
  • a moving landing strip may be utilised to simulate different weather conditions, such as moving the landing strip laterally to represent a cross winds or the like. This allows pilots to practice landing in adverse weather conditions, with realism being further enhanced by modifying the images presented to represent associated visibility conditions.
  • the technique may also be applied to movement of other vehicles or vessels in 2 dimensions.
  • the system can be applied to ships, to simulate navigation of narrow channels, or interaction with other vessels, such as submarines, ships, or the like. Again, this allows training in ship operation to be performed.
  • each individual on the ship may be prevented with a respective headset, such that each individual can view the respective virtual features in accordance with their position on the vessel.
  • the virtual images can correspond to each of the vehicles in use.
  • the relative positions ofthe vehicles can be separated from the real positions, such that the vessels may be separated by large distances in real spatial terms, but by only small separations in the virtual feature environment.
  • This allows real vessels to participate in exercises and interact with each other, whilst being physically separated to thereby prevent collisions and other damage to the vehicles.
  • vehicles in different countries, or the like could perform exercises with each other in a chosen real or virtual location.
  • movement of the other vehicles can be controlled by the processing system 10, or a central base station 1 for example by using a appropriate artificial intelligence algorithms.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Navigation (AREA)

Abstract

La présente invention a trait à un procédé permettant l'interaction d'un utilisateur d'un véhicule en mouvement avec un élément virtuel. Le procédé comprend la commande à un système de traitement de déterminer une position pour l'élément virtuel (33), et une position actuelle du véhicule (30) sur la base de signaux obtenus à partir d'un capteur. Le système de traitement détermine la position relative de l'élément virtuel par rapport au véhicule et l'utilise pour générer une image de l'élément virtuel. L'image est présentée à l'utilisateur sur un écran d'affichage, le processus étant répété pour permettre à l'image à modifier de prendre en compte le mouvement de l'aéronef.
PCT/AU2003/000571 2002-05-14 2003-05-14 Affichage d'elements WO2003096303A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2003221641A AU2003221641A1 (en) 2002-05-14 2003-05-14 Feature display

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AUPS2308 2002-05-14
AUPS2308A AUPS230802A0 (en) 2002-05-14 2002-05-14 Aircraft based visual aid

Publications (1)

Publication Number Publication Date
WO2003096303A1 true WO2003096303A1 (fr) 2003-11-20

Family

ID=3835878

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2003/000571 WO2003096303A1 (fr) 2002-05-14 2003-05-14 Affichage d'elements

Country Status (2)

Country Link
AU (1) AUPS230802A0 (fr)
WO (1) WO2003096303A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013164813A1 (fr) * 2012-05-03 2013-11-07 Pinchas Dahan Procédé et système pour afficher en temps réel diverses combinaisons de multiples positions d'aéronefs sélectionnées et de leurs vues de cockpit
GB2570470A (en) * 2018-01-26 2019-07-31 Bae Systems Plc Flight simulation
WO2019145675A1 (fr) * 2018-01-26 2019-08-01 Bae Systems Plc Simulation de vol
WO2020026235A1 (fr) 2018-08-02 2020-02-06 Elbit Systems Ltd. Simulation d'entraînement en vol affichant un environnement virtuel
CN114519935A (zh) * 2020-11-20 2022-05-20 华为技术有限公司 道路识别方法以及装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4308015A (en) * 1979-12-20 1981-12-29 General Electric Company System and method for aircraft gunnery training and accuracy evaluation
EP0654776B1 (fr) * 1993-11-20 1999-03-03 Bodenseewerk Gerätetechnik GmbH Dispositif d'entraínement pour pilotes
US6111526A (en) * 1996-08-02 2000-08-29 Sextant Avionique Vehicle course steering aid device
US6208933B1 (en) * 1998-12-04 2001-03-27 Northrop Grumman Corporation Cartographic overlay on sensor video
EP0399418B2 (fr) * 1989-05-20 2002-12-04 EADS Deutschland Gmbh Méthode et dispositif pour l'entrainement des missions pour aéronefs

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4308015A (en) * 1979-12-20 1981-12-29 General Electric Company System and method for aircraft gunnery training and accuracy evaluation
EP0399418B2 (fr) * 1989-05-20 2002-12-04 EADS Deutschland Gmbh Méthode et dispositif pour l'entrainement des missions pour aéronefs
EP0654776B1 (fr) * 1993-11-20 1999-03-03 Bodenseewerk Gerätetechnik GmbH Dispositif d'entraínement pour pilotes
US6111526A (en) * 1996-08-02 2000-08-29 Sextant Avionique Vehicle course steering aid device
US6208933B1 (en) * 1998-12-04 2001-03-27 Northrop Grumman Corporation Cartographic overlay on sensor video

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013164813A1 (fr) * 2012-05-03 2013-11-07 Pinchas Dahan Procédé et système pour afficher en temps réel diverses combinaisons de multiples positions d'aéronefs sélectionnées et de leurs vues de cockpit
GB2570470A (en) * 2018-01-26 2019-07-31 Bae Systems Plc Flight simulation
WO2019145675A1 (fr) * 2018-01-26 2019-08-01 Bae Systems Plc Simulation de vol
JP2021511257A (ja) * 2018-01-26 2021-05-06 ビ−エイイ− システムズ パブリック リミテッド カンパニ−BAE SYSTEMS plc フライトシミュレーション
JP7035222B2 (ja) 2018-01-26 2022-03-14 ビ-エイイ- システムズ パブリック リミテッド カンパニ- フライトシミュレーション
US11302211B2 (en) 2018-01-26 2022-04-12 Bae Systems Plc Flight simulation
GB2570470B (en) * 2018-01-26 2022-08-31 Bae Systems Plc Flight simulation
WO2020026235A1 (fr) 2018-08-02 2020-02-06 Elbit Systems Ltd. Simulation d'entraînement en vol affichant un environnement virtuel
US11189189B2 (en) 2018-08-02 2021-11-30 Elbit Systems Ltd. In-flight training simulation displaying a virtual environment
EP3830810A4 (fr) * 2018-08-02 2022-04-06 Elbit Systems Ltd. Simulation d'entraînement en vol affichant un environnement virtuel
CN114519935A (zh) * 2020-11-20 2022-05-20 华为技术有限公司 道路识别方法以及装置
CN114519935B (zh) * 2020-11-20 2023-06-06 华为技术有限公司 道路识别方法以及装置

Also Published As

Publication number Publication date
AUPS230802A0 (en) 2002-06-13

Similar Documents

Publication Publication Date Title
US11568756B2 (en) Augmented reality for vehicle operations
US11189189B2 (en) In-flight training simulation displaying a virtual environment
CN100583185C (zh) 飞行模拟器
US8754786B2 (en) Method of operating a synthetic vision system in an aircraft
US11869388B2 (en) Augmented reality for vehicle operations
EP2048640A2 (fr) Procédé et appareil de contrôle d'un objet mobile simulé
CN113409648A (zh) 一种飞行俯仰错觉模拟方法、装置及飞行错觉模拟器
WO2003096303A1 (fr) Affichage d'elements
CN113409649B (zh) 一种前庭性倾斜错觉模拟方法、装置及飞行错觉模拟器
EP4238081A1 (fr) Réalité augmentée pour opérations de véhicule
Archdeacon et al. Aerospace Cognitive Engineering Laboratory (ACELAB) Simulator for Electric Vertical Takeoff and Landing (eVOTL) Research and Development
RU2397549C1 (ru) Способ предупреждения угрозы столкновения вертолета с наземными препятствиями
Haber et al. Perception and attention during low-altitude high-speed flight
Archdeacon et al. Aerospace cognitive engineering laboratory (acelab) simulator for urban air mobility (uam) research and development
US20230215287A1 (en) Methods, systems, apparatuses, and devices for facilitating provisioning of a virtual experience
US20240071249A1 (en) System, Apparatus and Method for Advance View Limiting Device
CN117198116A (zh) 一种科里奥利飞行错觉模拟方法
WO2024035720A2 (fr) Procédés, systèmes, appareils et dispositifs pour faciliter la fourniture d'une expérience virtuelle
Dearing et al. Computer Aiding for Low-Altitude Flight Simulation to Flight: A Case Study
Szoboszlay et al. Synthetic vision for rotorcraft: low level flight
Bennett The display of spatial information and visually guided behavior

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NI NO NZ OM PH PL PT RO RU SC SD SE SG SK SL TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP

WWW Wipo information: withdrawn in national office

Country of ref document: JP