WO2006059151A1 - Indirect control of vehicles - Google Patents

Indirect control of vehicles Download PDF

Info

Publication number
WO2006059151A1
WO2006059151A1 PCT/GB2005/050217 GB2005050217W WO2006059151A1 WO 2006059151 A1 WO2006059151 A1 WO 2006059151A1 GB 2005050217 W GB2005050217 W GB 2005050217W WO 2006059151 A1 WO2006059151 A1 WO 2006059151A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
airframe
environment
image
real
Prior art date
Application number
PCT/GB2005/050217
Other languages
French (fr)
Inventor
Graham Patrick Wallis
Original Assignee
Mbda Uk Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from GB0426483A external-priority patent/GB0426483D0/en
Application filed by Mbda Uk Limited filed Critical Mbda Uk Limited
Publication of WO2006059151A1 publication Critical patent/WO2006059151A1/en

Links

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G9/00Systems for controlling missiles or projectiles, not provided for elsewhere
    • F41G9/002Systems for controlling missiles or projectiles, not provided for elsewhere for guiding a craft to a correct firing position
    • F41G9/004Training or teaching apparatus therefor
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G7/00Direction control systems for self-propelled missiles
    • F41G7/006Guided missiles training or simulation devices

Definitions

  • This invention relates to the indirect control of vehicles and provides both a control system for an actual vehicle, and a method of operating an actual vehicle, which enable the vehicle to be controlled by an operator who has no, or inadequate, direct visual reference to the proximate environment of the vehicle.
  • model boats, cars and aeroplanes It is well known for model boats, cars and aeroplanes to be controlled remotely using a radio control link.
  • the operator has remote visual reference to part of the environment of the model boat, car or aeroplane. This remote visual reference is usually from a direction oblique to the vehicle trajectory, and/or from a different level. The operator has to transpose these remote visual references when attempting to visualise the real-time environment of the model and the trajectory and attitude of the model within the real-time environment.
  • a remotely piloted airframe such an unmanned air vehicle (UAV)
  • UAV unmanned air vehicle
  • a video camera arranged, through a telemetry down-link, to provide a remotely based operator (on the ground or in another vehicle) with a pilot's view of the terrain in front of the remotely piloted airframe.
  • UAV unmanned air vehicle
  • this system is useful, many operators experience significant difficulties in achieving the safe landing of UAVs, probably due to the absence of any of the peripheral visual inputs which pilots of piloted airframes use to gauge slide-slip, roll, attitude and height immediately before and during landing.
  • Some airframe designs have closed cockpits which allow the pilot no, or limited, visual reference to the ground.
  • the operators of other vehicles, such as military tanks, have very restricted visual reference to the ground and usually no visual reference for gauging lateral separation from obstructions.
  • the captains of large ships have inadequate visual reference to quays or obstructions whilst controlling manoeuvres such as docking, or picking up buoys.
  • the use of simulators for training military and aircraft personnel is described in several documents including US 5,224,860, WO83/01832 and US 4,232,456. Simulators are non-destructive and relatively inexpensive to operate and are designed primarily to improve the skills of an operator.
  • the present invention is concerned with providing a control system for an actual vehicle, and a method of operating such a vehicle, to improve the control of such a vehicle by an operator who has no, or inadequate, direct visual reference to the environment of the vehicle and requires minimal skill on the part of the operator.
  • actual vehicle is meant a vehicle existing in reality as opposed to some virtual vehicle existing in a synthetic environment.
  • the invention is particularly, but not exclusively, concerned with improving the control of an airframe, such as a UAV, by a remote operator who has no direct visual reference to the ground in front of the airframe.
  • a control system for an actual vehicle includes a display screen for a vehicle operator, means for transmitting geometrical data relating to the vehicle to a synthesiser arranged to control the display of images on the display screen, and the synthesiser is arranged to:-
  • the system allows an operator to observe a visualisation of the vehicle within its environment enhancing situational awareness of the velocity, track, attitude and position of the vehicle by the operator and facilitating remote control of the vehicle by conventional manual means.
  • This enhanced awareness greatly reduces the skill required to operate the vehicle resulting in minimal training requirements for operator personnel.
  • Vehicle control can also be automatic but an operator has the benefit of sufficient situational awareness to be able to supervise the landing and intervene if necessary. This is of value in onboard flight path visualisation, in closed cockpits, or for operations in zero visibility and is advantageous in that conventional large displays are not required.
  • the virtual viewpoint is preferably displaced along the trajectory of the vehicle.
  • the virtual viewpoint is a predetermined point trailing the vehicle whereby the display screen will display the synthesised image of the vehicle as if viewed from behind.
  • the virtual viewpoint may alternatively, or additionally, be displaced laterally from the trajectory of the vehicle. In this manner the display screen will display the synthesised image of the vehicle as if viewed a direction parallel with, but on the same level as the trajectory.
  • the predetermined virtual viewpoint may alternatively, or additionally, be displaced vertically relative to the trajectory of the vehicle. In this manner the display screen will display the synthesised image of the vehicle as if viewed from a direction parallel with, but at higher or lower level than, the trajectory.
  • the system preferably includes an operator control for adjusting the virtual viewpoint of the synthesised vehicle image relative to the vehicle. - A -
  • a telemetric link may be arranged to supply the synthesiser with real-time data from the vehicle regarding the position, attitude and/or trajectory of the vehicle in its environment and the synthesiser be arranged to show the vehicle position, attitude and/or trajectory in the synthesised vehicle image.
  • a remote tracking means may be arranged to track the vehicle and to supply the synthesiser with real-time data regarding the position, attitude and/or trajectory of the vehicle in its environment, and the synthesiser is arranged to display the vehicle position, attitude and/or vehicle trajectory in the synthesised vehicle image.
  • the means for transmitting data relating to the real-time vehicle environment is preferably a reference database of the true environment of the vehicle, but a sensor carried by the vehicle producing a real-time image from the vehicle and a telemetric link arranged to transmit the real-time image to the synthesiser may be used.
  • the real-time vehicle environment is the real-time forward environment of the airframe
  • the virtual viewpoint is preferably a predetermined point trailing the airframe whereby the synthesised image of the airframe will appear to be viewed from behind.
  • an operator control may be ground-based, and a command link is provided between the operator control and the airframe to enable the airframe to be piloted remotely.
  • the display screen may be carried by a second vehicle and a command link be provided between the second vehicle and the airframe to enable the airframe to be piloted from the second vehicle.
  • the second vehicle may be ground-based, waterbome or airborne.
  • the synthesiser preferably includes means to generate a virtual shadow of the airframe and to superimpose the virtual shadow on the synthesised image of the real-time vehicle environment.
  • the virtual viewpoint may be a point displaced laterally from the airframe trajectory whereby the synthesised image of the airframe will appear to be viewed partially from one side.
  • the virtual viewpoint may be a point displaced above or below the airframe trajectory whereby the synthesised image of the airframe will appear to be viewed partially from above or from below.
  • the real-time vehicle environment is the real-time environment of the ship
  • the virtual viewpoint may be displaced such that the display of the synthesised image of the ship on the synthesised image of the real-time ship environment will reveal the location of at least part of the ship relative to any structure within the real-time environment of the ship.
  • the real-time environment is the realtime environment of the ship
  • the virtual viewpoint may be displaced above the ship whereby the display of the synthesised image of the ship on the synthesised image of the real-time ship environment will reveal the location of at least part of an the extremity of the ship relative to the real-time ship environment.
  • a method of operating an actual vehicle includes generating a synthesised image of the real-time vehicle environment having a virtual viewpoint displaced in a predetermined way relative to the vehicle, generating a synthesised image of the vehicle with the same virtual viewpoint, and superimposing the synthesised image of the vehicle on the synthesised image of the real-time vehicle environment.
  • a method of piloting an actual airframe from a command station having no, or inadequate, direct visual reference to the environment of the airframe includes generating a synthesised image of the real-time forward environment ahead of the airframe but with a virtual viewpoint trailing the airframe, synthesising an image of the airframe with the same virtual viewpoint, and superimposing the synthesised image of the airframe on the synthesised image of the real-time forward environment for use as a reference for piloting the airframe from the command station.
  • a pilot located at a ground station or in another vehicle, is enabled to visualise the movement of the airframe relative to a runway, or other objective, by the creation of the compound image which indicates the attitude of the airframe relative to the real-time environment ahead of the flight path.
  • This image preferably includes other flight parameters, such as airspeed.
  • the method may include generating the synthesised image of the real-time forward environment of the airframe either from a reference database of the true environment of the airframe.
  • the method may include transmitting the image of the real-time forward environment to the synthesiser from a sensor carried by the airframe to the command station.
  • Such a method can be useful when the command station is carried by the airframe, but is particularly useful in the case where the command station is not carried by the airframe.
  • an actual vehicle in the form of a remotely piloted airframe 10 is shown travelling obliquely to a runway 11 having a centre-line 12.
  • the vehicle operator (pilot) 13 is positioned at a command station 14,which is remote from the airframe 10, for instance on the ground, in another aircraft, or in a ship.
  • the task of the pilot 13 is to adjust the trajectory T of the remotely piloted airframe 10 so that it is aligned with the centre-line 12 (that is on “finals") at an appropriate height above the runway 11 , at an appropriate airspeed in an appropriate attitude, and with the airframe 10 in an appropriate configuration.
  • the command station 14 is provided with an operator (pilot) control comprising a control panel 15 which includes the appropriate flight controls and is operatively connected to the airframe 10 through a command 16.
  • the pilot 13 Prior to the present invention, the pilot 13 would receive an image of the scene ahead of the airframe flight path from a camera or other sensor 17 carried by the airframe 10, the image being transmitted by a telemetric link 18 to a display screen 19 positioned adjacent to the control panel 15. This display screen would also display data about the position and attitude of the airframe.
  • the pilot 13 of the remotely piloted airframe 10 has no feedback of how the airframe 10 feels or appears relative to the scene ahead of the airframe. This is recognised as a major causative factor in the high rate of landing accidents with remotely-piloted airframes iO.
  • the present invention teaches that a representation of the airframe's real-time forward environment is generated by or fed to a synthesiser 20 which controls the display of images on the display screen 19 and is conveniently positioned at the command station 14.
  • the synthesiser 20 is provided with software which manipulates the real-time image of the airframe's forward environment to generate a synthesised image S of the real-time environment ahead of the airframe but with a virtual viewpoint 21 displaced so that it trails the airframe 10.
  • the signal of this synthesised real-time forward image is transmitted by an image link 22 to the display screen 19 which conveniently may be a VDU that also displays flight parameters.
  • the virtual viewpoint 21 of the synthesised real-time image is transposed from the position of the camera 17 to the virtual viewpoint 21 which, as shown, is displaced behind airframe 10 along the trajectory T and therefore trails the airframe 10.
  • the synthesiser 20 also generates a synthesised image A of the airframe with the same virtual viewpoint 21 and superimposes the synthesised images S and A.
  • the actual position and attitude of the airframe are transmitted via the telemetric link 18 from onboard sensors to the synthesiser so that the synthesised airframe image A is the real-time position and attitude of the airframe 10 relative to the real-time synthesised image S.
  • the pilot 13 From the display screen 19, the pilot 13 therefore sees the synthesised airframe image A travelling in front of him along trajectory T towards the real-time synthesised image S which includes the position and orientation of the runway 11.
  • the pilot 13 will note that the airframe is banked to port and therefore turning left, and will very quickly understand that the airframe needs to continue tracking to the left and then levelled until it nearly intercepts the runway centre line 12 whereupon it will need to be banked to starboard prior to establishing "finals".
  • the pilot 13 can assess further information about the attitude of the airframe 10.
  • the remote synthesiser 20 can be programmed to show a virtual shadow of the synthesised airframe image A on the display screen 19. This virtual shadow does not need to take any account of the level or direction of lighting, but provides a reference to the height of the airframe 10 above the runway 11 thereby facilitating flare and landing.
  • the synthesising of virtual shadows is well-known in the art of video games which also teach diverse techniques and software for synthesising images.
  • the pilot 13 can alter the position of the virtual viewpoint 21 relative to the airframe 10. It will be noted that the virtual viewpoint 21 trails the airframe 10 and is directed along the trajectory T - however, it could instead arranged to have other geometric relationships with a frame of reference of the airframe 10. By using the operator control link 23, the pilot 13 can alter this geometrical relationship by displacing the virtual viewpoint 21 towards, or away from, the airframe 10 along the trajectory T, or by raising or lowering the virtual viewpoint 21 relative to the trajectory T, or by displacing the virtual viewpoint 21 laterally to the left or right of the trajectory T.
  • Raising the virtual viewpoint 21 will increase the apparent perspective of the airframe 10, whilst lowering the virtual viewpoint 21 enables the relationship between the landing wheels and the runway 11 to be closely observed together with the attitude of the airframe 10 relative to the runway 11.
  • Such lowering of the virtual viewpoint can be of particular use for short-field landings such as on aircraft carriers and temporary airstrips.
  • Lateral displacement of the virtual viewpoint 21 can be useful for docking an asymmetrical in-flight refuelling device and, particularly on large aircraft, for assessing the clearance of wing tips from obstacles whilst manoeuvring on the ground.
  • the camera 17 can be replaced by any convenient from of airborne sensor that can produce a real-time forward view of the airframe 10.
  • the camera 17 and the telemetric link 18 can be replaced, or backed-up, by information from a ground based, or air-based, tracking means that is tracking the airframe 10.
  • the invention has been described with reference to the landing of an unmanned airframe 10, it is also of use for improving control of the airframe 10 during various aerial manoeuvres, such as positioning relative to aerial or ground-based objects, and in-flight refuelling.
  • the invention facilitates the safe landing of a UAV by an operator who does not have the skills of a pilot.
  • the pilot of a manned aircraft By appropriately offsetting the virtual viewpoint 21 , it is possible for the pilot of a manned aircraft to make visual observation of the clearance between his wing tips and the wing tips and empannage of other aircraft whilst manoeuvring at an airport.
  • the invention is applicable to a range of vehicles of which the driver has inadequate visual reference to the vehicle's surroundings.
  • the invention is particularly useful in controlling the landing of a UAV on an aircraft carrier where a very detailed representation of the real-time environment can be generated.
  • the invention is useful with armoured vehicles engaged in combat where there is no direct visual reference of the vehicle's surroundings.
  • the invention provides an additional perception of the position and progress of the vehicle over the surrounding terrain. Also, by offsetting the virtual viewpoint 21 laterally, it is possible for the driver to gauge the actual clearance between the sides of his vehicle and any significant obstruction.
  • the invention is also useful for controlling unmanned vehicles such as those used for inspecting and detonating suspected bombs or booby traps.

Abstract

A vehicle, comprising a remotely piloted airframe (10), transmits its estimated position and attitude via a telemetric link (18) to an image synthesiser (20) at a command station (14). The image synthesiser (20) creates image S of the real- time vehicle environment having a virtual viewpoint (21) trailing the position of the airframe (10) onto which is superimposed a synthesised image A of the airframe with its estimated relative position and attitude. The composite image A-S is displayed on a screen (19) in front of the airframe operator (13), who controls the airframe (10) through a control panel (15) and a command link (16). The airframe operator (13) and the command station (14) may be onboard the airframe (10) or located remotely on the ground or in another airframe or vehicle.

Description

INDIRECT CONTROL OF VEHICLES
This invention relates to the indirect control of vehicles and provides both a control system for an actual vehicle, and a method of operating an actual vehicle, which enable the vehicle to be controlled by an operator who has no, or inadequate, direct visual reference to the proximate environment of the vehicle.
It is well known for model boats, cars and aeroplanes to be controlled remotely using a radio control link. In these examples the operator has remote visual reference to part of the environment of the model boat, car or aeroplane. This remote visual reference is usually from a direction oblique to the vehicle trajectory, and/or from a different level. The operator has to transpose these remote visual references when attempting to visualise the real-time environment of the model and the trajectory and attitude of the model within the real-time environment.
It is also well-known for a remotely piloted airframe, such an unmanned air vehicle (UAV), to carry a video camera arranged, through a telemetry down-link, to provide a remotely based operator (on the ground or in another vehicle) with a pilot's view of the terrain in front of the remotely piloted airframe. Whilst this system is useful, many operators experience significant difficulties in achieving the safe landing of UAVs, probably due to the absence of any of the peripheral visual inputs which pilots of piloted airframes use to gauge slide-slip, roll, attitude and height immediately before and during landing.
Some airframe designs have closed cockpits which allow the pilot no, or limited, visual reference to the ground. The operators of other vehicles, such as military tanks, have very restricted visual reference to the ground and usually no visual reference for gauging lateral separation from obstructions. The captains of large ships have inadequate visual reference to quays or obstructions whilst controlling manoeuvres such as docking, or picking up buoys. In addition, the use of simulators for training military and aircraft personnel is described in several documents including US 5,224,860, WO83/01832 and US 4,232,456. Simulators are non-destructive and relatively inexpensive to operate and are designed primarily to improve the skills of an operator. In recent years, full real-time digital synthesis of an environment and/or vehicle have been achieved, such as for example Microsoft's™ Flight Simulator. In these systems, synthetic images of a vehicle and flight environment are generated and manipulated so as to simulate the dynamic view of the environment to vehicle control actions of the operator. These systems can allow the use a "trailing viewpoint" of the environment located a few hundred metres behind the air vehicle so as to provide simultaneous situational awareness of velocity, track, attitude and position. However, this effect is currently only exploited in the virtual world of simulations.
The present invention is concerned with providing a control system for an actual vehicle, and a method of operating such a vehicle, to improve the control of such a vehicle by an operator who has no, or inadequate, direct visual reference to the environment of the vehicle and requires minimal skill on the part of the operator. By "actual vehicle" is meant a vehicle existing in reality as opposed to some virtual vehicle existing in a synthetic environment. The invention is particularly, but not exclusively, concerned with improving the control of an airframe, such as a UAV, by a remote operator who has no direct visual reference to the ground in front of the airframe.
According to one aspect of the invention a control system for an actual vehicle includes a display screen for a vehicle operator, means for transmitting geometrical data relating to the vehicle to a synthesiser arranged to control the display of images on the display screen, and the synthesiser is arranged to:-
(i) generate on the display screen a synthesised image of the real- time environment having a virtual viewpoint displaced in a predetermined way relative to the vehicle,
(ii) generate a synthesised image of the vehicle with the same virtual viewpoint, and (iii) superimpose the synthesised image of the vehicle on the synthesised image of the real-time environment.
The system allows an operator to observe a visualisation of the vehicle within its environment enhancing situational awareness of the velocity, track, attitude and position of the vehicle by the operator and facilitating remote control of the vehicle by conventional manual means. This enhanced awareness greatly reduces the skill required to operate the vehicle resulting in minimal training requirements for operator personnel. Vehicle control can also be automatic but an operator has the benefit of sufficient situational awareness to be able to supervise the landing and intervene if necessary. This is of value in onboard flight path visualisation, in closed cockpits, or for operations in zero visibility and is advantageous in that conventional large displays are not required.
The virtual viewpoint is preferably displaced along the trajectory of the vehicle. Preferably, the virtual viewpoint is a predetermined point trailing the vehicle whereby the display screen will display the synthesised image of the vehicle as if viewed from behind.
The virtual viewpoint may alternatively, or additionally, be displaced laterally from the trajectory of the vehicle. In this manner the display screen will display the synthesised image of the vehicle as if viewed a direction parallel with, but on the same level as the trajectory.
The predetermined virtual viewpoint may alternatively, or additionally, be displaced vertically relative to the trajectory of the vehicle. In this manner the display screen will display the synthesised image of the vehicle as if viewed from a direction parallel with, but at higher or lower level than, the trajectory.
The system preferably includes an operator control for adjusting the virtual viewpoint of the synthesised vehicle image relative to the vehicle. - A -
A telemetric link may be arranged to supply the synthesiser with real-time data from the vehicle regarding the position, attitude and/or trajectory of the vehicle in its environment and the synthesiser be arranged to show the vehicle position, attitude and/or trajectory in the synthesised vehicle image. Alternatively, a remote tracking means may be arranged to track the vehicle and to supply the synthesiser with real-time data regarding the position, attitude and/or trajectory of the vehicle in its environment, and the synthesiser is arranged to display the vehicle position, attitude and/or vehicle trajectory in the synthesised vehicle image.
The means for transmitting data relating to the real-time vehicle environment is preferably a reference database of the true environment of the vehicle, but a sensor carried by the vehicle producing a real-time image from the vehicle and a telemetric link arranged to transmit the real-time image to the synthesiser may be used.
In the case where the vehicle is an airframe, the real-time vehicle environment is the real-time forward environment of the airframe, and the virtual viewpoint is preferably a predetermined point trailing the airframe whereby the synthesised image of the airframe will appear to be viewed from behind.
In the case where the airframe is remotely piloted, an operator control may be ground-based, and a command link is provided between the operator control and the airframe to enable the airframe to be piloted remotely. Alternatively, the display screen may be carried by a second vehicle and a command link be provided between the second vehicle and the airframe to enable the airframe to be piloted from the second vehicle. The second vehicle may be ground-based, waterbome or airborne.
The synthesiser preferably includes means to generate a virtual shadow of the airframe and to superimpose the virtual shadow on the synthesised image of the real-time vehicle environment. In the case where the vehicle is an airframe, the virtual viewpoint may be a point displaced laterally from the airframe trajectory whereby the synthesised image of the airframe will appear to be viewed partially from one side.
In the case where the vehicle is an airframe, the virtual viewpoint may be a point displaced above or below the airframe trajectory whereby the synthesised image of the airframe will appear to be viewed partially from above or from below.
In the case where the vehicle is a ship, the real-time vehicle environment is the real-time environment of the ship, and the virtual viewpoint may be displaced such that the display of the synthesised image of the ship on the synthesised image of the real-time ship environment will reveal the location of at least part of the ship relative to any structure within the real-time environment of the ship.
In the case where the vehicle is a ship, the real-time environment is the realtime environment of the ship, and the virtual viewpoint may be displaced above the ship whereby the display of the synthesised image of the ship on the synthesised image of the real-time ship environment will reveal the location of at least part of an the extremity of the ship relative to the real-time ship environment.
According to another aspect of the invention, a method of operating an actual vehicle includes generating a synthesised image of the real-time vehicle environment having a virtual viewpoint displaced in a predetermined way relative to the vehicle, generating a synthesised image of the vehicle with the same virtual viewpoint, and superimposing the synthesised image of the vehicle on the synthesised image of the real-time vehicle environment.
According to a further aspect of the invention a method of piloting an actual airframe from a command station having no, or inadequate, direct visual reference to the environment of the airframe, includes generating a synthesised image of the real-time forward environment ahead of the airframe but with a virtual viewpoint trailing the airframe, synthesising an image of the airframe with the same virtual viewpoint, and superimposing the synthesised image of the airframe on the synthesised image of the real-time forward environment for use as a reference for piloting the airframe from the command station. Thus a pilot, located at a ground station or in another vehicle, is enabled to visualise the movement of the airframe relative to a runway, or other objective, by the creation of the compound image which indicates the attitude of the airframe relative to the real-time environment ahead of the flight path. This image preferably includes other flight parameters, such as airspeed.
The method may include generating the synthesised image of the real-time forward environment of the airframe either from a reference database of the true environment of the airframe.
The method may include transmitting the image of the real-time forward environment to the synthesiser from a sensor carried by the airframe to the command station. Such a method can be useful when the command station is carried by the airframe, but is particularly useful in the case where the command station is not carried by the airframe.
The accompanying diagrammatic drawing illustrates one embodiment of the invention, by way of example only, as applied to an airframe.
With reference to the drawing, an actual vehicle in the form of a remotely piloted airframe 10, is shown travelling obliquely to a runway 11 having a centre-line 12. The vehicle operator (pilot) 13 is positioned at a command station 14,which is remote from the airframe 10, for instance on the ground, in another aircraft, or in a ship.
The task of the pilot 13 is to adjust the trajectory T of the remotely piloted airframe 10 so that it is aligned with the centre-line 12 (that is on "finals") at an appropriate height above the runway 11 , at an appropriate airspeed in an appropriate attitude, and with the airframe 10 in an appropriate configuration. To achieve this task, the command station 14 is provided with an operator (pilot) control comprising a control panel 15 which includes the appropriate flight controls and is operatively connected to the airframe 10 through a command 16.
Prior to the present invention, the pilot 13 would receive an image of the scene ahead of the airframe flight path from a camera or other sensor 17 carried by the airframe 10, the image being transmitted by a telemetric link 18 to a display screen 19 positioned adjacent to the control panel 15. This display screen would also display data about the position and attitude of the airframe. In comparison with a pilot of a manned airframe, the pilot 13 of the remotely piloted airframe 10 has no feedback of how the airframe 10 feels or appears relative to the scene ahead of the airframe. This is recognised as a major causative factor in the high rate of landing accidents with remotely-piloted airframes iO.
The present invention teaches that a representation of the airframe's real-time forward environment is generated by or fed to a synthesiser 20 which controls the display of images on the display screen 19 and is conveniently positioned at the command station 14. The synthesiser 20 is provided with software which manipulates the real-time image of the airframe's forward environment to generate a synthesised image S of the real-time environment ahead of the airframe but with a virtual viewpoint 21 displaced so that it trails the airframe 10. The signal of this synthesised real-time forward image is transmitted by an image link 22 to the display screen 19 which conveniently may be a VDU that also displays flight parameters. In this manner the virtual viewpoint 21 of the synthesised real-time image is transposed from the position of the camera 17 to the virtual viewpoint 21 which, as shown, is displaced behind airframe 10 along the trajectory T and therefore trails the airframe 10.
The synthesiser 20 also generates a synthesised image A of the airframe with the same virtual viewpoint 21 and superimposes the synthesised images S and A. The actual position and attitude of the airframe are transmitted via the telemetric link 18 from onboard sensors to the synthesiser so that the synthesised airframe image A is the real-time position and attitude of the airframe 10 relative to the real-time synthesised image S. From the display screen 19, the pilot 13 therefore sees the synthesised airframe image A travelling in front of him along trajectory T towards the real-time synthesised image S which includes the position and orientation of the runway 11. From the particular image shown on the display screen 19 in the drawing, the pilot 13 will note that the airframe is banked to port and therefore turning left, and will very quickly understand that the airframe needs to continue tracking to the left and then levelled until it nearly intercepts the runway centre line 12 whereupon it will need to be banked to starboard prior to establishing "finals".
Dependant on the profile of the synthesised airframe image A, the pilot 13 can assess further information about the attitude of the airframe 10. The remote synthesiser 20 can be programmed to show a virtual shadow of the synthesised airframe image A on the display screen 19. This virtual shadow does not need to take any account of the level or direction of lighting, but provides a reference to the height of the airframe 10 above the runway 11 thereby facilitating flare and landing. The synthesising of virtual shadows is well-known in the art of video games which also teach diverse techniques and software for synthesising images.
Through an operator control link 23, the pilot 13 can alter the position of the virtual viewpoint 21 relative to the airframe 10. It will be noted that the virtual viewpoint 21 trails the airframe 10 and is directed along the trajectory T - however, it could instead arranged to have other geometric relationships with a frame of reference of the airframe 10. By using the operator control link 23, the pilot 13 can alter this geometrical relationship by displacing the virtual viewpoint 21 towards, or away from, the airframe 10 along the trajectory T, or by raising or lowering the virtual viewpoint 21 relative to the trajectory T, or by displacing the virtual viewpoint 21 laterally to the left or right of the trajectory T. Raising the virtual viewpoint 21 will increase the apparent perspective of the airframe 10, whilst lowering the virtual viewpoint 21 enables the relationship between the landing wheels and the runway 11 to be closely observed together with the attitude of the airframe 10 relative to the runway 11. Such lowering of the virtual viewpoint can be of particular use for short-field landings such as on aircraft carriers and temporary airstrips.
Lateral displacement of the virtual viewpoint 21 can be useful for docking an asymmetrical in-flight refuelling device and, particularly on large aircraft, for assessing the clearance of wing tips from obstacles whilst manoeuvring on the ground.
The camera 17 can be replaced by any convenient from of airborne sensor that can produce a real-time forward view of the airframe 10.
The camera 17 and the telemetric link 18 can be replaced, or backed-up, by information from a ground based, or air-based, tracking means that is tracking the airframe 10.
Although the invention has been described with reference to the landing of an unmanned airframe 10, it is also of use for improving control of the airframe 10 during various aerial manoeuvres, such as positioning relative to aerial or ground-based objects, and in-flight refuelling. In particular, the invention facilitates the safe landing of a UAV by an operator who does not have the skills of a pilot. By appropriately offsetting the virtual viewpoint 21 , it is possible for the pilot of a manned aircraft to make visual observation of the clearance between his wing tips and the wing tips and empannage of other aircraft whilst manoeuvring at an airport.
The invention is applicable to a range of vehicles of which the driver has inadequate visual reference to the vehicle's surroundings. For example, the invention is particularly useful in controlling the landing of a UAV on an aircraft carrier where a very detailed representation of the real-time environment can be generated. In particular, the invention is useful with armoured vehicles engaged in combat where there is no direct visual reference of the vehicle's surroundings. In this case, the invention provides an additional perception of the position and progress of the vehicle over the surrounding terrain. Also, by offsetting the virtual viewpoint 21 laterally, it is possible for the driver to gauge the actual clearance between the sides of his vehicle and any significant obstruction.
The invention is also useful for controlling unmanned vehicles such as those used for inspecting and detonating suspected bombs or booby traps.
Large ships are conned either by the skipper, or by a pilot, based on the bridge. When manoeuvring close to other vessels, or manoeuvring in a harbour, or coming alongside a buoy, the captain or pilot has to rely on information provided by crew members who may be several hundred metres away. This invention enables the captain or pilot to assess manoeuvring clearances directly on the bridge.

Claims

1. A control system for an actual vehicle including a display screen for a vehicle operator, means for transmitting data relating to the real-time vehicle environment to a synthesiser arranged to control the display of images on the display screen, and the synthesiser is arranged to:- (i) generate on the display screen a synthesised image of the realtime environment having a virtual viewpoint displaced in a predetermined way relative to the vehicle, (ii) generate a synthesised image of the vehicle with the same virtual viewpoint, and
(iii) superimpose the synthesised image of the vehicle on the synthesised image of the real-time environment.
2. A control system for an actual vehicle according to Claimi , in which the virtual viewpoint is displaced along the trajectory of the vehicle.
3. A control system for an actual vehicle according to Claim 2, in which the virtual viewpoint is a predetermined point trailing the vehicle whereby the display screen will display the synthesised image of the vehicle as if viewed from behind.
4. A control system for an actual vehicle according to any preceding claim, in which the virtual viewpoint is displaced laterally from the trajectory of the vehicle.
5. A control system for an actual vehicle according to any preceding claim, in which the virtual viewpoint is displaced vertically relative to the trajectory of the vehicle.
6. A control system for an actual vehicle according to Claim 1 , including an operator control for adjusting the virtual viewpoint of the synthesised vehicle image relative to the vehicle.
7. A control system for an actual vehicle according to any preceding claim, including a telemetric link arranged to supply the synthesiser with realtime data from the vehicle regarding the attitude of the vehicle in its environment, and the synthesiser is arranged to show the vehicle attitude in the synthesised vehicle image.
8. A control system for an actual vehicle according to any preceding claim, including a telemetric link arranged to supply the synthesiser with real- time data from the vehicle regarding the trajectory of the vehicle in its environment, and the synthesiser is arranged to show the vehicle trajectory in the synthesised vehicle image.
9. A control system for an actual vehicle according to any of Claims 1 to 6, including a remote tracking means arranged to track the vehicle and to supply the synthesiser with the real-time data regarding the attitude of the vehicle in its environment, and the synthesiser is arranged to display the vehicle attitude in the synthesised vehicle image.
10. A control system for an actual vehicle according to any of Claims 1 to 6 or 9, including a remote tracking means arranged to track the vehicle and to supply the synthesiser with the real-time data regarding the trajectory of the vehicle in its environment, and the synthesiser is arranged to show the vehicle trajectory in the synthesised vehicle image.
11. A control system for an actual vehicle, according to any preceding claim, in which the means for transmitting the data relating to the real-time vehicle environment includes a sensor carried by the vehicle.
12. A control system for an actual vehicle according to any of Claims 1 to 10, in which the means for transmitting the data relating to the real-time vehicle environment includes a reference database of the true environment of the vehicle.
13. A control system for an actual vehicle according to any preceding claim wherein, the vehicle is an airframe.
14. A control system for an actual vehicle according to claim 13, wherein which the real-time vehicle environment is the real-time forward environment of the airframe, and the virtual viewpoint is a point trailing the airframe whereby the synthesised image of the airframe will appear to be viewed from behind.
15. A control system for an actual vehicle according to Claim 13 wherein the airframe is remotely piloted and an operator control is ground-based, a command link being provided between the operator control and the airframe to enable the airframe to be piloted remotely.
16. A control system for an actual vehicle according to Claim 13, wherein the airframe is remotely piloted and the display screen is carried by a second vehicle, a command link being provided between the second vehicle and the airframe to enable the airframe to be piloted from the second vehicle.
17. A control system for an actual vehicle according to any of Claims 13 to 16, in which the synthesiser includes means to generate a virtual shadow of the airframe and to superimpose the virtual shadow on the synthesised image of the real-time forward environment.
18. A control system for an actual vehicle according to Claim 5, where the vehicle is an airframe and the synthesised image of the airframe will appear to be viewed partially from one side.
19. A control system for an actual vehicle according to Claim 13 wherein the virtual viewpoint is a point displaced above or below the airframe trajectory whereby the synthesised image of the airframe will appear to be viewed partially from above or below.
20. A control system for an actual vehicle according to any of Claims 1 to 12 wherein the vehicle is a ship and the virtual viewpoint is displaced such that the display of the synthesised image of the ship on the synthesised image of the real-time ship environment will reveal the location of at least part of the ship relative to any structure within the real-time environment of the ship.
21. A control system for an actual vehicle according to Claim 20 wherein the virtual viewpoint is displaced above the ship such that the display of the synthesised image of the ship on the synthesised image of the real-time ship environment will reveal the location of at least part of an extremity of the ship relative to the real-time ship environment.
22. A method of operating an actual vehicle, including generating a synthesised image of the real-time vehicle environment having a virtual viewpoint displaced in a predetermined way relative to the vehicle, generating a synthesised image of the vehicle with the same virtual viewpoint, and superimposing the synthesised image of the vehicle on the synthesised image of the real-time vehicle environment.
23. A method of piloting an actual airframe from a command station having no, or inadequate, direct visual reference to the environment of the airframe, including generating a synthesised image of the real-time forward environment ahead of the airframe but with a virtual viewpoint trailing the airframe, synthesising an image of the airframe with the same virtual viewpoint, and superimposing the synthesised image of the airframe on the synthesised image of the real-time forward environment for use as a reference for piloting the airframe from the command station.
PCT/GB2005/050217 2004-12-02 2005-11-30 Indirect control of vehicles WO2006059151A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
GB0426483A GB0426483D0 (en) 2004-12-02 2004-12-02 Indirect control of vehicles
GB0426483.4 2004-12-02
EP04257508 2004-12-02
EP04257508.4 2004-12-02

Publications (1)

Publication Number Publication Date
WO2006059151A1 true WO2006059151A1 (en) 2006-06-08

Family

ID=36165387

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2005/050217 WO2006059151A1 (en) 2004-12-02 2005-11-30 Indirect control of vehicles

Country Status (1)

Country Link
WO (1) WO2006059151A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010025559A1 (en) * 2008-09-05 2010-03-11 Cast Group Of Companies Inc. System and method for real-time environment tracking and coordination

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2041177A (en) * 1979-01-25 1980-09-03 Wigren P Sighting and target tracking instruction apparatus
US4232456A (en) * 1977-06-30 1980-11-11 Martin Marietta Corporation Weapons system simulator and method including ranging system
WO1983001832A1 (en) * 1981-11-14 1983-05-26 Walmsley, Dennis, Arthur Guided missile fire control simulators
EP0090323A1 (en) * 1982-03-30 1983-10-05 Günter Löwe Training device for the firing of guided missiles, particularly of surface-to-surface missiles
US5224860A (en) * 1991-03-01 1993-07-06 Electronics & Space Corp. Hardware-in-the-loop tow missile system simulator

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4232456A (en) * 1977-06-30 1980-11-11 Martin Marietta Corporation Weapons system simulator and method including ranging system
GB2041177A (en) * 1979-01-25 1980-09-03 Wigren P Sighting and target tracking instruction apparatus
WO1983001832A1 (en) * 1981-11-14 1983-05-26 Walmsley, Dennis, Arthur Guided missile fire control simulators
EP0090323A1 (en) * 1982-03-30 1983-10-05 Günter Löwe Training device for the firing of guided missiles, particularly of surface-to-surface missiles
US5224860A (en) * 1991-03-01 1993-07-06 Electronics & Space Corp. Hardware-in-the-loop tow missile system simulator

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010025559A1 (en) * 2008-09-05 2010-03-11 Cast Group Of Companies Inc. System and method for real-time environment tracking and coordination
US8639666B2 (en) 2008-09-05 2014-01-28 Cast Group Of Companies Inc. System and method for real-time environment tracking and coordination
US8938431B2 (en) 2008-09-05 2015-01-20 Cast Group Of Companies Inc. System and method for real-time environment tracking and coordination

Similar Documents

Publication Publication Date Title
US6584382B2 (en) Intuitive vehicle and machine control
CA2691375C (en) Aircraft landing assistance
EP0911647B1 (en) Flight system and system for forming virtual images for aircraft
US10392124B2 (en) Tactile and peripheral vision combined modality hover drift cueing
RU2550887C2 (en) On-board integrated crew support information system and cognitive format of presenting flight information at take-off phase of multi-engine aircraft
Hart Helicopter human factors
GB2453854A (en) Fully-automated flight management system for aircraft
Cacan et al. Human-in-the-loop control of guided airdrop systems
RU2351000C2 (en) Method and system of aircraft control apparatus
CN108154715A (en) A kind of side collision monitoring method
US8275492B2 (en) Method and a set of means for piloting an aircraft
WO2006059151A1 (en) Indirect control of vehicles
RU2271305C1 (en) Light supersonic multi-purpose aircraft
CN110979716A (en) Ship-borne vertical take-off and landing detection and correction unmanned aerial vehicle attitude ship-aircraft cooperative guidance method
Schuchardt et al. Mission Management and Landing Assistance for an Unmanned Rotorcraft for Maritime Operations
WO2003096303A1 (en) Feature display
Stevenson Assessment of the equivalent level of safety requirements for small unmanned aerial vehicles
Lund Unmanned powered parafoil tests for guidance, navigation, and control development
Horton Flight test experience and controlled impact of a remotely piloted jet transport aircraft
Huang et al. Virtual reality based safety system for airborne platforms
Malycke Developmental Evaluation of the B-1B Aircraft
Esin Vision-aided landing for fixed wing unmanned aerial vehicle
Statler et al. The role of the research simulator in the systems development of rotorcraft
Jones et al. Short‐range, unmanned air vehicle system development at Cranfield
JPH0485196A (en) Deck-landing guide for helicopter and vertical take-off and landing plane

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KN KP KR KZ LC LK LR LS LT LU LV LY MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU LV MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 05813609

Country of ref document: EP

Kind code of ref document: A1