INDIRECT CONTROL OF VEHICLES
This invention relates to the indirect control of vehicles and provides both a control system for an actual vehicle, and a method of operating an actual vehicle, which enable the vehicle to be controlled by an operator who has no, or inadequate, direct visual reference to the proximate environment of the vehicle.
It is well known for model boats, cars and aeroplanes to be controlled remotely using a radio control link. In these examples the operator has remote visual reference to part of the environment of the model boat, car or aeroplane. This remote visual reference is usually from a direction oblique to the vehicle trajectory, and/or from a different level. The operator has to transpose these remote visual references when attempting to visualise the real-time environment of the model and the trajectory and attitude of the model within the real-time environment.
It is also well-known for a remotely piloted airframe, such an unmanned air vehicle (UAV), to carry a video camera arranged, through a telemetry down-link, to provide a remotely based operator (on the ground or in another vehicle) with a pilot's view of the terrain in front of the remotely piloted airframe. Whilst this system is useful, many operators experience significant difficulties in achieving the safe landing of UAVs, probably due to the absence of any of the peripheral visual inputs which pilots of piloted airframes use to gauge slide-slip, roll, attitude and height immediately before and during landing.
Some airframe designs have closed cockpits which allow the pilot no, or limited, visual reference to the ground. The operators of other vehicles, such as military tanks, have very restricted visual reference to the ground and usually no visual reference for gauging lateral separation from obstructions. The captains of large ships have inadequate visual reference to quays or obstructions whilst controlling manoeuvres such as docking, or picking up buoys.
In addition, the use of simulators for training military and aircraft personnel is described in several documents including US 5,224,860, WO83/01832 and US 4,232,456. Simulators are non-destructive and relatively inexpensive to operate and are designed primarily to improve the skills of an operator. In recent years, full real-time digital synthesis of an environment and/or vehicle have been achieved, such as for example Microsoft's™ Flight Simulator. In these systems, synthetic images of a vehicle and flight environment are generated and manipulated so as to simulate the dynamic view of the environment to vehicle control actions of the operator. These systems can allow the use a "trailing viewpoint" of the environment located a few hundred metres behind the air vehicle so as to provide simultaneous situational awareness of velocity, track, attitude and position. However, this effect is currently only exploited in the virtual world of simulations.
The present invention is concerned with providing a control system for an actual vehicle, and a method of operating such a vehicle, to improve the control of such a vehicle by an operator who has no, or inadequate, direct visual reference to the environment of the vehicle and requires minimal skill on the part of the operator. By "actual vehicle" is meant a vehicle existing in reality as opposed to some virtual vehicle existing in a synthetic environment. The invention is particularly, but not exclusively, concerned with improving the control of an airframe, such as a UAV, by a remote operator who has no direct visual reference to the ground in front of the airframe.
According to one aspect of the invention a control system for an actual vehicle includes a display screen for a vehicle operator, means for transmitting geometrical data relating to the vehicle to a synthesiser arranged to control the display of images on the display screen, and the synthesiser is arranged to:-
(i) generate on the display screen a synthesised image of the real- time environment having a virtual viewpoint displaced in a predetermined way relative to the vehicle,
(ii) generate a synthesised image of the vehicle with the same virtual viewpoint, and
(iii) superimpose the synthesised image of the vehicle on the synthesised image of the real-time environment.
The system allows an operator to observe a visualisation of the vehicle within its environment enhancing situational awareness of the velocity, track, attitude and position of the vehicle by the operator and facilitating remote control of the vehicle by conventional manual means. This enhanced awareness greatly reduces the skill required to operate the vehicle resulting in minimal training requirements for operator personnel. Vehicle control can also be automatic but an operator has the benefit of sufficient situational awareness to be able to supervise the landing and intervene if necessary. This is of value in onboard flight path visualisation, in closed cockpits, or for operations in zero visibility and is advantageous in that conventional large displays are not required.
The virtual viewpoint is preferably displaced along the trajectory of the vehicle. Preferably, the virtual viewpoint is a predetermined point trailing the vehicle whereby the display screen will display the synthesised image of the vehicle as if viewed from behind.
The virtual viewpoint may alternatively, or additionally, be displaced laterally from the trajectory of the vehicle. In this manner the display screen will display the synthesised image of the vehicle as if viewed a direction parallel with, but on the same level as the trajectory.
The predetermined virtual viewpoint may alternatively, or additionally, be displaced vertically relative to the trajectory of the vehicle. In this manner the display screen will display the synthesised image of the vehicle as if viewed from a direction parallel with, but at higher or lower level than, the trajectory.
The system preferably includes an operator control for adjusting the virtual viewpoint of the synthesised vehicle image relative to the vehicle.
- A -
A telemetric link may be arranged to supply the synthesiser with real-time data from the vehicle regarding the position, attitude and/or trajectory of the vehicle in its environment and the synthesiser be arranged to show the vehicle position, attitude and/or trajectory in the synthesised vehicle image. Alternatively, a remote tracking means may be arranged to track the vehicle and to supply the synthesiser with real-time data regarding the position, attitude and/or trajectory of the vehicle in its environment, and the synthesiser is arranged to display the vehicle position, attitude and/or vehicle trajectory in the synthesised vehicle image.
The means for transmitting data relating to the real-time vehicle environment is preferably a reference database of the true environment of the vehicle, but a sensor carried by the vehicle producing a real-time image from the vehicle and a telemetric link arranged to transmit the real-time image to the synthesiser may be used.
In the case where the vehicle is an airframe, the real-time vehicle environment is the real-time forward environment of the airframe, and the virtual viewpoint is preferably a predetermined point trailing the airframe whereby the synthesised image of the airframe will appear to be viewed from behind.
In the case where the airframe is remotely piloted, an operator control may be ground-based, and a command link is provided between the operator control and the airframe to enable the airframe to be piloted remotely. Alternatively, the display screen may be carried by a second vehicle and a command link be provided between the second vehicle and the airframe to enable the airframe to be piloted from the second vehicle. The second vehicle may be ground-based, waterbome or airborne.
The synthesiser preferably includes means to generate a virtual shadow of the airframe and to superimpose the virtual shadow on the synthesised image of the real-time vehicle environment.
In the case where the vehicle is an airframe, the virtual viewpoint may be a point displaced laterally from the airframe trajectory whereby the synthesised image of the airframe will appear to be viewed partially from one side.
In the case where the vehicle is an airframe, the virtual viewpoint may be a point displaced above or below the airframe trajectory whereby the synthesised image of the airframe will appear to be viewed partially from above or from below.
In the case where the vehicle is a ship, the real-time vehicle environment is the real-time environment of the ship, and the virtual viewpoint may be displaced such that the display of the synthesised image of the ship on the synthesised image of the real-time ship environment will reveal the location of at least part of the ship relative to any structure within the real-time environment of the ship.
In the case where the vehicle is a ship, the real-time environment is the realtime environment of the ship, and the virtual viewpoint may be displaced above the ship whereby the display of the synthesised image of the ship on the synthesised image of the real-time ship environment will reveal the location of at least part of an the extremity of the ship relative to the real-time ship environment.
According to another aspect of the invention, a method of operating an actual vehicle includes generating a synthesised image of the real-time vehicle environment having a virtual viewpoint displaced in a predetermined way relative to the vehicle, generating a synthesised image of the vehicle with the same virtual viewpoint, and superimposing the synthesised image of the vehicle on the synthesised image of the real-time vehicle environment.
According to a further aspect of the invention a method of piloting an actual airframe from a command station having no, or inadequate, direct visual reference to the environment of the airframe, includes generating a synthesised image of the real-time forward environment ahead of the airframe but with a
virtual viewpoint trailing the airframe, synthesising an image of the airframe with the same virtual viewpoint, and superimposing the synthesised image of the airframe on the synthesised image of the real-time forward environment for use as a reference for piloting the airframe from the command station. Thus a pilot, located at a ground station or in another vehicle, is enabled to visualise the movement of the airframe relative to a runway, or other objective, by the creation of the compound image which indicates the attitude of the airframe relative to the real-time environment ahead of the flight path. This image preferably includes other flight parameters, such as airspeed.
The method may include generating the synthesised image of the real-time forward environment of the airframe either from a reference database of the true environment of the airframe.
The method may include transmitting the image of the real-time forward environment to the synthesiser from a sensor carried by the airframe to the command station. Such a method can be useful when the command station is carried by the airframe, but is particularly useful in the case where the command station is not carried by the airframe.
The accompanying diagrammatic drawing illustrates one embodiment of the invention, by way of example only, as applied to an airframe.
With reference to the drawing, an actual vehicle in the form of a remotely piloted airframe 10, is shown travelling obliquely to a runway 11 having a centre-line 12. The vehicle operator (pilot) 13 is positioned at a command station 14,which is remote from the airframe 10, for instance on the ground, in another aircraft, or in a ship.
The task of the pilot 13 is to adjust the trajectory T of the remotely piloted airframe 10 so that it is aligned with the centre-line 12 (that is on "finals") at an appropriate height above the runway 11 , at an appropriate airspeed in an appropriate attitude, and with the airframe 10 in an appropriate configuration.
To achieve this task, the command station 14 is provided with an operator (pilot) control comprising a control panel 15 which includes the appropriate flight controls and is operatively connected to the airframe 10 through a command 16.
Prior to the present invention, the pilot 13 would receive an image of the scene ahead of the airframe flight path from a camera or other sensor 17 carried by the airframe 10, the image being transmitted by a telemetric link 18 to a display screen 19 positioned adjacent to the control panel 15. This display screen would also display data about the position and attitude of the airframe. In comparison with a pilot of a manned airframe, the pilot 13 of the remotely piloted airframe 10 has no feedback of how the airframe 10 feels or appears relative to the scene ahead of the airframe. This is recognised as a major causative factor in the high rate of landing accidents with remotely-piloted airframes iO.
The present invention teaches that a representation of the airframe's real-time forward environment is generated by or fed to a synthesiser 20 which controls the display of images on the display screen 19 and is conveniently positioned at the command station 14. The synthesiser 20 is provided with software which manipulates the real-time image of the airframe's forward environment to generate a synthesised image S of the real-time environment ahead of the airframe but with a virtual viewpoint 21 displaced so that it trails the airframe 10. The signal of this synthesised real-time forward image is transmitted by an image link 22 to the display screen 19 which conveniently may be a VDU that also displays flight parameters. In this manner the virtual viewpoint 21 of the synthesised real-time image is transposed from the position of the camera 17 to the virtual viewpoint 21 which, as shown, is displaced behind airframe 10 along the trajectory T and therefore trails the airframe 10.
The synthesiser 20 also generates a synthesised image A of the airframe with the same virtual viewpoint 21 and superimposes the synthesised images S and A. The actual position and attitude of the airframe are transmitted via the
telemetric link 18 from onboard sensors to the synthesiser so that the synthesised airframe image A is the real-time position and attitude of the airframe 10 relative to the real-time synthesised image S. From the display screen 19, the pilot 13 therefore sees the synthesised airframe image A travelling in front of him along trajectory T towards the real-time synthesised image S which includes the position and orientation of the runway 11. From the particular image shown on the display screen 19 in the drawing, the pilot 13 will note that the airframe is banked to port and therefore turning left, and will very quickly understand that the airframe needs to continue tracking to the left and then levelled until it nearly intercepts the runway centre line 12 whereupon it will need to be banked to starboard prior to establishing "finals".
Dependant on the profile of the synthesised airframe image A, the pilot 13 can assess further information about the attitude of the airframe 10. The remote synthesiser 20 can be programmed to show a virtual shadow of the synthesised airframe image A on the display screen 19. This virtual shadow does not need to take any account of the level or direction of lighting, but provides a reference to the height of the airframe 10 above the runway 11 thereby facilitating flare and landing. The synthesising of virtual shadows is well-known in the art of video games which also teach diverse techniques and software for synthesising images.
Through an operator control link 23, the pilot 13 can alter the position of the virtual viewpoint 21 relative to the airframe 10. It will be noted that the virtual viewpoint 21 trails the airframe 10 and is directed along the trajectory T - however, it could instead arranged to have other geometric relationships with a frame of reference of the airframe 10. By using the operator control link 23, the pilot 13 can alter this geometrical relationship by displacing the virtual viewpoint 21 towards, or away from, the airframe 10 along the trajectory T, or by raising or lowering the virtual viewpoint 21 relative to the trajectory T, or by displacing the virtual viewpoint 21 laterally to the left or right of the trajectory T.
Raising the virtual viewpoint 21 will increase the apparent perspective of the airframe 10, whilst lowering the virtual viewpoint 21 enables the relationship between the landing wheels and the runway 11 to be closely observed together with the attitude of the airframe 10 relative to the runway 11. Such lowering of the virtual viewpoint can be of particular use for short-field landings such as on aircraft carriers and temporary airstrips.
Lateral displacement of the virtual viewpoint 21 can be useful for docking an asymmetrical in-flight refuelling device and, particularly on large aircraft, for assessing the clearance of wing tips from obstacles whilst manoeuvring on the ground.
The camera 17 can be replaced by any convenient from of airborne sensor that can produce a real-time forward view of the airframe 10.
The camera 17 and the telemetric link 18 can be replaced, or backed-up, by information from a ground based, or air-based, tracking means that is tracking the airframe 10.
Although the invention has been described with reference to the landing of an unmanned airframe 10, it is also of use for improving control of the airframe 10 during various aerial manoeuvres, such as positioning relative to aerial or ground-based objects, and in-flight refuelling. In particular, the invention facilitates the safe landing of a UAV by an operator who does not have the skills of a pilot. By appropriately offsetting the virtual viewpoint 21 , it is possible for the pilot of a manned aircraft to make visual observation of the clearance between his wing tips and the wing tips and empannage of other aircraft whilst manoeuvring at an airport.
The invention is applicable to a range of vehicles of which the driver has inadequate visual reference to the vehicle's surroundings. For example, the invention is particularly useful in controlling the landing of a UAV on an aircraft carrier where a very detailed representation of the real-time environment can be
generated. In particular, the invention is useful with armoured vehicles engaged in combat where there is no direct visual reference of the vehicle's surroundings. In this case, the invention provides an additional perception of the position and progress of the vehicle over the surrounding terrain. Also, by offsetting the virtual viewpoint 21 laterally, it is possible for the driver to gauge the actual clearance between the sides of his vehicle and any significant obstruction.
The invention is also useful for controlling unmanned vehicles such as those used for inspecting and detonating suspected bombs or booby traps.
Large ships are conned either by the skipper, or by a pilot, based on the bridge. When manoeuvring close to other vessels, or manoeuvring in a harbour, or coming alongside a buoy, the captain or pilot has to rely on information provided by crew members who may be several hundred metres away. This invention enables the captain or pilot to assess manoeuvring clearances directly on the bridge.