EP0383902A1 - Remote operated vehicle control - Google Patents

Remote operated vehicle control

Info

Publication number
EP0383902A1
EP0383902A1 EP89910132A EP89910132A EP0383902A1 EP 0383902 A1 EP0383902 A1 EP 0383902A1 EP 89910132 A EP89910132 A EP 89910132A EP 89910132 A EP89910132 A EP 89910132A EP 0383902 A1 EP0383902 A1 EP 0383902A1
Authority
EP
European Patent Office
Prior art keywords
vehicle
video
control system
information
receiver
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP89910132A
Other languages
German (de)
French (fr)
Inventor
Rodney John Blissett
Christopher George Harris
Debra Charnley
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Plessey Overseas Ltd
Original Assignee
Plessey Overseas Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Plessey Overseas Ltd filed Critical Plessey Overseas Ltd
Publication of EP0383902A1 publication Critical patent/EP0383902A1/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0011Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement
    • G05D1/0038Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement by providing the operator with simple or augmented images from one or more cameras located onboard the vehicle, e.g. tele-operation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/002Special television systems not provided for by H04N7/007 - H04N7/18
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/20Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using video object coding

Definitions

  • each image is broken down by an information processing stage into 'image tokens'.
  • These tokens are representative of the positions of comer and edge features present in a given video picture frame and some description of the identification of these features is given in our copending patent application No. 881123 entitled 'Digital Data Processing'.

Abstract

Le système de commande de véhicule à distance décrit comprend un véhicule commandé à distance (1) qui transporte une caméra de télévision (4), la caméra (4) et le véhicule (1) étant reliés par une liaison radio (6) à une station de base séparée (2) équipée d'un récepteur radio (3), dans lequel l'image vidéo assurant la transmission est divisée en groupes de données séparés, dont une partie présente de courtes variations de l'échelle de temps et dont l'autre présente de longues variations de l'échelle de temps, les deux groupes de données étant transmis à des vitesses de blocs images vidéo différentes, ainsi qu'un organe placé dans le récepteur et servant à combiner les deux transmissions pour fournir une image composite sur un écran vidéo dudit récepteur (3). Un tel système permet d'assurer une bonne commande du véhicule sans qu'il soit nécessaire de recourir à une liaison radio à grande largeur de bande.The described remote vehicle control system comprises a remote controlled vehicle (1) which carries a television camera (4), the camera (4) and the vehicle (1) being connected by a radio link (6) to a separate base station (2) equipped with a radio receiver (3), in which the video image providing the transmission is divided into separate data groups, a part of which has short variations in the time scale and the the other has long variations in the time scale, the two groups of data being transmitted at different video frame rates, and a device in the receiver which serves to combine the two transmissions to provide a composite image on a video screen of said receiver (3). Such a system makes it possible to ensure good control of the vehicle without it being necessary to resort to a high-bandwidth radio link.

Description

REMOTE OPERATED VFHTa .F. fT> TROT
This invention relates to remote operated vehicle control. It relates particularly to means for controlling a remote operated vehicle where the connecting link carries information for a picture display of the ground surface and obstacles ahead of the vehicle.
One type of remote operated vehicle makes use of a cable connection between the vehicle and the remotely located operator. Such a cable can carry a wide bandwidth of information for the operator including video signals, for example from a television camera mounted on the vehicle. However, the necessarily limited length of the cable will restrict the range of the vehicle, it will also restrict manoeuvrability and be susceptible to damage or breakage. An alternative approach would be to use a high bandwidth radio link but this can also restrict the range of operations and the type of country to be driven through. In addition, it may not be suitable for an application where more than one remotely controlled vehicle is to be operated together. There do exist techniques for video data compression which might be helpful, but these do not have adequate compression ratios to enable low bandwidth radio links to be πsed, in order to allow the necessary sensory feedback information to be transmitted .
The present invention was devised to provide a method for remote vehicle control where the need for a dedicated high bandwidth radio link can be avoided.
According to the invention, there is provided a remote operated vehicle control system comprising a remotely controlled vehicle carrying a television camera, the camera and vehicle being connected by a radio link to a separate base station having a radio receiver, in which the video picture for transmission is divided into separate data sets one part of which exhibits short timescale variations and another part of which exhibits long timescale variations, the two data sets being transmitted at differing video frame rates, and means at the receiver for combining the two transmissions to give a composite picture on a video display of said receiver.
The said video frame rates may include a rate of thirty hertz with a second slower rate.
Preferably, the picture information for transmission is processed to select generally static features which are used to generate a synthesized scene in the video display. The selected picture information may be processed such that only edge and corner information is used for said slower frame rate transmission. Tne received slower frame rate information may be converted to video frame rate for projection into the image plane of the video display. By way of example, a particular embodiment of the invention will now be described with reference to the accompanying drawings, in which:
Figure 1 shows the remote operated vehicle in a typical scene with a base station which accommodates a control console for a vehicle operator,
Figure 2 is a block diagram of the main system control units, and,
Figure 3 shows the different data transformations that are necessary to provide the required picture display. As depicted in Figure 1, a remote operated vehicle 1 is shown travelling along a country road and the vehicle is under the control of an operator located at a base station 2. The operator will be positioned at an operator console 3 and this makes use of a video display showing the scene in front of the vehicle. A television camera 4 located on the vehicle roof transmits a picture of the ground surface ahead of the vehicle 1 to a receiver at the console 3. The radio link 6 is a two-way one so that control signals from the console 3 can be used to guide the vehicle. In addition to the picture information, signals from sensors on the vehicle 1 can be transmitted to the console 3 so that information on the braking effect, steering movements etc. can be applied to the video display.
As already mentioned, to transmit a full video picture to the base station would take up a large video bandwidth. In some applications this might be acceptable, but in the event that more than one vehicle might be required to operate simultaneously, it would be preferable to reduce this bandwidth. In the present invention, this is effected by transmitting different pans of the video information at different data rates. The underlying vehicle motion information which changes comparatively rapidly is transmitted at the usual video frame rate. The underlying 3D scene structure changes slowly and is transmitted at a slow frame rate. This means that it might take several video frames for newly inferred structural information to be transmitted.
To infer the structural information of the observed scene, each image is broken down by an information processing stage into 'image tokens'. These tokens are representative of the positions of comer and edge features present in a given video picture frame and some description of the identification of these features is given in our copending patent application No. 881123 entitled 'Digital Data Processing'.
An appropriate application of a 'Structure-from-Motion' algorithm as discussed in the paper 'Towards robot mobility through passive monocular vision' Blissett, R.J., Charnley, D. and Harris, C.G., Proceedings International Symposium on Teleoperation and Control, pp. 123-132, July 1988, is then made. The equivalent 3D locations of the corresponding scene features can then be estimated relative to the current position and attitude of the vehicle. The viewed scene is next decomposed into 3D tokens. This information can then be transmitted to the base station via a low bandwidth radio link.
At the base station, the processing circuitry will be able to synthesise a skeletal representation of the scene information based on the received token data. A moving (video-rate) representation of the viewed scene can be generated based on a lower frequency update (for example, at two hertz), provided that the synthesized scene is reprojected onto the image plane at the normal video frame rate (thirty hertz). In order to accomplish this it is required that the vehicle motion parameters are transmitted back to the base station at video rate. This arrangement ensures that a real time video display is available at the base station faithfully reproducing the short timescale variations due to the vehicle dynamics and recording at a lower rate the general terrain and evolution of objects in the field of view of the television camera. The dynamically updated scene may then be viewed by the operator and used to provide the sensory feedback information to enable the vehicle to be driven. Whilst the need to compress the data results in a partial loss of information in that the actual gray- levels recorded by the television camera are not reconstituted, the reconstruction of 3D information provides valuable additional quantitative data regarding the nature of the terrain ahead of the vehicle and the presence of obstacles. This method offers the advantages of eliminating the need for cumbersome and restrictive umbilicals and does not require dedicated high bandwidth radio links. Hence, the operability of the teleoperated vehicle is enhanced accordingly.
Some benefits of the system of the application will be apparent from the following comparison between the data rate requirements of different video transmission formats:
Video Format Data Rate (kbits/s)
Full resolution digital 60000
Full resolution compressed (10:1) 6000
Low resolution compressed (10:1) 1500
Full resolution edge image about 1500 two hertz update of 240 scene tokens plus thirty hertz update of 15, plus about 1 vehicle parameters
The main system control units required to carry out this process are depicted in Figure 2. This Figure shows the television camera 4 (which uses, for preference, a non-interlaced line arrangement) which delivers analogue video data by cable to a front- end video processor 7. The processor 7 digitises each captured frame to 512 by 512 eight-bit pixels and this data is held in a frame store. The front-end processor is designed to decompose each digitised frame into a list of extracted image tokens and associated attributes. These tokens may be based on localised corners and/or edges. The design of the processor 7 is based on the VME-bus architecture and it follows the construction disclosed in the aforementioned paten: application.
The data emanating from the front-end processor 7 is at a much reduced data rate (about 1.8 megabits per second). This data stream is passed into a 3D geometry module 8 which may additionally accept data from auxiliary motion and attitude sensors on board the vehicle 1. The 3D geometry module 8 matches image tokens from frame to frame, computes or utilises the vehicle motion parameters and, through triangulation, estimates the 3D locations of the corresponding scene tokens. These estimates are continually refined as more data is accepted from subsequent frames. The outputs from the module 8 are the refined motion parameter estimates together with a list of the relevant 3D locations of scene tokens that are currently visible to the television camera 4. These outputs are transmitted to the base station 2 by means of a transmitter 9 and the low bandwidth radio link 6. The motion data is sent every video frame (at thirty hertz frame rate) and the 3D data is sent at a lower rate (for example, two hertz). A processor for the 3D geometry module 8 employs a parallel processing architecture. The rate of data transmission through the link 6 will be about sixteen kilobits per second.
Once the data is received by means of a receiver 11 at the base station 2, the 3D scene tokens are used to synthesise a 3D surface representation of the viewed scene. This is done by means of a 3D surface generator 12. The method for obtaining a points-only set of scene tokens is described in the aforementioned published paper. First, the 3D tokens are reprojected back onto the current image plane, utilising the most recent vehicle motion and attitude data. A Delaunay triangulation is then performed on the image plane through the projected tokens. This triangulation ensures that a corresponding 3D surface is reconstructed that is single valued in depth. The surface may then be visualised by means of an orthogonal grid of contours (formed at the intersections of the surface with an equispaced set of orthogonal vertical planes). Furthermore, potential obstacles and regions unsuitable for driving may be inferred from the 3D surface and highlighted, if required.
This information is then presented as a dynamic display on the operator console 3. The operator is then able to drive the vehicle by using slave controls and observing the display console. The Tequired vehicle steering and braking demands are transmitted back to the vehicle 1. The provision of 3D information to the operator will enable the nature of the terrain ahead of the vehicle to be assessed and potential obstacles can be located and avoided.
Figure 3 shows the different data transformations that are necessary to obtain the required picture display. The signals from the television camera produce a series of video frames 14 at a rate of thirty per second. The frames are depicted as they are spaced in sequence along the time coordinate 16. From each spaced video frame, the front end processor acts to produce sets of image tokens 17. The sets of image tokens are then delivered to the 3D geometry module to contribute towards the production of 3D scene , tokens 18.
The vehicle 1 is also equipped with auxiliary sensors which provide information on, for example, vehicle speed and attitude. The information 19 from these auxiliary sensors is processed to form a vehicle motion vector 21 which will have characteristics that will similarly vary along the time coordinate. The information in the vehicle motion vector is transmitted by means of the radio link 6 at a rate of thirty frames per second to form a received vehicle motion vector 22 at the base station.
The information 19 from the auxiliary sensors is also delivered to the 3D geometry module to contribute towards the production of the 3D scene tokens 18. These tokens 18 are varied at a rate of thirty frames per second, they are then sampled at a rate of two frames per second and the resulting data is transmitted by means of the radio link 6 to the base station.
The information received at the base station, consisting of the sampled 3D scene tokens 23 at a rate of two frames per second and the received vehicle motion vector 22 at thirty frames per second, is then combined to provide a series of projected image token frames 24. This is done by reprojecting the 3D tokens back onto the current image plane utilising the most recent vehicle motion and attitude data. A Delaunay triangulation 26 is performed on the image plane through the projected tokens. This triangulation ensures that a corresponding 3D surface is reconstructed that is single valued in depth. The surface is then visualised by means of an orthogonal grid of contours to create the 3D surface contours and navigable regions at a rate of thirty frames per second. This contour information 27 can be displayed on the video screen to give a somewhat simplified view of the terrain but one that will still allow the operator to drive the vehicle.
The foregoing description of an embodiment of the invention has been given by way of example only, and a number of modifications may be made without departing from the scope of the invention as defined in the appended claims. For instance, the rate of change of picture frame data could readily differ from the rates described in this example, namely thirty and two frames per second. Instead of using auxiliary sensors on the vehicle to give data pertaining to the vehicle position and attitude, this information could be obtained in another way, such as by use of the Structure-fro - Motion algorithm already mentioned.
The invention is not necessarily restricted to control of a road vehicle, and it could be used for other purposes such as for transmitting sensory feedback information for the control of 2 remote manipulator or for remote landing of an unmanned aircraft.

Claims

1. A remote operated vehicle control system comprising a remotely controlled vehicle carrying a television camera, the camera and vehicle being connected by a. radio link to a separate base station having a radio receiver, in which the video picture for transmission is divided into separate data sets one part of which exhibits short timescale variations and another part of which exhibits long timescale variations, the two data sets being transmitted at differing video frame rates, and means at the receiver for combining the two transmissions to give a composite picture on a video display of said receiver.
2. A control system as claimed in Claim 1 , in which the said video frame rates include a rate of thirty hertz with a second slower rate.
3. A control system as claimed in Claim 1 or 2, in which the picture part having generally static features serves to generate a synthesised scene in the video display.
4. A control system as claimed in Claim 3, in which the selected picture information is processed such that only edge and corner information is used for said slower frame rate transmission.
5. A control system as claimed in any one of Claims 1 to 4, in which the received slower frame rate information is converted to video frame rate for projection into the image plane of the video display.
6. A remote operated vehicle control system substantially as hereinbefore described with reference to any one of the accompanying drawings.
EP89910132A 1988-08-27 1989-08-25 Remote operated vehicle control Withdrawn EP0383902A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB8820417 1988-08-27
GB8820417A GB2222338B (en) 1988-08-27 1988-08-27 Remote operated vehicle control

Publications (1)

Publication Number Publication Date
EP0383902A1 true EP0383902A1 (en) 1990-08-29

Family

ID=10642854

Family Applications (1)

Application Number Title Priority Date Filing Date
EP89910132A Withdrawn EP0383902A1 (en) 1988-08-27 1989-08-25 Remote operated vehicle control

Country Status (4)

Country Link
EP (1) EP0383902A1 (en)
JP (1) JPH03501903A (en)
GB (1) GB2222338B (en)
WO (1) WO1990002370A1 (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IT1249907B (en) * 1991-06-11 1995-03-30 SLOW SCAN REMOTE SURVEILLANCE SYSTEM USING THE MOBILE MOBILE COMMUNICATION SYSTEM.
GB2258114B (en) * 1991-07-26 1995-05-17 Rachel Mary Turner A remote baby monitoring system
EP0648400A1 (en) * 1992-06-29 1995-04-19 BRITISH TELECOMMUNICATIONS public limited company Coding and decoding video signals
FR2700213B1 (en) * 1993-01-05 1995-03-24 Sfim Guide assembly.
FR2725102B1 (en) * 1994-09-27 1996-12-13 M5 Soc REMOTE VIDEO-CONTROL PROCESS OF EQUIPMENT, IN PARTICULAR VEHICLES, AND IMPLEMENTATION DEVICE
FR2743162B1 (en) * 1995-12-27 1998-05-07 Dassault Electronique CONTROL DEVICE FOR SECURING A FAST VEHICLE, IN PARTICULAR GUIDED BY AN OPERATOR ON OR OFF BY THE VEHICLE
SE512171C2 (en) 1997-07-02 2000-02-07 Forskarpatent I Linkoeping Ab video Transmission
GB2382708B (en) 2001-11-21 2006-03-15 Roke Manor Research Detection of foreign objects on surfaces
US9282144B2 (en) 2011-01-14 2016-03-08 Bae Systems Plc Unmanned vehicle selective data transfer system and method thereof

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE2519241A1 (en) * 1975-04-30 1976-11-18 Ver Flugtechnische Werke ARRANGEMENT FOR PROCESSING IMAGE INFORMATION
FR2461405A1 (en) * 1979-07-09 1981-01-30 Temime Jean Pierre SYSTEM FOR CODING AND DECODING A DIGITAL VISIOPHONE SIGNAL
DE3018329C2 (en) * 1980-05-10 1982-09-09 Deutsche Forschungs- und Versuchsanstalt für Luft- und Raumfahrt e.V., 5000 Köln Method for the transmission and reproduction of video scenes, in particular aerial photo scenes, with a reduced frame rate
US4405943A (en) * 1981-08-19 1983-09-20 Harris Corporation Low bandwidth closed loop imagery control and communication system for remotely piloted vehicle
US4513317A (en) * 1982-09-28 1985-04-23 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Retinally stabilized differential resolution television display
EP0123616B1 (en) * 1983-04-20 1987-03-04 Nippon Telegraph And Telephone Corporation Interframe coding method and apparatus therefor
CA1277416C (en) * 1984-08-13 1990-12-04 Akihiro Furukawa Inter-frame predictive coding apparatus for video signal
JPS61267476A (en) * 1985-05-22 1986-11-27 Nec Corp Monitor system
JPS6335094A (en) * 1986-07-30 1988-02-15 Nec Corp Moving image signal coding system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO9002370A1 *

Also Published As

Publication number Publication date
GB8820417D0 (en) 1989-03-30
GB2222338B (en) 1992-11-04
GB2222338A (en) 1990-02-28
WO1990002370A1 (en) 1990-03-08
JPH03501903A (en) 1991-04-25

Similar Documents

Publication Publication Date Title
US4855822A (en) Human engineered remote driving system
EP0793392B1 (en) Method and apparatus for the transmission and the reception of three-dimensional television signals of stereoscopic images
US4791478A (en) Position indicating apparatus
US5155683A (en) Vehicle remote guidance with path control
US5621429A (en) Video data display controlling method and video data display processing system
US5495576A (en) Panoramic image based virtual reality/telepresence audio-visual system and method
CA2471569C (en) Method and system for guiding a remote vehicle via lagged communication channel
JP5156571B2 (en) Image processing apparatus and image processing method
EP0735784B1 (en) Three-dimensional image display device
US6498618B2 (en) Dual reality system
JPH05143709A (en) Video effect device
AU6669386A (en) Transmitting system and method using data compression
CA2177051A1 (en) Synthesized stereoscopic imaging system and method
EP0383902A1 (en) Remote operated vehicle control
CN1036491A (en) The processing of sub-sampled signals
WO1995028806B1 (en) Diagnostic method and apparatus
CN108811766A (en) A kind of man-machine interactive fruits and vegetables of greenhouse harvesting robot system and its collecting method
JPH08237765A (en) Method and apparatus for remotely controlling machine, especially vehicle,through image
JPH07271434A (en) Environment map preparing method by plural mobile robots
CN115729351A (en) Interactive processing system based on meta universe
CN114897935A (en) Unmanned aerial vehicle tracking method and system for air target object based on virtual camera
CN110100557A (en) A kind of Cold region apple fruit tree based on AR determines cave-applied fertilizer teleoperation method
US7333156B2 (en) Sequential colour visual telepresence system
CN1049925A (en) The control of Remote Control Vehicle
JP2000152216A (en) Video output system

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 19900503

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE CH DE FR GB IT LI LU NL SE

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 19920303