GB2096867A - Information relay system - Google Patents

Information relay system Download PDF

Info

Publication number
GB2096867A
GB2096867A GB8111982A GB8111982A GB2096867A GB 2096867 A GB2096867 A GB 2096867A GB 8111982 A GB8111982 A GB 8111982A GB 8111982 A GB8111982 A GB 8111982A GB 2096867 A GB2096867 A GB 2096867A
Authority
GB
United Kingdom
Prior art keywords
visual representation
terminal
touch
information
producing means
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB8111982A
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
TOUCH TECHNOLOGY Ltd
Original Assignee
TOUCH TECHNOLOGY Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by TOUCH TECHNOLOGY Ltd filed Critical TOUCH TECHNOLOGY Ltd
Priority to GB8111982A priority Critical patent/GB2096867A/en
Publication of GB2096867A publication Critical patent/GB2096867A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Manipulator (AREA)

Abstract

An information relay system which comprises a touch video display terminal (12) controlled by a C.P.U. (10), and an electronic camera (18) or like visual representation producing device which supplies a real time, real life picture to the terminal. Thus, by feed back to the C.P.U. from the touch sensor (14) of the terminal, the C.P.U. can provide an output signal containing positional information as to the point of the real world representation which has been touched. This output signal may, for example, control a robotic device acting in the real world being viewed by the camera. <IMAGE>

Description

SPECIFICATION Information relay system This invention relates generally to an information relay system which may be used to transmit a purely informative signal or more commonly a control signal, and more specifically a system and method of relaying information based on use of a touch video display terminal.
Touch terminals comprise a C.P.U. driven video display wherein, by provision of a suitable contact sensitive matrix at the screen, the position of an applied touch can be sensed and fed to the C.P.U., which in turn generates a signal for display at the terminal which is dependent on the position of touch. One known form of contact sensitive matrix is a matrix of infra-red rays generated along two coordinate directions in a plane parallel to and just in front of the screen; a touch on the screen intercepts at least two intersecting rays, whereby information is fed back to the C.P.U.
indicative of the coordinates of the point of touch.
The C.P.U. can be programmed to respond to the touch in various ways. Simply, it may feed the positional information back to the terminal to generate a mark or character at the point of touch. Since response can be virtually instantaneous, it is thus possible to "draw" on the screen. Alternatively, for example, the C.P.U. may be called upon to generate a background on the screen, such as a keyboard, and respond to touches on such visual keyboard to perform calculations or the like.
In the case of the first example, the ability to draw on the screen may be useful in various applications, e.g. medicine, to facilitate identification of a point or area of an X-ray photograph which is separately supplied to the terminal.
It is also known to program the C.P.U. to generate a signal which is output to other equipment in response to a touch on the terminal.
Such a procedure has been applied, for example, to control of traffic movement within a conveyance system, wherein an electronic map of the system is displayed on the terminal.
It is an object of this invention to provide a further development in the relaying of information by means of a touch terminal.
In accordance with one aspect of the present invention, there is provided an information relay system which comprises a #.P.U., a touch video display terminal, means for producing a real time, real life visual representation, means coupling said visual representation producing means and said touch terminal for causing said real time, real life visual representation to be displayed, and output means from the C.P.U. for delivering an information output signal having a content jointly related to said visual representation and a touch applied to the terminal at which said visual representation is displayed.
The present invention thus differs from the prior art in making possible a real time coupling between an operator and real life at that time as seen by an electronic camera, e.g. T.V. camera, or other visual representation producing means such as a radar scanner or sonic detector or electron microscope. In the prior art, only "dead" or static information or character information from the C. P. U. has been applied as a touch sensitive representation on the screen.
Practical application of the invention will usually involve a device driven by the output signal responsively to the information content thereof.
Said driven device may simply be a remote display or information store. For example, a terminal receiving a real time picture of a congested area of traffic may be touched to relay an appropriately marked picture to a remote display visible to a traffic controller.
More commonly, however, the driven device will be a robotic device which is actuated in response to a touch on the screen. For example, if the camera continuously scans a range of identifiable control switches, any one of these switches may be actuated as required by touching the real life visual representation of the switch when it appears on the screen.
Again, the said robotic device may be coupled to the camera or other visual representation producing means, thereby to form a closed loop control system via the operator. The coupling may be indirect. For example, if a camera pointing out of an airborne aircraft provides a continuous real time, real life picture to a touch terminal on the ground, say with the camera axis aligned with the picture centre, an off-centre touch may be processed by the C.P.U. to provide signals transmitted to the aircraft to steer the latter so that the camera is re-centred on the touched offaxis point, which again becomes the centre of real time, real life picture on the screen once the steering manoeuvre has been completed.Clearly, this control method can be a continuous one, enabling the ground controller to steer the aircraft dependent on the real time, real life picture on the screen, which will in general correspond with the view as seen from the pilot's cabin.
In general therefore, in addition to its use for substantially instantaneous information conveyance, the present invention enables a real time coupling between an operator and robotics which may be relatively adjacent or remote, i.e. a substantially instantaneous coaction between the operator and real life robotics.
Thus, according to another aspect of the invention, there is provided a method of relaying information wherein a touch video display terminal is coupled to a visual representation producing means to produce on the display a real time, real life visual representation, and a touch is applied to said terminal to cause a controlling C.P.U. to produce an information output signal having a content dependent on the position of contact of said touch with said visual representation.
The invention will now be exemplified with reference to the accompanying drawings, in which:~ Figure 1 is a block diagram of the basic components of a system in accordance with the invention; Figure 2 is a circuit diagram of a C.P.U. camera interface; and Figure 3 is a circuit diagram of a C.P.U. and camera-touch terminal interface.
In Figure 1, the reference 10 denotes a C.P.U.
which has an output to a touch video display terminal 12. The touch terminal 12 is conventional, having a touch sensitive matrix, diagrammatically indicated at 14, formed by a lattice of infra-red rays in two coordinate directions. A touch, as by the finger, to the screen 16 of the terminal intercepts the lattice of infrared rays to feed back to the C.P.U. 10 a signal which contains the coordinates of the point of touch.
An electronic camera 18, such as a T.V.
camera, continuously supplies picture information to the terminal 12 to cause a real time, real life picture to be displayed on the screen 16.
As the terminal 12 also receives drive and character producing information from the C.P.U.
10, the inputs to the terminal are buffered through a video mixer 20 shown in Figure 3. This circuit is a substantially conventional circuit for mixing two video frequency signals while isolating the two circuits from which the respective signals are supplied, namely the camera output circuit and the C.P.U. output circuit. In Figure 3, reference 22 denotes the camera signal input, reference 24 denotes the C.P.U. character signal input and reference 26 designates the output to the video board of the terminal 12.
The camera 18 is driven by the C.P.U. 10, and receives its drive through an interface 28 shown in Figure 2. This circuit is a substantially conventional circuit acting to synchronise the frame and line scan of the camera with the frame and line scan drive from the C.P.U. 10 to the terminal 12. The interface circuit 28 is based on a known microchip 30 known by the designation 74LS221, which in effect constitutes a pulse cleaner. The input from the C.P.U. 10 on lines HS, VS may take the form of pulses of various shapes, amplitudes and durations. The pulse cleaner 30 ensures that the frame and line drive pulses to the camera are drive pulses of the correct shape, amplitude and duration. In Figure 2, the outputs to the green and yellow input drive circuits of the camera are designated G and Y.
The operation of the system shown in Figure 1 will now be described. The screen 16 of the terminal 12 carries a real time, real life picture of view seen by the camera 18. Thus, when the screen 16 is touched, the signal fed to the C.P.U.
10 contains information indicative of the position of the touched point of the screen, which corresponds to a touched point of the real time, real life picture. Thus, assuming the camera axis is centred on the screen, this information is directly representative of a real time, real life point in the view currently seen by the camera. Thus, the action of the operator in touching the terminal provides a substantially instantaneous contact between the operator and the real world. This instantaneous contact can be developed to provide an instantaneous coaction between the operator and the real world. For this purpose, the C.P.U. 10 provides an output on line 32 containing information relating to the point of the real time, real life picture which has been touched.This output 32 may simply feed a remote display, but more usually will be used to control a robotic device (the word "robotic" being used in its broadest sense to indicate any selfpowered device which is capable of being controlled or actuated from a remote point). This robotic device can then be actuated to perform a function in relation to the point of the real life scene which has been designated by touching that point on the real time, real life visual representation on the screen of the terminal. This robotic actuation may be repetitive or continuous, as instanced by the previously described example of aircraft control. It will also be appreciated, again as instanced by the said example, that the coupling between the camera and the C.P.U., and likewise between the C.P.U. and the robotic device, may be a radio link or the like. Continuous control may involve a coupling, direct or indirect, between the camera and the robotic device, so that a function, such as a steering manoeuvre, performed by the robotic device results in a repositioning of the camera to restore a zero or base condition at the touch terminal enabling performance of the function to be continued.
It will be appreciated that the system and method described with reference to the drawings may be modified in various ways within the scope of the invention as set forth in the appended claims. In particular, the electronic camera may be replaced by another visual representation producing device, such as a radar scanner, sonic detector or electron microscope, depending on the application to which the system is put. Also, various interfaces may employed instead of the circuits exemplified by Figures 2 and 3.

Claims (13)

Claims
1. An information relay system which comprises a C.P.U., a touch video display terminal, means for producing a real time, real life visual representation, means coupling said visual representation producing means and said touch terminal for causing said real time, real life visual representation to be displayed, and output means from the C.P.U. for delivering an information output signal having a content jointly related to said visual representation and a touch applied to the terminal at which said visual representation is displayed.
2. A system according to claim 1, including a device driven by the output signal responsively to the information content thereof.
3. A system according to claim 2, wherein said driven device is coupled to said visual representation producing means, thereby to form a closed loop control system via the operator.
4. A system according to any of claims 1 to 3, wherein said visual representation producing means is an electronic camera.
5. A system according to claim 4 when appendant to claim 3, wherein said driven device acts to adjust the direction in which the camera is pointing, whereby the system constitutes a continuously operable position controller.
6. A system according to claim 2, wherein said driven device is a remote information store or information display.
7. A system according to any one of claims 1 to 6, wherein said coupling means is a radio link.
8. A system according to claim 2 or claims appendant thereto, wherein said driven device receives said output signal over a radio link.
9. A system according to any of claims 1 to 8, including an interface between the C.P.U. and the visual representation producing means which comprises a frame and line synchronisation drive circuit.
10. A system according to any of claims 1 to 9, including an interface between the C.P.U. and the visual representation producing means and the touch terminal, which interface includes a signal mixer for mixing the input from the visual representation producing means with the input from the C.P.U. to said terminal while buffering said visual representation producing means from said C.P.U.
11. A method of relaying information wherein a touch video display terminal is coupled to a visual representation producing means to produce on the display a real time, real life visual representation, and a touch is applied to said terminal to cause a controlling C.P.U. to produce an information output signal having a content dependent on the position of contact of said touch with said visual representation.
12. An information relay system substantially as hereinbefore described with reference to the accompanying drawings.
13. A method of relaying information substantially as hereinbefore described.
GB8111982A 1981-04-15 1981-04-15 Information relay system Withdrawn GB2096867A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB8111982A GB2096867A (en) 1981-04-15 1981-04-15 Information relay system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB8111982A GB2096867A (en) 1981-04-15 1981-04-15 Information relay system

Publications (1)

Publication Number Publication Date
GB2096867A true GB2096867A (en) 1982-10-20

Family

ID=10521195

Family Applications (1)

Application Number Title Priority Date Filing Date
GB8111982A Withdrawn GB2096867A (en) 1981-04-15 1981-04-15 Information relay system

Country Status (1)

Country Link
GB (1) GB2096867A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2537748A1 (en) * 1982-12-14 1984-06-15 Video Prestations Sa GENERATOR OF ELECTRICAL ORDERS BY MANUAL INTERRUPTION OF LIGHT RADIATION
GB2143975A (en) * 1983-06-18 1985-02-20 Brown David F Computerised interactive advertising and information display system
EP0139941A1 (en) * 1983-08-24 1985-05-08 Siemens Aktiengesellschaft X-ray diagnosis device having a patient's couch and a primary radiation diaphragm
GB2154109A (en) * 1984-01-27 1985-08-29 Hitachi Shipbuilding Eng Co Ship collision preventive aid apparatus
FR2623997A1 (en) * 1987-12-04 1989-06-09 Cgr Ultrasonic Medical imaging device comprising means for displaying digitised images
EP0700203A1 (en) * 1994-09-05 1996-03-06 Sony Corporation Video signal recording apparatus using a touchscreen

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2537748A1 (en) * 1982-12-14 1984-06-15 Video Prestations Sa GENERATOR OF ELECTRICAL ORDERS BY MANUAL INTERRUPTION OF LIGHT RADIATION
EP0112253A1 (en) * 1982-12-14 1984-06-27 Video Prestations S.A. Electrical instructions generators by manual interruption of a light beam
GB2143975A (en) * 1983-06-18 1985-02-20 Brown David F Computerised interactive advertising and information display system
EP0139941A1 (en) * 1983-08-24 1985-05-08 Siemens Aktiengesellschaft X-ray diagnosis device having a patient's couch and a primary radiation diaphragm
GB2154109A (en) * 1984-01-27 1985-08-29 Hitachi Shipbuilding Eng Co Ship collision preventive aid apparatus
FR2623997A1 (en) * 1987-12-04 1989-06-09 Cgr Ultrasonic Medical imaging device comprising means for displaying digitised images
EP0700203A1 (en) * 1994-09-05 1996-03-06 Sony Corporation Video signal recording apparatus using a touchscreen
CN1058593C (en) * 1994-09-05 2000-11-15 索尼公司 Video apparatus

Similar Documents

Publication Publication Date Title
JP3419050B2 (en) Input device
US6831664B2 (en) Low cost interactive program control system and method
JP2851050B2 (en) Electronic light pointer for projection monitor
Huy et al. See-through and spatial augmented reality-a novel framework for human-robot interaction
ES2090834T3 (en) SYSTEM OF CONCEPTION AND MANUFACTURING ASSISTED BY COMPUTER.
GB2096867A (en) Information relay system
KR100399803B1 (en) Input device and display system
JPH0215987A (en) Arc vision sensor operating method
CN100549924C (en) Control device
JPS60201426A (en) Coordinate position designating method in touch panel
EP0314080A3 (en) Image paralleling and rotating system
DE4336677C1 (en) Arrangement for determining a spatial position and for inputting control information into a system having a graphical user interface
JP2637528B2 (en) Robot remote control device
JPH04286015A (en) Position instruction system using mouse
JPH086703A (en) Remote input device
JPH05165565A (en) Coordinate input system
Bon et al. Operator-coached machine vision for space telerobotics
JPH08276388A (en) Teaching device for robot
JPH0752068A (en) Remote control system
JPH06105311A (en) Photographed image display device
JPH0734234B2 (en) Figure instruction method
JPH04169913A (en) Controller for glasses
JPH06110610A (en) Coordinate input device
JPH0277922A (en) Display device
Stark FOR THE EVOLVING SPACE STATION: RESEARCH NEEDS AND OUTSTANDING

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)