EP0917699A1 - Sensory communication apparatus - Google Patents

Sensory communication apparatus

Info

Publication number
EP0917699A1
EP0917699A1 EP98900916A EP98900916A EP0917699A1 EP 0917699 A1 EP0917699 A1 EP 0917699A1 EP 98900916 A EP98900916 A EP 98900916A EP 98900916 A EP98900916 A EP 98900916A EP 0917699 A1 EP0917699 A1 EP 0917699A1
Authority
EP
European Patent Office
Prior art keywords
output
pins
processor
data
central processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP98900916A
Other languages
German (de)
French (fr)
Inventor
John Christian Doughty Nissen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of EP0917699A1 publication Critical patent/EP0917699A1/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B21/00Teaching, or communicating with, the blind, deaf or mute
    • G09B21/001Teaching or communicating with blind persons
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B21/00Teaching, or communicating with, the blind, deaf or mute
    • G09B21/001Teaching or communicating with blind persons
    • G09B21/003Teaching or communicating with blind persons using tactile presentation of the information, e.g. Braille displays
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B21/00Teaching, or communicating with, the blind, deaf or mute
    • G09B21/001Teaching or communicating with blind persons
    • G09B21/008Teaching or communicating with blind persons using visual presentation of the information for the partially sighted

Abstract

Tactile communication apparatus comprising a central processing unit (10) having a data store (11), an output processor (12), means for controlling the output of data from the date store (11) to the output processor (12), and output means (16) connected to receive data from the processor, the output means having tactile sensor units responsive to the output data whereby the output of the computer can be determined by touch.

Description

SENSORY COMMUNICATION APPARATUS
This invention is concerned with a sensory communication apparatus , and in particular to such apparatus for the dynamic display to a blind person of textual and graphical information.
Maps are used for three main purposes, namely for education, such as the study of geography, for planning a journey, and for navigation during a journey.
The tasks of exploration are similar in the three cases and are essentially to discover the locations, names and characteristics of, and relationships between, real-world objects or features , both natural and man-made . The goal of exploration may be to establish the layout of a place, or more specifically to find a suitable route from one place to another.
There have been a number of developments in making maps accessible for blind people. Firstly there are the maps which are purely tactile and in which writing is in Braille, as raised dots, and texture is used as a substitute for colour. Such maps are "crude" in that there can be little detail, and resolution is small. Thus only a limited amount of information can be presented for a given size of map - several orders of magnitude less than a conventional map used by- sighted people. There are problems of producing tactile maps, though there have been technical developments to improve the situation. The cost of reproduction (i.e. the per copy cost) is still high.
Similar problems arise in the presentation of other multidimensional information including graphs, charts, block diagrams, tables and matrices. Even with a simple table the single line display of a conventional dynamic Braille apparatus can lead to confusion becayse the line crosses the columns. There have been various attempts to get away from a "hard-copy" approach and substitute a dynamic tactile display, using a physical surface which can be altered under control of a computer. These displays are expensive because the surface is composed of a large array of movable elements, but the cost of map reproduction (i.e. producing copies of a map to distribute) is negligible since the maps are in electronic form.
It is an object of the present invention to obviate or mitigate these difficulties.
The present invention is sensory communication apparatus comprising a central processing unit having a data store, an output processor, means for controlling the output of data from the data store to the output processor, and output means connected to receive data from the processor, the output means being responsive to the output data whereby the output of the computer can be determined by one or more of the user's senses of touch, sound or sight.
The means for controlling the output of the data store may comprise a pointing device.
Preferably the output means includes tactile elements in the form of pins, each having associated with it an electromechanical transducer.
Preferably the pins are also associated with respective switches which are connected through an input processor to the central processing unit.
The pins may be arranged in pairs each for simultaneous contact by a respective finger and each pin of each pair can be vibrated at different frequencies or with different pulse lengths. The output means may be operable in either a character mode or a surface mode.
Preferably the patterns in which the pins are actuated are related to text characters.
The apparatus may include a speech synthesiser responsive to data output from the central processing unit.
Preferably a keyboard or keypad is connected through an input processor to the central processing unit.
The keyboard or keypad may be operable in a character mode or a control mode.
Graphical information may be held in the data store of the central processing unit as a virtual surface, over which moves a notional cursor. The data output reflects the graphical information under the cursor, and the relative position of graphical objects on the surface. A selected graphical object can be described to the user in text, which is output using the same tactile device, or using audio (speech) or visual display.
Embodiments of the present invention will now be described, by way of example, with reference to the accompanying drawings, in which:-
Fig.l is a block circuit diagram of an embodiment of the present invention; Fig.2 is a more detailed circuit diagram of part of Fig.1 ; Fig.3 is a more detailed circuit diagram of part of
Fig.2; Fig.4 is a side elevation of an input/output sensor; Figs.5 is a plan view of a tactile input/output unit; and Fig.6 is a side view of the tactile input/output unit of Fig.5.
Referring now to Fig.l, tactile communication apparatus according to an embodiment of the present invention comprises a central processing unit 10 having a data store 11 and being connected through an output processor 12 and an input processor 14 to a number of tactile sensors 16 mounted on an input/output unit 18 (Fig.5). The input and output processors need not, of course, be be separate units but may be provided in a single integrated circuit.
The sensors 16 of this embodiment are shown in Fig.4 and it should be noted that each sensor 16 is an input/ output device that both generates a mechanical movement in response to output signals received from the output processor 12 and generates signals in response to a mechanical input, the signals being passed to the central processing unit 10 through the input processor 14.
Referring now to Fig.4, each sensor 16 comprises a pair of pins 20, a pair of electro-mechanical transducers, in this embodiment piezo strips 22, and a pair of membrane switches 24 mounted on a printed circuit board 26 supported from the top plate 28 of the unit 18. The strips 22 are clamped at one end and mount the pins 20 at their other end, the pins 20 projecting upwardly through the top plate 28. The switches 24 are each mounted beneath a respective pin 20 to be closed by the piezo strip when the pin is depressed.
In Fig.2 the output processor 12 is shown to comprise, connected in cascade, a level translator 34, a pulse width modulator generator 36, a high voltage digital amplifier 38 and a number of low pass filters 40 each connected to a respective piezo strip 22 in a sensor 16. In this embodiment twelve low pass filters 40 are provided. The output processor is powered from a low voltage source, preferably a battery, connected to a step-up voltage convertor 42 the high voltage output of which powers the processor more efficiently than would a low voltage.
The details of the pulse width modulation generator 36 are shown in Fig.3. The generator 36 consists of four sources that may be mapped onto any combination of sixteen outputs. The sources are controlled by a control unit 50 which interprets the commands and data incoming from the computer 10 via the level translater 34 and controls the various aspects of pulse generation.
Each of the four sources generates an arbitrary wave of a number of different amplitudes and frequencies and comprises a phase accumulator 52a, 52b, 52c, or 52d, which is set to step through a respective look-up table 54a, 54b 54c, or 54d, at a configurable rate whereby the frequency can be varied by altering the step and the waveform can be altered by using a number of different look-ups. The output from the look-up table is then scaled by a respective sealer 56a,56b,56c or 56d, according to the desired amplitude.
The four current samples to be output are passed to their respective pulse width generators 58a-58d which create pulses which have widths proportional to the desired value.
Any of the generators can be fed to any of the outputs. This is achieved by preloading a mask register 60a, 60b, 60c or 60d with the output pattern for the respective pulse width generator.
When the generator is generating a pulse, this output word is fed through. When the generator is not generating a pulse, the output mask is gated off and therefore does not play a part in the output.
The sixteen outputs from each of the four generators are mixed together using an exclusive- OR gate 66. This results in a true mixing of the four sources when the resultant waveform has passed through its filtering process.
In embodiments allowing access to maps or other graphical information, the array of transducers are mounted on, or are otherwise associated with, a pointing device, such as a computer mouse or a touch sensitive tablet. As the mouse moves, or a depression moves across the tablet, the window is correspondingly moved over the virtual surface representing the map or graphical image .
To allow a map to be scaleable over a wide range of scales, the information can be stored at various levels of detail, with the greater detail suppressed for smaller-scale presentation. Thus for example, on a large scale the map may show the exact shapes of buildings, on a smaller scale building may be represented as simple rectangles, and on a smaller scale still the buildings may be merged into a single object representing a built-up area. In the current invention, the user can zoom in and out at will, which compensates for the small size of window.
In Fig.5 is shown the plan view of a tactile input/output unit 18 which is similar to a conputer mouse in that it has a mouse ball 88 (Fig.6) on its underside and can therefore serve as a pointing device. In addition the unit 18 is provided with four tactile sensors (70, 72, 74, and 76) each having a pair of pins 20 projecting through its upper surface and two sensors having two pairs of pins 84, 86, as seen in Fig.6, projecting through each of its sides 78 and 80. The sixteen pins 20 have associated respective transducers in the form of piezo strips 22 each connected to a respective one of the low pass filters 40 of Fig.2.
The unit is, in use, held in the hand with the four fingers engaging the four pairs of pins at its upper surface and the thumb engaging one of the two pairs of pins at the sides depending upon which hand is holding the unit.
In embodiments of the invention there are two modes of operation, namely character and surface.
In character mode, output is by characters, each a pattern on the pins formed as follows. Each pair of pins has three states determined by the frequency of vibration of the piezo strips; in the first state one of the strips vibrates at a low frequency (20 Hz), in the second state both piezo strips vibrate at the low frequency and in the third state both piezos vibrate at a higher frequency (200 Hz). With one of the four finger pairs on there are three times four states, i.e. twelve states. With one of the two thumb pairs on there are six states . The alphabet is coded by patterns comprising either a single finger state, or a single finger state combined, simultaneously or sequentially, with a single thumb state.
In surface mode, the output corresponds to the virtual surface under the fingers. The piezo strips vibrate according to an algorithm based on frequency and distance from vectors forming virtual objects on the surface.
In a modified embodiment the states of the pins are distinguished by the length of the vibration pulses as well as or instead of by the frequency of vibration. Furthermore, embodiments of the invention have two input modes, namely character and control.
In character mode the input is via keys, i.e the pins 20, acting on the input switches 24 mounted under the piezo strips. Patterns of input can be produced to correspond to patterns of output. Three input stimuli are possible per finger, corresponding to the three output states per finger, one pin/piezo depressed, the other depressed, and both depressed.
In control mode, input is again via the keys and switches 24. A single keystroke is used for simple commands such as Next, Previous, Up, Down, Enter and Leave. These commands are used for navigating in information space, typically for exploring a document hierarchy, and for editing. The same commands are used at all levels in the structure of information space which is basically organised as a tree with hyperlinks.
The patterns of activation of the pins can be used to give a direction, e.g. a compass bearing or the direction of an object from the cursor position. The patterns can also be used to indicate what is under the cursor, or in the immediate vicinity. By selecting objects, feeling the patterns and moving the pointer, the user is able to explore a map, or other graphical image, represented on the virtual surface.
One form of the input/output unit has six pins with the associated transducers arranged as a hexagon about a seventh central pin. The pins are used to guide the hand of the user holding the tactile input/output unit in the direction corresponding to the direction of the graphical object from the cursor on the virtual surface. This allows the user to explore the surface for objects which the user has selected. There is means of selecting a single object or a group of objects with shared characteristics. With this particular input/output unit, the method of finding a particular object and its shape is a follows.
Consider first a point object such as bus stop. While the window is not over the object, the pin or pair of pins closest to the object are activated periodically, with a period proportional to distance. The user can then move the window towards the object, and the frequency of activation increases as the object is approached. When the window is directly over the point, the central pin is activated. Next consider a line object such as the centre line of a pavement along a street. While the window is not over the object, again the pin or pair of pins closest to the object are activated periodically. When the line is reached the pins over the line are activated. The user can then follow the line. While exactly over the line, the central pin is operated. Now consider objects which have an area (i.e. are not points lines). The same procedure is followed to find the edge of the object. However if the window is moved inside the area, the central pin is continuously activated, and the pins nearest to the nearest edge are periodically activated.
A particularly useful aspect of the invention is the ability of the user to feel the input as it is being drawn. Thus a person can input a line onto a map by moving the pointing device and then immediately feel the line using the tactile means.
For examining the relationship and shapes of different areas of the map, different areas are allocated different "colours" on the virtual surface, and each pin reacts to a different colour, as the cursor is moved over the surface. The user can thus scan the surface and detect the position of the different coloured areas. This is particularly efficient as no map requires more than four colours to distinguish different areas.
Sounds may be produced corresponding to, pin activation above. Embodiments of the invention may have an audio output where the sounds corresponding to pin activation are combined with speech output of text information.
While the main outputs are tactile or audio a visual output may also be provided as, for example, a word-by-word visual display of text with large characters, where the display or highlighting of each word is synchronised or partially synchronised with the speech output of the word such that the word is spoken immediately before, immediately after, or simultaneously with the display of that word. The word-by word facility is useful for partially sighted people as the visual display can handle words in large characters, up to the width of the screen. The word display or highlighting can be centred, such that the person can retain focus on one point on the screen while the words are displayed sequentially. This permits rapid reading, since the eyes' sacade movements are eliminated, and the time spent in backtracking and re-reading is avoided. The synchronism with speech reinforces the association between the written and spoken word, helpful for language learners and dyslexics.
It should be understood that the output unit of the present invention may rest on a surface, or may be clipped to a part of a wearer's apparel, such as a belt, or may be strapped to and operated by one hand.
Moreover the central processing unit may be remote from the input/output and may even be accessed by telephone and modem.
In a further modification of the input/output units previously described, the units are provided with a transmitter, for example an infra-red or a radio transmitter, or a transmitter and receiver, providing a remote control link with an appliance or system such as a television, a kiosk or an automatic telling machine, thus enabling the user to control the appliance or system using a dedicated pattern of pin or key operation and have from an audio or tactile output from the central processing unit confirmation of the control command.

Claims

1. Sensory communication apparatus comprising a central processing unit having a data store, an output processor, means for controlling the output of data from the data store to the output processor, and output means connected to receive data from the processor, the output means being responsive to the output data whereby the output of the computer can be determined by one or more of the user's senses of touch, sound or sight.
2. Apparatus as claimed in claim 1 , in which the means for controlling the output of the data store comprises a pointing device.
3. Apparatus as claimed in claim 1 or claim 2 , in which the output means includes tactile elements in the form of pins, each having associated with it an electro-mechanical transducer.
4. Apparatus as claimed in claim 3 , in which the pins are also associated with respective switches which are connected through an input processor to the central processing unit.
5. Apparatus as claimed in claim 3 or claim 4, in which the pins are arranged in pairs each for simultaneous contact by a respective finger and each pin of each pair can be vibrated at different frequencies or with different pulse lengths.
6. Apparatus as claimed in any of claims 3 to 5, in which the output means is operable in either a character mode or a surface mode.
7. Apparatus as claimed in any of claims 3 to 6, in which the patterns in which the pins are actuated are related to text characters .
8. Apparatus as claimed in any preceding claim, including a speech synthesiser responsive to data output from the central processing unit.
9. Apparatus as claimed in any of claims 1 to 3, or any of claims 5 to 8 when independent of claim 4, in which a keyboard or keypad is connected through an input processor to the central processing unit.
10. Apparatus as claimed in clain 9, in which the keyboard or keypad is operable in a character mode or a control mode.
EP98900916A 1997-01-20 1998-01-19 Sensory communication apparatus Withdrawn EP0917699A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
GBGB9701102.7A GB9701102D0 (en) 1997-01-20 1997-01-20 Tactile system for dynamic display of textual and graphical information with audio option
GB9701102 1997-01-20
PCT/GB1998/000162 WO1998032112A1 (en) 1997-01-20 1998-01-19 Sensory communication apparatus

Publications (1)

Publication Number Publication Date
EP0917699A1 true EP0917699A1 (en) 1999-05-26

Family

ID=10806274

Family Applications (1)

Application Number Title Priority Date Filing Date
EP98900916A Withdrawn EP0917699A1 (en) 1997-01-20 1998-01-19 Sensory communication apparatus

Country Status (5)

Country Link
EP (1) EP0917699A1 (en)
AU (1) AU5672398A (en)
CA (1) CA2249415A1 (en)
GB (1) GB9701102D0 (en)
WO (1) WO1998032112A1 (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2790578A1 (en) * 1999-03-02 2000-09-08 Philippe Soulie Sign language for communicating with another person or electronic calculator has series of main keys on side not facing user and reading keys transmitting tactile sensations to fingers of user
FR2790567B1 (en) * 1999-03-02 2001-05-25 Philippe Soulie KEYBOARD FOR TOUCH READING INFORMATION FROM AN ELECTRONIC COMPUTER
CA2271416A1 (en) * 1999-05-10 2000-11-10 Vincent Hayward Electro-mechanical transducer suitable for tactile display and article conveyance
US6693622B1 (en) 1999-07-01 2004-02-17 Immersion Corporation Vibrotactile haptic feedback devices
DE20080209U1 (en) 1999-09-28 2001-08-09 Immersion Corp Control of haptic sensations for interface devices with vibrotactile feedback
GB2358514A (en) * 2000-01-21 2001-07-25 Peter Nigel Bellamy Electronic braille reader
US6445284B1 (en) 2000-05-10 2002-09-03 Juan Manuel Cruz-Hernandez Electro-mechanical transducer suitable for tactile display and article conveyance
JP2002055600A (en) 2000-08-09 2002-02-20 Laurel Seiki Kk Information input and output device for sight handicapped person
US9625905B2 (en) 2001-03-30 2017-04-18 Immersion Corporation Haptic remote control for toys
US9245428B2 (en) 2012-08-02 2016-01-26 Immersion Corporation Systems and methods for haptic remote control gaming

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB8801309D0 (en) * 1988-01-21 1988-02-17 British Telecomm Electronic vibrational display
JP3086069B2 (en) * 1992-06-16 2000-09-11 キヤノン株式会社 Information processing device for the disabled
JPH0777944A (en) * 1993-06-14 1995-03-20 Yasushi Ikei Vibrating type tactile display
JP3225477B2 (en) * 1994-06-23 2001-11-05 日本電信電話株式会社 Tactile stimulus expression method and apparatus and tactile stimulus display
US5719561A (en) * 1995-10-25 1998-02-17 Gilbert R. Gonzales Tactile communication device and method
JPH09166958A (en) * 1995-12-18 1997-06-24 Japan Radio Co Ltd Navigation device
GB2311888B (en) * 1996-04-01 2000-10-25 John Christian Doughty Nissen Tactile system for computer dynamic display and communication

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO9832112A1 *

Also Published As

Publication number Publication date
CA2249415A1 (en) 1998-07-23
GB9701102D0 (en) 1997-03-12
AU5672398A (en) 1998-08-07
WO1998032112A1 (en) 1998-07-23

Similar Documents

Publication Publication Date Title
Ducasse et al. Accessible interactive maps for visually impaired users
JP4567817B2 (en) Information processing apparatus and control method thereof
US4464118A (en) Didactic device to improve penmanship and drawing skills
Bolt “Put-that-there” Voice and gesture at the graphics interface
US5287102A (en) Method and system for enabling a blind computer user to locate icons in a graphical user interface
US6802717B2 (en) Teaching method and device
US8228298B2 (en) Method and devices of transmitting tactile information description
US20060024647A1 (en) Method and apparatus for communicating graphical information to a visually impaired person using haptic feedback
EP0917699A1 (en) Sensory communication apparatus
Rigas et al. The rising pitch metaphor: an empirical study
Brock Interactive maps for visually impaired people: design, usability and spatial cognition
GB2311888A (en) Tactile communication system
JP4736605B2 (en) Display device, information processing device, and control method thereof
US4594683A (en) Apparatus for fixing a coordinate point within a flat data representation
KR100312750B1 (en) Virtual musical performance apparatus and method thereof using sensor
US3740446A (en) Perception apparatus for the blind
Semwal et al. Virtual environments for visually impaired
Golledge et al. Multimodal interfaces for representing and accessing geospatial information
Bustoni et al. Multidimensional Earcon Interaction Design for The Blind: a Proposal and Evaluation
JPH05174074A (en) Page-turning device
Karshmer et al. Equal access to information for all: making the world of electronic information more accessible to the handicapped in our society
Parker Assessment of Access Methods for Mobile Maps for Individuals Who are Blind or Visually Impaired
JPH09198222A (en) Word processor device
Wang et al. Feasibility Study on Interactive Geometry Sonification
RU23515U1 (en) VIDEO INFORMATION SUBMISSION SYSTEM

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 19981013

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE CH DE DK ES FI FR GB IE IT LI NL SE

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20030801