EP0917699A1 - Appareil de communication sensoriel - Google Patents

Appareil de communication sensoriel

Info

Publication number
EP0917699A1
EP0917699A1 EP98900916A EP98900916A EP0917699A1 EP 0917699 A1 EP0917699 A1 EP 0917699A1 EP 98900916 A EP98900916 A EP 98900916A EP 98900916 A EP98900916 A EP 98900916A EP 0917699 A1 EP0917699 A1 EP 0917699A1
Authority
EP
European Patent Office
Prior art keywords
output
pins
processor
data
central processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP98900916A
Other languages
German (de)
English (en)
Inventor
John Christian Doughty Nissen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of EP0917699A1 publication Critical patent/EP0917699A1/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B21/00Teaching, or communicating with, the blind, deaf or mute
    • G09B21/001Teaching or communicating with blind persons
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B21/00Teaching, or communicating with, the blind, deaf or mute
    • G09B21/001Teaching or communicating with blind persons
    • G09B21/003Teaching or communicating with blind persons using tactile presentation of the information, e.g. Braille displays
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B21/00Teaching, or communicating with, the blind, deaf or mute
    • G09B21/001Teaching or communicating with blind persons
    • G09B21/008Teaching or communicating with blind persons using visual presentation of the information for the partially sighted

Definitions

  • This invention is concerned with a sensory communication apparatus , and in particular to such apparatus for the dynamic display to a blind person of textual and graphical information.
  • Maps are used for three main purposes, namely for education, such as the study of geography, for planning a journey, and for navigation during a journey.
  • the tasks of exploration are similar in the three cases and are essentially to discover the locations, names and characteristics of, and relationships between, real-world objects or features , both natural and man-made .
  • the goal of exploration may be to establish the layout of a place, or more specifically to find a suitable route from one place to another.
  • the present invention is sensory communication apparatus comprising a central processing unit having a data store, an output processor, means for controlling the output of data from the data store to the output processor, and output means connected to receive data from the processor, the output means being responsive to the output data whereby the output of the computer can be determined by one or more of the user's senses of touch, sound or sight.
  • the means for controlling the output of the data store may comprise a pointing device.
  • the output means includes tactile elements in the form of pins, each having associated with it an electromechanical transducer.
  • the pins are also associated with respective switches which are connected through an input processor to the central processing unit.
  • the pins may be arranged in pairs each for simultaneous contact by a respective finger and each pin of each pair can be vibrated at different frequencies or with different pulse lengths.
  • the output means may be operable in either a character mode or a surface mode.
  • the patterns in which the pins are actuated are related to text characters.
  • the apparatus may include a speech synthesiser responsive to data output from the central processing unit.
  • a keyboard or keypad is connected through an input processor to the central processing unit.
  • the keyboard or keypad may be operable in a character mode or a control mode.
  • Graphical information may be held in the data store of the central processing unit as a virtual surface, over which moves a notional cursor.
  • the data output reflects the graphical information under the cursor, and the relative position of graphical objects on the surface.
  • a selected graphical object can be described to the user in text, which is output using the same tactile device, or using audio (speech) or visual display.
  • Fig.l is a block circuit diagram of an embodiment of the present invention
  • Fig.2 is a more detailed circuit diagram of part of Fig.1
  • Fig.3 is a more detailed circuit diagram of part of
  • tactile communication apparatus comprises a central processing unit 10 having a data store 11 and being connected through an output processor 12 and an input processor 14 to a number of tactile sensors 16 mounted on an input/output unit 18 (Fig.5).
  • the input and output processors need not, of course, be be separate units but may be provided in a single integrated circuit.
  • each sensor 16 is an input/ output device that both generates a mechanical movement in response to output signals received from the output processor 12 and generates signals in response to a mechanical input, the signals being passed to the central processing unit 10 through the input processor 14.
  • each sensor 16 comprises a pair of pins 20, a pair of electro-mechanical transducers, in this embodiment piezo strips 22, and a pair of membrane switches 24 mounted on a printed circuit board 26 supported from the top plate 28 of the unit 18.
  • the strips 22 are clamped at one end and mount the pins 20 at their other end, the pins 20 projecting upwardly through the top plate 28.
  • the switches 24 are each mounted beneath a respective pin 20 to be closed by the piezo strip when the pin is depressed.
  • the output processor 12 is shown to comprise, connected in cascade, a level translator 34, a pulse width modulator generator 36, a high voltage digital amplifier 38 and a number of low pass filters 40 each connected to a respective piezo strip 22 in a sensor 16. In this embodiment twelve low pass filters 40 are provided.
  • the output processor is powered from a low voltage source, preferably a battery, connected to a step-up voltage convertor 42 the high voltage output of which powers the processor more efficiently than would a low voltage.
  • the details of the pulse width modulation generator 36 are shown in Fig.3.
  • the generator 36 consists of four sources that may be mapped onto any combination of sixteen outputs.
  • the sources are controlled by a control unit 50 which interprets the commands and data incoming from the computer 10 via the level translater 34 and controls the various aspects of pulse generation.
  • Each of the four sources generates an arbitrary wave of a number of different amplitudes and frequencies and comprises a phase accumulator 52a, 52b, 52c, or 52d, which is set to step through a respective look-up table 54a, 54b 54c, or 54d, at a configurable rate whereby the frequency can be varied by altering the step and the waveform can be altered by using a number of different look-ups.
  • the output from the look-up table is then scaled by a respective sealer 56a,56b,56c or 56d, according to the desired amplitude.
  • the four current samples to be output are passed to their respective pulse width generators 58a-58d which create pulses which have widths proportional to the desired value.
  • Any of the generators can be fed to any of the outputs. This is achieved by preloading a mask register 60a, 60b, 60c or 60d with the output pattern for the respective pulse width generator.
  • this output word is fed through.
  • the output mask is gated off and therefore does not play a part in the output.
  • the sixteen outputs from each of the four generators are mixed together using an exclusive- OR gate 66. This results in a true mixing of the four sources when the resultant waveform has passed through its filtering process.
  • the array of transducers are mounted on, or are otherwise associated with, a pointing device, such as a computer mouse or a touch sensitive tablet. As the mouse moves, or a depression moves across the tablet, the window is correspondingly moved over the virtual surface representing the map or graphical image .
  • the information can be stored at various levels of detail, with the greater detail suppressed for smaller-scale presentation.
  • the map may show the exact shapes of buildings, on a smaller scale building may be represented as simple rectangles, and on a smaller scale still the buildings may be merged into a single object representing a built-up area.
  • the user can zoom in and out at will, which compensates for the small size of window.
  • Fig.5 is shown the plan view of a tactile input/output unit 18 which is similar to a conputer mouse in that it has a mouse ball 88 (Fig.6) on its underside and can therefore serve as a pointing device.
  • the unit 18 is provided with four tactile sensors (70, 72, 74, and 76) each having a pair of pins 20 projecting through its upper surface and two sensors having two pairs of pins 84, 86, as seen in Fig.6, projecting through each of its sides 78 and 80.
  • the sixteen pins 20 have associated respective transducers in the form of piezo strips 22 each connected to a respective one of the low pass filters 40 of Fig.2.
  • the unit is, in use, held in the hand with the four fingers engaging the four pairs of pins at its upper surface and the thumb engaging one of the two pairs of pins at the sides depending upon which hand is holding the unit.
  • output is by characters, each a pattern on the pins formed as follows.
  • Each pair of pins has three states determined by the frequency of vibration of the piezo strips; in the first state one of the strips vibrates at a low frequency (20 Hz), in the second state both piezo strips vibrate at the low frequency and in the third state both piezos vibrate at a higher frequency (200 Hz).
  • the alphabet is coded by patterns comprising either a single finger state, or a single finger state combined, simultaneously or sequentially, with a single thumb state.
  • the output corresponds to the virtual surface under the fingers.
  • the piezo strips vibrate according to an algorithm based on frequency and distance from vectors forming virtual objects on the surface.
  • the states of the pins are distinguished by the length of the vibration pulses as well as or instead of by the frequency of vibration.
  • embodiments of the invention have two input modes, namely character and control.
  • the input is via keys, i.e the pins 20, acting on the input switches 24 mounted under the piezo strips. Patterns of input can be produced to correspond to patterns of output. Three input stimuli are possible per finger, corresponding to the three output states per finger, one pin/piezo depressed, the other depressed, and both depressed.
  • control mode input is again via the keys and switches 24.
  • a single keystroke is used for simple commands such as Next, Previous, Up, Down, Enter and Leave. These commands are used for navigating in information space, typically for exploring a document hierarchy, and for editing. The same commands are used at all levels in the structure of information space which is basically organised as a tree with hyperlinks.
  • the patterns of activation of the pins can be used to give a direction, e.g. a compass bearing or the direction of an object from the cursor position.
  • the patterns can also be used to indicate what is under the cursor, or in the immediate vicinity.
  • One form of the input/output unit has six pins with the associated transducers arranged as a hexagon about a seventh central pin.
  • the pins are used to guide the hand of the user holding the tactile input/output unit in the direction corresponding to the direction of the graphical object from the cursor on the virtual surface. This allows the user to explore the surface for objects which the user has selected. There is means of selecting a single object or a group of objects with shared characteristics.
  • the method of finding a particular object and its shape is a follows.
  • a point object such as bus stop. While the window is not over the object, the pin or pair of pins closest to the object are activated periodically, with a period proportional to distance. The user can then move the window towards the object, and the frequency of activation increases as the object is approached. When the window is directly over the point, the central pin is activated.
  • a line object such as the centre line of a pavement along a street. While the window is not over the object, again the pin or pair of pins closest to the object are activated periodically. When the line is reached the pins over the line are activated. The user can then follow the line. While exactly over the line, the central pin is operated. Now consider objects which have an area (i.e. are not points lines). The same procedure is followed to find the edge of the object. However if the window is moved inside the area, the central pin is continuously activated, and the pins nearest to the nearest edge are periodically activated.
  • a particularly useful aspect of the invention is the ability of the user to feel the input as it is being drawn.
  • a person can input a line onto a map by moving the pointing device and then immediately feel the line using the tactile means.
  • Sounds may be produced corresponding to, pin activation above.
  • Embodiments of the invention may have an audio output where the sounds corresponding to pin activation are combined with speech output of text information.
  • a visual output may also be provided as, for example, a word-by-word visual display of text with large characters, where the display or highlighting of each word is synchronised or partially synchronised with the speech output of the word such that the word is spoken immediately before, immediately after, or simultaneously with the display of that word.
  • the word-by word facility is useful for partially sighted people as the visual display can handle words in large characters, up to the width of the screen.
  • the word display or highlighting can be centred, such that the person can retain focus on one point on the screen while the words are displayed sequentially. This permits rapid reading, since the eyes' sacade movements are eliminated, and the time spent in backtracking and re-reading is avoided.
  • the synchronism with speech reinforces the association between the written and spoken word, helpful for language learners and dyslexics.
  • the output unit of the present invention may rest on a surface, or may be clipped to a part of a wearer's apparel, such as a belt, or may be strapped to and operated by one hand.
  • central processing unit may be remote from the input/output and may even be accessed by telephone and modem.
  • the units are provided with a transmitter, for example an infra-red or a radio transmitter, or a transmitter and receiver, providing a remote control link with an appliance or system such as a television, a kiosk or an automatic telling machine, thus enabling the user to control the appliance or system using a dedicated pattern of pin or key operation and have from an audio or tactile output from the central processing unit confirmation of the control command.
  • a transmitter for example an infra-red or a radio transmitter, or a transmitter and receiver, providing a remote control link with an appliance or system such as a television, a kiosk or an automatic telling machine, thus enabling the user to control the appliance or system using a dedicated pattern of pin or key operation and have from an audio or tactile output from the central processing unit confirmation of the control command.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Appareil de communication tactile comportant une unité centrale (10) dotée d'une mémoire de données (11), d'un processeur de sortie (12), d'éléments permettant de commander la sortie des données de la mémoire de données (11) vers le processeur de sortie (12), et d'éléments (16) situés au niveau de la sortie et connectés de manière à recevoir les données provenant du processeur. Lesdits éléments sont munis de capteurs tactiles sensibles aux données de sortie grâce auxquels on peut déterminer les données de sortie de l'ordinateur par le toucher.
EP98900916A 1997-01-20 1998-01-19 Appareil de communication sensoriel Withdrawn EP0917699A1 (fr)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
GB9701102 1997-01-20
GBGB9701102.7A GB9701102D0 (en) 1997-01-20 1997-01-20 Tactile system for dynamic display of textual and graphical information with audio option
PCT/GB1998/000162 WO1998032112A1 (fr) 1997-01-20 1998-01-19 Appareil de communication sensoriel

Publications (1)

Publication Number Publication Date
EP0917699A1 true EP0917699A1 (fr) 1999-05-26

Family

ID=10806274

Family Applications (1)

Application Number Title Priority Date Filing Date
EP98900916A Withdrawn EP0917699A1 (fr) 1997-01-20 1998-01-19 Appareil de communication sensoriel

Country Status (5)

Country Link
EP (1) EP0917699A1 (fr)
AU (1) AU5672398A (fr)
CA (1) CA2249415A1 (fr)
GB (1) GB9701102D0 (fr)
WO (1) WO1998032112A1 (fr)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2790578A1 (fr) * 1999-03-02 2000-09-08 Philippe Soulie Langage gestuel combinatoire et graphique associee dont les unites sont classees selon une progression logique et permettant de communiquer avec un calculateur electronique
FR2790567B1 (fr) * 1999-03-02 2001-05-25 Philippe Soulie Clavier de touches permettant une lecture tactile d'informations provenant d'un calculateur electronique
CA2271416A1 (fr) * 1999-05-10 2000-11-10 Vincent Hayward Transducteur electromecanique convenant a l'affichage tactile et au transport d'articles
US6693622B1 (en) 1999-07-01 2004-02-17 Immersion Corporation Vibrotactile haptic feedback devices
DE20080209U1 (de) 1999-09-28 2001-08-09 Immersion Corp Steuerung von haptischen Empfindungen für Schnittstellenvorrichtungen mit Vibrotaktiler Rückkopplung
GB2358514A (en) * 2000-01-21 2001-07-25 Peter Nigel Bellamy Electronic braille reader
US6445284B1 (en) 2000-05-10 2002-09-03 Juan Manuel Cruz-Hernandez Electro-mechanical transducer suitable for tactile display and article conveyance
JP2002055600A (ja) * 2000-08-09 2002-02-20 Laurel Seiki Kk 視覚障害者用情報入出力装置
US9625905B2 (en) 2001-03-30 2017-04-18 Immersion Corporation Haptic remote control for toys
US9245428B2 (en) 2012-08-02 2016-01-26 Immersion Corporation Systems and methods for haptic remote control gaming

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB8801309D0 (en) * 1988-01-21 1988-02-17 British Telecomm Electronic vibrational display
JP3086069B2 (ja) * 1992-06-16 2000-09-11 キヤノン株式会社 障害者用情報処理装置
JPH0777944A (ja) * 1993-06-14 1995-03-20 Yasushi Ikei 振動形触覚ディスプレイ
JP3225477B2 (ja) * 1994-06-23 2001-11-05 日本電信電話株式会社 触覚刺激表出方法及び装置と触覚刺激ディスプレイ
US5719561A (en) * 1995-10-25 1998-02-17 Gilbert R. Gonzales Tactile communication device and method
JPH09166958A (ja) * 1995-12-18 1997-06-24 Japan Radio Co Ltd ナビゲーション装置
GB2311888B (en) * 1996-04-01 2000-10-25 John Christian Doughty Nissen Tactile system for computer dynamic display and communication

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO9832112A1 *

Also Published As

Publication number Publication date
WO1998032112A1 (fr) 1998-07-23
CA2249415A1 (fr) 1998-07-23
AU5672398A (en) 1998-08-07
GB9701102D0 (en) 1997-03-12

Similar Documents

Publication Publication Date Title
Ducasse et al. Accessible interactive maps for visually impaired users
US5736978A (en) Tactile graphics display
JP4567817B2 (ja) 情報処理装置及びその制御方法
US4464118A (en) Didactic device to improve penmanship and drawing skills
Bolt “Put-that-there” Voice and gesture at the graphics interface
US5287102A (en) Method and system for enabling a blind computer user to locate icons in a graphical user interface
US6802717B2 (en) Teaching method and device
US8228298B2 (en) Method and devices of transmitting tactile information description
US20060024647A1 (en) Method and apparatus for communicating graphical information to a visually impaired person using haptic feedback
Brock Interactive maps for visually impaired people: design, usability and spatial cognition
Rigas et al. The rising pitch metaphor: an empirical study
EP0917699A1 (fr) Appareil de communication sensoriel
GB2311888A (en) Tactile communication system
JP4736605B2 (ja) 表示装置、情報処理装置、及びその制御方法
US4594683A (en) Apparatus for fixing a coordinate point within a flat data representation
KR100312750B1 (ko) 센서를 이용한 가상 연주장치 및 그 방법
US3740446A (en) Perception apparatus for the blind
Semwal et al. Virtual environments for visually impaired
Bustoni et al. Multidimensional Earcon Interaction Design for The Blind: a Proposal and Evaluation
Golledge et al. Multimodal interfaces for representing and accessing geospatial information
JPH11501740A (ja) 計算装置用のマン/マシンインタフェース
JPH05174074A (ja) 頁めくり装置
Karshmer et al. Equal access to information for all: making the world of electronic information more accessible to the handicapped in our society
Parker Assessment of Access Methods for Mobile Maps for Individuals Who are Blind or Visually Impaired
Ávila Soto Interactive tactile representations to support document accessibility for people with visual impairments

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 19981013

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE CH DE DK ES FI FR GB IE IT LI NL SE

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20030801