GB2311888A - Tactile communication system - Google Patents

Tactile communication system Download PDF

Info

Publication number
GB2311888A
GB2311888A GB9606892A GB9606892A GB2311888A GB 2311888 A GB2311888 A GB 2311888A GB 9606892 A GB9606892 A GB 9606892A GB 9606892 A GB9606892 A GB 9606892A GB 2311888 A GB2311888 A GB 2311888A
Authority
GB
United Kingdom
Prior art keywords
transducers
input
output
speech
output transducers
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB9606892A
Other versions
GB2311888B (en
GB9606892D0 (en
Inventor
John Christian Doughty Nissen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to GB9606892A priority Critical patent/GB2311888B/en
Publication of GB9606892D0 publication Critical patent/GB9606892D0/en
Publication of GB2311888A publication Critical patent/GB2311888A/en
Application granted granted Critical
Publication of GB2311888B publication Critical patent/GB2311888B/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B21/00Teaching, or communicating with, the blind, deaf or mute
    • G09B21/001Teaching or communicating with blind persons
    • G09B21/007Teaching or communicating with blind persons using both tactile and audible presentation of the information
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B21/00Teaching, or communicating with, the blind, deaf or mute
    • G09B21/001Teaching or communicating with blind persons
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B21/00Teaching, or communicating with, the blind, deaf or mute
    • G09B21/001Teaching or communicating with blind persons
    • G09B21/003Teaching or communicating with blind persons using tactile presentation of the information, e.g. Braille displays

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The communication device comprises a keypad which includes an array of input transducers 20-26 and tactile buttons which include output transducers 10-16. The device makes text, graphics and speech, held on a computer or communicated via a computer, accessible through the sense of touch. For text, the device displays one character at a time using the array of buttons 10-16 to tap or vibrate against the skin and produce a pattern of sensation on the skin. For graphics such as maps, the device allows the user to explore in 2-dimensions, and use the sense of touch to locate objects on the map. For speech, the device displays phonetic patterns based on output from an automatic speech recognition system. The system combines the keypad 20-26 with the tactile output 1-16 into a single hand-held unit.

Description

TACTILE SYSTEM FOR COMPUTER DYNAMIC DISPLAY AND COMMUNICATION This invention relates to a tactile system to make text, graphics and speech, held on a computer or communicated via a computer, accessible through the sense of touch. A text-based version of the system is intended for use by blind and deafblind people, but could also be useful for deaf people, or for people with processing impairments such as dyslexia. A speech-based version of the system is intended for people with communication problems because of deafness, speech impediment or both.
Most blind people cannot read Braille. Learning to read Braille is difficult for most people, especially elderly people, partly because of the fine sensitivity of feeling required in the finger tips, and only a small proportion of blind people learn to read Braille fluently. Alternatives to Braille have been tried, for example the systems known as Bliss and Moon, but with limited success. Special printers and paper are needed to produce hard-copies of such notations, and this gives rise to high cost with small quantity production.
Braille books are expensive to produce, bulky to store, and heavy to carry around. But with the advent of light-weight portable computers, information in electronic form can be carried around easily. Unfortunately dynamic Braille displays are bulky, expensive and power consumptive. A known dynamic Braille display comprises an array of cells, each displaying a single character. The cost per cell is considerable and it is multiplied by the number of cells in the array, typically 40 or 80. A few systems have been developed with a single Braille cell, but though the cost is reduced, the problem of finger sensitivity is aggravated, and blind people complain of numbness after using such a device for a few minutes.
Although voice synthesis often can be used with Braille in place of tactile output, there are many situations when sound is obviously undesirable, and tactile output by itself preferable, for example in the classroom. Blind people prefer Braille for reading documents containing technical jargon or numerical data. Many blind people prefer Braille for a quiet read of a book, when they can leave the sounds to their own imagination. Therefore tactile output is an extremely desirable supplement to sound.
However the advent of good-quality, affordable speech synthesisers is making it increasingly difficult for manufacturers to sell and maintain expensive tactile devices, for example production of the system known as Optacon (see below) will be discontinued at the end of 1996.
Blind people have problems accessing graphical information. Blind people have a problem to access any kind of information in graphical form. Tactile graphic "hardcopy" images can be produced, e.g. using swell paper or moulded plastic, but there are a number of disadvantages: the irnages are static; they are low resolution - with limited information content; they are not scaleable; they require special expensive equipment to be produced; 0 each image has a production cost per copy; the image is bulky, and unwieldy to carry around; an image once produced cannot be easily marked or annotated.
The main advantage is that there is no cost associated with actually reading the image; the user merely scans with a finger over the surface.
The information content can be boosted by having audio output of text annotations, and mounting the image on a touch-sensitive tablet. This is the approach of the system known as Nomad. However other disadvantages remain, and there is now a cost associated with the reading system.
A dynamic display should not have any of the disadvantages of a purely hard-copy image, but hitherto the cost of dynamic display systems has been a problem, especially for a system with reasonable resolution. There have been three approaches tried. In the first approach, that of the Optacon, there is a array of a large number of small pins, each pin corresponding to a pixel. The pins act over the surface of a finger-tip. The device is mounted on a mouse to allow a pointer (cursor) to be moved over the graphic image. Fine finger sensitivity is needed, and the solution is expensive because of the number of pins involved, each operated by a separate piezo-electric device A second approach has been to use a virtual reality display with a tactile glove incorporating position sensors and actuators. This has also proved very expensive. In a third and recent approach, that of the device known as FeelMouse, the button on a mouse has been modified with tactile feedback, such that the user can feel a force related to what is "under" the cursor on the screen (assuming there is one for the sake of discussion).
This is a much cheaper solution, but provides only limited sensory input to the user that is, the channel for information flow to the user is very narrow, and the user would require a long time to build up a mental image of a large or complex graphic image presented to him or her.
Blind people have problems of recording and retrieval. Blind people commonly use two methods to record information for subsequent reading: cassette recorder and Braille punch. The laptop and notebook computer offers a portable means to record information, input using the keyboard, but the keyboard is often smaller than usual, and difficult for a touch typist. Also such a computer does not offer a ready means to output the information unless it can support speech synthesis and a sound card. The weight of a conventional dynamic Braille display, for when tactile output is needed, reduces portability considerably.
Deafblind people have problems communicating with one another. There is a small proportion of the population of people who are both blind and profoundly deaf, though this proportion is growing due to greater numbers reaching old age and greater numbers of extremely premature babies surviving despite functional defects. Deafblind people have to rely on their sense of touch, typically using finger spelling for inter-personal communication, and using keyboard and dynamic Braille display for computer input and output respectively. They suffer the same problems with Braille as mentioned above, and several alternative systems have been tried, as follows.
A recent development comprises a virtual reality "data glove which can identify characters being spelt out by the user to send text (computer input), coupled with a "hand-tapper" which can tap a pattern related to finger spelling on the hand receiving the text (computer output). The glove is expensive to produce, and there would only appear to be a very small market for this product.
A device known as Dexter is a robot hand. It has been developed to simulate a hand signing with a deafblind manual alphabet, allowing the receipt of text. However deafblind people use subtle feeling of movements on the surface of the hand to help them to "read" the characters, and the robot hand has proved unsatisfactory.
Deafblind people have problems communicating with others, outside their own circle or community. Very few people know tactile sign language. There are several ways for people without knowledge of tactile sign language to communicate with a deafblind person: they may talk to an intervenor, who types the text for the deafblind person to read on a dynamic Braille display; or they might type directly themselves; or they might talk to an intervenor who signs in tactile language to the deafblind person In the absence of an intervenor or the above equipment, they may draw letters on the hand of the deafblind person. If the deafblind person is unable to respond orally, then there needs to be equipment or a convenor for this direction of communication. In any case, conversation is very slow, and suitable conditions may be needed for conversation to take place at all.
For remote communication, conversation is via textual message interchange. The system known as Hasicom is an example of a system to support such communication over the telephone.
Deafblind people have problems in obtaining adequate education. Current methods of teaching deafblind people are extremely labour intensive, and many deafblind people have poor literacy skills. Computer aided teaching systems lack a means of immediate feedback to the pupil in the tactile domain. The majority of the deafblind people acquired their second sensory impairment through gradual deterioration. However, even so, there is a reluctance to learn Braille before absolutely necessary, by which time it is more difficult because of a lack of feedback between teacher and pupil.
Deaf people have problems relying on sight only. There are various situations in which a deaf person may be trying to lip-read and watch the speaker's expression and body language, while at the same time follow the text of what is being said. The text might be provided as subtitles, or by a stenographer (or palantypist). For example, at meetings a deaf person may have a stenographer who types in phonetic code, which is translated into English and presented as text on a lap-top screen in front of the deaf person.
Alternatively there may be an interpreter providing a sign-language translation of the speaker's words. In either case the deaf person has a split visual attention.
Dyslexic people have problems using a single sensory mode, namely their sight. It has been estimated that as much as 10% of the population have some form of dyslexia, and have difficulty with reading printed text because of this. They are often helped by multi-modal display.
Visually impaired people have problems with computer and information access. The trend towards the electronic office and the information society offers the potential of equal ease of access to information, and equal ease of manipulation of that information.
However, people with visual impairment find it increasingly difficult to access computers in the home and in the workplace, as the fashion for visual graphics grows, and software tools become more diverse, more feature-laden, and more visually complex, employing overlapping windows, icons and a mouse. A means of finding one's way around a screen is becoming increasingly urgent for people with visual impairment.
The solution generally adopted by people with severe visual impairment is based on "screen-readers" which can be set to "look" at certain areas of the screen, and read out any new text appearing using speech synthesis or dynamic Braille display. The screenreaders have had to become more complex as the visual interface becomes more visually complex with successive versions of commercial windowing systems. This in turn has made them more difficult to use.
A problem with speech output and Braille display concerns the lack of distinction between upper and lower case letters. It is easy for a blind person, typing on the computer, to slip into upper case without noticing, or forget to revert to lower case at the appropriate time.
Many computer users risk developing repetitive strain injury (RSI). A problem for people who need to spend lengthy periods typing into a computer is the risk of repetitive strain injury. Special keyboards have been designed, allowing one to hold one's arms in a more "natural" position.
People with dexterity in only one hand require value a means of rapid single-handed input. One solution is to use a conventional keyboard with software to map the set of keys of one hand onto the other hand, and use depression of the space bar to indicate a character from the mapped set. Another solution is to use chordal input.
Deaf people have a problem of relying on sight for speech access. There appears to be no prior art in the area of speech representation for the tactile domain, except the SPIRAL system, developed at Cambridge University. Most of the prior art in tactile devices for the deaf concerns devices which present speech vibration to the skin, and are used to supplement lip-reading, so that, for example, a deaf person can be made aware of which consonants and vowels are voiced. The SPIRAL system uses advanced speech recognition technology, and presents the user with phonemes using an array of about 40 solenoids to tap or vibrate onto the hand.
People who are mute or with speech impediment lack a means of generating fluent speech. There appears to be no prior art in the area of direct manual generation of speech using chordal input. Generating speech indirectly via text has the disadvantage that the system becomes language specific. Also expressive content cannot be transmitted, as the timing, pitch and stress is lost.
Blind people lack suitable devices for pointing (analogue input) and pointers (analogue output). For many computer applications it is necessary to have a pointing mechanism, which can be used to move the position of a point, the "cursor", on a screen or in virtual space. Existing pointing mechanisms include: amouse; a tracker ball; ajoystick; a touch tablet; a motion tracking system.
A motion tracking system can track the motion of the hand in two or more degrees of movement. One such system determines the position of the hand relative to the person's body, by measuring the change in distance, between a device on the hand and two (or three) points on the person's body, typically using acoustic or electromagnetic Doppler devices. Another such system measures the induced current resulting from motion of a conducting wire attached to the hand either in the earth's magnetic field, or in the field of a nearby magnet.
A disadvantage of these mechanisms is that, without a cursor and screen, there is limited feedback to the user to indicate to which point the cursor has been moved.
According to the present invention there is provided a tactile system including a module adapted to be brought into contact with the skin of a user of the system, and having an array of pressure sensitive input transducers and an array of pressure generating output transducers, a central processing unit and an interface adapted to allow signals representative of a character generated by the input transducers to be entered into a data store in the central processing unit and operating signals from the data store representative of a character to be applied to the output transducers.
The present invention can include a communication system, to allow communication of signals associated with input and output between different locations. Thus characters generated as input at one location can be presented as output at another location, and vice versa.
The present invention will normally include a sound output system, adapted to operate in conjunction with the input and output transducers, but it can be used alone.
Alternatively, or in addition, the invention can include a visual display system, likewise adapted to operate in conjunction with the input and output transducers. The sound output system can include a speech synthesiser for indicating orally characters or words being entered into or retrieved from the data store. The visual display system can display characters or words being entered into or retrieved from the data store. The sound output and/or visual display can be synchronised with the tactile inputloutput.
The tactile system allows both input to a computer and output from a computer.
Typically the invention includes keys, similar to those on a conventional keyboard, for input, and "buttons" (round-ended pins) for output, see for example Figure 2. Each key has an associated transducer which converts pressure applied by the user on the key into an electric signal. Each button has an associated transducer which converts an electric signal into a force transmitted via the button onto the skin of the user.
Input and output can be discrete or analogue. Discrete means that a device, such as a key or a button with its associated transducer, can be in one of two states, off or on, whereas analogue allows a variation (continuous or in steps) between these states. In the analogue form of the present invention, the input and output transducers have a range of operating states, the degree of movement of the tactile surface of the transducers being related to a feature of the characters being entered into or retrieved from the data store.
Typically the arrangement of keys and/or buttons is symmetrical so that the mechanism can be used by either left or right hand, with each pattern for one hand being a mirror image of the pattem for the other hand. The line of symmetry 33 is shown for the example in Figure 2. For this ambidextrous arrangement there can be a switch, for setting by the user according to left or right hand use.
The buttons for output can be activated by transducers, typically solenoids, either in sequence or simultaneously to produce patterns of stimulation on the skin. The pattern can be based on a text-based or a phonetically-based code. The text-based code can be derived from Braille, or it can be designed for a particular embodiment of the invention, as where the code for consonants is related to the shape of lower case characters.
For text output, one character pattern is displayed at a time, stimulating neural receptors (typically in the palm of the hand of the user) over an area of skin with sufficient force to ensure detection. The speed at which the pattern is displayed can be controlled by the user, using the tactile input mechanism in control mode. Each button can be vibrated against the skin, for example by applying a sinusoidal voltage to the associated transducer. The frequency of vibration can be varied, for example to distinguish different subsets of characters. Upper case can be distinguished from lower case by this means. The energy level in the transducers can be varied, so that for the degree of generated pressure or the amplitude of vibration can be varied. The interface between the data store and the output transducers is adapted to produce operating signals having a plurality of distinct frequencies and/or amplitudes, the different frequencies and/or amplitudes of the operating signals being indicative of features of the characters to be communicated to a user of the system.
Typically there is an array of buttons in a hexagonal arrangement, see Figure 1. The array is used to simulate the drawing of characters on the skin, by varying the locus of vibration over an area within the polygon formed by outer buttons, such that the locus traces out the shape of character. The locus of vibration, to be felt on the user's skin, is a point between three nearest buttons, determined by the balance of energy of vibration of the buttons. This is analogous to a colour triangle where the colour at any point is determined by the balance of the primary colours to be seen at the vertices.
The invention can include a particular input mechanism. This input mechanism produces signals to the computer, analogous to those produced by a conventional keyboard and mouse, which allow: (i) commands to the computer operating system or application; (ii) control of parameters such as cursor position and the speed of output; (iii) input of characters of a text-based or phonetically-based code.
This mechanism can be described as a chordal keypad. The layout is arranged so that keys can be depressed to produce patterns, either as chords or sequences, corresponding to the output patterns. Typically the arrangement is symmetrical, so that the mechanism can be used by either left or right hand, as for the output mechanism. A switch serves to indicate left or right hand use. If input and output mechanisms are incorporated into a hand-set, as for example see Figure 2, a single switch suffices to cover both input and output patterns.
In an analogue form of the input mechanism, used as a pointing means, the degree of depression of each key is measured by a transducer, converting displacement into an electrical signal indicating the amount of the displacement. The locus of depression can be determined from the degree of depression of adjacent keys. Typically the keys are grouped around a central point. A user digit (finger or thumb) can slide over the keys, depressing one or more keys at once and to varying degrees. The locus of depression corresponds to the position of the user's digit at any moment. This locus affects the component of the position, or movement of position, of a cursor in the direction of the locus from the central point.
The present invention can include mechanisms where the output transducers are also capable of acting as input transducers. Thus the same button can act both for key input and tactile output. A depression of the button can either operate a switch or force a movement of a pin through a solenoid such as to induce a current which can be detected as a key depression. The same solenoid can press and/or vibrate the button against the skin for tactile stimulation. Thus an array of buttons can be used for both input and output.
In a similar way, the mechanism for pointing (input) can be used as a pointer (output).
In this case the output buttons are used as keys for pointing. The degree of depression, or speed of depression, of a button is detected by a transducer for input. The degree of vibration of the button can be varied using a transducer which can be the same transducer as detects input.
The present invention can include another form of pointer (output) mechanism, where transducers can tilt a surface in such a way that the tilt, and its degree and direction, can be detected by the user. Typically there is a platform with a pattern of holes which can be moved by transducers relative to a plate with a corresponding pattern of protuberances, such that some of these protuberances can be made to protrude through the holes. The plate is curved relative to the platform, such that only a few of the protuberances can protrude at any one time. The tilt of the platform relative to the plate, or vice versa, can be judged by the user who feels which protuberances are proud of the platform surface, and by how much.
The same mechanism can be used as a pointing (input) mechanism, where the user can depress and tilt the platform, and the degree of depression and tilting is detected by transducers. The platform mechanism can serve for both pointing (input) and pointer (output) at the same time. The platform is moved by the user for pointing (input), and by the transducers as a pointer (output).
There is an option for a pointing mechanism, including, but not limited to, a mouse, a touch tablet, a joystick, a position tracking device, or a special mechanism based on input keys or platform and part of this invention. Figure 3 shows the incorporation of a mouse mechanism, 37.
The present invention can include a speech encoder adapted to encode speech into data signals representative of characters to be reproduced by the output transducers as patterns of stimuli on the skin.
The present invention can include a speech encoder operating in response to patterns of pressure applied by a user on input transducers, and producing signals which can be reproduced as speech-like sound or reproduced as characters on output transducers.
The input and output mechanisms can be incorporated into a handset. There can be a strap over the back of the hand to hold the handset in position, or the mechanisms can be incorporated in a glove. For computer operation there is a link to the computer to convey signals in each direction. This link can be a cable connector.
The input, output and pointing mechanisms of the system can be physically separated for example the output mechanism can be worn on the upper arm while the input mechanism is mounted on a mouse or joystick used as pointing mechanism.
The invention will now be described and explained with reference to the accompanying drawings, in which: Figure 1 shows in plan an arrangement of pressure generating transducers included in embodiments of the invention; Figure 2 shows in plan an arrangement of pressure sensitive and pressure generating transducers utilised in embodiments of the invention; Figure 3 is a side view of a hand-held communication module incorporating the invention, including a mouse-based pointing means; Figure 4 shows a diagrammatic cross-section of another communication module embodying the invention.
Referring to figure 1, one effect of near simultaneous tapping or vibration at two neighbouring points on the skin is to produce a sensation of movement of the stimulation across the skin between the points. This effect is exploited by placing a number of transducers (e.g. solenoids with pins to tap or vibrate the skin surface) in an array, and activate successive transducers one by one to trace out a pattern in time and space akin to a hand-written letter being drawn on the hand. This pattern is a three dimensional code, one dimension being time (like Morse code), and two dimensions being space (as with printed characters).
The patterns for each character can be chosen to be distinctive. With patterns involving any number of successive stimulations, there is an unlimited number of patterns to choose from, and a set of distinctive patterns can easily be chosen. Some restrictions in the form of rules can limit the choice of patterns, but help the user in recognition of successive patterns at speed.
In this particular embodiment, the output is performed by seven solenoids packed in a hexagonal arrangement, to stimulate the skin, e.g. on palm of the hand. This is shown approximately to scale (but buttons should be rounded), in figure 1. The central button, 10, is surrounded by a six buttons, 11 to 16, forming a hexagon.
Vowels are abbreviated to a single button operation. Other characters of the alphabet have a sequential pattern, simulating the drawing of a lower case character in a stylised manner. The English alphabet, as written onto the right hand with palm down covering the buttons of figure 1, is output as follows:
a 13 n 13, 14, 11 b 15, 10, 11, 16 o 16 c 16, 14, 10, 11 p 12, 10, 16 d 13, 10, 15 q 14, 10, 12, 11 e 14 r 14, 10, 16 f 16, 15, 10, 12 s 16, 10, 11 g 14, 10, 12 t 15, 10, 16 h 15, 13, 11 u 11 i 15 v 14, 13, 10, 16 j 10, 12, 13, 15 w 14, 13, 10, 11, 16 k 15, 13, 10, 16, 11 x 10, 13, 14, 12 l 15, 10, 11 y 14, 10, 12, 16 m 13, 14, 16, 11 z 10, 16, 12, 11 The patterns are reflected about the line of symmetry, see 33 in figure 2, for writing onto the left hand.
There are four character subsets: lower case letters, capital letters, numbers and special characters. These subsets are distinguished by each having a characteristic frequency of vibration. Changes between these four subsets are indicated by shift characters: "shift to lower case", "shift to caps", "shift to numbers", and "shift to special". A fifth subset, which comprises space character, punctuation characters and the shift characters, is not distinguished by frequency:
space 12 shift to lower case 10, 13 period 13, 12, 11 shift to caps 10, 14 comma 11, 12 shift to numbers 10, 11 semicolon 10, 11, 12 shift to special 10, 15 colon 10, 13, 12 Capital letters have the same patterns as lower case letters. The number subset has number patterns as follows:
'16.: Mi. M1Ui5 - t7 "13 2 2 3' 13,i6' 8' 10, 12 4 F 14 sr 1 1 3 10 13, 12 These patterns are chosen to correspond as far as possible to the use of a telephone keypad arrangement for input. Mathematical symbols are included in this subset.
Timing is based on a unit which can be varied by the user, in the range from 10 milliseconds to 200 milliseconds, in approximately logarithmic steps over that range.
There is one unit between the operation of successive buttons of a letter pattern. There is a two to five unit pause between letter patterns of a wor keys are used for control of output, including adjustment of speed, pausing, repeating characters, repeating of words.
In one form of the input mechanism, the keys are arranged as on a telephone keypad.
For numerical input, the keys can be used singly as they would be on a normal telephone. Key 25 is number 2 in the middle of the top row; keys 24, 20 and 26 are in the next row as numbers 4, 5 and 6; keys 23 and 21 are in the third row as numbers 7 and 9; and key 22 is in the bottom row as number 0. The pattern for output is made to correspond where possible, see table above.
In an embodiment of the tactile system, for use with graphics and spatial information, the input and output mechanisms are mounted on a "mouse", that is, a handset with a ball whose motion is tracked, see 37 in figure 3, so that it can be used as a pointing mechanism. Digital representations of graphic images or virtual surfaces are stored in computer memory. The mouse pointing mechanism is used to move a cursor to indicate a position on the graphic image or virtual surface, just as a mouse is used to move a cursor position on a screen. An array of tactile output buttons is activated according to the characteristics of the image in the vicinity of the cursor. The tactile system allows the user to find the position of point objects on the image or surface, to follow lines (e.g. indicating roads, boundaries or contours on a map), and to add markings to the image or surface (e.g. a route on the map). The type and identity of an object can be displayed to the user, visually (in large print for partially sighted), in audio (with speech synthesiser or with digitised speech) or using the tactile system's character display. There is the same 7 button arrangement as shown in figure 1, which can be used for text output, e.g. of place names on a map, as well as for exploring the graphic image or virtual surface. It also has the same key arrangement for input as described previously. Buttons and keys are used as described previously. However there is some additional capability for both output and input, as now to be described.
There are two output modes: one for characters and one for navigating the image or surface. While in command mode, one of the keys is used like a mouse button, e.g. for drawing onto the graphic image or virtual surface. While in navigation mode, the buttons are activated as the cursor is moved. When the cursor is near an object in the image or surface, outer buttons of the array nearest the object vibrates at a rate and/or amplitude dependent on the distance to the object or its boundary. The centre button vibrates according to what is at the point on the map under the cursor.
Suppose the user moves the cursor towards a straight line on the map. On approaching the straight line, the user senses at first the proximity and direction of the nearest part of the line. As the line is reached, the centre button is activated, and the approximate direction of the line is indicated by vibrations of the outer buttons most closely aligned with the direction. This allows the user to stop the cursor on the line, and then move it along the line to follow a boundary or the shape of an object.
One embodiment, which can be used for graphics and spatial information but can also be used in virtual reality applications, is the same as the previously described embodiment, but has the keys and buttons mounted on a hand-set without mouse ball, and uses a different mechanism for pointing. The movement of the handset, or of the hand holding the hand-set, is tracked in two or three dimensions. The buttons are used to sense the proximity and direction of the nearest objects in the virtual space. With two dimensions and 7 buttons, the sensing of directions is as in the previous embodiment. With three dimensions and 7 buttons, the outer pairs of opposite buttons represent the x, y and z axes. Buttons 11, 16 and 15 represent x, y and z positive; buttons 14, 13 and 12 represent x, y and z negative. Directions cannot be judged as accurately as was possible in two dimensions, since these primary directions are 90" apart, whereas the primary directions for the hexagonal arrangement in two dimensions are 60 apart.
The tactile output mechanism of the current invention can be incorporated into any system which needs to display information. Such systems include appliances in the home, lifts to tell you which floor you are on, and kiosks in public places. The buttons are set into the display panel, as in figure 1. In such a system, the output might be combined with the invention's tactile input mechanism or with a conventional input mechanism. A pointing device could be included for access to maps in public places.
However for this application the mouse and virtual reality approaches of the previously described embodiments are probably inappropriate.
Referring to figure 4, a tilting tactile pointer mechanism is shown. In this example a platform, 40, moves, and a plate, 43, is fixed. (In another form the platform is fixed and the plate moves.) The corners of the platform rest on four pins, 41, which can be moved by solenoids, 42, such that the platform, 40, can be moved and tilted an amount determined by the relative activation of each pin, 41, corresponding to the direct current in each solenoid, 42. Each pin, 41, is attached at one end to the platform, 40, in such a way as to allow the platform, 40, to tilt, and at the other to a soft spring (not shown in the figure), such that at rest the platform, 40, is suspended just above the plate. In order that the user's finger or thumb, resting on the platform, 40, can feel the extent of tilting, In the platform there are small holes, 45, in a hexagonal pattern (as in figure 1 but with the holes close together), through which protuberances, 44, protrude, according to the movement and tilting of the platform, 40, relative to the plate, 43.
The same mechanism is used for pointing, using induction in the solenoids, 47, from movement of the pins, 41, when the platform, 40, is depressed, to determine the point and degree of depression. It is necessary that the platform, 40, at rest is a little above the fixed plate, 43, and that the curvature of the plate, 43, is such that when the platform, 40, is gradually depressed at any particular point, the platform, 40, eventually touches the plate at that same point A speech-based form of the current invention allows direct translation of spoken word between the sound domain and the tactile domain. For speech to touch, the invention uses transducers to tap or vibrate and produce patterns of stimulation, typically on the palm side of the hand, and the patterns correspond to phonetic characteristics of the speech, recognised using the front-end of a real-time speech recognition system.
In an embodiment of the speech-based form of the current invention, an array of buttons, stimulating the tips of the fingers, is used for consonants and each consonant has a discrete pattern. A second array of buttons, stimulating the palm of the hand, is used for vowels, and any vowel sound will have a locus of vibration (according to the formants) which moves continuously ifthe vowel sound changes during enunciation, as for a diphthong.
Vowel sounds are mapped onto points and curves within a diagram whose co-ordinates represent different formants. The extremes of vowels conventionally form the corners of a quadrilateral in this "vowel space". The invention maps vowels onto a hexagonal space using the front end of a speech recognition system. The invention utilises an interpolation effect whereby, if nearby points on the skin are stimulated simultaneously, the stimulus may be perceived to be between those points. The locus of perceived stimulus can be moved continuously in two dimensions within a triangle of three buttons, by changing the energy distribution between the buttons. The hexagonal arrangement is effectively a grouping of six such triangles, and thus allows the locus to be moved anywhere within the hexagon formed by the outer buttons. Vowel sounds are mapped to points and curves within this hexagon. Variations in pitch and stress can be captured by varying the frequency and amplitude of vibration of buttons.
For touch to speech, the invention allows patterns to be generated using combinations of finger movements, in the fashion of chords. These chords are translated into a phonetic stream. This stream comprises a sequence of character codes, each representing the phonetic characteristics of successive vowels and consonants. Timing, stress, and pitch information is included in the stream. The stream can be transmitted to a recipient person (the "listener") where it can be processed by the back-end of a speech synthesiser to produce speech in real-time. With the options for timing, stress, and pitch, it is possible to convey an expressive content of speech.
For the generation of vowel sounds, a pointing mechanism based on a platform is operated by thumb, as in figure 4 The thumb can point to a point, or trace out a curve, in vowel space, to generate a pure vowel or a diphthong respectively.
This embodiment of present invention uses the same speech recognition technology as SPIRAL, but presents the phoneme stream in a different way, using patterns to present phonetic speech content rather than individual pins to represent phonemes. The presentation of vowels using a quasi-analogue means differs from SPIRAL's strictly discrete means.
This embodiment differs from those which are text-based in that: the coding is phonetic rather than being based on a textual characters; the timing is synchronised with speech, there is a greater number of buttons and keys; there are buttons to be felt by the fingers; an analogue mechanism is needed for vowels.
For remote communication between a speaking person and a deaf person, the first stage of translation, involving speech recognition, has to take place at the speaker's location, and the intermediate phonetic representation ofthe speaker's utterances has to be encoded and transmitted over the telecommunication network for decoding and tactile display to the deafblind person. In the reverse direction, the tactile speech has to be encoded and transmitted, for decoding and speech synthesis for the hearing person.
The speech processing and coding at the speaker's location could be performed by a software package, without the need for an instance of the invention itself. Note that the encoded speech could be sent in packets over the Internet, as in an "Internet Phone" facility.
In this embodiment an array of buttons is used for input and output according to whether the system is in input or output mode, so that the fingers of one hand can both key in and be stimulated by the tactile output. A platform is used for pointing by the thumb, as in figure 4, but the corresponding pointer output is provided by buttons stimulating the palm of the hand, as in figure 1.
There are many benefits of the invention.
Features for reading text have the benefit that the invention provides inexpensive tactile output of text using a single cell with typically seven buttons (or pins) which can be made to stimulate the skin using cheap off-the-shelf solenoids. The buttons can be several times the size and spacing of embossed Braille dots, or the pins used in a conventional dynamic Braille display, so the sensitivity problem is avoided.
Furthermore the buttons can be vibrated, and adding vibration (optimally at around 200Hz) improves the detection ofthe stimulus by particularly sensitive receptor neurons present in the glabrous skin on the palm side of the hand.
For ease of learning, especially for elderly people, the buttons can be operated in such a way as to simulate the writing of lower-case characters on the palm of the hand. The invention can employ seven buttons as being the minimum for efficient simulation of handwriting. (Six buttons are arranged in a hexagon with the seventh button at the centre. An effect of near simultaneous tapping or vibration at two neighbouring points on the skin is to produce a sensation of movement of the stimulation across the skin between the points. For handwriting simulation, the locus is moved from button to button of the pattern.) As the user gains familiarity, the drawing time can be reduced until the pattern is presented as simultaneous stimuli to the hand, allowing rapid reading, with speeds comparable to those achieved by fluent readers of Braille. By having different vibration frequencies, a user can distinguish between upper and lower case characters, and does not have to rely on remembering the last change between upper and lower, as one has to with 6-dot Braille.
The invention provides a virtual display of the graphic surface, over which a pointer can be moved and characteristics of its location sensed using the same array of pins or buttons as for character output. A hexagonal arrangement of seven pins is a minimum sufficient to allow the user to examine the surface efficiently. (The central pin indicates to the user whether there is an object under the cursor. Six outer pins suffice to indicate the direction of a line object under the cursor. When there is no object under the cursor, they can indicate the direction of a nearby object, so the cursor can be moved towards it. Thus the entire image can be explored efficiently.) For portability, a form of the invention combines input and tactile output in a single, light-weight, hand-held unit. This unit, used in conjunction with a laptop or palmtop computer, provides a highly portable system for recording and retrieval, with many advantages over a tape recorder.
A feature of particular to deafblind people is that the invention can be used to communicate text in purely tactile domain. An overlapping two-way conversation is possible, as it is possible to send at the same time as receiving. The invention can be used for local or remote communication with a deafblind person. The invention can be used by both parties in communication, thus obviating the need for a keyboard or a dynamic Braille display. Each party can be standing or wallcing while communicating, using a hand-held unit.
There are features for feedback and reinforcement. The invention allows tactile input with immediate feedback, in visual, audio or tactile domain, to reinforce learning and provide correction when necessary. Communication between teacher and pupil is improved, in the situations of both special and mainstream education.
The invention can be realised with extra output buttons to make the simulation of writing on the hand more realistic, and allow the beginner to gradually move towards the stylised code. For people with at least partial vision, the tactile output can be displayed on a screen at the same time as being output, so the locus of vibration can be followed visually.
There are features to complement sight for deaf people. The invention allows text to be presented in tactile modality, to complement sight. The invention allows speech to be presented in tactile modality, to complement sight. The speech-based form of the system allows direct translation of spoken word into the tactile domain.
The invention can be used by dyslexics. The tactile display can reinforce the visual display of text, and therefore help with reading. This is especially valuable for deaf dyslexics for whom audio reinforcement cannot be used.
The invention can be incorporated into a consumer product or public facility to provide tactile output in a standard form for those who need it. The input mechanism ofthe invention can be incorporated into a consumer product or public facility, to replace or supplement an existing form of input, and provide the user with tactile input in a standard form.
As regards access to the computer screen for people with visual impairment, the invention can be used to explore the layout on a screen, treating the areas as graphic elements, see above. The content of an area can be read out using a speech synthesiser, as for a normal screen-reader, or it can be read out using the invention's tactile output.
The invention includes chordal input designed such that strain is reduced because the hand is in a natural shape, and the handset can be operated in any orientation.
The invention includes chordal input designed such that the system can be operated entirely by one hand. The input and output arrangement is typically symmetrical, such that the system can be operated by either hand alone.
A form of the invention includes mechanisms, based on the phonetics of speech, which translate the spoken word directly between the sound domain and the tactile domain, without an intermediate natural language textual form. A real-time speech recognition system recognises the phonetic characteristics of speech sounds within a fraction of a second; and this data is presented in the tactile domain, fast enough to be effectively synchronous with lip movements of the speaker. Thus the deaf person can feel the spoken word while watching the speaker. Because this version of the invention is based on universal phonetic characteristics of vowels and consonants, it can be used for communication in any spoken language in the world. Distinctive dialects can be recognised by the shift the vowels in vowel space.
The invention allows for speech generation, of particular benefit to people who are mute or with speech impediment. The invention allows patterns to be generated using combinations of finger and thumb movements, in the fashion of chords, these chords being translated into a phonetic stream, which can be processed by the back-end of a speech synthesiser to produce speech in real-time. The user can produce a distinctive dialect "accent" by shiffing the vowels in vowel space. Hand movement, in two of the six degrees of movement, can represent stress and pitch, to add expressive content to the generated speech. The pitch of vowels for tonal languages can be handled in this way, for example by relating pitch to sideways movement of the hand or to twist of the hand.
The invention includes features for pointing and for pointers, of particular use to blind people for access to graphic information. The features can also be employed in the speech-based form of the invention, as for example in pointing to a vowel in vowel space. (For pointing, the invention's chordal input mechanism is used in analogue mode, or a tilting platform is used. For pointing, the invention includes pointer mechanisms corresponding to the above pointing mechanisms.) A pointer and its corresponding pointing mechanism can be combined into a single composite mechanism, which reduces size and weight of the system.

Claims (20)

1. A tactile communication system including a module adapted to be brought into contact with the skin of a user of the system and having an array of pressure sensitive input transducers and an array of pressure generating output transducers, a central processing unit and an interface adapted to allow signals representative of a character generated by the pressure sensitive input transducers to be entered into a data store in the central processing unit and operating signals from the data store representative of a character to be applied to the pressure output transducers.
2. A system as claimed in claim 1, including a speech synthesiser adapted to operate in conjunction with the output transducers.
3. A system as claimed in claim 1 or claim 2, in which a speech synthesiser is adapted to operate in conjunction with the input transducers.
4. A system as claimed in any preceding claim, including means for indicating visually characters being entered into or retrieved from the data store.
5. A system as claimed in any preceding claim, in which the input or output transducers can be operated sequentially.
6. A system as claimed in any preceding claim in which the output transducers are adapted to produce vibratory output signals.
7. A system as claimed in claim 6, in which the frequency and/or amplitude of the vibrations of the output transducers can be varied.
8. A system as claimed in claim 7, in which the interface between the data store and the output transducers is adapted to produce operating signals having a plurality of distinct frequencies and or amplitudes, the different frequencies and/or amplitudes of the operating signals being indicative of features of the characters to be communicated to a user of the system.
9. A system as claimed in any preceding claim, in which the input and output transducers have only two operating states.
10. A system as claimed in any of claims 1 to 8, in which the input and output transducers have a range of operating states, the degree of movement of the tactile surfaces of the transducers being related to a feature of the characters being entered into or retrieved from the data store.
11. A system as claimed in any preceding claim, in which the characters reproduced by the output transducers approximate to alpha-numeric characters.
12. A system as claimed in any of claims 1 to 10, in which the characters reproduced by the output transducers approximate to Braille characters.
13. A system as claimed in any preceding claim, wherein the module includes two sets of arrays of input and output transducers, one set being the mirror image of the other so that the module can be used with either hand.
14. A system as claimed in any of claims 6 to 8 or 10 to 13, wherein a position on the surface on which the tactile surfaces of the output transducers lie can be determined by varying the energy of the vibrations of the three output transducers nearest to that position to give weighted mean vibrations at the said position.
15. A system as claimed in claim 10, in which a position on the surface on which the tactile surface of the input transducers lie is determined from the degree of depression of three input transducers surrounding that position.
16. A system as claimed in any preceding claim, in which the output transducers are also are capable of acting as input transducers
17. A system as claimed in any preceding claim, including a speech encoder adapted to encode speech into data signals representative of characters to be reproduced by the output transducers
18. A system as claimed in claim 17, in which there is included a transmission system for the data signals to be reproduced by the output transducers.
19. A system as claimed in claim 17 or claim 18, wherein the speech encoder operates in response to patterns input to the speech encoder via the input transducers.
20. A tactile communication system substantially as hereinbefore described with reference to and as shown in the accompanying drawings.
GB9606892A 1996-04-01 1996-04-01 Tactile system for computer dynamic display and communication Expired - Fee Related GB2311888B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB9606892A GB2311888B (en) 1996-04-01 1996-04-01 Tactile system for computer dynamic display and communication

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB9606892A GB2311888B (en) 1996-04-01 1996-04-01 Tactile system for computer dynamic display and communication

Publications (3)

Publication Number Publication Date
GB9606892D0 GB9606892D0 (en) 1996-06-05
GB2311888A true GB2311888A (en) 1997-10-08
GB2311888B GB2311888B (en) 2000-10-25

Family

ID=10791441

Family Applications (1)

Application Number Title Priority Date Filing Date
GB9606892A Expired - Fee Related GB2311888B (en) 1996-04-01 1996-04-01 Tactile system for computer dynamic display and communication

Country Status (1)

Country Link
GB (1) GB2311888B (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1998032112A1 (en) * 1997-01-20 1998-07-23 John Christian Doughty Nissen Sensory communication apparatus
WO1999026178A1 (en) * 1997-11-14 1999-05-27 Scientific Learning Corporation Methods and apparatus for assessing and improving processing of temporal information in human
GB2338539A (en) * 1995-06-23 1999-12-22 Marconi Electronic Syst Ltd A hand tapper for communicating with the deaf-blind
FR2790578A1 (en) * 1999-03-02 2000-09-08 Philippe Soulie Sign language for communicating with another person or electronic calculator has series of main keys on side not facing user and reading keys transmitting tactile sensations to fingers of user
WO2000052665A1 (en) * 1999-03-02 2000-09-08 Philippe Soulie Tactile reading system for data coming from a computer and associated communication device
EP1217808A2 (en) * 2000-12-21 2002-06-26 Nokia Corporation Communication unit provided with intra-changeable elements
WO2002054388A1 (en) * 2000-12-29 2002-07-11 John Christian Doughty Nissen Tactile communication system
EP1640939A1 (en) * 2004-09-22 2006-03-29 Jöelle Beuret-Devanthery Communication apparatus
US8275602B2 (en) 2006-04-21 2012-09-25 Scomm, Inc. Interactive conversational speech communicator method and system
US8280954B2 (en) 2010-03-25 2012-10-02 Scomm, Inc. Method and system for providing live real-time communication via text between mobile user devices
EP3352055A1 (en) * 2009-10-29 2018-07-25 Immersion Corporation Systems and methods for haptic augmentation of voice-to-text conversion
EP3893227A1 (en) 2020-02-12 2021-10-13 Politechnika Slaska Device for creating spatial visualisations of pictures and graphs for the purpose of education and visually impaired people

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3932869A (en) * 1973-12-03 1976-01-13 Gabriel Kane Tactile numeric display device
US4445871A (en) * 1981-11-12 1984-05-01 Becker John V Tactile communication
US4488146A (en) * 1981-03-27 1984-12-11 Nixdorf Computer Ag Information input and output unit for data processing equipment
GB2181591A (en) * 1985-09-26 1987-04-23 British Telecomm Electronic vibrational display, e.g. for braille
US4985692A (en) * 1987-01-23 1991-01-15 Vennootschap Onder Firma: Alva Word processor work station with a braille reading line

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2302583B (en) * 1995-06-23 2000-01-12 Marconi Gec Ltd Tactile transducer

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3932869A (en) * 1973-12-03 1976-01-13 Gabriel Kane Tactile numeric display device
US4488146A (en) * 1981-03-27 1984-12-11 Nixdorf Computer Ag Information input and output unit for data processing equipment
US4445871A (en) * 1981-11-12 1984-05-01 Becker John V Tactile communication
GB2181591A (en) * 1985-09-26 1987-04-23 British Telecomm Electronic vibrational display, e.g. for braille
US4985692A (en) * 1987-01-23 1991-01-15 Vennootschap Onder Firma: Alva Word processor work station with a braille reading line

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2338539A (en) * 1995-06-23 1999-12-22 Marconi Electronic Syst Ltd A hand tapper for communicating with the deaf-blind
GB2338539B (en) * 1995-06-23 2000-03-01 Marconi Electronic Syst Ltd Sign language hand tapper
WO1998032112A1 (en) * 1997-01-20 1998-07-23 John Christian Doughty Nissen Sensory communication apparatus
US6422869B1 (en) 1997-11-14 2002-07-23 The Regents Of The University Of California Methods and apparatus for assessing and improving processing of temporal information in human
WO1999026178A1 (en) * 1997-11-14 1999-05-27 Scientific Learning Corporation Methods and apparatus for assessing and improving processing of temporal information in human
US6639510B1 (en) 1999-03-02 2003-10-28 Philippe Soulie Tactile reading system for data coming from a computer and associated communication device
WO2000052665A1 (en) * 1999-03-02 2000-09-08 Philippe Soulie Tactile reading system for data coming from a computer and associated communication device
FR2790567A1 (en) * 1999-03-02 2000-09-08 Philippe Soulie KEYBOARD FOR TOUCH-BASED READING OF INFORMATION FROM AN ELECTRONIC COMPUTER
FR2790578A1 (en) * 1999-03-02 2000-09-08 Philippe Soulie Sign language for communicating with another person or electronic calculator has series of main keys on side not facing user and reading keys transmitting tactile sensations to fingers of user
EP1924059A3 (en) * 2000-12-21 2008-06-11 Nokia Corporation Communication unit provided with intra-changeable elements
EP1217808A3 (en) * 2000-12-21 2004-07-28 Nokia Corporation Communication unit provided with intra-changeable elements
EP1924059A2 (en) 2000-12-21 2008-05-21 Nokia Corporation Communication unit provided with intra-changeable elements
EP1217808A2 (en) * 2000-12-21 2002-06-26 Nokia Corporation Communication unit provided with intra-changeable elements
US7397413B2 (en) 2000-12-21 2008-07-08 Nokia Corporation Communication unit provided with intra-changeable elements
WO2002054388A1 (en) * 2000-12-29 2002-07-11 John Christian Doughty Nissen Tactile communication system
EP1640939A1 (en) * 2004-09-22 2006-03-29 Jöelle Beuret-Devanthery Communication apparatus
US8275602B2 (en) 2006-04-21 2012-09-25 Scomm, Inc. Interactive conversational speech communicator method and system
EP3352055A1 (en) * 2009-10-29 2018-07-25 Immersion Corporation Systems and methods for haptic augmentation of voice-to-text conversion
US8280954B2 (en) 2010-03-25 2012-10-02 Scomm, Inc. Method and system for providing live real-time communication via text between mobile user devices
US9565262B2 (en) 2010-03-25 2017-02-07 Scomm, Inc. Method and system for providing live real-time communication via text between mobile user devices
US10257130B2 (en) 2010-03-25 2019-04-09 Scomm, Inc. Method and system for providing live real-time communication via text between mobile user devices
EP3893227A1 (en) 2020-02-12 2021-10-13 Politechnika Slaska Device for creating spatial visualisations of pictures and graphs for the purpose of education and visually impaired people

Also Published As

Publication number Publication date
GB2311888B (en) 2000-10-25
GB9606892D0 (en) 1996-06-05

Similar Documents

Publication Publication Date Title
US5736978A (en) Tactile graphics display
US10095327B1 (en) System, method, and computer-readable medium for facilitating adaptive technologies
US6230135B1 (en) Tactile communication apparatus and method
O’Modhrain et al. Designing media for visually-impaired users of refreshable touch displays: Possibilities and pitfalls
Edwards Soundtrack: An auditory interface for blind users
US9286884B2 (en) Sequenced multi-meaning tactile symbols useable to produce synthetic plural word messages including words, phrases and sentences
Brodwin et al. Computer assistive technology for people who have disabilities: Computer adaptations and modifications.
Kamel et al. A study of blind drawing practice: creating graphical information without the visual channel
GB2311888A (en) Tactile communication system
KR101087640B1 (en) System for interacting Braille education using the feel presentation device and the method therefor
WO1998032112A1 (en) Sensory communication apparatus
Golledge et al. Multimodal interfaces for representing and accessing geospatial information
Reed et al. Haptic Communication of Language
Edwards Multimodal interaction and people with disabilities
Sturm et al. Communicating through gestures without visual feedback
Murray Instructional eLearning technologies for the vision impaired
Ávila Soto Interactive tactile representations to support document accessibility for people with visual impairments
Drake Non-visual user interfaces
Evreinova Alternative visualization of textual information for people with sensory impairment
Heller et al. Visual impairment: Ergonomic considerations in blind and low-vision rehabilitation
EP1665222A2 (en) User created interactive interface
Iman et al. Development of Be-Braille Learning Tool: Enhancing Braille Education and Interaction for Visually Impaired Students
Granström ADN EDWARDS
Parker Assessment of Access Methods for Mobile Maps for Individuals Who are Blind or Visually Impaired
JP2866931B2 (en) Braille learning device

Legal Events

Date Code Title Description
PCNP Patent ceased through non-payment of renewal fee

Effective date: 20030401