US6240392B1 - Communication device and method for deaf and mute persons - Google Patents

Communication device and method for deaf and mute persons Download PDF

Info

Publication number
US6240392B1
US6240392B1 US08/920,820 US92082097A US6240392B1 US 6240392 B1 US6240392 B1 US 6240392B1 US 92082097 A US92082097 A US 92082097A US 6240392 B1 US6240392 B1 US 6240392B1
Authority
US
United States
Prior art keywords
communication
message
data
speech
data processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
US08/920,820
Inventor
Hanan Butnaru
Wesley O. Krueger
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BCMC LLC
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US08/920,820 priority Critical patent/US6240392B1/en
Application granted granted Critical
Publication of US6240392B1 publication Critical patent/US6240392B1/en
Assigned to BCMC, LLC reassignment BCMC, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KRUEGER, WESLEY W.O., MD, DR.
Assigned to BCMC, LLC reassignment BCMC, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BUTNARU, DORA
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B21/00Teaching, or communicating with, the blind, deaf or mute
    • G09B21/009Teaching or communicating with deaf persons

Definitions

  • This invention relates generally to a device for communication between persons who are deaf, hearing impaired, or mute. More particularly, the invention is directed toward a communication system for deaf, hearing impaired or mute persons which allows the user to visualize speech and other sounds directed at him through various audio sources.
  • deaf persons often cannot enjoy television or the theater because the actors are not visible, or have turned their backs to the front of the stage so that lip-reading is impossible.
  • this difficulty is also encountered during day-to-day communication scenarios whenever the speaker or communicator is not facing the deaf person, or is not located in the visual line-of-sight range of the deaf person.
  • an unusually loud noise is used to alert persons in the vicinity that a dangerous or cautionary event is about to occur. This can be the siren of an ambulance or fire engine, an explosion, the horn of an automobile or truck, etc. While a person able to hear such sounds can immediately identify these circumstances as requiring caution or flight, the deaf person will be unaware that he is in danger.
  • the present invention provides a communication system for deaf, hearing impaired, and mute persons, or those communicating with hearing impaired and/or mute persons.
  • the communication system of the present invention comprises a message reception means to receive audio information, such as a microphone and an optional filter; an information conversion means, such as a speech recognizer, to convert the received information into speech data; a data processing means, such as a microprocessor, to organize the speech data into an appropriate and meaningful symbolic representation; and a display means, such as a see-through liquid crystal display to display the symbolic representation of the speech data (usually English language text symbols) to the user.
  • the processing means also has the ability to process other audio data (e.g. non-speech) to determine whether or not a dangerous situation exists within the environment surrounding the user.
  • the apparatus and method of the present invention comprises a message entry means, such as a keypad and/or stylus-and-tablet to acquire messages from the user.
  • the apparatus and method of the present invention also comprises a message transmission means, such as a voice synthesizer and speaker, to convert the user messages into message data, and convey the message data as audio output, preferably speech, to the user's immediate physical environment.
  • a method and apparatus for communication between hearing impaired, deaf and/or mute persons and others, including a group of people, which involves use of an optional element of the message transmission means, comprising a remote data transceiver electrically connected to the processor of the present invention so as to transmit and receive message information from other communication systems that are not in the immediate vicinity.
  • the apparatus and method include the ability to produce indicator signals for display to the user which correspond to dangerous or cautionary situations relating to abnormally loud noises, or readily recognized sound patterns, such as a siren.
  • the apparatus may also comprise a cellular phone which operates as an adjunct to the microphone and speaker elements, providing mobile telephone communications for the handicapped.
  • indications can be transmitted to the user representing information related to geographic location, distance to a present destination, or other personally useful information.
  • FIG. 1 is a perspective view of the present invention indicating a user's perception of a projected message image.
  • FIG. 2 is a stylized representation of the display for the user, including messages and various information indicators.
  • FIG. 3 is a simplified block diagram of the present invention.
  • FIG. 1 illustrates a perspective view of an exemplary embodiment of the present invention, as it is worn by user 200 .
  • a communication system 10 for deaf, hearing impaired, or mute persons optionally comprises headband 120 , or other suspension means, such as eyeglasses, goggles, or a helmet, for supporting the various component elements which comprise the present invention.
  • the user 200 receives messages, including speech and other audio information, by way of a message reception means, such as a microphone 70 , most preferably a self-powered and unobtrusive clip-on type.
  • the communication system 10 also comprises a data processing means, such as a wearable processor 20 , which may be co-located with the user 200 and electrically connected to a display means, such as a projector 50 , so as to control the content of the projector display 140 .
  • a keypad-tablet 35 or other user message entry means, comprising a keyboard 40 and a handwriting recognition display 42 , is manipulated by the user 200 to present user message information to the wearable processor 20 for transmission to the environment surrounding the user 200 by way of a speaker 60 and/or a remote data transceiver 72 .
  • a stylus 45 can be used to write on the recognition display 42 so that the printed or handwritten message entry by the user 200 can be recognized by the wearable processor 20 and communicated to the environment surrounding the user 200 .
  • a keypad-tablet 35 may be connected directly to the processor 20 , or communicate with the processor 20 via radio or other electromagnetic means, such as by infra-red signals (commonly found on television remote control devices).
  • the wearable microprocessor 20 is capable of transmitting messages into the environment by way of a message transmission means comprising a synthesizer 57 , which converts user messages into message speech data, and a speaker 60 , which presents audio information to the environment surrounding the user 200 .
  • the message area 100 which contains the collection of alphanumeric characters which communicate speech to the user, after it has been recognized from audio information in the surrounding environment, or submitted to the processor 20 by way of a remotely transmitted message from a similar device. It is preferred that the message area 100 be located on the periphery of the display 140 so that the viewing area directly in front of the user 200 remains clear. However, it is most preferred that projector 50 presents a display 140 to the user 200 which can be seen through (a “see-through” display), so that any message text or indicators shown thereon will not interfere with the user's normal line of sight. While shown as a single message line display, those skilled in the art will realize that several message lines can also be projected onto display 140 . Most preferably, message area 100 will comprise two lines of material presented in teleprompter fashion.
  • a first indicator 80 and a second indicator 90 represent optional elements of the display 140 .
  • the first indicator 80 may consist of an alarm icon, flashing light, or some type of analog display symbol which indicates the amount of noise in the environment.
  • the second indicator 90 may represent a geographical position location, generated by an attached Global Positioning Satellite (GPS) system, the distance of the user from a selected location, or the sound level of the surrounding environment in decibels.
  • GPS Global Positioning Satellite
  • any information which can be processed by the wearable processor 20 can be displayed to the user 200 at an appropriate time.
  • the time of day or of a specific event can also be included in the content of the second indicator 90 .
  • a power source 30 which may consist of a rechargeable battery, solar cell, or other electric current power generator, provides power to the wearable processor 20 .
  • the environmental message reception means comprising the microphone 70 , and optionally, the filter 150 , is used to acquire audio information.
  • a filter 150 is most preferably implemented as a band-pass filter which passes frequencies from about 500 Hz to about 2,999 Hz.
  • any of several noise cancellation techniques can be implemented by the wearable processor 20 or the speech recognizer 55 .
  • Information conversion means such as a speech recognizer 55 , preferably operates in a manner that will accept continuous speech input and other audio information (such as a police siren tone pattern or abnormally loud sound), using templates and a vocabulary lookup table, or other methods well known in the art, and convert it into speech data.
  • the data processing means processor 20 will then organize the speech data into a meaningful symbolic representation for transmission to the projector 50 .
  • the message entry means such as the keypad-tablet 35 is used to acquire user message information, which is passed on to a message transmission means, comprising a speech synthesizer 57 and a speaker 64 , producing an audible signal in the user's immediate vicinity.
  • a message transmission means comprising a speech synthesizer 57 and a speaker 64 , producing an audible signal in the user's immediate vicinity.
  • Synthesized speech produced by any of several methods well known in the art, is most preferably formed by the synthesizer 57 so as to mimic the gender of the user. In this manner, a mute person can communicate with those that can hear.
  • the message transmission means may alternatively comprise the synthesizer 57 and a remote data transceiver 72 , wherein the keypad-tablet 35 user input message information can alternatively be sent to a remote location via the remote data transceiver 72 and an energy transceiver 77 .
  • the remote data transceiver 72 comprises an electromagnetic energy transceiver capable of receiving and transmitting electromagnetic energy using an appropriate energy transceiver 77 . If the electromagnetic energy consists of infrared or other light, then a laser diode transmitter and receiver, or similar devices, would be most appropriate to use as the energy transceiver 77 . If the electromagnetic energy consists of radio waves, then an antenna designed for the desired frequency and bandwidth should be used as the energy transceiver 77 .
  • any person can communicate at a distance with another person who is hearing-impaired or deaf.
  • Such communications can also exist between persons at each end of the communication link who are not disabled.
  • those skilled in the art will recognize that such an arrangement of remote radio communication devices also lends itself to multi-channel communication, which allows groups of disabled users to communicate amongst themselves, either out loud, or silently.
  • Another example of system use might enable all English speakers to communicate on one channel, and all Spanish speakers on another.
  • Such multi-channel operation is desired to enhance user privacy and the practicality of the system. It is anticipated that a single “international” frequency, along with several “national” frequencies will be allocated to facilitate communications between people of differing countries, and the same country, as desired.
  • the processor 20 can be programmed to recognize a dangerous or a cautionary situation, due to the sheer magnitude of the electrical signal received, or based on a recognized pattern, and will thereafter display an alarm signal by way of the first indicator 80 to the user 200 via the projector 50 and the display 140 .
  • Operation of the communication system 10 by using audio input and output (via the microphone 70 and the speaker 60 ) obviates the need for ASL, and allows deaf, hearing impaired, or mute persons to communicate with others that are similarly disabled, and also with individuals having unimpaired hearing and speech capability.
  • Environmental information can also be received by way of the energy transceiver 77 and the remote data transceiver 72 .
  • Such remote information will normally originate from another communication system 10 at a remote location (i.e. outside of the line-of-sight, or out of normal hearing range) and consist of audio information, or speech data.
  • the acquired signals can then be processed by the speech recognizer 55 or, if already processed by the remote communication system 10 , the received speech data can be sent directly to the processor 20 for organization into an appropriate symbolic representation of the data, and on to the projector 50 for viewing by the user 200 and, optionally, transmitted to the external environment via the speech synthesizer 57 and the speaker 60 .
  • An optional element of the communication system 10 is a storage device 62 , which can store several canned messages for recall by the user 200 . This provides a convenient means of rapidly communicating standard words and/or phrases to the external environment.
  • the storage device 62 which comprises battery-powered RAM memory, EPROMs, disk drives, or tape drives, can be used to save messages that have been received from the external environment for receipt and processing at a later time by the user 200 .
  • a cellular telephone (not shown) can be used as a supplement to the microphone 70 and the speaker 60 .
  • the first indicator 80 within the display 140 can be activated, along with a message to the user 200 that the “Phone is ringing.”
  • the user 200 can then choose to answer the phone, and any input via the keypad-tablet 35 will be converted to synthesized speech, which is sent to the telephone receiver, instead of the speaker 60 .
  • any voice emanating from the telephone speaker output can be directed to the filter 150 and the speech recognizer 55 for conversion to text and display via the projector 50 .
  • This embodiment provides mobile telephone operation capability to deaf, hearing impaired, or mute persons, obviating the need for teletype (TTY) devices on one or both ends of a telephone conversation.
  • TTY teletype
  • the processor 20 is understood to be equipped with the necessary hardware and software to translate speech it receives from the recognizer into another language if desired, and then to display the resulting message via the projector 50 onto the display 140 .
  • the recognizer 55 may also be implemented as part of the software programming for the processor 20 , obviating the need for a separate hardware element.
  • the invention also anticipates a combination of some or all of the separate elements, such as the microphone 70 , the filter 150 , the recognizer 55 , the processor 20 , the speaker 60 , the synthesizer 57 , and the storage device 62 into a single, integrated, hybrid unit.
  • the separate elemental units are illustrated as such for clarity, and not by necessity.
  • the processor 20 can be remotely located to reduce local weight/power requirements.
  • a separate radio data communications path (not shown) can be used to transmit and receive processed audio and video data to and from the user 200 , if desired.
  • the projector 50 may project the display 140 out into space in front of the user, as occurs with a holographic projection system, or may consist of any of several head-mounted optical projection systems, such as the HOPROSTM unit produced by Optronic Instruments & Products, the PROVIEWTM system produced by KEO, or the Mark II GP Series of wearable displays available from Seattle Sight Systems, Inc.
  • the projector 50 may also comprise a see-through liquid crystal display with the display 140 indicated directly thereon, or a retinal scanner, such as the Virtual Retinal Display being developed by the Human Interface Technology Laboratory at the University of Washington.
  • the display 140 as perceived by the user 200 can be adjusted to accommodate vision correction or deficiency so that any image perceived by a particular user 200 will appear in sharp focus.
  • the adjustment can be programmed into the processor 20 by sensing a user-controlled input (not shown), or mechanically adjusted by moving the display 140 closer to, or farther away from, the eye.
  • the recognition display 42 and the stylus 45 input devices are similar to those readily available in the market, and can be identical to those used on electronic appointment calendars or pen-top computers.
  • the recognition display 42 can be separated from the keyboard 40 and used alone, without the use of the keyboard 40 .
  • the recognition display 42 may contain the intelligence necessary for independent handwriting or printing recognition, or may pass on electrical signals so that the wearable processor 20 can recognize the content of the user message entry by way of writing recognition algorithms well known in the art.
  • the present invention anticipates that the user 200 will receive feedback at the message area 100 in character text form for each message entered on the recognition display 42 . In this way, the user 200 can be assured that what is written on recognition the display 42 , using the stylus 45 , is properly translated into text for transmission to others. In an alternative embodiment of the present invention, transmission of the characters will not occur until the user has authorized such transmission.
  • the communication system 10 anticipates use as a real-time language translator for disabled persons, as well as for those who can hear and speak.
  • Language translation software modules are readily available for use with text received by the wearable processor after voice recognition occurs, or after messages have been input via the keypad-tablet 35 .
  • Several such language translation program modules may be simultaneously resident in the processor 20 memory.
  • the user 200 can, for example, write or type a message in English, and after textual translation, it will be “spoken” in a selected alternate language via the speech synthesizer 67 and the speaker 60 .
  • a foreign language speaker may verbally address the user 200 , who, upon selecting the proper translation software module for execution by the processor 20 , may then (after recognition and translation occurs) be enabled to read the translated message at the message area 100 of the display 140 (or hear the synthesized version thereof) in the user's 200 own language. Deaf/mute users may also communicate between themselves, each in their own different language, by a similar process.
  • the present invention will also facilitate round-table discussions among those conversing in two or more languages, by way of speech and/or remote transceiver broadcast and transmission. It is anticipated that appropriate translation codes, either oral or electromagnetic, will be sent with the text so that the communication system 10 can automatically engage the proper software translation module for use by the processor 20 . Such operational methods will obviate the need for manual selection of translation software modules.
  • the user 200 will be able to enjoy television or theater using the communication system 10 by employing the microphone 70 and the speech recognizer 55 to produce text via the processor 20 for presentation to the user 200 via the display 140 .
  • communication system 10 it does not matter whether the speaker is directly facing the listener or has turned away from the front of the stage.
  • the communication system 10 will enable the listener to receive the appropriate message in an environment in which audio communication is possible.
  • the user may employ the remote data transceiver 72 and the energy transceiver 77 to enable such communications.
  • Such use of the system 10 will obviate the need for expensive subtitle apparatus for use with televisions, and dependence on the proper coding which much be supplied by the television network therefor.
  • the method of communication for deaf, hearing impaired, or mute persons presented herein comprises the steps of: receiving audio information; converting the audio information, whether speech or other information, such as a siren or explosion, intospeech data; then converting the speech data into a symbolic representation which is meaningful to the user (e.g. German language text characters for a German language speaker); and displaying the symbolic representation to the user, either in the space in front of the eye, or directly onto the eye, as occurs with retinal scanning.
  • the method further comprises acquiring user messages which are intended for transmission to others in the surrounding environment and transmitting those messages to the surrounding environment in the form of audio information.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

A communication device for deaf, hearing impaired, or mute persons comprises a processor control system which makes use of a microphone and a speech recognizer to receive and process audio data (speech or non-speech) to determine whether or not a dangerous situation exits within the environment surrounding the user. The system comprises a key pad and/or stylus and tablet information input system to accommodate communication from the user to the persons in the surrounding environments and a visual display capability to transmit the information so acquired by way of a projection apparatus in the form of characters and in the language to which the user is accustomed. Indicator signals which correspond to dangerous or cautionary situations relating to abnormally loud noises, or readily recognized sound patterns, such as a siren may also be displayed to the user, as may be information related geographic location, distance to a preset destination, or other personally useful information.

Description

This application claims the benefit of U.S. Provisional Application Nos. 60/026,400, and 60/025,480, filed Aug. 29, 1996.
BACKGROUND OF THE INVENTION
1. Field of the Invention
This invention relates generally to a device for communication between persons who are deaf, hearing impaired, or mute. More particularly, the invention is directed toward a communication system for deaf, hearing impaired or mute persons which allows the user to visualize speech and other sounds directed at him through various audio sources.
2. Background of the Invention
Historically, deaf, hearing impaired, and mute persons have had a difficult time communicating, both among themselves and with others not similarly handicapped. In fact, it is often more difficult to conduct a simple conversation between a non-handicapped person and one that is deaf or mute, because the non-handicapped person usually is not familiar with American Sign Language (ASL).
Even when the persons communicating are familiar with ASL, sign language is not a very convenient method of communication when a large group is involved. Expensive television cameras and video screens must be employed to transmit messages to large audiences. That is, unless the receiver of the message is so close to the message transmitter that he or she can distinguish ASL gestures and the expressions on the message transmitter's face, communication is simply not possible. As mentioned previously, if the message receiver has not been trained to understand ASL, then communication is also not possible.
Further, deaf persons often cannot enjoy television or the theater because the actors are not visible, or have turned their backs to the front of the stage so that lip-reading is impossible. Of course, this difficulty is also encountered during day-to-day communication scenarios whenever the speaker or communicator is not facing the deaf person, or is not located in the visual line-of-sight range of the deaf person. In addition, there are occasions when an unusually loud noise is used to alert persons in the vicinity that a dangerous or cautionary event is about to occur. This can be the siren of an ambulance or fire engine, an explosion, the horn of an automobile or truck, etc. While a person able to hear such sounds can immediately identify these circumstances as requiring caution or flight, the deaf person will be unaware that he is in danger.
Several elements of the instant invention have only recently become generally available due to the general trend of technology miniaturization and a reduction in the price of sophisticated microprocessor control systems. More specifically, this includes the technologies of speech recognition, short-range infrared and radio data communication, personal video displays, and handwriting recognition. The present invention is directed toward overcoming the communication difficulties set forth above.
It is desirable to have an apparatus and method enabling communications between deaf, hearing impaired, or mute persons and others, whether similarly handicapped or not. It is also desirable to have an apparatus and method for use by deaf or hearing impaired persons to enable them to enjoy television or movie theaters without subtitles, ASL interpreters, etc. In addition, it is desirable to have an apparatus and method which enables deaf and/or mute persons to communicate with others who may not be in close proximity. Furthermore, it is desirable to have an apparatus and method which enables communication between a single individual and a group, whether or not all individuals participating in the communication have the ability to speak or hear without impairment.
While ASL has been developed for enhancing the communication abilities of deaf and mute people, most non-handicapped persons are not trained in its use. Even those that are trained are unable to use it for communication in circumstances where line-of-sight communication is impossible or impractical. Thus, there exists a long-felt and widespread need to provide alternative communication apparatus and methods for deaf, hearing impaired, and mute persons which can be used at a distance, in situations where line-of-sight communication is impossible or impractical, where communication with a group of mixed non-handicapped persons and deaf and/or mute persons is desired, or in everyday situations which render common means of understanding by hearing impaired persons (e.g., lip-reading) ineffective. This is especially the case for dangerous situations in which a loud sound emanates from an undisclosed location and the deaf person remains unaware of its existence. The apparatus and method of the present invention, discussed in greater detail below, clearly satisfy this need.
SUMMARY OF THE INVENTION
The present invention provides a communication system for deaf, hearing impaired, and mute persons, or those communicating with hearing impaired and/or mute persons. The communication system of the present invention comprises a message reception means to receive audio information, such as a microphone and an optional filter; an information conversion means, such as a speech recognizer, to convert the received information into speech data; a data processing means, such as a microprocessor, to organize the speech data into an appropriate and meaningful symbolic representation; and a display means, such as a see-through liquid crystal display to display the symbolic representation of the speech data (usually English language text symbols) to the user. The processing means also has the ability to process other audio data (e.g. non-speech) to determine whether or not a dangerous situation exists within the environment surrounding the user.
Further, the apparatus and method of the present invention comprises a message entry means, such as a keypad and/or stylus-and-tablet to acquire messages from the user. The apparatus and method of the present invention also comprises a message transmission means, such as a voice synthesizer and speaker, to convert the user messages into message data, and convey the message data as audio output, preferably speech, to the user's immediate physical environment.
In another aspect of the present invention, a method and apparatus is provided for communication between hearing impaired, deaf and/or mute persons and others, including a group of people, which involves use of an optional element of the message transmission means, comprising a remote data transceiver electrically connected to the processor of the present invention so as to transmit and receive message information from other communication systems that are not in the immediate vicinity.
Other features of the apparatus and method include the ability to produce indicator signals for display to the user which correspond to dangerous or cautionary situations relating to abnormally loud noises, or readily recognized sound patterns, such as a siren. The apparatus may also comprise a cellular phone which operates as an adjunct to the microphone and speaker elements, providing mobile telephone communications for the handicapped. In addition, indications can be transmitted to the user representing information related to geographic location, distance to a present destination, or other personally useful information. The above and other advantages of this invention will become apparent from the following more detailed description, in conjunction with the accompanying drawings and illustrative embodiments.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a perspective view of the present invention indicating a user's perception of a projected message image.
FIG. 2 is a stylized representation of the display for the user, including messages and various information indicators.
FIG. 3 is a simplified block diagram of the present invention.
DESCRIPTION OF THE PREFERRED EMBODIMENT
FIG. 1 illustrates a perspective view of an exemplary embodiment of the present invention, as it is worn by user 200. A communication system 10 for deaf, hearing impaired, or mute persons optionally comprises headband 120, or other suspension means, such as eyeglasses, goggles, or a helmet, for supporting the various component elements which comprise the present invention.
The user 200 receives messages, including speech and other audio information, by way of a message reception means, such as a microphone 70, most preferably a self-powered and unobtrusive clip-on type. The communication system 10 also comprises a data processing means, such as a wearable processor 20, which may be co-located with the user 200 and electrically connected to a display means, such as a projector 50, so as to control the content of the projector display 140.
A keypad-tablet 35, or other user message entry means, comprising a keyboard 40 and a handwriting recognition display 42, is manipulated by the user 200 to present user message information to the wearable processor 20 for transmission to the environment surrounding the user 200 by way of a speaker 60 and/or a remote data transceiver 72. A stylus 45 can be used to write on the recognition display 42 so that the printed or handwritten message entry by the user 200 can be recognized by the wearable processor 20 and communicated to the environment surrounding the user 200. A keypad-tablet 35 may be connected directly to the processor 20, or communicate with the processor 20 via radio or other electromagnetic means, such as by infra-red signals (commonly found on television remote control devices). The wearable microprocessor 20 is capable of transmitting messages into the environment by way of a message transmission means comprising a synthesizer 57, which converts user messages into message speech data, and a speaker 60, which presents audio information to the environment surrounding the user 200.
Turning now to FIG. 2, a representative sample of the display of the communication system 10 can be seen. Essential to the practice of the invention is the message area 100, which contains the collection of alphanumeric characters which communicate speech to the user, after it has been recognized from audio information in the surrounding environment, or submitted to the processor 20 by way of a remotely transmitted message from a similar device. It is preferred that the message area 100 be located on the periphery of the display 140 so that the viewing area directly in front of the user 200 remains clear. However, it is most preferred that projector 50 presents a display 140 to the user 200 which can be seen through (a “see-through” display), so that any message text or indicators shown thereon will not interfere with the user's normal line of sight. While shown as a single message line display, those skilled in the art will realize that several message lines can also be projected onto display 140. Most preferably, message area 100 will comprise two lines of material presented in teleprompter fashion.
A first indicator 80 and a second indicator 90 represent optional elements of the display 140. The first indicator 80 may consist of an alarm icon, flashing light, or some type of analog display symbol which indicates the amount of noise in the environment. Of course, these possibilities are listed by way of example, and not by way of limitation. Similarly, the second indicator 90 may represent a geographical position location, generated by an attached Global Positioning Satellite (GPS) system, the distance of the user from a selected location, or the sound level of the surrounding environment in decibels. Of course, these specific examples of the content of second indicator 90 are not meant to limit the possibility for display of information to the user. Any information which can be processed by the wearable processor 20, either acquired from the surrounding environment, or pre-programmed into the unit, can be displayed to the user 200 at an appropriate time. Similarly, the time of day or of a specific event can also be included in the content of the second indicator 90.
Turning now to FIG. 3, a simplified block diagram of the communication system 10 can be seen. A power source 30, which may consist of a rechargeable battery, solar cell, or other electric current power generator, provides power to the wearable processor 20. To process surrounding environmental audio data, the environmental message reception means comprising the microphone 70, and optionally, the filter 150, is used to acquire audio information. A filter 150 is most preferably implemented as a band-pass filter which passes frequencies from about 500 Hz to about 2,999 Hz. As a further aid to recognizing speech, any of several noise cancellation techniques can be implemented by the wearable processor 20 or the speech recognizer 55. Information conversion means, such as a speech recognizer 55, preferably operates in a manner that will accept continuous speech input and other audio information (such as a police siren tone pattern or abnormally loud sound), using templates and a vocabulary lookup table, or other methods well known in the art, and convert it into speech data. The data processing means processor 20 will then organize the speech data into a meaningful symbolic representation for transmission to the projector 50.
To process user initiated messages, the message entry means such as the keypad-tablet 35 is used to acquire user message information, which is passed on to a message transmission means, comprising a speech synthesizer 57 and a speaker 64, producing an audible signal in the user's immediate vicinity. Synthesized speech, produced by any of several methods well known in the art, is most preferably formed by the synthesizer 57 so as to mimic the gender of the user. In this manner, a mute person can communicate with those that can hear. The message transmission means may alternatively comprise the synthesizer 57 and a remote data transceiver 72, wherein the keypad-tablet 35 user input message information can alternatively be sent to a remote location via the remote data transceiver 72 and an energy transceiver 77. The remote data transceiver 72 comprises an electromagnetic energy transceiver capable of receiving and transmitting electromagnetic energy using an appropriate energy transceiver 77. If the electromagnetic energy consists of infrared or other light, then a laser diode transmitter and receiver, or similar devices, would be most appropriate to use as the energy transceiver 77. If the electromagnetic energy consists of radio waves, then an antenna designed for the desired frequency and bandwidth should be used as the energy transceiver 77.
By means of the above-described apparatus and method of communication, any person, whether mute or not, can communicate at a distance with another person who is hearing-impaired or deaf. Of course, such communications can also exist between persons at each end of the communication link who are not disabled. In any event, those skilled in the art will recognize that such an arrangement of remote radio communication devices also lends itself to multi-channel communication, which allows groups of disabled users to communicate amongst themselves, either out loud, or silently. Another example of system use might enable all English speakers to communicate on one channel, and all Spanish speakers on another. Such multi-channel operation is desired to enhance user privacy and the practicality of the system. It is anticipated that a single “international” frequency, along with several “national” frequencies will be allocated to facilitate communications between people of differing countries, and the same country, as desired.
The processor 20 can be programmed to recognize a dangerous or a cautionary situation, due to the sheer magnitude of the electrical signal received, or based on a recognized pattern, and will thereafter display an alarm signal by way of the first indicator 80 to the user 200 via the projector 50 and the display 140. Operation of the communication system 10 by using audio input and output (via the microphone 70 and the speaker 60) obviates the need for ASL, and allows deaf, hearing impaired, or mute persons to communicate with others that are similarly disabled, and also with individuals having unimpaired hearing and speech capability.
Environmental information, normally existing as local audio information, can also be received by way of the energy transceiver 77 and the remote data transceiver 72. Such remote information will normally originate from another communication system 10 at a remote location (i.e. outside of the line-of-sight, or out of normal hearing range) and consist of audio information, or speech data. The acquired signals can then be processed by the speech recognizer 55 or, if already processed by the remote communication system 10, the received speech data can be sent directly to the processor 20 for organization into an appropriate symbolic representation of the data, and on to the projector 50 for viewing by the user 200 and, optionally, transmitted to the external environment via the speech synthesizer 57 and the speaker 60.
An optional element of the communication system 10 is a storage device 62, which can store several canned messages for recall by the user 200. This provides a convenient means of rapidly communicating standard words and/or phrases to the external environment. In addition, the storage device 62, which comprises battery-powered RAM memory, EPROMs, disk drives, or tape drives, can be used to save messages that have been received from the external environment for receipt and processing at a later time by the user 200.
In another embodiment of the communication system 10, a cellular telephone (not shown) can be used as a supplement to the microphone 70 and the speaker 60. When the telephone receives an incoming call, the first indicator 80 within the display 140 can be activated, along with a message to the user 200 that the “Phone is ringing.” The user 200 can then choose to answer the phone, and any input via the keypad-tablet 35 will be converted to synthesized speech, which is sent to the telephone receiver, instead of the speaker 60. Similarly, any voice emanating from the telephone speaker output can be directed to the filter 150 and the speech recognizer 55 for conversion to text and display via the projector 50. This embodiment provides mobile telephone operation capability to deaf, hearing impaired, or mute persons, obviating the need for teletype (TTY) devices on one or both ends of a telephone conversation.
The processor 20 is understood to be equipped with the necessary hardware and software to translate speech it receives from the recognizer into another language if desired, and then to display the resulting message via the projector 50 onto the display 140. The recognizer 55 may also be implemented as part of the software programming for the processor 20, obviating the need for a separate hardware element. The invention also anticipates a combination of some or all of the separate elements, such as the microphone 70, the filter 150, the recognizer 55, the processor 20, the speaker 60, the synthesizer 57, and the storage device 62 into a single, integrated, hybrid unit. The separate elemental units are illustrated as such for clarity, and not by necessity. Likewise, the processor 20 can be remotely located to reduce local weight/power requirements. A separate radio data communications path (not shown) can be used to transmit and receive processed audio and video data to and from the user 200, if desired.
The projector 50 may project the display 140 out into space in front of the user, as occurs with a holographic projection system, or may consist of any of several head-mounted optical projection systems, such as the HOPROS™ unit produced by Optronic Instruments & Products, the PROVIEW™ system produced by KEO, or the Mark II GP Series of wearable displays available from Seattle Sight Systems, Inc. The projector 50 may also comprise a see-through liquid crystal display with the display 140 indicated directly thereon, or a retinal scanner, such as the Virtual Retinal Display being developed by the Human Interface Technology Laboratory at the University of Washington. In more sophisticated embodiments of this invention, the display 140 as perceived by the user 200 can be adjusted to accommodate vision correction or deficiency so that any image perceived by a particular user 200 will appear in sharp focus. The adjustment can be programmed into the processor 20 by sensing a user-controlled input (not shown), or mechanically adjusted by moving the display 140 closer to, or farther away from, the eye. The recognition display 42 and the stylus 45 input devices are similar to those readily available in the market, and can be identical to those used on electronic appointment calendars or pen-top computers. The recognition display 42 can be separated from the keyboard 40 and used alone, without the use of the keyboard 40. In addition, the recognition display 42 may contain the intelligence necessary for independent handwriting or printing recognition, or may pass on electrical signals so that the wearable processor 20 can recognize the content of the user message entry by way of writing recognition algorithms well known in the art. The present invention anticipates that the user 200 will receive feedback at the message area 100 in character text form for each message entered on the recognition display 42. In this way, the user 200 can be assured that what is written on recognition the display 42, using the stylus 45, is properly translated into text for transmission to others. In an alternative embodiment of the present invention, transmission of the characters will not occur until the user has authorized such transmission.
The communication system 10 anticipates use as a real-time language translator for disabled persons, as well as for those who can hear and speak. Language translation software modules are readily available for use with text received by the wearable processor after voice recognition occurs, or after messages have been input via the keypad-tablet 35. Several such language translation program modules may be simultaneously resident in the processor 20 memory. The user 200 can, for example, write or type a message in English, and after textual translation, it will be “spoken” in a selected alternate language via the speech synthesizer 67 and the speaker 60. Additionally, a foreign language speaker may verbally address the user 200, who, upon selecting the proper translation software module for execution by the processor 20, may then (after recognition and translation occurs) be enabled to read the translated message at the message area 100 of the display 140 (or hear the synthesized version thereof) in the user's 200 own language. Deaf/mute users may also communicate between themselves, each in their own different language, by a similar process.
The present invention will also facilitate round-table discussions among those conversing in two or more languages, by way of speech and/or remote transceiver broadcast and transmission. It is anticipated that appropriate translation codes, either oral or electromagnetic, will be sent with the text so that the communication system 10 can automatically engage the proper software translation module for use by the processor 20. Such operational methods will obviate the need for manual selection of translation software modules.
It is also anticipated that the user 200 will be able to enjoy television or theater using the communication system 10 by employing the microphone 70 and the speech recognizer 55 to produce text via the processor 20 for presentation to the user 200 via the display 140. By using communication system 10, it does not matter whether the speaker is directly facing the listener or has turned away from the front of the stage. In addition, even if the speaker has left the immediate area of the listener, the communication system 10 will enable the listener to receive the appropriate message in an environment in which audio communication is possible. At more remote locations, the user may employ the remote data transceiver 72 and the energy transceiver 77 to enable such communications. Such use of the system 10 will obviate the need for expensive subtitle apparatus for use with televisions, and dependence on the proper coding which much be supplied by the television network therefor.
The method of communication for deaf, hearing impaired, or mute persons presented herein comprises the steps of: receiving audio information; converting the audio information, whether speech or other information, such as a siren or explosion, intospeech data; then converting the speech data into a symbolic representation which is meaningful to the user (e.g. German language text characters for a German language speaker); and displaying the symbolic representation to the user, either in the space in front of the eye, or directly onto the eye, as occurs with retinal scanning. The method further comprises acquiring user messages which are intended for transmission to others in the surrounding environment and transmitting those messages to the surrounding environment in the form of audio information.
Although the invention has been described with reference to a specific embodiment, this description is not meant to be construed in a limiting sense. On the contrary, even though only specific devices have been shown to be mounted to the headband 120, all elements of the instant invention can be mounted thereon, given sufficient miniaturization. Also, various alternative stylized displays can be used, other than that shown in FIG. 2. As long as the user is given visual indication of the message he receives from the environment, the spirit of this invention is effected. Other various modifications of the enclosed embodiments will become apparent to those skilled in the art upon reference to the description of the invention. It is, therefore, contemplated that the following claims will cover such modifications, alternatives, and equivalents that fall within the true spirit of the scope of the invention.

Claims (27)

We claim:
1. A communication system for deaf, hearing impaired, or mute persons, comprising:
a message reception means for receiving audio information;
an information conversion means to convert said audio information into speech data, said information conversion means being in communication with said message reception means;
a data processing means for organizing said speech data into a symbolic representation of said speech data, said data processing means being in communication with said information conversion means;
a visual display means for displaying said symbolic representation, said display means being in communication with said data processing means;
a message entry means for acquisition of user messages, said message entry means being in communication with said data processing means; and
a message transmission means for conversion of said user messages into message data and transmission of said message data into the surrounding environment, said message transmission means being in communication with said data processing means.
2. The system of claim 1 wherein said message reception means further comprises a microphone.
3. The system of claim 1 wherein said message reception means further comprises a microphone and a filter.
4. The system of claim 1 wherein said information conversion means comprises a speech recognizer.
5. The system of claim 1 wherein said information conversion means comprises a continuous speech recognizer.
6. The system of claim 1 wherein said data processing means comprises a personal computer.
7. The system of claim 1 wherein said data processing means comprises a personal digital assistant.
8. The system of claim 1 wherein said display means is affixed to a pair of eye glasses.
9. The system of claim 1 wherein said display means is affixed to a headband.
10. The system of claim 1 wherein said display means comprises a liquid crystal display.
11. The system of claim 1 wherein said display means comprises a see-through liquid crystal display.
12. The system of claim 1 wherein said display means comprises a retinal scanner.
13. The system of claim 1 wherein said display means comprises a holographic projection system.
14. The system of claim 1 further comprising a storage device, said storage device being in communication with said data processing means.
15. The system of claim 1 further comprising a cellular telephone, said cellular telephone being in communication with said information conversion means and said message transmission means.
16. The system of claim 1 further comprising a remote data transceiver, said transceiver being in communication with said information conversion means and said message transmission means.
17. A method of communication for deaf, hearing impaired, or mute persons, comprising the steps of:
receiving audio information;
first converting said audio information into speech data;
second converting said speech data into a symbolic representation of said data;
displaying said symbolic representation in a visual format;
acquiring user messages; and
transmitting said messages to the surrounding environment in the form of audio information.
18. The method of claim 17 wherein said receiving step is accomplished with a microphone.
19. The method of claim 17 wherein said receiving step is accomplished with a microphone and a filter.
20. The method of claim 17 wherein said first converting step is accomplished with a speech recognizer.
21. The method of claim 17 wherein said second converting step is accomplished with a personal computer.
22. The method of claim 17 wherein said displaying step is accomplished with a liquid crystal display.
23. The method of claim 17 wherein said displaying step is accomplished with a see-through liquid crystal display.
24. The method of claim 17 wherein said displaying step is accomplished with a retinal scanner.
25. The method of claim 17 wherein said displaying step is accomplished with a holographic projector.
26. A communication system for deaf, hearing impaired, or mute persons, comprising:
a message reception means for receiving audio information;
a data processing means to convert said audio information into speech data and organize said speech data into a symbolic representation of said speech data, said data processing means being in communication with said message reception means;
a display means for displaying said symbolic representation, said display means being in communication with said data processing means;
a message entry means for acquisition of user messages, said message entry means being in communication with said data processing means; and
a message transmission means for conversion of said user messages into message data and transmission of said message data into the surrounding environment, said message transmission means being in communication with said data processing means.
27. The system of claim 26 further comprising a remote data transceiver, said transceiver being in communication with said data processing means and said message transmission means.
US08/920,820 1996-08-29 1997-08-29 Communication device and method for deaf and mute persons Expired - Lifetime US6240392B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US08/920,820 US6240392B1 (en) 1996-08-29 1997-08-29 Communication device and method for deaf and mute persons

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US2640096P 1996-08-29 1996-08-29
US2548096P 1996-08-29 1996-08-29
US08/920,820 US6240392B1 (en) 1996-08-29 1997-08-29 Communication device and method for deaf and mute persons

Publications (1)

Publication Number Publication Date
US6240392B1 true US6240392B1 (en) 2001-05-29

Family

ID=27362557

Family Applications (1)

Application Number Title Priority Date Filing Date
US08/920,820 Expired - Lifetime US6240392B1 (en) 1996-08-29 1997-08-29 Communication device and method for deaf and mute persons

Country Status (1)

Country Link
US (1) US6240392B1 (en)

Cited By (60)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020103649A1 (en) * 2001-01-31 2002-08-01 International Business Machines Corporation Wearable display system with indicators of speakers
WO2002073957A1 (en) * 2001-03-14 2002-09-19 Koninklijke Philips Electronics N.V. A television receiver having a notepad/message recorder functionality
US20020147589A1 (en) * 2001-04-04 2002-10-10 Nec Viewtechnology, Ltd. Graphic display device with built-in speech recognition function
US20020158816A1 (en) * 2001-04-30 2002-10-31 Snider Gregory S. Translating eyeglasses
US20020193993A1 (en) * 1998-09-30 2002-12-19 Leviton Dan?Apos;L Voice communication with simulated speech data
US20030053603A1 (en) * 2001-09-20 2003-03-20 Vejlgaard Benny Niels System for and method of detecting a connection of a text telephone (TTY) device to a mobile phone
US6542200B1 (en) 2001-08-14 2003-04-01 Cheldan Technologies, Inc. Television/radio speech-to-text translating processor
US20030172282A1 (en) * 2002-03-07 2003-09-11 International Business Machines Corporation Log-on processing
ES2192143A1 (en) * 2002-02-22 2003-09-16 Univ Granada Communication device for the verbally handicapped.
US6701162B1 (en) * 2000-08-31 2004-03-02 Motorola, Inc. Portable electronic telecommunication device having capabilities for the hearing-impaired
US20040155770A1 (en) * 2002-08-22 2004-08-12 Nelson Carl V. Audible alarm relay system
US6828918B2 (en) * 2000-11-29 2004-12-07 International Business Machines Corporation Personalized accessibility identification receiver/transmitter and method for providing assistance
US20050004978A1 (en) * 1996-02-29 2005-01-06 Reed Drummond Shattuck Object-based on-line transaction infrastructure
US6850166B2 (en) * 2001-06-28 2005-02-01 Nokia Mobile Phones Limited Ancillary wireless detector
US20050038663A1 (en) * 2002-01-31 2005-02-17 Brotz Gregory R. Holographic speech translation system and method
US20050066004A1 (en) * 2003-09-18 2005-03-24 Gan Kenneth A. Interactive real time visual conversation system for face-to-face communication
US20050106536A1 (en) * 2003-11-19 2005-05-19 Raanan Liebermann Touch language
US20050195106A1 (en) * 2004-03-03 2005-09-08 Davis Alan C. Hand held wireless occupant communicator
ES2245203A1 (en) * 2003-12-02 2005-12-16 Universidad De La Laguna Electronic optical acoustic transducer for visualization of sounds for deaf people, presents origin of sonorous source, potency of sound, and frequency information comprising sound in graphical form which is received by user in actual time
US6993474B2 (en) 2001-05-17 2006-01-31 Curry David G Interactive conversational speech communicator method and system
US20060234193A1 (en) * 2002-09-17 2006-10-19 Nozomu Sahashi Sign language interpretation system and a sign language interpretation method
US20070003025A1 (en) * 2005-06-24 2007-01-04 Insitituto Centro De Pesquisa E Desenvolvimento Em Rybena: an asl-based communication method and system for deaf, mute and hearing impaired persons
US20070140471A1 (en) * 2004-01-20 2007-06-21 Koninklijke Philips Electronics N.V. Enhanced usage of telephone in noisy surroundings
US20070167162A1 (en) * 2005-12-30 2007-07-19 Kim Young B Multi-functional communication terminal device and communication relay device for use in noise environment
US7277858B1 (en) * 2002-12-20 2007-10-02 Sprint Spectrum L.P. Client/server rendering of network transcoded sign language content
US20080064326A1 (en) * 2006-08-24 2008-03-13 Stephen Joseph Foster Systems and Methods for Casting Captions Associated With A Media Stream To A User
US20080097759A1 (en) * 2006-10-24 2008-04-24 Samsung Electronics Co., Ltd. Method and apparatus for remote control in portable terminal
US20080109208A1 (en) * 2006-04-21 2008-05-08 Scomm, Inc. Interactive conversational speech communicator method and system
US20080301543A1 (en) * 2001-03-19 2008-12-04 Stephane Herman Maes Intelligent Document Filtering
US7676372B1 (en) * 1999-02-16 2010-03-09 Yugen Kaisha Gm&M Prosthetic hearing device that transforms a detected speech into a speech of a speech form assistive in understanding the semantic meaning in the detected speech
US20100088096A1 (en) * 2008-10-02 2010-04-08 Stephen John Parsons Hand held speech recognition device
US20100198582A1 (en) * 2009-02-02 2010-08-05 Gregory Walker Johnson Verbal command laptop computer and software
US20100198580A1 (en) * 2000-10-25 2010-08-05 Robert Glen Klinefelter System, method, and apparatus for providing interpretive communication on a network
WO2011000113A1 (en) * 2009-06-30 2011-01-06 Harmonya Technologies Multiple sound and voice detector for hearing- impaired or deaf person
US20110116608A1 (en) * 2009-11-18 2011-05-19 Gwendolyn Simmons Method of providing two-way communication between a deaf person and a hearing person
CN102074150A (en) * 2011-01-12 2011-05-25 无锡工艺职业技术学院 Sentence and speech conversion device for the deaf to communicate with outside world
US8280954B2 (en) 2010-03-25 2012-10-02 Scomm, Inc. Method and system for providing live real-time communication via text between mobile user devices
WO2011145117A3 (en) * 2010-05-17 2013-01-24 Tata Consultancy Services Limited Hand-held communication aid for individuals with auditory, speech and visual impairments
US8461986B2 (en) 2007-12-14 2013-06-11 Wayne Harvey Snyder Audible event detector and analyzer for annunciating to the hearing impaired
US20130289970A1 (en) * 2003-11-19 2013-10-31 Raanan Liebermann Global Touch Language as Cross Translation Between Languages
US20140307879A1 (en) * 2013-04-11 2014-10-16 National Central University Vision-aided hearing assisting device
US20150073770A1 (en) * 2013-09-10 2015-03-12 At&T Intellectual Property I, L.P. System and method for intelligent language switching in automated text-to-speech systems
WO2015120184A1 (en) * 2014-02-06 2015-08-13 Otosense Inc. Instant real time neuro-compatible imaging of signals
US20150319546A1 (en) * 2015-04-14 2015-11-05 Okappi, Inc. Hearing Assistance System
US9191762B1 (en) 2012-02-23 2015-11-17 Joseph M. Matesa Alarm detection device and method
US20150379896A1 (en) * 2013-12-05 2015-12-31 Boe Technology Group Co., Ltd. Intelligent eyewear and control method thereof
CN105708059A (en) * 2016-04-21 2016-06-29 张胜国 Intelligent awaking bracelet for deaf and dumb
US9547981B1 (en) 2006-08-18 2017-01-17 Sockeye Licensing Tx Llc System, method and apparatus for using a wireless device to control other devices
USD788223S1 (en) 2015-08-11 2017-05-30 Barbara J. Grady Sign language displaying communicator
US9870357B2 (en) * 2013-10-28 2018-01-16 Microsoft Technology Licensing, Llc Techniques for translating text via wearable computing device
RU2691864C1 (en) * 2018-06-13 2019-06-18 Общество с ограниченной ответственностью "РостРесурс-Инклюзия" Telecommunication complex
CN110351631A (en) * 2019-07-11 2019-10-18 京东方科技集团股份有限公司 Deaf-mute's alternating current equipment and its application method
IT201800011175A1 (en) * 2018-12-17 2020-06-17 Andrea Previato Aid system and method for users with hearing impairments
CN111742363A (en) * 2018-02-22 2020-10-02 松下知识产权经营株式会社 Voice control information output system, voice control information output method, and program
CN111844055A (en) * 2019-04-26 2020-10-30 美澳视界(厦门)智能科技有限公司 Multi-mode man-machine interaction robot with auditory, visual, tactile and emotional feedback functions
US10848872B2 (en) * 2014-12-27 2020-11-24 Intel Corporation Binaural recording for processing audio signals to enable alerts
US10878819B1 (en) * 2017-04-25 2020-12-29 United Services Automobile Association (Usaa) System and method for enabling real-time captioning for the hearing impaired via augmented reality
IT202100015347A1 (en) * 2021-06-11 2022-12-11 Haga2 S R L COMMUNICATION DEVICE
CN116847023A (en) * 2023-08-31 2023-10-03 深圳市广和通无线通信软件有限公司 Auxiliary call method and device based on man-machine interaction
CN118605736A (en) * 2024-08-09 2024-09-06 厦门理工学院 Man-machine interaction head-mounted device for hearing impaired takeaway and control method thereof

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3746793A (en) * 1972-08-09 1973-07-17 Phonics Corp Telephone communication system for the hearing impaired
US3936605A (en) * 1972-02-14 1976-02-03 Textron, Inc. Eyeglass mounted visual display
US4268721A (en) * 1977-05-02 1981-05-19 Sri International Portable telephone communication device for the hearing impaired
US4414431A (en) * 1980-10-17 1983-11-08 Research Triangle Institute Method and apparatus for displaying speech information
US4694494A (en) * 1983-06-07 1987-09-15 Pathway Communications Limited Electronic memory devices for the blind
US5526407A (en) * 1991-09-30 1996-06-11 Riverrun Technology Method and apparatus for managing information
US5710806A (en) * 1994-09-22 1998-01-20 Ameri Phone, Inc. Telecommunications device for the hearing impaired with telephone, text communication and answering, and automated voice carryover
US5774857A (en) * 1996-11-15 1998-06-30 Motorola, Inc. Conversion of communicated speech to text for tranmission as RF modulated base band video
US6049328A (en) * 1995-10-20 2000-04-11 Wisconsin Alumni Research Foundation Flexible access system for touch screen devices

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3936605A (en) * 1972-02-14 1976-02-03 Textron, Inc. Eyeglass mounted visual display
US3746793A (en) * 1972-08-09 1973-07-17 Phonics Corp Telephone communication system for the hearing impaired
US4268721A (en) * 1977-05-02 1981-05-19 Sri International Portable telephone communication device for the hearing impaired
US4414431A (en) * 1980-10-17 1983-11-08 Research Triangle Institute Method and apparatus for displaying speech information
US4694494A (en) * 1983-06-07 1987-09-15 Pathway Communications Limited Electronic memory devices for the blind
US5526407A (en) * 1991-09-30 1996-06-11 Riverrun Technology Method and apparatus for managing information
US5710806A (en) * 1994-09-22 1998-01-20 Ameri Phone, Inc. Telecommunications device for the hearing impaired with telephone, text communication and answering, and automated voice carryover
US6049328A (en) * 1995-10-20 2000-04-11 Wisconsin Alumni Research Foundation Flexible access system for touch screen devices
US5774857A (en) * 1996-11-15 1998-06-30 Motorola, Inc. Conversion of communicated speech to text for tranmission as RF modulated base band video

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
IEEE Globecom 1988. Global telecommunication Conference, Dec. 1988. Bazzani et al., "PC-Based Communication system for Deaf-Blind People". pp. 2.3.1 to 2.3.5. *

Cited By (86)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050004978A1 (en) * 1996-02-29 2005-01-06 Reed Drummond Shattuck Object-based on-line transaction infrastructure
US6501751B1 (en) * 1998-09-30 2002-12-31 Symantec Corporation Voice communication with simulated speech data
US20020193993A1 (en) * 1998-09-30 2002-12-19 Leviton Dan?Apos;L Voice communication with simulated speech data
US7593387B2 (en) 1998-09-30 2009-09-22 Symantec Corporation Voice communication with simulated speech data
US7676372B1 (en) * 1999-02-16 2010-03-09 Yugen Kaisha Gm&M Prosthetic hearing device that transforms a detected speech into a speech of a speech form assistive in understanding the semantic meaning in the detected speech
US6701162B1 (en) * 2000-08-31 2004-03-02 Motorola, Inc. Portable electronic telecommunication device having capabilities for the hearing-impaired
US10499168B2 (en) * 2000-10-25 2019-12-03 Kp Innovations, Llc System, method, and apparatus for providing interpretive communication on a network
US20100198580A1 (en) * 2000-10-25 2010-08-05 Robert Glen Klinefelter System, method, and apparatus for providing interpretive communication on a network
US6828918B2 (en) * 2000-11-29 2004-12-07 International Business Machines Corporation Personalized accessibility identification receiver/transmitter and method for providing assistance
US20020103649A1 (en) * 2001-01-31 2002-08-01 International Business Machines Corporation Wearable display system with indicators of speakers
US6975991B2 (en) * 2001-01-31 2005-12-13 International Business Machines Corporation Wearable display system with indicators of speakers
WO2002073957A1 (en) * 2001-03-14 2002-09-19 Koninklijke Philips Electronics N.V. A television receiver having a notepad/message recorder functionality
US8239756B2 (en) * 2001-03-19 2012-08-07 International Business Machines Corporation Intelligent document filtering
US20080301543A1 (en) * 2001-03-19 2008-12-04 Stephane Herman Maes Intelligent Document Filtering
US20020147589A1 (en) * 2001-04-04 2002-10-10 Nec Viewtechnology, Ltd. Graphic display device with built-in speech recognition function
US20020158816A1 (en) * 2001-04-30 2002-10-31 Snider Gregory S. Translating eyeglasses
US6993474B2 (en) 2001-05-17 2006-01-31 Curry David G Interactive conversational speech communicator method and system
US20060206309A1 (en) * 2001-05-17 2006-09-14 Curry David G Interactive conversational speech communicator method and system
US6850166B2 (en) * 2001-06-28 2005-02-01 Nokia Mobile Phones Limited Ancillary wireless detector
US6542200B1 (en) 2001-08-14 2003-04-01 Cheldan Technologies, Inc. Television/radio speech-to-text translating processor
US7151820B2 (en) 2001-09-20 2006-12-19 Siemens Communications Inc. System for and method of detecting a connection of a text telephony (TTY) device to a mobile phone
US20030053603A1 (en) * 2001-09-20 2003-03-20 Vejlgaard Benny Niels System for and method of detecting a connection of a text telephone (TTY) device to a mobile phone
US20050038663A1 (en) * 2002-01-31 2005-02-17 Brotz Gregory R. Holographic speech translation system and method
ES2192143A1 (en) * 2002-02-22 2003-09-16 Univ Granada Communication device for the verbally handicapped.
US20030172282A1 (en) * 2002-03-07 2003-09-11 International Business Machines Corporation Log-on processing
US20040155770A1 (en) * 2002-08-22 2004-08-12 Nelson Carl V. Audible alarm relay system
US20060234193A1 (en) * 2002-09-17 2006-10-19 Nozomu Sahashi Sign language interpretation system and a sign language interpretation method
US7277858B1 (en) * 2002-12-20 2007-10-02 Sprint Spectrum L.P. Client/server rendering of network transcoded sign language content
US20050066004A1 (en) * 2003-09-18 2005-03-24 Gan Kenneth A. Interactive real time visual conversation system for face-to-face communication
US20050106536A1 (en) * 2003-11-19 2005-05-19 Raanan Liebermann Touch language
US20130289970A1 (en) * 2003-11-19 2013-10-31 Raanan Liebermann Global Touch Language as Cross Translation Between Languages
US8523572B2 (en) * 2003-11-19 2013-09-03 Raanan Liebermann Touch language
ES2245203A1 (en) * 2003-12-02 2005-12-16 Universidad De La Laguna Electronic optical acoustic transducer for visualization of sounds for deaf people, presents origin of sonorous source, potency of sound, and frequency information comprising sound in graphical form which is received by user in actual time
US20070140471A1 (en) * 2004-01-20 2007-06-21 Koninklijke Philips Electronics N.V. Enhanced usage of telephone in noisy surroundings
US20050195106A1 (en) * 2004-03-03 2005-09-08 Davis Alan C. Hand held wireless occupant communicator
US20070003025A1 (en) * 2005-06-24 2007-01-04 Insitituto Centro De Pesquisa E Desenvolvimento Em Rybena: an asl-based communication method and system for deaf, mute and hearing impaired persons
US20070167162A1 (en) * 2005-12-30 2007-07-19 Kim Young B Multi-functional communication terminal device and communication relay device for use in noise environment
US8275602B2 (en) 2006-04-21 2012-09-25 Scomm, Inc. Interactive conversational speech communicator method and system
US20080109208A1 (en) * 2006-04-21 2008-05-08 Scomm, Inc. Interactive conversational speech communicator method and system
US9547981B1 (en) 2006-08-18 2017-01-17 Sockeye Licensing Tx Llc System, method and apparatus for using a wireless device to control other devices
US20080064326A1 (en) * 2006-08-24 2008-03-13 Stephen Joseph Foster Systems and Methods for Casting Captions Associated With A Media Stream To A User
US20080097759A1 (en) * 2006-10-24 2008-04-24 Samsung Electronics Co., Ltd. Method and apparatus for remote control in portable terminal
US8461986B2 (en) 2007-12-14 2013-06-11 Wayne Harvey Snyder Audible event detector and analyzer for annunciating to the hearing impaired
US20100088096A1 (en) * 2008-10-02 2010-04-08 Stephen John Parsons Hand held speech recognition device
US20100198582A1 (en) * 2009-02-02 2010-08-05 Gregory Walker Johnson Verbal command laptop computer and software
WO2011000113A1 (en) * 2009-06-30 2011-01-06 Harmonya Technologies Multiple sound and voice detector for hearing- impaired or deaf person
US20110116608A1 (en) * 2009-11-18 2011-05-19 Gwendolyn Simmons Method of providing two-way communication between a deaf person and a hearing person
US8280954B2 (en) 2010-03-25 2012-10-02 Scomm, Inc. Method and system for providing live real-time communication via text between mobile user devices
US10257130B2 (en) 2010-03-25 2019-04-09 Scomm, Inc. Method and system for providing live real-time communication via text between mobile user devices
US9565262B2 (en) 2010-03-25 2017-02-07 Scomm, Inc. Method and system for providing live real-time communication via text between mobile user devices
US20130079061A1 (en) * 2010-05-17 2013-03-28 Tata Consultancy Services Limited Hand-held communication aid for individuals with auditory, speech and visual impairments
CN102939791A (en) * 2010-05-17 2013-02-20 塔塔咨询服务有限公司 Hand-held communication aid for individuals with auditory, speech and visual impairments
WO2011145117A3 (en) * 2010-05-17 2013-01-24 Tata Consultancy Services Limited Hand-held communication aid for individuals with auditory, speech and visual impairments
US9111545B2 (en) * 2010-05-17 2015-08-18 Tata Consultancy Services Limited Hand-held communication aid for individuals with auditory, speech and visual impairments
CN102939791B (en) * 2010-05-17 2015-09-23 塔塔咨询服务有限公司 For having the hand communication assistor of people of the sense of hearing, speech and dysopia
CN102074150A (en) * 2011-01-12 2011-05-25 无锡工艺职业技术学院 Sentence and speech conversion device for the deaf to communicate with outside world
US9191762B1 (en) 2012-02-23 2015-11-17 Joseph M. Matesa Alarm detection device and method
US20140307879A1 (en) * 2013-04-11 2014-10-16 National Central University Vision-aided hearing assisting device
US9280914B2 (en) * 2013-04-11 2016-03-08 National Central University Vision-aided hearing assisting device
US11195510B2 (en) * 2013-09-10 2021-12-07 At&T Intellectual Property I, L.P. System and method for intelligent language switching in automated text-to-speech systems
US9640173B2 (en) * 2013-09-10 2017-05-02 At&T Intellectual Property I, L.P. System and method for intelligent language switching in automated text-to-speech systems
US20150073770A1 (en) * 2013-09-10 2015-03-12 At&T Intellectual Property I, L.P. System and method for intelligent language switching in automated text-to-speech systems
US20170236509A1 (en) * 2013-09-10 2017-08-17 At&T Intellectual Property I, L.P. System and method for intelligent language switching in automated text-to-speech systems
US10388269B2 (en) * 2013-09-10 2019-08-20 At&T Intellectual Property I, L.P. System and method for intelligent language switching in automated text-to-speech systems
US9870357B2 (en) * 2013-10-28 2018-01-16 Microsoft Technology Licensing, Llc Techniques for translating text via wearable computing device
US20150379896A1 (en) * 2013-12-05 2015-12-31 Boe Technology Group Co., Ltd. Intelligent eyewear and control method thereof
WO2015120184A1 (en) * 2014-02-06 2015-08-13 Otosense Inc. Instant real time neuro-compatible imaging of signals
US9466316B2 (en) 2014-02-06 2016-10-11 Otosense Inc. Device, method and system for instant real time neuro-compatible imaging of a signal
US9812152B2 (en) 2014-02-06 2017-11-07 OtoSense, Inc. Systems and methods for identifying a sound event
US11095985B2 (en) 2014-12-27 2021-08-17 Intel Corporation Binaural recording for processing audio signals to enable alerts
US10848872B2 (en) * 2014-12-27 2020-11-24 Intel Corporation Binaural recording for processing audio signals to enable alerts
US20150319546A1 (en) * 2015-04-14 2015-11-05 Okappi, Inc. Hearing Assistance System
USD788223S1 (en) 2015-08-11 2017-05-30 Barbara J. Grady Sign language displaying communicator
CN105708059B (en) * 2016-04-21 2019-02-22 张胜国 The deaf and dumb wake-up bracelet of intelligence
CN105708059A (en) * 2016-04-21 2016-06-29 张胜国 Intelligent awaking bracelet for deaf and dumb
US10878819B1 (en) * 2017-04-25 2020-12-29 United Services Automobile Association (Usaa) System and method for enabling real-time captioning for the hearing impaired via augmented reality
CN111742363B (en) * 2018-02-22 2024-03-29 松下知识产权经营株式会社 Voice control information output system, voice control information output method, and recording medium
CN111742363A (en) * 2018-02-22 2020-10-02 松下知识产权经营株式会社 Voice control information output system, voice control information output method, and program
RU2691864C1 (en) * 2018-06-13 2019-06-18 Общество с ограниченной ответственностью "РостРесурс-Инклюзия" Telecommunication complex
IT201800011175A1 (en) * 2018-12-17 2020-06-17 Andrea Previato Aid system and method for users with hearing impairments
CN111844055A (en) * 2019-04-26 2020-10-30 美澳视界(厦门)智能科技有限公司 Multi-mode man-machine interaction robot with auditory, visual, tactile and emotional feedback functions
CN110351631A (en) * 2019-07-11 2019-10-18 京东方科技集团股份有限公司 Deaf-mute's alternating current equipment and its application method
IT202100015347A1 (en) * 2021-06-11 2022-12-11 Haga2 S R L COMMUNICATION DEVICE
WO2022259065A1 (en) * 2021-06-11 2022-12-15 Haga2 S.R.L. Communication device for facilitating verbal communication for deaf or hearing impaired people
CN116847023A (en) * 2023-08-31 2023-10-03 深圳市广和通无线通信软件有限公司 Auxiliary call method and device based on man-machine interaction
CN118605736A (en) * 2024-08-09 2024-09-06 厦门理工学院 Man-machine interaction head-mounted device for hearing impaired takeaway and control method thereof

Similar Documents

Publication Publication Date Title
US6240392B1 (en) Communication device and method for deaf and mute persons
US8082152B2 (en) Device for communication for persons with speech and/or hearing handicap
US6629076B1 (en) Method and device for aiding speech
US6975991B2 (en) Wearable display system with indicators of speakers
US20140236594A1 (en) Assistive device for converting an audio signal into a visual representation
US20060074624A1 (en) Sign language video presentation device , sign language video i/o device , and sign language interpretation system
US10453459B2 (en) Interpreting assistant system
US20020158816A1 (en) Translating eyeglasses
WO2023150327A1 (en) Smart glass interface for impaired users or users with disabilities
KR101017421B1 (en) communication system for deaf person
US7830271B2 (en) Audio coordinated visual indicator
KR101846218B1 (en) Language interpreter, speech synthesis server, speech recognition server, alarm device, lecture local server, and voice call support application for deaf auxiliaries based on the local area wireless communication network
KR20150026645A (en) Voice Recognition Application Program By Pattern Recognition Technology
KR100748432B1 (en) Wearable terminal device for aurally impaired persons
KR20130133932A (en) A wearable type head mounted display device for hearing impaired person
CA2214243C (en) Communication device and method for deaf and mute persons
US10936830B2 (en) Interpreting assistant system
US20050129250A1 (en) Virtual assistant and method for providing audible information to a user
KR102000282B1 (en) Conversation support device for performing auditory function assistance
JPH0879199A (en) Character information offering system
Olaosun et al. Assistive technology for hearing and speech disorders
KR20070071854A (en) Communication aiding method between dumb person and normal person in mobile telecommunication terminal
JP2006139138A (en) Information terminal and base station
JPH10145852A (en) Portable information transmitter
Heckendorf Assistive technology for individuals who are deaf or hard of hearing

Legal Events

Date Code Title Description
STCF Information on status: patent grant

Free format text: PATENTED CASE

FPAY Fee payment

Year of fee payment: 4

FPAY Fee payment

Year of fee payment: 8

REMI Maintenance fee reminder mailed
FPAY Fee payment

Year of fee payment: 12

SULP Surcharge for late payment

Year of fee payment: 11

AS Assignment

Owner name: BCMC, LLC, TEXAS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KRUEGER, WESLEY W.O., MD, DR.;REEL/FRAME:030760/0095

Effective date: 20130709

AS Assignment

Owner name: BCMC, LLC, TEXAS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BUTNARU, DORA;REEL/FRAME:031396/0131

Effective date: 20131013