DESCRIPTION
Human-Assistive Wearable Audio-Nisual Inter-Communication Apparatus.
Technical Field
The present invention is directed generally to an apparatus to provide self- contained wearable human- assistive communication device. The invention relates to recognizing Sign Language and converting into vocal speech and from vocal speech to converting into readable text. The apparatus is not just a data glove instead a complete glove with built-in processing and communication device.
Background Art
The increasing concern of United Nations' World Federation of Deaf Disables and thousands of institutions for the deaf and dumb all over the world is about how to bring deaf disable communities closer to everyday life, the work for which has yet to achieve the goal. Deaf and dumb disables take support from
Sign Language for their inter communications. For inter communications with normal people, they are helpless until a normal person knows Sign Language.
Unfortunately, the converse is not possible as it is impossible for a deaf and dumb disable to speak naturally. Sign language and spoken languages are two completely different languages. It is like, one is English and the other is
Chinese. This means, one who knows the both is the only one who can intercommunicate, but none else. Wearable human- assistive communication device is a data glove and generally a glove that fits over at least a part of a user's hand and detects detecting flexion of hand joints, touch and pressure of various muscles and sensing measurements of specific location of hands. Data Gloves or instrumented Gloves have been implemented using several different approaches, including fiber-optics, resistive sensors and accelerometer attached to the glove's joints to detect movement thereof. Conventional data gloves or instrumented gloves can be awkward for the user to operate because most of these gloves required intensive data processing and requires powerful computing device to be attached. In general, application of these data gloves could not be largely populated as
they presented the scope of application. Practically, it is not possible for Deaf disables to carry heavy equipment for communication. More importantly, deaf disables also requires warning and many other conditions where normal people or hazardous condition can communicate directly with deaf disables. Each spoken language has it's own alphabets and tone of sound and also has different rules for grammar, similarly Sign Language also different in different countries. Currently available data gloves are not well suitable for all Sign Languages.
There is therefore a need to invent and produce a self-contained wearable communication device, which could solve true inter-communication needs.
Furthermore, in U.S. Patent Application Publication No. 2002-0075232, Daum, Wolfgang et al. discloses an data glove. In the glove, a sensor material for fabricating instrumented clothing includes conductive rubber layer. In addition, two electrodes are disposed within the rubber layer, are connectable to an external circuit and are separated by a separation distance to form an electrical path from one electrode to the other through an intermediate portion of the conducting rubber layer. The electrical resistance measured between the electrodes is indicative of strain in the intermediate portion of the conducting rubber layer, thus permitting measurements of movement of the fabric to be made. The fabric may be used to form articles that a user can wear, including a data glove, so that movements of the user may be detected and measured.
In U.S. Patent No. 5,097,252, Harvill et al. discloses a motion sensor which produces an asymmetrical signal in response to symmetrical movement. In a first embodiment, a plurality of motion sensors are placed over the joints of a hand, with each sensor comprising an optical fiber disposed between a light source and a light sensor. An upper portion of the fiber is treated so that transmission loss of light being communicated through the optical fiber is increased only when the fiber bends in one direction. A light source and light sensor on opposite ends of the tube continuously indicate the extent that the tube is bent.
U.S. Patent No. 6,452,584 to Walker et al. is directed to data glove sensing hand gestures. In this patent, a system is provided for manipulating computer generated animation in real time, such as a virtual reality program running on a computer. The system includes a data glove for managing data based on an operator's hand gestures. This data glove comprises an elastic material that closely matches the shape of a wearer's hand, enabling the wearer to move their hand freely. A movement sensing unit is provided for sensing any hand gestures of the wearer. The movement sensing unit comprises a flexible circuit board
that extends along the dorsal region of the wearer's fingers and hand. The circuit board includes a base with a signal processor for processing received signals generated by a plurality of movement sensors. The sensors transmit signals to the processor for determining any movement of the wearer's hand. The sensors have a resistive material disposed on each side thereof, so that any flexure of the sensor causes the resistance values to diverge, preferably linearly. The resistance values on each side of the sensor diverge to a value corresponding to the degree of flexure of the sensor. A reference voltage is applied to each side of the sensor for establishing a voltage differential between its two sides. Any flexure of the sensor causes the resistance value of each side to change, for changing the reference voltage level between the two sides to indicate that the sensor has been flexed and the degree of flexure.
Firstly, said gloves and other such data gloves have common limitations. These limitations are referred to the measurement of flexion of hand fingers only. Said systems in patents/applications do not measures other joints, muscles and locations of hands. More importantly, in Sign Language, both hands are signed together and make many combined movements. These combined movements where both hands touches and forces each other cannot be sensed, because there is no sensors in the said patented gloves. Secondly, one way or the other, these patented gloves required very high performance external processing system and need many other necessary communication apparatus before gloves could be used. In all prior art inventions, communication is initiated by the Data Gloves wearer, he or she can only be able to make communication when data gloves are connected with all necessary processing and communication equipments and power sources. Other people cannot initiate communication until the user is sitting live over said patented glove apparatus.
Some systems which use Video camera and Television (TV) or other display monitor for Sign Language recognition, are not very practical to be used. First, they are very expensive, secondly they are not possible to carry. Thirdly, light and background of images and with different skin tones and body shapes and face, it is bound to get inaccurate results.
A further disadvantage of these data gloves is that the movement monitoring devices have poor longevity and are prone to reliability problems. Another disadvantage of these movement monitoring devices is that they may not sufficiently track the hand gestures of the wearer. The sensors may generate signals that are not an accurate representation of the wearer's hand gestures causing erroneous data to be generated.
Above are quite a few limitations to mention, that is why despite many efforts, these techniques have not solved communication problems of deaf and speechless dumb disables in practical use.
The present invention is a wearable human- assistive audio-visual inter¬ communication apparatus as a glove system, which is a long needed valuable invention that fulfills the much cherished goal of thousands of the institutions for the deaf and dumb all over the world.
The aim of the invention is to provide extremely useful, particularly device for those disables who do not know even the Sign Language.
Furthermore, it can provide a complete solution to intercommunicate between two similar or multiple languages without learning the other. It allows seem-less intercommunications between normal and disabled communities.
Furthermore, when using handwriting to text and speech and even Sign animation, it can provide every possible means of communication for the deaf and dumb disables.
Disclosure of Invention The invention provides communication-assistance to deaf and speechless
(dumb) disables to comfortably inter-communicate with normal people and/or disabled persons.
In addition, when installing communication means such as a Blue -tooth wireless device, a built-in Cellular and GPRS (General Packet Radio Service) device, a directly connected other device through industry standard interfaces like Universal Serial Bus (USB) and or Infra-red (IrDA), or the like, it can provide communication- assistance to deaf and speechless (dumb) disables to comfortably inter-communicate between remote distance.
Furthermore, when storing as database plural data set for detecting different kinds of sign languages in plural countries, it can also provide for cross-Sign- Language conversion to assist people across the globe to completely intercommunicate with deaf or speechless disable-to-disable, and disable-to- normal persons from face-to-face and face-to-remote distance without the pre¬ conditioned need of similar device at the other end. The invention also relates to a self-contained communication device to be worn on hands. It has a pair of hand gloves which has built-in wearable wrist processing devices, designed to provide for determining the gestures of Sign Language of one or both hands.
It can also convert sign language into other data in a different format such as digital sign data, speech, text, video animation or the like. It can also convert it back from speech to text, sign-data and graphical video animation to provide intercommunications . The invention can also recognize handwriting and converts it into text languages.
The invention can includes built-in Cell phone and Camera which enables remote distance voice and data communication world- wide. User can initiate phone calls and can also send live or pre-stored video images. More importantly, the invention does not necessarily require a similar device at the other end to intercommunicate* " it may also intercommunicate to other devices through software plug-in and/ or software utility program for specific function.
Furthermore, if a solar cell is attached on the dorsal side of the glove, electronic devices within the glove can be supplied electric power from the solar cell.
The invention also provides a data glove. This data glove has (a) a flexible printed circuit board settled on a dorsal side of a hand and extended towards a palm side of a hand, which has parts corresponding to five fingers, an ulner part, and extension parts extending to distal area of finger pulps on a palm side, (b) a first group of sensors in the flexible printed circuit board, for sensing touch force to the distal area of finger pulps, (c) a second group of sensors in the flexible printed circuit board, for sensing touch force to finger nails, and (d) a touch force sensor in the flexible printed circuit board, for sensing touch force to the ulner part, (e) a touch force sensor in the flexible printed circuit board, for sensing touch force to the mid palmer space.
By using this glove, gestures in sign language are sensed exactly and precisely. Furthermore, if rubber layers are made on both surfaces of the flexible printed circuit board, they can protect the flexible printed circuit board from breakage and water. When the solar cell and the devices are mounted detachable to the glove, fabric part of the glove can be washed easily after detaching the solar cell and the devices.
The invention also provides wrist mounted devices. Each of the wrist mounted devices has a wrist band and a device mounted on the wrist band. This device includes a text database having text data corresponding to
gesture data; a gesture-to-text conversion engine which reads gesture data sensed by sensors and finds equivalent word of text in the text database; a sentence composer engine which takes individual words of the text by the gesture -to -text conversion engine and re-arranges the words into a formal sentence; a speech database having audio data corresponding to text data; a text-to-speech engine which produces audible speech from text sentence by using the speech database; a speech-to-text engine which converts speech data into text data by using the speech database; a graphical animation engine which converts text data to gesture data and produces a graphical animation data of gesture from the gesture data; a display for displaying a text by the text data and a graphical animation by the graphical animation data; a speaker for output a speech by the speech data, etc.
Alternatively, the device includes a touch panel sensing hand writing; a converting means for converting the hand writing sensed by the touch panel to a video animation data; and a sending means for sending the video animation data to a remote device by wireless communication, etc.
Alternatively, the device includes one or more microphones! a speech database having audio data corresponding to text data; a speech-to-text database engine which converts audio data from the microphone(s) into a text data by using the speech database; and a display for displaying a text by the text data, etc.
Brief Description of Drawings Fig. 1 illustrates a human-assistive wearable wireless glove system as an embodiment of the invention.
Fig. 2 illustrates internal components of the human-assistive wearable wireless glove system.
Fig. 3 illustrates a block diagram of system and components functions of the human-assistive wearable wireless glove system for processing device A for one of a left or a right hand.
Fig. 4 illustrates a block diagram and components functions of human assistive wearable wireless glove system for processing device B for the other one of the left or the right hand. Fig. 5 illustrates a block diagram of software engines and databases for the processing device A.
Fig. 6 illustrates a block diagram of software engines and databases for processing device B.
Fig. 7 illustrates joints and locations of dorsal hand important to be measured.
Fig. 8 illustrates muscles and specific locations of palmer hand important to measured. Fig. 9 illustrates the position and type of sensors in dorsal and palmer hands.
Fig. 10 illustrates force and touch resistor sensors over a Polyimide Flexible Printed Circuit Board (FPCB) sheet.
Fig. 11 illustrates bend resistor sensors over the Polyimide Flexible Printed Circuit Board (FPCB) sheet. Fig. 12 illustrates RTV (Room Temperature Vulcanizing) silicon rubber layers sprayed over the Flexible Printed Circuit Board (FPCB) sheet.
Best Mode for Carrying Out the Invention Figure 1 shows an embodiment of the invention.
In Fig.l, both gloves 101 and 102 have built-in miniature complex wireless, analog and digital data processing devices 117 and 107 within the gloves 101 and 102, the devices 117 and 107 are mounted at wrist side of the gloves 101 and 102. The device 117, 107 is attached to a wrist band. In Fig. 1, flexible solar cells 103 and 112 are mounted at outer layers of the glove system to provide alternative power source of the devices 117 and 107. Control key switches 113 and 109 are switches to control and operate the processing devices 117 and 107. A touch screen panel grid and a display 110 and 114 provides data input and output functions. A speaker 116 is built within the processing device 117. A microphone 104 is built in the processing device 107. The processing device 107 has a built-in camera 105. Antennas 115 and 108 are set in processing devices 117 and 107, for transmitting and/or receiving data. Wrist-straps 118 and 106 tie-up the processing devices 117 and 107 over wrists and the glovelOl and 102. The processing devices 117 and 107 are integral part of glove system 102 and
101 forming a one integrated self-contained unit. Self-contained unit means that it does not require any external device or equipment to perform functions.
Figure 2 illustrates internal components of the glove system as embodiment of the invention. In Fig. 2, A Flexible Printed Circuit board (FPCB) sheet 120 has both bend resistive and force resistive sensors on the surface.
Accelerometer sensor groups 126 and 125 are installed over the FPCB sheet
120 at dorsal side of the hand to measure roll-over and direction of dorsal hand
movements. A dual-port Analog Multiplexer Switch device 127 is installed directly on the FPCB sheet 120 at dorsal side. One port in the dual-port Analog Multiplexer Switch device 127, "Port A", is for bend flex resistor sensors whereas another port in the dual-port Analog Multiplexer Switch device 127, "Port B" is for force resistor sensors. In Fig. 2, a flexible cable bank 121 is a connector, which connects the FPCB sheet 120 with a Printed Circuit Board (PCB) 122 of the processing device 107 or 117. The PCB 122 is installed within the glove 101 or 102 at wrist side of the hand similar position like a wrist watch. Accelerometer sensors 123 and 124 measure location of hand movements.
Figure 3 illustrates a block diagram of the processing device A (107) of Fig. 1 as embodiment which demonstrates various components and their flow in the device operation. Signals from all bend resistor sensors 133 pass through Port A of the analog Multiplexer Switch (MUX) 132 (corresponding to 127 in Fig. 2) within the FPCB sheet 120. MUX 132 is controlled by the analog signal processor (ASP) 139. Similarly, signals from all force resistor sensors 134 also pass through Port B of MUX 132 where it is controlled by ASP 139. Wheatstone Bridges 137 and 138 provide voltages to bend and force sensors 133 and 134. When values of the sensors 133 and 134 changes, the Wheatstone Bridges 137 and 138 output the respective change in current flow due to change in the sensors 133 and 134. The ASP 139 measures the value of current change after it converts analog current change into digital.
In Fig. 3, dorsal accelerometer sensors 135 and 136 are corresponding to the sensors 125 and 126 in Fig. 2, and wrist accelerometer sensors 146 and 147 are corresponding to the sensors 123 and 124 in Fig. 2.
The output of these sensors 135, 136, 146 and 147 are measured and controlled by the ASP 139.
Control key switches 131 trigger and provide input function for the ASP 139 and main Central Processing Unit (CPU) 140. A display and touch grid device 130 is as an input and output device.
The CPU 140 sends text and graphics to the display of the device 130 to be displayed. Using Stylus Pen, hand-written and touch characters and clicks are placed on top of the touch grid of the device 130 which sends change in grid value to the CPU 140. The CPU 140 measures the input changes of the grid value. Microphones 148, 149 and 150 (corresponding to 104 in Fig. l) are in combine forming an array of microphones receive live audio and output an audio signal and the signal passes to a Voice Processor (VP) 154 which treats
the audio signal to remove unwanted noise and echo. The VP 154 delivers filtered audio signal to a digital signal processor (DSP) 145 which not only converts analog audio signal into digital format, but performs intensive audio analysis for voice recognition (speech-to-text) translation. The ASP 139, the CPU 140, and the DSP 145 are inter-connected and perform program execution under the master command of the CPU 140.
A miniature Camera 155 captures video and sends a video signal to an image processor (IMP) 156. IMP 156 processes the captured video signal and forwards to the DSP 145 , where it processes back and forth in conjunction with a temporary memory 144 and a non-volatile memory storage 157. The DSP 145 sends the final video image to the CPU 140. A controller device 153 controls battery charging for a battery 153 and selects a power source of entire device from a battery 153 and a solar cell 152 (corresponding to 103 or 112 in Fig. l). The device 153 works in conjunction with the processors 139 and 140 for various power saving and sleep-mode operations.
A Universal Serial Bus (USB) 159 and an Infrared (IrDA) 158 are hardware interfaces which connect external input/output devices with the processing device 107 of the glove 101. A Blue-tooth Transceiver 142 and a Base-band 143 provide wireless communication with the processing device B (117) of the glove 102 and also other external devices through wireless data exchange. An antenna 141 is for Blue-tooth wireless.
The embodiment of the invention as built-in processing devices A and B (107 and 117) within gloves 101 and 102, the processing device A (107) is different from the processing device B (117) as shown in Fig. 1.
Figure 4 illustrates a block diagram of the processing device B (117) of Fig. 1 as embodiment which demonstrates various components and their flow in the device operation. Signals from all bend resistor sensors 184 pass through Port A of the analog Multiplexer Switch (MUX) 183 within a FPCB sheet for the glove 102. MUX 183 is controlled by the analog signal processor (ASP) 179. Similarly, signals from all force resistor sensors 185 also pass through Port B of MUX 183 where it is controlled by ASP 179. Wheatstone Bridges 181 and 182 provide voltages to bend and force sensors 184 and 185. When values of the sensors 184 and 185 changes, the Wheatstone Bridges 181 and 182 output the respective change in current flow due to change in the sensors 184 and 185. The ASP 179 measures the value of current change after it converts analog current change into digital.
In Fig. 4, dorsal accelerometer sensors 186 and 187 and wrist accelerometer sensors 188 and 190 are also installed on the FPCB sheet for the glove 102. The output of these sensors 184, 185, 186 and 187 are measured and controlled by the ASP 179. Control key switches 171 trigger and provide input function for ASP 179 and main CPU 178. A display and touch grid device 170 is as an input and output device. The CPU 178 sends text and graphics to the display of the device 170 to be displayed. Using Stylus Pen, hand-written and touch characters and clicks are placed on top of the touch grid of the device 170 which sends change in grid value to the CPU 178. The CPU 178 measure the input changes of the touch grid value.
A vibrator motor 191 is a very important component of the embodiment. The vibrator motor 191 is controlled by a vibrator motor controller 194 which takes signals from the ASP 179. The ASP179, the CPU 178 , and the DSP 180 are inter-connected and perform program execution under the master command of the CPU 178.
A miniature speaker 174 provides audio output. The speaker 174 is driven by an audio amplifier 175. The CPU 178 sends final audio output to the audio amplifier 175 which after signal amplification sends audio signal to the speaker 174 to be out. A controller device 195 controls battery charging of a battery 193 and selects power source of entire device from the battery 193 and a solar cell 192
(corresponding to 112 in Fig. l). The device 195 works in conjunction with the processors 179 and 178 for various power saving and sleep-mode operations.
A Universal Serial Bus (USB) 173 and an Infrared (IrDA) 172 are hardware interfaces which connect external input/output devices with the processing device 117 of the glove 102.
A Blue-tooth Transceiver 177 and a Base-band 200 provide wireless communication with the processing device A (107) of the glove 101 and also other external devices through wireless data exchange. An antenna 176 is for Blue-tooth wireless.
A Cellular & GPRS Transceiver 199 , a Cellular Base -band 198 and a Subscriber Identification Module (SIM) 197 are the components of built-in cell phone GPRS device which provide voice and data communication at remote distance. The General Packet Radio Service (GPRS) allows information to be sent and received. The Cellular & GPRS Transceiver 199 , the Cellular Baseband 198 and the Subscriber Identification Module (SIM) 197 are used for applications for GPRS such as Chat, Textual and Visual Information, still Images, Moving Images, Web Browsing, Document Sharing/ Collaborative
Working, Audio, Job Dispatch, Corporate Email, Internet Email, Device user's Positioning, Remote LAN Access, File Transfer, and Home Automation, etc
A wide range of content can also be delivered to the device 117 through GPRS services ranging from share prices, sports scores, weather, flight information, news headlines, prayer reminders, lottery results, jokes, horoscopes, traffic, location sensitive services and so on. This information need not necessarily be textual- it may be maps or graphs or other types of visual information.
An antenna 202 is for the built-in Cell phone. These components are controlled by CPU 178.
Figure 5 illustrates various software data conversion engines and databases of the processing device A (107) of Fig. 1. In Fig. 5, the processing device A means a main processing part such as the ASP 139, the CPU 140, the DSP 145, and the VP 154. In Fig. 5, a raw sign data 223, that is sensor data from the sensors, is from its own glove 101. The Raw Sign data 223 is simplified by the processing device 107 and the device 107 sends out through the Blue-tooth Antenna 141 to the processing device B (117) for Sign to text interpretation (conversion). A Speech-to-Text (STT) engine 226 and a STT database 231 convert audio speech data to text data. When audio speech is received through the microphone array 148, 149 and 150, USB or IrDA devices 159 and 158, or from the device B
(117) using Cell phone, the processing device A (107) runs the STT engine 226 which converts speech into text.
An Alert Checker (AC) engine 229 and an AC database 234 cross-check the output text of the STT engine 226 for various warning, information and control conditions. If the response of the STT engine 226 notices that it be matched, the processing device A (107) sends a signal to the processing device B (117) through Blue-tooth device 141, 142 and 143 to activate alert signal to the vibrator motor 191 in the device B (117). A regular text converted by STT engine 226 is passed to a Sentence Composer (SC) engine 228 and a SC database 233 for sentence composing based on specific grammar rules. The SC engine 228 and the SC database 233 convert the regular text to a formal sentence. The final sentence can be displayed at LCD display 220 of the device 130 and/or to broadcast out to external input/output interfaces and/or to send back through wireless device to the processing device B for further broadcast to built-in cell phone.
A touch panel grid device 230 is, in the device 130, for handwriting and command input. A Short Handwriting-to-Text (SHTT) engine 227 and a SHTT
database 232 convert a short handwriting detected by the touch grid device 230 into a text.
A video camera input 224 is obtained from the camera 155 . Other data input 225 is obtained from USB 159, IrDA 158, etc.
Figure 6 is a block diagram of various software Engines and Databases in processing device B (117). In Fig. 6, the processing device B means a main processing part such as the ASP 179, the CPU 178, and the DSP 180.
In Fig. 6, a raw sign data 252 is from its own glove whereas a sign data 251 is received from the processing device A (107).
A Sign-to-Sign Codes (STSC) conversion engine 268 and a STSC database 271 take a set of sign data detected on the both gloves 101 and 102, and convert the set of sign data to a series of sign codes which represent sign language words.
A Sign-Codes-to-Text (SCTT) conversion engine 267 and a SCTT database 270 take the series of sign codes from the STSC engine 268 and convert sign codes into a raw text.
The raw text is then passed to a Sentence Composer (SC) engine 269 and a
SC database 272, which correct the format of the raw text based on specific grammar rules. The final text data by the SC engine 269 is broadcast to external devices; to blue-tooth wireless 176, 177, and 200; and/or to built-in cell phone device 202, 199, and 198; and/or to directly at the speaker 174.
A Sign-code-to- Video-Animation (STVA) engine 262 and a STVA database 263 take a digital sign coded data and convert it into video animation data of a sign language corresponding to the sign coded data. Sign coded data of each hand joints, muscles and locations and output of accelerometer sensors are applied to graphical input to presents an equivalent sign video animation.
A Text-to-Speech (TTS) synthesizer engine 260 and a TTS database 261 take text data and convert it into speech data.
A Text-to-Sign-Code (TTSC) engine 258 is a reverse conversion engine of the SCTT engine 267 and a TTSC database 259 is a reverse conversion database of the SCTT database 270.
A touch panel grid 257 is for command input into the processing device B (117).
A LCD display 256 displays data in the processing device B (117). Other data input 264 is obtained from USB 173, IrDA 172, etc.
Figure 7(A) illustrates dorsal hand and wrist joints. A drawing 290 is an anatomy of dorsal hand's flex joints. Joints 283 are Metacarpol Phalangeal
flex joints of an Index Finger 281, a Middle Finger 285, a Ring finger 286 , and a Little Finger 287. Joints 282 are Proximal Interphalangeal Flex joints of a Thumb 288, and the fingers 281, 285, 286, and 287. Joints 280 are Distal Interphalangeal Flex joints of the thumb 288, and the fingers 281, 285, 286, and 287. A Flexor Retinaculum Wrist joint 284 is at connection between a hand and an arm
Figure 7(B) illustrates specific flexion muscles and location of dorsal hand. A drawing 298 is an anatomy of dorsal hand. A thumb nail is in a location 297, an index finger nail is in a location 293, a middle finger nail is in a location 292, a ring finger nail is in a location 294, and a little finger nail is in a location 295.
Inter Digital Spacer flexions 296 are located at roots of a thumb and fingers.
Figure 8 illustrates specific muscles and location of a palmer hand. A palmer hand 303 has Distal Pulps 300 of thumb, index, middle, ring and little fingers, a Medial Ulner Muscle 302, and a Mid Palmer space 301.
Figure 9(A) illustrates locations of Bend Flex Resistor sensors 310 and 311 on a dorsal hand 312, respectively. Figure 9(B) illustrates locations of flexible Force Resistor sensors 313 on hand nails of a dorsal hand 314. Figure 9(C) illustrates Flexible Force Resistive sensors 315 on a palmer hand 316.
Figure 10 illustrates locations of flexible Force Resistor Sensors 325, 326, 327, 328, and 329 as dorsal hand nail sensors, and locations of Distal Pulp flexible Force Resistor sensors 320, 321, 322, 323, and 324 for distal finger pulps of a palmer hand on the Flexible Printed Circuit Board (FPCB) sheet 120.
A finger part of the FPCB sheet 120 is extended to the direction to perpendicular from the finger part, and then the flexible Force Resistor Sensors
325, 326, 327, 328, and 329 are positioned at the tips of the finger parts and the Distal Pulp flexible Force Resistor sensors 320, 321, 322, 323, and 324 are positioned on an extended part of the FPCB sheet 120.
On the FPCB sheet 120, palmer hand sensors 330 and 331 are installed for Medial Ulner spacer and Mid Palmer space respectively.
Figure 11 illustrates locations of flexible Bend Resistors sensors 340 to 354 on the FPCB sheet 120. Flexible Bend Resistors sensors 340, 341, 342, and 343 are Metacarpol joints sensors. Flexible Bend Resistors sensors 345, 346, 347, 348, and 349 are Proximal and Distal Interphalangeal joint sensors which
measures the flexion of both Proximal and Distal Interphalangeal joints. Flexible Bend Resistors sensors 350, 351, 352, and 353 are Inter Digital spacer flexion sensors. Flexible Bend Resistors sensor 354 is a Flexor Retinaculum Wrist joint sensor.
Figure 12 illustrates RTV Silicon Rubber spray layers as upper rubber layer 372 and bottom rubber layer 370 on both surfaces of the FPCB sheet 120. In this embodiment, the upper rubber layer 372 and the bottom rubber layer 370 are formed on the surface of the FPCB sheet 120 without seams.
As mentioned above, the wearable human-assistive audio-visual inter¬ communication apparatus has a pair of self-contained hand gloves, designed to provide for determining the gestures of both hands of Sign Language, and producing speech, text and graphical video animation, and converting them back from graphical video-animation, text and speech to Sign Language and gestures of hands. It also recognizes Short Handwriting written on Touch Panel Grid and converts into text languages.
Both the gloves 101 and 102 have built-in miniature complex wireless, analog and digital data Processing Devices within the gloves at wrist side of the gloves, which includes Central Processor Unit (CPU), DSP (Digital Signal Processor), Analog Signal Processor (ASP), Voice Processor (VP), Image Processor (IMP), Memory, Memory Storage, Bluetooth Transceiver, Bluetooth Base-band, Cellular & GPRS Transceiver, Cellular Base-band, LCD display, Interface Controller, Accelerometer Sensors, Touch Panel Grid, Array of Microphones, Speaker, Camera, Vibrator Motor, Control Keys, Re-chargeable Battery and other likeable controllers and components.
The wearable human-assistive multi-lingual audio-visual inter¬ communication device, comprising self-contained wireless Gloves system, also has Flexible Solar Cells attached at the dorsal area of the said Gloves to provide for alternative power source to internal electronics and processing devices components of the Gloves.
The FPCB sheet for each of the gloves 101 and 102 is a double-sided copper layered Polyimide Flexible Printed Circuit sheet, which is sketched over entire hand bones for each of left and right hands.
An Electronic Printed Circuit Board lay-out design is applied at both layers of the FPCB sheet. 100K Ohm Flexible Bend resistors (sensors) 340 to 354 as shown in Fig. 11 are screen printed at various sizes (length and diameter) for
each finger's joints which includes" Metacarpol Phalangeal Flex joints for index, middle, ring and little fingers; Proximal Interphalangeal Flex joints for thumb, index, middle, ring and little fingers; Distal Interphalangeal Flex joints for thumb, index, middle, ring and little fingers; Flexor Retinaculum Wrist joint for wrist and dorsal hand flexor; Inter Digital Spacer Flex between thumb and index finger, between index and middle fingers, between middle and ring fingers, between ring and little fingers. Bend Resistive Sensors 350 to 353 for Inter Digital Spacer flexes are placed in such a way that it extends from Proximal Interphalangeals of fingers and drops down back towards Metacarpol Phalangeal and makes a U-turn back to Interphalangeal joints sensing the flex (opening and closing) linearly.
As the sensor 340, ..., 354 bends, it changes upward the value of resistor. As shown in the following data, the degree of joints motions (Extension, Flextion, Hyperextension, Abduction, Pronation, Supination, etc.) is varied from joint to joint. The change of value in resistors is measured through Wheatstone - Bridges 137, 138, 181, and 182 and read by the ASPs 139 and 179. Before reaching to Wheatstone-bridges 137, 138, 181, and 182, signals from the bend resistive sensors 340 to 354 pass through the Port A of the MUXs 132 (182) mounted on the FPCB sheets at dorsal area of hand. The MUXs 132 and 182 are controlled and addressed by the ASPs 139 and 179.
Based on Orthopedic data of human bone anatomy, following are the typical ranges of joints movements of hands and arms:
Bone Joints Movements Range of Degree Elbow Extension/Flexion 0 to 145
Forearm Pronation/Supination 70 to 85
Wrist Extension/Flexion 70 to 75
Radial/Ulnar 20 to 35
Thumb basal joint Palmar Adduction/Abduction Contact to 45 Radial Adduction/ Abduction Contact to 60
Thumb Interphalangeal Hyperextension/Flexion 15H to 80
Thumb Metacarpophalangeal Hyperextension/Flexion 10 to 55
Finger DIP joints Extension/Flexion 0 to 80
Finger PIP joints Extension/Flexion 0 to 100 Finger MCP joints Hyperextension/Flexion (0-45H) to 90
In order to capture various parts of hand and fingers touches and forces/pressures of left and right hands when making Signs in sign language,
the apparatus of the embodiment uses Flexible Force Resistive sensors, as the sensors 320 to 330 in Fig.10. Unlike Bend Resistive Sensor, when force, touch or pressure is applied at Force Resistive Sensors, the value of resistive sensor drops linearly from several hundred Mega Ohm to Kilo Ohm or even few ohms depending on the magnitude of force, touch and/or pressure. Therefore all flexible force resistive sensors are measured and read differently than Flexible Bend Resistive Sensors. The force resistive sensor consists of two thin, flexible polyester sheets which have electrically conductive electrodes. Inside surface of one sheet forms a row pattern while the inner surface of the other employs a column pattern. A thin semi-conductive coating (ink) is applied as an intermediate layer between the electrical contacts. The ink provides the electrical resistance change at each of the intersecting points.
When the two polyester sheets are placed on top of each other, a grid pattern is formed, creating a sensing location at each intersection. These flexible coatings are applied at Dorsal Hand Nails of both hands for thumb, index, middle, ring and little fingers! Distal Pulp of palmer hand for thumb, index, middle, ring and little fingers; Medial Ulner for medial ulner side of hand space; and Mid Palmer space for middle space of palm, as shown in Fig. 10.
Similar to Bend resistive sensors 340 to 354, each of Flexible force resistive sensors 320 to 331 also passes through the Port B of the MUX 132, and reaches to the Wheatstone Bridges 137 and 138 to be read the changes in sensor value by the ASP 139.
Portions of the FPCB sheet 120 for all Distal Pulp of Palmer hands sensors are extended from portions of the sensors 325 to 329 corresponding to nails on dorsal hand as a continuation of the FPCB sheet. Distal Pulp sensors 320 to 324 are bent reverse from Dorsal Nail sensors 325 to 329 in the glove. Similarly, Polyimide FPCB sheet 120 is extended from Dorsal area of hand and turned reverse towards a medial ulner of a hand for the Flexible Force Resistive sensor 330; and a middle space of palm for the Flexible Resistive sensor 331. Thus, it keeps one continuous Polyimide FPCB sheet 120 for each hand glove. The FPCB sheet 120 is for the glove 101 of a right hand, and the
FPCB sheet for the glove 102 of a right hand has a symmetrical shape to the
FPCB sheet 120.
A group of X-Y and X-Z Accelerometer sensors 125 and 126 are directly soldered over Polyimide FPCB sheet 120 to measure the position (i.e. direction) of Dorsal hand for up-down, front-back, and left-right. The output of these sensors are read by the ASP 139. The X-Y accelerometer sensor 125 is soldered evenly at the surface of the FPCB sheet 120, whereas the X-Z
accelerometer sensor 126 is soldered vertically over the FPCB sheet 120 hence making the Z axis. Similarly, the processing devices A and B (107 and 117) also have X-Y and X-Z accelerometer sensors 123 and 124 which are directly soldered over the PCB 122 of the processing devices A and B (107 and 117). The X"Y accelerometer sensor 123 is soldered evenly at the surface of the PCB 122 of the processing devices A and B (107 and 117), whereas the X-Z accelerometer sensor 124 is soldered vertically over PCB 122 of the processing devices A and B (107 and 117) hence making the Z axis. Dorsal X-Y and X"Z Accelerometer sensors work (sensed) directly in conjunction with Flexor Retinaculum Wrist joint Bend Resistor Sensor.
Wrist X-Y and X-Z Accelerometer sensors output (3-D location) dynamically adjusted in conjunction with two parameters^ (a) the output all Force Resistor Sensors in an OR (digital gate) Boolean and the determination of final computed gesture word. For example, signing for a word " LAZY " in which tap the index palm finger of the right hand at the left shoulder several times. This dynamically calibrate and adjust the location of wrist (X~Y and X-Z Accelerometer sensors). In another example, signing for a word " MOUSE" in which brush the right hand index finger to the left across the nose tip a few times. This dynamically calibrate and adjust the location of wrist X-Y and X-Z Accelerometer Sensors.
Therefore, both dorsal and wrist accelerometer sensors do not work individually, instead the output of all accelerometer sensors are read in conjunction with linear values of other bend and force resister sensors.
Dual Port Analog Multiplexer switch (MUX) 127 (corresponding to 132 in Fig.
3) is also directly soldered over the FPCB sheet 120. The MUX 127 is addressed and controlled by the ASP 139.
Polyimide FPCB sheet 120 is directly connected with loosely coupled flexible wire (Cable Bank) 121 to bridge the connections between Polyimide FPCB and Processing devices A and B.
A wrist can act 70 to 75 degree movements for Extension and Flexion and 20 to 25 degree for Radial and Ulner twist, which causes the changes of value from the bend resistive sensors. This flexible cable bank bridge 121 keeps the bend resistive sensors in their joints position. After placing all sensors, chips and components on the FPCB sheet 120, the
FPCB sheet 120 is cut like a stencil similar to the formation of hand joints and bones while keeping one piece for both Dorsal and Palmer.
The finished Polyimide FPCB sheet 120, after edge cutting, is sprayed Silicon
rubber using Room-Temperature-Vulcanizing on both sides (upper and bottom). From above stencil cutting, the FPCB sheet 120 becomes a one fabric similar to an upper layer of glove fabric. Silicon rubber layer is flexible and stretchable.
Polyimide Silicon Rubber FPCB sheet 120 is placed in between two layers of stretchable fabric and sieved or glued to make hand glove. The sensor 330 for a medial ulner side of a hand, and the sensor 331 for a middle space of a palm are placed in a small jacket of the palmer side of glove fabric. The outer edge of the sensor 331 is tied with an elastic thread where the other end of elastic is sieved or glued at thumb fabric. Thus keeping these two sensors to adjust and move freely. Specially when wearing gloves, it protects the FPCB sheet 120 from breaking and also lets the user to use hands freely for other work.
The sensors for both distal pulps and inter digital fingers are placed in a fabric jacket to avoid damage or breaking of the FPCB sheet 120 during wearing or use. A group of flexible and thin solar cells is placed at the outer layer of dorsal fabric. The solar cells 103 are connected to the flexible cable bank 121 and, through the flexible cable bank 121, connected to the processing device 107. The positive and negative charge of the solar cells is directly supplied to the processing device 107 via the flexible cable bank 121. The solar cells 103 is controlled by battery controller 153 built within the processing device 107.
The FPCB sheet 120, the flexible cable bank 121 and the processing device 107 are all combined to make a simple and a single piece of hand glove, the glove 101 (fabric and components connected together).
The processing device 107 and a group of solar cells 103 can both be detached for hand wash of the glove 101.
The glove 101 is discussed here, and the glove 102 is also configured in the same manner as the glove 101.
To protect the gloves 101 and 102 from any breakage or damage during wear- on and wear-off, the user wears other thin fabric gloves first and then wears the gloves, so that the gloves 101 and 102 are worn smoothly with minimal friction resistance.
Two processing devices A and B (107 and 117), which are attached on each apparatus glove system, perform specific data processing and program execution functions. Both the processing devices intercommunicate and exchange data wirelessly over blue -tooth.
Processing device A (107) continuously measures and reads all glove sensors and converts analog data into digital codes. It also treats and simplifies incoming data and removes unwanted signals and codes and broadcasts out
digitally coded gesture data to processing device B using built-in Blue-tooth wireless device.
Processing device B (117) also continuously measures and reads all sensors and converts analog data into digital codes. It also treats and simplifies incoming data and removes unwanted signals and codes. On receiving digitally coded gesture data from processing device A, processing device B (117) takes digitally coded sign data and executes Sign-Codes-to-Text engine which takes sign data and translates/finds an equivalent match of text alphabet or words from pre-stored database in memory storage of processing device B. Raw text is applied to Sentence Composer engine which re-arranges individual words into full sentence. Based on user setting, sentence composer engine can be by-passed.
Depending on the type of communication (face-to-face, or face-to-remote, or same device-to-device, or said apparatus-to-other device using software plug-in or utility), the text from above is broadcast out. The text is broadcast to processing device A (107) through built-in blue-tooth wireless. Processing device A (107) receives Text from processing device B (117) and displays over LCD located with the glove of processing device A. Processing device B forwards text to Text-to-Speech synthesizer engine, which produces human voiced audio speech through built-in Speaker in processing device B. Depending on the type of communication (face-to-face, or face-to-remote, or same device-to-device, or said apparatus-to-other device using software plug-in or utility), Text-to-Speech synthesizer can also send out audio speech to remote device using industry standard interface (USB or IrDA) or through remote distance using built-in Cellular & GPRS transceiver (in processing device B) very much like a normal person speaking over cell-phone's microphone to other party at remote distance. Live audio speech is received through Array of Microphones from processing device A. Depending on the type of communication (face-to-face, or face-to- remote, or same device -to -device, or said apparatus-to-other device using software plug-in or utility), an audio speech is applied through Voice processor (VP) on processing device A. Voice Processor takes the differential inputs of Silicon Array Microphones to minimize the RF interference and white noise. The 3 microphones create AMBIN (Array Microphone Bea -forming Integrated with Noise suppression) for advanced noise suppression and echo cancellation for clearest communication for better voice recognition even in highly noisy environment. The noise suppression can be achieved up to 15dB and more, and acoustic Echo Cancellation up to 45dB and more. After treating audio speech by Voice Processor (VP) from processing device A,
the audio speech passes through voice recognition (Speeclrto-Text) engine which converts audio speech into text.
Once the text is ready by processing device A, it is displayed over LCD located over the gloves in Processing device A. The text is also broadcast to processing device B over blue -tooth wireless. If activated, the processing device B takes the Text and applies it to Text-to-Video Sign Animation engine. The graphical video sign animation engine takes the each word of the text and finds equivalent digital sign coded sequence. These text equivalent sign coded data is applied to the graphics engine, which mimics 3-D animated human makes the sign of Sign language over processing device B's LCD display.
Processing device B has built-in Cellular & GPRS Transceivers which provide both voice and data communication. Gestures made using both hands and converted into final audio speech are broadcast out through Cellular & GPRS devices and it receives audio speech or data from remote side. The graphical video Sign animation or Sign language coded data are broadcast out to other remote device which may be using similar Glove apparatus or other devices using software plug-in or software utility.
Processing device A has Touch Grid panel over LCD. Instead of making Signs or typing each alphabet, processing device A has Short Handwriting recognition engine, which reads writing over Touch Grid Panel using Stylus pen and converts graphical input into Text. Once the text is extracted, it is passed through various engines as described above for producing audible speech, text or graphical video Sign Language animation.
Processing device B has built-in Vibrator motor, which provides many useful interfaces and communication between user and Gloves system apparatus. Processing Device B has Alert Checker database engine (Alert- Checker- Database Engine) : On receiving audio, text or animation data input, the Alert Checker database engine verifies conditions and generates alerts to the user through Vibrator Motor located within the glove. These Alert-Checker conditions may include, person's Name, Mr, Miss, Excuse me, Hay, Hello, or Attention like words and/or can be set for Phone Ring, Doorbell Signalers, Smoke/Fire Alarm, Burglar Alarm, Siren, Auto mobile Horn Alert, or even Baby Cry Signaler. This means, a person wearing Glove System apparatus can be called or alerted for various abnormalities or normal communication wherever he/she may happen to be (e.g. walking on the street or at the Airport). All received calls through built-in Cell phone are activated through Vibrator motor. If at home, user can be alerted, alarmed, informed, or called in various conditions. User can set 32 or more conditions. Sound, audio speech or
tone/tune can be set for personalized conditions.
In addition, a vibrator motor also provides mechanism where others can initiate communication, for example, by saying "Hello" to the person wearing this gloves 101 and 102.
Calibration methods for the sensors is described hereinafter. Dynamic calibration can be performed in the devices 107 and 117 for individual user for approximate positioning and location identification of individual user's body parts as stated below. It is further stated that all corresponding values, read by sensors, are stored in a non- volatile memory. Calibration starts and ends with Vibrator's vibration indicating to the user when to start and when to end, that it has read the calibration values.
Firstly, calibration method for the case that one or two hands of a user is/are located near the body of the user is described^ In step-1, user stand straight, leave arms to gravity and flat hand down towards ground (resting position) to read arms and hands positions for both hands, one by one. This tells the device that user is in the reset position from where it will proceed for signing or proceed for further calibration. In step-2, user stand straight, lift arms from resting position towards shoulder and set hand-shape similar to alphabet "A" facing opposite person, and set finger-spelling position as shown for both hands, one by one. This tells the device that the position of wrist next to the shoulder.
In step -3, user stand straight, lift arm from resting position till it makes an angle with palm facing opposite person for both hands, one by one; This tells the device the position of Palm.
In step-4, user stand straight, lift right arm towards right shoulder, flat palm position fingers towards up (head). Lift left arm from resting position, flat hand palm, turn right left arms and position fingers underneath right-arm elbow to define signing area; this tells the device that location of shoulders precisely. In step-5, user stand straight, lift right arm from resting position, flat palm and place hand over heart;. This tells the user the location of heart within the space of step-4.
In step-6, user stand straight, lift arm from resting position, flat palm and place over stomach for both (left and right) arms and hands, one by one; this tells the device the location of stomach.
In step -7, stand straight, lift arm from resting position, flat palm and place over chest for both (left and right) arms and hands, one by one; this tells the device the location of chest of the user body.
In step -8, user stand straight, lift arm and place it in finger- spelling position, flat palm facing opposite person for both hands, one by one; this tells the device position of palm.
In step-9, user stand Straight, lift arm and place it in finger-spelling position, flat palm facing towards user for both hands, one by one; this tells the device the position of palm.
In step- 10, user stand straight, lift arm and place it in finger-spelling position, flat palm facing towards opposite shoulder for both hands, one by one; this tells the device opposite location of shoulder from opposite wrist when the it touches the opposite shoulder.
In step -11, user stand straight, lift arm and place it at chest position, flat palm facing up for both hands, one by one; this tells the position of palm.
In step -12, user stand straight, lift arm and place it at chest position, flat palm facing own for both hands, one by one. This tells the device the position of palm.
Secondly, calibration method for the case that one or two hands of a user is/are located near the neck, the face, or the head of the user is described:
In step-1 the user's head is divided into four positions: left side of front head, right side of front head, top of the head, and back side of the head; user lift right and left arms (one-by-one) and place index finger at each location of head. This tells the device the locations of head.
In step-2, user's forehead is a single position place,' user lift right and left arms (one -by-one) and place index finger over forehead. This tells the device the location of forehead of user. In step-3, users' eyes are divided into two positions"- left-eye and right-eye; user lift right and left arms (one-by-one) and place index finger over closed left and right eyes. This tells the device the location of eyes of user.
In step-4, user's nose is a single position of the face! user lift right and left arms (one -by-one) and place index finger over nose. This tells the device the location of user's nose of the face.
In step-5, user's ears are divided into two positions: left ear and right ear! user lift right and left arms (one-by-one) and place index finger over left and right ears one-by-one. This tells the device the location of ears of user face.
In step-6, user's cheeks are divided into two positions: left cheek and right cheek! user lift right and left arms (one-by-one) and place index finger over left and right cheeks one-by-one. This tells the device the location of cheeks.
In step-7, user's mustache is a single of the face! user lift right and left arms (one-by-one) and place index finger over mustache. This tells the device the
location of mustaches of user's face.
In steρ-8, user's lips and teeth have one position! user lift right and left arms (one -by-one) and place index finger over lips. This tells the device the location of user's lips. In step-9, user's chin is also a single position of the face! user lift right and left arms (one-by-one) and place index finger over chin. This tells the device the location of user's chin.
In step-10, user's neck has one position! user lift right and left arms (one-by- one) and place index finger over neck. This tells the device the location of user's neck.
In short, the wearable human-assistive audio-visual inter-communication device comprising self-contained wireless Gloves system provides complete and self-contained total solution in many alternative communication situations, even with Sign Language or without Sign Language.
Another embodiment of the invention is described hereinafter.
In the embodiment, a wrist mounted device has a wrist band and a processing device mounted on the wrist band. The processing device is configured by adding a video processor (VP) 154, a microphone array 148, 149 and 150, an image processor (IMP) 156, a camera 155, a short-handwriting-to- text (SHTT) engine 227, a SHTT database 232, an Alert Checker (AC) engine 229, an AC database 234, a speech-to-text (STT) engine 226, and a STT database 231 as installed in the processing device A (107) to the processing device B (117).
The wrist mounted device is for disables and/or patients and is mounted on a wrist of the disables and/or patients without gloves.
This processing device can convert a hand writing sensed by touch panel grid 257 to a video animation data and can send the video animation data to a remote device by wireless communication such as Bluetooth, IrDA, CellularPhone, and GPRS (General Packet Radio System).
In addition, this processing device can convert an audio data from microphones 148, 149 and 150 to a text data. This processing device can send the audio data and/or the text data by wireless communication such as Bluetooth, IrDA, Cellular phone, and GPRS.
In addition, this processing device can convert audio data from the microphone 148, 149, and 150 into a text data or a video animation data and can display a text by the text data or a video animation by the video animation
data on the LCD display 256.
In addition, this processing device can convert audio data from the microphone 148, 149, and 150 into a text data and a vibrator 191 in the processing device starts vibrating when the converted text data is one of predefined text data.
Industrial Applicability The above apparatus is applicable various technical fields as follows: (a) To provide a state-of-the-art communication system for the deaf and hard-of- hearing disables, which assists them to inter-communicate with normal persons as well as the disables.
(b) To provide a complete end-to-end communication system, which allows deaf persons to communicate not only face-to-face but also to remote for voice and or data communication to both deaf and normal persons.
(c) To bring the deaf disabled community closer to everyday life and work.
(d) To provide an extremely lightweight and easy-to-use system, which do not requires wiring around the body of the user.
(e) To provide a self-contained complete solution without the need of external data processing.
(f) To provide a communication system which provides conversation in multiple communication forms, like audio, text, graphical video animation and digital coded sign data.
(g) To provide Handwritten Short-Handwriting recognition to text, voice, and graphical Sign video animation communication.
(h) To provide a communication system, which communicate in multiple languages and can be used world-wide.
(i) To provide a communication solution without the pre-requisite of a similar device at the other end. (j) To provide built-in two-way communication over Cell phone with live video
Camera.
(k) To provide an industry- standard communication system, which can be interfaced with IT, Telecom and deaf disable devices for broader applications.
(1) To provide a glove apparatus for Virtual Reality applications. (m) To provide Wrist-based Processing device for various data processing and control applications. That is, wrist processing devices may also be used in various data processing, computing and control applications and can be used without gloves.