WO2004114107A1 - Human-assistive wearable audio-visual inter-communication apparatus. - Google Patents

Human-assistive wearable audio-visual inter-communication apparatus. Download PDF

Info

Publication number
WO2004114107A1
WO2004114107A1 PCT/JP2003/007863 JP0307863W WO2004114107A1 WO 2004114107 A1 WO2004114107 A1 WO 2004114107A1 JP 0307863 W JP0307863 W JP 0307863W WO 2004114107 A1 WO2004114107 A1 WO 2004114107A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
text
wrist
glove
audio
Prior art date
Application number
PCT/JP2003/007863
Other languages
French (fr)
Inventor
Nadeem Mohammad Qadir
Original Assignee
Nadeem Mohammad Qadir
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nadeem Mohammad Qadir filed Critical Nadeem Mohammad Qadir
Priority to AU2003243003A priority Critical patent/AU2003243003A1/en
Priority to PCT/JP2003/007863 priority patent/WO2004114107A1/en
Publication of WO2004114107A1 publication Critical patent/WO2004114107A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B21/00Teaching, or communicating with, the blind, deaf or mute
    • G09B21/009Teaching or communicating with deaf persons
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B21/00Teaching, or communicating with, the blind, deaf or mute
    • G09B21/04Devices for conversing with the deaf-blind

Definitions

  • the present invention is directed generally to an apparatus to provide self- contained wearable human- assistive communication device.
  • the invention relates to recognizing Sign Language and converting into vocal speech and from vocal speech to converting into readable text.
  • the apparatus is not just a data glove instead a complete glove with built-in processing and communication device.
  • Wearable human- assistive communication device is a data glove and generally a glove that fits over at least a part of a user's hand and detects detecting flexion of hand joints, touch and pressure of various muscles and sensing measurements of specific location of hands.
  • Data Gloves or instrumented Gloves have been implemented using several different approaches, including fiber-optics, resistive sensors and accelerometer attached to the glove's joints to detect movement thereof.
  • Conventional data gloves or instrumented gloves can be awkward for the user to operate because most of these gloves required intensive data processing and requires powerful computing device to be attached. In general, application of these data gloves could not be largely populated as they presented the scope of application.
  • Deaf disables it is not possible for Deaf disables to carry heavy equipment for communication. More importantly, deaf disables also requires warning and many other conditions where normal people or hazardous condition can communicate directly with deaf disables.
  • Each spoken language has it's own alphabets and tone of sound and also has different rules for grammar, similarly Sign Language also different in different countries.
  • Currently available data gloves are not well suitable for all Sign Languages.
  • a sensor material for fabricating instrumented clothing includes conductive rubber layer.
  • two electrodes are disposed within the rubber layer, are connectable to an external circuit and are separated by a separation distance to form an electrical path from one electrode to the other through an intermediate portion of the conducting rubber layer.
  • the electrical resistance measured between the electrodes is indicative of strain in the intermediate portion of the conducting rubber layer, thus permitting measurements of movement of the fabric to be made.
  • the fabric may be used to form articles that a user can wear, including a data glove, so that movements of the user may be detected and measured.
  • Harvill et al. discloses a motion sensor which produces an asymmetrical signal in response to symmetrical movement.
  • a plurality of motion sensors are placed over the joints of a hand, with each sensor comprising an optical fiber disposed between a light source and a light sensor.
  • An upper portion of the fiber is treated so that transmission loss of light being communicated through the optical fiber is increased only when the fiber bends in one direction.
  • a light source and light sensor on opposite ends of the tube continuously indicate the extent that the tube is bent.
  • U.S. Patent No. 6,452,584 to Walker et al. is directed to data glove sensing hand gestures.
  • a system for manipulating computer generated animation in real time, such as a virtual reality program running on a computer.
  • the system includes a data glove for managing data based on an operator's hand gestures.
  • This data glove comprises an elastic material that closely matches the shape of a wearer's hand, enabling the wearer to move their hand freely.
  • a movement sensing unit is provided for sensing any hand gestures of the wearer.
  • the movement sensing unit comprises a flexible circuit board that extends along the dorsal region of the wearer's fingers and hand.
  • the circuit board includes a base with a signal processor for processing received signals generated by a plurality of movement sensors.
  • the sensors transmit signals to the processor for determining any movement of the wearer's hand.
  • the sensors have a resistive material disposed on each side thereof, so that any flexure of the sensor causes the resistance values to diverge, preferably linearly.
  • the resistance values on each side of the sensor diverge to a value corresponding to the degree of flexure of the sensor.
  • a reference voltage is applied to each side of the sensor for establishing a voltage differential between its two sides. Any flexure of the sensor causes the resistance value of each side to change, for changing the reference voltage level between the two sides to indicate that the sensor has been flexed and the degree of flexure.
  • said gloves and other such data gloves have common limitations. These limitations are referred to the measurement of flexion of hand fingers only. Said systems in patents/applications do not measures other joints, muscles and locations of hands. More importantly, in Sign Language, both hands are signed together and make many combined movements. These combined movements where both hands touches and forces each other cannot be sensed, because there is no sensors in the said patented gloves. Secondly, one way or the other, these patented gloves required very high performance external processing system and need many other necessary communication apparatus before gloves could be used. In all prior art inventions, communication is initiated by the Data Gloves wearer, he or she can only be able to make communication when data gloves are connected with all necessary processing and communication equipments and power sources. Other people cannot initiate communication until the user is sitting live over said patented glove apparatus.
  • a further disadvantage of these data gloves is that the movement monitoring devices have poor longevity and are prone to reliability problems.
  • Another disadvantage of these movement monitoring devices is that they may not sufficiently track the hand gestures of the wearer.
  • the sensors may generate signals that are not an accurate representation of the wearer's hand gestures causing erroneous data to be generated.
  • these techniques have not solved communication problems of deaf and speechless dumb disables in practical use.
  • the present invention is a wearable human- assistive audio-visual inter ⁇ communication apparatus as a glove system, which is a long needed valuable invention that fulfills the much cherished goal of thousands of the institutions for the deaf and dumb all over the world.
  • the aim of the invention is to provide extremely useful, particularly device for those disables who do not know even the Sign Language.
  • communication means such as a Blue -tooth wireless device, a built-in Cellular and GPRS (General Packet Radio Service) device, a directly connected other device through industry standard interfaces like Universal Serial Bus (USB) and or Infra-red (IrDA), or the like, it can provide communication- assistance to deaf and speechless (dumb) disables to comfortably inter-communicate between remote distance.
  • GPRS General Packet Radio Service
  • the invention when storing as database plural data set for detecting different kinds of sign languages in plural countries, it can also provide for cross-Sign- Language conversion to assist people across the globe to completely intercommunicate with deaf or speechless disable-to-disable, and disable-to- normal persons from face-to-face and face-to-remote distance without the pre ⁇ conditioned need of similar device at the other end.
  • the invention also relates to a self-contained communication device to be worn on hands. It has a pair of hand gloves which has built-in wearable wrist processing devices, designed to provide for determining the gestures of Sign Language of one or both hands. It can also convert sign language into other data in a different format such as digital sign data, speech, text, video animation or the like. It can also convert it back from speech to text, sign-data and graphical video animation to provide intercommunications .
  • the invention can also recognize handwriting and converts it into text languages.
  • the invention can includes built-in Cell phone and Camera which enables remote distance voice and data communication world- wide. User can initiate phone calls and can also send live or pre-stored video images. More importantly, the invention does not necessarily require a similar device at the other end to intercommunicate * " it may also intercommunicate to other devices through software plug-in and/ or software utility program for specific function.
  • a solar cell is attached on the dorsal side of the glove, electronic devices within the glove can be supplied electric power from the solar cell.
  • the invention also provides a data glove.
  • This data glove has (a) a flexible printed circuit board settled on a dorsal side of a hand and extended towards a palm side of a hand, which has parts corresponding to five fingers, an ulner part, and extension parts extending to distal area of finger pulps on a palm side, (b) a first group of sensors in the flexible printed circuit board, for sensing touch force to the distal area of finger pulps, (c) a second group of sensors in the flexible printed circuit board, for sensing touch force to finger nails, and (d) a touch force sensor in the flexible printed circuit board, for sensing touch force to the ulner part, (e) a touch force sensor in the flexible printed circuit board, for sensing touch force to the mid palmer space.
  • the invention also provides wrist mounted devices.
  • Each of the wrist mounted devices has a wrist band and a device mounted on the wrist band.
  • This device includes a text database having text data corresponding to gesture data; a gesture-to-text conversion engine which reads gesture data sensed by sensors and finds equivalent word of text in the text database; a sentence composer engine which takes individual words of the text by the gesture -to -text conversion engine and re-arranges the words into a formal sentence; a speech database having audio data corresponding to text data; a text-to-speech engine which produces audible speech from text sentence by using the speech database; a speech-to-text engine which converts speech data into text data by using the speech database; a graphical animation engine which converts text data to gesture data and produces a graphical animation data of gesture from the gesture data; a display for displaying a text by the text data and a graphical animation by the graphical animation data; a speaker for output a speech by the speech data, etc.
  • the device includes a touch panel sensing hand writing; a converting means for converting the hand writing sensed by the touch panel to a video animation data; and a sending means for sending the video animation data to a remote device by wireless communication, etc.
  • Fig. 2 illustrates internal components of the human-assistive wearable wireless glove system.
  • Fig. 3 illustrates a block diagram of system and components functions of the human-assistive wearable wireless glove system for processing device A for one of a left or a right hand.
  • Fig. 4 illustrates a block diagram and components functions of human assistive wearable wireless glove system for processing device B for the other one of the left or the right hand.
  • Fig. 5 illustrates a block diagram of software engines and databases for the processing device A.
  • Fig. 6 illustrates a block diagram of software engines and databases for processing device B.
  • Fig. 7 illustrates joints and locations of dorsal hand important to be measured.
  • Fig. 8 illustrates muscles and specific locations of palmer hand important to measured.
  • Fig. 9 illustrates the position and type of sensors in dorsal and palmer hands.
  • Fig. 10 illustrates force and touch resistor sensors over a Polyimide Flexible Printed Circuit Board (FPCB) sheet.
  • FPCB Flexible Printed Circuit Board
  • Fig. 11 illustrates bend resistor sensors over the Polyimide Flexible Printed Circuit Board (FPCB) sheet.
  • Fig. 12 illustrates RTV (Room Temperature Vulcanizing) silicon rubber layers sprayed over the Flexible Printed Circuit Board (FPCB) sheet.
  • FIG. 1 shows an embodiment of the invention.
  • both gloves 101 and 102 have built-in miniature complex wireless, analog and digital data processing devices 117 and 107 within the gloves 101 and 102, the devices 117 and 107 are mounted at wrist side of the gloves 101 and 102. The device 117, 107 is attached to a wrist band.
  • flexible solar cells 103 and 112 are mounted at outer layers of the glove system to provide alternative power source of the devices 117 and 107.
  • Control key switches 113 and 109 are switches to control and operate the processing devices 117 and 107.
  • a touch screen panel grid and a display 110 and 114 provides data input and output functions.
  • a speaker 116 is built within the processing device 117.
  • a microphone 104 is built in the processing device 107.
  • the processing device 107 has a built-in camera 105. Antennas 115 and 108 are set in processing devices 117 and 107, for transmitting and/or receiving data. Wrist-straps 118 and 106 tie-up the processing devices 117 and 107 over wrists and the glovelOl and 102.
  • the processing devices 117 and 107 are integral part of glove system 102 and
  • Self-contained unit means that it does not require any external device or equipment to perform functions.
  • FIG. 2 illustrates internal components of the glove system as embodiment of the invention.
  • a Flexible Printed Circuit board (FPCB) sheet 120 has both bend resistive and force resistive sensors on the surface.
  • Accelerometer sensor groups 126 and 125 are installed over the FPCB sheet
  • a dual-port Analog Multiplexer Switch device 127 is installed directly on the FPCB sheet 120 at dorsal side.
  • One port in the dual-port Analog Multiplexer Switch device 127, "Port A”, is for bend flex resistor sensors whereas another port in the dual-port Analog Multiplexer Switch device 127, "Port B” is for force resistor sensors.
  • a flexible cable bank 121 is a connector, which connects the FPCB sheet 120 with a Printed Circuit Board (PCB) 122 of the processing device 107 or 117.
  • the PCB 122 is installed within the glove 101 or 102 at wrist side of the hand similar position like a wrist watch. Accelerometer sensors 123 and 124 measure location of hand movements.
  • FIG. 3 illustrates a block diagram of the processing device A (107) of Fig. 1 as embodiment which demonstrates various components and their flow in the device operation.
  • Signals from all bend resistor sensors 133 pass through Port A of the analog Multiplexer Switch (MUX) 132 (corresponding to 127 in Fig. 2) within the FPCB sheet 120.
  • MUX 132 is controlled by the analog signal processor (ASP) 139.
  • ASP analog signal processor
  • signals from all force resistor sensors 134 also pass through Port B of MUX 132 where it is controlled by ASP 139.
  • Wheatstone Bridges 137 and 138 provide voltages to bend and force sensors 133 and 134. When values of the sensors 133 and 134 changes, the Wheatstone Bridges 137 and 138 output the respective change in current flow due to change in the sensors 133 and 134.
  • the ASP 139 measures the value of current change after it converts analog current change into digital.
  • dorsal accelerometer sensors 135 and 136 are corresponding to the sensors 125 and 126 in Fig. 2
  • wrist accelerometer sensors 146 and 147 are corresponding to the sensors 123 and 124 in Fig. 2.
  • the output of these sensors 135, 136, 146 and 147 are measured and controlled by the ASP 139.
  • Control key switches 131 trigger and provide input function for the ASP 139 and main Central Processing Unit (CPU) 140.
  • a display and touch grid device 130 is as an input and output device.
  • the CPU 140 sends text and graphics to the display of the device 130 to be displayed. Using Stylus Pen, hand-written and touch characters and clicks are placed on top of the touch grid of the device 130 which sends change in grid value to the CPU 140. The CPU 140 measures the input changes of the grid value.
  • Microphones 148, 149 and 150 (corresponding to 104 in Fig. l) are in combine forming an array of microphones receive live audio and output an audio signal and the signal passes to a Voice Processor (VP) 154 which treats the audio signal to remove unwanted noise and echo.
  • the VP 154 delivers filtered audio signal to a digital signal processor (DSP) 145 which not only converts analog audio signal into digital format, but performs intensive audio analysis for voice recognition (speech-to-text) translation.
  • DSP digital signal processor
  • the ASP 139, the CPU 140, and the DSP 145 are inter-connected and perform program execution under the master command of the CPU 140.
  • a miniature Camera 155 captures video and sends a video signal to an image processor (IMP) 156.
  • IMP 156 processes the captured video signal and forwards to the DSP 145 , where it processes back and forth in conjunction with a temporary memory 144 and a non-volatile memory storage 157.
  • the DSP 145 sends the final video image to the CPU 140.
  • a controller device 153 controls battery charging for a battery 153 and selects a power source of entire device from a battery 153 and a solar cell 152 (corresponding to 103 or 112 in Fig. l).
  • the device 153 works in conjunction with the processors 139 and 140 for various power saving and sleep-mode operations.
  • a Universal Serial Bus (USB) 159 and an Infrared (IrDA) 158 are hardware interfaces which connect external input/output devices with the processing device 107 of the glove 101.
  • a Blue-tooth Transceiver 142 and a Base-band 143 provide wireless communication with the processing device B (117) of the glove 102 and also other external devices through wireless data exchange.
  • An antenna 141 is for Blue-tooth wireless.
  • the ASP 179 measures the value of current change after it converts analog current change into digital.
  • dorsal accelerometer sensors 186 and 187 and wrist accelerometer sensors 188 and 190 are also installed on the FPCB sheet for the glove 102. The output of these sensors 184, 185, 186 and 187 are measured and controlled by the ASP 179.
  • Control key switches 171 trigger and provide input function for ASP 179 and main CPU 178.
  • a display and touch grid device 170 is as an input and output device.
  • the CPU 178 sends text and graphics to the display of the device 170 to be displayed. Using Stylus Pen, hand-written and touch characters and clicks are placed on top of the touch grid of the device 170 which sends change in grid value to the CPU 178.
  • the CPU 178 measure the input changes of the touch grid value.
  • a vibrator motor 191 is a very important component of the embodiment.
  • the vibrator motor 191 is controlled by a vibrator motor controller 194 which takes signals from the ASP 179.
  • the ASP179, the CPU 178 , and the DSP 180 are inter-connected and perform program execution under the master command of the CPU 178.
  • a miniature speaker 174 provides audio output.
  • the speaker 174 is driven by an audio amplifier 175.
  • the CPU 178 sends final audio output to the audio amplifier 175 which after signal amplification sends audio signal to the speaker 174 to be out.
  • a controller device 195 controls battery charging of a battery 193 and selects power source of entire device from the battery 193 and a solar cell 192
  • the device 195 works in conjunction with the processors 179 and 178 for various power saving and sleep-mode operations.
  • a Universal Serial Bus (USB) 173 and an Infrared (IrDA) 172 are hardware interfaces which connect external input/output devices with the processing device 117 of the glove 102.
  • a Blue-tooth Transceiver 177 and a Base-band 200 provide wireless communication with the processing device A (107) of the glove 101 and also other external devices through wireless data exchange.
  • An antenna 176 is for Blue-tooth wireless.
  • a Cellular & GPRS Transceiver 199 , a Cellular Base -band 198 and a Subscriber Identification Module (SIM) 197 are the components of built-in cell phone GPRS device which provide voice and data communication at remote distance.
  • the General Packet Radio Service (GPRS) allows information to be sent and received.
  • the Cellular & GPRS Transceiver 199 , the Cellular Baseband 198 and the Subscriber Identification Module (SIM) 197 are used for applications for GPRS such as Chat, Textual and Visual Information, still Images, Moving Images, Web Browsing, Document Sharing/ Collaborative Working, Audio, Job Dispatch, Corporate Email, Internet Email, Device user's Positioning, Remote LAN Access, File Transfer, and Home Automation, etc
  • a wide range of content can also be delivered to the device 117 through GPRS services ranging from share prices, sports scores, weather, flight information, news headlines, prayer reminders, lottery results, jokes, horoscopes, traffic, location sensitive services and so on.
  • This information need not necessarily be textual- it may be maps or graphs or other types of visual information.
  • An antenna 202 is for the built-in Cell phone. These components are controlled by CPU 178.
  • FIG. 5 illustrates various software data conversion engines and databases of the processing device A (107) of Fig. 1.
  • the processing device A means a main processing part such as the ASP 139, the CPU 140, the DSP 145, and the VP 154.
  • a raw sign data 223, that is sensor data from the sensors is from its own glove 101.
  • the Raw Sign data 223 is simplified by the processing device 107 and the device 107 sends out through the Blue-tooth Antenna 141 to the processing device B (117) for Sign to text interpretation (conversion).
  • a Speech-to-Text (STT) engine 226 and a STT database 231 convert audio speech data to text data.
  • STT Speech-to-Text
  • the processing device A (107) runs the STT engine 226 which converts speech into text.
  • An Alert Checker (AC) engine 229 and an AC database 234 cross-check the output text of the STT engine 226 for various warning, information and control conditions. If the response of the STT engine 226 notices that it be matched, the processing device A (107) sends a signal to the processing device B (117) through Blue-tooth device 141, 142 and 143 to activate alert signal to the vibrator motor 191 in the device B (117).
  • a regular text converted by STT engine 226 is passed to a Sentence Composer (SC) engine 228 and a SC database 233 for sentence composing based on specific grammar rules.
  • the SC engine 228 and the SC database 233 convert the regular text to a formal sentence.
  • the final sentence can be displayed at LCD display 220 of the device 130 and/or to broadcast out to external input/output interfaces and/or to send back through wireless device to the processing device B for further broadcast to built-in cell phone.
  • a touch panel grid device 230 is, in the device 130, for handwriting and command input.
  • a Short Handwriting-to-Text (SHTT) engine 227 and a SHTT database 232 convert a short handwriting detected by the touch grid device 230 into a text.
  • SHTT Short Handwriting-to-Text
  • a video camera input 224 is obtained from the camera 155 .
  • Other data input 225 is obtained from USB 159, IrDA 158, etc.
  • FIG. 6 is a block diagram of various software Engines and Databases in processing device B (117).
  • the processing device B means a main processing part such as the ASP 179, the CPU 178, and the DSP 180.
  • a raw sign data 252 is from its own glove whereas a sign data 251 is received from the processing device A (107).
  • a Sign-to-Sign Codes (STSC) conversion engine 268 and a STSC database 271 take a set of sign data detected on the both gloves 101 and 102, and convert the set of sign data to a series of sign codes which represent sign language words.
  • STSC Sign-to-Sign Codes
  • a Sign-Codes-to-Text (SCTT) conversion engine 267 and a SCTT database 270 take the series of sign codes from the STSC engine 268 and convert sign codes into a raw text.
  • the raw text is then passed to a Sentence Composer (SC) engine 269 and a Sentence Composer (SC) engine 269 and a Sentence Composer (SC) engine 269 and a Sentence Composer (SC) engine 269 and a Sentence Composer (SC) engine 269 and a Sentence Composer (SC) engine 269 and a
  • SC database 272 which correct the format of the raw text based on specific grammar rules.
  • the final text data by the SC engine 269 is broadcast to external devices; to blue-tooth wireless 176, 177, and 200; and/or to built-in cell phone device 202, 199, and 198; and/or to directly at the speaker 174.
  • a Sign-code-to- Video-Animation (STVA) engine 262 and a STVA database 263 take a digital sign coded data and convert it into video animation data of a sign language corresponding to the sign coded data. Sign coded data of each hand joints, muscles and locations and output of accelerometer sensors are applied to graphical input to presents an equivalent sign video animation.
  • a Text-to-Speech (TTS) synthesizer engine 260 and a TTS database 261 take text data and convert it into speech data.
  • TTS Text-to-Speech
  • a Text-to-Sign-Code (TTSC) engine 258 is a reverse conversion engine of the SCTT engine 267 and a TTSC database 259 is a reverse conversion database of the SCTT database 270.
  • TTSC Text-to-Sign-Code
  • a touch panel grid 257 is for command input into the processing device B (117).
  • a LCD display 256 displays data in the processing device B (117).
  • Other data input 264 is obtained from USB 173, IrDA 172, etc.
  • Figure 7(A) illustrates dorsal hand and wrist joints.
  • a drawing 290 is an anatomy of dorsal hand's flex joints.
  • Joints 283 are Metacarpol Phalangeal flex joints of an Index Finger 281, a Middle Finger 285, a Ring finger 286 , and a Little Finger 287.
  • Joints 282 are Proximal Interphalangeal Flex joints of a Thumb 288, and the fingers 281, 285, 286, and 287.
  • Joints 280 are Distal Interphalangeal Flex joints of the thumb 288, and the fingers 281, 285, 286, and 287.
  • a Flexor Retinaculum Wrist joint 284 is at connection between a hand and an arm
  • Figure 7(B) illustrates specific flexion muscles and location of dorsal hand.
  • a drawing 298 is an anatomy of dorsal hand.
  • a thumb nail is in a location 297
  • an index finger nail is in a location 293
  • a middle finger nail is in a location 292
  • a ring finger nail is in a location 294, and a little finger nail is in a location 295.
  • Inter Digital Spacer flexions 296 are located at roots of a thumb and fingers.
  • Figure 8 illustrates specific muscles and location of a palmer hand.
  • a palmer hand 303 has Distal Pulps 300 of thumb, index, middle, ring and little fingers, a Medial Ulner Muscle 302, and a Mid Palmer space 301.
  • Figure 9(A) illustrates locations of Bend Flex Resistor sensors 310 and 311 on a dorsal hand 312, respectively.
  • Figure 9(B) illustrates locations of flexible Force Resistor sensors 313 on hand nails of a dorsal hand 314.
  • Figure 9(C) illustrates Flexible Force Resistive sensors 315 on a palmer hand 316.
  • Figure 10 illustrates locations of flexible Force Resistor Sensors 325, 326, 327, 328, and 329 as dorsal hand nail sensors, and locations of Distal Pulp flexible Force Resistor sensors 320, 321, 322, 323, and 324 for distal finger pulps of a palmer hand on the Flexible Printed Circuit Board (FPCB) sheet 120.
  • FPCB Flexible Printed Circuit Board
  • a finger part of the FPCB sheet 120 is extended to the direction to perpendicular from the finger part, and then the flexible Force Resistor Sensors
  • 325, 326, 327, 328, and 329 are positioned at the tips of the finger parts and the Distal Pulp flexible Force Resistor sensors 320, 321, 322, 323, and 324 are positioned on an extended part of the FPCB sheet 120.
  • palmer hand sensors 330 and 331 are installed for Medial Ulner spacer and Mid Palmer space respectively.
  • FIG 11 illustrates locations of flexible Bend Resistors sensors 340 to 354 on the FPCB sheet 120.
  • Flexible Bend Resistors sensors 340, 341, 342, and 343 are Metacarpol joints sensors.
  • Flexible Bend Resistors sensors 345, 346, 347, 348, and 349 are Proximal and Distal Interphalangeal joint sensors which measures the flexion of both Proximal and Distal Interphalangeal joints.
  • Flexible Bend Resistors sensors 350, 351, 352, and 353 are Inter Digital spacer flexion sensors.
  • Flexible Bend Resistors sensor 354 is a Flexor Retinaculum Wrist joint sensor.
  • Figure 12 illustrates RTV Silicon Rubber spray layers as upper rubber layer 372 and bottom rubber layer 370 on both surfaces of the FPCB sheet 120.
  • the upper rubber layer 372 and the bottom rubber layer 370 are formed on the surface of the FPCB sheet 120 without seams.
  • the wearable human-assistive audio-visual inter ⁇ communication apparatus has a pair of self-contained hand gloves, designed to provide for determining the gestures of both hands of Sign Language, and producing speech, text and graphical video animation, and converting them back from graphical video-animation, text and speech to Sign Language and gestures of hands. It also recognizes Short Handwriting written on Touch Panel Grid and converts into text languages.
  • Both the gloves 101 and 102 have built-in miniature complex wireless, analog and digital data Processing Devices within the gloves at wrist side of the gloves, which includes Central Processor Unit (CPU), DSP (Digital Signal Processor), Analog Signal Processor (ASP), Voice Processor (VP), Image Processor (IMP), Memory, Memory Storage, Bluetooth Transceiver, Bluetooth Base-band, Cellular & GPRS Transceiver, Cellular Base-band, LCD display, Interface Controller, Accelerometer Sensors, Touch Panel Grid, Array of Microphones, Speaker, Camera, Vibrator Motor, Control Keys, Re-chargeable Battery and other likeable controllers and components.
  • CPU Central Processor Unit
  • DSP Digital Signal Processor
  • ASP Analog Signal Processor
  • VP Voice Processor
  • IMP Image Processor
  • Memory Storage Memory Storage
  • Bluetooth Transceiver Bluetooth Base-band
  • Cellular & GPRS Transceiver Cellular Base-band
  • LCD display Interface Controller
  • Accelerometer Sensors Touch Panel Grid
  • Array of Microphones Speaker
  • the wearable human-assistive multi-lingual audio-visual inter ⁇ communication device comprising self-contained wireless Gloves system, also has Flexible Solar Cells attached at the dorsal area of the said Gloves to provide for alternative power source to internal electronics and processing devices components of the Gloves.
  • the FPCB sheet for each of the gloves 101 and 102 is a double-sided copper layered Polyimide Flexible Printed Circuit sheet, which is sketched over entire hand bones for each of left and right hands.
  • 100K Ohm Flexible Bend resistors (sensors) 340 to 354 as shown in Fig. 11 are screen printed at various sizes (length and diameter) for each finger's joints which includes " Metacarpol Phalangeal Flex joints for index, middle, ring and little fingers; Proximal Interphalangeal Flex joints for thumb, index, middle, ring and little fingers; Distal Interphalangeal Flex joints for thumb, index, middle, ring and little fingers; Flexor Retinaculum Wrist joint for wrist and dorsal hand flexor; Inter Digital Spacer Flex between thumb and index finger, between index and middle fingers, between middle and ring fingers, between ring and little fingers.
  • Bend Resistive Sensors 350 to 353 for Inter Digital Spacer flexes are placed in such a way that it extends from Proximal Interphalangeals of fingers and drops down back towards Metacarpol Phalangeal and makes a U-turn back to Interphalangeal joints sensing the flex (opening and closing) linearly.
  • the sensor 340, ..., 354 bends, it changes upward the value of resistor.
  • the degree of joints motions is varied from joint to joint.
  • the change of value in resistors is measured through Wheatstone - Bridges 137, 138, 181, and 182 and read by the ASPs 139 and 179.
  • signals from the bend resistive sensors 340 to 354 pass through the Port A of the MUXs 132 (182) mounted on the FPCB sheets at dorsal area of hand.
  • the MUXs 132 and 182 are controlled and addressed by the ASPs 139 and 179.
  • the apparatus of the embodiment uses Flexible Force Resistive sensors, as the sensors 320 to 330 in Fig.10.
  • Flexible Force Resistive Sensor when force, touch or pressure is applied at Force Resistive Sensors, the value of resistive sensor drops linearly from several hundred Mega Ohm to Kilo Ohm or even few ohms depending on the magnitude of force, touch and/or pressure. Therefore all flexible force resistive sensors are measured and read differently than Flexible Bend Resistive Sensors.
  • the force resistive sensor consists of two thin, flexible polyester sheets which have electrically conductive electrodes. Inside surface of one sheet forms a row pattern while the inner surface of the other employs a column pattern. A thin semi-conductive coating (ink) is applied as an intermediate layer between the electrical contacts. The ink provides the electrical resistance change at each of the intersecting points.
  • each of Flexible force resistive sensors 320 to 331 also passes through the Port B of the MUX 132, and reaches to the Wheatstone Bridges 137 and 138 to be read the changes in sensor value by the ASP 139.
  • Portions of the FPCB sheet 120 for all Distal Pulp of Palmer hands sensors are extended from portions of the sensors 325 to 329 corresponding to nails on dorsal hand as a continuation of the FPCB sheet.
  • Distal Pulp sensors 320 to 324 are bent reverse from Dorsal Nail sensors 325 to 329 in the glove.
  • Polyimide FPCB sheet 120 is extended from Dorsal area of hand and turned reverse towards a medial ulner of a hand for the Flexible Force Resistive sensor 330; and a middle space of palm for the Flexible Resistive sensor 331.
  • the FPCB sheet 120 is for the glove 101 of a right hand, and the
  • FPCB sheet for the glove 102 of a right hand has a symmetrical shape to the
  • a group of X-Y and X-Z Accelerometer sensors 125 and 126 are directly soldered over Polyimide FPCB sheet 120 to measure the position (i.e. direction) of Dorsal hand for up-down, front-back, and left-right.
  • the output of these sensors are read by the ASP 139.
  • the X-Y accelerometer sensor 125 is soldered evenly at the surface of the FPCB sheet 120, whereas the X-Z accelerometer sensor 126 is soldered vertically over the FPCB sheet 120 hence making the Z axis.
  • the processing devices A and B also have X-Y and X-Z accelerometer sensors 123 and 124 which are directly soldered over the PCB 122 of the processing devices A and B (107 and 117).
  • the X"Y accelerometer sensor 123 is soldered evenly at the surface of the PCB 122 of the processing devices A and B (107 and 117), whereas the X-Z accelerometer sensor 124 is soldered vertically over PCB 122 of the processing devices A and B (107 and 117) hence making the Z axis.
  • Dorsal X-Y and X"Z Accelerometer sensors work (sensed) directly in conjunction with Flexor Retinaculum Wrist joint Bend Resistor Sensor.
  • Wrist X-Y and X-Z Accelerometer sensors output (3-D location) dynamically adjusted in conjunction with two parameters ⁇ (a) the output all Force Resistor Sensors in an OR (digital gate) Boolean and the determination of final computed gesture word. For example, signing for a word " LAZY " in which tap the index palm finger of the right hand at the left shoulder several times. This dynamically calibrate and adjust the location of wrist (X ⁇ Y and X-Z Accelerometer sensors). In another example, signing for a word " MOUSE” in which brush the right hand index finger to the left across the nose tip a few times. This dynamically calibrate and adjust the location of wrist X-Y and X-Z Accelerometer Sensors.
  • both dorsal and wrist accelerometer sensors do not work individually, instead the output of all accelerometer sensors are read in conjunction with linear values of other bend and force resister sensors.
  • Dual Port Analog Multiplexer switch (MUX) 127 (corresponding to 132 in Fig.
  • the MUX 127 is addressed and controlled by the ASP 139.
  • Polyimide FPCB sheet 120 is directly connected with loosely coupled flexible wire (Cable Bank) 121 to bridge the connections between Polyimide FPCB and Processing devices A and B.
  • Cable Bank loosely coupled flexible wire
  • a wrist can act 70 to 75 degree movements for Extension and Flexion and 20 to 25 degree for Radial and Ulner twist, which causes the changes of value from the bend resistive sensors.
  • This flexible cable bank bridge 121 keeps the bend resistive sensors in their joints position. After placing all sensors, chips and components on the FPCB sheet 120, the
  • FPCB sheet 120 is cut like a stencil similar to the formation of hand joints and bones while keeping one piece for both Dorsal and Palmer.
  • the finished Polyimide FPCB sheet 120 after edge cutting, is sprayed Silicon rubber using Room-Temperature-Vulcanizing on both sides (upper and bottom). From above stencil cutting, the FPCB sheet 120 becomes a one fabric similar to an upper layer of glove fabric. Silicon rubber layer is flexible and stretchable.
  • Polyimide Silicon Rubber FPCB sheet 120 is placed in between two layers of stretchable fabric and sieved or glued to make hand glove.
  • the sensor 330 for a medial ulner side of a hand, and the sensor 331 for a middle space of a palm are placed in a small jacket of the palmer side of glove fabric.
  • the outer edge of the sensor 331 is tied with an elastic thread where the other end of elastic is sieved or glued at thumb fabric.
  • the sensors for both distal pulps and inter digital fingers are placed in a fabric jacket to avoid damage or breaking of the FPCB sheet 120 during wearing or use.
  • a group of flexible and thin solar cells is placed at the outer layer of dorsal fabric.
  • the solar cells 103 are connected to the flexible cable bank 121 and, through the flexible cable bank 121, connected to the processing device 107.
  • the positive and negative charge of the solar cells is directly supplied to the processing device 107 via the flexible cable bank 121.
  • the solar cells 103 is controlled by battery controller 153 built within the processing device 107.
  • the FPCB sheet 120, the flexible cable bank 121 and the processing device 107 are all combined to make a simple and a single piece of hand glove, the glove 101 (fabric and components connected together).
  • the processing device 107 and a group of solar cells 103 can both be detached for hand wash of the glove 101.
  • the glove 101 is discussed here, and the glove 102 is also configured in the same manner as the glove 101.
  • the user wears other thin fabric gloves first and then wears the gloves, so that the gloves 101 and 102 are worn smoothly with minimal friction resistance.
  • Two processing devices A and B (107 and 117), which are attached on each apparatus glove system, perform specific data processing and program execution functions. Both the processing devices intercommunicate and exchange data wirelessly over blue -tooth.
  • Processing device A (107) continuously measures and reads all glove sensors and converts analog data into digital codes. It also treats and simplifies incoming data and removes unwanted signals and codes and broadcasts out digitally coded gesture data to processing device B using built-in Blue-tooth wireless device.
  • Processing device B (117) also continuously measures and reads all sensors and converts analog data into digital codes. It also treats and simplifies incoming data and removes unwanted signals and codes.
  • processing device B (117) takes digitally coded sign data and executes Sign-Codes-to-Text engine which takes sign data and translates/finds an equivalent match of text alphabet or words from pre-stored database in memory storage of processing device B.
  • Raw text is applied to Sentence Composer engine which re-arranges individual words into full sentence. Based on user setting, sentence composer engine can be by-passed.
  • processing device A receives Text from processing device B (117) and displays over LCD located with the glove of processing device A.
  • Processing device B forwards text to Text-to-Speech synthesizer engine, which produces human voiced audio speech through built-in Speaker in processing device B.
  • Text-to-Speech synthesizer can also send out audio speech to remote device using industry standard interface (USB or IrDA) or through remote distance using built-in Cellular & GPRS transceiver (in processing device B) very much like a normal person speaking over cell-phone's microphone to other party at remote distance. Live audio speech is received through Array of Microphones from processing device A.
  • an audio speech is applied through Voice processor (VP) on processing device A.
  • Voice Processor takes the differential inputs of Silicon Array Microphones to minimize the RF interference and white noise.
  • the 3 microphones create AMBIN (Array Microphone Bea -forming Integrated with Noise suppression) for advanced noise suppression and echo cancellation for clearest communication for better voice recognition even in highly noisy environment.
  • the noise suppression can be achieved up to 15dB and more, and acoustic Echo Cancellation up to 45dB and more.
  • voice recognition Seeclrto-Text
  • processing device A Once the text is ready by processing device A, it is displayed over LCD located over the gloves in Processing device A.
  • the text is also broadcast to processing device B over blue -tooth wireless. If activated, the processing device B takes the Text and applies it to Text-to-Video Sign Animation engine.
  • the graphical video sign animation engine takes the each word of the text and finds equivalent digital sign coded sequence. These text equivalent sign coded data is applied to the graphics engine, which mimics 3-D animated human makes the sign of Sign language over processing device B's LCD display.
  • Processing device B has built-in Cellular & GPRS Transceivers which provide both voice and data communication. Gestures made using both hands and converted into final audio speech are broadcast out through Cellular & GPRS devices and it receives audio speech or data from remote side. The graphical video Sign animation or Sign language coded data are broadcast out to other remote device which may be using similar Glove apparatus or other devices using software plug-in or software utility.
  • Processing device A has Touch Grid panel over LCD. Instead of making Signs or typing each alphabet, processing device A has Short Handwriting recognition engine, which reads writing over Touch Grid Panel using Stylus pen and converts graphical input into Text. Once the text is extracted, it is passed through various engines as described above for producing audible speech, text or graphical video Sign Language animation.
  • Processing device B has built-in Vibrator motor, which provides many useful interfaces and communication between user and Gloves system apparatus.
  • Processing Device B has Alert Checker database engine (Alert- Checker- Database Engine) : On receiving audio, text or animation data input, the Alert Checker database engine verifies conditions and generates alerts to the user through Vibrator Motor located within the glove.
  • Alert-Checker conditions may include, person's Name, Mr, Miss, Excuse me, Hay, Hello, or Attention like words and/or can be set for Phone Ring, Doorbell Signalers, Smoke/Fire Alarm, Burglar Alarm, Siren, Auto mobile Horn Alert, or even Baby Cry Signaler.
  • a person wearing Glove System apparatus can be called or alerted for various abnormalities or normal communication wherever he/she may happen to be (e.g. walking on the street or at the Airport). All received calls through built-in Cell phone are activated through Vibrator motor. If at home, user can be alerted, alarmed, informed, or called in various conditions. User can set 32 or more conditions. Sound, audio speech or tone/tune can be set for personalized conditions.
  • a vibrator motor also provides mechanism where others can initiate communication, for example, by saying "Hello” to the person wearing this gloves 101 and 102.
  • Dynamic calibration can be performed in the devices 107 and 117 for individual user for approximate positioning and location identification of individual user's body parts as stated below. It is further stated that all corresponding values, read by sensors, are stored in a non- volatile memory. Calibration starts and ends with Vibrator's vibration indicating to the user when to start and when to end, that it has read the calibration values.
  • step-1 user stand straight, leave arms to gravity and flat hand down towards ground (resting position) to read arms and hands positions for both hands, one by one. This tells the device that user is in the reset position from where it will proceed for signing or proceed for further calibration.
  • step-2 user stand straight, lift arms from resting position towards shoulder and set hand-shape similar to alphabet "A" facing opposite person, and set finger-spelling position as shown for both hands, one by one. This tells the device that the position of wrist next to the shoulder.
  • step -3 user stand straight, lift arm from resting position till it makes an angle with palm facing opposite person for both hands, one by one; This tells the device the position of Palm.
  • step-4 user stand straight, lift right arm towards right shoulder, flat palm position fingers towards up (head). Lift left arm from resting position, flat hand palm, turn right left arms and position fingers underneath right-arm elbow to define signing area; this tells the device that location of shoulders precisely.
  • step-5 user stand straight, lift right arm from resting position, flat palm and place hand over heart;. This tells the user the location of heart within the space of step-4.
  • step-6 user stand straight, lift arm from resting position, flat palm and place over stomach for both (left and right) arms and hands, one by one; this tells the device the location of stomach.
  • step -7 stand straight, lift arm from resting position, flat palm and place over chest for both (left and right) arms and hands, one by one; this tells the device the location of chest of the user body.
  • step -8 user stand straight, lift arm and place it in finger- spelling position, flat palm facing opposite person for both hands, one by one; this tells the device position of palm.
  • step-9 user stand Straight, lift arm and place it in finger-spelling position, flat palm facing towards user for both hands, one by one; this tells the device the position of palm.
  • step- 10 user stand straight, lift arm and place it in finger-spelling position, flat palm facing towards opposite shoulder for both hands, one by one; this tells the device opposite location of shoulder from opposite wrist when the it touches the opposite shoulder.
  • step -11 user stand straight, lift arm and place it at chest position, flat palm facing up for both hands, one by one; this tells the position of palm.
  • step -12 user stand straight, lift arm and place it at chest position, flat palm facing own for both hands, one by one. This tells the device the position of palm.
  • step-1 the user's head is divided into four positions: left side of front head, right side of front head, top of the head, and back side of the head; user lift right and left arms (one-by-one) and place index finger at each location of head. This tells the device the locations of head.
  • step-2 user's forehead is a single position place, ' user lift right and left arms (one -by-one) and place index finger over forehead. This tells the device the location of forehead of user.
  • step-3 users' eyes are divided into two positions " - left-eye and right-eye; user lift right and left arms (one-by-one) and place index finger over closed left and right eyes. This tells the device the location of eyes of user.
  • step-4 user's nose is a single position of the face! user lift right and left arms (one -by-one) and place index finger over nose. This tells the device the location of user's nose of the face.
  • step-5 user's ears are divided into two positions: left ear and right ear! user lift right and left arms (one-by-one) and place index finger over left and right ears one-by-one. This tells the device the location of ears of user face.
  • step-6 user's cheeks are divided into two positions: left cheek and right cheek! user lift right and left arms (one-by-one) and place index finger over left and right cheeks one-by-one. This tells the device the location of cheeks.
  • step-7 user's mustache is a single of the face! user lift right and left arms (one-by-one) and place index finger over mustache. This tells the device the location of mustaches of user's face.
  • user's lips and teeth have one position! user lift right and left arms (one -by-one) and place index finger over lips. This tells the device the location of user's lips.
  • user's chin is also a single position of the face! user lift right and left arms (one-by-one) and place index finger over chin. This tells the device the location of user's chin.
  • step-10 user's neck has one position! user lift right and left arms (one-by- one) and place index finger over neck. This tells the device the location of user's neck.
  • the wearable human-assistive audio-visual inter-communication device comprising self-contained wireless Gloves system provides complete and self-contained total solution in many alternative communication situations, even with Sign Language or without Sign Language.
  • a wrist mounted device has a wrist band and a processing device mounted on the wrist band.
  • the processing device is configured by adding a video processor (VP) 154, a microphone array 148, 149 and 150, an image processor (IMP) 156, a camera 155, a short-handwriting-to- text (SHTT) engine 227, a SHTT database 232, an Alert Checker (AC) engine 229, an AC database 234, a speech-to-text (STT) engine 226, and a STT database 231 as installed in the processing device A (107) to the processing device B (117).
  • VP video processor
  • IMP image processor
  • AC Alert Checker
  • STT speech-to-text
  • the wrist mounted device is for disables and/or patients and is mounted on a wrist of the disables and/or patients without gloves.
  • This processing device can convert a hand writing sensed by touch panel grid 257 to a video animation data and can send the video animation data to a remote device by wireless communication such as Bluetooth, IrDA, CellularPhone, and GPRS (General Packet Radio System).
  • wireless communication such as Bluetooth, IrDA, CellularPhone, and GPRS (General Packet Radio System).
  • this processing device can convert an audio data from microphones 148, 149 and 150 to a text data.
  • This processing device can send the audio data and/or the text data by wireless communication such as Bluetooth, IrDA, Cellular phone, and GPRS.
  • this processing device can convert audio data from the microphone 148, 149, and 150 into a text data or a video animation data and can display a text by the text data or a video animation by the video animation data on the LCD display 256.
  • this processing device can convert audio data from the microphone 148, 149, and 150 into a text data and a vibrator 191 in the processing device starts vibrating when the converted text data is one of predefined text data.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

In this wearable human-assistive audio-visual inter-communication apparatus, bend sensors, force sensors and one or more accelerometer are installed to two gloves for a right hand and a left hand, and two wrist devices are mounted within the gloves. Each device receives sensor data from the sensors installed to the glove. Two wrist devices communicate with each other. One of them gathers sensor data detected by two wrist devices and the data to digitally coded sign language information, and output speech, text, graphical animation, etc. corresponding to the sign language information for both face-to-face, and face-to-remote distance without the need of external processing and communication devices.

Description

DESCRIPTION
Human-Assistive Wearable Audio-Nisual Inter-Communication Apparatus.
Technical Field
The present invention is directed generally to an apparatus to provide self- contained wearable human- assistive communication device. The invention relates to recognizing Sign Language and converting into vocal speech and from vocal speech to converting into readable text. The apparatus is not just a data glove instead a complete glove with built-in processing and communication device.
Background Art
The increasing concern of United Nations' World Federation of Deaf Disables and thousands of institutions for the deaf and dumb all over the world is about how to bring deaf disable communities closer to everyday life, the work for which has yet to achieve the goal. Deaf and dumb disables take support from
Sign Language for their inter communications. For inter communications with normal people, they are helpless until a normal person knows Sign Language.
Unfortunately, the converse is not possible as it is impossible for a deaf and dumb disable to speak naturally. Sign language and spoken languages are two completely different languages. It is like, one is English and the other is
Chinese. This means, one who knows the both is the only one who can intercommunicate, but none else. Wearable human- assistive communication device is a data glove and generally a glove that fits over at least a part of a user's hand and detects detecting flexion of hand joints, touch and pressure of various muscles and sensing measurements of specific location of hands. Data Gloves or instrumented Gloves have been implemented using several different approaches, including fiber-optics, resistive sensors and accelerometer attached to the glove's joints to detect movement thereof. Conventional data gloves or instrumented gloves can be awkward for the user to operate because most of these gloves required intensive data processing and requires powerful computing device to be attached. In general, application of these data gloves could not be largely populated as they presented the scope of application. Practically, it is not possible for Deaf disables to carry heavy equipment for communication. More importantly, deaf disables also requires warning and many other conditions where normal people or hazardous condition can communicate directly with deaf disables. Each spoken language has it's own alphabets and tone of sound and also has different rules for grammar, similarly Sign Language also different in different countries. Currently available data gloves are not well suitable for all Sign Languages.
There is therefore a need to invent and produce a self-contained wearable communication device, which could solve true inter-communication needs.
Furthermore, in U.S. Patent Application Publication No. 2002-0075232, Daum, Wolfgang et al. discloses an data glove. In the glove, a sensor material for fabricating instrumented clothing includes conductive rubber layer. In addition, two electrodes are disposed within the rubber layer, are connectable to an external circuit and are separated by a separation distance to form an electrical path from one electrode to the other through an intermediate portion of the conducting rubber layer. The electrical resistance measured between the electrodes is indicative of strain in the intermediate portion of the conducting rubber layer, thus permitting measurements of movement of the fabric to be made. The fabric may be used to form articles that a user can wear, including a data glove, so that movements of the user may be detected and measured.
In U.S. Patent No. 5,097,252, Harvill et al. discloses a motion sensor which produces an asymmetrical signal in response to symmetrical movement. In a first embodiment, a plurality of motion sensors are placed over the joints of a hand, with each sensor comprising an optical fiber disposed between a light source and a light sensor. An upper portion of the fiber is treated so that transmission loss of light being communicated through the optical fiber is increased only when the fiber bends in one direction. A light source and light sensor on opposite ends of the tube continuously indicate the extent that the tube is bent.
U.S. Patent No. 6,452,584 to Walker et al. is directed to data glove sensing hand gestures. In this patent, a system is provided for manipulating computer generated animation in real time, such as a virtual reality program running on a computer. The system includes a data glove for managing data based on an operator's hand gestures. This data glove comprises an elastic material that closely matches the shape of a wearer's hand, enabling the wearer to move their hand freely. A movement sensing unit is provided for sensing any hand gestures of the wearer. The movement sensing unit comprises a flexible circuit board that extends along the dorsal region of the wearer's fingers and hand. The circuit board includes a base with a signal processor for processing received signals generated by a plurality of movement sensors. The sensors transmit signals to the processor for determining any movement of the wearer's hand. The sensors have a resistive material disposed on each side thereof, so that any flexure of the sensor causes the resistance values to diverge, preferably linearly. The resistance values on each side of the sensor diverge to a value corresponding to the degree of flexure of the sensor. A reference voltage is applied to each side of the sensor for establishing a voltage differential between its two sides. Any flexure of the sensor causes the resistance value of each side to change, for changing the reference voltage level between the two sides to indicate that the sensor has been flexed and the degree of flexure.
Firstly, said gloves and other such data gloves have common limitations. These limitations are referred to the measurement of flexion of hand fingers only. Said systems in patents/applications do not measures other joints, muscles and locations of hands. More importantly, in Sign Language, both hands are signed together and make many combined movements. These combined movements where both hands touches and forces each other cannot be sensed, because there is no sensors in the said patented gloves. Secondly, one way or the other, these patented gloves required very high performance external processing system and need many other necessary communication apparatus before gloves could be used. In all prior art inventions, communication is initiated by the Data Gloves wearer, he or she can only be able to make communication when data gloves are connected with all necessary processing and communication equipments and power sources. Other people cannot initiate communication until the user is sitting live over said patented glove apparatus.
Some systems which use Video camera and Television (TV) or other display monitor for Sign Language recognition, are not very practical to be used. First, they are very expensive, secondly they are not possible to carry. Thirdly, light and background of images and with different skin tones and body shapes and face, it is bound to get inaccurate results.
A further disadvantage of these data gloves is that the movement monitoring devices have poor longevity and are prone to reliability problems. Another disadvantage of these movement monitoring devices is that they may not sufficiently track the hand gestures of the wearer. The sensors may generate signals that are not an accurate representation of the wearer's hand gestures causing erroneous data to be generated. Above are quite a few limitations to mention, that is why despite many efforts, these techniques have not solved communication problems of deaf and speechless dumb disables in practical use.
The present invention is a wearable human- assistive audio-visual inter¬ communication apparatus as a glove system, which is a long needed valuable invention that fulfills the much cherished goal of thousands of the institutions for the deaf and dumb all over the world.
The aim of the invention is to provide extremely useful, particularly device for those disables who do not know even the Sign Language.
Furthermore, it can provide a complete solution to intercommunicate between two similar or multiple languages without learning the other. It allows seem-less intercommunications between normal and disabled communities.
Furthermore, when using handwriting to text and speech and even Sign animation, it can provide every possible means of communication for the deaf and dumb disables.
Disclosure of Invention The invention provides communication-assistance to deaf and speechless
(dumb) disables to comfortably inter-communicate with normal people and/or disabled persons.
In addition, when installing communication means such as a Blue -tooth wireless device, a built-in Cellular and GPRS (General Packet Radio Service) device, a directly connected other device through industry standard interfaces like Universal Serial Bus (USB) and or Infra-red (IrDA), or the like, it can provide communication- assistance to deaf and speechless (dumb) disables to comfortably inter-communicate between remote distance.
Furthermore, when storing as database plural data set for detecting different kinds of sign languages in plural countries, it can also provide for cross-Sign- Language conversion to assist people across the globe to completely intercommunicate with deaf or speechless disable-to-disable, and disable-to- normal persons from face-to-face and face-to-remote distance without the pre¬ conditioned need of similar device at the other end. The invention also relates to a self-contained communication device to be worn on hands. It has a pair of hand gloves which has built-in wearable wrist processing devices, designed to provide for determining the gestures of Sign Language of one or both hands. It can also convert sign language into other data in a different format such as digital sign data, speech, text, video animation or the like. It can also convert it back from speech to text, sign-data and graphical video animation to provide intercommunications . The invention can also recognize handwriting and converts it into text languages.
The invention can includes built-in Cell phone and Camera which enables remote distance voice and data communication world- wide. User can initiate phone calls and can also send live or pre-stored video images. More importantly, the invention does not necessarily require a similar device at the other end to intercommunicate* " it may also intercommunicate to other devices through software plug-in and/ or software utility program for specific function.
Furthermore, if a solar cell is attached on the dorsal side of the glove, electronic devices within the glove can be supplied electric power from the solar cell.
The invention also provides a data glove. This data glove has (a) a flexible printed circuit board settled on a dorsal side of a hand and extended towards a palm side of a hand, which has parts corresponding to five fingers, an ulner part, and extension parts extending to distal area of finger pulps on a palm side, (b) a first group of sensors in the flexible printed circuit board, for sensing touch force to the distal area of finger pulps, (c) a second group of sensors in the flexible printed circuit board, for sensing touch force to finger nails, and (d) a touch force sensor in the flexible printed circuit board, for sensing touch force to the ulner part, (e) a touch force sensor in the flexible printed circuit board, for sensing touch force to the mid palmer space.
By using this glove, gestures in sign language are sensed exactly and precisely. Furthermore, if rubber layers are made on both surfaces of the flexible printed circuit board, they can protect the flexible printed circuit board from breakage and water. When the solar cell and the devices are mounted detachable to the glove, fabric part of the glove can be washed easily after detaching the solar cell and the devices.
The invention also provides wrist mounted devices. Each of the wrist mounted devices has a wrist band and a device mounted on the wrist band. This device includes a text database having text data corresponding to gesture data; a gesture-to-text conversion engine which reads gesture data sensed by sensors and finds equivalent word of text in the text database; a sentence composer engine which takes individual words of the text by the gesture -to -text conversion engine and re-arranges the words into a formal sentence; a speech database having audio data corresponding to text data; a text-to-speech engine which produces audible speech from text sentence by using the speech database; a speech-to-text engine which converts speech data into text data by using the speech database; a graphical animation engine which converts text data to gesture data and produces a graphical animation data of gesture from the gesture data; a display for displaying a text by the text data and a graphical animation by the graphical animation data; a speaker for output a speech by the speech data, etc.
Alternatively, the device includes a touch panel sensing hand writing; a converting means for converting the hand writing sensed by the touch panel to a video animation data; and a sending means for sending the video animation data to a remote device by wireless communication, etc.
Alternatively, the device includes one or more microphones! a speech database having audio data corresponding to text data; a speech-to-text database engine which converts audio data from the microphone(s) into a text data by using the speech database; and a display for displaying a text by the text data, etc.
Brief Description of Drawings Fig. 1 illustrates a human-assistive wearable wireless glove system as an embodiment of the invention.
Fig. 2 illustrates internal components of the human-assistive wearable wireless glove system.
Fig. 3 illustrates a block diagram of system and components functions of the human-assistive wearable wireless glove system for processing device A for one of a left or a right hand.
Fig. 4 illustrates a block diagram and components functions of human assistive wearable wireless glove system for processing device B for the other one of the left or the right hand. Fig. 5 illustrates a block diagram of software engines and databases for the processing device A.
Fig. 6 illustrates a block diagram of software engines and databases for processing device B. Fig. 7 illustrates joints and locations of dorsal hand important to be measured.
Fig. 8 illustrates muscles and specific locations of palmer hand important to measured. Fig. 9 illustrates the position and type of sensors in dorsal and palmer hands.
Fig. 10 illustrates force and touch resistor sensors over a Polyimide Flexible Printed Circuit Board (FPCB) sheet.
Fig. 11 illustrates bend resistor sensors over the Polyimide Flexible Printed Circuit Board (FPCB) sheet. Fig. 12 illustrates RTV (Room Temperature Vulcanizing) silicon rubber layers sprayed over the Flexible Printed Circuit Board (FPCB) sheet.
Best Mode for Carrying Out the Invention Figure 1 shows an embodiment of the invention.
In Fig.l, both gloves 101 and 102 have built-in miniature complex wireless, analog and digital data processing devices 117 and 107 within the gloves 101 and 102, the devices 117 and 107 are mounted at wrist side of the gloves 101 and 102. The device 117, 107 is attached to a wrist band. In Fig. 1, flexible solar cells 103 and 112 are mounted at outer layers of the glove system to provide alternative power source of the devices 117 and 107. Control key switches 113 and 109 are switches to control and operate the processing devices 117 and 107. A touch screen panel grid and a display 110 and 114 provides data input and output functions. A speaker 116 is built within the processing device 117. A microphone 104 is built in the processing device 107. The processing device 107 has a built-in camera 105. Antennas 115 and 108 are set in processing devices 117 and 107, for transmitting and/or receiving data. Wrist-straps 118 and 106 tie-up the processing devices 117 and 107 over wrists and the glovelOl and 102. The processing devices 117 and 107 are integral part of glove system 102 and
101 forming a one integrated self-contained unit. Self-contained unit means that it does not require any external device or equipment to perform functions.
Figure 2 illustrates internal components of the glove system as embodiment of the invention. In Fig. 2, A Flexible Printed Circuit board (FPCB) sheet 120 has both bend resistive and force resistive sensors on the surface.
Accelerometer sensor groups 126 and 125 are installed over the FPCB sheet
120 at dorsal side of the hand to measure roll-over and direction of dorsal hand movements. A dual-port Analog Multiplexer Switch device 127 is installed directly on the FPCB sheet 120 at dorsal side. One port in the dual-port Analog Multiplexer Switch device 127, "Port A", is for bend flex resistor sensors whereas another port in the dual-port Analog Multiplexer Switch device 127, "Port B" is for force resistor sensors. In Fig. 2, a flexible cable bank 121 is a connector, which connects the FPCB sheet 120 with a Printed Circuit Board (PCB) 122 of the processing device 107 or 117. The PCB 122 is installed within the glove 101 or 102 at wrist side of the hand similar position like a wrist watch. Accelerometer sensors 123 and 124 measure location of hand movements.
Figure 3 illustrates a block diagram of the processing device A (107) of Fig. 1 as embodiment which demonstrates various components and their flow in the device operation. Signals from all bend resistor sensors 133 pass through Port A of the analog Multiplexer Switch (MUX) 132 (corresponding to 127 in Fig. 2) within the FPCB sheet 120. MUX 132 is controlled by the analog signal processor (ASP) 139. Similarly, signals from all force resistor sensors 134 also pass through Port B of MUX 132 where it is controlled by ASP 139. Wheatstone Bridges 137 and 138 provide voltages to bend and force sensors 133 and 134. When values of the sensors 133 and 134 changes, the Wheatstone Bridges 137 and 138 output the respective change in current flow due to change in the sensors 133 and 134. The ASP 139 measures the value of current change after it converts analog current change into digital.
In Fig. 3, dorsal accelerometer sensors 135 and 136 are corresponding to the sensors 125 and 126 in Fig. 2, and wrist accelerometer sensors 146 and 147 are corresponding to the sensors 123 and 124 in Fig. 2.
The output of these sensors 135, 136, 146 and 147 are measured and controlled by the ASP 139.
Control key switches 131 trigger and provide input function for the ASP 139 and main Central Processing Unit (CPU) 140. A display and touch grid device 130 is as an input and output device.
The CPU 140 sends text and graphics to the display of the device 130 to be displayed. Using Stylus Pen, hand-written and touch characters and clicks are placed on top of the touch grid of the device 130 which sends change in grid value to the CPU 140. The CPU 140 measures the input changes of the grid value. Microphones 148, 149 and 150 (corresponding to 104 in Fig. l) are in combine forming an array of microphones receive live audio and output an audio signal and the signal passes to a Voice Processor (VP) 154 which treats the audio signal to remove unwanted noise and echo. The VP 154 delivers filtered audio signal to a digital signal processor (DSP) 145 which not only converts analog audio signal into digital format, but performs intensive audio analysis for voice recognition (speech-to-text) translation. The ASP 139, the CPU 140, and the DSP 145 are inter-connected and perform program execution under the master command of the CPU 140.
A miniature Camera 155 captures video and sends a video signal to an image processor (IMP) 156. IMP 156 processes the captured video signal and forwards to the DSP 145 , where it processes back and forth in conjunction with a temporary memory 144 and a non-volatile memory storage 157. The DSP 145 sends the final video image to the CPU 140. A controller device 153 controls battery charging for a battery 153 and selects a power source of entire device from a battery 153 and a solar cell 152 (corresponding to 103 or 112 in Fig. l). The device 153 works in conjunction with the processors 139 and 140 for various power saving and sleep-mode operations.
A Universal Serial Bus (USB) 159 and an Infrared (IrDA) 158 are hardware interfaces which connect external input/output devices with the processing device 107 of the glove 101. A Blue-tooth Transceiver 142 and a Base-band 143 provide wireless communication with the processing device B (117) of the glove 102 and also other external devices through wireless data exchange. An antenna 141 is for Blue-tooth wireless.
The embodiment of the invention as built-in processing devices A and B (107 and 117) within gloves 101 and 102, the processing device A (107) is different from the processing device B (117) as shown in Fig. 1.
Figure 4 illustrates a block diagram of the processing device B (117) of Fig. 1 as embodiment which demonstrates various components and their flow in the device operation. Signals from all bend resistor sensors 184 pass through Port A of the analog Multiplexer Switch (MUX) 183 within a FPCB sheet for the glove 102. MUX 183 is controlled by the analog signal processor (ASP) 179. Similarly, signals from all force resistor sensors 185 also pass through Port B of MUX 183 where it is controlled by ASP 179. Wheatstone Bridges 181 and 182 provide voltages to bend and force sensors 184 and 185. When values of the sensors 184 and 185 changes, the Wheatstone Bridges 181 and 182 output the respective change in current flow due to change in the sensors 184 and 185. The ASP 179 measures the value of current change after it converts analog current change into digital. In Fig. 4, dorsal accelerometer sensors 186 and 187 and wrist accelerometer sensors 188 and 190 are also installed on the FPCB sheet for the glove 102. The output of these sensors 184, 185, 186 and 187 are measured and controlled by the ASP 179. Control key switches 171 trigger and provide input function for ASP 179 and main CPU 178. A display and touch grid device 170 is as an input and output device. The CPU 178 sends text and graphics to the display of the device 170 to be displayed. Using Stylus Pen, hand-written and touch characters and clicks are placed on top of the touch grid of the device 170 which sends change in grid value to the CPU 178. The CPU 178 measure the input changes of the touch grid value.
A vibrator motor 191 is a very important component of the embodiment. The vibrator motor 191 is controlled by a vibrator motor controller 194 which takes signals from the ASP 179. The ASP179, the CPU 178 , and the DSP 180 are inter-connected and perform program execution under the master command of the CPU 178.
A miniature speaker 174 provides audio output. The speaker 174 is driven by an audio amplifier 175. The CPU 178 sends final audio output to the audio amplifier 175 which after signal amplification sends audio signal to the speaker 174 to be out. A controller device 195 controls battery charging of a battery 193 and selects power source of entire device from the battery 193 and a solar cell 192
(corresponding to 112 in Fig. l). The device 195 works in conjunction with the processors 179 and 178 for various power saving and sleep-mode operations.
A Universal Serial Bus (USB) 173 and an Infrared (IrDA) 172 are hardware interfaces which connect external input/output devices with the processing device 117 of the glove 102.
A Blue-tooth Transceiver 177 and a Base-band 200 provide wireless communication with the processing device A (107) of the glove 101 and also other external devices through wireless data exchange. An antenna 176 is for Blue-tooth wireless.
A Cellular & GPRS Transceiver 199 , a Cellular Base -band 198 and a Subscriber Identification Module (SIM) 197 are the components of built-in cell phone GPRS device which provide voice and data communication at remote distance. The General Packet Radio Service (GPRS) allows information to be sent and received. The Cellular & GPRS Transceiver 199 , the Cellular Baseband 198 and the Subscriber Identification Module (SIM) 197 are used for applications for GPRS such as Chat, Textual and Visual Information, still Images, Moving Images, Web Browsing, Document Sharing/ Collaborative Working, Audio, Job Dispatch, Corporate Email, Internet Email, Device user's Positioning, Remote LAN Access, File Transfer, and Home Automation, etc
A wide range of content can also be delivered to the device 117 through GPRS services ranging from share prices, sports scores, weather, flight information, news headlines, prayer reminders, lottery results, jokes, horoscopes, traffic, location sensitive services and so on. This information need not necessarily be textual- it may be maps or graphs or other types of visual information.
An antenna 202 is for the built-in Cell phone. These components are controlled by CPU 178.
Figure 5 illustrates various software data conversion engines and databases of the processing device A (107) of Fig. 1. In Fig. 5, the processing device A means a main processing part such as the ASP 139, the CPU 140, the DSP 145, and the VP 154. In Fig. 5, a raw sign data 223, that is sensor data from the sensors, is from its own glove 101. The Raw Sign data 223 is simplified by the processing device 107 and the device 107 sends out through the Blue-tooth Antenna 141 to the processing device B (117) for Sign to text interpretation (conversion). A Speech-to-Text (STT) engine 226 and a STT database 231 convert audio speech data to text data. When audio speech is received through the microphone array 148, 149 and 150, USB or IrDA devices 159 and 158, or from the device B
(117) using Cell phone, the processing device A (107) runs the STT engine 226 which converts speech into text.
An Alert Checker (AC) engine 229 and an AC database 234 cross-check the output text of the STT engine 226 for various warning, information and control conditions. If the response of the STT engine 226 notices that it be matched, the processing device A (107) sends a signal to the processing device B (117) through Blue-tooth device 141, 142 and 143 to activate alert signal to the vibrator motor 191 in the device B (117). A regular text converted by STT engine 226 is passed to a Sentence Composer (SC) engine 228 and a SC database 233 for sentence composing based on specific grammar rules. The SC engine 228 and the SC database 233 convert the regular text to a formal sentence. The final sentence can be displayed at LCD display 220 of the device 130 and/or to broadcast out to external input/output interfaces and/or to send back through wireless device to the processing device B for further broadcast to built-in cell phone.
A touch panel grid device 230 is, in the device 130, for handwriting and command input. A Short Handwriting-to-Text (SHTT) engine 227 and a SHTT database 232 convert a short handwriting detected by the touch grid device 230 into a text.
A video camera input 224 is obtained from the camera 155 . Other data input 225 is obtained from USB 159, IrDA 158, etc.
Figure 6 is a block diagram of various software Engines and Databases in processing device B (117). In Fig. 6, the processing device B means a main processing part such as the ASP 179, the CPU 178, and the DSP 180.
In Fig. 6, a raw sign data 252 is from its own glove whereas a sign data 251 is received from the processing device A (107).
A Sign-to-Sign Codes (STSC) conversion engine 268 and a STSC database 271 take a set of sign data detected on the both gloves 101 and 102, and convert the set of sign data to a series of sign codes which represent sign language words.
A Sign-Codes-to-Text (SCTT) conversion engine 267 and a SCTT database 270 take the series of sign codes from the STSC engine 268 and convert sign codes into a raw text.
The raw text is then passed to a Sentence Composer (SC) engine 269 and a
SC database 272, which correct the format of the raw text based on specific grammar rules. The final text data by the SC engine 269 is broadcast to external devices; to blue-tooth wireless 176, 177, and 200; and/or to built-in cell phone device 202, 199, and 198; and/or to directly at the speaker 174.
A Sign-code-to- Video-Animation (STVA) engine 262 and a STVA database 263 take a digital sign coded data and convert it into video animation data of a sign language corresponding to the sign coded data. Sign coded data of each hand joints, muscles and locations and output of accelerometer sensors are applied to graphical input to presents an equivalent sign video animation.
A Text-to-Speech (TTS) synthesizer engine 260 and a TTS database 261 take text data and convert it into speech data.
A Text-to-Sign-Code (TTSC) engine 258 is a reverse conversion engine of the SCTT engine 267 and a TTSC database 259 is a reverse conversion database of the SCTT database 270.
A touch panel grid 257 is for command input into the processing device B (117).
A LCD display 256 displays data in the processing device B (117). Other data input 264 is obtained from USB 173, IrDA 172, etc.
Figure 7(A) illustrates dorsal hand and wrist joints. A drawing 290 is an anatomy of dorsal hand's flex joints. Joints 283 are Metacarpol Phalangeal flex joints of an Index Finger 281, a Middle Finger 285, a Ring finger 286 , and a Little Finger 287. Joints 282 are Proximal Interphalangeal Flex joints of a Thumb 288, and the fingers 281, 285, 286, and 287. Joints 280 are Distal Interphalangeal Flex joints of the thumb 288, and the fingers 281, 285, 286, and 287. A Flexor Retinaculum Wrist joint 284 is at connection between a hand and an arm
Figure 7(B) illustrates specific flexion muscles and location of dorsal hand. A drawing 298 is an anatomy of dorsal hand. A thumb nail is in a location 297, an index finger nail is in a location 293, a middle finger nail is in a location 292, a ring finger nail is in a location 294, and a little finger nail is in a location 295.
Inter Digital Spacer flexions 296 are located at roots of a thumb and fingers.
Figure 8 illustrates specific muscles and location of a palmer hand. A palmer hand 303 has Distal Pulps 300 of thumb, index, middle, ring and little fingers, a Medial Ulner Muscle 302, and a Mid Palmer space 301.
Figure 9(A) illustrates locations of Bend Flex Resistor sensors 310 and 311 on a dorsal hand 312, respectively. Figure 9(B) illustrates locations of flexible Force Resistor sensors 313 on hand nails of a dorsal hand 314. Figure 9(C) illustrates Flexible Force Resistive sensors 315 on a palmer hand 316.
Figure 10 illustrates locations of flexible Force Resistor Sensors 325, 326, 327, 328, and 329 as dorsal hand nail sensors, and locations of Distal Pulp flexible Force Resistor sensors 320, 321, 322, 323, and 324 for distal finger pulps of a palmer hand on the Flexible Printed Circuit Board (FPCB) sheet 120.
A finger part of the FPCB sheet 120 is extended to the direction to perpendicular from the finger part, and then the flexible Force Resistor Sensors
325, 326, 327, 328, and 329 are positioned at the tips of the finger parts and the Distal Pulp flexible Force Resistor sensors 320, 321, 322, 323, and 324 are positioned on an extended part of the FPCB sheet 120.
On the FPCB sheet 120, palmer hand sensors 330 and 331 are installed for Medial Ulner spacer and Mid Palmer space respectively.
Figure 11 illustrates locations of flexible Bend Resistors sensors 340 to 354 on the FPCB sheet 120. Flexible Bend Resistors sensors 340, 341, 342, and 343 are Metacarpol joints sensors. Flexible Bend Resistors sensors 345, 346, 347, 348, and 349 are Proximal and Distal Interphalangeal joint sensors which measures the flexion of both Proximal and Distal Interphalangeal joints. Flexible Bend Resistors sensors 350, 351, 352, and 353 are Inter Digital spacer flexion sensors. Flexible Bend Resistors sensor 354 is a Flexor Retinaculum Wrist joint sensor.
Figure 12 illustrates RTV Silicon Rubber spray layers as upper rubber layer 372 and bottom rubber layer 370 on both surfaces of the FPCB sheet 120. In this embodiment, the upper rubber layer 372 and the bottom rubber layer 370 are formed on the surface of the FPCB sheet 120 without seams.
As mentioned above, the wearable human-assistive audio-visual inter¬ communication apparatus has a pair of self-contained hand gloves, designed to provide for determining the gestures of both hands of Sign Language, and producing speech, text and graphical video animation, and converting them back from graphical video-animation, text and speech to Sign Language and gestures of hands. It also recognizes Short Handwriting written on Touch Panel Grid and converts into text languages.
Both the gloves 101 and 102 have built-in miniature complex wireless, analog and digital data Processing Devices within the gloves at wrist side of the gloves, which includes Central Processor Unit (CPU), DSP (Digital Signal Processor), Analog Signal Processor (ASP), Voice Processor (VP), Image Processor (IMP), Memory, Memory Storage, Bluetooth Transceiver, Bluetooth Base-band, Cellular & GPRS Transceiver, Cellular Base-band, LCD display, Interface Controller, Accelerometer Sensors, Touch Panel Grid, Array of Microphones, Speaker, Camera, Vibrator Motor, Control Keys, Re-chargeable Battery and other likeable controllers and components.
The wearable human-assistive multi-lingual audio-visual inter¬ communication device, comprising self-contained wireless Gloves system, also has Flexible Solar Cells attached at the dorsal area of the said Gloves to provide for alternative power source to internal electronics and processing devices components of the Gloves.
The FPCB sheet for each of the gloves 101 and 102 is a double-sided copper layered Polyimide Flexible Printed Circuit sheet, which is sketched over entire hand bones for each of left and right hands.
An Electronic Printed Circuit Board lay-out design is applied at both layers of the FPCB sheet. 100K Ohm Flexible Bend resistors (sensors) 340 to 354 as shown in Fig. 11 are screen printed at various sizes (length and diameter) for each finger's joints which includes" Metacarpol Phalangeal Flex joints for index, middle, ring and little fingers; Proximal Interphalangeal Flex joints for thumb, index, middle, ring and little fingers; Distal Interphalangeal Flex joints for thumb, index, middle, ring and little fingers; Flexor Retinaculum Wrist joint for wrist and dorsal hand flexor; Inter Digital Spacer Flex between thumb and index finger, between index and middle fingers, between middle and ring fingers, between ring and little fingers. Bend Resistive Sensors 350 to 353 for Inter Digital Spacer flexes are placed in such a way that it extends from Proximal Interphalangeals of fingers and drops down back towards Metacarpol Phalangeal and makes a U-turn back to Interphalangeal joints sensing the flex (opening and closing) linearly.
As the sensor 340, ..., 354 bends, it changes upward the value of resistor. As shown in the following data, the degree of joints motions (Extension, Flextion, Hyperextension, Abduction, Pronation, Supination, etc.) is varied from joint to joint. The change of value in resistors is measured through Wheatstone - Bridges 137, 138, 181, and 182 and read by the ASPs 139 and 179. Before reaching to Wheatstone-bridges 137, 138, 181, and 182, signals from the bend resistive sensors 340 to 354 pass through the Port A of the MUXs 132 (182) mounted on the FPCB sheets at dorsal area of hand. The MUXs 132 and 182 are controlled and addressed by the ASPs 139 and 179.
Based on Orthopedic data of human bone anatomy, following are the typical ranges of joints movements of hands and arms:
Bone Joints Movements Range of Degree Elbow Extension/Flexion 0 to 145
Forearm Pronation/Supination 70 to 85
Wrist Extension/Flexion 70 to 75
Radial/Ulnar 20 to 35
Thumb basal joint Palmar Adduction/Abduction Contact to 45 Radial Adduction/ Abduction Contact to 60
Thumb Interphalangeal Hyperextension/Flexion 15H to 80
Thumb Metacarpophalangeal Hyperextension/Flexion 10 to 55
Finger DIP joints Extension/Flexion 0 to 80
Finger PIP joints Extension/Flexion 0 to 100 Finger MCP joints Hyperextension/Flexion (0-45H) to 90
In order to capture various parts of hand and fingers touches and forces/pressures of left and right hands when making Signs in sign language, the apparatus of the embodiment uses Flexible Force Resistive sensors, as the sensors 320 to 330 in Fig.10. Unlike Bend Resistive Sensor, when force, touch or pressure is applied at Force Resistive Sensors, the value of resistive sensor drops linearly from several hundred Mega Ohm to Kilo Ohm or even few ohms depending on the magnitude of force, touch and/or pressure. Therefore all flexible force resistive sensors are measured and read differently than Flexible Bend Resistive Sensors. The force resistive sensor consists of two thin, flexible polyester sheets which have electrically conductive electrodes. Inside surface of one sheet forms a row pattern while the inner surface of the other employs a column pattern. A thin semi-conductive coating (ink) is applied as an intermediate layer between the electrical contacts. The ink provides the electrical resistance change at each of the intersecting points.
When the two polyester sheets are placed on top of each other, a grid pattern is formed, creating a sensing location at each intersection. These flexible coatings are applied at Dorsal Hand Nails of both hands for thumb, index, middle, ring and little fingers! Distal Pulp of palmer hand for thumb, index, middle, ring and little fingers; Medial Ulner for medial ulner side of hand space; and Mid Palmer space for middle space of palm, as shown in Fig. 10.
Similar to Bend resistive sensors 340 to 354, each of Flexible force resistive sensors 320 to 331 also passes through the Port B of the MUX 132, and reaches to the Wheatstone Bridges 137 and 138 to be read the changes in sensor value by the ASP 139.
Portions of the FPCB sheet 120 for all Distal Pulp of Palmer hands sensors are extended from portions of the sensors 325 to 329 corresponding to nails on dorsal hand as a continuation of the FPCB sheet. Distal Pulp sensors 320 to 324 are bent reverse from Dorsal Nail sensors 325 to 329 in the glove. Similarly, Polyimide FPCB sheet 120 is extended from Dorsal area of hand and turned reverse towards a medial ulner of a hand for the Flexible Force Resistive sensor 330; and a middle space of palm for the Flexible Resistive sensor 331. Thus, it keeps one continuous Polyimide FPCB sheet 120 for each hand glove. The FPCB sheet 120 is for the glove 101 of a right hand, and the
FPCB sheet for the glove 102 of a right hand has a symmetrical shape to the
FPCB sheet 120.
A group of X-Y and X-Z Accelerometer sensors 125 and 126 are directly soldered over Polyimide FPCB sheet 120 to measure the position (i.e. direction) of Dorsal hand for up-down, front-back, and left-right. The output of these sensors are read by the ASP 139. The X-Y accelerometer sensor 125 is soldered evenly at the surface of the FPCB sheet 120, whereas the X-Z accelerometer sensor 126 is soldered vertically over the FPCB sheet 120 hence making the Z axis. Similarly, the processing devices A and B (107 and 117) also have X-Y and X-Z accelerometer sensors 123 and 124 which are directly soldered over the PCB 122 of the processing devices A and B (107 and 117). The X"Y accelerometer sensor 123 is soldered evenly at the surface of the PCB 122 of the processing devices A and B (107 and 117), whereas the X-Z accelerometer sensor 124 is soldered vertically over PCB 122 of the processing devices A and B (107 and 117) hence making the Z axis. Dorsal X-Y and X"Z Accelerometer sensors work (sensed) directly in conjunction with Flexor Retinaculum Wrist joint Bend Resistor Sensor.
Wrist X-Y and X-Z Accelerometer sensors output (3-D location) dynamically adjusted in conjunction with two parameters^ (a) the output all Force Resistor Sensors in an OR (digital gate) Boolean and the determination of final computed gesture word. For example, signing for a word " LAZY " in which tap the index palm finger of the right hand at the left shoulder several times. This dynamically calibrate and adjust the location of wrist (X~Y and X-Z Accelerometer sensors). In another example, signing for a word " MOUSE" in which brush the right hand index finger to the left across the nose tip a few times. This dynamically calibrate and adjust the location of wrist X-Y and X-Z Accelerometer Sensors.
Therefore, both dorsal and wrist accelerometer sensors do not work individually, instead the output of all accelerometer sensors are read in conjunction with linear values of other bend and force resister sensors.
Dual Port Analog Multiplexer switch (MUX) 127 (corresponding to 132 in Fig.
3) is also directly soldered over the FPCB sheet 120. The MUX 127 is addressed and controlled by the ASP 139.
Polyimide FPCB sheet 120 is directly connected with loosely coupled flexible wire (Cable Bank) 121 to bridge the connections between Polyimide FPCB and Processing devices A and B.
A wrist can act 70 to 75 degree movements for Extension and Flexion and 20 to 25 degree for Radial and Ulner twist, which causes the changes of value from the bend resistive sensors. This flexible cable bank bridge 121 keeps the bend resistive sensors in their joints position. After placing all sensors, chips and components on the FPCB sheet 120, the
FPCB sheet 120 is cut like a stencil similar to the formation of hand joints and bones while keeping one piece for both Dorsal and Palmer.
The finished Polyimide FPCB sheet 120, after edge cutting, is sprayed Silicon rubber using Room-Temperature-Vulcanizing on both sides (upper and bottom). From above stencil cutting, the FPCB sheet 120 becomes a one fabric similar to an upper layer of glove fabric. Silicon rubber layer is flexible and stretchable.
Polyimide Silicon Rubber FPCB sheet 120 is placed in between two layers of stretchable fabric and sieved or glued to make hand glove. The sensor 330 for a medial ulner side of a hand, and the sensor 331 for a middle space of a palm are placed in a small jacket of the palmer side of glove fabric. The outer edge of the sensor 331 is tied with an elastic thread where the other end of elastic is sieved or glued at thumb fabric. Thus keeping these two sensors to adjust and move freely. Specially when wearing gloves, it protects the FPCB sheet 120 from breaking and also lets the user to use hands freely for other work.
The sensors for both distal pulps and inter digital fingers are placed in a fabric jacket to avoid damage or breaking of the FPCB sheet 120 during wearing or use. A group of flexible and thin solar cells is placed at the outer layer of dorsal fabric. The solar cells 103 are connected to the flexible cable bank 121 and, through the flexible cable bank 121, connected to the processing device 107. The positive and negative charge of the solar cells is directly supplied to the processing device 107 via the flexible cable bank 121. The solar cells 103 is controlled by battery controller 153 built within the processing device 107.
The FPCB sheet 120, the flexible cable bank 121 and the processing device 107 are all combined to make a simple and a single piece of hand glove, the glove 101 (fabric and components connected together).
The processing device 107 and a group of solar cells 103 can both be detached for hand wash of the glove 101.
The glove 101 is discussed here, and the glove 102 is also configured in the same manner as the glove 101.
To protect the gloves 101 and 102 from any breakage or damage during wear- on and wear-off, the user wears other thin fabric gloves first and then wears the gloves, so that the gloves 101 and 102 are worn smoothly with minimal friction resistance.
Two processing devices A and B (107 and 117), which are attached on each apparatus glove system, perform specific data processing and program execution functions. Both the processing devices intercommunicate and exchange data wirelessly over blue -tooth.
Processing device A (107) continuously measures and reads all glove sensors and converts analog data into digital codes. It also treats and simplifies incoming data and removes unwanted signals and codes and broadcasts out digitally coded gesture data to processing device B using built-in Blue-tooth wireless device.
Processing device B (117) also continuously measures and reads all sensors and converts analog data into digital codes. It also treats and simplifies incoming data and removes unwanted signals and codes. On receiving digitally coded gesture data from processing device A, processing device B (117) takes digitally coded sign data and executes Sign-Codes-to-Text engine which takes sign data and translates/finds an equivalent match of text alphabet or words from pre-stored database in memory storage of processing device B. Raw text is applied to Sentence Composer engine which re-arranges individual words into full sentence. Based on user setting, sentence composer engine can be by-passed.
Depending on the type of communication (face-to-face, or face-to-remote, or same device-to-device, or said apparatus-to-other device using software plug-in or utility), the text from above is broadcast out. The text is broadcast to processing device A (107) through built-in blue-tooth wireless. Processing device A (107) receives Text from processing device B (117) and displays over LCD located with the glove of processing device A. Processing device B forwards text to Text-to-Speech synthesizer engine, which produces human voiced audio speech through built-in Speaker in processing device B. Depending on the type of communication (face-to-face, or face-to-remote, or same device-to-device, or said apparatus-to-other device using software plug-in or utility), Text-to-Speech synthesizer can also send out audio speech to remote device using industry standard interface (USB or IrDA) or through remote distance using built-in Cellular & GPRS transceiver (in processing device B) very much like a normal person speaking over cell-phone's microphone to other party at remote distance. Live audio speech is received through Array of Microphones from processing device A. Depending on the type of communication (face-to-face, or face-to- remote, or same device -to -device, or said apparatus-to-other device using software plug-in or utility), an audio speech is applied through Voice processor (VP) on processing device A. Voice Processor takes the differential inputs of Silicon Array Microphones to minimize the RF interference and white noise. The 3 microphones create AMBIN (Array Microphone Bea -forming Integrated with Noise suppression) for advanced noise suppression and echo cancellation for clearest communication for better voice recognition even in highly noisy environment. The noise suppression can be achieved up to 15dB and more, and acoustic Echo Cancellation up to 45dB and more. After treating audio speech by Voice Processor (VP) from processing device A, the audio speech passes through voice recognition (Speeclrto-Text) engine which converts audio speech into text.
Once the text is ready by processing device A, it is displayed over LCD located over the gloves in Processing device A. The text is also broadcast to processing device B over blue -tooth wireless. If activated, the processing device B takes the Text and applies it to Text-to-Video Sign Animation engine. The graphical video sign animation engine takes the each word of the text and finds equivalent digital sign coded sequence. These text equivalent sign coded data is applied to the graphics engine, which mimics 3-D animated human makes the sign of Sign language over processing device B's LCD display.
Processing device B has built-in Cellular & GPRS Transceivers which provide both voice and data communication. Gestures made using both hands and converted into final audio speech are broadcast out through Cellular & GPRS devices and it receives audio speech or data from remote side. The graphical video Sign animation or Sign language coded data are broadcast out to other remote device which may be using similar Glove apparatus or other devices using software plug-in or software utility.
Processing device A has Touch Grid panel over LCD. Instead of making Signs or typing each alphabet, processing device A has Short Handwriting recognition engine, which reads writing over Touch Grid Panel using Stylus pen and converts graphical input into Text. Once the text is extracted, it is passed through various engines as described above for producing audible speech, text or graphical video Sign Language animation.
Processing device B has built-in Vibrator motor, which provides many useful interfaces and communication between user and Gloves system apparatus. Processing Device B has Alert Checker database engine (Alert- Checker- Database Engine) : On receiving audio, text or animation data input, the Alert Checker database engine verifies conditions and generates alerts to the user through Vibrator Motor located within the glove. These Alert-Checker conditions may include, person's Name, Mr, Miss, Excuse me, Hay, Hello, or Attention like words and/or can be set for Phone Ring, Doorbell Signalers, Smoke/Fire Alarm, Burglar Alarm, Siren, Auto mobile Horn Alert, or even Baby Cry Signaler. This means, a person wearing Glove System apparatus can be called or alerted for various abnormalities or normal communication wherever he/she may happen to be (e.g. walking on the street or at the Airport). All received calls through built-in Cell phone are activated through Vibrator motor. If at home, user can be alerted, alarmed, informed, or called in various conditions. User can set 32 or more conditions. Sound, audio speech or tone/tune can be set for personalized conditions.
In addition, a vibrator motor also provides mechanism where others can initiate communication, for example, by saying "Hello" to the person wearing this gloves 101 and 102.
Calibration methods for the sensors is described hereinafter. Dynamic calibration can be performed in the devices 107 and 117 for individual user for approximate positioning and location identification of individual user's body parts as stated below. It is further stated that all corresponding values, read by sensors, are stored in a non- volatile memory. Calibration starts and ends with Vibrator's vibration indicating to the user when to start and when to end, that it has read the calibration values.
Firstly, calibration method for the case that one or two hands of a user is/are located near the body of the user is described^ In step-1, user stand straight, leave arms to gravity and flat hand down towards ground (resting position) to read arms and hands positions for both hands, one by one. This tells the device that user is in the reset position from where it will proceed for signing or proceed for further calibration. In step-2, user stand straight, lift arms from resting position towards shoulder and set hand-shape similar to alphabet "A" facing opposite person, and set finger-spelling position as shown for both hands, one by one. This tells the device that the position of wrist next to the shoulder.
In step -3, user stand straight, lift arm from resting position till it makes an angle with palm facing opposite person for both hands, one by one; This tells the device the position of Palm.
In step-4, user stand straight, lift right arm towards right shoulder, flat palm position fingers towards up (head). Lift left arm from resting position, flat hand palm, turn right left arms and position fingers underneath right-arm elbow to define signing area; this tells the device that location of shoulders precisely. In step-5, user stand straight, lift right arm from resting position, flat palm and place hand over heart;. This tells the user the location of heart within the space of step-4.
In step-6, user stand straight, lift arm from resting position, flat palm and place over stomach for both (left and right) arms and hands, one by one; this tells the device the location of stomach.
In step -7, stand straight, lift arm from resting position, flat palm and place over chest for both (left and right) arms and hands, one by one; this tells the device the location of chest of the user body. In step -8, user stand straight, lift arm and place it in finger- spelling position, flat palm facing opposite person for both hands, one by one; this tells the device position of palm.
In step-9, user stand Straight, lift arm and place it in finger-spelling position, flat palm facing towards user for both hands, one by one; this tells the device the position of palm.
In step- 10, user stand straight, lift arm and place it in finger-spelling position, flat palm facing towards opposite shoulder for both hands, one by one; this tells the device opposite location of shoulder from opposite wrist when the it touches the opposite shoulder.
In step -11, user stand straight, lift arm and place it at chest position, flat palm facing up for both hands, one by one; this tells the position of palm.
In step -12, user stand straight, lift arm and place it at chest position, flat palm facing own for both hands, one by one. This tells the device the position of palm.
Secondly, calibration method for the case that one or two hands of a user is/are located near the neck, the face, or the head of the user is described:
In step-1 the user's head is divided into four positions: left side of front head, right side of front head, top of the head, and back side of the head; user lift right and left arms (one-by-one) and place index finger at each location of head. This tells the device the locations of head.
In step-2, user's forehead is a single position place,' user lift right and left arms (one -by-one) and place index finger over forehead. This tells the device the location of forehead of user. In step-3, users' eyes are divided into two positions"- left-eye and right-eye; user lift right and left arms (one-by-one) and place index finger over closed left and right eyes. This tells the device the location of eyes of user.
In step-4, user's nose is a single position of the face! user lift right and left arms (one -by-one) and place index finger over nose. This tells the device the location of user's nose of the face.
In step-5, user's ears are divided into two positions: left ear and right ear! user lift right and left arms (one-by-one) and place index finger over left and right ears one-by-one. This tells the device the location of ears of user face.
In step-6, user's cheeks are divided into two positions: left cheek and right cheek! user lift right and left arms (one-by-one) and place index finger over left and right cheeks one-by-one. This tells the device the location of cheeks.
In step-7, user's mustache is a single of the face! user lift right and left arms (one-by-one) and place index finger over mustache. This tells the device the location of mustaches of user's face.
In steρ-8, user's lips and teeth have one position! user lift right and left arms (one -by-one) and place index finger over lips. This tells the device the location of user's lips. In step-9, user's chin is also a single position of the face! user lift right and left arms (one-by-one) and place index finger over chin. This tells the device the location of user's chin.
In step-10, user's neck has one position! user lift right and left arms (one-by- one) and place index finger over neck. This tells the device the location of user's neck.
In short, the wearable human-assistive audio-visual inter-communication device comprising self-contained wireless Gloves system provides complete and self-contained total solution in many alternative communication situations, even with Sign Language or without Sign Language.
Another embodiment of the invention is described hereinafter.
In the embodiment, a wrist mounted device has a wrist band and a processing device mounted on the wrist band. The processing device is configured by adding a video processor (VP) 154, a microphone array 148, 149 and 150, an image processor (IMP) 156, a camera 155, a short-handwriting-to- text (SHTT) engine 227, a SHTT database 232, an Alert Checker (AC) engine 229, an AC database 234, a speech-to-text (STT) engine 226, and a STT database 231 as installed in the processing device A (107) to the processing device B (117).
The wrist mounted device is for disables and/or patients and is mounted on a wrist of the disables and/or patients without gloves.
This processing device can convert a hand writing sensed by touch panel grid 257 to a video animation data and can send the video animation data to a remote device by wireless communication such as Bluetooth, IrDA, CellularPhone, and GPRS (General Packet Radio System).
In addition, this processing device can convert an audio data from microphones 148, 149 and 150 to a text data. This processing device can send the audio data and/or the text data by wireless communication such as Bluetooth, IrDA, Cellular phone, and GPRS.
In addition, this processing device can convert audio data from the microphone 148, 149, and 150 into a text data or a video animation data and can display a text by the text data or a video animation by the video animation data on the LCD display 256.
In addition, this processing device can convert audio data from the microphone 148, 149, and 150 into a text data and a vibrator 191 in the processing device starts vibrating when the converted text data is one of predefined text data.
Industrial Applicability The above apparatus is applicable various technical fields as follows: (a) To provide a state-of-the-art communication system for the deaf and hard-of- hearing disables, which assists them to inter-communicate with normal persons as well as the disables.
(b) To provide a complete end-to-end communication system, which allows deaf persons to communicate not only face-to-face but also to remote for voice and or data communication to both deaf and normal persons.
(c) To bring the deaf disabled community closer to everyday life and work.
(d) To provide an extremely lightweight and easy-to-use system, which do not requires wiring around the body of the user.
(e) To provide a self-contained complete solution without the need of external data processing.
(f) To provide a communication system which provides conversation in multiple communication forms, like audio, text, graphical video animation and digital coded sign data.
(g) To provide Handwritten Short-Handwriting recognition to text, voice, and graphical Sign video animation communication.
(h) To provide a communication system, which communicate in multiple languages and can be used world-wide.
(i) To provide a communication solution without the pre-requisite of a similar device at the other end. (j) To provide built-in two-way communication over Cell phone with live video
Camera.
(k) To provide an industry- standard communication system, which can be interfaced with IT, Telecom and deaf disable devices for broader applications.
(1) To provide a glove apparatus for Virtual Reality applications. (m) To provide Wrist-based Processing device for various data processing and control applications. That is, wrist processing devices may also be used in various data processing, computing and control applications and can be used without gloves.

Claims

1. A wearable human-assistive audio-visual inter-communication apparatus comprising:
(a) a first glove for any one of a right hand and a left hand!
(b) a second glove for the other one of the right hand and the left hand!
(c) bend sensors installed within each of the first glove and the second glove!
(d) force sensors installed within each of the first glove and the second glove! (e) one or more accelerometer sensors installed in each of the first glove and the second glove! and
(f) a first wrist device mounted within the first glove, the first wrist device includes: a means for gathering sensor data of the bend sensors, the force sensors, and the accelerometer sensor(s) within the first glove! and a means for sending the sensor data to the second glove by wireless communication! and
(g) a second wrist device mounted within the second glove, the second wrist device includes: database having digitally coded sign language information, and typical sensor data for each of sign language words! a means for gathering sensor data of the bend sensors, the force sensors, and the accelerometer sensor(s) from the second glove! a means for receiving the sensor data from the first glove by wireless communication! and a means for converting the sensor data from the first glove and the second glove to digitally coded sign language information by using the database.
2. A wearable human-assistive audio-visual inter-communication apparatus as claimed in claim 1, wherein the second wrist device further includes: a means for transforming the sign language information to a speech.
3. A wearable human-assistive audio-visual inter-communication apparatus as claimed in claim 1, wherein the second wrist device further includes: a means for transforming the sign language information to text data! and a means for sending the text data to the first wrist device! and the first wrist device further includes: a means for receiving the text data from the second wrist device! and a display which displays a sentence by the received text data.
4. A wearable human-assistive audio-visual inter-communication apparatus as claimed in claim 3, wherein the second wrist device further includes: a means for transforming the text data to a speech.
5. A wearable human-assistive audio-visual inter-communication apparatus as claimed in claim 1, wherein one of the first wrist device and the second wrist device further includes: a mean for receiving audio signal! a mean for transforming the audio signal to audio data! database having a set of specific audio data! and a vibrator which vibrates when the received audio signal is identical to any one of the audio data in the database.
6. A wearable human-assistive audio-visual inter-communication apparatus as claimed in claim 5, wherein one of the first wrist device and the second wrist device further includes: a filter which reduces background noise and echo in the audio signal.
7. A wearable human-assistive audio-visual inter-communication apparatus as claimed in claim 5, wherein one of the first wrist device and the second wrist device further includes'- a means for transforming the audio signal to a text data! and a means for displaying sentence by the text data.
8. A wearable human-assistive audio-visual inter-communication apparatus as claimed in claim 1, wherein one of the first wrist device and the second wrist device further includes: a mean for receiving text data! a mean for transforming the text data to a graphical animation data of sign language! and a display for displaying a graphical animation by the graphical animation data.
9. A wearable human-assistive audio-visual inter-communication apparatus as claimed in claim 1, wherein the first wrist device is detachable from the first glove.
10. A wearable human-assistive audio-visual inter-communication apparatus as claimed in claim 1, wherein the second wrist device is detachable from the second glove.
11. A wearable human-assistive audio-visual inter-communication apparatus as claimed in claim 1 further comprising:
(h) a solar cell mounted on a dorsal side of the first glove, for supplying electric power to the first wrist device.
12. A wearable human-assistive audio-visual inter-communication apparatus as claimed in claim 11, wherein the solar cell is detachable from the first glove.
13. A wearable human-assistive audio-visual inter-communication apparatus as claimed in claim 1 further comprising:
(i) a solar cell mounted on a dorsal side of the second glove, for supplying electric power to the second wrist device.
14. A wearable human-assistive audio-visual inter-communication apparatus as claimed in claim 13, wherein the solar cell is detachable from the second glove.
15. A wearable human-assistive audio-visual inter-communication apparatus as claimed in claim 1, wherein the second wrist device further includes: a wireless data communication means for sending and receiving data, the a wireless data communication means sends a text data, a voice data, and a video animation data corresponding to the sign language information.
16. A wearable human-assistive audio-visual inter-communication apparatus as claimed in claim 15, wherein the wireless data communication means includes a cellular phone and GPRS transceiver.
17. A wearable human-assistive audio-visual inter-communication apparatus as claimed in claim 1, wherein the first wrist device and the second wrist device perform all data processing without any external devices other than the first wrist device and the second wrist device.
18. A data glove comprising: (a) a flexible printed circuit board settled on a dorsal side of a hand, which has parts corresponding to five fingers, an ulner part, and extension parts extending to distal area of finger pulps and mid palmer space on a palm side!
(b) a first group of sensors in the flexible printed circuit board, for sensing touch force to the distal area of finger pulps!
(c) a second group of sensors in the flexible printed circuit board, for sensing touch force to finger nails!
(d) a touch force sensor in the flexible printed circuit board, for sensing touch force to the ulner part! and (e) a touch force sensor in the flexible printed circuit board, for sensing touch force to the mid palmer space.
19. A data glove as claimed in claim 18 further comprising'-
(f) bend sensors in the flexible printed circuit board, for sensing bending angle of a metacarpol phalangeal flex joint in respective fingers! and
(g) bend sensors in the flexible printed circuit board, for sensing total bending angle of a distal interphalangeal flex joint and a proximal interphalangeal flex joint in respective fingers.
20. A data glove as claimed in claim 19 further comprising: (h) a bend sensor for sensing bending angle of a wrist! and
(i) accelerometer sensors on the flexible printed circuit board, for sensing position and rotation of a hand wearing the data glove in conjunction with the first group of sensors, the second group of sensors, the touch force sensor for sensing touch force to the ulner part, the touch force sensor for sensing touch force to the mid palmer space, and the bend sensor for sensing bending angle of the wrist.
21. A data glove as claimed in claim 18 further comprising: (j) rubber layers on both sides of the flexible printed circuit board! and
(k) two fabric gloves between which the flexible printed circuit board is settled with the rubber layers.
22. A data glove as claimed in claim 21, wherein the rubber layers which are made by spraying RTV (Room Temperature Vulcanizing) Silicon Rubber on top and bottom surface of the flexible printed circuit board.
23. A wrist mounted device comprising'- (a) a wrist band! and
(b) a device mounted on the wrist band, the device includes: a text database having text data corresponding to gesture data! a gesture-to-text conversion engine which reads gesture data sensed by sensors and finds equivalent word of text, in the text database! a sentence composer engine which takes individual words of the text by the gesture-to-text conversion engine and re-arranges the words into a formal sentence! a speech database having audio data corresponding to text data! a text-to-speech engine which produces audible speech from text sentence by using the speech database! a speech-to-text engine which converts speech data into text data by using the speech database! a graphical animation engine which converts text data to gesture data and produces a graphical animation data of gesture from the gesture data! a display for displaying a text by the text data and a graphical animation by the graphical animation data! and a speaker for output a speech by the speech data.
24. A wrist mounted device claimed as claim 23, wherein the text database is a sign language's gesture database having sign language words corresponding to gestures on sign language.
25. A wrist mounted device claimed as claim 23, wherein the device further includes a vibrator which vibrates when a predefined input to the wrist mounted device occurs.
26. A wrist mounted device claimed as claim 23 further comprising:
(c) a touch panel for detecting a handwriting, installed on the display! and (d) a handwriting-to-text conversion engine which converts the handwriting to text data.
27. A wrist mounted device for disables comprising: (a) a wrist band! and (b) a device mounted on the wrist band, the device includes: a touch panel sensing hand writing! a converting means for converting the hand writing sensed by the touch panel to a video animation data! and a sending means for sending the video animation data to a remote device by wireless communication.
28. A wrist mounted device for disables claimed as claim 27, wherein the device further includes one or more microphones! the sending means sends an audio data captured by the microphone(s) to a remote device by wireless communication.
29. A wrist mounted device for disables claimed as claim 27, wherein the device further includes: one or more microphones! a speech database having audio data corresponding to text data! and a speech-to-text database engine which converts audio data from the microphone(s) into a text data by using the speech database! and the sending means sends the text data to a remote device by wireless communication.
30. A wrist mounted device for disables claimed as claim 27, wherein the device further includes*- one or more microphones! a speech database having audio data corresponding to text data! a speech-to-text database engine which converts audio data from the microphone(s) into a text data by using the speech database! and a display for displaying a text by the text data.
31. A wrist mounted device for disables claimed as claim 27, wherein the device further includes: one or more microphones! a speech database having audio data corresponding to text data! a speech-to-text database engine which converts audio data from the microphone(s) into a text data by using the speech database! and a vibrator which vibrates when the text data converted by the speech-to-text database engine is one of predefined text data.
32. A wrist mounted device for disables comprising:
(a) a wrist band! and
(b) a device mounted on the wrist band, the device includes'- one or more microphones! a speech database having audio data corresponding to text data! a speech-to-text database engine which converts audio data from the microphone(s) into a text data by using the speech database! and a display for displaying a text by the text data.
33. A wrist mounted device for disables claimed as claim 32, wherein the device further includes: a vibrator which vibrates when the text data converted by the speech-to-text database engine is one of predefined text data.
34. A wrist mounted device for disables claimed as claim 32, wherein the device further includes: a converting means for converting the audio data to a video animation data of sign language! and a display for displaying a video animation by the video animation data.
PCT/JP2003/007863 2003-06-20 2003-06-20 Human-assistive wearable audio-visual inter-communication apparatus. WO2004114107A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
AU2003243003A AU2003243003A1 (en) 2003-06-20 2003-06-20 Human-assistive wearable audio-visual inter-communication apparatus.
PCT/JP2003/007863 WO2004114107A1 (en) 2003-06-20 2003-06-20 Human-assistive wearable audio-visual inter-communication apparatus.

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2003/007863 WO2004114107A1 (en) 2003-06-20 2003-06-20 Human-assistive wearable audio-visual inter-communication apparatus.

Publications (1)

Publication Number Publication Date
WO2004114107A1 true WO2004114107A1 (en) 2004-12-29

Family

ID=33524173

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2003/007863 WO2004114107A1 (en) 2003-06-20 2003-06-20 Human-assistive wearable audio-visual inter-communication apparatus.

Country Status (2)

Country Link
AU (1) AU2003243003A1 (en)
WO (1) WO2004114107A1 (en)

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006103358A1 (en) * 2005-03-31 2006-10-05 Erocca Device for communication for persons with speech and/or hearing handicap
EP1838099A1 (en) * 2006-03-23 2007-09-26 Fujitsu Limited Image-based communication methods and apparatus
US7565295B1 (en) * 2003-08-28 2009-07-21 The George Washington University Method and apparatus for translating hand gestures
GB2458583A (en) * 2005-01-18 2009-09-30 Rallypoint Inc Wearable article sensing a force and a direction associated the force
WO2010084348A3 (en) * 2009-01-21 2010-10-28 Birmingham City University A motion capture apparatus
CN102132227A (en) * 2008-03-26 2011-07-20 艾登特技术股份公司 System and method for the multidimensional evaluation of gestures
WO2011103095A1 (en) * 2010-02-18 2011-08-25 Dilluvah Corp. Dual wrist user input system
ES2386992A1 (en) * 2011-02-14 2012-09-10 Juan Álvarez Álvarez System and procedure for interpretation of the language of signs. (Machine-translation by Google Translate, not legally binding)
CN102663197A (en) * 2012-04-18 2012-09-12 天津大学 Virtual hand grasp simulating method based on motion capture
US8461468B2 (en) 2009-10-30 2013-06-11 Mattel, Inc. Multidirectional switch and toy including a multidirectional switch
WO2014053041A1 (en) * 2012-10-05 2014-04-10 Brunian Ltda Me Equipment worn on the upper limbs for sensing, processing and storing quantitative data regarding the classic triad of symptoms of parkinson's disease
US8801488B2 (en) 2012-10-15 2014-08-12 Disney Enterprises, Inc. Chin strap sensor for triggering control of walk-around characters
CN104049753A (en) * 2014-06-09 2014-09-17 百度在线网络技术(北京)有限公司 Method and device for realizing mutual conversion between sign language information and text information
WO2015116008A1 (en) * 2013-11-07 2015-08-06 Bavunoglu Harun System of converting hand and finger movements into text and audio
WO2016029183A1 (en) * 2014-08-22 2016-02-25 Sony Computer Entertainment Inc. Glove interface object
CN105976675A (en) * 2016-05-17 2016-09-28 福建万亿店中店电子商务有限责任公司 Intelligent information exchange device and method for deaf-mute and average person
CN106020468A (en) * 2016-05-18 2016-10-12 翁明辉 Glove controlled augmented reality system
GR1009085B (en) * 2016-06-21 2017-08-11 Αλεξανδρος Τηλεμαχου Τζαλλας Method and glove-like device for the determination and improved assessment of disease-associated kinetic symptoms
CN107749213A (en) * 2017-11-24 2018-03-02 闽南师范大学 A kind of wearable sign language tutoring system based on six axle attitude transducer modules
US9999280B2 (en) 2014-06-27 2018-06-19 David Gareth Zebley Interactive bracelet for practicing an activity between user devices
EP3234742A4 (en) * 2014-12-16 2018-08-08 Quan Xiao Methods and apparatus for high intuitive human-computer interface
IT201700014209A1 (en) * 2017-03-14 2018-09-14 Nicholas Caporusso Useful device for communication and interaction based on gestures and touch.
US10139858B2 (en) 2010-09-27 2018-11-27 Nokia Technologies Oy Apparatus with elastically transformable body
EP3518075A1 (en) * 2018-01-24 2019-07-31 C.R.F. Società Consortile per Azioni Sensorized glove and corresponding method for ergonomic analysis of the hand, in particular a worker's hand
CN113434042A (en) * 2021-06-29 2021-09-24 深圳市阿尓法智慧科技有限公司 Deaf-mute interactive AI intelligent navigation device
US11163522B2 (en) 2019-09-25 2021-11-02 International Business Machines Corporation Fine grain haptic wearable device
US11449143B2 (en) 2018-06-11 2022-09-20 Koninklijke Philips N.V. Haptic input text generation
WO2022264165A1 (en) * 2021-06-13 2022-12-22 Karnataki Aishwarya A portable assistive device for challenged individuals
US11604512B1 (en) * 2022-01-05 2023-03-14 City University Of Hong Kong Fingertip-motion sensing device and handwriting recognition system using the same

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6244827A (en) * 1985-08-20 1987-02-26 ブィ・ピィ・エル・リサーチ・インコーポレイテッド Apparatus and method for generating control signal accordingto action and position of hand
US5047952A (en) * 1988-10-14 1991-09-10 The Board Of Trustee Of The Leland Stanford Junior University Communication system for deaf, deaf-blind, or non-vocal individuals using instrumented glove
JPH09319297A (en) * 1996-05-29 1997-12-12 Hitachi Ltd Communicating device by speech and writing
US5953693A (en) * 1993-02-25 1999-09-14 Hitachi, Ltd. Sign language generation apparatus and sign language translation apparatus
US6141643A (en) * 1998-11-25 2000-10-31 Harmon; Steve Data input glove having conductive finger pads and thumb pad, and uses therefor
JP2001111708A (en) * 1999-10-14 2001-04-20 Matsushita Electric Ind Co Ltd Mobile information communication device
WO2001059741A1 (en) * 2000-02-10 2001-08-16 Koninklijke Philips Electronics N.V. Sign language to speech converting method and apparatus
US20010050883A1 (en) * 2000-06-07 2001-12-13 Pierre-Andre Farine Portable object with a wristband including a keyboard
JP2002040927A (en) * 2000-07-25 2002-02-08 Towa Erekkusu:Kk Tactile part of auditory sense auxiliary equipment utilizing tactile sense
JP2003015810A (en) * 2001-06-29 2003-01-17 Tadatoshi Goto Glove-shaped input device

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6244827A (en) * 1985-08-20 1987-02-26 ブィ・ピィ・エル・リサーチ・インコーポレイテッド Apparatus and method for generating control signal accordingto action and position of hand
US5047952A (en) * 1988-10-14 1991-09-10 The Board Of Trustee Of The Leland Stanford Junior University Communication system for deaf, deaf-blind, or non-vocal individuals using instrumented glove
US5953693A (en) * 1993-02-25 1999-09-14 Hitachi, Ltd. Sign language generation apparatus and sign language translation apparatus
JPH09319297A (en) * 1996-05-29 1997-12-12 Hitachi Ltd Communicating device by speech and writing
US6141643A (en) * 1998-11-25 2000-10-31 Harmon; Steve Data input glove having conductive finger pads and thumb pad, and uses therefor
JP2001111708A (en) * 1999-10-14 2001-04-20 Matsushita Electric Ind Co Ltd Mobile information communication device
WO2001059741A1 (en) * 2000-02-10 2001-08-16 Koninklijke Philips Electronics N.V. Sign language to speech converting method and apparatus
US20010050883A1 (en) * 2000-06-07 2001-12-13 Pierre-Andre Farine Portable object with a wristband including a keyboard
JP2002040927A (en) * 2000-07-25 2002-02-08 Towa Erekkusu:Kk Tactile part of auditory sense auxiliary equipment utilizing tactile sense
JP2003015810A (en) * 2001-06-29 2003-01-17 Tadatoshi Goto Glove-shaped input device

Cited By (51)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7565295B1 (en) * 2003-08-28 2009-07-21 The George Washington University Method and apparatus for translating hand gestures
US8140339B2 (en) 2003-08-28 2012-03-20 The George Washington University Method and apparatus for translating hand gestures
GB2458583A (en) * 2005-01-18 2009-09-30 Rallypoint Inc Wearable article sensing a force and a direction associated the force
GB2458583B (en) * 2005-01-18 2009-12-09 Rallypoint Inc Sensing input actions
US8082152B2 (en) 2005-03-31 2011-12-20 Erocca Device for communication for persons with speech and/or hearing handicap
FR2884023A1 (en) * 2005-03-31 2006-10-06 Erocca Sarl DEVICE FOR COMMUNICATION BY PERSONS WITH DISABILITIES OF SPEECH AND / OR HEARING
WO2006103358A1 (en) * 2005-03-31 2006-10-05 Erocca Device for communication for persons with speech and/or hearing handicap
EP1838099A1 (en) * 2006-03-23 2007-09-26 Fujitsu Limited Image-based communication methods and apparatus
US7664531B2 (en) 2006-03-23 2010-02-16 Fujitsu Limited Communication method
CN102132227A (en) * 2008-03-26 2011-07-20 艾登特技术股份公司 System and method for the multidimensional evaluation of gestures
WO2010084348A3 (en) * 2009-01-21 2010-10-28 Birmingham City University A motion capture apparatus
US8461468B2 (en) 2009-10-30 2013-06-11 Mattel, Inc. Multidirectional switch and toy including a multidirectional switch
WO2011103095A1 (en) * 2010-02-18 2011-08-25 Dilluvah Corp. Dual wrist user input system
US10139858B2 (en) 2010-09-27 2018-11-27 Nokia Technologies Oy Apparatus with elastically transformable body
ES2386992A1 (en) * 2011-02-14 2012-09-10 Juan Álvarez Álvarez System and procedure for interpretation of the language of signs. (Machine-translation by Google Translate, not legally binding)
CN102663197A (en) * 2012-04-18 2012-09-12 天津大学 Virtual hand grasp simulating method based on motion capture
WO2014053041A1 (en) * 2012-10-05 2014-04-10 Brunian Ltda Me Equipment worn on the upper limbs for sensing, processing and storing quantitative data regarding the classic triad of symptoms of parkinson's disease
US8801488B2 (en) 2012-10-15 2014-08-12 Disney Enterprises, Inc. Chin strap sensor for triggering control of walk-around characters
US10319257B2 (en) 2013-11-07 2019-06-11 Harun Bavunoglu System of converting hand and finger movements into text and audio
DE212014000212U1 (en) 2013-11-07 2016-06-13 Elif Saygi Bavunoglu System of conversion of hand and finger movements into text and sound
WO2015116008A1 (en) * 2013-11-07 2015-08-06 Bavunoglu Harun System of converting hand and finger movements into text and audio
CN104049753B (en) * 2014-06-09 2017-06-20 百度在线网络技术(北京)有限公司 Realize the method and apparatus that sign language information and text message are mutually changed
CN104049753A (en) * 2014-06-09 2014-09-17 百度在线网络技术(北京)有限公司 Method and device for realizing mutual conversion between sign language information and text information
US9999280B2 (en) 2014-06-27 2018-06-19 David Gareth Zebley Interactive bracelet for practicing an activity between user devices
US11659903B2 (en) 2014-06-27 2023-05-30 David Gareth Zebley Band for performing an interactive activity
US11395531B2 (en) 2014-06-27 2022-07-26 David Gareth Zebley Band for performing an activity
US11039669B2 (en) 2014-06-27 2021-06-22 David Gareth Zebley Band for performing an activity
JP2017530452A (en) * 2014-08-22 2017-10-12 株式会社ソニー・インタラクティブエンタテインメント Glove interface object
EP3183633B1 (en) * 2014-08-22 2022-05-25 Sony Interactive Entertainment Inc. Glove controller
US9971404B2 (en) 2014-08-22 2018-05-15 Sony Interactive Entertainment Inc. Head-mounted display and glove interface object with pressure sensing for interactivity in a virtual environment
WO2016029183A1 (en) * 2014-08-22 2016-02-25 Sony Computer Entertainment Inc. Glove interface object
US10019059B2 (en) 2014-08-22 2018-07-10 Sony Interactive Entertainment Inc. Glove interface object
US10055018B2 (en) 2014-08-22 2018-08-21 Sony Interactive Entertainment Inc. Glove interface object with thumb-index controller
CN106575159A (en) * 2014-08-22 2017-04-19 索尼互动娱乐股份有限公司 Glove interface object
CN106575159B (en) * 2014-08-22 2019-08-20 索尼互动娱乐股份有限公司 Glove ports object
EP3183633A1 (en) * 2014-08-22 2017-06-28 Sony Interactive Entertainment Inc. Thumb controller
EP3234742A4 (en) * 2014-12-16 2018-08-08 Quan Xiao Methods and apparatus for high intuitive human-computer interface
CN105976675A (en) * 2016-05-17 2016-09-28 福建万亿店中店电子商务有限责任公司 Intelligent information exchange device and method for deaf-mute and average person
CN106020468A (en) * 2016-05-18 2016-10-12 翁明辉 Glove controlled augmented reality system
GR1009085B (en) * 2016-06-21 2017-08-11 Αλεξανδρος Τηλεμαχου Τζαλλας Method and glove-like device for the determination and improved assessment of disease-associated kinetic symptoms
IT201700014209A1 (en) * 2017-03-14 2018-09-14 Nicholas Caporusso Useful device for communication and interaction based on gestures and touch.
CN107749213A (en) * 2017-11-24 2018-03-02 闽南师范大学 A kind of wearable sign language tutoring system based on six axle attitude transducer modules
JP2019127677A (en) * 2018-01-24 2019-08-01 シー.アール.エフ. ソシエタ コンソルティレ ペル アツィオニ Sensor mounted glove relevant to ergonomic analysis of hand, especially for ergonomic analysis of hand of operator and corresponding method
US11006861B2 (en) 2018-01-24 2021-05-18 C.R.F. Societa Consortile Per Azioni Sensorized glove and corresponding method for ergonomic analysis of the hand, in particular a worker's hand
EP3518075A1 (en) * 2018-01-24 2019-07-31 C.R.F. Società Consortile per Azioni Sensorized glove and corresponding method for ergonomic analysis of the hand, in particular a worker's hand
JP7346791B2 (en) 2018-01-24 2023-09-20 シー.アール.エフ. ソシエタ コンソルティレ ペル アツィオニ Regarding ergonomic analysis of hands, in particular gloves equipped with sensors for ergonomic analysis of workers' hands and corresponding methods
US11449143B2 (en) 2018-06-11 2022-09-20 Koninklijke Philips N.V. Haptic input text generation
US11163522B2 (en) 2019-09-25 2021-11-02 International Business Machines Corporation Fine grain haptic wearable device
WO2022264165A1 (en) * 2021-06-13 2022-12-22 Karnataki Aishwarya A portable assistive device for challenged individuals
CN113434042A (en) * 2021-06-29 2021-09-24 深圳市阿尓法智慧科技有限公司 Deaf-mute interactive AI intelligent navigation device
US11604512B1 (en) * 2022-01-05 2023-03-14 City University Of Hong Kong Fingertip-motion sensing device and handwriting recognition system using the same

Also Published As

Publication number Publication date
AU2003243003A1 (en) 2005-01-04

Similar Documents

Publication Publication Date Title
WO2004114107A1 (en) Human-assistive wearable audio-visual inter-communication apparatus.
CN112789577B (en) Neuromuscular text input, writing and drawing in augmented reality systems
US8519950B2 (en) Input device
CN112739254A (en) Neuromuscular control of augmented reality systems
CN112822992A (en) Providing enhanced interaction with physical objects using neuromuscular signals in augmented reality environments
CN107205879B (en) Hand rehabilitation exercise system and method
US20160195928A1 (en) Closed loop feedback interface for wearable devices
US20100023314A1 (en) ASL Glove with 3-Axis Accelerometers
US20080036737A1 (en) Arm Skeleton for Capturing Arm Position and Movement
Das et al. Smart glove for sign language communications
CN107765850A (en) A kind of sign Language Recognition based on electronic skin and multi-sensor fusion
US20050148870A1 (en) Apparatus for generating command signals to an electronic device
Chakoma et al. Converting South African sign language to verbal
KR20210051277A (en) Wearable Device for Motion Detecting and Method for Manufacturing Thereof
KR20010110615A (en) Information input device operated by detecting movement of skin, mobile information processing device, computer and mobile communication device using the same
CN111610857A (en) Gloves with interactive installation is felt to VR body
KR100510876B1 (en) Device for producing voice from manual language
CN117389408A (en) Intelligent glove system
WO2013011336A2 (en) Data input device
JP2004013209A (en) Wrist-mounted finger operation information processor
Kumari et al. Gesture Recognizing Smart System
Blackmon et al. Target acquisition by a hands-free wireless tilt mouse
CN115686220A (en) Glove type Bluetooth keyboard
Rajput et al. INTERACTIVE ACCELEROMETRIC GLOVE FOR HEARING IMPAIRED
KR20220041705A (en) Sign language glove translation system

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: COMMUNICATION UNDER RULE 69 EPC ( EPO FORM 1205A DATED 17/05/06 )

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP