CN106200988A - A kind of wearable hand language recognition device and sign language interpretation method - Google Patents

A kind of wearable hand language recognition device and sign language interpretation method Download PDF

Info

Publication number
CN106200988A
CN106200988A CN201610777204.5A CN201610777204A CN106200988A CN 106200988 A CN106200988 A CN 106200988A CN 201610777204 A CN201610777204 A CN 201610777204A CN 106200988 A CN106200988 A CN 106200988A
Authority
CN
China
Prior art keywords
sign language
motion sensor
grader
multiaxial motion
host computer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610777204.5A
Other languages
Chinese (zh)
Inventor
朱向阳
吕博
赵彬
郭伟超
盛鑫军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Jiaotong University
Original Assignee
Shanghai Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Jiaotong University filed Critical Shanghai Jiaotong University
Priority to CN201610777204.5A priority Critical patent/CN106200988A/en
Publication of CN106200988A publication Critical patent/CN106200988A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/40Processing or translation of natural language
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/011Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns

Abstract

The invention discloses a kind of wearable hand language recognition device and sign language interpretation method, wearable hand language recognition device includes wearable device end and upper computer end, wearable device end has finger ring and armlet, specifically sign language action is carried out athletic posture resolving and pattern recognition, by the real time human-machine interaction system that sign language interpreter is voice or word based on multiaxial motion sensor and muscle electric signal sensor.The present invention is by the muscle signal of telecommunication and multiaxial motion signal fused, and Land use models identification technology realizes accurately identifying of sign language action, and not by sign language action vocabulary quantitative limitation.By above-mentioned signals collecting, data process, categorizing process is integrated in wearable device, it is simple to user is worn and do not affects its daily life.

Description

A kind of wearable hand language recognition device and sign language interpretation method
Technical field
The present invention relates to sign language interpretation system field, particularly relate to a kind of wearable hand language recognition device and sign language interpreter Method, specifically based on multiaxial motion sensor with the muscle signal of telecommunication athletic posture resolves and sign language is turned over mode identification technology It is translated into the real time human-machine interaction system of voice or word.
Background technology
Sign language is the main tool that language or deaf individual are linked up with people, and can skillfully use sign language in ordinary people Ratio is relatively low.Therefore, language or deaf individual are linked up with ordinary people and be there is inconvenience.Owing to sign language kind is many, with China As a example by sign language vocabulary, have more than 5000 gesture words;The features such as sign language action dynamically change so that Sign Language Recognition technology not yet takes Must be widely applied.
By Sign Language Recognition technology, sign language action is translated as the voice that ordinary people can understand or the literary composition that can understand Word, it is possible to cross over the communication disorders between language or deaf individual and ordinary people, improves language or the life of deaf individual Quality.
Main technical schemes for gesture identification has based on data glove collection joint angles information at present, but data Glove wear the daily life limiting language or deaf individual;Gesture identification mode based on image procossing is by environment Limit, and at portable aspect, not there is advantage.Additionally, Chinese Patent Application No. is: 201410376205.X " believes based on myoelectricity Number and the sign language interpreter of motion sensor and display sonification system " a kind of equipment by being worn on arm proposed, gather hands Arm electromyographic signal and multiaxial motion sensor signal, but the attitude of finger and forearm can only be detected according to its principle, it is impossible to detection Palm and the relative attitude of arm, and in sign language action, the relative attitude of palm and arm has weight for the differentiation of sign language action Want meaning;Not having feedback function according to principle it described, deaf individual cannot determine that translation system the most normally works, institute There is technological deficiency in this approach.
Summary of the invention
Because the deficiencies in the prior art, the technical problem to be solved is to provide one and simply can wear accurately Wear formula hand language recognition device.
For achieving the above object, the invention provides a kind of wearable hand language recognition device, by the muscle electricity of armlet The athletic posture of signal transducer Real-time Collection the muscles of the arm electrical signal detection finger, is transported by the multiaxis of finger ring and armlet Dynamic sensor Real-time Collection palm and the relative attitude of arm and arm, in three-dimensional attitude, use the side of pattern recognition Formula, it is achieved Sign Language Recognition, and by host computer, translation result is exported with the form of voice or word.Specifically, the present invention carries The technical scheme of confession is as follows:
A kind of wearable hand language recognition device, including wearable device end and upper computer end;Wearable device end includes wearing The finger ring worn and armlet, finger ring has multiaxial motion sensor at palm;Have on armlet at least one first module, one Or multiple second module and fix the first module and the arm straps of the second module, the first module include muscle electric signal sensor, Fore-arm multiaxial motion sensor, microprocessor, feedback element and the first communication module, the second module includes the muscle signal of telecommunication Sensor;At muscle electric signal sensor, fore-arm multiaxial motion sensor, feedback element, the first communication module and palm Multiaxial motion sensor is all connected with microprocessor, and microprocessor is used for reading and process muscle electric signal sensor, fore-arm The information of multiaxial motion sensor acquisition at multiaxial motion sensor and palm, feedback element is for believing to user feedback prompts Breath;Upper computer end includes host computer and the second communication module, microprocessor by the first communication module and the second communication module with Host computer interactive information.
Preferably, upper computer end also includes the interactive device being connected with host computer, and interactive device is arranged to be used for upper The sign language of position machine identification carries out missed suppression and grammer is corrected.
Preferably, multichannel muscle electric signal sensor includes multichannel muscle electrical-signal electrode, filter circuit and puts Big circuit.
Preferably, host computer is the mobile terminal device such as smart mobile phone, panel computer.
Preferably, arm straps uses retractable material to make.
Preferably, at palm multiaxial motion sensor and fore-arm multiaxial motion sensor all include three axis accelerometer, Three axis angular rate meters and three axle magnetometers.
Preferably, information includes the feedback prompts modes such as vibration, sound, display screen, signal lights.
A kind of sign language interpretation method using above-mentioned wearable hand language recognition device, comprises the steps:
1) select host computer pattern for training, according to sign language action training grader and obtain sign language grader collection, store In host computer;
2) select host computer pattern for identifying, obtain grader collection according to the signal of sign language action, and be stored in host computer In sign language grader collection mate and export recognition result;
3) by the interactive device of host computer, recognition result carried out missed suppression and grammer is corrected, and with voice or word Form output, after user completes sign language action in short, whether feedback element prompting user speech has been translated.
Preferably, step 1) training process comprise the following steps:
A) by multiaxial motion sensor at palm and fore-arm multiaxial motion sensor, sign language action is carried out attitude solution Calculate, extract feature and train the first training grader;
B) by multichannel muscle electric signal sensor, sign language action is carried out point window to process, extract feature and train second Training grader;
C) the first training grader and the second training information of classifier are carried out Multiple Classifier Fusion, obtain sign language grader Collection, is stored in gained sign language grader in host computer.
Preferably, step 2) identification process comprise the following steps:
A) by palm multiaxial motion sensor, sign language action is carried out attitude algorithm, it is judged that single-handed exercise and both hands move Make;
B) by multichannel muscle electric signal sensor, sign language action is carried out end-point detection, make continuous print sign language action be divided It is segmented into single sign language action;
C) by multiaxial motion sensor at palm and fore-arm multiaxial motion sensor, sign language action is carried out attitude solution Calculate and by multichannel muscle electric signal sensor, sign language action is carried out point window and process, obtain sign language grader collection;
D) by obtained grader collection, mate with the sign language grader collection being stored in host computer and export identification knot Really.
Compared with prior art, the advantage of the present invention is:
1, the present invention is by the muscle signal of telecommunication and multiaxial motion signal fused, and Land use models identification technology realizes sign language action Accurately identify, and not by sign language action vocabulary quantitative limitation.By above-mentioned signals collecting, data process, categorizing process is integrated in In wearable device, it is simple to user is worn and do not affects its daily life.
2, the present invention is during using mode identification technology, and sign language action is first split into singlehanded sign language action and both hands Sign language action, it is to avoid the both hands wearable device moment is in running order, reduces power consumption;Multiaxial motion sensor at finger ring and arm Can accurately detect the palm attitude relative to arm so that Sign Language Recognition accuracy rate is with in hgher efficiency.
Below with reference to accompanying drawing, the method for the present invention and the technique effect of generation are described further, to be fully understood from The purpose of the present invention, feature and effect.
Accompanying drawing explanation
Fig. 1 is the wearable hand language recognition device schematic diagram of the present invention;
Fig. 2 is the wearable hand language recognition device hardware system structure figure of the present invention;
Fig. 3 is the sign Language Recognition one hand sign language training flow chart of the present invention;
Fig. 4 is the sign Language Recognition both hands sign language training flow chart of the present invention;
Fig. 5 is the sign Language Recognition identification procedure chart of the present invention.
Detailed description of the invention
It is respectively wearable hand language recognition device and hardware system schematic diagram that the present invention provides as depicted in figs. 1 and 2, Be included on the finger of user and both arms forearm the finger ring 401 and armlet 404 worn, the arm straps 402 of armlet 404 scalable with The user adapting to different upper-arm circumference uses.
Include on armlet 404 multichannel muscle electric signal sensor 113, multiaxial motion sensor 107, microprocessor 106, Feedback element 114 and communication module 108, wherein multichannel muscle electric signal sensor 113 includes the multichannel muscle signal of telecommunication Electrode 102, filter circuit 103 and amplifying circuit 104.Finger ring 401 includes multiaxial motion sensor 405.Armlet 404 middle mold Block 403 includes muscle electric signal sensor 113, multiaxial motion sensor 107, microprocessor 106, feedback element 114 and leads to News module 108, remaining module only includes muscle electric signal sensor 113.
Sign Language Recognition hardware system gathers forearm by the multiple electrode slices 406 on multichannel muscle electrical-signal electrode 102 Place's multichannel muscle signal of telecommunication, is read by microprocessor 106 after circuit 103, amplifying circuit 104 and A/D conversion after filtering, At palm, multiaxial motion sensor 105 obtains three axis accelerometer at palm, three axis angular rate meters, three axle magnetometer data, logical Crossing but be not limited to wired mode and read by microprocessor 106, fore-arm multiaxial motion sensor 107 obtains three axles at arm to accelerate Degree meter, three axis angular rate meters, three axle magnetometer data, read by microprocessor 106, when microprocessor 106 carries out sort operation After, send instruction to feedback element 114, feedback element 114 provide but be not limited to vibrational feedback.Microprocessor 106 is by logical News module 108 realizes the bidirectional transfer of information with host computer 111.
Wearable device end 101 of the present invention first passes through communication module 110 in use and realizes logical with upper computer end 109 News, communication module 110 uses but is not limited only to Bluetooth transmission or is wirelessly transferred;Host computer 111 can use smart mobile phone, flat board electricity The mobile terminal devices such as brain.During first use wearable device end 101, need to be trained;Reuse wearable device end 101 Time, can skip training link, be directly entered ONLINE RECOGNITION.
It is the sign Language Recognition training flow process of the present invention as shown in Figure 3 and Figure 4.When using for the first time, when host computer 111 with After wearable device end 101 is successfully established communication, feedback element 114 points out normal connection.By host computer 111 through communication module 110 Send host computer model selection 201,216 instruction to wearable device end 101 communication module 108, enter training 202,217.Training Sign language action 203,218 is selected to be divided into singlehanded sign language action 204 and both hands sign language action 219 under pattern.
Carrying out single-handed exercise 204 in training mode when training, the multiaxis being read finger ring 401 by microprocessor 106 is transported Dynamic sensor 405 obtains multiaxial motion sensor raw data 205 at singlehanded palm, reads in the module 403 in armlet 404 Multiaxial motion sensor obtains singlehanded fore-arm multiaxial motion sensor raw data 206, and two kinds of initial datas are carried out palm With forearm attitude algorithm 207 extract feature 208, obtain the relative position of palm and forearm, train the first grader 210.Logical Cross the multichannel muscle signal of telecommunication 113 after microprocessor 106 reads circuit 103, amplifying circuit 104 after filtering, through undue window Process 211, extract feature 213 and train the second grader 212.The first grader 210 and the second grader that training is obtained 212 carry out Multiple Classifier Fusion, obtain singlehanded sign language grader collection 214, gained sign language grader 215 is stored in host computer 111 In.
Carry out double-handed exercise 219 in training mode when training, read both hands finger ring more than 401 by microprocessor 106 Axis movement sensor 405 obtains at left-hand palm multiaxial motion at multiaxial motion sensor raw data 220 and right hand palm and passes Sensor initial data 222, it is many that the multiaxial motion sensor in module 403 in reading both arms armlet 404 obtains left hand fore-arm Axis movement sensor initial data 221 and right hand fore-arm multiaxial motion sensor raw data 222, respectively to two kinds of original number According to carrying out left-hand palm and forearm attitude algorithm 224 and right hand palm and forearm attitude algorithm 226 and extracting feature 225 respectively, 227, obtain the relative position of right-hand man's palm and forearm, train the first grader 229 and the 3rd grader 231.By micro-place Reason device 106 reads the left hand fore-arm multichannel muscle signal of telecommunication 228 after circuit 103, amplifying circuit 104 after filtering and the right hand The fore-arm multichannel muscle signal of telecommunication 241, carries out point window respectively and processes 233,235, extracts feature 234,236 respectively and trains the Two grader 230 and the 4th graders 232.The first grader 229 training obtained and the second grader 230 carry out grader Merging and obtain left hand sign language grader collection 237, the 3rd grader 231 training obtained and the 4th grader 232 are classified Device merges and obtains right hand sign language grader collection 238, is merged by right-hand man's grader collection and obtains left hand right hand integrated classification device 239, Gained sign language grader 240 is stored in host computer 111.
After training terminates, user can pass through host computer 111 through communication module 110 to wearable device end 101 communication module 108 Send host computer model selection 301 to instruct, enter ONLINE RECOGNITION 302.User does sign language action according to the meaning being intended by 303, the multiaxial motion sensor 405 being read both hands finger ring 401 by microprocessor 106 obtains multiaxial motion biography at both hands palm Sensor initial data 304, user's done sign language action is single-handed exercise 306 or double-handed exercise to take attitude algorithm 305 to judge 307。
If user's done sign language action is single-handed exercise 306, by microprocessor 106 read after filtering circuit 103, put The multichannel muscle signal of telecommunication 308 after big circuit 104, uses the method segmentation action 314 of end-point detection 311, end-point detection 311 The quadratic sum of data point in each window after method specially calculating 308 points of windows of the multichannel muscle signal of telecommunication, and swash with setting muscle The threshold value of the state of living compares, and judges whether action is carried out with this so that continuous print sign language action is divided into single sign language Action.Singlehanded sign language action initial data is processed 332.
Singlehanded sign language action original data processing 332 process includes, is read the multiaxis of finger ring 401 by microprocessor 106 Motion sensor 405 obtains multiaxial motion sensor raw data 317 at singlehanded palm, reads in the module 403 in armlet 404 Multiaxial motion sensor obtain singlehanded fore-arm multiaxial motion sensor raw data 318, two kinds of initial datas are carried out hands The palm and forearm attitude algorithm 321 extract feature 322, send into and preliminary in the first grader 325 judge done sign language action place Classification.The multichannel muscle signal of telecommunication 327 after circuit 103, amplifying circuit 104 after filtering, warp is read by microprocessor 106 Undue window processes 328, extracts feature 329 and sends into preliminary judgement done sign language action place classification in the second grader 326.
That the first grader 325 and the second grader 326 are done tentatively judges that sign language action place category result is sent into Singlehanded sign language grader collection 333, finally gives the result of singlehanded sign language action recognition 334.
If user's done sign language action is double-handed exercise 307, by microprocessor 106 read after filtering circuit 103, put The left hand fore-arm multichannel muscle signal of telecommunication 309 after big circuit 104 and the right hand fore-arm multichannel muscle signal of telecommunication 310, adopt With end-point detection 312, the method segmentation left hand of 313 and right hand action 315,316 so that continuous print sign language action is divided into list Individual sign language action.Respectively left hand and right hand sign language action initial data are processed 319,320, carry out left hand sign language respectively and divide Send into left hand right hand integrated classification device 330 after the preliminary classification of class device collection 323 and right hand sign language grader collection 324, finally give The result of both hands sign language action recognition 331.
By Sign Language Recognition result 335 by wearable device end 101 communication module 108 through upper computer end 109 communication module 110 send to host computer 111, and the interactive device 112 by means of host computer 111 realizes missed suppression and grammer is corrected, and with language The form output of sound or word.After user completes sign language action in short, feedback element 114 can point out whether user speech turns over Translate, it is to avoid exchange occurs confusion, it is achieved the people of language or dysacousis and ordinary people's normal communication.
After training mode, ONLINE RECOGNITION pattern when user reuses wearable device end, can be directly selected, it is not necessary to Experience training process again.
The preferred embodiment of the present invention described in detail above.Should be appreciated that the ordinary skill of this area is without wound The property made work just can make many modifications and variations according to the design of the present invention.Therefore, all technical staff in the art The most on the basis of existing technology by the available technology of logical analysis, reasoning, or a limited experiment Scheme, all should be in the protection domain being defined in the patent claims.

Claims (10)

1. a wearable hand language recognition device, it is characterised in that include wearable device end and upper computer end;
Described wearable device end includes wearable finger ring and armlet, and described finger ring has multiaxial motion sensor at palm; There is on described armlet at least one first module, one or more second module and fixing described first module and described the The arm straps of two modules, described first module include muscle electric signal sensor, fore-arm multiaxial motion sensor, microprocessor, Feedback element and the first communication module, described second module includes muscle electric signal sensor;The described muscle signal of telecommunication senses Multiaxis fortune at device, described fore-arm multiaxial motion sensor, described feedback element, described first communication module and described palm Dynamic sensor is all connected with described microprocessor, described microprocessor be used for reading and process described muscle electric signal sensor, The information of multiaxial motion sensor acquisition at described fore-arm multiaxial motion sensor and described palm, described feedback element is used for To user feedback prompts information;
Described upper computer end includes host computer and the second communication module, and described microprocessor passes through described first communication module and institute State the second communication module and described host computer interactive information.
Wearable hand language recognition device the most as claimed in claim 1, wherein said upper computer end also includes with host computer even The interactive device connect, described interactive device is arranged to be used for that the sign language of host computer identification carries out missed suppression and grammer is rectified Just.
Wearable hand language recognition device the most as claimed in claim 1, wherein said multichannel muscle electric signal sensor bag Include multichannel muscle electrical-signal electrode, filter circuit and amplifying circuit.
Wearable hand language recognition device the most as claimed in claim 1, wherein said host computer is smart mobile phone, panel computer Deng mobile terminal device.
Wearable hand language recognition device the most as claimed in claim 1, wherein said arm straps uses retractable material to make.
Wearable hand language recognition device the most as claimed in claim 1, multiaxial motion sensor and institute at wherein said palm State fore-arm multiaxial motion sensor and all include three axis accelerometer, three axis angular rate meters and three axle magnetometers.
Wearable hand language recognition device the most as claimed in claim 1, wherein said information includes vibration, sound, shows Display screen, signal lights feedback prompts mode.
8. a sign language interpretation method for the use wearable hand language recognition device as described in claim 1-7, its feature exists In, comprise the steps:
1) select host computer pattern for training, according to sign language action training grader and obtain sign language grader collection, be stored in In the machine of position;
2) select host computer pattern for identifying, obtain grader collection according to the signal of sign language action, and be stored in host computer Sign language grader collection mates and exports recognition result;
3) by the interactive device of host computer, recognition result carried out missed suppression and grammer is corrected, and with voice or the shape of word Formula exports, and after user completes sign language action in short, whether feedback element prompting user speech has been translated.
9. sign language interpretation method as claimed in claim 8, wherein said step 1) training process comprise the following steps:
A) by multiaxial motion sensor at palm and fore-arm multiaxial motion sensor, sign language action is carried out attitude algorithm, carry Take feature and train the first training grader;
B) by multichannel muscle electric signal sensor, sign language action is carried out point window to process, extract feature and train the second training Grader;
C) the first training grader and the second training information of classifier are carried out Multiple Classifier Fusion, obtain sign language grader collection, will Gained sign language grader is stored in host computer.
10. sign language interpretation method as claimed in claim 8, wherein said step 2) identification process comprise the following steps:
A) by palm multiaxial motion sensor, sign language action is carried out attitude algorithm, it is judged that single-handed exercise and double-handed exercise;
B) by multichannel muscle electric signal sensor, sign language action is carried out end-point detection, make continuous print sign language action be divided into Single sign language action;
C) by multiaxial motion sensor at palm and fore-arm multiaxial motion sensor sign language action carried out attitude algorithm with And by multichannel muscle electric signal sensor, sign language action is carried out a point window process, obtain sign language grader collection;
D) by obtained grader collection, mate with the sign language grader collection being stored in host computer and export recognition result.
CN201610777204.5A 2016-08-30 2016-08-30 A kind of wearable hand language recognition device and sign language interpretation method Pending CN106200988A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610777204.5A CN106200988A (en) 2016-08-30 2016-08-30 A kind of wearable hand language recognition device and sign language interpretation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610777204.5A CN106200988A (en) 2016-08-30 2016-08-30 A kind of wearable hand language recognition device and sign language interpretation method

Publications (1)

Publication Number Publication Date
CN106200988A true CN106200988A (en) 2016-12-07

Family

ID=58089374

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610777204.5A Pending CN106200988A (en) 2016-08-30 2016-08-30 A kind of wearable hand language recognition device and sign language interpretation method

Country Status (1)

Country Link
CN (1) CN106200988A (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107240337A (en) * 2017-07-27 2017-10-10 金勇� A kind of intelligent sign language ring
CN107391497A (en) * 2017-07-28 2017-11-24 深圳市锐曼智能装备有限公司 Bluetooth finger tip translater and its interpretation method
CN107728788A (en) * 2017-10-31 2018-02-23 吉林大学 One kind is based on infrared ultrasonic three-dimensional localization body feeling interaction device
CN107749213A (en) * 2017-11-24 2018-03-02 闽南师范大学 A kind of wearable sign language tutoring system based on six axle attitude transducer modules
CN108037821A (en) * 2017-10-30 2018-05-15 上海念通智能科技有限公司 A kind of wearable palm band for gesture identification
CN108766434A (en) * 2018-05-11 2018-11-06 东北大学 A kind of Sign Language Recognition translation system and method
CN109032333A (en) * 2018-06-24 2018-12-18 佛山凯舒易智能科技有限公司 A kind of sign Language Recognition and method based on optical fiber perception
CN110840652A (en) * 2019-11-11 2020-02-28 北京海益同展信息科技有限公司 Wearable device, information processing method and device
CN111881697A (en) * 2020-08-17 2020-11-03 华东理工大学 Real-time sign language translation method and system
CN114442798A (en) * 2020-11-06 2022-05-06 复旦大学附属妇产科医院 Portable control system and control method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101819635A (en) * 2010-04-02 2010-09-01 北京大学软件与微电子学院无锡产学研合作教育基地 Micro-inertial navigation signal and mode recognition-based sign language interpretation method
CN102810154A (en) * 2011-06-02 2012-12-05 国民技术股份有限公司 Method and system for biological characteristic acquisition and fusion based on trusted module
CN104134060A (en) * 2014-08-03 2014-11-05 上海威璞电子科技有限公司 Sign language interpreting, displaying and sound producing system based on electromyographic signals and motion sensors

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101819635A (en) * 2010-04-02 2010-09-01 北京大学软件与微电子学院无锡产学研合作教育基地 Micro-inertial navigation signal and mode recognition-based sign language interpretation method
CN102810154A (en) * 2011-06-02 2012-12-05 国民技术股份有限公司 Method and system for biological characteristic acquisition and fusion based on trusted module
CN104134060A (en) * 2014-08-03 2014-11-05 上海威璞电子科技有限公司 Sign language interpreting, displaying and sound producing system based on electromyographic signals and motion sensors

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
曹翔,陈香,苏瑞良: "基于fHMM分类优化的多传感器手语手势识别方法", 《航天医学与医学工程》 *

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107240337A (en) * 2017-07-27 2017-10-10 金勇� A kind of intelligent sign language ring
CN107391497A (en) * 2017-07-28 2017-11-24 深圳市锐曼智能装备有限公司 Bluetooth finger tip translater and its interpretation method
CN107391497B (en) * 2017-07-28 2024-02-06 深圳市锐曼智能装备有限公司 Bluetooth fingertip translator and translation method thereof
CN108037821A (en) * 2017-10-30 2018-05-15 上海念通智能科技有限公司 A kind of wearable palm band for gesture identification
CN107728788A (en) * 2017-10-31 2018-02-23 吉林大学 One kind is based on infrared ultrasonic three-dimensional localization body feeling interaction device
CN107749213A (en) * 2017-11-24 2018-03-02 闽南师范大学 A kind of wearable sign language tutoring system based on six axle attitude transducer modules
CN108766434A (en) * 2018-05-11 2018-11-06 东北大学 A kind of Sign Language Recognition translation system and method
CN108766434B (en) * 2018-05-11 2022-01-04 东北大学 Sign language recognition and translation system and method
CN109032333A (en) * 2018-06-24 2018-12-18 佛山凯舒易智能科技有限公司 A kind of sign Language Recognition and method based on optical fiber perception
CN110840652A (en) * 2019-11-11 2020-02-28 北京海益同展信息科技有限公司 Wearable device, information processing method and device
CN111881697A (en) * 2020-08-17 2020-11-03 华东理工大学 Real-time sign language translation method and system
CN114442798A (en) * 2020-11-06 2022-05-06 复旦大学附属妇产科医院 Portable control system and control method

Similar Documents

Publication Publication Date Title
CN106200988A (en) A kind of wearable hand language recognition device and sign language interpretation method
CN104134060B (en) Sign language interpreter and display sonification system based on electromyographic signal and motion sensor
CN107886061B (en) Human body behavior recognition method and system based on multi-mode deep Boltzmann machine
WO2018120964A1 (en) Posture correction method based on depth information and skeleton information
CN107397649A (en) A kind of upper limbs exoskeleton rehabilitation robot control method based on radial base neural net
CN108983636B (en) Man-machine intelligent symbiotic platform system
CN101947152A (en) Electroencephalogram-voice control system and working method of humanoid artificial limb
CN107423730A (en) A kind of body gait behavior active detecting identifying system and method folded based on semanteme
CN107754225A (en) A kind of intelligent body-building coaching system
CN105739688A (en) Man-machine interaction method and device based on emotion system, and man-machine interaction system
CN210402266U (en) Sign language translation system and sign language translation gloves
CN104463100A (en) Intelligent wheelchair man-machine interaction system and method based on facial expression recognition mode
CN104665962A (en) Wearable function-enhanced manipulator system as well as assisting fingers and control method thereof
CN103279734A (en) Novel intelligent sign language translation and man-machine interaction system and use method thereof
CN106997243A (en) Speech scene monitoring method and device based on intelligent robot
CN103336581A (en) Human eye movement characteristic design-based human-computer interaction method and system
CN105912142A (en) Step recording and behavior identification method based on acceleration sensor
CN106123911A (en) A kind of based on acceleration sensor with the step recording method of angular-rate sensor
CN204537060U (en) A kind of human-computer interaction device based on myoelectricity stream and multi-sensor cooperation effect
JP2022507635A (en) Intelligent vehicle motion control methods and devices, equipment and storage media
CN110442233A (en) A kind of augmented reality key mouse system based on gesture interaction
CN107132915A (en) A kind of brain-machine interface method based on dynamic brain function network connection
CN113849068A (en) Gesture multi-mode information fusion understanding and interacting method and system
CN114255508A (en) OpenPose-based student posture detection analysis and efficiency evaluation method
CN113423341A (en) Method and apparatus for automatic calibration of wearable electrode sensor system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20161207