CN103699227A - Novel human-computer interaction system - Google Patents

Novel human-computer interaction system Download PDF

Info

Publication number
CN103699227A
CN103699227A CN201310725456.XA CN201310725456A CN103699227A CN 103699227 A CN103699227 A CN 103699227A CN 201310725456 A CN201310725456 A CN 201310725456A CN 103699227 A CN103699227 A CN 103699227A
Authority
CN
China
Prior art keywords
action
human
computer interaction
oral cavity
interaction device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201310725456.XA
Other languages
Chinese (zh)
Inventor
邵剑锋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201310725456.XA priority Critical patent/CN103699227A/en
Publication of CN103699227A publication Critical patent/CN103699227A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Dental Tools And Instruments Or Auxiliary Dental Instruments (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

The invention relates to the technical field of realizing operation on a control terminal by virtue of oral actions, by a human-computer interaction device capable of identifying tongue movement action, tooth occlusion action and oral breathing action. The human-computer interaction device disclosed by the invention is compact in structure, high in control accuracy and sensitivity, low in error, and convenient to carry and use. According to the human-computer interaction device disclosed by the invention, the limitations and defects of limb control and voice intelligent identification control technologies are solved, and substitution and supplement effects are acted on the original technology for operating the control terminal. According to the human-computer interaction device disclosed by the invention, the degree of participation of the disabled people in social activities is increased, and the disabled people can control use terminals more and better, so as to obtain convenient service brought by human development; the disabled people can also increase own employment opportunities by controlling the use terminals. The human-computer interaction device disclosed by the invention also has an efficiency-increasing function, the both hands can be released to perform other operations or get a rest time during the use of the human-computer interaction device, and then the hand resources are allocated and utilized better to achieve the efficiency-increasing function.

Description

A kind of new man-machine interactive system
Technical field
What the present invention relates to is a kind of technical field that can utilize oral cavity motion action control terminal, what more particularly, relate to is to be realized and utilized oral cavity action terminal to be operated to the technical field of controlling by a kind of human-computer interaction device that can identify tongue shift action, dental articulation action, oral cavity respiratory movement.Birth of the present invention, has formed and has substituted and supplementary function the technology of original operation control terminal.
Background technology
Along with scientific and technological development, there is increasing electric terminal equipment, as desktop computer, notebook computer, panel computer, smart mobile phone, intelligent watch, navigating instrument, intelligent glasses, camera, navigating instrument, numerically-controlled machine, game machine, robot etc. terminal device, to work and life, brought numerous facilities to experience.Current operation controls that the means of this type of terminal device are current limb control technology (as keyboard, mouse, trace ball, touch-screen, touch pad, operating rod etc.) and speech-sound intelligent control technology means.
There is limitation and shortcoming in above-mentioned technological means, as follows more or less:
Deformity of limbs personage just cannot use limb control technology.
Aphasia personage just cannot carry out controlling of voice means.
The personage of deformity of limbs and aphasia, just cannot use this kind equipment to carry out work and life.
Speech recognition controlled technology exists the limitation that chain of command is narrow, only can carry out control operation to discrete function in terminal, and limitation is large.
Speech recognition controlled technology exists the situation that fault rate is high.
While using limbs to control technological means operation control terminal, cannot liberate both hands and participate in other movable or work, have inefficiency.
Summary of the invention
In order to solve limitation and the shortcoming of aforesaid operations control technology means, the invention discloses a kind of Structural Tectonics and using method thereof that can be placed in the human-computer interaction device that mouth uses, by using this equipment to reach, utilize oral cavity action to realize with terminal the object that man-machine interaction is controlled.
The technical scheme adopting in the present invention is: human-computer interaction device is designed to the profile in can posting port, in the portion of sucking, be provided with tongue action identification module (1), tooth action recognition module (2) and respiratory movement identification module (3), by above-mentioned three identification modules, gather respectively tongue action track, dental articulation action and oral cavity respiratory movement, and convert actuating signal to electric signal, again by central processing module (4) this type of signal is processed to conversion, carry out signal and communication with terminal, thereby reach and realize the object of moving operation control terminal by oral cavity,
In the present invention, tongue action identification module (1) function is for detecting moving direction and the translational speed of identification tongue; Have the multiple technologies scheme can be for adopting, existing will be wherein two kinds describe:
The one, by photoelectric reflection technical scheme, judge tongue shift action (similar optical mouse technology).Its principle of work is in tongue action identification module, to comprise light emitting diode, sensitive chip, positioning chip.When light emitting diode work is irradiated to tongue, concavo-convex situation because of lingual surface, produce different reflected light signals, and sensitive chip just can produce corresponding digital signal after can processing these reflected light signals, positioning chip in module can carry out analytic operation to this digital signal, thereby obtains displacement data;
The 2nd, adopt video identification technology scheme to judge tongue shift action.Its principle of work is in tongue action identification module, to comprise core component light emitting diode, minisize pick-up head, light engine and control chip.During work, light emitting diode emission of light illuminates the tip of the tongue surface, and minisize pick-up head constantly carries out image taking to the tip of the tongue surface at a certain time interval simultaneously.The different images that tongue produces in moving process sends light engine to and carries out digitized processing, and last location dsp chip in light engine is again analyzed produced image digitization matrix.Because two adjacent width images always exist identical feature, by contrasting the change in location information of these unique points, just can judge moving direction and the distance of tongue, this analysis result is finally converted into the data that coordinate offset amount realizes displacement.
Also have in addition other multiple technology that can detect displacement action available, as operating rod, trace ball, touch pad, touch-screen, infrared reflection identification, laser identification, video identification etc. wherein a kind of detection technique all can realize this functions of modules; This explanation is only to utilizing photoelectric reflection technical scheme and video identification technology scheme to be described, but adopts other similar schemes that detect displacement technology for detection tongue action also should fall into protection domain of the present invention.
The function of Tooth interlock identification module of the present invention (2) is by detecting the state recognition dental articulation of dental articulation identification switch (7) and there is no the state of interlock.Realize technical method that this function adopts for being provided for dental articulation switch (7) in the equipment portion of sucking and connecting the circuit arrangement of dental articulation switch (7), can choice for use mechanical button switch or optoelectronic switch or other switching techniques of having applied;
In the present invention, the function of oral cavity respiratory movement identification module (3) is by the state of detection module mesopetalum membrane switch (10) or pressure transducer (11), to identify that exhale in oral cavity, three kinds of states of air-breathing, the air-flow that breathes no more; The technology adopting, for respiratory airway to be set in the equipment portion of sucking, to arrange valve switch (10) and circuit thereof or pressure transducer (11) and circuit thereof is set in air flue (8) in air flue (8).Under the direction of not sharing a common fate, the state of valve switch or pressure transducer will change, and by the state of detector switch, thereby identify oral cavity breathing state;
The combination of air flue in the present invention in the breathing detection module of equipment, valve switch and gas storehouse, pressure transducer exists kinds of schemes can realize functions of modules, existing three kinds of schemes is wherein described;
One, as shown in a scheme in Fig. 4: use valve switch combination;
When air-breathing, the conducting of AC point, AB point disconnects;
During expiration, the conducting of AB point, AC point disconnects;
While there is no air-flow, the conducting of AB point, the conducting of AC point.
Two, as shown in No. two schemes in Fig. 4: use a valve switch;
When air-breathing, the conducting of AC point, AB point disconnects;
During expiration, the conducting of AB point, AC point disconnects;
While there is no air-flow, AB point disconnects, and AC point disconnects.
Three, as shown in No. three schemes in Fig. 4: working pressure sensor;
When air-breathing, baroswitch is in negative pressure state;
During expiration, baroswitch is in pressurized state;
While there is no air-flow, baroswitch is in atmospheric pressure state.
  
Two kinds of embodiment described above do not represent only three kinds of assembled schemes, and other assembled schemes that are comprised of air flue, valve switch or pressure transducer also should fall into protection scope of the present invention.
Human-computer interaction device's compact conformation of the present invention, control accuracy and highly sensitive, error is little, portably uses convenient and simple.By the present invention, solve aforementioned limbs and controlled limitation and the shortcoming of controlling technological means with speech-sound intelligent identification, the technology of original operation control terminal has been formed and substituted and supplementary function.The present invention has improved the participation of defect personage to social activities, makes society more fair more harmonious.Allow physical disabilities manyly better control use terminal, obtain better the facility service that human development brings; Also can allow physical disabilities by controlling use terminal, improve job opportunity.The present invention also improves the effect of live and work efficiency, during by operation of the present invention and control terminal, can liberate both hands, allows both hands carry out other operation or to obtain the time of having a rest.And then make both hands resource obtain better domination and utilize, reach the effect of putting forward effect.
Feature of the present invention and operation mechanism will further contrast annexed drawings set forth, and in accompanying drawing, identical label represents same or similar explanation.
Accompanying drawing explanation
Accompanying drawing and accompanying drawing explanation are included in instructions, form its a part of accompanying drawing and represent embodiments of the invention, together with declaratives, are used for setting forth operation logic of the present invention.
Fig. 1 represents human-computer interaction device's surface structure of embodiment mono-.
Fig. 2 represents the inner structure sectional view of human-computer interaction device embodiment mono-.
Fig. 3 represents human-computer interaction device's operational scheme.
Fig. 4 represents structure and the operation variation that in human-computer interaction device, three kinds of embodiment of identification module are breathed in oral cavity.
Fig. 5 represents the using method of human-computer interaction device embodiment mono-.
Fig. 6 represents the inner structure sectional view of human-computer interaction device's embodiment bis-.
specific implementation method
1, as shown in Figure 2, in working order under, user is placed in mouth by equipment, brings into use this equipment.
2, three large identification modules start respectively to gather oral cavity action: tongue action identification module (1) gathers tongue shift action, and convert shift action to displacement data signal (being similar to the data-signal that mouse moves) output; Dental articulation identification module (2) starts to identify dental articulation state, and converts snap-action to corresponding signal output; Oral cavity is breathed identification module (3) and is started to identify oral cavity breathing state, and converts oral cavity breathing state to corresponding signal output.
3, the synchronous reception of central processing module (4) gathers the signal that above-mentioned three large identification modules are exported, signal is processed to conversion (according to the signal standards requirement of different terminals, the signal of receiving is processed to conversion), transfer to corresponding terminal, form the signal and communication of human-computer interaction device and terminal.
4, through above-mentioned utilization " oral cavity action ", pass through " system of the present invention " and follow the interactive process of " terminal ", thereby realized by the object of oral cavity motion action control terminal.
Above-described embodiment is described a kind of preferred embodiment that can identify the human-computer interaction device of oral cavity action wherein; not design and the scope of invention are limited; do not departing under the present invention program's prerequisite; various external form modification and improvement that those skilled in the art make technical scheme of the present invention, all should fall into a kind of protection domain that can identify the human-computer interaction device of oral cavity action of the present invention.

Claims (11)

1. a new man-machine interactive system, it is characterized in that by one, can be placed in mouth uses, can identify tongue shift action, dental articulation action and oral cavity respiratory movement and above-mentioned actuating signal is changed into corresponding electric signal, with the equipment that terminal is carried out communication, realize and utilize the action of tongue in oral cavity, the object that the snap-action of tooth and oral cavity respiratory movement behavior operate and control terminal.
2. a kind of new man-machine interactive system according to claim 1, is characterized in that using breathing in oral cavity, dental articulation and tongue shift action to control means as the key of native system.
3. a kind of new man-machine interactive system according to claim 1 is characterized in that comprising that one is positioned at human-computer interaction device and sucks portion for gathering tongue shift action and converting the tongue action identification module (1) of electric signal to.
4. a kind of new man-machine interactive system according to claim 1 is characterized in that comprising that one is positioned at human-computer interaction device and sucks portion for gathering dental articulation and move and convert to the tooth action recognition module (2) of electric signal.
5. a kind of new man-machine interactive system according to claim 1 is characterized in that comprising that one is positioned at human-computer interaction device and sucks portion for gathering oral cavity respiratory movement and converting the oral cavity respiratory movement identification module (3) of electric signal to.
6. a kind of new man-machine interactive system according to claim 1 is characterized in that comprising a central processing module for signal processing, signal and communication, power management (4).
7. a kind of new man-machine interactive system according to claim 3, it is characterized in that tongue action identification module (1) can identify the displacement action of tongue, and convert corresponding electric signal to, its basic technology of using is for realizing a wherein technology of discrimination bit shifting signal, as displacement recognition technologies such as operating rod, trace ball, touch pad, touch-screen, photoelectric reflection identification, laser reflection identification, video identification.
8. a kind of new man-machine interactive system according to claim 4, it is characterized in that dental articulation action recognition module (2) is that circuit by some switches (7) and connecting valve combines, by obtaining the state of module Tooth snap-action identification switch (7), the action of identification dental articulation, the technology adopting in dental articulation action recognition module is for being used conventional push button switch or optoelectronic switch or other on-off circuit technology of having applied.
9. a kind of new man-machine interactive system according to claim 5 is characterized in that oral cavity respiratory movement identification module (3) is to be combined by air flue (8) and valve switch (10) or pressure transducer (11) and circuit thereof, identifies oral cavity respiratory movement by obtaining the state of module mesopetalum membrane switch (10) or pressure transducer (11).
10. a kind of new man-machine interactive system according to claim 6 is characterized in that central processing module (4) processes conversion by the signal of above-mentioned tongue action identification module (1), dental articulation action recognition module (2), oral cavity respiratory movement identification module (3), and carries out communication transfer with terminal.
11. a kind of new man-machine interactive systems according to claim 6 is characterized in that central processing module (4) and terminal carry out the mode of signal and communication and comprise wire communication and wireless communication mode.
CN201310725456.XA 2013-12-25 2013-12-25 Novel human-computer interaction system Pending CN103699227A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310725456.XA CN103699227A (en) 2013-12-25 2013-12-25 Novel human-computer interaction system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310725456.XA CN103699227A (en) 2013-12-25 2013-12-25 Novel human-computer interaction system

Publications (1)

Publication Number Publication Date
CN103699227A true CN103699227A (en) 2014-04-02

Family

ID=50360780

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310725456.XA Pending CN103699227A (en) 2013-12-25 2013-12-25 Novel human-computer interaction system

Country Status (1)

Country Link
CN (1) CN103699227A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104317388A (en) * 2014-09-15 2015-01-28 联想(北京)有限公司 Interaction method and wearable electronic equipment
CN105321519A (en) * 2014-07-28 2016-02-10 刘璟锋 Speech recognition system and unit
CN107894838A (en) * 2017-11-30 2018-04-10 佛山市蓝瑞欧特信息服务有限公司 Between cog controller
CN107948369A (en) * 2017-11-30 2018-04-20 佛山市蓝瑞欧特信息服务有限公司 Mobile phone oral area answers device
CN110007767A (en) * 2019-04-15 2019-07-12 上海交通大学医学院附属第九人民医院 Man-machine interaction method and tongue training system
CN115648258A (en) * 2022-11-01 2023-01-31 哈尔滨工业大学 Three-dimensional space human-computer interaction device controlled through tongue

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1352766A (en) * 1999-02-12 2002-06-05 皮埃尔·博纳 Method and device for monitoring an electronic or computer system by means of a fluid flow
CN101236470A (en) * 2008-01-28 2008-08-06 范洪文 Tongue and tooth operated input device
CN102099922A (en) * 2008-03-26 2011-06-15 皮埃尔·邦纳特 Method and system for processing signals for a mems detector that enables control of a device using human breath
CN102460347A (en) * 2009-04-23 2012-05-16 耶达研究及发展有限公司 Nasal flow device controller
CN202394177U (en) * 2011-12-21 2012-08-22 林怡君 Oral type operation controller for electronic device
CN103135747A (en) * 2011-11-28 2013-06-05 宗鹏 Tongue-controlled mouse and finger keyboard

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1352766A (en) * 1999-02-12 2002-06-05 皮埃尔·博纳 Method and device for monitoring an electronic or computer system by means of a fluid flow
CN101236470A (en) * 2008-01-28 2008-08-06 范洪文 Tongue and tooth operated input device
CN102099922A (en) * 2008-03-26 2011-06-15 皮埃尔·邦纳特 Method and system for processing signals for a mems detector that enables control of a device using human breath
CN102460347A (en) * 2009-04-23 2012-05-16 耶达研究及发展有限公司 Nasal flow device controller
CN103135747A (en) * 2011-11-28 2013-06-05 宗鹏 Tongue-controlled mouse and finger keyboard
CN202394177U (en) * 2011-12-21 2012-08-22 林怡君 Oral type operation controller for electronic device

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105321519A (en) * 2014-07-28 2016-02-10 刘璟锋 Speech recognition system and unit
CN105321519B (en) * 2014-07-28 2019-05-14 刘璟锋 Speech recognition system and unit
CN104317388A (en) * 2014-09-15 2015-01-28 联想(北京)有限公司 Interaction method and wearable electronic equipment
CN104317388B (en) * 2014-09-15 2018-12-14 联想(北京)有限公司 A kind of exchange method and wearable electronic equipment
CN107894838A (en) * 2017-11-30 2018-04-10 佛山市蓝瑞欧特信息服务有限公司 Between cog controller
CN107948369A (en) * 2017-11-30 2018-04-20 佛山市蓝瑞欧特信息服务有限公司 Mobile phone oral area answers device
CN110007767A (en) * 2019-04-15 2019-07-12 上海交通大学医学院附属第九人民医院 Man-machine interaction method and tongue training system
CN115648258A (en) * 2022-11-01 2023-01-31 哈尔滨工业大学 Three-dimensional space human-computer interaction device controlled through tongue

Similar Documents

Publication Publication Date Title
CN103699227A (en) Novel human-computer interaction system
CN104134060A (en) Sign language interpreting, displaying and sound producing system based on electromyographic signals and motion sensors
CN102096467B (en) Light-reflecting type mobile sign language recognition system and finger-bending measurement method
CN101370096A (en) Interactive television remote control based on spacing positioning
CN110362195A (en) Gesture identification and interactive system based on bistable state coding and Flexiable angular transducer
CN104157276A (en) Photoelectric detection device and detection method for motions of piano keyboard
CN103823555A (en) System and method for converting 3D (three-dimensional) gestures into key codes
WO2017118284A1 (en) Passive optical motion capture device, and application thereof
CN207586952U (en) A kind of Kinect gesture identifying devices towards dentist's chair
CN2601426Y (en) Radio input pen
CN105138150A (en) Optical mouse control device based on touchpad
CN105208746A (en) Cubic LED desk lamp based on motion and muscle current signal control and method
CN106502415A (en) Power conservation type electronic equipment controls glove
CN110362190B (en) Text input system and method based on MYO
US20120162059A1 (en) Handwriting input device with charger
CN202041925U (en) Wireless mouse
CN102929405B (en) A kind of remote control equipment of novel manipulation intelligent video terminal and implementation method
CN212623993U (en) Intelligent interactive pen and virtual reality system
CN205210821U (en) Hand interaction system device
CN208027354U (en) A kind of multiple languages ancillary equipment based on gesture identification
CN106648114B (en) Tongue machine interaction model and device
KR101460028B1 (en) Keyboard having non-contact operating apparatus
CN102289302A (en) Data glove for digital sand table explanation and manufacturing method thereof
CN102074255A (en) Digital versatile disc (DVD) system with hand-held mouse type remote controller
CN106843486B (en) Virtual reality gesture control method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20140402