CN114063777A - Tongue and tooth driving interaction system and interaction method - Google Patents

Tongue and tooth driving interaction system and interaction method Download PDF

Info

Publication number
CN114063777A
CN114063777A CN202111324177.3A CN202111324177A CN114063777A CN 114063777 A CN114063777 A CN 114063777A CN 202111324177 A CN202111324177 A CN 202111324177A CN 114063777 A CN114063777 A CN 114063777A
Authority
CN
China
Prior art keywords
tongue
tooth
command
tooth occlusion
matching
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111324177.3A
Other languages
Chinese (zh)
Inventor
郑凡
凤一鸣
杨大鹏
杨泓渊
陈雨阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jilin University
Original Assignee
Jilin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jilin University filed Critical Jilin University
Priority to CN202111324177.3A priority Critical patent/CN114063777A/en
Publication of CN114063777A publication Critical patent/CN114063777A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Business, Economics & Management (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Primary Health Care (AREA)
  • Public Health (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

The invention belongs to the field of medical assistance, and particularly relates to a tongue and tooth driving interaction system and an interaction method. The method comprises the following steps: establishing an instruction library; triggering signals of a nipple-like sensor carrier placed in an oral cavity through tongue and/or teeth, receiving the signals generated after triggering, and forming a triggering type set comprising a plurality of triggering types according to the received signals; matching the trigger type set with corresponding instructions in an instruction library; if the matching is successful; and transmitting the command to an actuator, and executing the command by an operating device docked by the actuator. Solves the problems of complex treatment process and pain brought to the patient by puncturing and adhering special equipment to the body of the patient, and has simple operation and low cost.

Description

Tongue and tooth driving interaction system and interaction method
Technical Field
The invention belongs to the field of medical assistance, and particularly relates to a tongue and tooth driving interaction system and an interaction method.
Background
Nowadays, there are many disabled people caused by diseases or accidents all over the world, and the conventional medical apparatus is difficult to meet the requirements of patients. There is a need for a method that can assist the treatment of a patient by the functional advantages of a computer to sustain the patient's routine work, brain-computer interface (BCI) and a magnetic wireless tongue-computer interface-Tongue Drive System (TDS). The BCI is used for communicating control command output by a brain, and other nerves are not depended on in the output process. BCI systems usually require complex devices implanted in the scalp or brain to acquire electroencephalogram signals, and then use correlation algorithms to perform feature extraction and conversion of the signals to determine the intention of a user and convert the intention into specific computer control commands. Finally, the command is input into the computer, and the computer operates according to the intention of the user. TDS is a tracking agent formed by implanting, puncturing or tissue-adhering a small permanent magnet to the tongue of a patient, collecting magnetic signals by a sensor to track the motion trajectory of the tongue, and finally outputting the signals to a computer device for processing. However, both of these methods are not widely used in daily life and the expensive system price is unacceptable to many patients.
Disclosure of Invention
The technical problem to be solved by the invention is to provide a tongue and tooth driving interaction system and an interaction method, which solve the problems that the treatment process is complex, and the pain is brought to the patient by puncturing and adhering special equipment to the body of the patient.
The invention is realized in such a way that a tongue and tooth drive interaction method comprises the following steps:
establishing an instruction library;
triggering signals of a nipple-like sensor carrier placed in an oral cavity through tongue and/or teeth, wherein the triggering types comprise tongue touch actions, tongue touch positions, tongue touch action times, tongue touch force, tongue touch time, tooth occlusion actions, tooth occlusion force, tooth occlusion action times, tooth occlusion time and tooth occlusion positions;
receiving a signal generated after triggering, and forming a trigger type set comprising a plurality of trigger types according to the received signal;
matching the trigger type set with corresponding instructions in an instruction library;
if the matching is successful;
and transmitting the command to an actuator, and executing the command by an operating device docked by the actuator.
Further, a single command corresponds to a combination of types of one or more triggers within a set period of time.
And further, judging a signal exceeding a set threshold value according to the size of tongue touch force or tooth occlusion force before the trigger type set is matched with a corresponding instruction in an instruction library, and taking the signal as an effective signal to carry out the next step, or taking the signal as an ineffective signal.
Further, the nipple-like sensor carrier comprises a tongue contact area and a tooth occlusion area, wherein the tongue contact area and the tooth occlusion area are formed by a plurality of functional areas, and the tooth occlusion area is contained in the tongue contact area and is judged to be tongue contact or tooth occlusion according to position combination strength.
Furthermore, each functional area is provided with at least one sensor, the sensors are coded according to the functional area where the sensor is located, and the tongue touch position is judged according to the codes of the sensors corresponding to the information.
Further, the operating equipment of butt joint includes computer, smart mobile phone, dull and stereotyped and intelligent wheelchair for intelligent auxiliary assembly.
Further, the step of establishing the instruction library comprises the step of generating a secondary instruction library from the total instruction library according to the operation equipment docked by the actuator, and the step of matching the trigger type set with the secondary instruction library.
A tongue and tooth drive interaction system, the system comprising:
an instruction unit having an instruction library for performing interaction;
the acquisition unit is used for acquiring signals generated after signals are triggered on a nipple-like sensor carrier placed in an oral cavity through tongue and/or teeth, the nipple-like sensor carrier comprises a tongue touch area and a tooth occlusion area which are formed by a plurality of functional areas, the tooth occlusion area is contained in the tongue touch area and is judged to be tongue touch or tooth occlusion according to position combination force, and the triggering type comprises tongue touch action, tongue touch position, tongue touch action times, tongue touch force, tongue touch time, tooth occlusion action, tooth occlusion force, tooth occlusion action times, tooth occlusion time and tooth occlusion position;
a processing unit forming a trigger type set including a plurality of trigger types from the received signal;
the matching unit is used for matching the trigger type set with the corresponding instruction in the instruction library;
the execution unit is used for matching the matching unit successfully; and transmitting the command to an actuator, and executing the command by an operating device docked by the actuator.
Further, an instruction corresponds to a combination of types of one or more triggers within a set period of time.
And further, the system also comprises a judging unit which is used for judging a signal exceeding a set threshold value as an effective signal to carry out the next step before the trigger type set is matched with the corresponding instruction in the instruction library according to the magnitude of the tongue touch force or the tooth occlusion force, and otherwise, the signal is used as an ineffective signal.
Compared with the prior art, the invention has the beneficial effects that:
the invention is used for helping the disabled or the sick patient with consciousness and certain movement ability of the mouth and tongue to express the self requirement through the sensor device and the computer system. And the patient can perform some simple operations on the computer while meeting the basic functions, and the operations can help the patient to express and record, even to perform routine office work. Unlike current BCI and TDS techniques, the system does not require the implantation, puncture, and bonding of special devices into the patient, causing pain. Moreover, the wearing equipment is simple and convenient, and only the sensing equipment needs to be arranged in the inlet when the sensing equipment is used every time. The mode fully considers the self feeling of the patient, and maintains the dignity of the patient without wearing large special equipment. Compared with other devices, the system has lower cost and can be used by patients without paying huge money. The operation action of the user is simple and does not need training.
Drawings
FIG. 1 is a flow chart of a method provided by an embodiment of the present invention;
FIG. 2 is a block diagram of a system provided by an embodiment of the invention;
fig. 3 is a schematic structural diagram of a pacifier-like sensor carrier according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail with reference to the following embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Referring to fig. 1, a tongue and tooth drive interaction method includes:
establishing an instruction library;
triggering signals of a nipple-like sensor carrier placed in an oral cavity through tongue and/or teeth, wherein the triggering types comprise tongue touch actions, tongue touch positions, tongue touch action times, tongue touch force, tongue touch time, tooth occlusion actions, tooth occlusion force, tooth occlusion action times, tooth occlusion time and tooth occlusion positions;
receiving a signal generated after triggering, and forming a trigger type set comprising a plurality of trigger types according to the received signal;
matching the trigger type set with corresponding instructions in an instruction library;
if the matching is successful;
and transmitting the command to an actuator, and executing the command by an operating device docked by the actuator.
The instruction is a combination of one instruction and one or more trigger types in a set time period. Depending on the meaning of the desired expression to be accomplished by the combination of types, for example:
the method can be divided into three levels of instructions according to the degree required by a user, wherein the first level operation is an emergency function: if rescue and assistance are needed, a primary alarm is issued to alert medical and nursing staff once the command is triggered. Secondary operations are requirements in daily life such as: the upper level of the equal-level operation can send out a second-level alarm according to different contents to inform nursing staff, such as 'eating', 'drinking', 'turning over', 'toileting', 'sputum excretion' and the like. The three-level operation is that the user performs some non-urgent but more complex operations, such as playing music, assisting communication, operating a personal computer or a smart phone, operating an intelligent wheelchair and the like, and the three-level operation requires the user to complete the operations together with the intelligent equipment by means of an upper computer program, and at the moment, the upper computer does not give an alarm. The actions of different levels can be completed by corresponding to different tongue and tooth matching modes, and generally, the actions are from difficult to easy to correspond to the operation levels from low to high (from three levels to one level). The operation of the three levels can be adjusted and divided according to the default recommendation of the system and the self-based condition and will of the user. Tongue and tooth fit can be operated in several ways: 1. the tongue touches a certain position of the bottom of the nipple alone, and the bottom of the nipple can be divided into different functional areas according to different positions and force according to the requirements of users. The action triggers three-stage operation, and when the tongue touches different areas of the bottom of the nipple, different commands are triggered. And transmitting the commands to the upper computer through the signal transmission part, and finally processing by the upper computer. The number of functions and the specific functions of the bottom functional area need to be customized according to the selection of a user. 2. The tongue is stirred and slightly pushed inwards in multiple directions such as up, down, left and right independently, and the system defaults to trigger secondary operation due to the fact that operation is simple. 3. The nipple is occluded by using teeth, and the action can be divided into a plurality of functions according to the requirements of users according to different occlusion times, occlusion force and occlusion time. Such as using a short snap to simulate a single click in computer operation, two consecutive short snaps to simulate a double click, etc. But by default the system will engage the nipple head for more than 1s triggering a level one operation, which is most easily done when the patient is in danger and is difficult to miss in everyday use. 4. Biting the nipple and shaking the head, which is a complex activity and requires the user to be conscious. The action is matched with the completion of three-level operation so as to complete the use of unnecessary but more advanced functions such as cursor movement, intelligent wheelchair movement control and the like. 5. The tongue cooperates with the teeth, and this and other actions can be customized to the user to accomplish their desired task. The actions, operation hierarchies and specific operations listed above are only one possible configuration, and the specific actions, hierarchies and operations can be selected or customized by the user for new functions.
In order to prevent the false judgment after the generation of the invalid signal of the user, the signal which is judged to exceed the set threshold value according to the size of the tongue touch force or the tooth occlusion force before the trigger type set is matched with the corresponding instruction in the instruction library is taken as the valid signal to be carried out in the next step, otherwise, the valid signal is taken as the invalid signal.
Referring to fig. 3, the pacifier-like sensor carrier includes a tongue contact region 2 composed of a plurality of functional regions and a tooth occlusion region 1 included in the tongue contact region and judged as tongue contact or tooth occlusion according to a position combining force. Each functional area is at least provided with one sensor, the sensors are coded according to the functional area where the sensors are located, and the tongue touch position is judged according to the codes of the sensors corresponding to the information.
The docked operating device is an intelligent auxiliary device comprising a computer, a smart phone, a tablet and an intelligent wheelchair.
And establishing the instruction library comprises generating a secondary instruction library from the total instruction library according to the operating equipment docked by the actuator, and triggering the type set to be matched with the secondary instruction library for instructions so as to adapt to different operating equipment.
Referring to fig. 2, the present invention provides a tongue and tooth drive interaction system comprising:
an instruction unit having an instruction library for performing interaction;
the acquisition unit is used for acquiring signals generated after signals are triggered on a nipple-like sensor carrier placed in an oral cavity through tongue and/or teeth, the nipple-like sensor carrier comprises a tongue touch area and a tooth occlusion area which are formed by a plurality of functional areas, the tooth occlusion area is contained in the tongue touch area and is judged to be tongue touch or tooth occlusion according to position combination force, and the triggering type comprises tongue touch action, tongue touch position, tongue touch action times, tongue touch force, tongue touch time, tooth occlusion action, tooth occlusion force, tooth occlusion action times, tooth occlusion time and tooth occlusion position;
a processing unit forming a trigger type set including a plurality of trigger types from the received signal;
the matching unit is used for matching the trigger type set with the corresponding instruction in the instruction library;
the execution unit is used for matching the matching unit successfully; and transmitting the command to an actuator, and executing the command by an operating device docked by the actuator.
And the judging unit is used for judging a signal exceeding a set threshold value as an effective signal to carry out the next step according to the size of tongue touch force or tooth occlusion force before the trigger type set is matched with the corresponding instruction in the instruction library, and otherwise, the signal is used as an ineffective signal.
The above units are realized by a server or a computer as an upper computer, the signal transmission part can be realized in a wireless mode, and the sensing equipment has the advantage of large internal space, so that the control module, the transmission module and the battery can be arranged in the sensing equipment.
The upper computer program part is used for realizing various functions of user operation, and can provide a function module list for a user firstly, so that the user can select the required function module and then add the function module into the system. Meanwhile, the corresponding functional modules can be adjusted according to the requirements of users in different periods. The function module of the upper computer program is divided into a basic function module and an advanced function module. The basic function module corresponds to the users with relatively critical illness states mainly by the primary operation and the secondary operation, and is used for disease monitoring or expressing daily requirements. The user is required to trigger by himself, and after triggering, the system informs the nursing staff or medical staff to help the patient. The function in the "advanced function module" is oriented to some users whose disease conditions are controlled and desire to perform other activities, and the module corresponds to the above three-level operation. The system can be butted with different types of upper computers according to the requirements of users, not only can use a common computer, but also can be butted with other equipment such as a smart phone, a smart wheelchair and the like.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and improvements made within the spirit and principle of the present invention are intended to be included within the scope of the present invention.

Claims (10)

1. A tongue and tooth drive interaction method, the method comprising:
establishing an instruction library;
triggering signals of a nipple-like sensor carrier placed in an oral cavity through tongue and/or teeth, wherein the triggering types comprise tongue touch actions, tongue touch positions, tongue touch action times, tongue touch force, tongue touch time, tooth occlusion actions, tooth occlusion force, tooth occlusion action times, tooth occlusion time and tooth occlusion positions;
receiving a signal generated after triggering, and forming a trigger type set comprising a plurality of trigger types according to the received signal;
matching the trigger type set with corresponding instructions in an instruction library;
if the matching is successful;
and transmitting the command to an actuator, and executing the command by an operating device docked by the actuator.
2. The method of claim 1, wherein a command corresponds to a combination of types of one or more triggers within a set period of time.
3. The method of claim 1, wherein the signal exceeding the set threshold is judged as a valid signal to proceed to the next step before the trigger type set is matched with the corresponding instruction in the instruction library according to the magnitude of the tongue contact force or the tooth occlusion force, and otherwise, the signal is used as an invalid signal.
4. The method of claim 3, wherein the pacifier-like sensor carrier includes a tongue-contacting region having a plurality of functional regions and a tooth-engaging region, the tooth-engaging region being contained within the tongue-contacting region and being determined as tongue-contacting or tooth-engaging based on the position-engaging force.
5. The method of claim 4, wherein each functional area is provided with at least one sensor, and the sensors are encoded according to the functional area in which the sensor is located, and the position of the tongue contact is determined according to the code of the sensor corresponding to the information.
6. The method of claim 1, wherein the docked operational device is a smart auxiliary device including a computer, a smartphone, a tablet, and a smart wheelchair.
7. The method of claim 1, wherein establishing the command library comprises generating a secondary command library from the master command library based on the manipulator device to which the actuator is docked, the trigger type set matching commands from the secondary command library.
8. A tongue and tooth drive interaction system, comprising:
an instruction unit having an instruction library for performing interaction;
the acquisition unit is used for acquiring signals generated after signals are triggered on a nipple-like sensor carrier placed in an oral cavity through tongue and/or teeth, the nipple-like sensor carrier comprises a tongue touch area and a tooth occlusion area which are formed by a plurality of functional areas, the tooth occlusion area is contained in the tongue touch area and is judged to be tongue touch or tooth occlusion according to position combination force, and the triggering type comprises tongue touch action, tongue touch position, tongue touch action times, tongue touch force, tongue touch time, tooth occlusion action, tooth occlusion force, tooth occlusion action times, tooth occlusion time and tooth occlusion position;
a processing unit forming a trigger type set including a plurality of trigger types from the received signal;
the matching unit is used for matching the trigger type set with the corresponding instruction in the instruction library;
the execution unit is used for matching the matching unit successfully; and transmitting the command to an actuator, and executing the command by an operating device docked by the actuator.
9. The system of claim 8, wherein a command corresponds to a combination of types of one or more triggers within a set period of time.
10. The system of claim 8, further comprising a determining unit for determining a signal exceeding a set threshold as a valid signal for the next step before matching the trigger type set with the corresponding command in the command library according to the magnitude of the tongue contact force or the tooth engagement force, and as an invalid signal otherwise.
CN202111324177.3A 2021-11-10 2021-11-10 Tongue and tooth driving interaction system and interaction method Pending CN114063777A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111324177.3A CN114063777A (en) 2021-11-10 2021-11-10 Tongue and tooth driving interaction system and interaction method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111324177.3A CN114063777A (en) 2021-11-10 2021-11-10 Tongue and tooth driving interaction system and interaction method

Publications (1)

Publication Number Publication Date
CN114063777A true CN114063777A (en) 2022-02-18

Family

ID=80274535

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111324177.3A Pending CN114063777A (en) 2021-11-10 2021-11-10 Tongue and tooth driving interaction system and interaction method

Country Status (1)

Country Link
CN (1) CN114063777A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106648114A (en) * 2017-01-12 2017-05-10 长春大学 Interactive model of tongue machine and device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005215818A (en) * 2004-01-28 2005-08-11 Takuya Shinkawa Input device and input processing apparatus
KR20080110043A (en) * 2007-06-14 2008-12-18 영 준 장 Signal input apparatus using mouth
CN103702611A (en) * 2011-05-27 2014-04-02 Ccb研究集团有限责任公司 Tongue strength evaluation system and method
US20150290454A1 (en) * 2003-11-26 2015-10-15 Wicab, Inc. Systems and methods for altering brain and body functions and for treating conditions and diseases of the same
US20170188936A1 (en) * 2012-07-18 2017-07-06 Chantal Lau Systems for Monitoring Infant Oral Motor Kinetics During Nutritive and Non-Nutritive Feeding
CN111752394A (en) * 2020-08-18 2020-10-09 长春大学 Non-implanted wearable double-layer electrode tongue touch force feedback control system and device
CN113055778A (en) * 2021-03-23 2021-06-29 深圳市沃特沃德信息有限公司 Earphone interaction method and device based on dental motion state, terminal equipment and medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150290454A1 (en) * 2003-11-26 2015-10-15 Wicab, Inc. Systems and methods for altering brain and body functions and for treating conditions and diseases of the same
JP2005215818A (en) * 2004-01-28 2005-08-11 Takuya Shinkawa Input device and input processing apparatus
KR20080110043A (en) * 2007-06-14 2008-12-18 영 준 장 Signal input apparatus using mouth
CN103702611A (en) * 2011-05-27 2014-04-02 Ccb研究集团有限责任公司 Tongue strength evaluation system and method
US20170188936A1 (en) * 2012-07-18 2017-07-06 Chantal Lau Systems for Monitoring Infant Oral Motor Kinetics During Nutritive and Non-Nutritive Feeding
CN111752394A (en) * 2020-08-18 2020-10-09 长春大学 Non-implanted wearable double-layer electrode tongue touch force feedback control system and device
CN113055778A (en) * 2021-03-23 2021-06-29 深圳市沃特沃德信息有限公司 Earphone interaction method and device based on dental motion state, terminal equipment and medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106648114A (en) * 2017-01-12 2017-05-10 长春大学 Interactive model of tongue machine and device
CN106648114B (en) * 2017-01-12 2023-11-14 长春大学 Tongue machine interaction model and device

Similar Documents

Publication Publication Date Title
CN205179107U (en) Intelligence endowment monitor system based on internet +
Stefanov et al. The smart house for older persons and persons with physical disabilities: structure, technology arrangements, and perspectives
US20150320359A1 (en) Wearable mini-size intelligent healthcare system
Das et al. Using smart phones for context-aware prompting in smart environments
Alekya et al. IoT based smart healthcare monitoring systems: A literature review
JP2022501697A (en) Myoelectric potential (EMG) assisted communication device with context-sensitive user interface
Huang et al. An EOG-based wheelchair robotic arm system for assisting patients with severe spinal cord injuries
CN107802253A (en) A kind of wearable device and its method of work for health detection
CN211674200U (en) Intelligent health monitoring system for daily care of old people
AU2022268375B2 (en) Multiple switching electromyography (emg) assistive communications device
CN114063777A (en) Tongue and tooth driving interaction system and interaction method
CN115501481A (en) Emergency program control equipment, medical system and computer readable storage medium
CN112883707B (en) Emergency aid assisting method, system, equipment and storage medium based on man-machine conversation
CN112120859A (en) Intelligent nursing system
Jafar et al. Literature review on assistive devices available for quadriplegic people: Indian context
CN214318355U (en) Intelligent nursing system
Viancy et al. Paralyzed Patient Monitoring Equipment–IoT
Hii et al. Smart phone based patient-centered remote health monitoring application in wireless sensor network
Kataria et al. A Novel IOT-Based Smart Wheelchair Design for Cerebral Palsy Patients
CN108784674A (en) A kind of humanoid robot control system of accompanying and attending to based on sensor monitoring
Janney et al. Design of an automated healthcare system using hand gestures for paraplegic people
Ibrahim et al. Smart Homes for Disabled People: A Review Study
CN105769145A (en) End-of-life monitoring system in non-medical mode
Yeo et al. Evaluation of a low-cost alternative communication device with brain control
Rotariu et al. Assistive technologies to support communication with neuro-motor disabled patients

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination