CN110956701A - Life support system and life support method - Google Patents

Life support system and life support method Download PDF

Info

Publication number
CN110956701A
CN110956701A CN201911019749.XA CN201911019749A CN110956701A CN 110956701 A CN110956701 A CN 110956701A CN 201911019749 A CN201911019749 A CN 201911019749A CN 110956701 A CN110956701 A CN 110956701A
Authority
CN
China
Prior art keywords
control
module
signal
brain
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911019749.XA
Other languages
Chinese (zh)
Inventor
邓宝松
陈鹏飞
王怡静
谢良
闫野
印二威
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin (binhai) Intelligence Military-Civil Integration Innovation Center
National Defense Technology Innovation Institute PLA Academy of Military Science
Original Assignee
Tianjin (binhai) Intelligence Military-Civil Integration Innovation Center
National Defense Technology Innovation Institute PLA Academy of Military Science
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin (binhai) Intelligence Military-Civil Integration Innovation Center, National Defense Technology Innovation Institute PLA Academy of Military Science filed Critical Tianjin (binhai) Intelligence Military-Civil Integration Innovation Center
Priority to CN201911019749.XA priority Critical patent/CN110956701A/en
Publication of CN110956701A publication Critical patent/CN110956701A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Dermatology (AREA)
  • General Health & Medical Sciences (AREA)
  • Neurology (AREA)
  • Neurosurgery (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses life auxiliary system and life auxiliary method, this life auxiliary system includes: augmented reality module, brain-computer interface module, intelligent thing allies oneself with platform and a plurality of controlled object. The augmented reality module displays a multilevel menu in front of a user, the brain-computer interface module acquires a bioelectricity signal of the user when the user gazes at the multilevel menu, and confirms a control instruction according to the bioelectricity signal, and sends the control instruction to the intelligent Internet of things platform, and the intelligent Internet of things platform receives the control instruction, and right the control instruction is sent to a corresponding controlled object after being classified, so as to control the controlled object to perform corresponding operation. In this scheme, on the basis of intelligent thing allies oneself with the platform, utilize techniques such as brain machine interface to the demand of disability and half disability old man, make corresponding control to life apparatus such as wheelchair, arm and intelligent house to satisfied their basic life demand, improved their quality of life.

Description

Life support system and life support method
Technical Field
The application relates to the technical field of Internet of things, in particular to a life auxiliary system and a life auxiliary method.
Background
With the further increase of the aging of the society, the elderly population is increasing, and the disabled and semi-disabled elderly in the population occupy a large proportion. The social group is large in number and loses mobility, and related auxiliary equipment is urgently needed to improve the life of the social group.
With the development of science and technology, such as the maturity of brain-computer interface technology, robotics, and multi-modal perception technology, it is possible to assist the life of people with mobility disabilities through technology. However, some old-age support auxiliary devices and systems on the market at present are simple in structure and single in function, and cannot meet corresponding requirements.
Disclosure of Invention
In view of the above, the present application provides a life support system and a life support method for helping people with mobility disabilities to improve their quality of life.
In a first aspect, an embodiment of the present application provides a life assisting system, including: the system comprises an augmented reality module, a brain-computer interface module, an intelligent Internet of things platform and a plurality of controlled objects; the augmented reality module, the brain-computer interface module and the controlled objects are respectively in signal connection with the intelligent Internet of things platform; wherein the content of the first and second substances,
the augmented reality module is internally stored with a control model of each controlled object and is used for displaying a multi-level menu in front of the eyes of a user; the multi-level menu comprises a control model selection menu and a command menu corresponding to each control model, wherein each command menu consists of a plurality of control instructions;
the brain-computer interface module is used for acquiring a bioelectricity signal of a user when the user watches the multilevel menu, confirming a control instruction according to the bioelectricity signal and sending the control instruction to the intelligent Internet of things platform;
and the intelligent Internet of things platform is used for receiving the control command, classifying the control command and then sending the control command to the corresponding controlled object so as to control the controlled object to perform corresponding operation.
In a possible implementation manner, in the life support system provided in an embodiment of the present application, the brain-computer interface module includes: the acquisition unit, the processing unit and the analysis unit are sequentially in signal connection; wherein the content of the first and second substances,
the acquisition unit is used for acquiring the bioelectricity signals of the user when the user watches the multilevel menu and sending the bioelectricity signals to the processing unit;
the processing unit is used for segmenting the bioelectricity signal by a preset time length, carrying out band-pass filtering after segmentation, carrying out down-sampling after filtering, obtaining a processing signal and sending the processing signal to the analysis unit;
and the analysis unit is used for analyzing the processing signal by adopting a brain-computer interface paradigm and a preset algorithm to confirm a control instruction and sending the control instruction to the intelligent Internet of things platform.
In a possible implementation manner, in the life assisting system provided by the embodiment of the present application, the bioelectric signal includes an electroencephalogram signal EEG and an electromyogram signal EMG; the brain-computer interface paradigm is steady-state visual evoked potential (SSVEP), and the preset algorithm is typical correlation analysis (CCA).
In a possible implementation manner, in the life assisting system provided in an embodiment of the present application, the controlled object is an intelligent home module, a mechanical arm module, or a wheelchair module.
In a possible implementation manner, in the life assisting system provided in an embodiment of the present application, the system further includes: an infrared signal transmitting module; wherein the content of the first and second substances,
the infrared signal sending module is in signal connection with the intelligent Internet of things platform and used for sending a control instruction to a controlled object in an infrared remote control mode.
In a second aspect, an embodiment of the present application provides a life assisting method, where the method is based on the system in any one of the above, and the method includes:
the augmented reality module displays a multi-level menu in front of a user; the augmented reality module is internally stored with a control model of each controlled object, the multilevel menu comprises a control model selection menu and a command menu corresponding to each control model, and each command menu consists of a plurality of control instructions;
the brain-computer interface module acquires a bioelectricity signal of a user when the user watches the multilevel menu, confirms a control instruction according to the bioelectricity signal and sends the control instruction to the intelligent Internet of things platform;
and the intelligent Internet of things platform receives the control command, classifies the control command and then sends the control command to the corresponding controlled object so as to control the controlled object to perform corresponding operation.
In a possible implementation manner, in the above life assisting method provided in an embodiment of the present application, the brain-computer interface module acquires a bioelectrical signal of the user when the user gazes at the multilevel menu, confirms a control instruction according to the bioelectrical signal, and sends the control instruction to the intelligent internet of things platform, including:
collecting a bioelectrical signal of a user while the user gazes at the multi-level menu;
segmenting the bioelectricity signal by a preset time length, carrying out band-pass filtering after segmentation, and carrying out down-sampling after filtering to obtain a processed signal;
and analyzing the processing signal by adopting a brain-computer interface paradigm and a preset algorithm to confirm a control instruction, and sending the control instruction to the intelligent Internet of things platform.
In a possible implementation manner, in the life assisting method provided by the embodiment of the present application, the bioelectric signal includes an electroencephalogram signal EEG and an electromyogram signal EMG; the brain-computer interface paradigm is steady-state visual evoked potential (SSVEP), and the preset algorithm is typical correlation analysis (CCA).
In a possible implementation manner, in the life assisting method provided in an embodiment of the present application, the controlled object is an intelligent home module, a mechanical arm module, or a wheelchair module.
In a possible implementation manner, in the life assisting method provided in an embodiment of the present application, the method further includes:
and the infrared signal sending module sends a control instruction to the controlled object in an infrared remote control mode.
Compared with the prior art, in the life assisting system and the life assisting method provided by the application, the life assisting system comprises an augmented reality module, a brain-computer interface module, an intelligent internet of things platform and a plurality of controlled objects. The augmented reality module displays a multilevel menu in front of a user, the brain-computer interface module acquires a bioelectricity signal of the user when the user gazes at the multilevel menu, and confirms a control instruction according to the bioelectricity signal, and sends the control instruction to the intelligent Internet of things platform, and the intelligent Internet of things platform receives the control instruction, and right the control instruction is sent to a corresponding controlled object after being classified, so as to control the controlled object to perform corresponding operation. In this scheme, on the basis of intelligent thing allies oneself with the platform, utilize techniques such as brain machine interface to the demand of disability and half disability old man, make corresponding control to life apparatus such as wheelchair, arm and intelligent house to satisfied their basic life demand, improved their quality of life.
Drawings
Fig. 1 is a block diagram illustrating a life support system according to an embodiment of the present disclosure;
fig. 1a illustrates a block diagram of a brain-computer interface module according to an embodiment of the present application;
FIG. 2 shows a schematic workflow diagram of a robotic arm module;
fig. 3 shows a workflow diagram of the smart home module;
FIG. 4 illustrates a workflow diagram of the wheelchair module;
fig. 5 is a flowchart illustrating a life assisting method provided by an embodiment of the present application;
FIG. 6 illustrates a module selection user interface provided by an embodiment of the present application;
FIG. 7 illustrates an intelligent home user selection interface provided by an embodiment of the present application;
FIG. 8 illustrates a robotic arm user control interface provided by an embodiment of the present application;
FIG. 9 illustrates a wheelchair self-control user selection interface provided by embodiments of the present application;
fig. 10 illustrates a user confirmation interface provided by an embodiment of the present application.
Detailed Description
The following detailed description of embodiments of the present application is provided in conjunction with the accompanying drawings, but it should be understood that the scope of the present application is not limited to the specific embodiments.
Throughout the specification and claims, unless explicitly stated otherwise, the word "comprise", or variations such as "comprises" or "comprising", will be understood to imply the inclusion of a stated element or component but not the exclusion of any other element or component.
In order to better understand the present application, some terms referred to in the present application will be first introduced below.
Brain-Computer Interface (BCI), sometimes also referred to as "Brain port" or "Brain-Computer fusion perception," is a direct communication and control channel established between the human Brain and a Computer or other electronic device through which a person can directly express ideas or manipulate the device through the Brain without requiring language or motion, which can effectively enhance the ability of severely physically disabled patients to communicate with the outside world or control the external environment, to improve the quality of life of the patients. The brain-computer interface technology is a cross technology relating to multiple subjects such as neuroscience, signal detection, signal processing, mode recognition and the like, is a man-machine interaction means different from a keyboard and a touch screen, can realize control over equipment by a user without a traditional control mode such as manual operation and the like, and is very suitable for being used by groups with mobility disorders.
A Robot Operating System (ROS) can connect a plurality of devices together, and the operation and the use of the living electric appliances by a user are simplified. ROS can provide operating system-like functionality for heterogeneous computer clusters. It provides the services that the operating system should have, including hardware abstraction, underlying device control, implementation of common functions, interprocess message passing, and package management. It also provides the tools and library functions needed to obtain, compile, write, and run code across computers. The primary objective of the ROS is to provide a unified open source programming framework for controlling the robot in diverse real-world and simulation environments.
Electromyography (EMG) is a superposition of Motor Unit Action Potentials (MUAP) in a multitude of muscle fibers, both temporally and spatially. It is used in a manner similar to electroencephalogram (EEG).
Steady-State Visual Evoked Potentials (SSVEPs) refer to the continuous stimulus frequency-dependent (fundamental or multiple frequency) response of the Visual cortex of the human brain when subjected to a fixed frequency Visual stimulus, referred to as SSVEPs.
The mechanical arm is an industrial device appearing in recent years, and is a complex system with high precision, multiple inputs and multiple outputs, high nonlinearity and strong coupling. Is mainly applied to industrial production lines. With the development of technology, the flexibility and the safety of the technology are higher and higher, and the 'distance' between the technology and a human is closer and closer. The head and horn of a person begin to be exposed in the field of life assistance.
The smart home is different from the isolated existence of a common home, all devices in the home are connected together through the Internet of things technology, the operation mode is different from the prior manual input, and the remote control can be realized by utilizing the brain-computer signal input. And it can self-judge and react accordingly through various sensors.
Please refer to fig. 1, which shows a structural diagram of a life support system according to an embodiment of the present application. The life support system 10 includes: an augmented reality module 110, a brain-computer interface module 120, an intelligent internet of things platform 130 and a plurality of controlled objects 140; the augmented reality module 110, the brain-computer interface module 120 and the controlled objects 140 are respectively in signal connection with the intelligent internet of things platform 120. The controlled object 140 may be an intelligent home module, a mechanical arm module, or a wheelchair module.
The augmented reality module 110 stores a control model of each controlled object 140, and the augmented reality module 110 is used for displaying a multi-level menu in front of eyes of a user; the multi-level menu comprises a control module selection menu and a command menu corresponding to each control module, wherein each command menu is composed of a plurality of control instructions.
The brain-computer interface module 120 is configured to obtain a bioelectrical signal of the user when the user gazes at the multi-level menu, confirm a control instruction according to the bioelectrical signal, and send the control instruction to the intelligent internet of things platform 130. Specifically, in some embodiments of the present application, the bioelectric signals may include an electroencephalogram signal EEG and an electromyogram signal EMG.
In some embodiments of the present application, as shown in fig. 1a, the brain-computer interface module 120 may specifically include: the device comprises an acquisition unit 121, a processing unit 122 and an analysis unit 123 which are connected in sequence through signals. The acquisition unit 121 is configured to acquire a bioelectrical signal of the user when the user looks at the multilevel menu, and send the bioelectrical signal to the processing unit 122; the processing unit 122 is configured to segment the bioelectric signal for a preset time length (for example, 1000ms), perform band-pass filtering after the segmentation, perform down-sampling after the filtering, complete data preprocessing, obtain a processed signal, and send the processed signal to the analysis unit 123. And the analysis unit 123 is configured to analyze the processing signal by using a brain-computer interface paradigm and a preset algorithm to confirm a control instruction, and send the control instruction to the intelligent internet of things platform. Specifically, in some embodiments of the present application, the brain-computer interface paradigm may be a steady-state visual evoked potential SSVEP data paradigm, and the predetermined algorithm may be a typical correlation analysis CCA.
The intelligent internet of things platform 130 is configured to receive the control instruction, classify the control instruction, and send the control instruction to the corresponding controlled object, so as to control the corresponding controlled object to perform corresponding operations. On the intelligent Internet of things platform, control instructions are classified and processed and accurately sent to a controlled object, so that point-to-point control is realized. In practical application, the intelligent internet of things platform 130 can adopt an ROS platform, which is a framework and an interface capable of integrating resources, so that the resources can be shared and used, and the reuse rate of various functions and various software is increased.
In the embodiment, based on the intelligent internet of things platform, the technologies such as the brain-computer interface and the sensor are utilized to meet the requirements of disabled and semi-disabled old people, and the life appliances such as wheelchairs, mechanical arms and intelligent homes are controlled correspondingly, so that the basic life requirements of the old people are met, and the life quality of the old people is improved.
In some embodiments of the present application, the life support system 10 may further include: an infrared signal transmitting module 150; the infrared signal sending module 150 is in signal connection with the intelligent internet of things platform 130, and is configured to send a control instruction to a controlled object using an infrared remote control mode. For example, the intelligent home module adopts an infrared remote control mode in the aspect of remote control, and then the control instruction can be sent to the intelligent home module through the infrared signal sending module. The embodiment is beneficial to expanding the functions of the life auxiliary system, thereby better meeting the requirements of users.
For better understanding of the present application, the controlled object is taken as an example of a smart home module, a robot arm module and a wheelchair module.
In the embodiment of the application, life auxiliary system, including brain-computer interface module, intelligent house module, arm module, wheelchair module and augmented reality module five parts, five modules all use ROS intelligence thing to ally oneself with the platform as information intersection center.
The brain-computer interface module simultaneously acquires the EEG signals and the EMG signals of the user. The collected EMG signals are mainly generated by occlusion of the teeth. The brain-computer interface module is also responsible for amplifying, filtering and down-sampling the acquired signals. By utilizing the multi-level menu, the brain-computer interface module can convert the electroencephalogram characteristic signals induced when the user watches the visual stimulation superposed on the control module selection menu and the command menu corresponding to each controlled object into computer control instructions. The brain-computer interface paradigm adopts an SSVEP data paradigm. The control signal converted from the EMG signal may first be used as a confirmation signal, i.e. to confirm the selection made by the user, which confirmation is presented in the form of a pop-up window in the control module selection window. Biting the teeth once is "confirmed" and biting the teeth twice is "cancelled". Secondly, the teeth are occluded for three times continuously to serve as a forced exit signal, namely, the current instruction is stopped to return to the upper menu. FIG. 6 illustrates a module selection user interface provided by an embodiment of the present application; FIG. 7 illustrates an intelligent home user selection interface provided by an embodiment of the present application; FIG. 8 illustrates a robotic arm user control interface provided by an embodiment of the present application; FIG. 9 illustrates a wheelchair self-control user selection interface provided by embodiments of the present application; fig. 10 illustrates a user confirmation interface provided by an embodiment of the present application.
Specifically, the visual stimulation brightness is controlled by rectangular wave modulation, and the light emitting color is white. At the first level of the menu, the stimulation paradigm includes 3 targets. The method comprises the following steps: the intelligent home system comprises an intelligent home module, a mechanical arm module and a wheelchair module. A 3D model of the three objects would be displayed in front of the user's eyes by an Augmented Reality (AR) module. The model of the three targets blinks at a frequency of 10-11Hz, with a frequency interval of 0.5 Hz.
Two levels of menus are arranged below the intelligent home module, at most thirty options can be provided by the two levels of menus, the model flicker frequency of the two levels of menus is 9-13.5Hz, and the frequency interval is 0.2 Hz. The electroencephalogram signal acquisition channels mainly use 14 channels, which are respectively as follows: p1, PZ, P2, PO7, PO5, PO3, POZ, PO4, PO6, PO8, O1, OZ, O2, M2. The collected signals are processed and analyzed, firstly, the collected signals are segmented in 1000ms time length, then, the band-pass filtering is carried out, and finally, the data preprocessing is completed through the down-sampling. When analyzing the signal, the method used is the typical correlation analysis method (C)CA). The method is an algorithm for studying the correlation between a plurality of variables and a plurality of variables. The method is satisfactory in use by using 14 channels to acquire signals, and the basic idea is to find and maximize the correlation coefficient ρ of two sets of variables X and Y as a whole. Defining the multichannel SSVEP signal as XnIs mixing XnDecomposition into n XnfmSignal, n is channel number, XnfmIs the fm frequency point signal reconstructed by the original SSVEP electroencephalogram signal through wavelet packet decomposition. Establishing corresponding reference signal matrix, and then obtaining maximum correlation coefficient rhom. Performing wavelet packet decomposition on the SSVEP signal to obtain a reconstructed signal, and calculating X of each channel according to a CCA algorithmnMaximum correlation coefficient ρ of sum YmThen, the frequency point fm of the SSVEP stimulation signal is compared and judged. And then finding out the control command selected by the user according to the one-to-one correspondence relationship between the frequency point fm and the icon.
In the augmented reality module, icons are designed by a unity development tool and displayed on a virtual screen on the augmented reality glasses.
The mechanical arm module mainly comprises two parts: a control section and an active section. The control section includes: the system comprises a signal receiving device, a control computer, a joint motor drive, a joint limit switch and a pressure sensor. The movable part mainly comprises: the device comprises a base, shoulders, an upper arm, forearms, wrists, clamping fingers and seven joints for connecting all the parts, and a direct current motor for controlling all the parts correspondingly is arranged in the device. The signal receiving equipment mainly receives a control signal from the ROS intelligent Internet of things platform. The control computer is responsible for processing the control signal and outputting a control signal of a corresponding motor. The pressure sensor is responsible for feeding back the position of the mechanical arm, and the computer can conveniently adjust the position in time. Simplifying the motion of the robotic arm into several simple actions, including: the device is stretched forwards, retracted, lifted upwards, lifted downwards, clamped and loosened. The command is selected by acquiring the user's selection using the SSVEP paradigm as described above. And the embodiment properly reduces the speed of the mechanical arm to meet the requirements of users. And the mechanical arm module is provided with a pressure sensor, so that the finger clamping force is judged, and whether the mechanical arm activity reaches the limit is fed back.
Referring to FIG. 2, a schematic workflow diagram of a robot arm module is shown. As shown, it includes:
initializing a system, and judging whether a control command is received or not;
if so, after acquiring the control command sequence, executing a corresponding command;
determining whether the command terminates or moves to a joint limit;
if yes, returning to the step of judging whether the control command is received.
The intelligent home module mainly comprises: the system comprises a signal receiving part, a control computer part, an indoor detection sensor part and an infrared remote control part. The signal receiving part mainly detects whether a control signal from the ROS intelligent Internet of things platform exists. The control computer part mainly recognizes the control command and reacts accordingly. And when the intelligent home control interface module does not receive the related control command, the control computer automatically adjusts the related equipment according to a preset scheme according to the data transmitted back by the sensor in real time. The infrared remote control part: the control sequences of indoor common electric appliances (such as electric lamps, air conditioners, televisions and the like) are arranged according to a certain sequence, and the consistency of the interface specifications is ensured. The functions of a traditional infrared remote control of a household appliance are built into the module. The inner detection sensor portion includes: a humidity sensor, a temperature sensor, a digital light intensity sensor and a gas sensor;
the humidity sensor is used for detecting the indoor humidity condition in real time; the temperature sensor is used for detecting the indoor temperature condition in real time; the digital light intensity sensor is used for detecting the indoor illumination intensity condition in real time; the gas sensor is used for detecting the concentration of indoor polluted gas in real time.
Please refer to fig. 3, which shows a work flow diagram of the smart home module. As shown, it includes:
initializing a system, and judging whether control information is received or not;
if yes, receiving the electroencephalogram signals, and controlling a computer to analyze and process; if not, after receiving the sensor signal, controlling the computer to analyze and process;
determining to control the n equipment to work, and starting the n equipment to work;
continuously judging whether to exit the current process; if yes, returning to the step of judging whether the control information is received.
The wheelchair module consists of a control computer, a forward motor, a steering motor, a motor drive, an ultrasonic distance measurement sensor and an RGBD field depth camera. The wheelchair module has two control modes:
first, the corresponding reaction is made entirely at the command of the user. On a wheelchair, a user can sense the surrounding environment and make corresponding reactions according to the needs of the user. The control command is transmitted to the wheelchair through the ROS intelligent Internet of things platform to control the specific movement of the wheelchair; secondly, firstly, a user frequently-used regional map is stored in the control computer, and the driving track of the wheelchair is improved by utilizing a path planning technology. When the user selects the destination, the wheelchair automatically walks according to the automatic map traveling mode. In the driving process, the ultrasonic sensor and the RGBD camera are used for exploring obstacles and realizing obstacle avoidance. During the driving process, if the user needs, the automatic driving can be stopped at any time, and the automatic driving is converted into the first control mode. The self-control of the wheelchair module is the same as that of the wheelchair module, the SSVEP paradigm is utilized to obtain the instruction of the user, and the system drives the wheelchair to move in a continuous signal acquisition mode. And, like the robot arm, its speed must be controlled within a certain range. The automatic control of the wheelchair module relies primarily on path planning techniques. Before use, the map of the use environment is loaded, so the method belongs to the problem of off-line path planning, and the embodiment mainly uses an artificial potential field method. In the aspect of positioning, the embodiment adopts a relative positioning mode, and the positioning is updated under the condition of a given map.
Referring to FIG. 4, a schematic workflow diagram of the wheelchair module is shown. As shown, it includes:
initializing a system, and judging whether a control command is received or not;
if yes, judging whether to select self-control;
if yes, receiving a control command, and walking according to the command until the destination is reached; if not, the path is automatically planned until the destination is reached.
In the embodiment, on the basis of the ROS intelligent Internet of things platform, the brain-computer interface, the sensor and other technologies are utilized to meet the requirements of disabled and semi-disabled old people, and the wheelchair, the mechanical arm, the intelligent home and other living appliances are correspondingly controlled, so that the basic living requirements of the old people are met, and the living quality of the old people is improved.
As shown in fig. 5, an embodiment of the present application further provides a life assisting method, where the method is based on the system in any of the above embodiments, and includes the following steps:
step S101: the augmented reality module displays a multi-level menu in front of a user; the augmented reality module stores a control model of each controlled object, the multilevel menu comprises a control module selection menu and a command menu corresponding to each control module, and each command menu consists of a plurality of control instructions;
step S102: the brain-computer interface module acquires a bioelectricity signal of a user when the user watches the multilevel menu, confirms a control instruction according to the bioelectricity signal and sends the control instruction to the intelligent Internet of things platform;
step S103: and the intelligent Internet of things platform receives the control command, classifies the control command and then sends the control command to the corresponding controlled object so as to control the controlled object to perform corresponding operation.
In a possible implementation manner, in the life assisting method provided in an embodiment of the present application, the step S402 includes:
collecting a bioelectrical signal of a user while the user gazes at the multi-level menu;
segmenting the bioelectricity signal by a preset time length, carrying out band-pass filtering after segmentation, and carrying out down-sampling after filtering to obtain a processed signal;
and analyzing the processing signal by adopting a brain-computer interface paradigm and a preset algorithm to confirm a control instruction, and sending the control instruction to the intelligent Internet of things platform.
In a possible implementation manner, in the life assisting method provided by the embodiment of the present application, the bioelectric signal includes an electroencephalogram signal EEG and an electromyogram signal EMG; the brain-computer interface paradigm is steady-state visual evoked potential (SSVEP), and the preset algorithm is typical correlation analysis (CCA).
In a possible implementation manner, in the life assisting method provided in an embodiment of the present application, the controlled object is an intelligent home module, a mechanical arm module, or a wheelchair module.
In a possible implementation manner, in the life assisting method provided in an embodiment of the present application, the method further includes:
and the infrared signal sending module sends a control instruction to the controlled object in an infrared remote control mode.
The related descriptions in the embodiments executed by the system side are all applicable to the embodiment of the method, and are not described herein again.
In the embodiment, based on the intelligent internet of things platform, the technologies such as the brain-computer interface and the sensor are utilized to meet the requirements of disabled and semi-disabled old people, and the life appliances such as wheelchairs, mechanical arms and intelligent homes are controlled correspondingly, so that the basic life requirements of the old people are met, and the life quality of the old people is improved.
The foregoing descriptions of specific exemplary embodiments of the present application have been presented for purposes of illustration and description. It is not intended to limit the application to the precise form disclosed, and obviously many modifications and variations are possible in light of the above teaching. The exemplary embodiments were chosen and described in order to explain certain principles of the present application and its practical application to enable one skilled in the art to make and use various exemplary embodiments of the present application and various alternatives and modifications thereof. It is intended that the scope of the application be defined by the claims and their equivalents.

Claims (10)

1. A life support system, comprising: the system comprises an augmented reality module, a brain-computer interface module, an intelligent Internet of things platform and a plurality of controlled objects; the augmented reality module, the brain-computer interface module and the controlled objects are respectively in signal connection with the intelligent Internet of things platform; wherein the content of the first and second substances,
the augmented reality module is internally stored with a control model of each controlled object and is used for displaying a multi-level menu in front of the eyes of a user; the multi-level menu comprises a control model selection menu and a command menu corresponding to each control model, wherein each command menu consists of a plurality of control instructions;
the brain-computer interface module is used for acquiring a bioelectricity signal of a user when the user watches the multilevel menu, confirming a control instruction according to the bioelectricity signal and sending the control instruction to the intelligent Internet of things platform;
and the intelligent Internet of things platform is used for receiving the control command, classifying the control command and then sending the control command to the corresponding controlled object so as to control the controlled object to perform corresponding operation.
2. The life support system of claim 1, wherein said brain-computer interface module comprises: the acquisition unit, the processing unit and the analysis unit are sequentially in signal connection; wherein the content of the first and second substances,
the acquisition unit is used for acquiring the bioelectricity signals of the user when the user watches the multilevel menu and sending the bioelectricity signals to the processing unit;
the processing unit is used for segmenting the bioelectricity signal by a preset time length, carrying out band-pass filtering after segmentation, carrying out down-sampling after filtering, obtaining a processing signal and sending the processing signal to the analysis unit;
and the analysis unit is used for analyzing the processing signal by adopting a brain-computer interface paradigm and a preset algorithm to confirm a control instruction and sending the control instruction to the intelligent Internet of things platform.
3. The life assistance system of claim 2, wherein said bioelectric signals comprise EEG signals and EMG signals; the brain-computer interface paradigm is steady-state visual evoked potential (SSVEP), and the preset algorithm is typical correlation analysis (CCA).
4. The life assistance system of claim 1, wherein the controlled object is a smart home module, a robotic arm module, or a wheelchair module.
5. The life support system of claim 1, wherein said system further comprises: an infrared signal transmitting module; wherein the content of the first and second substances,
the infrared signal sending module is in signal connection with the intelligent Internet of things platform and used for sending a control instruction to a controlled object in an infrared remote control mode.
6. A life support method, characterized in that the method is based on the system of any one of claims 1 to 5, the method comprising:
the augmented reality module displays a multi-level menu in front of a user; the augmented reality module is internally stored with a control model of each controlled object, the multilevel menu comprises a control model selection menu and a command menu corresponding to each control model, and each command menu consists of a plurality of control instructions;
the brain-computer interface module acquires a bioelectricity signal of a user when the user watches the multilevel menu, confirms a control instruction according to the bioelectricity signal and sends the control instruction to the intelligent Internet of things platform;
and the intelligent Internet of things platform receives the control command, classifies the control command and then sends the control command to the corresponding controlled object so as to control the controlled object to perform corresponding operation.
7. The life assisting method of claim 6, wherein the brain-computer interface module acquires a bioelectrical signal of a user when the user watches the multi-level menu, confirms a control command according to the bioelectrical signal, and sends the control command to the intelligent internet of things platform, and the method comprises the following steps:
collecting a bioelectrical signal of a user while the user gazes at the multi-level menu;
segmenting the bioelectricity signal by a preset time length, carrying out band-pass filtering after segmentation, and carrying out down-sampling after filtering to obtain a processed signal;
and analyzing the processing signal by adopting a brain-computer interface paradigm and a preset algorithm to confirm a control instruction, and sending the control instruction to the intelligent Internet of things platform.
8. The life assisting method according to claim 7, wherein the bioelectric signals comprise EEG signals and EMG signals; the brain-computer interface paradigm is steady-state visual evoked potential (SSVEP), and the preset algorithm is typical correlation analysis (CCA).
9. The life assisting method according to claim 6, wherein the controlled object is a smart home module, a mechanical arm module or a wheelchair module.
10. The life assisting method according to claim 6, further comprising:
and the infrared signal sending module sends a control instruction to the controlled object in an infrared remote control mode.
CN201911019749.XA 2019-10-24 2019-10-24 Life support system and life support method Pending CN110956701A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911019749.XA CN110956701A (en) 2019-10-24 2019-10-24 Life support system and life support method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911019749.XA CN110956701A (en) 2019-10-24 2019-10-24 Life support system and life support method

Publications (1)

Publication Number Publication Date
CN110956701A true CN110956701A (en) 2020-04-03

Family

ID=69976374

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911019749.XA Pending CN110956701A (en) 2019-10-24 2019-10-24 Life support system and life support method

Country Status (1)

Country Link
CN (1) CN110956701A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111522445A (en) * 2020-04-27 2020-08-11 兰州交通大学 Intelligent control method
CN111571587A (en) * 2020-05-13 2020-08-25 南京邮电大学 Brain-controlled mechanical arm dining assisting system and method
CN112140113A (en) * 2020-10-12 2020-12-29 北京邮电大学 Robot control system and control method based on brain-computer interface
CN112859628A (en) * 2021-01-19 2021-05-28 华南理工大学 Intelligent home control method based on multi-mode brain-computer interface and augmented reality
US11684301B1 (en) 2022-01-14 2023-06-27 Toyota Motor Engineering & Manufacturing North America, Inc. Methods, systems, and non-transitory computer-readable mediums for SSVEP detection

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102188311A (en) * 2010-12-09 2011-09-21 南昌大学 Embedded visual navigation control system and method of intelligent wheelchair
CN102631265A (en) * 2012-05-11 2012-08-15 重庆大学 Embedded control system of intelligent wheelchair
CN103607519A (en) * 2013-10-17 2014-02-26 南昌大学 Brain-computer interface-based telephone system for double-upper limb disabled people
CN105022281A (en) * 2015-07-29 2015-11-04 中国电子科技集团公司第十五研究所 Intelligent household control system based on virtual reality
CN105578239A (en) * 2015-12-16 2016-05-11 西安科技大学 Television control system based on brain-computer interface technology
CN106339091A (en) * 2016-08-31 2017-01-18 博睿康科技(常州)股份有限公司 Augmented reality interaction method based on brain-computer interface wearing system
CN106726209A (en) * 2016-11-24 2017-05-31 中国医学科学院生物医学工程研究所 A kind of method for controlling intelligent wheelchair based on brain-computer interface Yu artificial intelligence
CN107479696A (en) * 2017-07-25 2017-12-15 天津大学 Based on P300 normal form virtual reality brain machine interface systems and implementation method
CN107957783A (en) * 2017-12-21 2018-04-24 北京航天测控技术有限公司 A kind of Multimode Intelligent control system and method based on brain electricity with myoelectric information
CN108062101A (en) * 2017-12-20 2018-05-22 驭势科技(北京)有限公司 The intelligent personal vehicles and its control method, dispatching method and storage medium
CN108304068A (en) * 2018-01-30 2018-07-20 山东建筑大学 A kind of upper-limbs rehabilitation training robot control system and method based on brain-computer interface
CN108897418A (en) * 2018-05-15 2018-11-27 天津大学 A kind of wearable brain-machine interface arrangement, man-machine interactive system and method
CN109284004A (en) * 2018-10-29 2019-01-29 中国矿业大学 A kind of intelligent nursing system based on brain-computer interface
CN110134243A (en) * 2019-05-20 2019-08-16 中国医学科学院生物医学工程研究所 A kind of brain control mechanical arm shared control system and its method based on augmented reality
CN110198465A (en) * 2019-05-30 2019-09-03 重庆邮电大学 A kind of television set intelligently remote control system based on brain-computer interface

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102188311A (en) * 2010-12-09 2011-09-21 南昌大学 Embedded visual navigation control system and method of intelligent wheelchair
CN102631265A (en) * 2012-05-11 2012-08-15 重庆大学 Embedded control system of intelligent wheelchair
CN103607519A (en) * 2013-10-17 2014-02-26 南昌大学 Brain-computer interface-based telephone system for double-upper limb disabled people
CN105022281A (en) * 2015-07-29 2015-11-04 中国电子科技集团公司第十五研究所 Intelligent household control system based on virtual reality
CN105578239A (en) * 2015-12-16 2016-05-11 西安科技大学 Television control system based on brain-computer interface technology
CN106339091A (en) * 2016-08-31 2017-01-18 博睿康科技(常州)股份有限公司 Augmented reality interaction method based on brain-computer interface wearing system
CN106726209A (en) * 2016-11-24 2017-05-31 中国医学科学院生物医学工程研究所 A kind of method for controlling intelligent wheelchair based on brain-computer interface Yu artificial intelligence
CN107479696A (en) * 2017-07-25 2017-12-15 天津大学 Based on P300 normal form virtual reality brain machine interface systems and implementation method
CN108062101A (en) * 2017-12-20 2018-05-22 驭势科技(北京)有限公司 The intelligent personal vehicles and its control method, dispatching method and storage medium
CN107957783A (en) * 2017-12-21 2018-04-24 北京航天测控技术有限公司 A kind of Multimode Intelligent control system and method based on brain electricity with myoelectric information
CN108304068A (en) * 2018-01-30 2018-07-20 山东建筑大学 A kind of upper-limbs rehabilitation training robot control system and method based on brain-computer interface
CN108897418A (en) * 2018-05-15 2018-11-27 天津大学 A kind of wearable brain-machine interface arrangement, man-machine interactive system and method
CN109284004A (en) * 2018-10-29 2019-01-29 中国矿业大学 A kind of intelligent nursing system based on brain-computer interface
CN110134243A (en) * 2019-05-20 2019-08-16 中国医学科学院生物医学工程研究所 A kind of brain control mechanical arm shared control system and its method based on augmented reality
CN110198465A (en) * 2019-05-30 2019-09-03 重庆邮电大学 A kind of television set intelligently remote control system based on brain-computer interface

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111522445A (en) * 2020-04-27 2020-08-11 兰州交通大学 Intelligent control method
CN111571587A (en) * 2020-05-13 2020-08-25 南京邮电大学 Brain-controlled mechanical arm dining assisting system and method
CN111571587B (en) * 2020-05-13 2023-02-24 南京邮电大学 Brain-controlled mechanical arm dining assisting system and method
CN112140113A (en) * 2020-10-12 2020-12-29 北京邮电大学 Robot control system and control method based on brain-computer interface
CN112859628A (en) * 2021-01-19 2021-05-28 华南理工大学 Intelligent home control method based on multi-mode brain-computer interface and augmented reality
US11684301B1 (en) 2022-01-14 2023-06-27 Toyota Motor Engineering & Manufacturing North America, Inc. Methods, systems, and non-transitory computer-readable mediums for SSVEP detection

Similar Documents

Publication Publication Date Title
CN110956701A (en) Life support system and life support method
Tang et al. Towards BCI-actuated smart wheelchair system
Carlson et al. Brain-controlled wheelchairs: a robotic architecture
Chae et al. Toward brain-actuated humanoid robots: asynchronous direct control using an EEG-based BCI
Escolano et al. A telepresence mobile robot controlled with a noninvasive brain–computer interface
Luth et al. Low level control in a semi-autonomous rehabilitation robotic system via a brain-computer interface
Tonin et al. Noninvasive brain–machine interfaces for robotic devices
Valbuena et al. Brain-computer interface for high-level control of rehabilitation robotic systems
CN108983636B (en) Man-machine intelligent symbiotic platform system
CN112017516B (en) Remote vascular intervention operation training system
Kanemura et al. A waypoint-based framework in brain-controlled smart home environments: Brain interfaces, domotics, and robotics integration
Argall Autonomy in rehabilitation robotics: An intersection
Rani et al. Electroencephalogram-based brain controlled robotic wheelchair
Zhao et al. Behavior-based SSVEP hierarchical architecture for telepresence control of humanoid robot to achieve full-body movement
Bien et al. Intention reading is essential in human-friendly interfaces for the elderly and the handicapped
Gillini et al. An assistive shared control architecture for a robotic arm using eeg-based bci with motor imagery
CN111399652A (en) Multi-robot hybrid system based on layered SSVEP and visual assistance
Wang et al. Brain-controlled wheelchair review: From wet electrode to dry electrode, from single modal to hybrid modal, from synchronous to asynchronous
Li et al. CVT-based asynchronous BCI for brain-controlled robot navigation
Ashok High-level hands-free control of wheelchair–a review
CN113156861A (en) Intelligent wheelchair control system
Liu et al. A novel brain-controlled wheelchair combined with computer vision and augmented reality
Laniel et al. Adding navigation, artificial audition and vital sign monitoring capabilities to a telepresence mobile robot for remote home care applications
Ababneh et al. Gesture controlled mobile robotic arm for elderly and wheelchair people assistance using kinect sensor
US11687074B2 (en) Method for controlling moving body based on collaboration between the moving body and human, and apparatus for controlling the moving body thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200403