WO2018113392A1 - 一种基于脑机接口的机械臂自主辅助系统及方法 - Google Patents

一种基于脑机接口的机械臂自主辅助系统及方法 Download PDF

Info

Publication number
WO2018113392A1
WO2018113392A1 PCT/CN2017/105622 CN2017105622W WO2018113392A1 WO 2018113392 A1 WO2018113392 A1 WO 2018113392A1 CN 2017105622 W CN2017105622 W CN 2017105622W WO 2018113392 A1 WO2018113392 A1 WO 2018113392A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
robot arm
cup
brain
module
Prior art date
Application number
PCT/CN2017/105622
Other languages
English (en)
French (fr)
Inventor
张智军
黄永前
李远清
Original Assignee
华南理工大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华南理工大学 filed Critical 华南理工大学
Priority to US16/471,505 priority Critical patent/US11602300B2/en
Publication of WO2018113392A1 publication Critical patent/WO2018113392A1/zh

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • A61B5/375Electroencephalography [EEG] using biofeedback
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • B25J13/087Controls for manipulators by means of sensing devices, e.g. viewing or touching devices for sensing other physical parameters, e.g. electrical or chemical properties
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J3/00Manipulators of master-slave type, i.e. both controlling unit and controlled unit perform corresponding spatial movements
    • B25J3/04Manipulators of master-slave type, i.e. both controlling unit and controlled unit perform corresponding spatial movements involving servo mechanisms
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1682Dual arm manipulator; Coordination of several manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1689Teleoperation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • G05B19/4097Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by using design data to control NC machines, e.g. CAD/CAM
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W80/00Wireless network protocols or protocol adaptations to wireless operation
    • H04W80/06Transport layer protocols, e.g. TCP [Transport Control Protocol] over wireless
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/35Nc in input of data, input till input file format
    • G05B2219/35444Gesture interface, controlled machine observes operator, executes commands
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39451Augmented reality for robot programming
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40413Robot has multisensors surrounding operator, to understand intention of operator
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/011Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns

Definitions

  • the invention relates to the field of brain-computer interface application research, and particularly relates to a robot arm autonomous auxiliary system and method based on a brain-computer interface.
  • BCI Brain Computer Interface
  • BCI Brain-Computer Interface
  • the BCI system can be divided into invasive and non-invasive, the invasive system will implant the electrode into the skull, and the non-invasive system will only collect the scalp brain electrical signal. Since the non-invasive brain-computer interface does not require surgery, it is safer and simpler than invasive, and with the continuous advancement of signal processing methods and techniques, the treatment of scalp electroencephalogram (EEG) has reached a certain level. It is possible to make the brain-computer interface into practical life application, and the invention adopts a non-invasive brain-computer interface technology.
  • the above-mentioned invention patents only perform some simple or even preset mechanical arm motion control through the EEG signal, and do not fully exert the characteristics and advantages of combining the brain-computer interface with the robot arm autonomous control technology, based on the brain machine.
  • the mechanical arm independent auxiliary system and method of the interface can combine the advantages of both the brain-computer interface and the mechanical arm, and better utilize the brain-computer interface to improve the quality of life of the paralyzed patients and improve their ability to live independently.
  • the object of the present invention is to provide a robot arm autonomous auxiliary system based on a brain-computer interface, in view of the above-mentioned deficiencies of the prior art.
  • Another object of the present invention is to provide a robot arm autonomous assisting method based on a brain-computer interface.
  • a mechanical arm autonomous auxiliary system based on a brain-computer interface the system is constructed according to a three-layer structure of a sensing layer, a decision layer and an execution layer, and the sensing layer comprises an EEG acquisition and detection module and a visual recognition and positioning module, the EEG
  • the acquisition and detection module is configured to collect an EEG signal
  • the analysis identifies the user's intention.
  • the visual recognition and positioning module is configured to identify and locate the corresponding cup and the position of the user's mouth according to the user's intention;
  • the execution layer includes the robot arm control.
  • the robot arm control module is a carrier that assists a person in practice, performs trajectory planning and control on the robot arm according to an execution instruction received from the decision module;
  • the decision layer includes a decision module, and the decision module uses
  • the EEG acquisition and detection module, the visual recognition and positioning module and the robot arm control module are connected to realize data acquisition and transmission of the EEG signal, the positioning position and the arm state, and the robot arm executes the instruction transmission.
  • the brain electrical collection and detection module comprises an electrode cap for brain electrical collection, an electroencephalogram acquisition device and a first computer, wherein “A1”, “T5”, “P3”, “PZ” in the electrode cap are used, Ten channels of “P4”, “T6”, “O1”, “Oz”, “O2” and “A2” are placed in accordance with the international standard 10-20 system; the first computer is used to detect the P300 signal.
  • visual stimuli flashing of the function keys in the screen the function keys of the blinking visual stimuli are regularly distributed in the computer screen in the form of 2*2 rows and columns, including "cup1", “cup2", “cup3” and “back” function keys And black and green color change flicker in random order at intervals of 200ms.
  • the visual recognition and positioning module comprises two Microsoft Kinect vision sensors and a second computer, and the two Microsoft Kinect vision sensors are respectively placed in front of the cup to be grasped and in front of the user for identifying and locating the location.
  • the cup to be grasped and the mouth of the user; the second computer is used to implement a cup contour detection algorithm, a cup positioning algorithm, a template matching recognition algorithm, and a mouth recognition positioning algorithm.
  • the decision module is based on a TCP communication protocol, and defines a unified transmission data variable, including a user's brain electrical intention, cup and mouth position information, and erects a client and server service code framework to implement an EEG intent.
  • the positioning position and the acquisition and transmission of the arm state data, and the robot arm executes the transmission of the command.
  • the robot arm control module uses a multi-degree of freedom robot arm as an actuator.
  • a robot arm self-assisted method based on a brain-computer interface comprising the following steps:
  • the first computer screen enters a function key interface of the blinking visual stimulus, and the function key interface includes four function keys of “cup1”, “cup2”, “cup3” and “back”;
  • the visual recognition and positioning module identifies and locates the position of the corresponding cup and the position of the user's mouth according to the EEG intent in step 4), and uses the TCP communication protocol to position the user-selected cup and the user's mouth. Information, sent to the decision module;
  • the decision module generates a corresponding robot arm execution instruction according to the position information of the cup and the user's mouth obtained in step 5) and the brain electrical intention obtained in step 4), and sends the corresponding arm execution command to the robot arm control module;
  • the robot arm control module performs the trajectory planning according to the instruction of the robot arm, and according to the planned trajectory, the control robot arm grabs the user selected cup and transmits the cup to the user's mouth;
  • step 8 a decision module, according to the EEG intention of the returned cup obtained in step 8), generating a corresponding robot arm execution instruction and transmitting to the robot arm control module;
  • the manipulator control module performs the trajectory planning according to the instruction of the robot arm, and according to the planned trajectory, the control robot arm puts the cup selected by the user back to the original position and returns to the initial position of the arm, thereby realizing the autonomous auxiliary user of the arm. Drink water function.
  • the selection of the function keys in the step 4) and the step 8) is specifically implemented by the following process: the user looks at a certain function key in the function key interface of the first computer, and the EEG signal is collected through the electrode cap and the brain electricity. After the instrument performs acquisition, amplification, filtering, and analog-to-digital conversion processing, the data is transmitted to the first computer for P300 signal detection, and then a function key is selected.
  • the P300 signal detection is specifically implemented by the following steps:
  • the EEG signal is subjected to bandpass filtering denoising processing of 0.1 to 20 Hz;
  • step 5 the identifying and locating the position of the corresponding cup is specifically achieved by the following steps:
  • step (2) removing the horizontal plane extracted in step (1), and extracting and dividing the object from the remaining three-dimensional point cloud;
  • step (3) selecting a template matching algorithm, respectively matching the color image corresponding to each object point cloud set obtained in step (2) with the preset image in the library, and identifying a point cloud set corresponding to the cup selected by the user;
  • step (3) Calculating the mean value of the point cloud set corresponding to the selected cup obtained in step (3), realizing the positioning of the cup in the Microsoft Kinect vision sensor coordinate system, and converting to the robot arm coordinate system.
  • the identifying and locating the position of the user's mouth is implemented by the following steps: using the software development kit provided by the Microsoft Kinect vision sensor itself for human detection, and acquiring the user's mouth at Microsoft Kinect The coordinate position in the vision sensor coordinate system and converted to the robot arm coordinate system.
  • the robot arm trajectory planning and control is specifically implemented by: presetting the preset key trajectory point with the coordinates of the user's mouth and the selected cup on the robot arm coordinate system. The points are combined to plan the running track of the robot arm.
  • the control arm is operated according to the planned trajectory, so that the mechanical arm can independently assist the user to drink water.
  • the present invention has the following advantages and beneficial effects:
  • the invention combines the brain-computer interface technology of P300 with the robot arm assisting technology of autonomous control decision-making function, and only needs the user to provide the intention of brain electricity, and the remaining mechanical arm motion control is completed by the automatic planning control of the system, which is generated for the user.
  • the burden is small and easy to apply.
  • the invention combines visual recognition and positioning technology, brain machine interface and mechanical arm to realize the effect that the beverage selected by the user can be placed arbitrarily within a certain range.
  • the invention combines visual recognition and positioning technology, brain-computer interface and mechanical arm to comprehensively exert the advantages of the three; the user can independently select the beverage to be drinked by using the brain-computer interface in the system of the invention;
  • the robotic arm and visual recognition and positioning technology in the invention system locates and captures the user-selected beverages and sends them to the user's mouth, providing the convenience of drinking water for the paralyzed patients, improving the quality of life of the paralyzed patients, and improving them.
  • FIG. 1 is a structural block diagram of a robotic arm autonomous auxiliary system based on a brain-computer interface of the present invention.
  • FIG. 2 is a flow chart of a robotic arm autonomous auxiliary system based on a brain-computer interface of the present invention.
  • Figure 3 is a flow chart of the identification and positioning of the corresponding cup position of the present invention.
  • FIG. 4 is a schematic view of the trajectory planning and control of the robot arm of the present invention.
  • the embodiment provides a mechanical arm autonomous auxiliary system based on a brain-computer interface.
  • the system is constructed according to a three-layer structure of a sensing layer, a decision layer, and an execution layer, and the sensing layer includes an EEG acquisition and detection module.
  • the EEG acquisition and detection module is configured to collect an EEG signal
  • the analysis identifies the user intent
  • the visual recognition and positioning module is configured to identify and locate the corresponding cup and user according to the user intention.
  • the execution layer includes a robot arm control module, which is a carrier for assisting the operation in practice, and performs trajectory planning and control on the robot arm according to the execution instruction received from the decision module;
  • the decision layer includes a decision module for connecting an EEG acquisition and detection module, a visual recognition and positioning module, and a robot arm control module to realize acquisition and transmission of data such as EEG signals, positioning positions, and arm status, and mechanical The arm executes the transmission of the instruction.
  • the brain electrical collection and detection module comprises an electrode cap for brain electrical collection, an electroencephalogram acquisition device and a first computer, wherein "A1", “T5", “P3", “PZ”, "” in the electrode cap are used.
  • Ten channels of P4", “T6”, “O1”, “Oz”, “O2", “A2” are placed in accordance with the international standard 10-20 system; the first computer is used to detect the P300 signal and
  • the visual stimuli of the function keys flashing in the screen, the function keys of the blinking visual stimuli are regularly distributed in the computer screen in the form of 2*2 rows and columns, including the “cup1”, “cup2”, “cup3” and “back” function keys.
  • the black and green color change flicker in random order is performed at intervals of 200 ms.
  • the visual recognition and positioning module comprises two Microsoft Kinect vision sensors and a second computer, and the two Microsoft Kinect vision sensors are respectively placed in front of the cup to be grasped and in front of the user for identification and positioning.
  • the captured cup and the user's mouth; the second computer is used to implement a cup contour detection algorithm, a cup positioning algorithm, a template matching recognition algorithm, and a mouth recognition positioning algorithm.
  • the decision module is based on a TCP communication protocol, and defines a unified transmission data variable, including a user's brain electrical intent, cup and mouth position information, and erects a client and server service code framework to implement an EEG intent, The positioning position and the acquisition and transmission of the arm state data, and the robot arm executes the transmission of the command.
  • the robot arm control module adopts a multi-degree-of-freedom robot arm as an actuator.
  • the embodiment provides a robot arm self-assisted method based on a brain-computer interface. As shown in FIG. 2, the method includes the following steps:
  • the first computer screen enters a function key interface of a blinking visual stimulus, the function key interface including "cup1", “cup2”, “cup3” and “back” four function keys;
  • the visual recognition and positioning module identifies and locates the position of the corresponding cup and the position of the user's mouth according to the EEG intent in step 4), and uses the TCP communication protocol to position the user-selected cup and the user's mouth. Information, sent to the decision module;
  • the decision module generates a corresponding robot arm execution instruction according to the position information of the cup and the user's mouth obtained in step 5) and the brain electrical intention obtained in step 4), and sends the corresponding arm execution command to the robot arm control module;
  • the robot arm control module performs the trajectory planning according to the instruction of the robot arm, and according to the planned trajectory, the control robot arm grabs the user selected cup and transmits the cup to the user's mouth;
  • step 8 a decision module, according to the EEG intention of the returned cup obtained in step 8), generating a corresponding robot arm execution instruction and transmitting to the robot arm control module;
  • the manipulator control module performs the trajectory planning according to the instruction of the robot arm, and according to the planned trajectory, the control robot arm puts the cup selected by the user back to the original position and returns to the initial position of the arm, thereby realizing the autonomous auxiliary user of the arm. Drink water function.
  • the selection of the function keys in the step 4) and the step 8) is specifically implemented by the following process: the user looks at a function key in the function key interface of the first computer, and the EEG signal passes through the electrode cap and the EEG collecting instrument. After the acquisition, amplification, filtering, and analog-to-digital conversion processing, the data is transmitted to the first computer for P300 signal detection, and then a function key is selected.
  • the P300 signal detection is specifically implemented by the following steps:
  • the EEG signal is subjected to bandpass filtering denoising processing of 0.1 to 20 Hz;
  • step 5 the position of identifying and positioning the corresponding cup is specifically achieved by the following steps:
  • step (2) removing the horizontal plane extracted in step (1), and extracting and dividing the object from the remaining three-dimensional point cloud;
  • step (3) selecting a template matching algorithm, respectively matching the color image corresponding to each object point cloud set obtained in step (2) with the preset image in the library, and identifying a point cloud set corresponding to the cup selected by the user;
  • step (3) Calculating the mean value of the point cloud set corresponding to the selected cup obtained in step (3), realizing the positioning of the cup in the Microsoft Kinect vision sensor coordinate system, and converting to the robot arm coordinate system.
  • step 5 the identifying and locating the position of the user's mouth is specifically implemented by the following steps: using the software development kit provided by the Microsoft Kinect vision sensor to perform human body detection, and acquiring the user's mouth in the Microsoft Kinect vision.
  • step 7) and step 10), as shown in FIG. 4, the robot arm trajectory planning and control is specifically implemented by the following process: the preset key trajectory point and the user's mouth and the selected cup are in the robot arm The coordinate points on the coordinate system are combined to plan the running track of the robot arm. By calling the corresponding API of the robot arm, the control arm is operated according to the planned trajectory, so that the mechanical arm can independently assist the user to drink water.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Robotics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Neurosurgery (AREA)
  • Neurology (AREA)
  • Dermatology (AREA)
  • Medical Informatics (AREA)
  • Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Psychiatry (AREA)
  • Psychology (AREA)
  • Public Health (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Automation & Control Theory (AREA)
  • Manufacturing & Machinery (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)
  • Rehabilitation Tools (AREA)
  • User Interface Of Digital Computer (AREA)
  • Manipulator (AREA)

Abstract

一种基于脑机接口的机械臂自主辅助系统及方法,所述系统包括感知层、决策层和执行层,感知层包括脑电采集与检测模块和视觉识别与定位模块,用于分析识别出用户意图,并用于根据用户意图识别与定位出相应的杯子及用户嘴部的位置;执行层包括机械臂控制模块,根据从决策模块接收来的执行指令,对机械臂进行轨迹规划及控制;决策层包括决策模块,用于连接脑电采集与检测模块、视觉识别与定位模块和机械臂控制模块,实现脑电信号、定位位置及机械臂状态数据的采集和传输,机械臂执行指令的发送。该系统将视觉识别与定位技术、脑机接口和机械臂相结合,为瘫痪病人提供自主喝水的便利,改善了瘫痪病人的生活质量。

Description

一种基于脑机接口的机械臂自主辅助系统及方法 技术领域
本发明涉及脑机接口应用研究领域,具体涉及一种基于脑机接口的机械臂自主辅助系统及方法。
背景技术
世界上有许多严重瘫痪的病人,他们只能通过他人的帮助来完成一些日常生活所必需的活动,如喝水等。随着人工智能和机器人技术的不断发展,越来越多的研究成果被应用于辅助此类人群,以便改善他们的生活质量,其中,脑机接口(Brain Computer Interface,BCI)领域作为神经工程领域的一个分支,其发展迅速,前景广泛,激起了人们对脑机接口领域研究热潮。
脑机接口(BCI)是一种全新的人-机交互技术,它能够不通过常规大脑输出通路(外周神经和肌肉组织)而直接实现人脑与计算机之间的通信,为瘫痪病人提供了一种与外界进行信息交流和控制的新途径,BCI系统可分为侵入式和非侵入式,侵入式系统将会把电极植入到脑壳内,非侵入式系统则只采集头皮脑电信号。由于非侵入式脑机接口无需做手术,相比于侵入式更加安全简单,而且随着信号处理方法和技术的不断进步,对头皮脑电(electroencephalogram,EEG)的处理已经能够达到一定的水平,使脑机接口进入实际生活应用成为可能,本发明所采用的是非侵入式脑机接口技术。
目前,也有部分已有研究试图将脑机接口技术和机器人技术相结合,公开号为CN102198660A,发明名称为《基于脑-机接口的机械手臂控制系统及动作命令控制方案》的中国专利申请,基于运动想象的脑机接口实现了机械臂上、下、左、右、前、后和手指的抓与放等八个指令的控制;公开号为CN102309365A,发明名称为《一种可穿戴的脑控智能假肢》的中国专利申请,实现了脑电检测识别的穿戴式检测与计算,并结合智能感知技术对假肢实现了精密自适应的智能控制,提高了假肢动作的效率与精度,较理想的实现人手功能;公开号为CN105425963A,发明名称为《一种脑电波控制机械臂的系统》的中国发明专利利用脑电波信号获取注意力及放松度参数,来完成预设的机械臂动作。
以上所述发明专利,都只通过脑电信号完成一些简单的甚至是预设的机械臂动作控制,没有充分地发挥出脑机接口与机械臂自主控制技术相结合的特点及优势,基于脑机接口的机械臂自主辅助系统及方法,能够综合脑机接口与机械臂两者的优势,更好的利用脑机接口改善瘫痪病人的生活质量,提高他们自主生活的能力。
发明内容
本发明的目的是针对上述现有技术的不足,提供了一种基于脑机接口的机械臂自主辅助系统。
本发明的另一目的在于提供一种基于脑机接口的机械臂自主辅助方法。
本发明的目的可以通过如下技术方案实现:
一种基于脑机接口的机械臂自主辅助系统,该系统按照感知层、决策层、执行层三层结构来搭建,感知层包括脑电采集与检测模块和视觉识别与定位模块,所述脑电采集与检测模块用于采集脑电信号,分析识别出用户意图,所述视觉识别与定位模块用于根据用户意图,识别与定位出相应的杯子及用户嘴部的位置;执行层包括机械臂控制模块,所述机械臂控制模块为在实际中对人进行辅助操作的载体,根据从决策模块接收来的执行指令,对机械臂进行轨迹规划及控制;决策层包括决策模块,所述决策模块用于连接脑电采集与检测模块、视觉识别与定位模块和机械臂控制模块,实现脑电信号、定位位置及机械臂状态等数据的采集和传输,机械臂执行指令的发送。
优选的,所述脑电采集与检测模块包括脑电采集的电极帽、脑电采集仪和第一计算机,其中使用电极帽中的“A1”、“T5”、“P3”、“PZ”、“P4”、“T6”、“O1”、“Oz”、“O2”、“A2”十个通道,放置的位置遵循国际标准10-20系统;所述第一计算机用来实现P300信号的检测和屏幕中功能键闪烁的视觉刺激,所述闪烁的视觉刺激的功能键按2*2行列形式规则分布于计算机屏幕中,包括“cup1”、“cup2”、“cup3”以及“back”功能键,并以200ms的时间间隔进行随机顺序的黑、绿两种颜色变化闪烁。
优选的,所述视觉识别与定位模块包括两个Microsoft Kinect视觉传感器和第二计算机,所述两个Microsoft Kinect视觉传感器分别放置在所需抓取的杯子前和用户面前,用于识别、定位所需抓取的杯子和用户的嘴部;所述第二计算机用于实现杯子轮廓检测算法、杯子定位算法、模板匹配识别算法以及嘴部识别定位算法。
优选的,所述决策模块是基于TCP通信协议,通过定义统一的传输数据变量,包括用户的脑电意图、杯子及嘴部的位置信息,架设客户端与服务端服务代码框架,实现脑电意图、定位位置及机械臂状态数据的采集和传输,机械臂执行指令的发送。
优选的,所述机械臂控制模块采用一个多自由度机械臂作为执行机构。
本发明的另一目的可以通过如下技术方案实现:
一种基于脑机接口的机械臂自主辅助方法,所述方法包括以下步骤:
1)用户坐在第一计算机屏幕前,调整好位置,佩戴好脑电采集的电极帽,打开脑电采集仪和第一计算机,确认信号采集状态良好;
2)启动基于脑机接口的机械臂自主辅助系统,确认识别、定位用户嘴部的Microsoft Kinect视觉传感器能够正确捕捉到用户的嘴部,确认所要抓取的三个预设的杯子被正确放在用于识别、定位所需抓取的杯子的Microsoft Kinect视觉传感器的视野内;
3)第一计算机屏幕进入闪烁的视觉刺激的功能键界面,该功能键界面包括“cup1”,“cup2”,“cup3”以及“back”四个功能键;
4)用户注视“cup1”,“cup2”或者“cup3”三个功能键的其中一个,即选择出三个预设的杯子中的其中一个,一旦功能键被选定,将得出用户关于杯子选择的脑电意图,并发送到视觉识别与定位模块和决策模块;
5)视觉识别与定位模块根据步骤4)中的脑电意图,识别与定位出相应杯子的位置与用户的嘴部的位置,并利用TCP通信协议,将用户所选杯子及用户嘴部的位置信息,发送到决策模块;
6)决策模块根据步骤5)中获得的杯子及用户嘴部的位置信息和步骤4)中获得的脑电意图,生成相应的机械臂执行指令并发送给机械臂控制模块;
7)机械臂控制模块根据机械臂执行指令,进行轨迹规划,并按照所规划的轨迹,控制机械臂抓取用户所选杯子及将杯子传送到用户的嘴边;
8)喝水结束后,用户注视“back”功能键,一旦功能键被选定,将得出用户关于送返杯子的脑电意图并发送到决策模块;
9)决策模块,根据步骤8)中获得的送返杯子的脑电意图,生成相应的机械臂执行指令并发送给机械臂控制模块;
10)机械臂控制模块根据机械臂执行指令,进行轨迹规划,并按照所规划的轨迹,控制机械臂将用户所选杯子放回原来位置并恢复到机械臂初始位置状态,实现机械臂自主辅助用户喝水功能。
优选的,所述步骤4)和步骤8)中功能键的选定具体通过以下过程实现:用户注视第一计算机的功能键界面中的某个功能键,脑电信号通过电极帽和脑电采集仪进行采集、放大、滤波、模数转换处理后,再将数据传输给第一计算机进行P300信号检测,然后实现某个功能键的选定,所述P300信号检测具体通过以下步骤实现:
(一)、将EEG信号经过0.1~20Hz的带通滤波去噪处理;
(二)、以EEG信号幅值作为特征,截取P300功能键闪烁后的600ms时间窗的数据,并采用贝叶斯模型进行状态分类,从而实现P300信号检测。
优选的,步骤5)中,所述识别与定位相应杯子的位置具体通过以下步骤实现:
(1)、通过区域生长算法,在Microsoft Kinect视觉传感器的三维点云中,提取出杯子所摆放在的水平面;
(2)、去除步骤(1)提取的水平面,对剩下的三维点云进行物体的提取和分割;
(3)、选用模板匹配算法,将步骤(2)中获得的各物体点云集合对应的彩色图像与库中预设图像分别进行匹配,识别出用户所选杯子对应的点云集合;
(4)、将步骤(3)中获得的所选杯子对应的点云集合进行均值计算,实现杯子在Microsoft Kinect视觉传感器坐标系中的定位,并转换到机械臂坐标系上。
优选的,步骤5)中,所述识别与定位用户的嘴部的位置,具体通过以下步骤实现:运用Microsoft Kinect视觉传感器本身所提供的软件开发工具包进行人体检测,获取用户嘴部在Microsoft Kinect视觉传感器坐标系中的坐标位置,并转换到机械臂坐标系上。
优选的,步骤7)和步骤10)中,所述机械臂轨迹规划及控制,具体通过以下过程实现:将预设的关键轨迹点与用户嘴部及所选杯子在机械臂坐标系上的坐标点相结合,规划出机械臂运行轨迹,通过调用机械臂相应的API,控制机械臂按所规划轨迹运行,实现机械臂自主辅助用户喝水功能。
本发明与现有技术相比,具有如下优点和有益效果:
1、本发明基于P300的脑机接口技术与自主控制决策功能的机械臂辅助技术相结合,只需用户提供脑电意图,剩下的机械臂动作控制为系统自动规划控制完成,对用户产生的负担较小,便于应用。
2、本发明将视觉识别与定位技术、脑机接口和机械臂相结合,实现用户所选饮料可以在一定范围内任意放置的效果。
3、本发明将视觉识别与定位技术、脑机接口和机械臂相结合,综合发挥三者的优势;用户可以通过使用本发明系统中的脑机接口,对所喝饮料进行自主选择;再通过发明系统中的机械臂与视觉识别与定位技术,对用户所选饮料进行定位识别及抓取并送到用户嘴边,为瘫痪病人提供自主喝水的便利,改善瘫痪病人的生活质量,提高他们自主生活的能力。
附图说明
图1为本发明的基于脑机接口的机械臂自主辅助系统的结构框图。
图2为本发明的基于脑机接口的机械臂自主辅助系统的流程图。
图3为本发明的识别与定位相应杯子位置的流程图。
图4为本发明的机械臂轨迹规划及控制的示意图。
具体实施方式
下面结合实施例及附图对本发明作进一步详细的描述,但本发明的实施方式不限于此。
实施例1:
如图1所示,本实施例提供了一种基于脑机接口的机械臂自主辅助系统,该系统按照感知层、决策层、执行层三层结构来搭建,感知层包括脑电采集与检测模块和视觉识别与定位模块,所述脑电采集与检测模块用于采集脑电信号,分析识别出用户意图,所述视觉识别与定位模块用于根据用户意图,识别与定位出相应的杯子及用户嘴部的位置;执行层包括机械臂控制模块,所述机械臂控制模块为在实际中对人进行辅助操作的载体,根据从决策模块接收来的执行指令,对机械臂进行轨迹规划及控制;决策层包括决策模块,所述决策模块用于连接脑电采集与检测模块、视觉识别与定位模块和机械臂控制模块,实现脑电信号、定位位置及机械臂状态等数据的采集和传输,机械臂执行指令的发送。
其中,所述脑电采集与检测模块包括脑电采集的电极帽、脑电采集仪和第一计算机,其中使用电极帽中的“A1”、“T5”、“P3”、“PZ”、“P4”、“T6”、“O1”、“Oz”、“O2”、“A2”十个通道,放置的位置遵循国际标准10-20系统;所述第一计算机用来实现P300信号的检测和屏幕中功能键闪烁的视觉刺激,所述闪烁的视觉刺激的功能键按2*2行列形式规则分布于计算机屏幕中,包括“cup1”、“cup2”、“cup3”以及“back”功能键,并以200ms的时间间隔进行随机顺序的黑、绿两种颜色变化闪烁。
其中,所述视觉识别与定位模块包括两个Microsoft Kinect视觉传感器和第二计算机,所述两个Microsoft Kinect视觉传感器分别放置在所需抓取的杯子前和用户面前,用于识别、定位所需抓取的杯子和用户的嘴部;所述第二计算机用于实现杯子轮廓检测算法、杯子定位算法、模板匹配识别算法以及嘴部识别定位算法。
其中,所述决策模块是基于TCP通信协议,通过定义统一的传输数据变量,包括用户的脑电意图、杯子及嘴部的位置信息,架设客户端与服务端服务代码框架,实现脑电意图、定位位置及机械臂状态数据的采集和传输,机械臂执行指令的发送。
其中,所述机械臂控制模块采用一个多自由度机械臂作为执行机构。
实施例2:
本实施例提供了一种基于脑机接口的机械臂自主辅助方法,如图2所示,该方法包括以下步骤:
1)用户坐在第一计算机屏幕前,调整好位置,佩戴好脑电采集的电极帽,打开脑电采集仪和第一计算机,确认信号采集状态良好;
2)启动基于脑机接口的机械臂自主辅助系统,确认识别、定位用户嘴部的Microsoft Kinect视觉传感器能够正确捕捉到用户的嘴部,确认所要抓取的三个预设的杯子被正确放在用于识别、定位所需抓取的杯子的Microsoft Kinect视觉传感器的视野内;
3)第一计算机屏幕进入闪烁的视觉刺激的功能键界面,该功能键界面包括“cup1”, “cup2”,“cup3”以及“back”四个功能键;
4)用户注视“cup1”,“cup2”或者“cup3”三个功能键的其中一个,即选择出三个预设的杯子中的其中一个,一旦功能键被选定,将得出用户关于杯子选择的脑电意图,并发送到视觉识别与定位模块和决策模块;
5)视觉识别与定位模块根据步骤4)中的脑电意图,识别与定位出相应杯子的位置与用户的嘴部的位置,并利用TCP通信协议,将用户所选杯子及用户嘴部的位置信息,发送到决策模块;
6)决策模块根据步骤5)中获得的杯子及用户嘴部的位置信息和步骤4)中获得的脑电意图,生成相应的机械臂执行指令并发送给机械臂控制模块;
7)机械臂控制模块根据机械臂执行指令,进行轨迹规划,并按照所规划的轨迹,控制机械臂抓取用户所选杯子及将杯子传送到用户的嘴边;
8)喝水结束后,用户注视“back”功能键,一旦功能键被选定,将得出用户关于送返杯子的脑电意图并发送到决策模块;
9)决策模块,根据步骤8)中获得的送返杯子的脑电意图,生成相应的机械臂执行指令并发送给机械臂控制模块;
10)机械臂控制模块根据机械臂执行指令,进行轨迹规划,并按照所规划的轨迹,控制机械臂将用户所选杯子放回原来位置并恢复到机械臂初始位置状态,实现机械臂自主辅助用户喝水功能。
其中,所述步骤4)和步骤8)中功能键的选定具体通过以下过程实现:用户注视第一计算机的功能键界面中的某个功能键,脑电信号通过电极帽和脑电采集仪进行采集、放大、滤波、模数转换处理后,再将数据传输给第一计算机进行P300信号检测,然后实现某个功能键的选定,所述P300信号检测具体通过以下步骤实现:
(一)、将EEG信号经过0.1~20Hz的带通滤波去噪处理;
(二)、以EEG信号幅值作为特征,截取P300功能键闪烁后的600ms时间窗的数据,并采用贝叶斯模型进行状态分类,从而实现P300信号检测。
其中,步骤5)中,如图3所示,所述识别与定位相应杯子的位置具体通过以下步骤实现:
(1)、通过区域生长算法,在Microsoft Kinect视觉传感器的三维点云中,提取出杯子所摆放在的水平面;
(2)、去除步骤(1)提取的水平面,对剩下的三维点云进行物体的提取和分割;
(3)、选用模板匹配算法,将步骤(2)中获得的各物体点云集合对应的彩色图像与库中预设图像分别进行匹配,识别出用户所选杯子对应的点云集合;
(4)、将步骤(3)中获得的所选杯子对应的点云集合进行均值计算,实现杯子在Microsoft Kinect视觉传感器坐标系中的定位,并转换到机械臂坐标系上。
其中,步骤5)中,所述识别与定位用户的嘴部的位置,具体通过以下步骤实现:运用Microsoft Kinect视觉传感器本身所提供的软件开发工具包进行人体检测,获取用户嘴部在Microsoft Kinect视觉传感器坐标系中的坐标位置,并转换到机械臂坐标系上。
其中,步骤7)和步骤10)中,如图4所示,所述机械臂轨迹规划及控制,具体通过以下过程实现:将预设的关键轨迹点与用户嘴部及所选杯子在机械臂坐标系上的坐标点相结合,规划出机械臂运行轨迹,通过调用机械臂相应的API,控制机械臂按所规划轨迹运行,实现机械臂自主辅助用户喝水功能。
以上所述,仅为本发明专利较佳的实施例,但本发明专利的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本发明专利所公开的范围内,根据本发明专利的技术方案及其发明专利构思加以等同替换或改变,都属于本发明专利的保护范围。

Claims (10)

  1. 一种基于脑机接口的机械臂自主辅助系统,其特征在于:所述系统按照感知层、决策层、执行层三层结构来搭建,感知层包括脑电采集与检测模块和视觉识别与定位模块,所述脑电采集与检测模块用于采集脑电信号,分析识别出用户意图,所述视觉识别与定位模块用于根据用户意图,识别与定位出相应的杯子及用户嘴部的位置;执行层包括机械臂控制模块,所述机械臂控制模块为在实际中对人进行辅助操作的载体,根据从决策模块接收来的执行指令,对机械臂进行轨迹规划及控制;决策层包括决策模块,所述决策模块用于连接脑电采集与检测模块、视觉识别与定位模块和机械臂控制模块,实现脑电信号、定位位置及机械臂状态数据的采集和传输,机械臂执行指令的发送。
  2. 根据权利要求1所述的一种基于脑机接口的机械臂自主辅助系统,其特征在于:所述脑电采集与检测模块包括脑电采集的电极帽、脑电采集仪和第一计算机,其中使用电极帽中的“A1”、“T5”、“P3”、“PZ”、“P4”、“T6”、“O1”、“Oz”、“O2”、“A2”十个通道,放置的位置遵循国际标准10-20系统;所述第一计算机用来实现P300信号的检测和屏幕中功能键闪烁的视觉刺激,所述闪烁的视觉刺激的功能键按2*2行列形式规则分布于计算机屏幕中,包括“cup1”、“cup2”、“cup3”以及“back”功能键,并以200ms的时间间隔进行随机顺序的黑、绿两种颜色变化闪烁。
  3. 根据权利要求1所述的一种基于脑机接口的机械臂自主辅助系统,其特征在于:所述视觉识别与定位模块包括两个Microsoft Kinect视觉传感器和第二计算机,所述两个Microsoft Kinect视觉传感器分别放置在所需抓取的杯子前和用户面前,用于识别、定位所需抓取的杯子和用户的嘴部;所述第二计算机用于实现杯子轮廓检测算法、杯子定位算法、模板匹配识别算法以及嘴部识别定位算法。
  4. 根据权利要求1所述的一种基于脑机接口的机械臂自主辅助系统,其特征在于:所述决策模块是基于TCP通信协议,通过定义统一的传输数据变量,包括用户的脑电意图、杯子及嘴部的位置信息,架设客户端与服务端服务代码框架,实现脑电意图、定位位置及机械臂状态数据的采集和传输,机械臂执行指令的发送。
  5. 根据权利要求1所述的一种基于脑机接口的机械臂自主辅助系统,其特征在于:所述机械臂控制模块,采用一个多自由度机械臂作为执行机构。
  6. 一种基于脑机接口的机械臂自主辅助方法,其特征在于:所述方法包括以下步骤:
    1)用户坐在第一计算机屏幕前,调整好位置,佩戴好脑电采集的电极帽,打开脑电采集仪和第一计算机,确认信号采集状态良好;
    2)启动基于脑机接口的机械臂自主辅助系统,确认识别、定位用户嘴部的Microsoft Kinect视觉传感器能够正确捕捉到用户的嘴部,确认所要抓取的三个预设的杯子被正确放在用于识别、定位所需抓取的杯子的Microsoft Kinect视觉传感器的视野内;
    3)第一计算机屏幕进入闪烁的视觉刺激的功能键界面,该功能键界面包括“cup1”,“cup2”,“cup3”以及“back”四个功能键;
    4)用户注视“cup1”,“cup2”或者“cup3”三个功能键的其中一个,即选择出三个预设的杯子中的其中一个,一旦功能键被选定,将得出用户关于杯子选择的脑电意图,并发送到视觉识别与定位模块和决策模块;
    5)视觉识别与定位模块根据步骤4)中的脑电意图,识别与定位出相应杯子的位置与用户的嘴部的位置,并利用TCP通信协议,将用户所选杯子及用户嘴部的位置信息,发送到决策模块;
    6)决策模块根据步骤5)中获得的杯子及用户嘴部的位置信息和步骤4)中获得的脑电意图,生成相应的机械臂执行指令并发送给机械臂控制模块;
    7)机械臂控制模块根据机械臂执行指令,进行轨迹规划,并按照所规划的轨迹,控制机械臂抓取用户所选杯子及将杯子传送到用户的嘴边;
    8)喝水结束后,用户注视“back”功能键,一旦功能键被选定,将得出用户关于送返杯子的脑电意图并发送到决策模块;
    9)决策模块,根据步骤8)中获得的送返杯子的脑电意图,生成相应的机械臂执行指令并发送给机械臂控制模块;
    10)机械臂控制模块根据机械臂执行指令,进行轨迹规划,并按照所规划的轨迹,控制机械臂将用户所选杯子放回原来位置并恢复到机械臂初始位置状态,实现机械臂自主辅助用户喝水功能。
  7. 根据权利要求6所述的一种基于脑机接口的机械臂自主辅助方法,其特征在于:所述步骤4)和步骤8)中功能键的选定具体通过以下过程实现:用户注视第一计算机的功能键界面中的某个功能键,脑电信号通过电极帽和脑电采集仪进行采集、放大、滤波、模数转换处理后,再将数据传输给第一计算机进行P300信号检测,然后实现某个功能键的选定,所述P300信号检测具体通过以下步骤实现:
    (一)、将EEG信号经过0.1~20Hz的带通滤波去噪处理;
    (二)、以EEG信号幅值作为特征,截取P300功能键闪烁后的600ms时间窗的数据,并采用贝叶斯模型进行状态分类,从而实现P300信号检测。
  8. 根据权利要求6所述的一种基于脑机接口的机械臂自主辅助方法,其特征在于:步骤5)中,所述识别与定位相应杯子的位置具体通过以下步骤实现:
    (1)、通过区域生长算法,在Microsoft Kinect视觉传感器的三维点云中,提取出杯子所摆放在的水平面;
    (2)、去除步骤(1)提取的水平面,对剩下的三维点云进行物体的提取和分割;
    (3)、选用模板匹配算法,将步骤(2)中获得的各物体点云集合对应的彩色图像与库中预设图像分别进行匹配,识别出用户所选杯子对应的点云集合;
    (4)、将步骤(3)中获得的所选杯子对应的点云集合进行均值计算,实现杯子在Microsoft Kinect视觉传感器坐标系中的定位,并转换到机械臂坐标系上。
  9. 根据权利要求6所述的一种基于脑机接口的机械臂自主辅助方法,其特征在于:步骤5)中,所述识别与定位用户的嘴部的位置,具体通过以下步骤实现:运用Microsoft Kinect视觉传感器本身所提供的软件开发工具包进行人体检测,获取用户嘴部在Microsoft Kinect视觉传感器坐标系中的坐标位置,并转换到机械臂坐标系上。
  10. 根据权利要求6所述的一种基于脑机接口的机械臂自主辅助方法,其特征在于:步骤7)和步骤10)中,所述机械臂轨迹规划及控制,具体通过以下过程实现:将预设的关键轨迹点与用户嘴部及所选杯子在机械臂坐标系上的坐标点相结合,规划出机械臂运行轨迹,通过调用机械臂相应的API,控制机械臂按所规划轨迹运行,实现机械臂自主辅助用户喝水功能。
PCT/CN2017/105622 2016-12-20 2017-10-11 一种基于脑机接口的机械臂自主辅助系统及方法 WO2018113392A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/471,505 US11602300B2 (en) 2016-12-20 2017-10-11 Brain-computer interface based robotic arm self-assisting system and method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201611180477.8 2016-12-20
CN201611180477.8A CN106671084B (zh) 2016-12-20 2016-12-20 一种基于脑机接口的机械臂自主辅助方法

Publications (1)

Publication Number Publication Date
WO2018113392A1 true WO2018113392A1 (zh) 2018-06-28

Family

ID=58869723

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/105622 WO2018113392A1 (zh) 2016-12-20 2017-10-11 一种基于脑机接口的机械臂自主辅助系统及方法

Country Status (3)

Country Link
US (1) US11602300B2 (zh)
CN (1) CN106671084B (zh)
WO (1) WO2018113392A1 (zh)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111571619A (zh) * 2020-04-17 2020-08-25 上海大学 一种基于ssvep脑控机械臂抓取的生活辅助系统与方法
CN112631173A (zh) * 2020-12-11 2021-04-09 中国人民解放军国防科技大学 脑控无人平台协同控制系统
CN113545783A (zh) * 2021-07-02 2021-10-26 徐州市健康研究院有限公司 一种脑电助餐装置
CN113778113A (zh) * 2021-08-20 2021-12-10 北京科技大学 一种基于多模态生理信号的飞行员辅助驾驶方法及系统
CN114161414A (zh) * 2021-12-03 2022-03-11 中国科学院沈阳自动化研究所 一种基于脑电和视觉的水下机械手控制系统及方法
CN114833825A (zh) * 2022-04-19 2022-08-02 深圳市大族机器人有限公司 协作机器人控制方法、装置、计算机设备和存储介质

Families Citing this family (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106671084B (zh) * 2016-12-20 2019-11-15 华南理工大学 一种基于脑机接口的机械臂自主辅助方法
CN107553491A (zh) * 2017-09-15 2018-01-09 华南理工大学 一种脑控轮椅机械臂
CN107595505A (zh) * 2017-09-15 2018-01-19 华南理工大学 一种电动轮椅机械臂装置
TWI672207B (zh) * 2017-11-03 2019-09-21 財團法人工業技術研究院 機械設備之定位系統及其方法
CN107885124B (zh) * 2017-11-21 2020-03-24 中国运载火箭技术研究院 一种增强现实环境中的脑眼协同控制方法及系统
CN108814894A (zh) * 2018-04-12 2018-11-16 山东大学 基于视觉人体位姿检测的上肢康复机器人系统及使用方法
CN108646915B (zh) * 2018-05-03 2020-12-15 东南大学 结合三维视线跟踪和脑机接口控制机械臂抓取物体的方法和系统
CN108942938B (zh) * 2018-07-27 2020-10-16 齐鲁工业大学 一种基于脑电波信号的四轴机器臂运动控制方法及系统
CN109223441A (zh) * 2018-09-13 2019-01-18 华南理工大学 一种基于Kinect传感器的人体上肢康复训练及动作辅助系统
CN109366508A (zh) * 2018-09-25 2019-02-22 中国医学科学院生物医学工程研究所 一种基于bci的高级机械臂控制系统及其实现方法
CN109034322B (zh) * 2018-10-18 2024-04-16 昆明昆船逻根机场物流系统有限公司 一种行李远程视觉补码系统及方法
CN109605385B (zh) * 2018-11-28 2020-12-11 东南大学 一种混合脑机接口驱动的康复辅助机器人
CN110134243A (zh) * 2019-05-20 2019-08-16 中国医学科学院生物医学工程研究所 一种基于增强现实的脑控机械臂共享控制系统及其方法
CN110480637B (zh) * 2019-08-12 2020-10-20 浙江大学 一种基于Kinect传感器的机械臂零件图像识别抓取方法
CN110575332B (zh) * 2019-08-29 2024-09-03 江苏大学 基于近红外主动立体视觉和脑电波技术的护理床及方法
CN110673721B (zh) * 2019-08-29 2023-07-21 江苏大学 基于视觉与意念信号协同控制的机器人看护系统
CN111309453A (zh) * 2020-02-13 2020-06-19 佛山智能装备技术研究院 分布式部署的智能机器人系统
CN111571587B (zh) * 2020-05-13 2023-02-24 南京邮电大学 一种脑控机械臂助餐系统及方法
CN112545533B (zh) * 2020-12-03 2023-03-21 中国船舶工业系统工程研究院 基于多传感器与脑电磁波复合信号的人体域网通信方法
CN112936259B (zh) * 2021-01-26 2023-06-20 中国科学院沈阳自动化研究所 一种适用于水下机器人的人机协作方法
CN113146618B (zh) * 2021-03-16 2022-07-01 深兰科技(上海)有限公司 机械臂的控制方法、系统、电子设备及存储介质
CN113274032A (zh) * 2021-04-29 2021-08-20 上海大学 一种基于ssvep+mi脑机接口的脑卒中康复训练系统及方法
CN113332101B (zh) * 2021-06-11 2023-08-01 上海羿生医疗科技有限公司 基于脑机接口的康复训练装置的控制方法和装置
CN113253750B (zh) * 2021-06-28 2021-11-05 北京科技大学 一种面向扑翼飞行器的多模态控制系统
CN113500611A (zh) * 2021-07-22 2021-10-15 常州大学 一种基于脑电和视觉导引的喂饭机器人系统
CN113805694A (zh) * 2021-08-26 2021-12-17 上海大学 一种基于脑机接口与计算机视觉的辅助抓取系统及方法
CN114037738A (zh) * 2021-11-18 2022-02-11 湖北三闾智能科技有限公司 一种人类视觉驱动的上肢辅助机器人控制方法
CN114131635B (zh) * 2021-12-08 2024-07-12 山东大学 融合视触觉主动感知的多自由度辅助抓握外肢体机器人系统
CN117718962A (zh) * 2023-12-21 2024-03-19 太原理工大学 一种面向多任务的脑控复合机器人控制系统与方法
CN117873330B (zh) * 2024-03-11 2024-05-17 河海大学 一种脑电-眼动混合遥操作机器人控制方法、系统及装置

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0861475A1 (en) * 1995-11-14 1998-09-02 Coard Technology doing business as Westrex Virtual motion programming and control
CN102198660A (zh) * 2011-05-04 2011-09-28 上海海事大学 基于脑-机接口的机械手臂控制系统及动作命令控制方案
CN105078450A (zh) * 2015-08-24 2015-11-25 华南理工大学 一种可实现脑电检测的健康服务机器人
CN204971277U (zh) * 2015-08-24 2016-01-20 华南理工大学 一种可实现脑电检测的健康服务机器人
CN105425963A (zh) * 2015-11-30 2016-03-23 北京航空航天大学 一种脑电波控制机械臂的系统
US9327396B2 (en) * 2012-02-15 2016-05-03 Samsung Electronics Co., Ltd. Tele-operation system and control method thereof
CN106671084A (zh) * 2016-12-20 2017-05-17 华南理工大学 一种基于脑机接口的机械臂自主辅助系统及方法

Family Cites Families (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5343871A (en) * 1992-03-13 1994-09-06 Mindscope Incorporated Method and apparatus for biofeedback
US6129748A (en) * 1996-03-22 2000-10-10 Kamei; Tsutomu Apparatus for applying pulsed light to the forehead of a user
US20030181961A1 (en) * 1995-03-23 2003-09-25 Tsutomu Kamei Method of noninvasively enhancing immunosurveillance capacity and apparatus for applying pulsed light to at least a portion of a user's temporal region
US6682182B2 (en) * 2002-04-10 2004-01-27 Eastman Kodak Company Continuous ink jet printing with improved drop formation
US10589087B2 (en) * 2003-11-26 2020-03-17 Wicab, Inc. Systems and methods for altering brain and body functions and for treating conditions and diseases of the same
JP5113520B2 (ja) * 2004-08-25 2013-01-09 モトリカ リミテッド 脳の可塑性による運動訓練
US20070239063A1 (en) * 2006-02-01 2007-10-11 Kristina Narfstrom Portable electroretinograph with automated, flexible software
US9511877B2 (en) * 2006-08-09 2016-12-06 Angela Masson Electronic kit bag
CN1927551A (zh) * 2006-09-30 2007-03-14 电子科技大学 一种视导脑声控的残障辅助机器人
WO2008131553A1 (en) * 2007-04-30 2008-11-06 Alain Anadi Martel Light modulation device and system
US8784293B2 (en) * 2008-10-07 2014-07-22 Advanced Brain Monitoring, Inc. Systems and methods for optimization of sleep and post-sleep performance
US20100094156A1 (en) * 2008-10-13 2010-04-15 Collura Thomas F System and Method for Biofeedback Administration
US8483816B1 (en) * 2010-02-03 2013-07-09 Hrl Laboratories, Llc Systems, methods, and apparatus for neuro-robotic tracking point selection
US9445739B1 (en) * 2010-02-03 2016-09-20 Hrl Laboratories, Llc Systems, methods, and apparatus for neuro-robotic goal selection
WO2011140303A1 (en) * 2010-05-05 2011-11-10 University Of Maryland, College Park Time domain-based methods for noninvasive brain-machine interfaces
WO2014186739A1 (en) * 2013-05-17 2014-11-20 Macri Vincent J System and method for pre-movement and action training and control
US10195058B2 (en) * 2013-05-13 2019-02-05 The Johns Hopkins University Hybrid augmented reality multimodal operation neural integration environment
US9389685B1 (en) * 2013-07-08 2016-07-12 University Of South Florida Vision based brain-computer interface systems for performing activities of daily living
CN105578954B (zh) * 2013-09-25 2019-03-29 迈恩德玛泽控股股份有限公司 生理参数测量和反馈系统
US9405366B2 (en) * 2013-10-02 2016-08-02 David Lee SEGAL Systems and methods for using imagined directions to define an action, function or execution for non-tactile devices
FR3026638B1 (fr) * 2014-10-07 2016-12-16 Yannick Vaillant Ensemble environnement et interface de stimulation tactile de guidage sur une trajectoire dans l'environnement
EP3064130A1 (en) * 2015-03-02 2016-09-07 MindMaze SA Brain activity measurement and feedback system
US10299692B2 (en) * 2015-05-13 2019-05-28 Ep Solutions, S.A. Systems, components, devices and methods for cardiac mapping using numerical reconstruction of cardiac action potentials
CN105710885B (zh) * 2016-04-06 2017-08-11 济南大学 服务型移动机械手系统
CN106020470B (zh) * 2016-05-18 2019-05-14 华南理工大学 基于脑机接口的自适应家居环境控制装置及其控制方法
CN106074021B (zh) * 2016-06-08 2018-02-02 山东建筑大学 基于脑机接口的智能轮椅系统及其动作方法
US11478603B2 (en) * 2017-12-31 2022-10-25 Neuroenhancement Lab, LLC Method and apparatus for neuroenhancement to enhance emotional response
CN109366508A (zh) * 2018-09-25 2019-02-22 中国医学科学院生物医学工程研究所 一种基于bci的高级机械臂控制系统及其实现方法
CN109605385B (zh) * 2018-11-28 2020-12-11 东南大学 一种混合脑机接口驱动的康复辅助机器人
CN109992113B (zh) * 2019-04-09 2020-05-15 燕山大学 一种基于多场景诱发的mi-bci系统及其控制方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0861475A1 (en) * 1995-11-14 1998-09-02 Coard Technology doing business as Westrex Virtual motion programming and control
CN102198660A (zh) * 2011-05-04 2011-09-28 上海海事大学 基于脑-机接口的机械手臂控制系统及动作命令控制方案
US9327396B2 (en) * 2012-02-15 2016-05-03 Samsung Electronics Co., Ltd. Tele-operation system and control method thereof
CN105078450A (zh) * 2015-08-24 2015-11-25 华南理工大学 一种可实现脑电检测的健康服务机器人
CN204971277U (zh) * 2015-08-24 2016-01-20 华南理工大学 一种可实现脑电检测的健康服务机器人
CN105425963A (zh) * 2015-11-30 2016-03-23 北京航空航天大学 一种脑电波控制机械臂的系统
CN106671084A (zh) * 2016-12-20 2017-05-17 华南理工大学 一种基于脑机接口的机械臂自主辅助系统及方法

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111571619A (zh) * 2020-04-17 2020-08-25 上海大学 一种基于ssvep脑控机械臂抓取的生活辅助系统与方法
CN112631173A (zh) * 2020-12-11 2021-04-09 中国人民解放军国防科技大学 脑控无人平台协同控制系统
CN113545783A (zh) * 2021-07-02 2021-10-26 徐州市健康研究院有限公司 一种脑电助餐装置
CN113778113A (zh) * 2021-08-20 2021-12-10 北京科技大学 一种基于多模态生理信号的飞行员辅助驾驶方法及系统
CN113778113B (zh) * 2021-08-20 2024-03-26 北京科技大学 一种基于多模态生理信号的飞行员辅助驾驶方法及系统
CN114161414A (zh) * 2021-12-03 2022-03-11 中国科学院沈阳自动化研究所 一种基于脑电和视觉的水下机械手控制系统及方法
CN114161414B (zh) * 2021-12-03 2023-09-19 中国科学院沈阳自动化研究所 一种基于脑电和视觉的水下机械手控制系统及方法
CN114833825A (zh) * 2022-04-19 2022-08-02 深圳市大族机器人有限公司 协作机器人控制方法、装置、计算机设备和存储介质

Also Published As

Publication number Publication date
US11602300B2 (en) 2023-03-14
US20190387995A1 (en) 2019-12-26
CN106671084B (zh) 2019-11-15
CN106671084A (zh) 2017-05-17

Similar Documents

Publication Publication Date Title
WO2018113392A1 (zh) 一种基于脑机接口的机械臂自主辅助系统及方法
Fang et al. A multichannel surface EMG system for hand motion recognition
Chen et al. Combination of augmented reality based brain-computer interface and computer vision for high-level control of a robotic arm
CN109366508A (zh) 一种基于bci的高级机械臂控制系统及其实现方法
CN106491251B (zh) 一种基于非侵入式脑机接口机器臂控制系统及其控制方法
CN109605385B (zh) 一种混合脑机接口驱动的康复辅助机器人
CN104398325B (zh) 基于场景稳态视觉诱发的脑-肌电控制假肢的装置及方法
CN111110982A (zh) 基于运动想象的手部康复训练方法
CN102309366B (zh) 用眼动信号控制上假肢运动的控制系统和控制方法
CN111571587B (zh) 一种脑控机械臂助餐系统及方法
CN110840666A (zh) 一种基于眼电和机器视觉的轮椅机械臂集成系统及其控制方法
CN108646915B (zh) 结合三维视线跟踪和脑机接口控制机械臂抓取物体的方法和系统
CN104997581B (zh) 基于面部表情驱动脑电信号的假手控制方法及装置
CN113274032A (zh) 一种基于ssvep+mi脑机接口的脑卒中康复训练系统及方法
CN113805694A (zh) 一种基于脑机接口与计算机视觉的辅助抓取系统及方法
CN111096796B (zh) 全自动静脉穿刺机器人多层控制系统
CN105012057A (zh) 基于双臂肌电、姿态信息采集的智能假肢及运动分类方法
Zhang et al. Study on robot grasping system of SSVEP-BCI based on augmented reality stimulus
CN111571619A (zh) 一种基于ssvep脑控机械臂抓取的生活辅助系统与方法
CN113143676B (zh) 一种基于脑肌电协同的外肢体手指的控制方法
CN105137830A (zh) 一种基于视觉诱发脑机接口的国画机器手及其绘图方法
Tang et al. A shared-control based BCI system: For a robotic arm control
Ianez et al. Multimodal human-machine interface based on a brain-computer interface and an electrooculography interface
CN115145387A (zh) 一种基于机器视觉的脑控移动抓取机器人系统及控制方法
Wang et al. Humanoid robot control system based on AR-SSVEP

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17884765

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 23.09.2019)

122 Ep: pct application non-entry in european phase

Ref document number: 17884765

Country of ref document: EP

Kind code of ref document: A1