CN114489097B - Unmanned aerial vehicle flight attitude brain control method based on precise motion gesture - Google Patents
Unmanned aerial vehicle flight attitude brain control method based on precise motion gesture Download PDFInfo
- Publication number
- CN114489097B CN114489097B CN202111586421.3A CN202111586421A CN114489097B CN 114489097 B CN114489097 B CN 114489097B CN 202111586421 A CN202111586421 A CN 202111586421A CN 114489097 B CN114489097 B CN 114489097B
- Authority
- CN
- China
- Prior art keywords
- eeg signal
- neural network
- convolutional neural
- network model
- processing module
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000033001 locomotion Effects 0.000 title claims abstract description 54
- 210000004556 brain Anatomy 0.000 title claims abstract description 23
- 238000000034 method Methods 0.000 title claims abstract description 22
- 238000013527 convolutional neural network Methods 0.000 claims abstract description 34
- 238000012545 processing Methods 0.000 claims abstract description 31
- 210000003811 finger Anatomy 0.000 claims description 19
- 238000005457 optimization Methods 0.000 claims description 9
- 230000005540 biological transmission Effects 0.000 claims description 8
- 238000007781 pre-processing Methods 0.000 claims description 7
- 238000004070 electrodeposition Methods 0.000 claims description 5
- 230000002068 genetic effect Effects 0.000 claims description 5
- 210000003813 thumb Anatomy 0.000 claims description 5
- 238000012549 training Methods 0.000 claims description 5
- 241000203475 Neopanax arboreus Species 0.000 claims description 4
- 238000001914 filtration Methods 0.000 claims description 4
- 210000002569 neuron Anatomy 0.000 claims description 4
- 238000005516 engineering process Methods 0.000 description 6
- 238000013473 artificial intelligence Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 3
- 230000008447 perception Effects 0.000 description 3
- 238000011161 development Methods 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000003909 pattern recognition Methods 0.000 description 2
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 210000003710 cerebral cortex Anatomy 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 239000002360 explosive Substances 0.000 description 1
- 210000004247 hand Anatomy 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000002269 spontaneous effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/08—Control of attitude, i.e. control of roll, pitch, or yaw
- G05D1/0808—Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/10—Internal combustion engine [ICE] based vehicles
- Y02T10/40—Engine management systems
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- User Interface Of Digital Computer (AREA)
- Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)
Abstract
本发明公开了一种基于精密运动手势的无人机飞行姿态脑控方法,具体为:脑电信号采集模块采集受试者在不同精密运动手势执行时躯体运动区的脑电信号;脑电信号处理模块对脑电信号样本进行预处理、构建深度卷积神经网络模型,然后对已构建的深度卷积神经网络模型进行超参数寻优优化已构建的深度卷积神经网络模型,然后调用训练好的深度卷积神经网络模型识别不同精密运动手势执行时的脑电信号,将脑电信号处理模块识别出的结果转换为无人机的飞行姿态控制指令,发送给无人机控制模块对无人机的飞行姿态进行控制。本发明解决了现有技术中脑控无人机存在的控制指令少、脑电信号解码精度低的问题。
The invention discloses a brain control method for the flight attitude of a drone based on precise motion gestures, specifically: the EEG signal acquisition module collects the EEG signals of the subject's body movement area when performing different precise motion gestures; the EEG signals The processing module preprocesses the EEG signal samples, constructs a deep convolutional neural network model, and then optimizes the hyperparameters of the constructed deep convolutional neural network model, and then calls the trained deep convolutional neural network model. The deep convolutional neural network model recognizes the EEG signals during the execution of different precise motion gestures, converts the results recognized by the EEG signal processing module into the flight attitude control instructions of the UAV, and sends them to the UAV control module for unmanned control. The flight attitude of the aircraft is controlled. The invention solves the problems of few control instructions and low decoding accuracy of brain electrical signals in the prior art of brain-controlled drones.
Description
技术领域technical field
本发明属于无人机飞行姿态控制方法技术领域,涉及一种基于精密运动手势的无人机飞行姿态脑控方法。The invention belongs to the technical field of UAV flight attitude control methods, and relates to a UAV flight attitude brain control method based on precision motion gestures.
背景技术Background technique
随着人工智能技术的发展和广泛应用,无人机因其轻便灵活和安全可控等优点,在各行各业呈现广阔的应用场景。目前,市面上出现的无人机普遍采用操作杆遥操作的方式进行控制,因此人们的双手还没有彻底解放出来。2017年,随着人工智能行业迎来爆发式的发展,中国成为最早公布人工智能产业白皮书的国家之一。智能化、无人机作为人工智能产品的消费级终端之一,在农业、军事、消防、娱乐等方面得到了广泛的应用。但是现有的无人机多采用传统控制模式下进行飞行姿态的执行,无法在动态、复杂工作环境下进行自主感知、推理决策能力。With the development and wide application of artificial intelligence technology, unmanned aerial vehicles (UAVs) present broad application scenarios in various industries due to their advantages of portability, flexibility, safety and controllability. At present, drones on the market are generally controlled by joystick remote operation, so people's hands have not been completely freed. In 2017, with the explosive development of the artificial intelligence industry, China became one of the first countries to publish a white paper on the artificial intelligence industry. Intelligent, drones, as one of the consumer terminals of artificial intelligence products, have been widely used in agriculture, military, fire protection, entertainment, etc. However, most of the existing UAVs use the traditional control mode to execute the flight attitude, and cannot perform autonomous perception, reasoning and decision-making capabilities in dynamic and complex working environments.
因此,基于操控者脑控意图的准确感知成了无人机智能化研究的关键技术之一。脑控技术作为近年来新兴的一门人工智能技术,通过脑电信号解码技术建立大脑与外围设备间的直接桥梁,得到了国内外学者的广泛应用。其中,基于运动想象、运动执行的脑机接口系统,作为不依赖外部刺激器的自发型脑机接口系统,得到了国内外学者的广泛关注。但传统的运动想象脑机接口系统,存在着脑控指令不足、训练时间较长、脑电信号解码精度低的问题。因此,现阶段亟需提出一种新的脑控范式,能够实现操控者运动意图的准确感知。Therefore, accurate perception based on the operator's brain-control intention has become one of the key technologies for the intelligent research of drones. As an emerging artificial intelligence technology in recent years, brain control technology establishes a direct bridge between the brain and peripheral devices through EEG signal decoding technology, and has been widely used by scholars at home and abroad. Among them, the brain-computer interface system based on motor imagery and motor execution, as a spontaneous brain-computer interface system that does not rely on external stimulators, has attracted extensive attention from scholars at home and abroad. However, the traditional motor imagery brain-computer interface system has the problems of insufficient brain control instructions, long training time, and low decoding accuracy of EEG signals. Therefore, at this stage, it is urgent to propose a new brain control paradigm that can realize the accurate perception of the operator's motion intention.
发明内容Contents of the invention
本发明的目的是提供一种基于精密运动手势的无人机飞行姿态脑控方法,解决了现有无人机控制技术中运动想象脑控方法存在的控制指令少、脑电信号解码精度低的问题。The purpose of the present invention is to provide a brain control method for flying posture of UAV based on precise motion gestures, which solves the problem of less control instructions and low decoding accuracy of EEG signals in the existing UAV control technology. question.
本发明采用的技术方案是,基于精密运动手势的无人机飞行姿态脑控方法,包括给受试者头部佩戴脑电信号采集模块,脑电信号采集模块通过WiFi传输模块a连接有脑电信号处理模块,脑电信号处理模块通过WiFi传输模块b连接有无人机控制模块,具体按照如下步骤实施:The technical solution adopted in the present invention is a brain control method for the flying attitude of the drone based on precise motion gestures, including wearing an EEG signal acquisition module on the head of the subject, and the EEG signal acquisition module is connected to the EEG through the WiFi transmission module a The signal processing module and the EEG signal processing module are connected to the UAV control module through the WiFi transmission module b, and are specifically implemented according to the following steps:
步骤1,受试者做出预先设定的六种精密运动手势动作;Step 1, the subject makes six pre-set precision motor gestures;
步骤2,脑电信号采集模块采集受试者在执行不同精密运动手势时躯体运动区的脑电信号,得到不同精密运动手势执行的脑电信号样本;Step 2, the EEG signal acquisition module collects the EEG signals in the body motor area of the subject when performing different precise motion gestures, and obtains the EEG signal samples performed by different precise motion gestures;
步骤3,脑电信号处理模块对脑电信号样本行预处理;Step 3, the EEG signal processing module preprocesses the EEG signal samples;
步骤4,脑电信号处理模块构建深度卷积神经网络模型,并将构建的深度卷积神经网络模型用于脑电信号的解码;Step 4, the EEG signal processing module constructs a deep convolutional neural network model, and uses the constructed deep convolutional neural network model for decoding the EEG signal;
步骤5,脑电信号处理模块对已构建的深度卷积神经网络模型进行超参数寻优,寻优过程中采用步骤3经过预处理后的脑电信号样本作为训练数据,对深度卷积神经网络模型进行训练,根据寻优得到的深度卷积神经网络最优超参数优化步骤4已构建的深度卷积神经网络模型,并进行保存;Step 5. The EEG signal processing module optimizes the hyperparameters of the constructed deep convolutional neural network model. During the optimization process, the preprocessed EEG signal samples in step 3 are used as training data. The model is trained, and the deep convolutional neural network model constructed in step 4 is optimized according to the optimal hyperparameters of the deep convolutional neural network obtained through optimization, and saved;
步骤6,脑电信号处理模块调用训练好的深度卷积神经网络模型识别执行不同精密运动手势时的脑电信号;Step 6, the EEG signal processing module invokes the trained deep convolutional neural network model to identify EEG signals when performing different precise motion gestures;
步骤7,将脑电信号处理模块识别出的结果转换为无人机的飞行姿态控制指令,发送给无人机控制模块对无人机的飞行姿态进行控制。Step 7, convert the result identified by the EEG signal processing module into a flight attitude control command of the UAV, and send it to the UAV control module to control the flight attitude of the UAV.
本发明的特征还在于,The present invention is also characterized in that,
脑电信号采集模块具体采集国际标准10/20下FC3、FCz、FC4、C3、Cz、C4、CP3、CP4电极位置的脑电信号。The EEG signal acquisition module specifically collects EEG signals at the electrode positions of FC3, FCz, FC4, C3, Cz, C4, CP3, and CP4 under the international standard 10/20.
六种精密运动手势执行动作分别为:五指闭合运动、拇指单指伸展运动、五指打开运动、食指单指伸展运动、拇指和食指双指伸展运动、食指和中指双指伸展运动,六种精密运动手势分别对应无人机起飞、着陆、前进、后退、左转、右转。Six precision movement gestures are performed: five finger closing movement, thumb single finger stretching movement, five finger opening movement, index finger single finger stretching movement, thumb and index finger double finger stretching movement, index finger and middle finger double finger stretching movement, six precision movements The gestures correspond to the drone's takeoff, landing, forward, backward, left turn, and right turn respectively.
步骤3中脑电信号处理模块对脑电信号样本进行预处理具体为:对采集到的脑电信号样本依次进行巴特沃斯带通滤波和去除趋势项预处理。In step 3, the EEG signal processing module preprocesses the EEG signal samples specifically as follows: sequentially perform Butterworth band-pass filtering and trend removal preprocessing on the collected EEG signal samples.
步骤5中脑电信号处理模块对已构建的深度卷积神经网络模型进行超参数寻优采用遗传算法进行超参数寻优,寻优超参数为卷积层的卷积核个数、全连接层的神经元个数。In step 5, the EEG signal processing module optimizes the hyperparameters of the constructed deep convolutional neural network model and uses the genetic algorithm to optimize the hyperparameters. number of neurons.
本发明的有益效果是:The beneficial effects of the present invention are:
本发明采用精密运动手势脑控方法进行无人机飞行姿态的控制,能实现6个运动手势的脑控指令,进一步的,脑电信号采集模块采集受试者脑电信号采用遗传算法对深度卷积神经网络模型进行超参数寻优,时间消耗小,然后采用寻优后的深度卷积神经网络模型对脑电信号进行分类提取,脑电信号解码精度高。The present invention adopts the precise movement gesture brain control method to control the flight posture of the UAV, and can realize the brain control instructions of 6 movement gestures. The convolutional neural network model is used to optimize the hyperparameters, and the time consumption is small. Then, the optimized deep convolutional neural network model is used to classify and extract the EEG signals, and the decoding accuracy of the EEG signals is high.
附图说明Description of drawings
图1是本发明基于精密运动手势的无人机飞行姿态脑控方法的流程图;Fig. 1 is the flow chart of the present invention based on the UAV flight attitude brain control method of precision motion gesture;
图2是本发明基于精密运动手势的无人机飞行姿态脑控方法中的各个模块的连接图;Fig. 2 is the connection diagram of each module in the UAV flight attitude brain control method based on precise motion gestures of the present invention;
图3是本发明脑电信号采集模块的布置示意图;3 is a schematic diagram of the layout of the EEG signal acquisition module of the present invention;
图4是本发明基于精密运动手势的无人机飞行姿态脑控方法中六种精密运动手势示意图。Fig. 4 is a schematic diagram of six precision movement gestures in the brain control method of drone flight attitude based on precision movement gestures in the present invention.
图中,310.脑电信号采集模块,320.WiFi传输模块a,330.脑电信号处理模块,340.WiFi传输模块b,350.无人机控制模块。In the figure, 310. EEG signal acquisition module, 320. WiFi transmission module a, 330. EEG signal processing module, 340. WiFi transmission module b, 350. UAV control module.
具体实施方式Detailed ways
下面结合附图和具体实施方式对本发明进行详细说明。The present invention will be described in detail below in conjunction with the accompanying drawings and specific embodiments.
本发明基于精密运动手势的无人机飞行姿态脑控方法,如图1-2所示,包括给受试者头部佩戴脑电信号采集模块310,脑电信号采集模块310通过WiFi传输模块a320连接有脑电信号处理模块330,脑电信号处理模块330通过WiFi传输模块b340连接有无人机控制模块350,如图3所示,脑电信号采集模块310具体采集国际标准10/20下FC3、FCz、FC4、C3、Cz、C4、CP3、CP4电极位置的脑电信号,具体按照如下步骤实施:The brain control method of UAV flight attitude based on precision motion gestures of the present invention, as shown in Figure 1-2, includes wearing an EEG
步骤1,受试者做出预先设定的六种精密运动手势动作;如图4所示,六种精密运动手势动作分别为:五指闭合运动、拇指单指伸展运动、五指打开运动、食指单指伸展运动、拇指和食指双指伸展运动、食指和中指双指伸展运动,六种精密运动手势分别对应无人机起飞、着陆、前进、后退、左转、右转;In step 1, the subject made six pre-set precise movement gestures; as shown in Figure 4, the six precision movement gestures were: five-finger closing movement, thumb-single-finger stretching movement, five-finger opening movement, index finger single-finger movement Finger stretching, thumb and index finger stretching, index finger and middle finger stretching, six precision motion gestures correspond to drone takeoff, landing, forward, backward, left turn, right turn;
步骤2,脑电信号采集模块310采集受试者在执行不同精密运动手势时躯体运动区的脑电信号,得到不同精密运动手势执行时的脑电信号样本;Step 2, the EEG
步骤3,脑电信号处理模块330对脑电信号样本依次进行巴特沃斯带通滤波和去除趋势项预处理;Step 3, the EEG
步骤4,脑电信号处理模块330构建深度卷积神经网络模型,并将构建的深度卷积神经网络模型用于脑电信号的解码;Step 4, the EEG
步骤5,脑电信号处理模块330对已构建的深度卷积神经网络模型采用遗传算法进行超参数寻优,寻优过程中采用步骤3经过预处理后的脑电信号样本作为训练数据,对深度卷积神经网络模型进行训练,寻优超参数为卷积层的卷积核个数、全连接层的神经元个数,根据寻优得到的神经网络最优超参数优化步骤4已构建的深度卷积神经网络模型,并进行保存;Step 5, the EEG
步骤6,脑电信号处理模块330调用训练好的深度卷积神经网络模型识别执行不同精密运动手势时的脑电信号;Step 6, the EEG
步骤7,将脑电信号处理模块330识别出的结果转换为无人机的飞行姿态控制指令,发送给无人机控制模块350对无人机的飞行姿态进行控制。Step 7: Convert the result recognized by the EEG
实施例Example
本实施例中采用便携化NeuSen W64通道脑电采集设备,选取国际标准10/20下FC3、FCz、FC4、C3、Cz、C4、CP3、CP4电极位置进行脑电信号的采集。脑电信号采集模块310采集被试脑电信号之后对信号进行放大,然后通过WiFi传输模块a320传输给脑电信号处理模块330。脑电信号处理模块330负责将脑电信号进行预处理、特征提取和模式识别,脑电信号识别结果转化为无人机的控制指令并通过WiFi传输模块b340发送给无人机控制模块350,从而控制无人机执行不同的飞行姿态。In this embodiment, a portable NeuSen W64 channel EEG acquisition device is used, and the electrode positions of FC3, FCz, FC4, C3, Cz, C4, CP3, and CP4 under the international standard 10/20 are selected for EEG signal acquisition. The EEG
本发明基于精密运动手势的无人机飞行姿态脑控方法,当受试者开始执行精密手势运动时,脑电信号采集模块310采集受试者大脑皮层脑电信号,放大后传输给脑电信号处理模块330进行预处理、特征提取和模式识别,最后将脑电信号识别结果传输给无人机控制模块350,实现无人机不同飞行姿态的控制,具体实现包括以下步骤:The brain control method of UAV flight attitude based on precise motion gestures of the present invention, when the subject starts to perform precise gesture movements, the EEG
步骤1,被试做出六种相应的精密手势动作,采集被试FC3、FCz、FC4、C3、Cz、C4、CP3、CP4电极位置的脑电信号,得到不同精密运动手势执行的脑电信号样本集,本实施例中六种精密运动手势动作如图4所示。Step 1. The subjects made six corresponding precise gestures, collected the EEG signals at the electrode positions of the subjects FC3, FCz, FC4, C3, Cz, C4, CP3, and CP4, and obtained the EEG signals performed by different precise motor gestures The sample set, the six precision motion gestures in this embodiment are shown in Figure 4.
步骤2,对采集到的脑电信号进行预处理。在本实施例中,对采集到的脑电信号进行0.5-45Hz的巴特沃斯带通滤波和去除趋势项预处理。Step 2, preprocessing the collected EEG signals. In this embodiment, the collected EEG signals are subjected to 0.5-45 Hz Butterworth band-pass filtering and preprocessing for removing trend items.
步骤3,搭建深度卷积神经网络结构模型。本实施例中采用二维卷积层提取脑电信号特征,采用全连接层对提取特征进行分类。Step 3, build a deep convolutional neural network structure model. In this embodiment, a two-dimensional convolutional layer is used to extract EEG signal features, and a fully connected layer is used to classify the extracted features.
步骤4,对深度卷积神经网络超参数进行自动寻优。本实施例中采用遗传算法进行超参数寻优,寻优过程中采用步骤3经过预处理后的脑电信号样本作为训练数据,对深度卷积神经网络模型进行训练,将深度卷积神经网络结构模型定义为待寻优函数,将卷积层的卷积核个数、全连接层的神经元个数定义为待寻优变量,脑电信号的分类正确率定义为函数值作为网络性能的评价标准。Step 4, automatically optimize the hyperparameters of the deep convolutional neural network. In this embodiment, a genetic algorithm is used to optimize hyperparameters. During the optimization process, the preprocessed EEG signal samples in step 3 are used as training data to train the deep convolutional neural network model, and the deep convolutional neural network structure The model is defined as a function to be optimized, the number of convolution kernels in the convolutional layer and the number of neurons in the fully connected layer are defined as variables to be optimized, and the classification accuracy of EEG signals is defined as the function value as the evaluation of network performance standard.
步骤5,通过遗传算法自动迭代寻优,得到最优超参数,从而得到性能最优的深度卷积神经网络结构模型,随后将训练好的深度卷积神经网络结构模型进行保存。Step 5, automatically iteratively optimize through the genetic algorithm to obtain the optimal hyperparameters, thereby obtaining the deep convolutional neural network structure model with the best performance, and then save the trained deep convolutional neural network structure model.
步骤6,调用步骤5已训练并保存的深度卷积神经网络结构模型进行实时识别不同精密运动手势时的脑电信号。Step 6, call the deep convolutional neural network structure model trained and saved in step 5 to recognize the EEG signals of different precise motion gestures in real time.
步骤7,脑电信号识别结果转化为控制指令传输给无人机。本实施例中,脑电信号识别结果与控制指令的对应关系如表1所示。Step 7, the EEG signal recognition results are converted into control instructions and transmitted to the UAV. In this embodiment, the corresponding relationship between the EEG signal recognition result and the control instruction is shown in Table 1.
表1Table 1
Claims (5)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111586421.3A CN114489097B (en) | 2021-12-20 | 2021-12-20 | Unmanned aerial vehicle flight attitude brain control method based on precise motion gesture |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111586421.3A CN114489097B (en) | 2021-12-20 | 2021-12-20 | Unmanned aerial vehicle flight attitude brain control method based on precise motion gesture |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114489097A CN114489097A (en) | 2022-05-13 |
CN114489097B true CN114489097B (en) | 2023-06-30 |
Family
ID=81494365
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111586421.3A Active CN114489097B (en) | 2021-12-20 | 2021-12-20 | Unmanned aerial vehicle flight attitude brain control method based on precise motion gesture |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114489097B (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111240350A (en) * | 2020-02-13 | 2020-06-05 | 西安爱生无人机技术有限公司 | Unmanned aerial vehicle pilot dynamic behavior evaluation system |
WO2020186651A1 (en) * | 2019-03-15 | 2020-09-24 | 南京邮电大学 | Smart sports earphones based on eeg thoughts and implementation method and system thereof |
-
2021
- 2021-12-20 CN CN202111586421.3A patent/CN114489097B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020186651A1 (en) * | 2019-03-15 | 2020-09-24 | 南京邮电大学 | Smart sports earphones based on eeg thoughts and implementation method and system thereof |
CN111240350A (en) * | 2020-02-13 | 2020-06-05 | 西安爱生无人机技术有限公司 | Unmanned aerial vehicle pilot dynamic behavior evaluation system |
Non-Patent Citations (1)
Title |
---|
基于表情辅助的假手脑控方法;陆竹风;张小栋;李睿;郭晋;;中国机械工程(第12期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN114489097A (en) | 2022-05-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112990074B (en) | VR-based multi-scene autonomous control mixed brain-computer interface online system | |
CN110377049B (en) | Brain-computer interface-based unmanned aerial vehicle cluster formation reconfiguration control method | |
CN112631173B (en) | Brain-controlled unmanned platform cooperative control system | |
CN105912980A (en) | Unmanned plane and unmanned plane system | |
CN103473294B (en) | MSVM (multi-class support vector machine) electroencephalogram feature classification based method and intelligent wheelchair system | |
CN107741781A (en) | Flight control method and device of unmanned aerial vehicle, unmanned aerial vehicle and storage medium | |
CN105159452B (en) | A kind of control method and system based on human face modeling | |
CN106940593B (en) | Emotiv brain-controlled UAV system and method based on mixed programming of VC++ and Matlab | |
CN104777775A (en) | Two-wheeled self-balancing robot control method based on Kinect device | |
CN106127146A (en) | A kind of unmanned aerial vehicle flight path guidance method based on gesture identification | |
CN107369635A (en) | A kind of intelligent semi-conductor change system based on deep learning | |
CN110633624A (en) | A machine vision human abnormal behavior recognition method based on multi-feature fusion | |
CN111881802B (en) | Traffic police gesture recognition method based on double-branch space-time graph convolutional network | |
CN107351080A (en) | A kind of hybrid intelligent research system and control method based on array of camera units | |
CN116052024A (en) | Power line inspection method based on light-weight target recognition neural network model | |
Bilang et al. | Cactaceae detection using MobileNet architecture | |
CN109711324A (en) | Human pose recognition method based on Fourier transform and convolutional neural network | |
CN116434037A (en) | A Robust Recognition Method for Multimodal Remote Sensing Targets Based on Two-layer Optimal Learning | |
CN114489097B (en) | Unmanned aerial vehicle flight attitude brain control method based on precise motion gesture | |
CN116225212A (en) | Gesture, hand shape and voice collaborative multi-mode interaction sensing method for human and unmanned aerial vehicle group | |
CN111931748A (en) | Worker fatigue detection method suitable for storage battery production workshop | |
CN112936259B (en) | A Human-Robot Collaboration Method Applicable to Underwater Robots | |
CN114863572A (en) | An EMG gesture recognition method based on multi-channel heterogeneous sensors | |
CN117908675A (en) | Fine motion gesture two-person collaborative brain control method and system for drone swarms | |
Zheng | Gesture recognition real-time control system based on YOLOV4 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |