CN114489097B - Unmanned aerial vehicle flight attitude brain control method based on precise motion gesture - Google Patents
Unmanned aerial vehicle flight attitude brain control method based on precise motion gesture Download PDFInfo
- Publication number
- CN114489097B CN114489097B CN202111586421.3A CN202111586421A CN114489097B CN 114489097 B CN114489097 B CN 114489097B CN 202111586421 A CN202111586421 A CN 202111586421A CN 114489097 B CN114489097 B CN 114489097B
- Authority
- CN
- China
- Prior art keywords
- unmanned aerial
- aerial vehicle
- electroencephalogram signal
- neural network
- convolutional neural
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000033001 locomotion Effects 0.000 title claims abstract description 56
- 210000004556 brain Anatomy 0.000 title claims abstract description 52
- 238000000034 method Methods 0.000 title claims abstract description 25
- 238000013527 convolutional neural network Methods 0.000 claims abstract description 37
- 238000012545 processing Methods 0.000 claims abstract description 31
- 238000005457 optimization Methods 0.000 claims abstract description 11
- 238000007781 pre-processing Methods 0.000 claims abstract description 9
- 230000005540 biological transmission Effects 0.000 claims description 11
- 230000002068 genetic effect Effects 0.000 claims description 6
- 238000012549 training Methods 0.000 claims description 5
- 238000004070 electrodeposition Methods 0.000 claims description 4
- 238000001914 filtration Methods 0.000 claims description 4
- 210000002569 neuron Anatomy 0.000 claims description 4
- 238000005516 engineering process Methods 0.000 description 7
- 238000013473 artificial intelligence Methods 0.000 description 4
- 210000003811 finger Anatomy 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 241000203475 Neopanax arboreus Species 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000008447 perception Effects 0.000 description 2
- 210000003813 thumb Anatomy 0.000 description 2
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 239000002360 explosive Substances 0.000 description 1
- 210000004247 hand Anatomy 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000002269 spontaneous effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/08—Control of attitude, i.e. control of roll, pitch, or yaw
- G05D1/0808—Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/10—Internal combustion engine [ICE] based vehicles
- Y02T10/40—Engine management systems
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- User Interface Of Digital Computer (AREA)
- Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)
Abstract
The invention discloses an unmanned aerial vehicle flight attitude brain control method based on precise motion gestures, which specifically comprises the following steps: the electroencephalogram signal acquisition module acquires electroencephalogram signals of a body movement region of a subject when different precise movement gestures are executed; the electroencephalogram signal processing module is used for preprocessing an electroencephalogram signal sample, constructing a deep convolutional neural network model, performing super-parameter optimization on the constructed deep convolutional neural network model, calling the trained deep convolutional neural network model to identify electroencephalogram signals when different precise motion gestures are executed, converting the result identified by the electroencephalogram signal processing module into a flight attitude control instruction of the unmanned aerial vehicle, and sending the flight attitude control instruction to the unmanned aerial vehicle control module to control the flight attitude of the unmanned aerial vehicle. The invention solves the problems of few control instructions and low brain signal decoding precision of the brain-controlled unmanned aerial vehicle in the prior art.
Description
Technical Field
The invention belongs to the technical field of unmanned aerial vehicle flight attitude control methods, and relates to an unmanned aerial vehicle flight attitude brain control method based on precise motion gestures.
Background
Along with the development and wide application of artificial intelligence technology, unmanned aerial vehicles present wide application scenes in various industries due to the advantages of portability, flexibility, safety, controllability and the like. At present, unmanned aerial vehicles appearing on the market generally adopt the mode of action bars teleoperation to control, so people's both hands have not thoroughly liberated yet. In 2017, with the coming explosive development of the artificial intelligence industry, china became one of the earliest countries that published white books of the artificial intelligence industry. The intelligent unmanned aerial vehicle is used as one of consumer terminals of artificial intelligent products, and is widely applied to agriculture, military, fire fighting, entertainment and other aspects. However, the existing unmanned aerial vehicle mostly adopts the traditional control mode to execute the flight gesture, and cannot perform autonomous sensing and reasoning decision-making capability in a dynamic and complex working environment.
Therefore, accurate perception based on the brain control intention of the manipulator is one of key technologies for unmanned aerial vehicle intelligent research. The brain control technology is used as an artificial intelligence technology which is emerging in recent years, and a direct bridge between the brain and peripheral equipment is established through an electroencephalogram signal decoding technology, so that the brain control technology is widely applied to domestic and foreign scholars. The brain-computer interface system based on motor imagery and motor execution is widely focused by students at home and abroad as a spontaneous brain-computer interface system independent of an external stimulator. However, the traditional motor imagery brain-computer interface system has the problems of insufficient brain control instructions, long training time and low brain electrical signal decoding precision. Therefore, there is a need to propose a new brain control paradigm that can achieve accurate perception of a motion intent of a manipulator.
Disclosure of Invention
The invention aims to provide an unmanned aerial vehicle flight attitude brain control method based on precise motion gestures, which solves the problems of few control instructions and low brain signal decoding precision of a motor imagery brain control method in the existing unmanned aerial vehicle control technology.
The unmanned aerial vehicle flight attitude brain control method based on the precise motion gesture comprises the steps of wearing an electroencephalogram signal acquisition module on the head of a subject, wherein the electroencephalogram signal acquisition module is connected with an electroencephalogram signal processing module through a WiFi transmission module a, and the electroencephalogram signal processing module is connected with an unmanned aerial vehicle control module through a WiFi transmission module b, and is implemented specifically according to the following steps:
step 1, a subject makes six preset precise motion gestures;
step 2, an electroencephalogram signal acquisition module acquires electroencephalogram signals of a body movement region of a subject when different precise movement gestures are executed, and electroencephalogram signal samples executed by the different precise movement gestures are obtained;
step 3, preprocessing an electroencephalogram signal sample by an electroencephalogram signal processing module;
step 4, an electroencephalogram signal processing module constructs a deep convolutional neural network model, and the constructed deep convolutional neural network model is used for decoding electroencephalogram signals;
step 5, the electroencephalogram signal processing module carries out super-parameter optimization on the constructed deep convolutional neural network model, an electroencephalogram signal sample which is subjected to pretreatment in the step 3 is adopted as training data in the optimizing process, the deep convolutional neural network model is trained, and the constructed deep convolutional neural network model in the step 4 is optimized according to the optimal super-parameter of the deep convolutional neural network obtained by optimizing, and the deep convolutional neural network model is stored;
step 6, the electroencephalogram signal processing module calls a trained deep convolutional neural network model to identify electroencephalogram signals when different precise motion gestures are executed;
and 7, converting the result recognized by the electroencephalogram signal processing module into a flight attitude control instruction of the unmanned aerial vehicle, and sending the flight attitude control instruction to the unmanned aerial vehicle control module to control the flight attitude of the unmanned aerial vehicle.
The present invention is also characterized in that,
the electroencephalogram signal acquisition module specifically acquires electroencephalogram signals of FC3, FCz, FC4, C3, cz, C4, CP3 and CP4 electrode positions under the international standard of 10/20.
The six precise motion gestures are respectively: the six precise motion gestures correspond to take-off, landing, forward, backward, left turn and right turn of the unmanned aerial vehicle respectively.
In the step 3, the electroencephalogram signal processing module carries out preprocessing on an electroencephalogram signal sample specifically as follows: and carrying out Butterworth band-pass filtering and trend item removal pretreatment on the acquired electroencephalogram signal samples in sequence.
And 5, performing super-parameter optimization on the constructed deep convolutional neural network model by using an electroencephalogram signal processing module, and performing super-parameter optimization by using a genetic algorithm, wherein the optimizing super-parameters are the number of convolutional kernels of a convolutional layer and the number of neurons of a full-connection layer.
The beneficial effects of the invention are as follows:
according to the invention, the flight gesture of the unmanned aerial vehicle is controlled by adopting a precise motion gesture brain control method, brain control instructions of 6 motion gestures can be realized, further, an electroencephalogram signal acquisition module acquires electroencephalogram signals of a subject, super-parameter optimization is performed on a deep convolutional neural network model by adopting a genetic algorithm, time consumption is small, and then the optimized deep convolutional neural network model is adopted to classify and extract the electroencephalogram signals, so that the electroencephalogram signal decoding precision is high.
Drawings
FIG. 1 is a flow chart of the unmanned aerial vehicle flight attitude brain control method based on precise motion gestures of the invention;
FIG. 2 is a connection diagram of each module in the unmanned aerial vehicle flight attitude brain control method based on precise motion gestures;
FIG. 3 is a schematic diagram of the arrangement of the electroencephalogram signal acquisition module of the present invention;
fig. 4 is a schematic diagram of six precise motion gestures in the unmanned aerial vehicle flight gesture brain control method based on the precise motion gestures.
In the figure, an electroencephalogram signal acquisition module, a WiFi transmission module a, a WiFi transmission module, a brain signal processing module, a WiFi transmission module b and a unmanned aerial vehicle control module are respectively arranged at the front end and the rear end of the figure.
Detailed Description
The invention will be described in detail below with reference to the drawings and the detailed description.
The invention discloses an unmanned aerial vehicle flight attitude brain control method based on precise motion gestures, which is shown in fig. 1-2, and comprises a brain electrical signal acquisition module 310 worn on the head of a subject, wherein the brain electrical signal acquisition module 310 is connected with a brain electrical signal processing module 330 through a WiFi transmission module a320, the brain electrical signal processing module 330 is connected with an unmanned aerial vehicle control module 350 through a WiFi transmission module b340, and as shown in fig. 3, the brain electrical signal acquisition module 310 specifically acquires brain electrical signals at FC3, FCz, FC4, C3, cz, C4, CP3 and CP4 electrode positions under the international standard 10/20, and the method is specifically implemented according to the following steps:
step 1, a subject makes six preset precise motion gestures; as shown in fig. 4, the six precision motion gestures are respectively: five-finger closing movement, thumb single-finger stretching movement, five-finger opening movement, index finger single-finger stretching movement, thumb and index finger double-finger stretching movement, index finger and middle finger double-finger stretching movement, and six precise movement gestures respectively correspond to take-off, landing, forward movement, backward movement, left rotation and right rotation of the unmanned aerial vehicle;
step 2, an electroencephalogram signal acquisition module 310 acquires electroencephalogram signals of a body movement region of a subject when different precise movement gestures are executed, and an electroencephalogram signal sample when the different precise movement gestures are executed is obtained;
step 3, the electroencephalogram signal processing module 330 sequentially performs Butterworth band-pass filtering and trend item removal preprocessing on the electroencephalogram signal sample;
step 4, the electroencephalogram signal processing module 330 constructs a deep convolutional neural network model, and uses the constructed deep convolutional neural network model for decoding of electroencephalogram signals;
step 5, the electroencephalogram signal processing module 330 carries out super-parameter optimization on the constructed deep convolutional neural network model by adopting a genetic algorithm, the electroencephalogram signal sample subjected to pretreatment in the step 3 is adopted as training data in the optimizing process, the deep convolutional neural network model is trained, the optimizing super-parameters are the number of convolutional kernels of a convolutional layer and the number of neurons of a full-connection layer, and the constructed deep convolutional neural network model in the step 4 is optimized according to the optimizing obtained neural network optimal super-parameter, and the deep convolutional neural network model is stored;
step 6, the electroencephalogram signal processing module 330 invokes the trained deep convolutional neural network model to recognize electroencephalogram signals when different precise motion gestures are executed;
and 7, converting the result recognized by the electroencephalogram signal processing module 330 into a flight attitude control instruction of the unmanned aerial vehicle, and sending the flight attitude control instruction to the unmanned aerial vehicle control module 350 to control the flight attitude of the unmanned aerial vehicle.
Examples
In the embodiment, a portable NeuSen W64 channel electroencephalogram acquisition device is adopted, and the positions of FC3, FCz, FC4, C3, cz, C4, CP3 and CP4 electrodes under the international standard of 10/20 are selected to acquire electroencephalogram signals. The electroencephalogram signal acquisition module 310 acquires the electroencephalogram signal to be tested, amplifies the signal, and transmits the signal to the electroencephalogram signal processing module 330 through the WiFi transmission module a 320. The electroencephalogram signal processing module 330 is responsible for preprocessing, feature extraction and mode recognition of electroencephalogram signals, and an electroencephalogram signal recognition result is converted into a control instruction of the unmanned aerial vehicle and is sent to the unmanned aerial vehicle control module 350 through the WiFi transmission module b340, so that the unmanned aerial vehicle is controlled to execute different flight attitudes.
According to the unmanned aerial vehicle flight attitude brain control method based on the precise motion gesture, when a subject starts to execute the precise gesture motion, the brain electrical signal acquisition module 310 acquires brain cortex brain electrical signals of the subject, the brain electrical signal acquisition module acquires the brain cortex brain electrical signals of the subject, the brain electrical signal signals are amplified and then transmitted to the brain electrical signal processing module 330 for preprocessing, feature extraction and pattern recognition, and finally, brain electrical signal recognition results are transmitted to the unmanned aerial vehicle control module 350 to realize the control of different flight attitudes of the unmanned aerial vehicle, and the method specifically comprises the following steps:
step 1, six corresponding precise gesture actions are made by a tested, and brain electrical signals of electrode positions of the tested FC3, FCz, FC4, C3, cz, C4, CP3 and CP4 are collected to obtain brain electrical signal sample sets executed by different precise motion gestures, wherein the six precise motion gesture actions in the embodiment are shown in figure 4.
And 2, preprocessing the acquired brain electrical signals. In the embodiment, the acquired electroencephalogram signals are subjected to Butterworth band-pass filtering of 0.5-45Hz and trend term removal preprocessing.
And 3, building a deep convolutional neural network structure model. In the embodiment, a two-dimensional convolution layer is adopted to extract the characteristics of the brain electrical signals, and a full connection layer is adopted to classify the extracted characteristics.
And 4, automatically optimizing the super parameters of the deep convolutional neural network. In the embodiment, a genetic algorithm is adopted to perform super-parameter optimization, an electroencephalogram signal sample subjected to pretreatment in the step 3 is adopted as training data in the optimization process, a deep convolutional neural network model is trained, a deep convolutional neural network structure model is defined as a function to be optimized, the number of convolutional kernels of a convolutional layer and the number of neurons of a full-connection layer are defined as variables to be optimized, and the classification accuracy of the electroencephalogram signal is defined as a function value serving as an evaluation standard of network performance.
And step 5, automatically iterating and optimizing through a genetic algorithm to obtain optimal super parameters, thereby obtaining a depth convolutional neural network structure model with optimal performance, and then storing the trained depth convolutional neural network structure model.
And 6, calling the deep convolutional neural network structural model trained and stored in the step 5 to recognize the electroencephalogram signals of different precise motion gestures in real time.
And 7, converting the brain electrical signal identification result into a control instruction and transmitting the control instruction to the unmanned aerial vehicle. In this embodiment, the correspondence between the electroencephalogram signal identification result and the control command is shown in table 1.
TABLE 1
Claims (5)
1. The unmanned aerial vehicle flight attitude brain control method based on the precise motion gesture is characterized by comprising the steps of wearing an electroencephalogram signal acquisition module (310) on the head of a subject, wherein the electroencephalogram signal acquisition module (310) is connected with an electroencephalogram signal processing module (330) through a WiFi transmission module a (320), and the electroencephalogram signal processing module (330) is connected with an unmanned aerial vehicle control module (350) through a WiFi transmission module b (340), and is implemented specifically according to the following steps:
step 1, a subject makes six preset precise motion gestures;
step 2, an electroencephalogram signal acquisition module (310) acquires electroencephalogram signals of a body movement region of a subject when different precise movement gestures are executed, and electroencephalogram signal samples executed by the different precise movement gestures are obtained;
step 3, an electroencephalogram signal processing module (330) preprocesses an electroencephalogram signal sample;
step 4, an electroencephalogram signal processing module (330) constructs a deep convolutional neural network model, and the constructed deep convolutional neural network model is used for decoding electroencephalogram signals;
step 5, an electroencephalogram signal processing module (330) carries out super-parameter optimization on the constructed deep convolutional neural network model, an electroencephalogram signal sample which is subjected to pretreatment in the step 3 is adopted as training data in the optimizing process, the deep convolutional neural network model is trained, and the constructed deep convolutional neural network model in the step 4 is optimized according to the optimal super-parameter of the deep convolutional neural network obtained by optimizing, and the deep convolutional neural network model is stored;
step 6, an electroencephalogram signal processing module (330) invokes a trained deep convolutional neural network model to recognize electroencephalogram signals when different precise motion gestures are executed;
and 7, converting the result recognized by the electroencephalogram signal processing module (330) into a flight attitude control instruction of the unmanned aerial vehicle, and sending the flight attitude control instruction to the unmanned aerial vehicle control module (350) to control the flight attitude of the unmanned aerial vehicle.
2. The unmanned aerial vehicle flight attitude brain control method based on the precise motion gesture according to claim 1, wherein the brain electrical signal acquisition module (310) specifically acquires brain electrical signals of the FC3, FCz, FC4, C3, cz, C4, CP3, CP4 electrode positions under the international standard 10/20.
3. The unmanned aerial vehicle flight attitude brain control method based on the precise motion gesture according to claim 2, wherein the six precise motion gesture execution actions are respectively: the six precise motion gestures correspond to take-off, landing, forward, backward, left turn and right turn of the unmanned aerial vehicle respectively.
4. The unmanned aerial vehicle flight attitude brain control method based on the precise motion gesture according to claim 3, wherein the preprocessing of the electroencephalogram signal sample by the electroencephalogram signal processing module (330) in the step 3 is specifically: and carrying out Butterworth band-pass filtering and trend item removal pretreatment on the acquired electroencephalogram signal samples in sequence.
5. The brain control method of unmanned aerial vehicle flight attitude based on precise motion gesture according to claim 4, wherein in the step 5, the electroencephalogram signal processing module (330) performs super-parameter optimization on the constructed deep convolutional neural network model, and performs super-parameter optimization by adopting a genetic algorithm, wherein the optimizing super-parameter is the number of convolution kernels of a convolution layer and the number of neurons of a full-connection layer.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111586421.3A CN114489097B (en) | 2021-12-20 | 2021-12-20 | Unmanned aerial vehicle flight attitude brain control method based on precise motion gesture |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111586421.3A CN114489097B (en) | 2021-12-20 | 2021-12-20 | Unmanned aerial vehicle flight attitude brain control method based on precise motion gesture |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114489097A CN114489097A (en) | 2022-05-13 |
CN114489097B true CN114489097B (en) | 2023-06-30 |
Family
ID=81494365
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111586421.3A Active CN114489097B (en) | 2021-12-20 | 2021-12-20 | Unmanned aerial vehicle flight attitude brain control method based on precise motion gesture |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114489097B (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111240350A (en) * | 2020-02-13 | 2020-06-05 | 西安爱生无人机技术有限公司 | Unmanned aerial vehicle pilot dynamic behavior evaluation system |
WO2020186651A1 (en) * | 2019-03-15 | 2020-09-24 | 南京邮电大学 | Smart sports earphones based on eeg thoughts and implementation method and system thereof |
-
2021
- 2021-12-20 CN CN202111586421.3A patent/CN114489097B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020186651A1 (en) * | 2019-03-15 | 2020-09-24 | 南京邮电大学 | Smart sports earphones based on eeg thoughts and implementation method and system thereof |
CN111240350A (en) * | 2020-02-13 | 2020-06-05 | 西安爱生无人机技术有限公司 | Unmanned aerial vehicle pilot dynamic behavior evaluation system |
Non-Patent Citations (1)
Title |
---|
基于表情辅助的假手脑控方法;陆竹风;张小栋;李睿;郭晋;;中国机械工程(第12期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN114489097A (en) | 2022-05-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107886061B (en) | Human body behavior recognition method and system based on multi-mode deep Boltzmann machine | |
CN104504390B (en) | A kind of user on the network's state identification method and device based on eye movement data | |
CN112990074B (en) | VR-based multi-scene autonomous control mixed brain-computer interface online system | |
CN108615010B (en) | Facial expression recognition method based on parallel convolution neural network feature map fusion | |
CN107491726B (en) | Real-time expression recognition method based on multichannel parallel convolutional neural network | |
CN105205436B (en) | A kind of gesture recognition system based on forearm bioelectricity multisensor | |
CN104997581B (en) | Artificial hand control method and apparatus for driving EEG signals on the basis of facial expressions | |
CN110377049B (en) | Brain-computer interface-based unmanned aerial vehicle cluster formation reconfiguration control method | |
CN112631173B (en) | Brain-controlled unmanned platform cooperative control system | |
CN105807926A (en) | Unmanned aerial vehicle man-machine interaction method based on three-dimensional continuous gesture recognition | |
CN109086754A (en) | A kind of human posture recognition method based on deep learning | |
CN111199202B (en) | Human body action recognition method and recognition device based on circulating attention network | |
CN108629288A (en) | A kind of gesture identification model training method, gesture identification method and system | |
CN106963372A (en) | A kind of electric electromyographic signal fusing device of brain and fusion method | |
CN115294658B (en) | Personalized gesture recognition system and gesture recognition method for multiple application scenes | |
CN105138133A (en) | Biological signal gesture recognition device and method | |
CN110738141A (en) | vein identification method, device, equipment and storage medium | |
CN105012057A (en) | Intelligent artificial limb based on double-arm electromyogram and attitude information acquisition and motion classifying method | |
CN111401116B (en) | Bimodal emotion recognition method based on enhanced convolution and space-time LSTM network | |
CN105137830A (en) | Traditional Chinese painting mechanical hand based on visual evoking brain-machine interface, and drawing method thereof | |
CN105069745A (en) | face-changing system based on common image sensor and enhanced augmented reality technology and method | |
CN116434037B (en) | Multi-mode remote sensing target robust recognition method based on double-layer optimization learning | |
CN109711324A (en) | Human posture recognition method based on Fourier transformation and convolutional neural networks | |
CN110807391A (en) | Human body posture instruction identification method for human-unmanned aerial vehicle interaction based on vision | |
CN110929242A (en) | Method and system for carrying out attitude-independent continuous user authentication based on wireless signals |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |