CN110412996A - It is a kind of based on gesture and the unmanned plane control method of eye movement, device and system - Google Patents

It is a kind of based on gesture and the unmanned plane control method of eye movement, device and system Download PDF

Info

Publication number
CN110412996A
CN110412996A CN201910524901.3A CN201910524901A CN110412996A CN 110412996 A CN110412996 A CN 110412996A CN 201910524901 A CN201910524901 A CN 201910524901A CN 110412996 A CN110412996 A CN 110412996A
Authority
CN
China
Prior art keywords
unmanned plane
eye movement
gesture
helmet
signal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910524901.3A
Other languages
Chinese (zh)
Inventor
闫野
刘璇恒
秦伟
谢良
邓宝松
印二威
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin (binhai) Intelligence Military-Civil Integration Innovation Center
National Defense Technology Innovation Institute PLA Academy of Military Science
Original Assignee
Tianjin (binhai) Intelligence Military-Civil Integration Innovation Center
National Defense Technology Innovation Institute PLA Academy of Military Science
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin (binhai) Intelligence Military-Civil Integration Innovation Center, National Defense Technology Innovation Institute PLA Academy of Military Science filed Critical Tianjin (binhai) Intelligence Military-Civil Integration Innovation Center
Priority to CN201910524901.3A priority Critical patent/CN110412996A/en
Publication of CN110412996A publication Critical patent/CN110412996A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

This application discloses a kind of based on gesture and the unmanned plane control method of eye movement, device and system, comprising: hand wears equipment acquisition hand signal, is sent to helmet;Helmet handles hand signal, obtains control instruction;Helmet acquisition eye movement signal is simultaneously handled, and obtains mode instruction;Helmet sends the control command and mode command to unmanned plane, controls unmanned plane during flying.It is combined based on gesture and eye movement, simplifies the control method of unmanned plane, realize multi-modal unmanned plane manipulation.Pass through augmented reality, and in augmented reality system realize gesture interact with unmanned plane, eye movement interact with unmanned plane, unmanned plane image real-time display, be unmanned aerial vehicle (UAV) control presentation one scene is more three-dimensional, information is more abundant, environment more naturally cordiality unmanned plane interactive interface.

Description

It is a kind of based on gesture and the unmanned plane control method of eye movement, device and system
Technical field
This application involves air vehicle technique field more particularly to a kind of unmanned plane control method based on gesture and eye movement, Device and system.
Background technique
Unmanned plane is the not manned vehicle manipulated using radio robot and the presetting apparatus provided for oneself, or It fully or is intermittently automatically operated by car-mounted computer.With science and technology development, in face of the mankind it is incompetent it is highly difficult, The task of high risk and high-content, unmanned plane come into being.It substitutes manned aircraft and goes to execute these tasks.Unmanned plane It is a kind of equipment manipulated with radio, someone is called remote driving aircraft then.It can tend to perfectly utilize people The sharp technology of the essence such as work intelligence, signal processing and automatic Pilot, and since it has the advantages such as small in size, unmanned and voyage is remote It is widely used, in natural environment investigation, popular science research, agriculture field, defends state sovereignty and permitted with public health security etc. Various aspects are all applied.The application for tending to diversification this at present is so that each state all accelerates the paces explored, developed.
Currently, consumer level unmanned plane product size on the market is many kinds of, mode of operation is also had nothing in common with each other, most-often used Unmanned aerial vehicle (UAV) control need to be equipped with both hands manipulation double rod remote controler.With the continuous maturation of unmanned air vehicle technique, present nobody Machine intelligence paces are very rapid, unmanned plane automatic obstacle-avoiding, path planning, automatic cruising, the functions such as intelligence follows, a key makes a return voyage Reformed AHP emerges one after another, and the more the function of unmanned plane the more perfect, this makes unmanned plane control method be also to become increasingly complex, no The components of evitable unmanned plane also can be more and more, while abundant functional experience, but also operation is inconvenient, nothing The thing that man-machine aircraft bombing is crashed is also commonplace.Especially for for the unfamiliar new hand of unmanned plane, a thick explanation Book and complicated control program are less friendly, and once accident occurs, injure personnel, even more lose more than gain.Therefore Under the premise of unmanned plane function is so complicated diversified, simplifies user's operation, realize that multi-modal unmanned aerial vehicle (UAV) control mode is Ten minutes important.
In summary, it is desirable to provide the simply multi-modal unmanned aerial vehicle (UAV) control methods, devices and systems of operating method.
Summary of the invention
In order to solve the above problem, present applicant proposes a kind of unmanned plane control method based on gesture and eye movement, device and System.
On the one hand, the application proposes a kind of unmanned plane control method based on gesture and eye movement, comprising:
Hand wears equipment acquisition hand signal, is sent to helmet;
Helmet handles hand signal, obtains control instruction;
Helmet acquisition eye movement signal is simultaneously handled, and obtains mode instruction;
Helmet sends the control command and mode command to unmanned plane, controls unmanned plane during flying.
Preferably, the processing hand signal, obtains control instruction, comprising:
The hand signal is pre-processed, feature extraction and characteristic processing, obtains gesture identification result;
Corresponding control instruction is obtained according to gesture identification result.
Preferably, it the acquisition eye movement signal and handles, obtains mode instruction, comprising:
Eye movement signal is acquired, the eye movement signal is pre-processed, feature extraction and characteristic processing, eye movement identification is obtained As a result;
Corresponding mode instruction is obtained according to eye movement recognition result.
Preferably, further includes:
Helmet receives the video stream data and depth data of unmanned plane ambient enviroment collected, handles the video Flow data and depth data generate the hologram of ambient enviroment and display.
Preferably, further includes:
The pose data of itself, resolve the pose data, obtain posture sample acquired in helmet reception unmanned plane Formula is simultaneously shown.
Second aspect, the application propose a kind of unmanned plane control device based on gesture and eye movement, comprising: hand wear equipment and Helmet;The hand wears equipment, for acquiring hand signal, is sent to helmet;
The helmet obtains control instruction for handling hand signal;Acquisition eye movement signal is simultaneously handled, and obtains mould Formula instruction;The control command and mode command are sent to unmanned plane, controls unmanned plane during flying.
Preferably, the helmet is also used to processing environment and depth data, generates the hologram of ambient enviroment simultaneously Display;Pose data are resolved, posture pattern is obtained and shows.
Preferably, the helmet includes:
Eye movement acquisition module for acquiring eye movement signal, and is transmitted to human physiological signal treatment module;
First communication module is transmitted to human physiological signal treatment module for receiving the hand signal;
Human physiological signal treatment module is controlled for handling the hand signal and the eye movement signal System instruction and mode instruction;
Control module, for control instruction and mode instruction to be sent to first communication module, control display module is shown.
Preferably, the helmet further include:
Image procossing and space mapping module generate the complete of ambient enviroment for handling video stream data and depth data Retire into private life picture;
UAV position and orientation resolves module and obtains posture pattern for resolving pose data;
Display module, for showing hologram, posture pattern and unmanned machine information;
First communication module is also used to receive video stream data and depth data, is transmitted to image processing module, received bit Appearance data are transmitted to pose and resolve module, receive unmanned machine information, send control instruction and mode instruction to unmanned plane.
The third aspect, the application propose a kind of unmanned plane control system based on gesture and eye movement, use above-mentioned dress It sets, further includes:
Unmanned plane is sent to helmet for acquiring unmanned plane ambient enviroment and depth data;Acquire the position of unmanned plane Appearance data, are sent to helmet;It is flown according to control command and mode command.
The advantages of the application, is: being combined based on gesture and eye movement, simplifies the control method of unmanned plane, realizes multimode The unmanned plane of state manipulates.
Detailed description of the invention
By reading the following detailed description of the preferred embodiment, various other advantages and benefits are common for this field Technical staff will become clear.Attached drawing is only used for showing the purpose of preferred implementations, and is not considered as to the application Limitation.And throughout the drawings, identical component is indicated with same reference symbol.In the accompanying drawings:
Fig. 1 is a kind of step schematic diagram of unmanned plane control method based on gesture and eye movement provided by the present application;
Fig. 2 is a kind of schematic diagram of unmanned plane control device based on gesture and eye movement provided by the present application;
Fig. 3 is that a kind of hand of unmanned plane control device based on gesture and eye movement provided by the present application wears equipment schematic diagram;
Fig. 4 is a kind of manipulation gesture schematic diagram of unmanned plane control device based on gesture and eye movement provided by the present application;
Fig. 5 is a kind of eye movement acquisition interface signal of unmanned plane control device based on gesture and eye movement provided by the present application Figure;
Fig. 6 is a kind of flow diagram of unmanned plane control device based on gesture and eye movement provided by the present application.
Specific embodiment
The illustrative embodiments of the disclosure are more fully described below with reference to accompanying drawings.Although showing this public affairs in attached drawing The illustrative embodiments opened, it being understood, however, that may be realized in various forms the disclosure without the reality that should be illustrated here The mode of applying is limited.It is to be able to thoroughly understand the disclosure on the contrary, providing these embodiments, and can be by this public affairs The range opened is fully disclosed to those skilled in the art.
According to presently filed embodiment, a kind of unmanned plane control method based on gesture and eye movement is proposed, such as Fig. 1 institute Show, comprising:
Hand wears equipment acquisition hand signal, is sent to helmet;
Helmet handles hand signal, obtains control instruction;
Helmet acquisition eye movement signal is simultaneously handled, and obtains mode instruction;
Helmet sends the control command and mode command to unmanned plane, controls unmanned plane during flying.
The processing hand signal, obtains control instruction, comprising:
The hand signal is pre-processed, feature extraction and characteristic processing, obtains gesture identification result;
Corresponding control instruction is obtained according to gesture identification result.
Equipment is worn by hand and acquires the gesture information (hand signal) of user, and hand signal is transferred to helmet It is analyzed and processed, the result analyzed is mapped as corresponding control instruction and is sent to unmanned plane, and control unmanned plane is made accordingly Posture and direction variation.
The transmission mode includes wireless mode.
Helmet acquires the eye movement information of user, carries out processing analysis, the result analyzed is mapped as corresponding mould Formula instruction is sent to unmanned plane, and unmanned plane is made to make corresponding mode conversion.
The recognition methods of hand signal includes: the method based on data glove, the method based on electromyography signal, based on calculating The method of machine vision and the method based on wearable sensors etc..
The recognizer of hand signal includes: deep learning algorithm, backpropagation (Back Propagation, BP) nerve Network algorithm, dynamic time warping (Dynamic Time Warping, DTW) algorithm and hidden Markov model (Hidden Markov Model, HMM) model etc..
For the recognizer of hand signal by taking hidden Markov model as an example, a HMM can be by λi=(S, O, A, B, π) is retouched It states, can also be abbreviated as λi=(π, A, B), wherein S is hidden state set, and O is observation state set, and A is turning for hidden state Probability matrix is moved, B is observation state probability distribution, and π is initial state probabilities distribution vector.
Step includes: the timing in view of acceleration information, selects Bakis type HMM (HMM model from left to right) point It is other that each gesture motion is modeled, and initialization model parameter lambdai=(A, B, π);Acquire each gesture motion repeatedly respectively The data of signal, and using Baum-Welch algorithm come to gesture model λiIt is trained, model parameter is made to tend to receive as far as possible It holds back, obtains the optimal λ of corresponding gesturei;Select Viterbi algorithm as the corresponding HMM recognition methods of each gesture, i.e., it will input The acceleration signature sequence of gesture respectively with trained λiCalculating assessment is carried out, the maximum λ of its probability output is takeniFor corresponding hand The recognition result of gesture movement.
The acquisition eye movement signal is simultaneously handled, and obtains mode instruction, comprising:
Eye movement signal is acquired, the eye movement signal is pre-processed, feature extraction and characteristic processing, eye movement identification is obtained As a result;
Corresponding mode instruction is obtained according to eye movement recognition result.
The algorithm of eye movement signal includes: backpropagation (Back Propagation, BP) algorithm, support vector machines (Support Vector Machine, SVM) algorithm and dynamic time warping (Dynamic Time Warping, DTW) algorithm Deng.
The recognizer of eye movement signal is based on following mistakes by taking algorithm of support vector machine as an example, to the feature extraction of eye movement signal Cheng Shixian:
If can divided data collection D={ (xi, yi) | i=1,2 ..., n }, wherein input vector xi∈ Rd, Rd are that d dimension real number is flat Face, target data yi∈ { -1 ,+1 }, if xi∈ Rd belongs to the 1st class, then yiLabel is positive, i.e. yi=1, if belonging to the 2nd class, yiLabel is negative, i.e. yi=-1.
Can divided data integrate D as eye movement data, be the data obtained after eye movement acquisition process.
Interior Product function (kernel function) K (xi, x) and it can be solved by following three kinds of algorithms:
Polynomial function:
K(xi, x) and=[1+ (xi·x)]d
Wherein d is d dimension space (number plane), xiX indicates xiWith x inner product;
Multilayer neural network function:
K(xi, x) and=tanh (v (xi·x)+c)
Wherein, h () representative function, v indicate that a scalar, c indicate displacement parameter;
Radial basis function:
Wherein σ is the hyper parameter of radial basis function (Radial basis function network, RBF) core;
Optimal decision function is obtained after solution are as follows:
Wherein sgn is sign function, and * indicates the optimized parameter in identified optimal decision function, aiFor Lagrange Multiplier, bias parameter b are solved in solution by following formula:
Wherein NNSVFor standard supporting vector number, JN is the intersection of standard supporting vector, and J is the intersection of supporting vector.
The method also includes:
Helmet receives the video stream data and depth data of unmanned plane ambient enviroment collected, handles the video Flow data and depth data generate the hologram of ambient enviroment and display.
The method also includes:
The pose data of itself, resolve the pose data, obtain posture sample acquired in helmet reception unmanned plane Formula is simultaneously shown.
According to presently filed embodiment, it is also proposed that a kind of unmanned plane control device based on gesture and eye movement, comprising: hand Wear equipment and helmet;
The hand wears equipment, for acquiring hand signal, is sent to helmet;
The helmet obtains control instruction for handling hand signal;Acquisition eye movement signal is simultaneously handled, and obtains mould Formula instruction;The control command and mode command are sent to unmanned plane, controls unmanned plane during flying.
It includes with microelectromechanical-systems (Micro-Electro-Mechanical System, MEMS) inertia that hand, which wears equipment, The finger ring of sensor.
Helmet be include eye tracker and camera head-wearing type intelligent augmented reality (Augmented Reality, AR) system equipment.
The helmet is also used to processing environment and depth data, generates the hologram of ambient enviroment and display;Solution Pose data are calculated, posture pattern is obtained and shows.
As shown in Fig. 2, the helmet includes:
Eye movement acquisition module for acquiring eye movement signal, and is transmitted to human physiological signal treatment module;
First communication module is transmitted to human physiological signal treatment module for receiving the hand signal;
Human physiological signal treatment module is controlled for handling the hand signal and the eye movement signal System instruction and mode instruction;
Control module, for control instruction and mode instruction to be sent to first communication module, control display module is shown.
It is described that the hand signal and the eye movement signal are handled, comprising: digital signal pretreatment, characteristic signal It extracts and characteristic signal is handled.
Collected hand signal and eye movement signal are handled by human physiological signal treatment module, by number Signal (hand signal and eye movement signal) is pre-processed, and removes baseline drift, notch filter etc. to digitized electro-physiological signals Processing, obtains cleaner digital signal, later, carries out at feature signal extraction and feature to by pretreated digital signal Reason.
As shown in Fig. 2, it includes: gesture acquisition module and second communication module that the hand, which wears equipment,.
It further includes power module that the hand, which wears equipment,.
The gesture acquisition module includes: accelerometer, gyroscope, arm processor, data acquisition unit.
The data acquisition unit includes digital analog converter (Analog-to-Digital Converter, ADC).
Gesture control is for adjusting unmanned plane during flying speed and direction.
Eye movement control is for microcosmic flight attitude adjustment and mode conversion.
When controlling unmanned plane adjustment pitching, yaw, surveying roll angle degree and unmanned plane acceleration-deceleration, pass through the second communication mould Hand is worn human physiological signal treatment module that the human hand movement track in equipment is transferred in helmet and according to analysis by block Good recognition result, which sends instructions to UAV system, makes unmanned plane carry out pose adjustment according to human hand movement track.
As shown in figure 3, with hand wear equipment acquisition index finger, middle finger, the third finger, little finger second knuckle hand signal be Example.
As shown in figure 4, by taking corresponding five corresponding instructions of five kinds of gestures as an example, five kinds of gestures be respectively the five fingers open, clench fist, The rotation of palm medial rotation is outer, palm anterior flexion and rear stretching, palm outreach adduction, and the corresponding instruction of gesture is deceleration, acceleration, rolling, pitching, partially Boat mode.
The recognizer of hand signal establishes a HMM model by taking hidden Markov model as an example, for each gesture.Each Gesture has individual beginning and end state.According to the complexity of gesture, the status number of different gestures is not identical.Gesture identification Accuracy can be improved with the increase of status number, however the complexity of model and the complexity of calculating time can also increase therewith Add.5 to 10 are set by state number, different state numbers is set for the complexity of gesture.Based on there is temporal aspect Sensing data gesture model, usually using model (Bakis type HMM) from left to right.
From left to right in model, state can only be transferred to from lower lower target state compared with relative superiority or inferiority target state.The shape of back State cannot jump to state in front.Therefore, in this model, the probability shifted from back to front is arranged to 0, it is assumed that step-length is more than 2 transition probability is 0.In order to guarantee each state of model from left to right be it is reachable, the initial state of model be state 1 namely the probability that occurs of initial state 1 be 1, other state probabilities are 0.
Model λ from left to righti=(π, A, B), wherein A is the transition probability matrix of hidden state, and B is observation state probability Distribution, π are initial state probabilities distribution vector, it is 1 × N rank matrix, πiIndicate that model is in shape when gesture starts The probability of state i, it is necessary to meet following condition:
π1=1
πi,i≠1=0
That is initial state space probability distribution are as follows: π=(1,0 ..., 0).
Followed by second parameter A, state-transition matrix, for model from left to right, 8 state from left-hand The transfer matrix of right mould type are as follows:
Each state in state-transition matrix A be transferred to itself, next, next but one probability it is identical, from below It is 0 that state, which is transferred to front state and is transferred to the shape probability of state that step-length is more than 2, and it is relevant to timing to meet gesture data Feature.
Parameter B, its element BijRefer to that state i corresponds to the probability that observed value is j.By taking status number is 5 to 10 as an example, if working as Preceding status number is 10, then B are as follows:
Using Baum-Welch algorithm come to gesture model λiIt is trained, makes model parameter tend to restrain as far as possible, obtain The optimal λ of gesture is corresponded to outi
It selects Viterbi algorithm as the corresponding HMM recognition methods of each gesture, i.e., will input the acceleration signature of gesture Sequence respectively with trained λiCalculating assessment is carried out, the maximum λ of its probability output is takeniFor the recognition result of corresponding gesture motion.
As shown in figure 5, the region that eye movement acquisition interface screen is presented is 3 D visual interface, upper left corner area represents positioning Mode, upper right comer region represent gesture mode, lower left corner region represents motor pattern, lower right field represents a key and makes a return voyage mould Formula.Station-keeping mode carries out automatically the standard operations such as increasing is steady, hovers, makes a return voyage by satellite positioning.Motor pattern is station-keeping mode On the basis of acceleration mode, automatic obstacle-avoiding close, speed promoted.Gesture mode without using positioning, speed promoted, it is smoother but compared with It is dangerous.One key makes a return voyage the height and horizontal distance that mode unmanned plane is automatically determined and maked a return voyage a little, makes a return voyage automatically.
The recognizer of eye movement signal is by taking algorithm of support vector machine as an example, after entering eye movement and capturing (eye movement acquisition), meeting Calculate which kind of mode user enters using the identification of SVM feature according to the opposite deformation trace of eyeball.Entering the mode Before, countdowns in 10 seconds of the mode will be showed access on the screen, and lock present mode after unmanned plane enters the mode, Avoid maloperation.
With according to human body eye movement, in order successively upper left, lower-left, upper right, mark caused by bottom right for, according to people Body eye movement in order successively upper left, lower-left, upper right, label caused by bottom right eye movement signal after A/D conversion circuit, into Row training, and the initiation parameter after training is inputed into the classification of support amount machine classifier.SVM be based on Statistical Learning Theory and Structural risk minimization, basic thought are that the sample of the input space is mapped to high dimensional feature sky by nonlinear transformation Between, it is then sought in feature space the linear separated optimal classification surface of sample.Myoelectricity is believed using algorithm of support vector machine Number classification be used for unmanned plane mode conversion.
By taking the eye movement signal characteristic abstraction based on SVM as an example, by Nonlinear Mapping, SVM is mapped to input vector x The feature space of one higher-dimension realizes nonlinear transformation by defining interior Product function appropriate, finally in this higher dimensional space Construct optimal separating hyper plane Z.
If can divided data collection D={ (xi,yi) | i=1,2 ..., n }, wherein input vector xi∈ Rd, Rd are that d dimension real number is flat Face, target data yi∈ { -1 ,+1 }, if xi∈ Rd belongs to the 1st class, then yiLabel is positive, i.e. yi=1, if belonging to the 2nd Class, then yiLabel is negative, i.e. yi=-1.
By map these input vectors to a higher-dimension reproducing kernel Hilbert space (Reproducing Kernel Hilbert Space, RKHS), wherein linear machine is constructed by minimizing a regularizing functionals.For training sample set For nonlinear situation, Optimal Separating Hyperplane equation are as follows:
Wherein, w indicates that the normal vector of hyperplane, w ∈ Rd have carried out standardization processing;It is non-thread in lower dimensional space Property function, it is exactly the function that training set data x is mapped to a High-dimensional Linear feature space.
Optimal separating hyper plane is constructed in the linear space that may be infinitely great dimension by the function, and solves classification The decision function of device;B ∈ R is offset parameter.
Supporting vector (Support Vector, SV) is exactly those nearest apart from optimal hyperlane points on H1 and H2. H1 and H2 was the straight line of distance H (hyperplane) closest approach in two samples respectively.
Decision function are as follows:
Wherein sgn is sign function.SVM is solved by quadratic programming problem below:
If in a higher dimensional space, data be not still it is separable, penalty coefficient can be increased Obtain objective function:
yi(< w, φ (xi) >+b) >=1- ξii>=0, i=1 ..., l,
Wherein, φ (xi) it is nonlinear function in higher dimensional space, C is punishment parameter, and C is bigger to be indicated to mistake classification Punish bigger, C > 0, ξiFor slack variable, ξi>=0.C-support vector classification antithesis optimal problem is as follows:
Wherein scalar product φ (xi)·φ(xj) (in higher dimensional space) be also RKHS in kernel function K again (xi,xj).Using different interior Product functions, the algorithm of support vector machines is also different, and there are mainly three types of inner product functional forms:
Polynomial function:
K(xi, x) and=[1+ (xi·x)]d
Wherein d is that d ties up number plane, xiX indicates xiWith x inner product;
Multilayer neural network function:
K(xi, x) and=tanh (v (xi·x)+c)
Wherein, h () representative function, v indicate that a scalar, c indicate displacement parameter;
Radial basis function:
Wherein σ is the hyper parameter of radial basis function (Radial basis function network, RBF) core;
Optimal decision function is obtained after solution are as follows:
Wherein sgn is sign function, and * indicates the optimized parameter in identified optimal decision function, aiFor Lagrange Multiplier, bias parameter b are solved in solution by following formula:
Wherein NNSVFor standard supporting vector number, JN is the intersection of standard supporting vector, and J is the intersection of supporting vector.
It is carried out using original signal of the human physiological signal treatment module to collected hand signal and eye movement signal pre- Processing, the notch filter filter based on adaptive high-pass filter and adaptive 50Hz carry out hand signal and eye movement signal Filtering processing, then with have limit for length's unit impulse response (Finite Impulse Response, FIR) filter to gesture letter Number and eye movement signal be filtered, according to effective frequency range feature of signal, choose the cutoff frequency of hand signal are as follows: 2Hz And 200Hz, choose eye movement signal by frequency be 8Hz and 90Hz.
Hand signal and eye movement signal synchronous collection.Connected between gesture acquisition module and eye movement acquisition module by bluetooth It connects, realizes synchronous acquisition.
The hand signal exports control instruction using the strategy of asynchronous controlling.
The eye movement signal is instructed using the tactful output mode of asynchronous controlling.
As shown in fig. 6, presetting step-length and threshold value, system and come data intercept and for feature extraction divide according to step-length This data slot is just denoted as a valid data when obtained prediction result correlation coefficient value P reaches threshold value PTrd by class;When When threshold value PTrd is not achieved in obtained prediction result correlation coefficient value P, return re-starts signal acquisition.There are 3 phases when accumulative Same eye movement, gesture, and when effective prediction result, i.e. NTrd > 3, export control instruction and/or mode instruction.
The helmet further include:
Image procossing and space mapping module generate the complete of ambient enviroment for handling video stream data and depth data Retire into private life picture;
UAV position and orientation resolves module and obtains posture pattern for resolving pose data;
Display module, for showing hologram, posture pattern and unmanned machine information;
First communication module is also used to receive video stream data and depth data, is transmitted to image processing module, received bit Appearance data are transmitted to pose and resolve module, receive unmanned machine information, send control instruction and mode instruction to unmanned plane.
The unmanned plane information includes multiple important parameter information such as power, electricity, height, speed.
Helmet can be with the hologram of real-time display unmanned plane ambient enviroment, the posture pattern and unmanned plane of unmanned plane Information can be realized the manipulation unmanned plane of immersion, understand the current flight environment of vehicle of unmanned plane.
According to presently filed embodiment, it is also proposed that a kind of unmanned plane control system based on gesture and eye movement uses Above-mentioned device, further includes:
Unmanned plane is sent to helmet for acquiring unmanned plane ambient enviroment and depth data;Acquire the position of unmanned plane Appearance data, are sent to helmet;It is flown according to control command and mode command.
The ambient data includes the video stream data of ambient enviroment.
It in the present processes, is combined by gesture and eye movement, simplifies the control method of unmanned plane, realized multi-modal Unmanned plane manipulation.By augmented reality, and realize in augmented reality system gesture interacted with unmanned plane, eye movement and nothing Human-computer interaction, unmanned plane image real-time display are that one scene of unmanned aerial vehicle (UAV) control presentation is more three-dimensional, information is more abundant , the unmanned plane interactive interface that environment is more naturally warm.Unmanned plane is grasped by acquisition hand signal and eye movement signal Control, easy to operate, study is simple, reduces the risk of aircraft bombing.Bimodal unmanned plane control method based on gesture and eye movement, than list One handle control mode classify risk it is low, can classification mode diversification, environmental suitability it is strong, easy to operate, control unmanned plane Flight course in, hand, eye, machine can more nature and cooperation, sufficiently realize that system is comprehensive, dynamic advantageous combination. Gesture identification mode based on MEMS sensor has very high reliability and identification precision.Compared with the gesture based on image recognition Control mode, the gesture identification mode based on MEMS sensor are not influenced by environment light, background colour, acquire data stabilization, letter Number processing is simple.Gesture control mode based on MEMS sensor is when facing complex environment, it is not easy to by sudden The influences of the weather such as haze, wet weather, thunderstorm will not be by when there is object accidentally to block between hand and picture pick-up device It influences.Helmet can show power, electricity, height, the speed etc. of unmanned plane in the virtual screen in front of glasses interface Multiple important parameter information, the convenient state of flight for grasping unmanned plane in real time.It can also be switched to the picture of taking photo by plane of unmanned plane, i.e., Unmanned plane can be manipulated to immersion, current flight environment is understood, gives experience on the spot in person.
The preferable specific embodiment of the above, only the application, but the protection scope of the application is not limited thereto, Within the technical scope of the present application, any changes or substitutions that can be easily thought of by anyone skilled in the art, Should all it cover within the scope of protection of this application.Therefore, the protection scope of the application should be with the protection model of the claim Subject to enclosing.

Claims (10)

1. a kind of unmanned plane control method based on gesture and eye movement characterized by comprising
Hand wears equipment acquisition hand signal, is sent to helmet;
Helmet handles hand signal, obtains control instruction;
Helmet acquisition eye movement signal is simultaneously handled, and obtains mode instruction;
Helmet sends the control command and mode command to unmanned plane, controls unmanned plane during flying.
2. a kind of unmanned plane control method based on gesture and eye movement as described in claim 1, which is characterized in that the processing Hand signal obtains control instruction, comprising:
The hand signal is pre-processed, feature extraction and characteristic processing, obtains gesture identification result;
Corresponding control instruction is obtained according to gesture identification result.
3. a kind of unmanned plane control method based on gesture and eye movement as described in claim 1, which is characterized in that the acquisition Eye movement signal is simultaneously handled, and obtains mode instruction, comprising:
Eye movement signal is acquired, the eye movement signal is pre-processed, feature extraction and characteristic processing, eye movement identification knot is obtained Fruit;
Corresponding mode instruction is obtained according to eye movement recognition result.
4. a kind of unmanned plane control method based on gesture and eye movement as described in claim 1, which is characterized in that further include:
Helmet receives the video stream data and depth data of unmanned plane ambient enviroment collected, handles the video fluxion According to and depth data, generate the hologram of ambient enviroment and display.
5. a kind of unmanned plane control method based on gesture and eye movement as described in claim 1, which is characterized in that further include:
The pose data of itself, resolve the pose data, obtain posture pattern simultaneously acquired in helmet reception unmanned plane Display.
6. a kind of unmanned plane control device based on gesture and eye movement characterized by comprising hand wears equipment and helmet;
The hand wears equipment, for acquiring hand signal, is sent to helmet;
The helmet obtains control instruction for handling hand signal;Acquisition eye movement signal is simultaneously handled, and the mode of obtaining refers to It enables;The control command and mode command are sent to unmanned plane, controls unmanned plane during flying.
7. a kind of unmanned plane control device based on gesture and eye movement as claimed in claim 6, which is characterized in that described to wear Equipment is also used to processing environment and depth data, generates the hologram of ambient enviroment and display;Pose data are resolved, appearance is obtained Aspect formula is simultaneously shown.
8. a kind of unmanned plane control device based on gesture and eye movement as claimed in claim 6, which is characterized in that described to wear Equipment includes:
Eye movement acquisition module for acquiring eye movement signal, and is transmitted to human physiological signal treatment module;
First communication module is transmitted to human physiological signal treatment module for receiving the hand signal;
Human physiological signal treatment module obtains control and refers to for handling the hand signal and the eye movement signal Order and mode instruction;
Control module, for control instruction and mode instruction to be sent to first communication module, control display module is shown.
9. a kind of unmanned plane control device based on gesture and eye movement as claimed in claim 6, which is characterized in that described to wear Equipment further include:
Image procossing and space mapping module generate the holographic shadow of ambient enviroment for handling video stream data and depth data Picture;
UAV position and orientation resolves module and obtains posture pattern for resolving pose data;
Display module, for showing hologram, posture pattern and unmanned machine information;
First communication module is also used to receive video stream data and depth data, is transmitted to image processing module, receives pose number According to being transmitted to pose and resolve module, receive unmanned machine information, send control instruction and mode instruction to unmanned plane.
10. a kind of unmanned plane control system based on gesture and eye movement, which is characterized in that including as described in claim 6-9 Device, further includes:
Unmanned plane is sent to helmet for acquiring unmanned plane ambient enviroment and depth data;Acquire the pose number of unmanned plane According to being sent to helmet;It is flown according to control command and mode command.
CN201910524901.3A 2019-06-18 2019-06-18 It is a kind of based on gesture and the unmanned plane control method of eye movement, device and system Pending CN110412996A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910524901.3A CN110412996A (en) 2019-06-18 2019-06-18 It is a kind of based on gesture and the unmanned plane control method of eye movement, device and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910524901.3A CN110412996A (en) 2019-06-18 2019-06-18 It is a kind of based on gesture and the unmanned plane control method of eye movement, device and system

Publications (1)

Publication Number Publication Date
CN110412996A true CN110412996A (en) 2019-11-05

Family

ID=68359180

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910524901.3A Pending CN110412996A (en) 2019-06-18 2019-06-18 It is a kind of based on gesture and the unmanned plane control method of eye movement, device and system

Country Status (1)

Country Link
CN (1) CN110412996A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111093220A (en) * 2019-11-14 2020-05-01 中国人民解放军军事科学院国防科技创新研究院 Autonomous unmanned cluster dynamic management method and management platform
CN111966217A (en) * 2020-07-20 2020-11-20 中国人民解放军军事科学院国防科技创新研究院 Unmanned aerial vehicle control method and system based on gestures and eye movements
CN112650393A (en) * 2020-12-23 2021-04-13 航天时代电子技术股份有限公司 Head-mounted teleoperation control device
CN112860054A (en) * 2019-11-28 2021-05-28 北京宝沃汽车股份有限公司 Method and vehicle for controlling unmanned aerial vehicle
CN113534835A (en) * 2021-07-01 2021-10-22 湘南学院 Tourism virtual remote experience system and method
CN118151759A (en) * 2024-04-09 2024-06-07 广州壹联信息科技有限公司 Man-machine interaction method, system and device
CN118377312A (en) * 2024-06-24 2024-07-23 南京信息工程大学 Unmanned aerial vehicle cluster control method based on virtual reality

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105676860A (en) * 2016-03-17 2016-06-15 歌尔声学股份有限公司 Wearable equipment, unmanned plane control device and control realization method
CN106527466A (en) * 2016-12-15 2017-03-22 鹰眼电子科技有限公司 Wearing type unmanned aerial vehicle control system
CN106569508A (en) * 2016-10-28 2017-04-19 深圳市元征软件开发有限公司 Unmanned aerial vehicle control method and device
CN107454947A (en) * 2016-09-26 2017-12-08 深圳市大疆创新科技有限公司 Unmanned aerial vehicle (UAV) control method, wear-type show glasses and system
CN107885124A (en) * 2017-11-21 2018-04-06 中国运载火箭技术研究院 Brain eye cooperative control method and system in a kind of augmented reality environment
CN108319289A (en) * 2017-01-16 2018-07-24 翔升(上海)电子技术有限公司 Head-wearing display device, unmanned plane, flight system and unmanned aerial vehicle (UAV) control method
CN108700890A (en) * 2017-06-12 2018-10-23 深圳市大疆创新科技有限公司 Unmanned plane makes a return voyage control method, unmanned plane and machine readable storage medium
CN109062398A (en) * 2018-06-07 2018-12-21 中国航天员科研训练中心 A kind of Spacecraft Rendezvous interconnection method based on virtual reality Yu multi-modal man-machine interface
CN109669477A (en) * 2019-01-29 2019-04-23 华南理工大学 A kind of cooperative control system and control method towards unmanned plane cluster
CN111966217A (en) * 2020-07-20 2020-11-20 中国人民解放军军事科学院国防科技创新研究院 Unmanned aerial vehicle control method and system based on gestures and eye movements

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105676860A (en) * 2016-03-17 2016-06-15 歌尔声学股份有限公司 Wearable equipment, unmanned plane control device and control realization method
CN107454947A (en) * 2016-09-26 2017-12-08 深圳市大疆创新科技有限公司 Unmanned aerial vehicle (UAV) control method, wear-type show glasses and system
CN106569508A (en) * 2016-10-28 2017-04-19 深圳市元征软件开发有限公司 Unmanned aerial vehicle control method and device
CN106527466A (en) * 2016-12-15 2017-03-22 鹰眼电子科技有限公司 Wearing type unmanned aerial vehicle control system
CN108319289A (en) * 2017-01-16 2018-07-24 翔升(上海)电子技术有限公司 Head-wearing display device, unmanned plane, flight system and unmanned aerial vehicle (UAV) control method
CN108700890A (en) * 2017-06-12 2018-10-23 深圳市大疆创新科技有限公司 Unmanned plane makes a return voyage control method, unmanned plane and machine readable storage medium
CN107885124A (en) * 2017-11-21 2018-04-06 中国运载火箭技术研究院 Brain eye cooperative control method and system in a kind of augmented reality environment
CN109062398A (en) * 2018-06-07 2018-12-21 中国航天员科研训练中心 A kind of Spacecraft Rendezvous interconnection method based on virtual reality Yu multi-modal man-machine interface
CN109669477A (en) * 2019-01-29 2019-04-23 华南理工大学 A kind of cooperative control system and control method towards unmanned plane cluster
CN111966217A (en) * 2020-07-20 2020-11-20 中国人民解放军军事科学院国防科技创新研究院 Unmanned aerial vehicle control method and system based on gestures and eye movements

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111093220A (en) * 2019-11-14 2020-05-01 中国人民解放军军事科学院国防科技创新研究院 Autonomous unmanned cluster dynamic management method and management platform
CN112860054A (en) * 2019-11-28 2021-05-28 北京宝沃汽车股份有限公司 Method and vehicle for controlling unmanned aerial vehicle
CN111966217A (en) * 2020-07-20 2020-11-20 中国人民解放军军事科学院国防科技创新研究院 Unmanned aerial vehicle control method and system based on gestures and eye movements
CN111966217B (en) * 2020-07-20 2023-08-18 中国人民解放军军事科学院国防科技创新研究院 Unmanned aerial vehicle control method and system based on gestures and eye movements
CN112650393A (en) * 2020-12-23 2021-04-13 航天时代电子技术股份有限公司 Head-mounted teleoperation control device
CN113534835A (en) * 2021-07-01 2021-10-22 湘南学院 Tourism virtual remote experience system and method
CN113534835B (en) * 2021-07-01 2022-05-31 湘南学院 Tourism virtual remote experience system and method
CN118151759A (en) * 2024-04-09 2024-06-07 广州壹联信息科技有限公司 Man-machine interaction method, system and device
CN118377312A (en) * 2024-06-24 2024-07-23 南京信息工程大学 Unmanned aerial vehicle cluster control method based on virtual reality

Similar Documents

Publication Publication Date Title
CN110412996A (en) It is a kind of based on gesture and the unmanned plane control method of eye movement, device and system
Mahmud et al. Interface for human machine interaction for assistant devices: A review
US20180186452A1 (en) Unmanned Aerial Vehicle Interactive Apparatus and Method Based on Deep Learning Posture Estimation
CN110083202B (en) Multimode interaction with near-eye display
Qi et al. Computer vision-based hand gesture recognition for human-robot interaction: a review
CN109062398B (en) Spacecraft rendezvous and docking method based on virtual reality and multi-mode human-computer interface
CN111055279A (en) Multi-mode object grabbing method and system based on combination of touch sense and vision
CN111966217B (en) Unmanned aerial vehicle control method and system based on gestures and eye movements
CN105159452B (en) A kind of control method and system based on human face modeling
CN106648068A (en) Method for recognizing three-dimensional dynamic gesture by two hands
Kassab et al. Real-time human-UAV interaction: New dataset and two novel gesture-based interacting systems
JPH0830327A (en) Active environment recognition system
CN111461059A (en) Multi-zone multi-classification extensible gesture recognition control device and control method
Krishnaraj et al. A Glove based approach to recognize Indian Sign Languages
Abualola et al. Flexible gesture recognition using wearable inertial sensors
CN108052901A (en) A kind of gesture identification Intelligent unattended machine remote control method based on binocular
CN111695408A (en) Intelligent gesture information recognition system and method and information data processing terminal
Haratiannejadi et al. Smart glove and hand gesture-based control interface for multi-rotor aerial vehicles in a multi-subject environment
CN211979681U (en) Multi-zone multi-classification extensible gesture recognition control device
Haratiannejadi et al. Smart glove and hand gesture-based control interface for multi-rotor aerial vehicles
CN111134974B (en) Wheelchair robot system based on augmented reality and multi-mode biological signals
CN116257130A (en) XR (X-ray diffraction) equipment-based eye movement gesture unmanned aerial vehicle control method
Srisuphab et al. Artificial neural networks for gesture classification with inertial motion sensing armbands
Sung et al. Motion quaternion-based motion estimation method of MYO using K-means algorithm and Bayesian probability
Dhamanskar et al. Human computer interaction using hand gestures and voice

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20191105

RJ01 Rejection of invention patent application after publication