CN106293103B - Gesture control device and gesture control method for four-axis aircraft based on inertial sensor - Google Patents

Gesture control device and gesture control method for four-axis aircraft based on inertial sensor Download PDF

Info

Publication number
CN106293103B
CN106293103B CN201610920077.XA CN201610920077A CN106293103B CN 106293103 B CN106293103 B CN 106293103B CN 201610920077 A CN201610920077 A CN 201610920077A CN 106293103 B CN106293103 B CN 106293103B
Authority
CN
China
Prior art keywords
finger
gesture
hand
fingers
axis
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610920077.XA
Other languages
Chinese (zh)
Other versions
CN106293103A (en
Inventor
余乐
李洋洋
陈岩
王瑶
吴超
董文菲
李阳光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Technology and Business University
Original Assignee
Beijing Technology and Business University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Technology and Business University filed Critical Beijing Technology and Business University
Priority to CN201610920077.XA priority Critical patent/CN106293103B/en
Publication of CN106293103A publication Critical patent/CN106293103A/en
Application granted granted Critical
Publication of CN106293103B publication Critical patent/CN106293103B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Abstract

The invention relates to a motion sensing gesture control device and a motion sensing gesture control method of a four-axis aircraft, wherein the gesture control device comprises the following components: a controller and an inertial sensor. The inertial sensing node must include a three-axis angular velocity meter and a three-axis accelerometer, or a six-axis inertial sensor integrated with the two, and the three-axis magnetometer may not be included. The inertial sensing node is fixed on the back of the second knuckle of the finger and the Y-axis positive direction points to the fingertip. And the motion gesture fusion adopts a strapdown navigation algorithm. The controller is integrated with a six-axis inertial sensor which is used as a reference point, and the actual measurement angle of the finger is the relative angle between the inertial sensor of the finger joint and the reference point. The control method can divide the process into three steps of sensor configuration, acquisition error and finger instruction calculation according to the sequence of acquisition and processing. The control method only uses the roll angle to judge the finger gesture. And the left hand gesture instruction is used for determining an accelerator gear according to the stretching hand index. And the right hand gesture instruction determines forward, backward, left and right flight directions of the aircraft according to the stretching and retracting of the thumb and the other four fingers.

Description

Gesture control device and gesture control method for four-axis aircraft based on inertial sensor
Technical Field
The invention relates to a gesture control device and a gesture control method, in particular to a gesture control device and a gesture control method for a four-axis aircraft based on an inertial sensor.
Background
In recent years, four-axis aircraft have become increasingly popular in the consumer market as a special "self-timer stick". However, for the four-axis aircraft, the currently commercial products still mainly operate by handles. Recently, some researchers have attempted to control a four-axis aircraft using gestures, and the most central part of gesture control is gesture recognition and gesture recognition.
Currently, gesture recognition is generally two ways, one based on machine vision. Namely, depth information of a shot three-dimensional space is extracted through a binocular camera, and then three-dimensional reconstruction is carried out on gestures, which are typically represented by kinect and leap motion. In this way, the greatest advantage can be achieved with bare-handed operation, which is the most desirable manner of manipulation. However, the defects are that the requirements on the ambient light are severe, and the influence of illumination intensity, uniformity and the like on the recognition rate is great. In addition, the gesture recognition algorithm based on the image is a second-order vector operation, a special graphic processor is required to accelerate, and the time delay and the power consumption of the whole processing process are large, and the gesture recognition algorithm is usually used on a host device.
Another approach is based on sensor technology, i.e., detecting finger motion with various types of sensors attached to the finger joints. Among the most prominent types of sensors are inertial sensors. Such inertial sensors typically comprise a three-axis gyroscope, a three-axis accelerometer (part of the product also comprises a three-axis magnetometer). The gesture recognition mode has the greatest advantages that measured data are direct, rapid, low in power consumption and small in environmental influence, and is suitable for scenes with high requirements on real-time performance, and particularly suitable for controlling a four-axis aircraft outdoors.
The existing gesture control method based on the inertial sensor has three problems: (1) The recognition precision and the speed cannot be considered, some 9-axis inertial sensors and corresponding gesture resolving algorithms are used, the gesture resolving precision is pursued on one side, and the stable delay after resolving by the scheme reaches tens or hundreds of milliseconds; or only a 3-axis inertial sensor is used for carrying out attitude calculation and fusion, and original data is directly used for judging, so that the drift of the sensor per se under the scheme greatly affects the follow-up recognition accuracy. (2) The robustness of gesture recognition is not sufficient, but the recognition rate is high only when the palm is held in a certain posture, such as flat lifting, and the recognition rate is drastically reduced when the hand has a tilt angle. (3) The difference of gesture instructions is not obvious enough, for example, the wrist is rotated and the arm is swung, and slight rotation action can be generated during swinging, so that misoperation is easy to generate.
Disclosure of Invention
Problems to be solved by the invention
The invention aims to solve the technical problems of balancing gesture recognition accuracy and speed, improving gesture recognition robustness and providing a set of gesture action instructions with obvious difference and convenience for stable recognition.
Solution for solving the problem
In view of the above, the present invention provides a gesture control device and a gesture control method for a four-axis aircraft based on an inertial sensor, and a solution is provided for the above-mentioned problems.
In one aspect, a gesture control apparatus is provided, including: a controller and an inertial sensing node. The controller is fixed on the back of the hand, and the inertial sensing node is fixed at a second knuckle of the finger. The inertial sensor is used for collecting the motion attitude angle information of the finger and outputting the information to the controller, and the controller is used for collecting the output data of the sensor and sending a control instruction to the four-axis aircraft.
The inertial sensing node must include a three-axis angular velocity meter and a three-axis accelerometer, which may not be included.
The inertial sensing node is fixed on the back of the second knuckle of the finger and the Y-axis positive direction points to the fingertip.
The motion gesture fusion adopts a strapdown navigation algorithm, and the pitch angle measurement range after fusion is-80 degrees to +80 degrees; the measurement range of the roll angle is-180 degrees; geomagnetism causes the yaw angle to continuously drift, and accurate measurement results are not available.
In the control method, the finger gesture is judged only by using the roll angle.
The controller is integrated with an inertial sensor which is used as a reference point, and the actual measurement angle of the finger is the relative angle between the inertial sensor of the finger joint and the reference point.
On the other hand, a control method is provided, and the specific implementation steps are as follows:
step 1: the sensor is configured and the error value is acquired, namely the sensor is configured according to certain requirements, and the error values of the angular speed and the acceleration are acquired.
Step 2: the method comprises the steps of collecting finger information and calculating the bending degree of fingers, namely collecting the acceleration and the angular velocity of each finger, and calculating the bending degree of each finger relative to the back of the hand.
Step 3: the method comprises the steps of calculating the current gesture and sending a corresponding control command, namely, calculating the current gesture according to the bending degree of fingers of two hands, and sending a corresponding gesture control command to the four-axis aircraft.
The left hand control instruction specifically comprises:
throttle first gear: the palm of the hand is facing the ground, the index finger is flattened, and the rest of the fingers are curled. The gesture is exemplified by an index finger, and may be a stretching action of any finger.
Throttle second gear: the palm of the hand is facing the ground, the index and middle fingers are flattened, and the remaining fingers are curled. The gesture takes an index finger and a middle finger as examples, and can also be the stretching action of any two other fingers.
Three gears of accelerator: the palm of the hand is facing the ground, the index finger, the middle finger and the ring finger are flattened, and the rest of the fingers are curled. The gesture takes an index finger, a middle finger and a ring finger as examples, and can also be the stretching action of any three fingers.
Four gears of the accelerator: the palm of the hand is directed toward the ground, and the index finger, middle finger, ring finger and little finger are flattened, and the thumb is curled. The gesture takes an index finger, a middle finger, a ring finger and a little finger as an example, and can also be the stretching action of any four fingers.
Five gears of the accelerator: indicating that the palm of the hand is facing the ground, five fingers are flattened.
The right hand gesture instruction specifically comprises:
forward flight: the palm of the hand is facing the sky, the thumb is curled, and the other four fingers are flattened.
Flying backwards: the palm of the hand is facing the sky, the thumb is curled, and the remaining four fingers are curled and pressed against the thumb.
Flying to the left: with the palm facing toward the ground, the thumb Ping Shenxiang is left and the remaining four fingers are curled.
Flying to the right: the palm of the hand is facing the sky, the thumb is flat to the right, and the other four fingers are curled.
Drawings
Fig. 1: is a left gesture command schematic diagram of the control device;
fig. 2: is a right gesture command schematic diagram of the control device;
fig. 3: is a schematic structural diagram of the control device of the present invention;
fig. 4: is a control flow chart of the present invention;
Detailed Description
Various exemplary embodiments, features and aspects of the invention will be described in detail below with reference to the drawings. In the drawings, like reference numbers indicate identical or functionally similar elements. Although various aspects of the embodiments are illustrated in the accompanying drawings, the drawings are not necessarily drawn to scale unless specifically indicated.
The word "exemplary" is used herein to mean "serving as an example, embodiment, or illustration. Any embodiment described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
In addition, numerous specific details are set forth in the following description in order to provide a better illustration of the invention. It will be understood by those skilled in the art that the present invention may be practiced without some of these specific details. In some instances, well known methods, procedures, components, and circuits have not been described in detail so as not to obscure the present invention.
Fig. 1 shows a left gesture command schematic of a control device according to an embodiment of the present invention, in which finger numbers 101 to 105 represent a thumb, an index finger, a middle finger, a ring finger, and a thumb of a left hand, respectively. The left hand controls the throttle of the four-axis aircraft, and gestures 106-110 respectively indicate that the control throttle is divided into five gears from low to high in sequence.
Left hand gesture 106: indicating that the palm of the hand is facing the ground, the index finger is flattened, the rest fingers are curled, and the gesture controls the throttle to be in first gear. The gesture is exemplified by an index finger, and may be a stretching action of any finger.
Left hand gesture 107: indicating that the palm of the hand is facing the ground, the index finger and the middle finger are flattened, the rest fingers are curled, and the gesture controls the throttle to be in the second gear. The gesture takes an index finger and a middle finger as examples, and can also be the stretching action of any two other fingers.
Left hand gesture 108: indicating that the palm of the hand is facing the ground, stretching the index finger, the middle finger and the ring finger, and the rest fingers are curled, wherein the gesture controls the throttle to be in a third gear. The gesture takes an index finger, a middle finger and a ring finger as examples, and can also be the stretching action of any three fingers.
Left gesture 109: indicating that the palm of the hand is facing the ground, stretching the index finger, the middle finger, the ring finger and the little finger, and the thumb is curled, and controlling the throttle to be in the fourth gear. The gesture takes an index finger, a middle finger, a ring finger and a little finger as an example, and can also be the stretching action of any four fingers.
Left gesture 110: indicating that the palm of the hand is facing the ground, five fingers are flattened. The gesture controls the throttle to be in five gears.
Fig. 2 shows a schematic block diagram of a right gesture command of a control device according to an embodiment of the present invention, in which finger numbers 201 to 205 represent a thumb, an index finger, a middle finger, a ring finger, and a thumb of a right hand, respectively. The right gestures 206 to 209 respectively indicate control of the four-axis vehicle to fly in four directions of forward, backward, leftward and rightward, and here, the forward-backward-leftward-rightward directions are initial positions with respect to the four-axis vehicle.
Right gesture 206: indicating that the palm of the hand is facing the sky, the thumb is curled, the remaining four fingers are flattened, and the gesture controls the aircraft to fly forward.
Right gesture 207: indicating that the palm of the hand is facing the sky, the thumb is curled, the remaining four fingers are curled and pressed against the thumb, and the gesture controls the aircraft to fly backwards.
Right gesture 208: representing the palm facing the ground, thumb Ping Shenxiang is left, the remaining four fingers are curled, and this gesture controls the aircraft to fly to the left.
Right gesture 209: indicating that the palm of the hand is facing the sky, the thumb is horizontally stretched to the right, the remaining four fingers are curled, and the gesture controls the aircraft to fly to the right.
Fig. 3 shows a schematic diagram of a control device 300, a connection 309 and a finger joint detection device 310 according to an embodiment of the present invention, where the control device includes a microcontroller 301, an inertial sensor 302, a wireless transmission module (1) 303, a wireless transmission module (2) 304, a power module 305, an LED, and other peripheral modules 306, and the detection device includes a connection port 307 and an inertial sensor 308. The microcontroller 301 is connected to the inertial sensor module 302, the wireless transmission module (1) 303, the wireless transmission module (2) 304 and the wire connection port 307 through SPI communication. The interface wire 307 and the inertial sensor 308 are directly connected. The control device 300 is fixed on the back of the hand and the Y-axis positive direction of the inertial sensor 302 is directed to four fingers. The detection device 310 is fixed to the back of the second knuckle of the finger and the positive Y-axis of the inertial sensor 308 is directed toward the nail cover.
Fig. 4 shows a control method of a four-axis aircraft according to the invention. The specific implementation steps are as follows:
step 1: configuring a sensor and collecting error values
In one possible implementation, the controller configures the inertial sensor according to a certain requirement and controls the inertial sensor to acquire the angular velocity Gyro and the acceleration Acc of the finger according to a certain velocity, and reads the two data into the controller to calculate an angular velocity error e_gyro and an acceleration error e_acc respectively. Wherein, angular velocity and acceleration are three-dimensional vectors.
In one embodiment:
and (5) horizontally placing the two fingers with the backs facing upwards and keeping a static state, collecting acceleration and angular velocity values, and recording the collection times num.
The magnitude of the acquisition angular velocity: gyro (i) =gyro_correct ()
The magnitude of the collection acceleration: acc (i) =acc_correct ()
Calculating an angular velocity error:
calculating an acceleration error:
step 2: collecting finger information and calculating the bending degree of the finger
In one possible implementation, the acquisition angular velocity Gyro (i), the acceleration Acc (i) use a strapdown inertial navigation algorithm (IMU) to perform finger gesture calculation and calculate the bending degree of the finger.
In one embodiment:
(1) Firstly, calibrating the collected Gyro (i) and Acc (i)
And
wherein, final. Gyro (i) & X, final. Gyro (i) & Y, final. Gyro (i) & Z, final. Acc (i) & X Final. Acc (i) & Y, final. Acc (i) & Z represent the angular velocity and acceleration axis after calibration, respectively.
(2) And unitizing the calibrated acceleration and angular velocity to obtain FN.Gyro (i) X, FN.Gyro (i) Y, FN.Gyro (i) Z, FN.Acc (i) X, FN.Acc (i) Y and FN.Acc (i) Z.
(3) Vectoring gravitational acceleration in a geographic coordinate systemThrough the gesture conversion matrix->Conversion into vectors in the vector coordinate System>Then
Wherein, the liquid crystal display device comprises a liquid crystal display device,is>Initial values of matrix and quaternion formed
(4) Under the carrier coordinate system, the error between the acceleration fn.acc (i) measured by the inertial sensor and the acceleration V converted by the attitude matrix is calculated and denoted as e.
e=FN .Acc(i)×V (6)
(5) The corrected angular velocity is corrected by means of the proportional and integral correction, and the corrected angular velocity δ and the corrected angular velocity w are respectively:
δ=K p e+K i ∫e (7)
w=FN .Gyro(i)+δ (8)
(6) Updating quaternions using corrected angular velocity
The quaternion differential equation is used for obtaining:
wherein the method comprises the steps ofAre components of the updated quaternion
(7) Determining Euler angle by using the quaternion after the update
Heading angle:
pitch angle: pitch=arcsin (2 (q 0 q 2 -q 1 q 3 ))
Roll angle:
the roll angle is the angle of rotation of the inertial sensor about the X-axis and is denoted as D 0 Therefore, the rotation angle D of the inertial sensor 302 on the back of the hand around the X-axis is calculated 0 And the angle D of rotation of the inertial sensor 308 on each finger (except thumb) about the X axis i Angle DT of each finger (except thumb) relative to back of hand i =D i -D 0
Step 3: calculating current gesture and sending corresponding control command
In one possible implementation, D is maintained stationary while the five fingers are held together and the back of the hand is held upward 0 At C 1 To C 2 Between them. For example, through repeated experiments, it was found that-3 ° to +3° was in a steady state. The inertial sensors on the index finger, the middle finger, the ring finger and the little finger rotate around the X axis by an angle D 2 、D 3 、D 4 、D 5 At C 3 To C 4 For example, the angle is shown to be between-3 ° and +5° when four fingers are held stable when they are drawn together and straightened over a number of repeated experiments. Angle D of rotation of inertial sensor on thumb about X-axis 1 At C 5 To C 6 For example, the test is stable between 30 ° and 40 ° after repeated experiments.
When the back of the hand is gripped with the second knuckle of the four fingers facing horizontally upwards and the thumb is opened in a direction perpendicular to the four fingers, D 2 、D 3 、D 4 、D 5 Are all at C 7 To C 8 For example, after repeated experiments, the measured data shows a stability between 75 ° and +93°. D (D) 1 At C 9 To C 10 For example, the data is stabilized between 70 ° and 87 ° over multiple repetitions.
Comprehensive knowledge of when DT i (i=2, 3, 4, 5) is 0 ° to (C 4 -C 1 Between +ε) and +10deg.for example, the finger corresponding to DTi is considered to be in a straightened state relative to the back of the hand; when D is 1 Between 0 DEG and (C) 6 +ε), for example, between 0 and +40°, the thumb is considered to be in a closed state. Where ε represents a small angular margin.
When DT is i (i=2, 3, 4, 5) is greater than (C 7 +ε), for example, greater than 65 °, DT is considered to be i The corresponding finger is in a gripping state relative to the back of the hand; when D is 1 Is greater than (C) 9 +ε), for example, greater than 70 °, the thumb is considered to be in the open state.
Therefore, according to the number of fingers which are opened or gripped, which gesture is currently in can be judged, and finally, a command corresponding to the gesture is sent to the four-axis aircraft.

Claims (9)

1. Gesture control device, characterized by, include: the device comprises a controller and an inertial sensing node, wherein the controller is fixed on the back of the hand, and the inertial sensing node is fixed at a second knuckle of the finger; the inertial sensor is used for collecting the motion attitude angle information of the finger and outputting the information to the controller, and the controller is used for collecting the output data of the sensor and sending a control instruction to the four-axis aircraft;
the motion attitude angle information of the finger is the bending degree of the finger relative to the back of the hand;
the process of calculating the bending degree of each finger relative to the back of hand comprises the following steps:
calibrating the collected angular velocity and acceleration;
unitizing the calibrated angular velocity and acceleration to obtain unitized angular velocity and unitized acceleration;
converting a gravity acceleration vector in a geographic coordinate system into a vector in a carrier coordinate system through a gesture conversion matrix, wherein the gesture conversion matrix is a matrix formed by quaternions;
under a carrier coordinate system, calculating an error between the acceleration measured by the inertial sensor and the acceleration converted by the gesture matrix;
correcting the unit angular velocity by utilizing proportion and integral based on the error to obtain corrected angular velocity;
updating the quaternion by using the corrected angular velocity to obtain an updated quaternion;
based on the updated quaternion, calculating to obtain an Euler angle, wherein the Euler angle comprises: and the course angle, the pitch angle and the roll angle are used for calculating the bending degree of each finger relative to the back of the hand based on the roll angle.
2. The gesture control device of claim 1, wherein the inertial sensing node must include a three-axis angular velocity meter and a three-axis accelerometer, or a six-axis inertial sensor integrated with both, and the three-axis magnetometer may not be included.
3. The gesture control device of claim 1 wherein the inertial sensing node is fixed to the back of the second knuckle of the finger and the Y-axis positive direction is directed toward the fingertip.
4. The gesture control device according to claim 1, wherein strapdown navigation algorithm is adopted for the movement gesture fusion, and the pitch angle measurement range after fusion is-90 degrees to +90 degrees; the measurement range of the roll angle is-180 degrees; geomagnetism causes the yaw angle to continuously drift, and accurate measurement results are not available.
5. The gesture control device according to claim 1, wherein the controller is integrated with a six-axis inertial sensor as a reference point, and the actual measured angle of the finger is a relative angle of the inertial sensor of the finger joint to the reference point.
6. A gesture control method is characterized in that,
the control method can divide the process into three steps according to the sequence of collection and treatment:
step 1: configuring a sensor and collecting error values, namely configuring the sensor according to certain requirements and collecting error values of angular speed and acceleration;
step 2: collecting finger information and calculating the bending degree of fingers, namely collecting the acceleration and the angular velocity of each finger, and calculating the bending degree of each finger relative to the back of the hand;
the process of calculating the bending degree of each finger relative to the back of hand comprises the following steps:
calibrating the collected angular velocity and acceleration;
unitizing the calibrated angular velocity and acceleration to obtain unitized angular velocity and unitized acceleration;
converting a gravity acceleration vector in a geographic coordinate system into a vector in a carrier coordinate system through a gesture conversion matrix, wherein the gesture conversion matrix is a matrix formed by quaternions;
under a carrier coordinate system, calculating an error between the acceleration measured by the inertial sensor and the acceleration converted by the gesture matrix;
correcting the unit angular velocity by utilizing proportion and integral based on the error to obtain corrected angular velocity;
updating the quaternion by using the corrected angular velocity to obtain an updated quaternion;
based on the updated quaternion, calculating to obtain an Euler angle, wherein the Euler angle comprises: the course angle, the pitch angle and the roll angle are used for calculating the bending degree of each finger relative to the back of the hand based on the roll angle;
step 3: the method comprises the steps of calculating the current gesture and sending a corresponding control command, namely, calculating the current gesture according to the bending degree of fingers of two hands, and sending a corresponding gesture control command to the four-axis aircraft.
7. The gesture control method according to claim 6, wherein in the control method, the finger gesture is determined using only the roll angle.
8. The gesture control method according to claim 6, wherein the left-hand control command is specifically:
throttle first gear: the palm of the hand faces the ground, the index finger is flattened, and the rest fingers are curled; the gesture takes an index finger as an example, and can also be the stretching action of any other finger;
throttle second gear: the palm of the hand faces the ground, the index finger and the middle finger are flattened, and the rest fingers are curled; the gesture takes an index finger and a middle finger as examples, and can also be the stretching action of any two other fingers;
three gears of accelerator: the palm of the hand faces the ground, the index finger, the middle finger and the ring finger are flattened, and the rest fingers are curled; the gesture takes an index finger, a middle finger and a ring finger as examples, and can also be the stretching action of any three fingers;
four gears of the accelerator: the palm of the hand faces the ground, and the index finger, the middle finger, the ring finger and the little finger are stretched, and the thumb is curled; the gesture takes an index finger, a middle finger, a ring finger and a little finger as an example, and can also be the stretching action of any four fingers;
five gears of the accelerator: indicating that the palm of the hand is facing the ground, five fingers are flattened.
9. The gesture control method of claim 6, wherein the gesture control method comprises the steps of,
the right hand gesture instruction specifically comprises:
forward flight: the palm of the hand is turned to the sky, the thumb is curled, and the other four fingers are stretched flatly;
flying backwards: the palm of the hand is turned to the sky, the thumb is curled, and the other four fingers are curled and pressed on the thumb;
flying to the left: the palm of the hand is facing the ground, the thumb Ping Shenxiang is left, and the other four fingers are curled;
flying to the right: the palm of the hand is facing the sky, the thumb is flat to the right, and the other four fingers are curled.
CN201610920077.XA 2016-10-21 2016-10-21 Gesture control device and gesture control method for four-axis aircraft based on inertial sensor Active CN106293103B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610920077.XA CN106293103B (en) 2016-10-21 2016-10-21 Gesture control device and gesture control method for four-axis aircraft based on inertial sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610920077.XA CN106293103B (en) 2016-10-21 2016-10-21 Gesture control device and gesture control method for four-axis aircraft based on inertial sensor

Publications (2)

Publication Number Publication Date
CN106293103A CN106293103A (en) 2017-01-04
CN106293103B true CN106293103B (en) 2023-09-26

Family

ID=57720443

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610920077.XA Active CN106293103B (en) 2016-10-21 2016-10-21 Gesture control device and gesture control method for four-axis aircraft based on inertial sensor

Country Status (1)

Country Link
CN (1) CN106293103B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107024939A (en) * 2017-06-14 2017-08-08 南昌航空大学 A kind of intelligent expanding device of four rotors and its control method
CN107831791B (en) * 2017-11-17 2020-12-15 深圳意动航空科技有限公司 Unmanned aerial vehicle control method and device, control equipment and storage medium
CN108710443B (en) * 2018-05-21 2021-09-07 云谷(固安)科技有限公司 Displacement data generation method and control system
CN109032160A (en) * 2018-07-27 2018-12-18 北京臻迪科技股份有限公司 Attitude control system, method and UAV system
CN112655194B (en) * 2018-09-11 2022-07-19 三星电子株式会社 Electronic device and method for capturing views
CN110779553A (en) * 2019-12-03 2020-02-11 中国科学院电子学研究所 Calibration method for magnetometer data
CN111158478B (en) * 2019-12-26 2023-02-03 维沃移动通信有限公司 Response method and electronic equipment
CN111461059A (en) * 2020-04-21 2020-07-28 哈尔滨拓博科技有限公司 Multi-zone multi-classification extensible gesture recognition control device and control method

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6515669B1 (en) * 1998-10-23 2003-02-04 Olympus Optical Co., Ltd. Operation input device applied to three-dimensional input device
CN101033973A (en) * 2007-04-10 2007-09-12 南京航空航天大学 Attitude determination method of mini-aircraft inertial integrated navigation system
JP2008065860A (en) * 2007-11-26 2008-03-21 Olympus Corp Operation input device
JP2008102951A (en) * 2007-11-26 2008-05-01 Olympus Corp Operation input device
JP2008112459A (en) * 2007-11-26 2008-05-15 Olympus Corp Operation input device
JP2008135033A (en) * 2007-11-26 2008-06-12 Olympus Corp Hand posture operation detector
CN106342284B (en) * 2008-08-18 2011-11-23 西北工业大学 A kind of flight carrier attitude is determined method
CN103112007A (en) * 2013-02-06 2013-05-22 华南理工大学 Human-machine interaction method based on mixing sensor
CN103175502A (en) * 2013-02-07 2013-06-26 广州畅途软件有限公司 Attitude angle detecting method based on low-speed movement of data glove
CN203759869U (en) * 2014-03-20 2014-08-06 西南科技大学 Gesture sensing type aircraft remote controller
CN104345904A (en) * 2013-07-23 2015-02-11 西安艾尔特仪器有限公司 Finger-type air mouse

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6744420B2 (en) * 2000-06-01 2004-06-01 Olympus Optical Co., Ltd. Operation input apparatus using sensor attachable to operator's hand

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6515669B1 (en) * 1998-10-23 2003-02-04 Olympus Optical Co., Ltd. Operation input device applied to three-dimensional input device
CN101033973A (en) * 2007-04-10 2007-09-12 南京航空航天大学 Attitude determination method of mini-aircraft inertial integrated navigation system
JP2008065860A (en) * 2007-11-26 2008-03-21 Olympus Corp Operation input device
JP2008102951A (en) * 2007-11-26 2008-05-01 Olympus Corp Operation input device
JP2008112459A (en) * 2007-11-26 2008-05-15 Olympus Corp Operation input device
JP2008135033A (en) * 2007-11-26 2008-06-12 Olympus Corp Hand posture operation detector
CN106342284B (en) * 2008-08-18 2011-11-23 西北工业大学 A kind of flight carrier attitude is determined method
CN103112007A (en) * 2013-02-06 2013-05-22 华南理工大学 Human-machine interaction method based on mixing sensor
CN103175502A (en) * 2013-02-07 2013-06-26 广州畅途软件有限公司 Attitude angle detecting method based on low-speed movement of data glove
CN104345904A (en) * 2013-07-23 2015-02-11 西安艾尔特仪器有限公司 Finger-type air mouse
CN203759869U (en) * 2014-03-20 2014-08-06 西南科技大学 Gesture sensing type aircraft remote controller

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
王伟栋 ; 费洁 ; 杨英东 ; 钱峰 ; .基于MEMS的数据手套传感技术研究.电子设计工程.2014,(第21期),全文. *
陈鹏展 ; 李杰 ; 罗漫 ; .网络化手势运动跟踪系统设计.传感器与微系统.2016,(第02期),全文. *

Also Published As

Publication number Publication date
CN106293103A (en) 2017-01-04

Similar Documents

Publication Publication Date Title
CN106293103B (en) Gesture control device and gesture control method for four-axis aircraft based on inertial sensor
CN106708066B (en) View-based access control model/inertial navigation unmanned plane independent landing method
JP6429450B2 (en) Information processing apparatus and information processing method
JP4860697B2 (en) Acceleration sensor correction apparatus and acceleration sensor output value correction method
CN110986939B (en) Visual inertia odometer method based on IMU (inertial measurement Unit) pre-integration
JP5987247B2 (en) Motion capture pointer by data fusion
US10705113B2 (en) Calibration of inertial measurement units attached to arms of a user to generate inputs for computer systems
EP2939402B1 (en) Method and device for sensing orientation of an object in space in a fixed frame of reference
US10191544B2 (en) Hand gesture recognition system for controlling electronically controlled devices
CN110580844A (en) self-balancing control method and device for two-wheeled robot, computer equipment and storage medium
US20150286279A1 (en) Systems and methods for guiding a user during calibration of a sensor
CN105045293B (en) Cloud platform control method, outer carrier control method and holder
CN109682377A (en) A kind of Attitude estimation method based on the decline of dynamic step length gradient
CN109724602A (en) A kind of attitude algorithm system and its calculation method based on hardware FPU
WO2020124678A1 (en) Method and system employing functional iterative integration to solve inertial navigation
Zhang et al. Micro-IMU-based motion tracking system for virtual training
EP3771968A1 (en) Low-power tilt-compensated pointing method and corresponding pointing electronic device
CN113602462A (en) Underwater robot and attitude and motion control method thereof under high-visibility condition in water
TW201118662A (en) Trace-generating systems and methods thereof
CN108507567A (en) Attitude quaternion determines method, apparatus and user towards determining method, apparatus
US20080074385A1 (en) Stand-Alone Device, System And Method For Navigating In A Space Having At Least Three Dimensions
CN110779554A (en) Mechanical arm, and calibration system and method based on initial pose of IMU
CN116125789A (en) Gesture algorithm parameter automatic matching system and method based on quaternion
CN110209270A (en) A kind of data glove, data glove system, bearing calibration and storage medium
JP2002023919A (en) Posture detection device and operation input device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant