CN117031924A - Multi-rotor unmanned aerial vehicle PID control parameter gesture adjusting system - Google Patents

Multi-rotor unmanned aerial vehicle PID control parameter gesture adjusting system Download PDF

Info

Publication number
CN117031924A
CN117031924A CN202310957998.3A CN202310957998A CN117031924A CN 117031924 A CN117031924 A CN 117031924A CN 202310957998 A CN202310957998 A CN 202310957998A CN 117031924 A CN117031924 A CN 117031924A
Authority
CN
China
Prior art keywords
gesture
unmanned aerial
aerial vehicle
finger
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310957998.3A
Other languages
Chinese (zh)
Inventor
王晓波
严旭飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Lab
Original Assignee
Zhejiang Lab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Lab filed Critical Zhejiang Lab
Priority to CN202310957998.3A priority Critical patent/CN117031924A/en
Publication of CN117031924A publication Critical patent/CN117031924A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B11/00Automatic controllers
    • G05B11/01Automatic controllers electric
    • G05B11/36Automatic controllers electric with provision for obtaining particular characteristics, e.g. proportional, integral, differential
    • G05B11/42Automatic controllers electric with provision for obtaining particular characteristics, e.g. proportional, integral, differential for obtaining a characteristic which is both proportional and time-dependent, e.g. P. I., P. I. D.
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a multi-rotor unmanned aerial vehicle PID control parameter gesture adjusting system which comprises a glove, 5 acceleration sensors, a main control panel, a data transmission module, a battery and an OLED display screen. The acceleration sensor is arranged on five fingers of the glove, the main control board is arranged at the back of the glove, and the OLED display screen is arranged on the main control board. The main controller collects data of the acceleration sensor, recognizes gesture actions through a gesture recognition algorithm after data processing is performed on the acceleration data, then sends gesture action instructions to the unmanned aerial vehicle flight control computer through the data transmission module, and the flight control computer executes corresponding PID control parameter adjustment tasks according to the gesture instructions after receiving the gesture action instructions. According to the invention, the PID control parameters of the unmanned aerial vehicle are adjusted through the online gestures of wearing the data glove by personnel, so that the PID parameter adjustment steps and processes of the unmanned aerial vehicle are greatly simplified and facilitated, and the parameter adjustment efficiency of the flight experiment is improved.

Description

Multi-rotor unmanned aerial vehicle PID control parameter gesture adjusting system
Technical Field
The invention relates to the field of unmanned aerial vehicles, in particular to a multi-rotor unmanned aerial vehicle PID control parameter gesture adjusting system.
Background
At present, the unmanned aerial vehicle has wider and wider application scenes, such as urban air traffic, electric power inspection, aerial photography and aerial survey, logistics express, fire rescue and the like. The PID control parameters of the unmanned aerial vehicle are critical to the stable flight of the unmanned aerial vehicle, and therefore, the adjustment of the PID control parameters of the unmanned aerial vehicle is also an important loop for the unmanned aerial vehicle. At present, most unmanned aerial vehicles adopt an off-line mode for parameter adjustment, firstly, the unmanned aerial vehicles start to fly, then, after the unmanned aerial vehicles finish flying, the flying logs are taken out for analysis, after the analysis, parameters in a flight control program are modified, the program is downloaded to a flight control computer, then, trial flight is carried out, and the mode is circularly carried out. The mode is large in workload, complex and tedious, and the flight experiment efficiency is seriously affected. Some unmanned aerial vehicle parameter adjustment adopts computer equipment to carry out online debugging, and this kind of method operation is inflexible, and the debugging is inconvenient, and portable.
Therefore, a new parameter adjusting system is needed to be provided, so that the PID parameter adjustment of the unmanned aerial vehicle is further performed, and the parameter adjusting efficiency of the flight experiment can be improved.
Disclosure of Invention
Aiming at the defects in the prior art, the invention provides a PID control parameter gesture adjusting system of a multi-rotor unmanned aerial vehicle.
The aim of the invention is realized by adopting the following technical scheme:
the multi-rotor unmanned aerial vehicle PID control parameter gesture adjusting system comprises a glove, 5 acceleration sensors, a main control panel, a data transmission, a battery and an OLED display screen; the 5 acceleration sensors are respectively arranged on five fingers of the glove; the main control panel is arranged at the back of the glove; the OLED display screen is arranged on the main control board; the data transmission is arranged on the main control board; the battery is arranged on the main control board; the acceleration sensor, the data transmission and the OLED screen are connected with the main control board, and the battery provides energy for the main control board; the main control board collects data of the acceleration sensor, recognizes gesture actions through a gesture recognition algorithm after data processing is carried out on the acceleration data, then sends gesture action instructions to the unmanned aerial vehicle flight control computer through the data transmission module, and the flight control computer executes corresponding PID control parameter adjustment tasks according to the gesture instructions after receiving the gesture action instructions; the main control board receives expected value and estimated value data of the attitude angle and the attitude angular speed sent by the flight control computer, and displays the data on the OLED screen in a graph; the main control board also displays the collected five acceleration sensor data on the OLED display screen.
Specifically, the X axis of the acceleration sensor is parallel to the glove fingers.
Further, the data processing on the acceleration data specifically includes: and removing errors and noise in the original acceleration data by adopting a sliding window filtering method.
Specifically, the gesture actions include thumb bending, index finger bending, middle finger bending, ring finger bending, little finger bending, gesture "0", gesture "1", gesture "2", gesture "3", gesture "4", gesture "5", gesture "6", gesture "7", gesture "8", gesture "9", wherein the thumb bending and gesture "4" are the same gesture actions.
Further, the gesture recognition algorithm recognizes gesture actions, specifically:
(5.1) 5 acceleration sensors are fixed on the glove fingers, and the X axis of each acceleration sensor is parallel to the finger; then, the thumb and the index finger are respectively bent and straightened, the main control board collects the variation values of acceleration data on three axes of each sensor, the collected thumb acceleration data and index finger acceleration data are sent to a computer through a serial port and stored in a document, then, the data are subjected to drawing analysis, and the analysis finds that the difference value between the maximum value and the minimum value of the collected Z-axis data of the acceleration sensor in the thumb bending process is the largest, so that the acceleration data on the thumb takes the Z axis as a reference axis, the difference value between the maximum value and the minimum value of the X-axis data of the collected acceleration sensor in the index finger bending process is the largest, and the index finger acceleration data takes the X axis as a reference axis;
(5.2) the acceleration data threshold is selected using a method of autonomously selecting the threshold; the method comprises the following steps: after the main control panel is electrified, the thumb, the index finger, the middle finger, the ring finger and the little finger respectively perform reciprocating straightening and bending actions for a plurality of times to train the finger, an array is defined in the main controller for each finger, the size is N, in the process of training the finger, collected acceleration data are stored in the array after being filtered by a sliding window, then N data in the array are sequenced to find out the maximum value and the minimum value of the acceleration, and the threshold value of the acceleration is the average value of the found maximum acceleration value and the minimum acceleration value for the thumb, the middle finger, the ring finger and the little finger:
wherein, thread is the selected threshold value, a min To smooth the acceleration minimum value, a, in the filtered set of data max To smooth the maximum acceleration in the filtered set of data, two thresholds, thread1 and thread2, are set for the index finger:
wherein, thread1 is threshold 1, thread2 is threshold 2, a min To smooth the acceleration minimum value, a, in the filtered set of data max Five acceleration thresholds can be obtained through the steps for smoothing the maximum acceleration value in the group of data after filtering;
(5.3) when the gesture is performed, the palm plane is vertical to the horizontal plane, and when the acceleration sensor data is fixed to a certain value, the finger is considered to be in a straightened state if the value is larger than a threshold value, and otherwise, the finger is considered to be in a bent state; if the thumb is in a straightening state, the FLAG bit FLAG1 is set to be 1, and if the thumb is in a bending state, the FLAG bit FLAG1 is set to be 0; similarly, if the thumb is in a straight state, the FLAG bit FLAG1 is made to be 1, and if the thumb is in a curved state, the FLAG bit FLAG1 is made to be 0; if the middle finger is in a straightening state, the FLAG bit FLAG3 is set to be 1, and if the middle finger is in a bending state, the FLAG bit FLAG3 is set to be 0; if the ring finger is in a straightening state, the FLAG bit FLAG4 is set to be 1, and if the ring finger is in a bending state, the FLAG bit FLAG4 is set to be 0; if the little finger is in a straightening state, the FLAG bit FLAG5 is 1, and if the little finger is in a bending state, the FLAG bit FLAG4 is 0; for the index finger, when the data of the acceleration sensor is fixed at a certain value, if the value is greater than a threshold value 1, the index finger is considered to be in a straightened state, and the FLAG bit FLAG2 is set to be 1; if the value is smaller than the threshold value 1 and larger than the threshold value 2, the finger is considered to be in a bending state 1, and the FLAG bit FLAG2 is set to be 0; if the value is smaller than the threshold value 2, the finger is considered to be in a bending state 2, and the FLAG bit FLAG2 is set to be 2;
(5.4) designing a gesture recognition rule base, wherein the specific rules are as follows:
(5.4.1) if FLAG1 is 0, FLAG2 is 1, FLAG3 is 1, FLAG4 is 1, FLAG5 is 1, then the gesture is a thumb bend;
(5.4.2) if FLAG1 is 1, FLAG2 is not 1, FLAG3 is 1, FLAG4 is 1, FLAG5 is 1, the gesture is a bending of the index finger;
(5.4.3) if FLAG1 is 1, FLAG2 is 1, FLAG3 is 0, FLAG4 is 1, FLAG5 is 1, then the gesture is a middle finger bend;
(5.4.4) if FLAG1 is 1, FLAG2 is 1, FLAG3 is 1, FLAG4 is 0, and FLAG5 is 1, then the gesture is ring finger bending;
(5.4.5) if FLAG1 is 1, FLAG2 is 1, FLAG3 is 1, FLAG4 is 1, FLAG5 is 0, the gesture is a small finger bend;
(5.4.6) if FLAG1 is 0, FLAG2 is 2, FLAG3 is 0, FLAG4 is 0, and FLAG5 is 0, then the gesture is gesture "0";
(5.4.7) if FLAG1 is 0, FLAG2 is 1, FLAG3 is 0, FLAG4 is 0, and FLAG5 is 0, then the gesture is gesture "1";
(5.4.8) if FLAG1 is 0, FLAG2 is 1, FLAG3 is 1, FLAG4 is 0, and FLAG5 is 0, then the gesture is gesture "2";
(5.4.9) if FLAG1 is 0, FLAG2 is 1, FLAG3 is 1, FLAG4 is 1, and FLAG5 is 0, then the gesture is gesture "3";
(5.4.10) if FLAG1 is 0, FLAG2 is 1, FLAG3 is 1, FLAG4 is 1, and FLAG5 is 1, then the gesture is gesture "4";
(5.4.11) if FLAG1 is 1, FLAG2 is 1, FLAG3 is 1, FLAG4 is 1, and FLAG5 is 1, then the gesture is gesture "5";
(5.4.12) if FLAG1 is 1, FLAG2 is 2, FLAG3 is 0, FLAG4 is 0, and FLAG5 is 1, then the gesture is gesture "6";
(5.4.13) if FLAG1 is 1, FLAG2 is 0, FLAG3 is 0, FLAG4 is 0, and FLAG5 is 0, then the gesture is gesture "7";
(5.4.14) if FLAG1 is 1, FLAG2 is 1, FLAG3 is 0, FLAG4 is 0, and FLAG5 is 0, then the gesture is gesture "8";
(5.4.15) if FLAG1 is 0, FLAG2 is 0, FLAG3 is 0, FLAG4 is 0, and FLAG5 is 0, then the gesture is gesture "9".
Further, the corresponding PID control parameter adjustment task is executed according to the gesture instruction, specifically:
(6.1) when the gesture action instruction received by the unmanned aerial vehicle flight control computer is thumb bending or gesture 4, increasing the gesture control angle ring course angle control parameter proportional term P by 0.01;
(6.2) when the gesture action instruction received by the unmanned aerial vehicle flight control computer is that the index finger is bent, reducing the gesture control angle ring course angle control parameter proportion item P by 0.01;
(6.3) increasing the attitude control angular velocity ring course angle control parameter proportion term P by 0.01 when the gesture action instruction received by the unmanned aerial vehicle flight control computer is middle finger bending;
(6.4) when the gesture action instruction received by the unmanned aerial vehicle flight control computer is ring finger bending, reducing the gesture control angular velocity ring course angle control parameter proportion item P by 0.01;
(6.5) when the gesture action instruction received by the unmanned aerial vehicle flight control computer is that the little finger bends, increasing the proportional item P of the attitude control angle ring pitch angle and the roll angle control parameter by 0.01;
(6.6) when the gesture action instruction received by the unmanned aerial vehicle flight control computer is gesture 0, reducing the proportional term P of the gesture control angle, pitch angle and roll angle control parameters by 0.01;
(6.7) when the gesture action instruction received by the unmanned aerial vehicle flight control computer is gesture '1', increasing the proportional term P of the gesture control angular speed, annular pitch angle and roll angle control parameters by 0.01;
(6.8) when the gesture action instruction received by the unmanned aerial vehicle flight control computer is gesture '2', reducing the proportional term P of the gesture control angle, pitch angle and roll angle control parameters by 0.01;
(6.9) when the gesture action instruction received by the unmanned aerial vehicle flight control computer is gesture '3', increasing the integral term I of the attitude control angular speed, annular pitch angle and roll angle control parameters by 0.01;
(6.10) when the gesture action instruction received by the unmanned aerial vehicle flight control computer is gesture 4, reducing the integral term I of the attitude control angular speed, annular pitch angle and roll angle control parameters by 0.01;
(6.11) when the gesture action instruction received by the unmanned aerial vehicle flight control computer is gesture '5', not doing any task;
(6.12) when the gesture action instruction received by the unmanned aerial vehicle flight control computer is gesture 6, increasing the differential term D of the attitude control angular speed ring pitch angle and the roll angle control parameter by 0.01;
and (6.13) reducing the differential term D of the attitude control angular speed, the annular pitch angle and the roll angle control parameters by 0.01 when the gesture action instruction received by the unmanned aerial vehicle flight control computer is a gesture '7'.
Specifically, the data are displayed on the OLED screen in a graph mode and are used for an experimenter to adjust PID control parameters of the unmanned aerial vehicle through gesture actions by observing deviation of expected values and estimated values of the state of the unmanned aerial vehicle.
The beneficial effects of the invention are as follows:
according to the unmanned aerial vehicle PID control parameter adjustment method, the personnel wear the data glove to adjust the unmanned aerial vehicle PID control parameter in an online real-time gesture manner, so that the unmanned aerial vehicle PID parameter adjustment steps and processes are greatly simplified and facilitated, the flight experiment parameter adjustment efficiency is improved, and the unmanned aerial vehicle parameter adjustment is more flexible and simpler.
The gesture recognition technology is based on the acceleration sensor, and the acceleration sensor is low in price, small in size and free of limitation of environment and light, so that the gesture recognition application range based on the acceleration sensor is wider.
The gesture recognition system can be applied to one hand and can be expanded into two hands. When the gesture recognition system is applied to both hands, the variety of gesture recognition by the system is increased, so that more parameters of the unmanned aerial vehicle can be controlled.
Drawings
FIG. 1 is a schematic illustration of the apparatus of the present invention;
FIG. 2 is a system block diagram of the present invention;
fig. 3 is a schematic diagram of gesture operation of the present invention.
Detailed Description
The invention is described in further detail below with reference to the drawings and specific examples.
The invention discloses a multi-rotor unmanned aerial vehicle PID control parameter gesture adjusting system, which is shown in an embodiment of fig. 1 and 2 and comprises a glove, 5 acceleration sensors, a main control panel, a data transmission, a battery and an OLED display screen. The acceleration sensors are respectively arranged on five fingers of the glove; the main control panel is arranged at the back of the glove; the OLED display screen is arranged on the main control board; the data transmission is arranged on the main control board; the battery is arranged on the main control board. The acceleration sensor, the data transmission and the OLED screen are connected with the main control board, and the battery provides energy for the main control board. The main control panel collects acceleration sensor data, recognizes gesture actions through a gesture recognition algorithm after data processing is performed on the acceleration data, then sends gesture action instructions to the unmanned aerial vehicle flight control computer through the data transmission module, and the flight control computer executes corresponding PID control parameter adjustment tasks according to the gesture instructions after receiving the gesture action instructions. The main control board receives expected value and estimated value data of the attitude angle and the attitude angular speed sent by the flight control computer, and displays the data on the OLED screen in a graph. The main control board also displays the collected five acceleration sensor data on the OLED display screen.
The X axis of the acceleration sensor is parallel to the glove fingers.
The specific method for processing the acceleration data is to remove errors and noise in the original acceleration data by adopting a sliding window filtering method.
The gesture actions comprise thumb bending, index finger bending, middle finger bending, ring finger bending, little finger bending, gesture '0', gesture '1', gesture '2', gesture '3', gesture '4', gesture '5', gesture '6', gesture '7', gesture '8', gesture '9', wherein the thumb bending and gesture '4' are the same gesture actions.
The specific steps of identifying the gesture action through the gesture identification algorithm are as follows:
(1) The 5 acceleration sensors are fixed on the glove fingers, and the X axis of each acceleration sensor is in the same straight line with the fingers. And then the thumb and the index finger are respectively bent and straightened, the main control board collects the change values of acceleration data on three axes of each sensor, the collected thumb acceleration data and index finger acceleration data are sent to a computer through a serial port and stored in a document, and then the data are subjected to drawing analysis. According to analysis, the Z-axis data of the acceleration sensor has large change amplitude in the thumb bending process, so that the acceleration data on the thumb takes the Z-axis as a reference axis, and in the same way, the index finger, the middle finger, the ring finger and the little finger are bent, the X-axis data has large change amplitude, and the acceleration data of the index finger, the middle finger, the ring finger and the little finger take the X-axis as a reference axis, and the similarity of the space positions of the index finger, the ring finger, the little finger and the index finger also takes the X-axis as a reference axis;
(2) The invention provides a method for autonomously selecting a threshold value, which is used for effectively performing gesture recognition. The method comprises the following steps: after the main control panel is electrified, the thumb, the index finger, the middle finger, the ring finger and the little finger respectively perform repeated straightening and bending actions, which is called training finger herein, an array is defined in the main controller for each finger respectively, the size is N, in the process of training the finger, collected acceleration data are stored in the array after being filtered by a sliding window, then N data in the array are sequenced to find out the maximum value and the minimum value of the acceleration, and for the thumb, the middle finger, the ring finger and the little finger, the acceleration threshold value is the average value of the found maximum acceleration value and the minimum acceleration value:
wherein, thread is the selected threshold value, a min To smooth the acceleration minimum value, a, in the filtered set of data max Is the maximum value of acceleration rate in a set of data after smoothing. For the index finger, two thresholds thread1 and thread2 are set:
wherein, thread1 is threshold 1, thread2 is threshold 2, a min To smooth the acceleration minimum value, a, in the filtered set of data max Is the maximum value of acceleration rate in a set of data after smoothing. Five acceleration thresholds can be obtained through the steps;
(3) When the gesture is performed, the palm plane is vertical to the horizontal plane, and for the thumb, the middle finger, the ring finger and the little finger, when the acceleration sensor data is fixed at a certain value, if the value is larger than a threshold value, the finger is considered to be in a straightened state, otherwise, the finger is considered to be in a bent state. If the thumb is in a straightened state, the FLAG bit FLAG1 is set to 1, and if the thumb is in a bent state, the FLAG bit FLAG1 is set to 0. Similarly, if the thumb is in the straightened state, the FLAG bit FLAG1 is set to 1, and if the thumb is in the curved state, the FLAG bit FLAG1 is set to 0. If the middle finger is in a straight state, the FLAG bit FLAG3 is set to 1, and if the middle finger is in a curved state, the FLAG bit FLAG3 is set to 0. If the ring finger is in a straightened state, the FLAG bit FLAG4 is set to 1, and if the ring finger is in a curved state, the FLAG bit FLAG4 is set to 0. If the little finger is in a straightened state, the FLAG bit FLAG5 is set to 1, and if the little finger is in a bent state, the FLAG bit FLAG4 is set to 0. When the acceleration sensor data is fixed to a certain value, if the value is greater than the threshold value 1, the index finger is considered to be in a straightened state, and the FLAG bit FLAG2 is set to 1. If the value is smaller than the threshold value 1 and larger than the threshold value 2, the finger is considered to be in a bent state 1, and the FLAG bit FLAG2 is set to 0. If the value is smaller than the threshold value 2, the finger is considered to be in a bending state 2, and the FLAG bit FLAG2 is set to be 2;
(4) A gesture recognition rule base is designed, and specific rules are as follows:
(4.1) if FLAG1 is 0, FLAG2 is 1, FLAG3 is 1, FLAG4 is 1, FLAG5 is 1, then the gesture is thumb bending.
(4.2) if FLAG1 is 1, FLAG2 is not 1, FLAG3 is 1, FLAG4 is 1, FLAG5 is 1, the gesture is a bending of the index finger.
(4.3) if FLAG1 is 1, FLAG2 is 1, FLAG3 is 0, FLAG4 is 1, FLAG5 is 1, the gesture is a middle finger bending.
(4.4) if FLAG1 is 1, FLAG2 is 1, FLAG3 is 1, FLAG4 is 0, FLAG5 is 1, the gesture is ring finger bending.
(4.5) if FLAG1 is 1, FLAG2 is 1, FLAG3 is 1, FLAG4 is 1, FLAG5 is 0, the gesture is a small finger bend.
(4.6) if FLAG1 is 0, FLAG2 is 2, FLAG3 is 0, FLAG4 is 0, and FLAG5 is 0, the gesture is "0".
(4.7) if FLAG1 is 0, FLAG2 is 1, FLAG3 is 0, FLAG4 is 0, and FLAG5 is 0, the gesture is gesture "1".
(4.8) if FLAG1 is 0, FLAG2 is 1, FLAG3 is 1, FLAG4 is 0, FLAG5 is 0, the gesture is gesture "2".
(4.9) if FLAG1 is 0, FLAG2 is 1, FLAG3 is 1, FLAG4 is 1, and FLAG5 is 0, then the gesture is gesture "3".
(4.10) if FLAG1 is 0, FLAG2 is 1, FLAG3 is 1, FLAG4 is 1, and FLAG5 is 1, the gesture is gesture "4".
(4.11) if FLAG1 is 1, FLAG2 is 1, FLAG3 is 1, FLAG4 is 1, and FLAG5 is 1, the gesture is gesture "5".
(4.12) if FLAG1 is 1, FLAG2 is 2, FLAG3 is 0, FLAG4 is 0, and FLAG5 is 1, the gesture is gesture "6".
(4.13) if FLAG1 is 1, FLAG2 is 0, FLAG3 is 0, FLAG4 is 0, FLAG5 is 0, the gesture is "7".
(4.14) if FLAG1 is 1, FLAG2 is 1, FLAG3 is 0, FLAG4 is 0, and FLAG5 is 0, the gesture is "8".
(4.15) if FLAG1 is 0, FLAG2 is 0, FLAG3 is 0, FLAG4 is 0, FLAG5 is 0, the gesture is gesture "9".
A gesture recognition rule base schematic table specifically shown in table 1;
TABLE 1
The corresponding PID control parameter adjusting task is executed according to the gesture instruction, and specifically comprises the following steps:
(1) When the gesture action instruction received by the unmanned aerial vehicle flight control computer is thumb bending or gesture '4', the gesture control angle ring course angle control parameter proportion item P is increased by 0.01.
(2) When the gesture action instruction received by the unmanned aerial vehicle flight control computer is that the index finger is bent, the gesture control angle ring course angle control parameter proportion item P is reduced by 0.01.
(3) When the gesture action instruction received by the unmanned aerial vehicle flight control computer is middle finger bending, the gesture control angular speed ring course angle control parameter proportion item P is increased by 0.01.
(4) When the gesture action instruction received by the unmanned aerial vehicle flight control computer is ring finger bending, the gesture control angular velocity ring course angle control parameter proportion item P is reduced by 0.01.
(5) When the gesture action instruction received by the unmanned aerial vehicle flight control computer is that the little finger bends, the proportional term P of the control parameters of the attitude control angle ring pitch angle and the roll angle is increased by 0.01.
(6) When the gesture action instruction received by the unmanned aerial vehicle flight control computer is gesture 0, the proportional term P of the control parameters of the attitude control angle, the pitch angle and the roll angle is reduced by 0.01.
(7) When the gesture action instruction received by the unmanned aerial vehicle flight control computer is gesture '1', the proportional term P of the gesture control angular speed, annular pitch angle and roll angle control parameters is increased by 0.01.
(8) When the gesture action instruction received by the unmanned aerial vehicle flight control computer is gesture '2', the proportional term P of the control parameters of the attitude control angle, the pitch angle and the roll angle is reduced by 0.01.
(9) When the gesture action instruction received by the unmanned aerial vehicle flight control computer is gesture '3', the integral term I of the attitude control angular speed, the annular pitch angle and the roll angle control parameters is increased by 0.01.
(10) When the gesture action instruction received by the unmanned aerial vehicle flight control computer is gesture 4, the integral term I of the attitude control angular speed, the annular pitch angle and the roll angle control parameters is reduced by 0.01.
(11) When the gesture action instruction received by the unmanned aerial vehicle flight control computer is gesture '5', no task is done.
(12) When the gesture action instruction received by the unmanned aerial vehicle flight control computer is gesture 6, the differential item D of the gesture control angular speed, annular pitch angle and roll angle control parameters is increased by 0.01.
(13) When the gesture action instruction received by the unmanned aerial vehicle flight control computer is gesture 7, the differential term D of the gesture control angular speed, annular pitch angle and roll angle control parameters is reduced by 0.01.
The gesture control instruction design schematic table specifically shown in table 2:
TABLE 2
The data are displayed on the OLED screen in a graph mode, and the aim is to facilitate experimenters to observe deviation of expected values and estimated values of the unmanned aerial vehicle state, so that PID control parameters of the unmanned aerial vehicle can be adjusted more accurately through gesture actions.
According to the unmanned aerial vehicle PID control parameter adjustment method, the personnel wear the data glove to adjust the unmanned aerial vehicle PID control parameter in an online real-time gesture manner, so that the unmanned aerial vehicle PID parameter adjustment steps and processes are greatly simplified and facilitated, the flight experiment parameter adjustment efficiency is improved, and the unmanned aerial vehicle parameter adjustment is more flexible and simpler.
The gesture recognition technology is based on the acceleration sensor, and the acceleration sensor is low in price, small in size and free of limitation of environment and light, so that the gesture recognition application range based on the acceleration sensor is wider. The gesture recognition system of the invention can be applied to not only one hand but also two hands. When the gesture recognition system is applied to both hands, the variety of gesture recognition by the system is increased, so that more parameters of the unmanned aerial vehicle can be controlled.
The foregoing is merely a preferred example of the invention, and is not intended to limit the invention thereto. Modifications may be made to the above-described aspects, or equivalents may be substituted for elements thereof, without departing from the scope of the invention, as will be apparent to those skilled in the art. All modifications, equivalent substitutions and the like which do not depart from the technical content of the invention are included in the protection scope of the invention.

Claims (7)

1. The multi-rotor unmanned aerial vehicle PID control parameter gesture adjusting system is characterized by comprising a glove, 5 acceleration sensors, a main control panel, a data transmission, a battery and an OLED display screen; the 5 acceleration sensors are respectively arranged on five fingers of the glove; the main control panel is arranged at the back of the glove; the OLED display screen is arranged on the main control board; the data transmission is arranged on the main control board; the battery is arranged on the main control board; the acceleration sensor, the data transmission and the OLED screen are connected with the main control board, and the battery provides energy for the main control board; the main control board collects data of the acceleration sensor, recognizes gesture actions through a gesture recognition algorithm after data processing is carried out on the acceleration data, then sends gesture action instructions to the unmanned aerial vehicle flight control computer through the data transmission module, and the flight control computer executes corresponding PID control parameter adjustment tasks according to the gesture instructions after receiving the gesture action instructions; the main control board receives expected value and estimated value data of the attitude angle and the attitude angular speed sent by the flight control computer, and displays the data on the OLED screen in a graph; the main control board also displays the collected five acceleration sensor data on the OLED display screen.
2. The multi-rotor unmanned aerial vehicle PID control parameter gesture adjustment system of claim 1, wherein the X-axis of the acceleration sensor is parallel to the glove fingers.
3. The multi-rotor unmanned aerial vehicle PID control parameter gesture adjustment system according to claim 1, wherein the data processing for the acceleration data specifically comprises: and removing errors and noise in the original acceleration data by adopting a sliding window filtering method.
4. The multi-rotor unmanned aerial vehicle PID control parameter gesture adjustment system of claim 1, wherein the gesture motion comprises a thumb bend, a index finger bend, a middle finger bend, a ring finger bend, a little finger bend, a gesture "0", a gesture "1", a gesture "2", a gesture "3", a gesture "4", a gesture "5", a gesture "6", a gesture "7", a gesture "8", a gesture "9", wherein the thumb bend and the gesture "4" are the same gesture motion.
5. The system for adjusting the PID control parameters of the multi-rotor unmanned aerial vehicle according to claim 1, wherein the specific step of recognizing the gesture by the gesture recognition algorithm is as follows:
(5.1) 5 acceleration sensors are fixed on the glove fingers, and the X axis of each acceleration sensor is parallel to the finger; then, the thumb and the index finger are respectively bent and straightened, the main control board collects the variation values of acceleration data on three axes of each sensor, the collected thumb acceleration data and index finger acceleration data are sent to a computer through a serial port and stored in a document, then, the data are subjected to drawing analysis, and the analysis finds that the difference value between the maximum value and the minimum value of the collected Z-axis data of the acceleration sensor in the thumb bending process is the largest, so that the acceleration data on the thumb takes the Z axis as a reference axis, the difference value between the maximum value and the minimum value of the X-axis data of the collected acceleration sensor in the index finger bending process is the largest, and the index finger acceleration data takes the X axis as a reference axis;
(5.2) the acceleration data threshold is selected using a method of autonomously selecting the threshold; the method comprises the following steps: after the main control panel is electrified, the thumb, the index finger, the middle finger, the ring finger and the little finger respectively perform reciprocating straightening and bending actions for a plurality of times to train the finger, an array is defined in the main controller for each finger, the size is N, in the process of training the finger, collected acceleration data are stored in the array after being filtered by a sliding window, then N data in the array are sequenced to find out the maximum value and the minimum value of the acceleration, and the threshold value of the acceleration is the average value of the found maximum acceleration value and the minimum acceleration value for the thumb, the middle finger, the ring finger and the little finger:
wherein, thread is the selected threshold value, a min To smooth the acceleration minimum value, a, in the filtered set of data max To smooth the maximum acceleration in the filtered set of data, two thresholds, thread1 and thread2, are set for the index finger:
wherein, thread1 is threshold 1, thread2 is threshold 2, a min To smooth the acceleration minimum value, a, in the filtered set of data max Five acceleration thresholds can be obtained through the steps for smoothing the maximum acceleration value in the group of data after filtering;
(5.3) when the gesture is performed, the palm plane is vertical to the horizontal plane, and when the acceleration sensor data is fixed to a certain value, the finger is considered to be in a straightened state if the value is larger than a threshold value, and otherwise, the finger is considered to be in a bent state; if the thumb is in a straightening state, the FLAG bit FLAG1 is set to be 1, and if the thumb is in a bending state, the FLAG bit FLAG1 is set to be 0; similarly, if the thumb is in a straight state, the FLAG bit FLAG1 is made to be 1, and if the thumb is in a curved state, the FLAG bit FLAG1 is made to be 0; if the middle finger is in a straightening state, the FLAG bit FLAG3 is set to be 1, and if the middle finger is in a bending state, the FLAG bit FLAG3 is set to be 0; if the ring finger is in a straightening state, the FLAG bit FLAG4 is set to be 1, and if the ring finger is in a bending state, the FLAG bit FLAG4 is set to be 0; if the little finger is in a straightening state, the FLAG bit FLAG5 is 1, and if the little finger is in a bending state, the FLAG bit FLAG4 is 0; for the index finger, when the data of the acceleration sensor is fixed at a certain value, if the value is greater than a threshold value 1, the index finger is considered to be in a straightened state, and the FLAG bit FLAG2 is set to be 1; if the value is smaller than the threshold value 1 and larger than the threshold value 2, the finger is considered to be in a bending state 1, and the FLAG bit FLAG2 is set to be 0; if the value is smaller than the threshold value 2, the finger is considered to be in a bending state 2, and the FLAG bit FLAG2 is set to be 2;
(5.4) designing a gesture recognition rule base, wherein the specific rules are as follows:
(5.4.1) if FLAG1 is 0, FLAG2 is 1, FLAG3 is 1, FLAG4 is 1, FLAG5 is 1, then the gesture is a thumb bend;
(5.4.2) if FLAG1 is 1, FLAG2 is not 1, FLAG3 is 1, FLAG4 is 1, FLAG5 is 1, the gesture is a bending of the index finger;
(5.4.3) if FLAG1 is 1, FLAG2 is 1, FLAG3 is 0, FLAG4 is 1, FLAG5 is 1, then the gesture is a middle finger bend;
(5.4.4) if FLAG1 is 1, FLAG2 is 1, FLAG3 is 1, FLAG4 is 0, and FLAG5 is 1, then the gesture is ring finger bending;
(5.4.5) if FLAG1 is 1, FLAG2 is 1, FLAG3 is 1, FLAG4 is 1, FLAG5 is 0, the gesture is a small finger bend;
(5.4.6) if FLAG1 is 0, FLAG2 is 2, FLAG3 is 0, FLAG4 is 0, and FLAG5 is 0, then the gesture is gesture "0";
(5.4.7) if FLAG1 is 0, FLAG2 is 1, FLAG3 is 0, FLAG4 is 0, and FLAG5 is 0, then the gesture is gesture "1";
(5.4.8) if FLAG1 is 0, FLAG2 is 1, FLAG3 is 1, FLAG4 is 0, and FLAG5 is 0, then the gesture is gesture "2";
(5.4.9) if FLAG1 is 0, FLAG2 is 1, FLAG3 is 1, FLAG4 is 1, and FLAG5 is 0, then the gesture is gesture "3";
(5.4.10) if FLAG1 is 0, FLAG2 is 1, FLAG3 is 1, FLAG4 is 1, and FLAG5 is 1, then the gesture is gesture "4";
(5.4.11) if FLAG1 is 1, FLAG2 is 1, FLAG3 is 1, FLAG4 is 1, and FLAG5 is 1, then the gesture is gesture "5";
(5.4.12) if FLAG1 is 1, FLAG2 is 2, FLAG3 is 0, FLAG4 is 0, and FLAG5 is 1, then the gesture is gesture "6";
(5.4.13) if FLAG1 is 1, FLAG2 is 0, FLAG3 is 0, FLAG4 is 0, and FLAG5 is 0, then the gesture is gesture "7";
(5.4.14) if FLAG1 is 1, FLAG2 is 1, FLAG3 is 0, FLAG4 is 0, and FLAG5 is 0, then the gesture is gesture "8";
(5.4.15) if FLAG1 is 0, FLAG2 is 0, FLAG3 is 0, FLAG4 is 0, and FLAG5 is 0, then the gesture is gesture "9".
6. The multi-rotor unmanned aerial vehicle PID control parameter gesture adjustment system according to claim 1, wherein the corresponding PID control parameter adjustment task is executed according to gesture instructions, specifically:
(6.1) when the gesture action instruction received by the unmanned aerial vehicle flight control computer is thumb bending or gesture 4, increasing the gesture control angle ring course angle control parameter proportional term P by 0.01;
(6.2) when the gesture action instruction received by the unmanned aerial vehicle flight control computer is that the index finger is bent, reducing the gesture control angle ring course angle control parameter proportion item P by 0.01;
(6.3) increasing the attitude control angular velocity ring course angle control parameter proportion term P by 0.01 when the gesture action instruction received by the unmanned aerial vehicle flight control computer is middle finger bending;
(6.4) when the gesture action instruction received by the unmanned aerial vehicle flight control computer is ring finger bending, reducing the gesture control angular velocity ring course angle control parameter proportion item P by 0.01;
(6.5) when the gesture action instruction received by the unmanned aerial vehicle flight control computer is that the little finger bends, increasing the proportional item P of the attitude control angle ring pitch angle and the roll angle control parameter by 0.01;
(6.6) when the gesture action instruction received by the unmanned aerial vehicle flight control computer is gesture 0, reducing the proportional term P of the gesture control angle, pitch angle and roll angle control parameters by 0.01;
(6.7) when the gesture action instruction received by the unmanned aerial vehicle flight control computer is gesture '1', increasing the proportional term P of the gesture control angular speed, annular pitch angle and roll angle control parameters by 0.01;
(6.8) when the gesture action instruction received by the unmanned aerial vehicle flight control computer is gesture '2', reducing the proportional term P of the gesture control angle, pitch angle and roll angle control parameters by 0.01;
(6.9) when the gesture action instruction received by the unmanned aerial vehicle flight control computer is gesture '3', increasing the integral term I of the attitude control angular speed, annular pitch angle and roll angle control parameters by 0.01;
(6.10) when the gesture action instruction received by the unmanned aerial vehicle flight control computer is gesture 4, reducing the integral term I of the attitude control angular speed, annular pitch angle and roll angle control parameters by 0.01;
(6.11) when the gesture action instruction received by the unmanned aerial vehicle flight control computer is gesture '5', not doing any task;
(6.12) when the gesture action instruction received by the unmanned aerial vehicle flight control computer is gesture 6, increasing the differential term D of the attitude control angular speed ring pitch angle and the roll angle control parameter by 0.01;
and (6.13) reducing the differential term D of the attitude control angular speed, the annular pitch angle and the roll angle control parameters by 0.01 when the gesture action instruction received by the unmanned aerial vehicle flight control computer is a gesture '7'.
7. The system for gesture adjustment of PID control parameters of a multi-rotor unmanned aerial vehicle according to claim 1, wherein the data is displayed on the OLED screen in a graph form, and is used for an experimenter to adjust the PID control parameters of the unmanned aerial vehicle by gesture actions by observing deviations between expected values and estimated values of the unmanned aerial vehicle.
CN202310957998.3A 2023-08-01 2023-08-01 Multi-rotor unmanned aerial vehicle PID control parameter gesture adjusting system Pending CN117031924A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310957998.3A CN117031924A (en) 2023-08-01 2023-08-01 Multi-rotor unmanned aerial vehicle PID control parameter gesture adjusting system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310957998.3A CN117031924A (en) 2023-08-01 2023-08-01 Multi-rotor unmanned aerial vehicle PID control parameter gesture adjusting system

Publications (1)

Publication Number Publication Date
CN117031924A true CN117031924A (en) 2023-11-10

Family

ID=88636488

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310957998.3A Pending CN117031924A (en) 2023-08-01 2023-08-01 Multi-rotor unmanned aerial vehicle PID control parameter gesture adjusting system

Country Status (1)

Country Link
CN (1) CN117031924A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160162022A1 (en) * 2014-12-08 2016-06-09 Rohit Seth Wearable wireless hmi device
CN207457837U (en) * 2017-09-27 2018-06-05 歌尔科技有限公司 A kind of Intelligent glove and unmanned plane for being used to control unmanned plane
CN109144272A (en) * 2018-09-10 2019-01-04 哈尔滨工业大学 A kind of quadrotor drone control method based on data glove gesture identification
CN111124126A (en) * 2019-12-25 2020-05-08 北京航空航天大学 Unmanned aerial vehicle gesture control method
CN116185176A (en) * 2022-12-15 2023-05-30 华东理工大学 Four rotor unmanned aerial vehicle man-machine interaction system based on meticulous gesture recognition
CN116501168A (en) * 2023-04-13 2023-07-28 南京航空航天大学 Unmanned aerial vehicle gesture control method and control system based on chaotic sparrow search and fuzzy PID parameter optimization

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160162022A1 (en) * 2014-12-08 2016-06-09 Rohit Seth Wearable wireless hmi device
CN207457837U (en) * 2017-09-27 2018-06-05 歌尔科技有限公司 A kind of Intelligent glove and unmanned plane for being used to control unmanned plane
CN109144272A (en) * 2018-09-10 2019-01-04 哈尔滨工业大学 A kind of quadrotor drone control method based on data glove gesture identification
CN111124126A (en) * 2019-12-25 2020-05-08 北京航空航天大学 Unmanned aerial vehicle gesture control method
CN116185176A (en) * 2022-12-15 2023-05-30 华东理工大学 Four rotor unmanned aerial vehicle man-machine interaction system based on meticulous gesture recognition
CN116501168A (en) * 2023-04-13 2023-07-28 南京航空航天大学 Unmanned aerial vehicle gesture control method and control system based on chaotic sparrow search and fuzzy PID parameter optimization

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
邵燕: "基于数字信息蒙和领航算法的未知环境多智能体目标探测", FRONTIERS OF INFORMATION TECHNOLOGY & ELECTRONIC ENGINEERING, vol. 21, no. 5, 31 December 2020 (2020-12-31) *

Similar Documents

Publication Publication Date Title
Wang et al. Digital twin for human-robot interactive welding and welder behavior analysis
CN109597485B (en) Gesture interaction system based on double-fingered-area features and working method thereof
CN103977539B (en) Cervical vertebra rehabilitation health care auxiliary training system
CN106909216A (en) A kind of Apery manipulator control method based on Kinect sensor
Lin et al. Flying through a narrow gap using neural network: an end-to-end planning and control approach
Jiao et al. An intuitive end-to-end human-UAV interaction system for field exploration
Khajone et al. Implementation of a wireless gesture controlled robotic arm
CN105988583A (en) Gesture control method and virtual reality display output device
Kassab et al. Real-time human-UAV interaction: New dataset and two novel gesture-based interacting systems
CN112405530A (en) Robot vision tracking control system and control method based on wearable vision
Abualola et al. Flexible gesture recognition using wearable inertial sensors
Shin et al. EMG and IMU based real-time HCI using dynamic hand gestures for a multiple-DoF robot arm
CN113128339A (en) Intelligent vehicle operation control system and method based on behavior recognition
Silva et al. Landing area recognition by image applied to an autonomous control landing of VTOL aircraft
CN114495273A (en) Robot gesture teleoperation method and related device
CN117031924A (en) Multi-rotor unmanned aerial vehicle PID control parameter gesture adjusting system
Anwar et al. Feature extraction for indonesian sign language (SIBI) using leap motion controller
US11513607B2 (en) Path recognition method using a combination of invariant positional data and attributes of variation, path recognition device, path recognition program, and path recognition program recording medium
CN116476074A (en) Remote mechanical arm operation system based on mixed reality technology and man-machine interaction method
CN112884799A (en) Target tracking method in complex scene based on twin neural network
CN110065064A (en) A kind of robot sorting control method
Chen et al. Design of a real-time human-robot collaboration system using dynamic gestures
CN113221729B (en) Unmanned aerial vehicle cluster control method and system based on gesture human-computer interaction
CN109977884B (en) Target following method and device
CN108873935A (en) Control method, device, equipment and the storage medium of logistics distribution unmanned plane landing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination