CN113721764A - IMU-based human-computer interaction system and control and evaluation method - Google Patents

IMU-based human-computer interaction system and control and evaluation method Download PDF

Info

Publication number
CN113721764A
CN113721764A CN202110985483.5A CN202110985483A CN113721764A CN 113721764 A CN113721764 A CN 113721764A CN 202110985483 A CN202110985483 A CN 202110985483A CN 113721764 A CN113721764 A CN 113721764A
Authority
CN
China
Prior art keywords
cursor
target
interaction system
user
action
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110985483.5A
Other languages
Chinese (zh)
Inventor
赵玉良
任宪收
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northeastern University Qinhuangdao Branch
Original Assignee
Northeastern University Qinhuangdao Branch
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northeastern University Qinhuangdao Branch filed Critical Northeastern University Qinhuangdao Branch
Priority to CN202110985483.5A priority Critical patent/CN113721764A/en
Publication of CN113721764A publication Critical patent/CN113721764A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention designs a man-machine interaction system based on an IMU and a control and evaluation method. The man-machine interaction system based on the IM U comprises: the system comprises a sensor module, a power supply module, a wireless transmission module and a micro-control processor; capturing to obtain the change of the hand inclination angle of the user through the IMU to control the cursor movement, and capturing to obtain the rapid turning action of the hand of the user to the left side and the right side to control the cursor clicking; the control of the man-machine interaction system based on the IMU is realized through a multi-level decision-making action recognition algorithm; three indexes of target selection accuracy, target selection time and path efficiency are used for evaluating a human-computer interaction system; a user does not need to support a two-dimensional desktop, and can realize the interaction function under multiple scenes only by wearing a ring type human-computer interaction system on the hand.

Description

IMU-based human-computer interaction system and control and evaluation method
Technical Field
The invention relates to the technical field of sensor detection and artificial intelligence, and provides a man-machine interaction system based on an IMU (inertial measurement Unit) and a control and evaluation method.
Background
Since 1964, a mouse has been the most widely used device in the field of human-computer interaction, for example, a traditional two-dimensional wired optical mouse, which detects infrared scattering through an optical sensor to realize cursor positioning and control functions of a two-dimensional desktop, such a scheme looks very natural and convenient, and is a popular choice, however, in the scenes of business trip, academic conferences, and the like, the volume and portability of the mouse are limited, in addition, in the using process, the traditional mouse is often limited by a two-dimensional plane and distance, the human-computer experience is poor, the wireless mouse transmits data to a host by using an nRF technology, although the limitation of the wired mouse on the distance is solved, the wireless mouse is still limited by the two-dimensional plane, and the function of free interaction in more scenes cannot be provided, the mainstream multi-scene interaction technology at present, such as based on a computer vision technology, the technical challenges of how to carry and reliably realize free interaction of users in multiple scenes are an urgent need to be solved.
Disclosure of Invention
Aiming at the defects of the existing solution, the invention designs a man-machine interaction system and a control and evaluation method based on an Inertial Measurement Unit (IMU), and a user can realize the interaction function under multiple scenes by only wearing a ring-type man-machine interaction system on the hand without the support of a two-dimensional desktop;
the invention designs a human-computer interaction system based on a ring-type carrier platform, the tilt angle change of the hand of a user is captured by an IMU (inertial measurement Unit) to control the movement of a cursor, the rapid turning action of the hand of the user to the left and right sides is captured to control the clicking of the cursor, and the action recognition function of the system is realized by a multi-stage decision-making action recognition algorithm;
the invention provides a man-machine interaction system based on an IMU (inertial measurement Unit), which comprises a sensor module, a power supply module, a wireless transmission module and a micro-control processor, wherein the sensor module is connected with the power supply module;
the sensor module adopts a nine-axis attitude sensor, and comprises a 3-axis accelerometer, a 3-axis gyroscope and a 3-axis magnetometer, wherein the 3-axis accelerometer, the 3-axis gyroscope and the 3-axis magnetometer are used for acquiring motion data of a human-computer interaction system worn by a user, and the motion data comprises acceleration characteristics, angular velocity characteristics and magnetic field data;
the power supply module: the hardware circuit is powered by the button cell, and the switch is used for controlling the power supply;
a wireless transmission module: the Bluetooth chip is connected with the sensor module and used for sending the motion data to a display terminal in a wireless transmission mode;
the micro-control processor: the gesture action recognition of the user is realized through the judgment of the attitude angle characteristic threshold and the judgment of the acceleration characteristic threshold, and the command is sent to the display terminal through the wireless transmission module;
the attitude angle feature threshold value judging process comprises the following steps: carrying out attitude calculation on the motion data by adopting an angle complementary filtering algorithm to obtain attitude angle characteristics, and judging an attitude angle characteristic threshold value by the micro-control processor;
the display terminal: is a display.
A control method of a man-machine interaction system based on an IMU specifically comprises the following steps:
step 1: capturing user hand motion mapping system operation;
the operation of the human-computer interaction system is simplified into six actions: (1) the method comprises the following steps that a cursor moves leftwards (2), the cursor moves rightwards (3), the cursor moves upwards (4), the cursor moves downwards (5), the cursor clicks a left key (6) and clicks a right key, when a user starts to use the man-machine interaction system, the hand is parallel to the floor, the initial area where the hand stays is called a central area, the user controls the cursor to move and click actions through hand action changes, a sensor of the man-machine interaction system captures motion data of acceleration characteristics and angular velocity characteristics of the hand of the user, the inclination angle changes of the hand of the user are mapped into cursor movement operation, and quick overturning actions towards the left side and the right side are mapped into left key clicking and right key clicking operation of the cursor;
step 2: cursor movement and clicking are realized through a multi-stage decision algorithm;
the sensor captures motion data of acceleration characteristics and angular velocity characteristics of motion of a user hand, the motion data is preprocessed through a moving average filtering algorithm, the motion data is fused into attitude angle characteristics through a complementary filtering algorithm, cursor movement and clicking are achieved through threshold judgment, and the method specifically comprises the following steps:
step 2.1: checking triaxial and acceleration characteristics;
the discrimination of the clicking action and the static and moving actions of the cursor can judge that the hand of the user rotates leftwards or rightwards by solving the mean value of the acceleration characteristics of the y axis in the sliding window, the left key clicking action and the right key clicking action of the mapping control are performed, AccSqrt is used for identifying the execution of the rolling action, and the formula is used
Figure BDA0003230539430000021
Where AccSqrt (n) is the sum acceleration characteristic of the three axes at time n, axRepresenting the acceleration characteristic of the preprocessed x-axis, ayRepresenting the acceleration characteristic of the preprocessed y-axis, azRepresenting the acceleration characteristic of the preprocessed z axis, if the three axes and the acceleration characteristic are larger than a threshold value alpha, entering a step 2.2, otherwise, entering a step 2.3;
step 2.2, judging cursor clicking;
if the y-axis acceleration characteristic accumulated sum AYW is larger than the threshold value beta 1 or smaller than the threshold value beta 2, executing the action of left click or right click, otherwise, entering the step 2.3;
step 2.3: judging the movement of a cursor;
checking the pitch angle pitch (theta), if pitch (theta) is greater than a threshold gamma1Or pitch (θ) is less than a threshold γ2Indicating that an activity of left or right movement of the cursor is being performed, the roll angle yaw (ψ) is checked, and if yaw (ψ) is larger than the threshold δ1Or yaw (ψ) is less than threshold δ2Then the activity indicating that the cursor is being moved up or down is transferred to action execution, otherwise no action is performed.
In the step 1: the hand rotates to the left from the central area, the cursor moves to the left, the hand returns to the central area, the cursor stops moving, the hand of the user rapidly turns to the left side, and then the action of rapidly returning to the central area is the operation of clicking the left key.
In the step 2.2, the click action execution is to operate the left key or the right key of the cursor to click once and then suspend, but not to execute the click action all the time, when a user uses the man-machine interaction system, the real-time action of the cursor on the screen is monitored, and if error recognition occurs, the user can immediately control to correct and recognize.
An evaluation method of a man-machine interaction system based on an IMU comprises the following steps: an upper computer is designed by QT software, equidistant circles radially distributed along k directions are uniformly placed on a data acquisition interface of the upper computer, circular targets are used for keeping constant target width and are not influenced by approach angles, in each round, one target can be randomly displayed as an active target in one color, an inactive target is displayed in another color, a user needs to use a control cursor of a human-computer interaction system to move from an original point position of the upper computer to the position of the active target and execute clicking action, a computer can automatically record the number and the position of the randomly distributed target, the time of target selection, whether the target is hit or not, the position of the clicked target and the movement distance of the relevant data are automatically re-centered, a new target can be randomly selected and prompted to be the active target, and k targets are sequentially executed as a task round,
Figure BDA0003230539430000031
accuracy (k) is target selection accuracy, k belongs to (1, k) and represents the serial number of the target circle, TP represents the number of the target circles hit correctly, FP represents the number of the target circles not hit correctly;
T=T2-T1 (3)
t denotes a target selection time, T1Indicating the time, T, at which the cursor starts to be moved from the center of the screen by the cursor2Representing the time when the user finishes cursor movement and selects the target circle;
Figure BDA0003230539430000032
PE denotes path efficiency, n denotes the number of data collection rounds to be performed, m denotes the number of target circles to be clicked in one round, j denotes from 1 to n, i denotes from 1 to m, D is a linear distance from the center of the screen to the target circle, (px)0,py0) Is the position of the center of the screen, (px)i,pyj) Is the user clickingThe position of the target circle;
accuracy (k) is target selection accuracy, T is target selection time, and PE is path efficiency, where the three indicators of target selection accuracy, target selection time, and path efficiency are used for performance evaluation of the human-computer interaction system.
The technical scheme has the following beneficial effects:
1. the invention designs a ring type man-machine interaction system, and provides a portable and reliable multi-scene interaction scheme, which can realize the interaction function of a traditional mouse in the air only by wearing an intelligent ring on the middle finger of the right hand, thereby greatly improving the portability;
2. the hand inclination angle change control cursor position is obtained through IMU capture, fast turning motion control clicking of the user hand towards the left side and the right side is obtained through capture, a user does not need support of a two-dimensional desktop, the defect that a traditional mouse is applied in some specific scenes is overcome, and the requirement on the environment is greatly reduced;
3. the accuracy of the man-machine interaction system is improved by using a multistage decision-making action recognition algorithm, under different experimental conditions, experimental test indexes of the man-machine interaction system and a traditional mouse are compared, and experimental results show that the target selection accuracy of the finger-ring type man-machine interaction system is more than 96%, and the target selection time and the path efficiency are almost the same as those of the traditional mouse;
4. the method utilizes a cheap IMU inertial sensor and a simple and stable multi-stage decision motion recognition algorithm, and has the advantages of no noise and easy portability.
Drawings
FIG. 1 is a block diagram of a human-computer interaction system of an IMU
FIG. 2 is a circuit diagram of a sensor module
FIG. 3 is a circuit diagram of a power module
FIG. 4 is a circuit diagram of a microprocessor
FIG. 5 is a flowchart of a control method of a human-computer interaction system based on IMU
FIG. 6 hand motion and corresponding System operation schematic
FIG. 7 is a waveform variation diagram of AccSqrt
FIG. 8 is a sliding window algorithm diagram
FIG. 9 is a waveform diagram of a single click action and corresponding AYW
FIG. 10 is a graph showing changes in attitude angle and pitch angle (θ) when the hand is tilted
FIG. 11 is a diagram of the operation of the system when the cursor moves left and right
FIG. 12 is a graph of waveform variation of tilt angle data for human-computer interaction system
FIG. 13 is a diagram of an upper computer data acquisition interface
Detailed Description
The invention is described in further detail below with reference to the following figures and detailed description:
the embodiment of the invention provides a man-machine interaction system based on an IMU (inertial measurement Unit), which comprises a sensor module, a power supply module, a wireless transmission module and a micro-control processor as shown in figure 1;
the sensor module adopts nine-axis MPU9250 attitude sensors as shown in FIG. 2, which comprises: the device comprises a 3-axis acceleration characteristic meter, a 3-axis gyroscope and a 3-axis magnetometer, wherein the 3-axis acceleration characteristic meter, the 3-axis gyroscope and the 3-axis magnetometer are used for acquiring motion data of a human-computer interaction system worn by a user, and the motion data comprises acceleration characteristics, angular velocity characteristics and magnetic field data;
the power supply module: as shown in fig. 3, the hardware circuit is powered by the button cell, and the switch is used for power control;
a wireless transmission module: the Bluetooth chip is connected with the sensor module and used for sending the motion data to a display terminal in a wireless transmission mode;
the micro-control processor: as shown in fig. 4, the circuit schematic diagram of the micro-control processor is shown, the micro-control processor is connected to the attitude angle characteristic threshold value judgment process and the acceleration characteristic threshold value judgment process, so as to recognize the gesture of the user, and sends the command to the display terminal through the wireless transmission module, and the method specifically includes:
the attitude angle feature threshold value judging process comprises the following steps: adopting an angle complementary filtering algorithm to carry out attitude calculation on the motion data to obtain attitude angle characteristics, and using the attitude angle characteristics to carry out threshold judgment on the attitude angle characteristics by the micro-control processor;
the acceleration characteristic threshold value judging process comprises the following steps: the micro control processor is used for carrying out threshold judgment on the acceleration characteristic;
the display terminal: is a display;
a control method of a man-machine interaction system based on an IMU is disclosed, as shown in FIG. 5, which specifically comprises the following steps:
step 1: capturing user hand motion mapping system operation;
the operation of the human-computer interaction system is simplified into six actions: (1) the method comprises the following steps that a cursor moves leftwards (2), the cursor moves rightwards (3), the cursor moves upwards (4), the cursor moves downwards (5), the cursor clicks a left key (6) and clicks a right key, when a user starts to use the man-machine interaction system, the hand is parallel to the floor, the initial area where the hand stays is called a central area, the user controls the cursor to move and click actions through hand action changes, a sensor of the man-machine interaction system captures motion data of acceleration characteristics and angular velocity characteristics of the hand of the user, the inclination angle changes of the hand of the user are mapped into cursor movement operation, and quick overturning actions towards the left side and the right side are mapped into left key clicking and right key clicking operation of the cursor;
in the step 1: the hand rotates to the left from the central area, the cursor moves to the left, the hand returns to the central area, the cursor stops moving, the hand of the user rapidly turns to the left side, and then the action of rapidly returning to the central area is the operation of clicking the left key.
Step 2, cursor movement and clicking are realized through a multi-stage decision algorithm;
the MPU9250 sensor has caught the motion data of user's hand acceleration characteristic and angular velocity characteristic, carries out the preliminary treatment to the motion data with the moving average filtering algorithm in order to eliminate the noise influence, fuses the motion into gesture angle characteristic with complementary filtering algorithm, does corresponding mapping with the change of the different hand action of user and gesture angle characteristic, has realized cursor movement's control through threshold value judgement, specifically includes:
step 2.1: checking triaxial and acceleration characteristics;
the distinction of the clicking action from the static and moving actions of the cursor can judge that the hand of the user rotates left or right by solving the mean value of the acceleration characteristics of the y axis in the sliding window, and map the controlled left-key clicking action and the controlled right-key clicking action, such as the waveform change of AccSqrt shown in FIG. 6, the AccSqrt is used for identifying the execution of the rolling action, and the execution of the rolling action is determined by a formula
Figure BDA0003230539430000051
Where AccSqrt (n) is the sum acceleration characteristic of the three axes at time n, axRepresenting the acceleration characteristic of the preprocessed x-axis, ayRepresenting the acceleration characteristic of the preprocessed y-axis, azShowing the acceleration characteristics of the z-axis after preprocessing, the change of AccSqrt is small when the user's hand is kept still or performs a tilting motion, and the change of AccSqrt is severe when the user's wrist is rapidly turned over, so that we can distinguish the rapidly turned over state from other states,
the sliding window algorithm shown in fig. 7 performs threshold judgment on the y-axis acceleration characteristic sum AYW to map the functions of realizing left click and right click, and the y-axis acceleration characteristic sum AYW is obtained through a formula
Figure BDA0003230539430000061
Ayw (n) is an average value of y-axis acceleration characteristics in a sliding window obtained at n moments, w is the width of the sliding window, w y-axis acceleration characteristics are acquired by a sliding window algorithm and are respectively stored in w memory cells, after initial sampling of the w acceleration characteristics is completed, every sampling time, w memory cells sequentially move out of an initial datum and move into the data sampled at the time, waveform change of AYW is described in fig. 8, judgment of left-turning or right-turning of a hand of a user is achieved by judging AYW data through a threshold, and when AYW>β1When we consider that a left click action is performed, when AYW<β 2, we believe that a right click action is performed, byBy the method, the action recognition of left click and right click is realized;
Figure BDA0003230539430000062
where AccSqrt (n) is the sum acceleration characteristic of the three axes at time n, axRepresenting the acceleration characteristic of the preprocessed x-axis, ayRepresenting the acceleration characteristic of the preprocessed y-axis, azRepresenting the acceleration characteristic feature of the z axis after preprocessing, when the hand of the user keeps still or performs the tilting action, the change of the AccSqrt is small, and when the hand of the user turns over rapidly, the AccSqrt changes violently, if the three axes and the acceleration characteristic are larger than a threshold value alpha, the step 2.2 is carried out, otherwise, the step 2.3 is carried out;
step 2.2, judging cursor click
If the y-axis acceleration characteristic in the sliding window; the accumulated sum AYW is larger than the threshold value beta 1 or smaller than the threshold value beta 2, the action of left click or right click is executed, otherwise, the step 2.3 is executed;
in the step 2.2, the click action execution is to operate the left key or the right key of the cursor to click once and then suspend, but not to execute the click action all the time, when a user uses the man-machine interaction system, the real-time action of the cursor on the screen is monitored, and once the wrong movement occurs, the user can immediately control to correct and recognize;
step 2.3: determining cursor movement
As shown in fig. 9, the IMU is integrally disposed on the ring carrier, the rotation angle around the x axis is roll angle (γ), the rotation angle around the y axis is pitch angle (θ), the rotation angle around the z axis is yaw angle yaw (ψ), the attitude angle characteristics include roll angle, pitch angle and yaw angle, and are used to reflect the tilt angle change of the human-computer interaction system, as shown in fig. 10, the attitude angle characteristic change is linked to the tilt action of the hand of the user, as shown in fig. 11, the waveform change curve of the tilt angle data of the human-computer interaction system, and the threshold α is athAnd betathFor detecting the start and end points of four tilting movements, if rollingThe angle roll (gamma) satisfies the formula
Ang(k)<γ1 (5)
Ang(k+1)>γ1 (6)
Then we consider him as the starting point for a right roll maneuver if the roll angle roll (γ) satisfies the formula
Ang(k)>γ2 (7)
Ang(k+1)<γ2 (8)
Then we think that he is the terminal point of the right-leaning motion, so we can obtain the execution time of the right-leaning motion, and further control the distance of the cursor moving to the right, to avoid the false recognition caused by hand shake, formula
End-Start>γth (9)
Gamma in (5)thFor defining the holding time of the gesture motion and the length of the data sampling window, where the starting point is defined as "Start" and the End point is defined as "End", the pitch angle pitch (θ) is checked, if pitch (θ) is larger than a threshold γ1Or pitch (θ) is less than a threshold γ2Indicating that an activity of left or right movement of the cursor is being performed, look at yaw (ψ), and if yaw (ψ) is greater than the threshold δ1Or yaw (ψ) is less than threshold δ2If the cursor is moved upwards or downwards, the table executes the action of moving the cursor upwards or downwards, otherwise, no action is executed;
a man-machine interaction system evaluation method based on IMU comprises the following steps: as shown in fig. 12, a set of upper computer is designed by using QT software, equidistant circles radially distributed in 8 directions are uniformly placed on a data acquisition interface of the upper computer, a circular target is used for keeping a constant target width and is not influenced by an approach angle, in each round, one target is randomly changed into an active target and changed into red, and an inactive target is displayed in green, as shown in fig. 13, a user needs to move a control cursor from an origin to the position of the active target by using a human-computer interaction system and perform a click action, a computer automatically records the number and the position of the randomly allocated target, the time of selecting the target, whether the target is hit, the position of the clicked target and the movement distance of the relevant data, after the hit, the cursor is automatically re-centered, a new target is randomly selected and prompted to be the active target, and 8 targets are sequentially executed as a task round, is obtained by a formula
Figure BDA0003230539430000071
Accuracy (k) is target selection accuracy, k belongs to (1, 8) and represents the serial number of the target circle, from 1 to 8, TP represents the number of correctly hit target circles, FP represents the number of not hit target circles;
T=T2-T1 (11)
t denotes a target selection time, T1Indicating the time when the cursor starts to move from the center of the screen; t is2Representing the time when the user finishes cursor movement and selects the target circle;
Figure BDA0003230539430000081
PE denotes path efficiency, n denotes the number of data collection rounds to be performed, m denotes the number of target circles to be clicked in one round, j denotes from 1 to n, i denotes from 1 to m, D is a linear distance from the center of the screen to the target circle, (px)0,py0) Is the position of the center of the screen, (px)i,pyj) Is the position where the user clicked on the target circle;
accuracy (k) is target selection accuracy, T is target selection time, and PE is path efficiency, where the three indicators of target selection accuracy, target selection time, and path efficiency are used for performance evaluation of the human-computer interaction system.

Claims (6)

1. A man-machine interaction system based on IMU, its characterized in that: the system comprises a sensor module, a power supply module, a wireless transmission module and a micro-control processor;
the sensor module adopts a nine-axis attitude sensor, and comprises a 3-axis accelerometer, a 3-axis gyroscope and a 3-axis magnetometer, wherein the 3-axis accelerometer, the 3-axis gyroscope and the 3-axis magnetometer are used for acquiring motion data of a human-computer interaction system worn by a user;
the motion data comprises acceleration data, angular velocity data and magnetic field data;
the power supply module: the hardware circuit is powered by the button cell, and the switch is used for controlling the power supply;
a wireless transmission module: the Bluetooth chip is connected with the sensor module and used for sending the motion data to a display terminal in a wireless transmission mode;
the micro-control processor: through gesture angle characteristic threshold value judgement and acceleration characteristic threshold value judgement, realize user's gesture action recognition, through wireless sending module with instruction transmission display terminal, specifically include:
judging the attitude angle characteristic threshold: carrying out attitude calculation on the motion data by adopting an angle complementary filtering algorithm to obtain attitude angle characteristics, and judging an attitude angle characteristic threshold value by the micro-control processor;
the display terminal: is a display.
2. A control method of an IMU-based human-computer interaction system, which employs the IMU-based human-computer interaction system of claim 1, characterized in that: the method comprises the following steps:
step 1: capturing user hand motion mapping system operation;
the operation of the human-computer interaction system is simplified into six actions: (1) the method comprises the following steps that a cursor moves leftwards (2), the cursor moves rightwards (3), the cursor moves upwards (4), the cursor moves downwards (5), the cursor clicks a left key (6) and clicks a right key, when a user starts to use the man-machine interaction system, the hand is parallel to the floor, the initial area where the hand stays is called a central area, the user controls the cursor to move and click actions through hand action changes, a sensor of the man-machine interaction system captures motion data of acceleration characteristics and angular velocity characteristics of the hand of the user, the inclination angle changes of the hand of the user are mapped into cursor movement operation, and quick overturning actions towards the left side and the right side are mapped into left key clicking and right key clicking operation of the cursor;
step 2: cursor movement and clicking are realized through a multi-stage decision algorithm;
the sensor captures motion data of acceleration characteristics and angular velocity characteristics of motion of a user hand, the motion data is preprocessed through a moving average filtering algorithm, the motion data is fused into attitude angle characteristics through a complementary filtering algorithm, cursor movement and clicking are achieved through threshold judgment, and the method specifically comprises the following steps:
step 2.1: checking triaxial and acceleration characteristics;
the discrimination of the clicking action and the static and moving actions of the cursor can judge that the hand of the user rotates leftwards or rightwards by solving the mean value of the acceleration characteristics of the y axis in the sliding window, the left key clicking action and the right key clicking action of the mapping control are performed, AccSqrt is used for identifying the execution of the rolling action, and the formula is used
Figure FDA0003230539420000011
Where AccSqrt (n) is the sum acceleration characteristic of the three axes at time n, axRepresenting the acceleration characteristic of the preprocessed x-axis, ayRepresenting the acceleration characteristic of the preprocessed y-axis, azRepresenting the acceleration characteristic of the preprocessed z axis, if the three axes and the acceleration characteristic are larger than a threshold value alpha, entering a step 2.2, otherwise, entering a step 2.3;
step 2.2, judging cursor clicking;
if the y-axis acceleration characteristic accumulated sum AYW is larger than the threshold value beta 1 or smaller than the threshold value beta 2, executing the action of left click or right click, otherwise, entering the step 2.3;
step 2.3: judging the movement of a cursor;
checking the pitch angle pitch (theta), if pitch (theta) is greater than a threshold gamma1Or pitch (θ) is less than a threshold γ2Indicating that an activity of left or right movement of the cursor is being performed, the roll angle yaw (ψ) is checked, and if yaw (ψ) is larger than the threshold δ1Or yaw (ψ) is less than threshold δ2Then the activity indicating that the cursor is being moved up or down is transferred to action execution, otherwise no action is performed.
3. The IMU-based human-computer interaction system control method of claim 2, wherein: in the step 1: the hand rotates to the left from the central area, the cursor moves to the left, the hand returns to the central area, the cursor stops moving, the hand of the user rapidly turns to the left side, and then the action of rapidly returning to the central area is the operation of clicking the left key.
4. The IMU-based human-computer interaction system control method of claim 2, wherein: in the step 2.2, the click action execution is to operate the left key or the right key of the cursor to click once and then suspend, but not to execute the click action all the time, when a user uses the man-machine interaction system, the real-time action of the cursor on the screen is monitored, and if error recognition occurs, the user can immediately control to correct and recognize.
5. An evaluation method of an IMU-based human-computer interaction system, for evaluating the IMU-based human-computer interaction system of claim 1, characterized in that: an upper computer is designed by QT software, equidistant circles radially distributed along k directions are uniformly placed on a data acquisition interface of the upper computer, circular targets are used for keeping constant target width and are not influenced by approach angles, in each round, one target can be randomly displayed as an active target in one color, an inactive target is displayed in another color, a user needs to use a control cursor of a human-computer interaction system to move from an original point position of the upper computer to the position of the active target and execute clicking action, a computer can automatically record the number and the position of the randomly distributed target, the time of target selection, whether the target is hit or not, the position of the clicked target and the movement distance of the relevant data are automatically re-centered, a new target can be randomly selected and prompted to be the active target, and k targets are sequentially executed as a task round,
Figure FDA0003230539420000021
accuracy (k) is target selection accuracy, k belongs to (1, k) and represents the serial number of the target circle, TP represents the number of the target circles hit correctly, FP represents the number of the target circles not hit correctly;
T=T2-T1
t denotes a target selection time, T1Indicating the time, T, at which the cursor starts to be moved from the center of the screen by the cursor2Representing the time when the user finishes cursor movement and selects the target circle;
Figure FDA0003230539420000031
PE denotes path efficiency, n denotes the number of data collection rounds to be performed, m denotes the number of target circles to be clicked in one round, j denotes from 1 to n, i denotes from 1 to m, D is a linear distance from the center of the screen to the target circle, (px)0,py0) Is the position of the center of the screen, (px)i,pyj) Is the position where the user clicked on the target circle;
accuracy (k) is target selection accuracy, T is target selection time, and PE is path efficiency, where the three indicators of target selection accuracy, target selection time, and path efficiency are used for performance evaluation of the human-computer interaction system.
6. An IMU-based human-computer interaction system as claimed in claim 1, wherein: the man-machine interaction system is based on a ring type carrier platform.
CN202110985483.5A 2021-08-26 2021-08-26 IMU-based human-computer interaction system and control and evaluation method Pending CN113721764A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110985483.5A CN113721764A (en) 2021-08-26 2021-08-26 IMU-based human-computer interaction system and control and evaluation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110985483.5A CN113721764A (en) 2021-08-26 2021-08-26 IMU-based human-computer interaction system and control and evaluation method

Publications (1)

Publication Number Publication Date
CN113721764A true CN113721764A (en) 2021-11-30

Family

ID=78677991

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110985483.5A Pending CN113721764A (en) 2021-08-26 2021-08-26 IMU-based human-computer interaction system and control and evaluation method

Country Status (1)

Country Link
CN (1) CN113721764A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114579008A (en) * 2022-05-06 2022-06-03 湖北工业大学 Science popularization experience interaction system
CN116736188A (en) * 2023-06-13 2023-09-12 深圳市费思泰克科技有限公司 Multichannel electronic load for testing wire harness of new energy automobile

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040004601A1 (en) * 2002-07-02 2004-01-08 Luke Wu Virtual position movement capturing apparatus
CN103933722A (en) * 2014-02-28 2014-07-23 永康市坤卓科技有限公司 Bodybuilding dumbbell exercise detection device and method
CN104345904A (en) * 2013-07-23 2015-02-11 西安艾尔特仪器有限公司 Finger-type air mouse
CN105302021A (en) * 2015-10-23 2016-02-03 哈尔滨工业大学 Wearable gesture control device for controlling movement of robot in human-computer collaborative remanufacturing
KR101598455B1 (en) * 2015-08-21 2016-02-29 모테가 이노베이티브 인크 Electrical Dart
CN205721628U (en) * 2016-04-13 2016-11-23 哈尔滨工业大学深圳研究生院 A kind of quick three-dimensional dynamic hand gesture recognition system and gesture data collecting device
US20160361626A1 (en) * 2012-01-18 2016-12-15 Larry E. Moore Laser activated moving target
CN109085885A (en) * 2018-08-14 2018-12-25 李兴伟 Intelligent ring
US20190113966A1 (en) * 2017-10-17 2019-04-18 Logitech Europe S.A. Input device for ar/vr applications
CN109814707A (en) * 2018-12-19 2019-05-28 东北大学秦皇岛分校 A kind of virtual input method and system based on intelligent finger ring

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040004601A1 (en) * 2002-07-02 2004-01-08 Luke Wu Virtual position movement capturing apparatus
US20160361626A1 (en) * 2012-01-18 2016-12-15 Larry E. Moore Laser activated moving target
CN104345904A (en) * 2013-07-23 2015-02-11 西安艾尔特仪器有限公司 Finger-type air mouse
CN103933722A (en) * 2014-02-28 2014-07-23 永康市坤卓科技有限公司 Bodybuilding dumbbell exercise detection device and method
KR101598455B1 (en) * 2015-08-21 2016-02-29 모테가 이노베이티브 인크 Electrical Dart
CN105302021A (en) * 2015-10-23 2016-02-03 哈尔滨工业大学 Wearable gesture control device for controlling movement of robot in human-computer collaborative remanufacturing
CN205721628U (en) * 2016-04-13 2016-11-23 哈尔滨工业大学深圳研究生院 A kind of quick three-dimensional dynamic hand gesture recognition system and gesture data collecting device
US20190113966A1 (en) * 2017-10-17 2019-04-18 Logitech Europe S.A. Input device for ar/vr applications
CN109085885A (en) * 2018-08-14 2018-12-25 李兴伟 Intelligent ring
CN109814707A (en) * 2018-12-19 2019-05-28 东北大学秦皇岛分校 A kind of virtual input method and system based on intelligent finger ring

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114579008A (en) * 2022-05-06 2022-06-03 湖北工业大学 Science popularization experience interaction system
CN114579008B (en) * 2022-05-06 2022-07-08 湖北工业大学 Science popularization experience interaction system
CN116736188A (en) * 2023-06-13 2023-09-12 深圳市费思泰克科技有限公司 Multichannel electronic load for testing wire harness of new energy automobile
CN116736188B (en) * 2023-06-13 2024-04-12 深圳市费思泰克科技有限公司 Multichannel electronic load for testing wire harness of new energy automobile

Similar Documents

Publication Publication Date Title
US11983326B2 (en) Hand gesture input for wearable system
CN113721764A (en) IMU-based human-computer interaction system and control and evaluation method
US9323340B2 (en) Method for gesture control
EP2817694B1 (en) Navigation for multi-dimensional input
CN104769522A (en) Remote control with 3D pointing and gesture recognition capabilities
US20130246955A1 (en) Visual feedback for highlight-driven gesture user interfaces
US20170003747A1 (en) Touchless user interface navigation using gestures
JP6171615B2 (en) Information processing apparatus and program
KR20210010437A (en) Power management for optical positioning devices
CN102810015B (en) Input method based on space motion and terminal
CN103034343B (en) The control method and device of a kind of sensitive mouse
CN102890558A (en) Method for detecting handheld motion state of mobile handheld device based on sensor
US20140351699A1 (en) Method, device, and mobile terminal for performing a short cut browser operation
CN109829368A (en) Recognition methods, device, computer equipment and the storage medium of palm feature
KR20150091365A (en) Multi-touch symbol recognition
KR20160101605A (en) Gesture input processing method and electronic device supporting the same
JP2015170257A (en) Input method, program and input device
TWI721317B (en) Control instruction input method and input device
Atia et al. Interaction with tilting gestures in ubiquitous environments
KR101335394B1 (en) Screen touch apparatus at long range using 3D position of eyes and pointy object
CN105308540A (en) Method for processing touch event and apparatus for same
CN107277257A (en) Mobile terminal and its screen lighting method, the device with store function
CN112426709B (en) Forearm movement posture recognition method, interface interaction control method and device
CN111813280B (en) Display interface control method and device, electronic equipment and readable storage medium
CN110570930B (en) Medical image sketching method and system based on tablet personal computer

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination