CN112426709B - Forearm movement posture recognition method, interface interaction control method and device - Google Patents

Forearm movement posture recognition method, interface interaction control method and device Download PDF

Info

Publication number
CN112426709B
CN112426709B CN202011331040.6A CN202011331040A CN112426709B CN 112426709 B CN112426709 B CN 112426709B CN 202011331040 A CN202011331040 A CN 202011331040A CN 112426709 B CN112426709 B CN 112426709B
Authority
CN
China
Prior art keywords
forearm
detector
ratio
interface interaction
axis acceleration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011331040.6A
Other languages
Chinese (zh)
Other versions
CN112426709A (en
Inventor
蒋超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Jinling Technology Co ltd
Original Assignee
Shenzhen Jinling Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Jinling Technology Co ltd filed Critical Shenzhen Jinling Technology Co ltd
Priority to CN202011331040.6A priority Critical patent/CN112426709B/en
Publication of CN112426709A publication Critical patent/CN112426709A/en
Application granted granted Critical
Publication of CN112426709B publication Critical patent/CN112426709B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/212Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1012Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals involving biosensors worn by the player, e.g. for measuring heart beat, limb activity

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biophysics (AREA)
  • Cardiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a forearm movement posture recognition method, an interface interaction control method and an interface interaction control device. The identification method utilizes the detectors worn on the left wrist and the right wrist to automatically judge the motion postures of the forearms through the ratio of the acceleration of the X axis to the acceleration of the Z axis, the magnitude of the acceleration and the angular velocity, and can automatically identify the various motion postures of the forearms, such as the parallel uplifting of the forearms, the left swinging and the right swinging. The interface interaction method and the interface interaction device utilize the attitude parameters obtained by the attitude identification and the interface interaction control rules to execute corresponding interface interaction operation when the attitude parameters meet the conditions of the interface interaction control rules, thereby improving the immersion feeling of games or software control.

Description

Forearm movement posture recognition method, interface interaction control method and device
Technical Field
The invention relates to the technical field of wearable equipment, in particular to a forearm movement posture recognition method and device, an electronic equipment interface interaction control method and device, and an electronic device adopting the control method.
Background
In traditional software/games, interface interaction modes generally include conventional input modes such as a mouse, a keyboard and a touch screen, however, if the user wears the somatosensory input device in the somatosensory game, the somatosensory input device is split in the somatosensory/conventional input experience if the conventional input mode is used, the immersion feeling of the game is not strong, and the advantage of the somatosensory input device is not utilized.
Disclosure of Invention
The invention aims to provide a forearm movement posture recognition method, an interface interaction control method and a forearm movement posture recognition device, which at least solve the defects in the related art to a certain extent.
In order to achieve the purpose, the technical scheme adopted by the invention is as follows:
a forearm motion gesture recognition method, the forearm motion gesture comprising a first motion gesture in which two forearms lift upward intersecting the chest at 45 degrees, the recognition method comprising the steps of: acquiring X-axis acceleration and Z-axis acceleration detected by a detector of the left wrist; acquiring X-axis acceleration and Z-axis acceleration detected by a detector of the right wrist; performing division operation on the X-axis acceleration and the Z-axis acceleration detected by the detector of the left wrist to obtain a first ratio; performing division operation on the X-axis acceleration and the Z-axis acceleration detected by the detector of the right wrist to obtain a second ratio; judging whether the first ratio and the second ratio fall into a first preset interval with 1 as a center; if yes, outputting that the forearm is in the first motion posture; wherein, the direction of the X axis is the same as the length direction of the front arm, and the direction of the Z axis is vertical to the top surface of the detector.
Preferably, the forearm movement posture further includes a second posture in which two forearms are lifted in parallel, and the recognition method further includes the steps of: and judging whether the X-axis acceleration detected by the detector of the left wrist and the X-axis acceleration detected by the detector of the right wrist both fall into a second preset interval with 9.8 as the center, and if so, outputting that the forearm is in the second motion posture.
Preferably, the forearm movement postures further comprise a third movement posture of a single-arm right pendulum and a fourth movement posture of a single-arm left pendulum, and the recognition method further comprises the following steps: acquiring a Y-axis angular velocity detected by a detector of a left wrist or a right arm; according to the time sequence, a plurality of Y-axis angular velocities obtained within a set time length are stored to form a Y-axis angular velocity list; judging whether the position median in the Y-axis angular velocity list is the maximum value or the minimum value; if the position median is the maximum value, judging whether the position median is larger than a preset right-swing threshold value; if the position median is larger than a preset right swing threshold value, outputting that the forearm is in the third motion posture; if the position median is the minimum value, judging whether the position median is smaller than a preset left-swing threshold value; and if the position median is smaller than a preset left swing threshold value, outputting that the forearm is in the fourth motion posture.
A forearm motion gesture recognition device, comprising: the acceleration ratio acquisition module is used for carrying out division operation on the X-axis acceleration and the Z-axis acceleration detected by the detector of the left wrist to obtain a first ratio; performing division operation on the X-axis acceleration and the Z-axis acceleration detected by the detector of the right wrist to obtain a second ratio; the ratio judging module is used for judging whether the first ratio and the second ratio fall into a first preset interval with 1 as a center; and the result output module is used for outputting that the forearm is in the first motion posture when the first ratio and the second ratio both fall into a first preset interval taking 1 as the center.
Preferably, the forearm movement posture recognition device further includes: the acceleration acquisition module is used for acquiring the X-axis acceleration detected by the detector of the left wrist and the X-axis acceleration detected by the detector of the right wrist; the acceleration judging module is used for judging whether the X-axis acceleration detected by the detector of the left wrist and the X-axis acceleration detected by the detector of the right wrist both fall into a second preset interval with 9.8 as the center; and the result output module is also used for outputting that the forearm is in a second motion posture when the X-axis acceleration detected by the detector of the left wrist and the X-axis acceleration detected by the detector of the right wrist both fall into a second preset interval taking 9.8 as the center.
Preferably, the forearm movement posture recognition device further comprises: the angular velocity acquisition module is used for acquiring the Y-axis angular velocity detected by the detector of the left wrist or the right arm; the angular velocity judging module is used for storing a plurality of Y-axis angular velocities acquired within a set time length according to a time sequence to form a Y-axis angular velocity list; judging whether the position median in the Y-axis angular velocity list is the maximum value or the minimum value; if the position median is the maximum value, judging whether the position median is larger than a preset right swing threshold value; if the position median is the minimum value, judging whether the position median is smaller than a preset left swing threshold value; the result output module is further used for outputting that the forearm is in a third motion posture when the position median is larger than a preset right swing threshold value, and outputting that the forearm is in a fourth motion posture when the position median is smaller than a preset left swing threshold value.
Preferably, the detector is a smart watch or a bracelet, and an IMU chip is arranged in the smart watch or the bracelet.
The control method for the interface interaction of the electronic equipment comprises the following steps: acquiring posture parameters output by a forearm movement posture recognition device; acquiring an interface interaction control rule; judging whether the attitude parameters meet the conditions of the interface interaction control rules or not; and if so, executing interface interaction operation corresponding to the interface interaction control rule. Wherein, the forearm movement gesture recognition device is any one of the forearm movement gesture recognition devices.
Control apparatus for electronic device interface interaction, comprising: the posture parameter acquisition module is used for acquiring posture parameters output by the forearm movement posture recognition device; the interface interaction control rule acquisition module is used for acquiring an interface interaction control rule of the electronic equipment; the judgment module is used for judging whether the attitude parameters meet the conditions of the interface interaction control rules or not; and the control execution module is used for executing the interface interaction operation corresponding to the interface interaction control rule when the attitude parameter meets the condition of the interface interaction control rule. Wherein, the forearm movement gesture recognition device is any one of the forearm movement gesture recognition devices.
An electronic device comprising a processor, a memory, and a computer program stored in the memory and executable by the processor, the electronic device implementing the steps of the control method of electronic device interface interaction as described above when the computer program is executed by the processor.
Compared with the prior art, the invention at least has the following beneficial effects:
the method can automatically recognize various motion postures of the forearm lifting crossing the chest, the parallel forearm lifting, the left swinging and the right swinging, and can automatically trigger the operation of the interface by utilizing the motion postures of the forearm, thereby improving the immersion feeling of games or software control.
Drawings
FIG. 1 is a schematic illustration of four motion poses of the forearm;
FIG. 2 is a flow chart of a method of identifying a first motion gesture;
FIG. 3 is a definition of the X, Y, and Z axes;
FIG. 4 is a flow chart of a method of identifying a second motion gesture;
FIG. 5 is a flow chart of a method of identifying a third motion gesture and a fourth motion gesture;
FIG. 6 is a block diagram of the forearm movement posture recognition apparatus;
FIG. 7 is a flow chart of a method of controlling interface interaction of an electronic device;
fig. 8 is a block diagram of the control device for interface interaction of the electronic device.
Detailed Description
The invention is further illustrated with reference to the following figures and examples.
Four motion poses defined by some embodiments are shown in fig. 1. The method comprises the following steps: a first motion posture that the two forearms are lifted upwards to intersect at 45 degrees on the chest, a second motion posture that the two forearms are lifted upwards in parallel, a third motion posture of a right single-arm pendulum and a fourth motion posture of a left single-arm pendulum.
According to the invention, the electronic equipment automatically recognizes the motion gesture through a specific gesture recognition method, so that the automatic control of the interface interaction operation of the electronic equipment is realized. The details are as follows.
Fig. 2 shows a flow of the above-described first motion gesture recognition method. As shown in fig. 2, the method for recognizing the first motion gesture includes:
s1-1, acquiring X-axis acceleration Lx and Z-axis acceleration Lz detected by a detector of a left wrist, and acquiring X-axis acceleration Rx and Z-axis acceleration Rz detected by a detector of a right wrist;
an Inertial Measurement Unit (IMU) chip is built in the detector, and the X-axis acceleration, the Y-axis acceleration, the Z-axis acceleration, the X-axis angular velocity, the Y-axis angular velocity, and the Z-axis angular velocity can be detected by the IMU chip. The detector that can be adopted is preferably but not limited to smart bracelet, smart watch, etc. with built-in IMU chip. The X-axis acceleration and the Z-axis acceleration can be acquired in real time through wired or wireless connection with the detector. The connection to the detector is preferably via a bluetooth module.
The definitions of the X, Y and Z axes are shown in fig. 3. As shown in fig. 3, the X-axis is oriented in the same direction as the length of the forearm, and the Z-axis is oriented perpendicular to the top surface of the detector.
Then, dividing the X-axis acceleration Lx and the Z-axis acceleration Lz detected by the detector of the left wrist to obtain a first ratio Lx/Lz; and (4) dividing the X-axis acceleration Rx and the Z-axis acceleration Rz detected by the detector of the right wrist to obtain a second ratio Rx/Rz.
And S1-2, judging whether the first ratio Lx/Lz and the second ratio Rx/Rz fall into a first preset interval with 1 as the center.
The first preset interval in this embodiment is [ 0.95,1.05 ]. The end value of the first preset interval can also be other values, and the value follows the following principle: the larger the range of the first preset interval is, the smaller the missing recognition probability is and the greater the false recognition probability is, whereas the smaller the range of the first preset interval is, the smaller the false recognition probability is and the greater the missing recognition probability is.
Step S1-3, when the first ratio Lx/Lz and the second ratio Rx/Rz both fall into the first preset interval with 1 as the center, further determining whether the state maintaining time exceeds a set time, where the set time is 1 second in this embodiment.
Specifically, after the first ratio Lx/Lz and the second ratio Rx/Rz both fall into a first preset interval with 1 as the center, judging whether the first preset interval is in a second reading state, if not, entering the second reading state, recording a time stamp when the second reading state is entered, and returning to the step S1-1; if yes, judging whether the difference between the current time stamp and the time stamp in the second reading state is larger than 1, if not, returning to the step S1-1, and if so, executing the step S1-4.
And S1-4, outputting that the forearm is in a first motion posture. In this embodiment, the first motion gesture corresponds to a cancel operation.
Fig. 4 shows a flow of the above-described recognition method of the second motion gesture. As shown in fig. 4, the method for recognizing the second motion gesture includes:
and S2-1, acquiring the X-axis acceleration La detected by the detector of the left wrist and the X-axis acceleration Ra detected by the detector of the right wrist.
As described in the foregoing step S1-1, the detector incorporates an IMU chip, by which the X-axis acceleration can be detected.
And S2-2, judging whether the X-axis acceleration La detected by the detector of the left wrist and the X-axis acceleration Ra detected by the detector of the right wrist both fall into a second preset interval with 9.8 as the center.
In this embodiment, the second preset interval is [ 9.5, 10 ]. The end value of the second preset interval can also be other values, and the value follows the following principle: the larger the range of the second preset interval is, the smaller the missing recognition probability is and the greater the false recognition probability is, whereas the smaller the range of the second preset interval is, the smaller the false recognition probability is and the greater the missing recognition probability is.
Step S2-3, when the X-axis acceleration La detected by the detector of the left wrist and the X-axis acceleration Ra detected by the detector of the right wrist both fall into a second preset interval, further determining whether the state maintaining time exceeds a set time period, where the set time period is 1 second in this embodiment.
Specifically, after the X-axis acceleration La detected by the detector of the left wrist and the X-axis acceleration Ra detected by the detector of the right wrist both fall within a second preset interval, judging whether the left wrist is in a second reading state, if not, entering the second reading state, recording a timestamp when the left wrist enters the second reading state, and returning to the step S2-1; if yes, judging whether the difference between the current time stamp and the time stamp in the second reading state is larger than 1, if not, returning to the step S2-1, and if so, executing the step S2-4.
And S2-4, outputting that the forearm is in a second motion posture. In this embodiment, the second motion gesture corresponds to a determination operation.
Fig. 5 shows a flow of the recognition method of the third motion gesture and the fourth motion gesture. As shown in fig. 5, the method for recognizing the third motion gesture and the fourth motion gesture includes:
s3-1, acquiring the Y-axis angular velocity detected by a detector of the left wrist or the right arm;
as described in the foregoing step S1-1, the detector incorporates an IMU chip, and the Y-axis angular velocity can be detected by the IMU chip.
S3-2, storing a plurality of Y-axis angular velocities acquired within a set time duration according to a time sequence to form a Y-axis angular velocity list;
in this embodiment, the set time period is specifically 1 second. The formed Y-axis angular velocity list is a kind of temporary list.
S3-3, judging whether the position median in the Y-axis angular velocity list is the maximum value or not; if not, turning to the step S3-4, and if yes, turning to the step S3-5;
s3-4, judging whether the position median in the Y-axis angular velocity list is the minimum value or not; if not, turning to the step S3-1; if yes, turning to the step S3-6;
s3-5, judging whether the median value of the position is greater than a preset right swing threshold value or not; if not, turning to the step S3-1; if yes, turning to step S3-7;
s3-6, judging whether the median value of the position is smaller than a preset left swing threshold value or not; if not, turning to the step S3-1; if yes, turning to the step S3-8;
s3-7, outputting a third motion posture of the forearm;
and S3-8, outputting the fourth motion posture of the forearm.
As can be seen from the above, with the above technical solution, the processor can automatically recognize whether the forearm enters the first motion posture, the second motion posture, the third motion posture and the fourth motion posture.
A block diagram of the components of an embodiment of the forearm motion gesture recognition apparatus is shown in fig. 6. As shown in fig. 6, the forearm movement posture recognition device specifically includes the following modules:
the acceleration ratio acquisition module 1 is used for performing division operation on the X-axis acceleration and the Z-axis acceleration detected by the detector of the left wrist to acquire a first ratio; performing division operation on the X-axis acceleration and the Z-axis acceleration detected by the detector of the right wrist to obtain a second ratio;
the ratio judging module 2 is used for judging whether the first ratio and the second ratio fall into a first preset interval with 1 as the center;
and the result output module 7 is used for outputting that the forearm is in the first motion posture when the first ratio and the second ratio both fall into a first preset interval taking 1 as the center.
The forearm movement gesture recognition device specifically comprises the following modules:
the acceleration acquisition module 3 is used for acquiring the X-axis acceleration detected by the detector of the left wrist and the X-axis acceleration detected by the detector of the right wrist;
the acceleration judging module 4 is used for judging whether the X-axis acceleration detected by the detector of the left wrist and the X-axis acceleration detected by the detector of the right wrist both fall into a second preset interval with 9.8 as the center;
and the result output module 7 is further configured to output that the forearm is in a second motion posture when the X-axis acceleration detected by the detector of the left wrist and the X-axis acceleration detected by the detector of the right wrist both fall within a second preset interval centered at 9.8.
The forearm movement gesture recognition device specifically comprises the following modules:
the angular velocity acquisition module 5 is used for acquiring the Y-axis angular velocity detected by the detector of the left wrist or the right arm;
the angular velocity determination module 6 is configured to store a plurality of Y-axis angular velocities acquired within a set time duration in a time sequence to form a Y-axis angular velocity list; judging whether the position median in the Y-axis angular velocity list is the maximum value or the minimum value; if the position median is the maximum value, judging whether the position median is larger than a preset right-swing threshold value; if the position median is the minimum value, judging whether the position median is smaller than a preset left swing threshold value;
the result output module 7 is further configured to output that the forearm is in a third motion posture when the median value of the positions is greater than a preset right-swing threshold, and output that the forearm is in a fourth motion posture when the median value of the positions is less than a preset left-swing threshold.
Wherein, the detector is preferred to be built-in have intelligent wrist-watch or the bracelet of IMU chip, and intelligent wrist-watch or bracelet are preferred still to include bluetooth module to realize the wireless transmission who detects data.
The flow of the control method of the electronic device interface interaction is shown in fig. 7. As shown in fig. 7, the interface interaction control method specifically includes:
and S4-1, acquiring the posture parameters output by the forearm movement posture recognition device.
The forearm movement posture recognition device is the forearm movement posture recognition device. In a specific implementation, the attitude parameter is preferably represented by a digital code, for example, 00 represents the first motion attitude, 01 represents the second motion attitude, 10 represents the third motion attitude, and 11 represents the fourth motion attitude.
And S4-2, acquiring an interface interaction control rule.
Specifically, the interface interaction operation includes a cancel operation, a determine operation, a left operation, and a right operation, and the interface interaction control rule specifically includes: the first motion gesture corresponds to a cancel operation, the second motion gesture corresponds to a confirm operation, the third motion gesture corresponds to a page turn right operation, and the fourth motion gesture corresponds to a page turn left operation.
The rules are pre-stored in the memory.
And S4-3, judging whether the attitude parameters meet the conditions of the interface interaction control rules.
Namely, whether the attitude parameters output in the step S4-1 include the attitude parameters corresponding to the interface interactive operation is judged.
And S4-4, if the attitude parameters meet the conditions of the interface interaction control rule, executing interface interaction operation corresponding to the interface interaction control rule.
For example, if the attitude parameter output in step S4-1 is 00, a cancel operation is performed; if the attitude parameter output by the step S4-1 is 01, executing a determining operation; if the posture parameter output by the step S4-1 is 10, performing page turning operation to the right; and if the gesture parameter output in the step S4-1 is 11, performing a page turning operation to the left.
Therefore, the control method realizes automatic control of the interface interaction of the electronic equipment based on the motion posture of the forearm. Compared with the existing interface interaction modes such as a mouse, a keyboard and a touch screen, the control method can improve the immersion feeling of game or software control.
Compared with fingers, the movement amplitude of the front arm is larger, so that the device has the advantages of easiness in identification and low requirement on the precision of the sensor, and compared with precision sensors such as an optical sensor, the IMU chip has the advantage of low cost, so that the combination of the IMU chip and the front arm realizes the automatic control of the interface interaction of the electronic equipment, and the device has the advantages of reliability in control and low cost.
A block diagram of the components of the control device for electronic device interface interaction is shown in fig. 8. As shown in fig. 8, the control device for interface interaction specifically includes the following modules:
and the posture parameter acquisition module 8 is used for acquiring the posture parameters output by the forearm movement posture recognition device. Wherein the forearm movement posture recognition device is the forearm movement posture recognition device;
the interface interaction control rule obtaining module 9 is used for obtaining an interface interaction control rule of the electronic equipment;
the judgment module 10 is used for judging whether the attitude parameters meet the conditions of the interface interaction control rules;
and the control execution module 11 is configured to execute the interface interaction operation corresponding to the interface interaction control rule when the posture parameter meets the condition of the interface interaction control rule.
An embodiment of the present invention further provides an electronic device, which includes a processor, a memory, and a computer program stored in the memory and capable of being executed by the processor, wherein when the computer program is executed by the processor, the electronic device implements the steps of the control method for interface interaction of the electronic device as described above.
The present invention has been described in detail with reference to the specific embodiments, and the detailed description is only for assisting the understanding of the present invention by those skilled in the art, and is not to be construed as limiting the scope of the present invention. Various modifications, equivalent changes, etc., which can be made by those skilled in the art under the conception of the present invention, should be included in the protection scope of the present invention.

Claims (8)

1. A forearm movement gesture recognition method, wherein the forearm movement gesture includes a first movement gesture in which two forearms lift upward to intersect at 45 degrees on the chest, the recognition method comprising the steps of:
s1-1, acquiring X-axis acceleration and Z-axis acceleration detected by a detector of a left wrist;
acquiring X-axis acceleration and Z-axis acceleration detected by a detector of the right wrist;
performing division operation on the X-axis acceleration and the Z-axis acceleration detected by the detector of the left wrist to obtain a first ratio;
performing division operation on the X-axis acceleration and the Z-axis acceleration detected by the detector of the right wrist to obtain a second ratio;
s1-2, judging whether the first ratio and the second ratio fall into a first preset interval with 1 as the center;
s1-3, judging whether the second reading state exists or not when the judgment result shows that the first ratio and the second ratio both fall into a first preset interval with 1 as the center, if not, entering the second reading state, recording a time stamp when the second reading state is entered, and returning to the step S1-1; if yes, judging whether the difference between the current timestamp and the timestamp in the second reading state is larger than 1, if not, returning to the step S1-1, and if so, executing the step S1-4;
s1-4, if yes, outputting that the forearm is in the first motion posture;
wherein, the direction of the X axis is the same as the length direction of the front arm, and the direction of the Z axis is vertical to the top surface of the detector;
the forearm movement gestures further comprise a second movement gesture that two forearms lift up in parallel, and the recognition method further comprises the following steps: and judging whether the X-axis acceleration detected by the detector of the left wrist and the X-axis acceleration detected by the detector of the right wrist both fall into a second preset interval with 9.8 as the center, and if so, outputting that the forearm is in the second motion posture.
2. The forearm movement gesture recognition method of claim 1, wherein the forearm movement gesture further includes a third movement gesture of a right single-arm pendulum and a fourth movement gesture of a left single-arm pendulum, the recognition method further comprising the steps of:
acquiring the Y-axis angular velocity detected by a detector of the left wrist or the right arm;
saving a plurality of Y-axis angular velocities acquired within a set time duration according to a time sequence to form a Y-axis angular velocity list;
judging whether the position median in the Y-axis angular velocity list is the maximum value or the minimum value;
if the position median is the maximum value, judging whether the position median is larger than a preset right swing threshold value;
if the position median is larger than a preset right swing threshold, outputting that the forearm is in the third motion posture;
if the position median is the minimum value, judging whether the position median is smaller than a preset left swing threshold value;
and if the position median is smaller than a preset left swing threshold value, outputting that the forearm is in the fourth motion posture.
3. A forearm movement posture identifying device characterized by comprising:
the acceleration ratio acquisition module is used for performing division operation on the X-axis acceleration and the Z-axis acceleration detected by the detector of the left wrist to acquire a first ratio; performing division operation on the X-axis acceleration and the Z-axis acceleration detected by the detector of the right wrist to obtain a second ratio;
the ratio judging module is used for judging whether the first ratio and the second ratio fall into a first preset interval with 1 as the center;
the second reading module is used for judging whether the second reading state is achieved or not when the judgment result of the ratio judgment module is that the first ratio and the second ratio both fall into a first preset interval with 1 as the center, if not, the second reading state is achieved, a timestamp of the second reading state is recorded, and the acceleration ratio acquisition module is returned to be executed; if yes, judging whether the difference between the current timestamp and the timestamp in the second reading state is larger than 1, if not, returning to execute the acceleration ratio acquisition module, and if so, executing a result output module;
the acceleration acquisition module is used for acquiring the X-axis acceleration detected by the detector of the left wrist and the X-axis acceleration detected by the detector of the right wrist;
the acceleration judging module is used for judging whether the X-axis acceleration detected by the detector of the left wrist and the X-axis acceleration detected by the detector of the right wrist both fall into a second preset interval with 9.8 as the center;
and the result output module is used for outputting that the forearm is in a first motion gesture when the first ratio and the second ratio both fall into a first preset interval taking 1 as a center and fall into a second preset interval taking 9.8 as a center when the X-axis acceleration detected by the detector of the left wrist and the X-axis acceleration detected by the detector of the right wrist both fall into the first preset interval and the preset time meets a preset value, and outputting that the forearm is in a second motion gesture.
4. The forearm movement gesture recognition device of claim 3, further comprising:
the angular velocity acquisition module is used for acquiring the Y-axis angular velocity detected by the detector of the left wrist or the right arm;
the angular velocity judging module is used for storing a plurality of Y-axis angular velocities acquired within a set time length according to a time sequence to form a Y-axis angular velocity list; judging whether the position median in the Y-axis angular velocity list is the maximum value or the minimum value; if the position median is the maximum value, judging whether the position median is larger than a preset right-swing threshold value; if the position median is the minimum value, judging whether the position median is smaller than a preset left-swing threshold value;
the result output module is further used for outputting that the forearm is in a third motion posture when the position median is larger than a preset right swing threshold value, and outputting that the forearm is in a fourth motion posture when the position median is smaller than a preset left swing threshold value.
5. The forearm movement gesture recognition device of claim 3, wherein the detector is a smart watch or bracelet with an IMU chip built in.
6. The method for controlling the interface interaction of the electronic equipment is characterized by comprising the following steps of:
acquiring posture parameters output by a forearm movement posture recognition device, wherein the forearm movement posture recognition device is the forearm movement posture recognition device according to any one of claims 3 to 5;
acquiring an interface interaction control rule;
judging whether the attitude parameters meet the conditions of the interface interaction control rules or not;
and if so, executing interface interaction operation corresponding to the interface interaction control rule.
7. Control device of electronic equipment interface interaction, characterized by that, includes:
a posture parameter acquiring module, configured to acquire a posture parameter output by a forearm movement posture recognition device, where the forearm movement posture recognition device is a forearm movement posture recognition device according to any one of claims 3 to 5;
the interface interaction control rule acquisition module is used for acquiring an interface interaction control rule of the electronic equipment;
the judgment module is used for judging whether the attitude parameters meet the conditions of the interface interaction control rules or not;
and the control execution module is used for executing the interface interaction operation corresponding to the interface interaction control rule when the attitude parameter meets the condition of the interface interaction control rule.
8. An electronic device, characterized in that it comprises a processor, a memory, and a computer program stored in the memory and executable by the processor, the computer program, when executed by the processor, implementing the steps of the control method according to claim 6.
CN202011331040.6A 2020-11-24 2020-11-24 Forearm movement posture recognition method, interface interaction control method and device Active CN112426709B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011331040.6A CN112426709B (en) 2020-11-24 2020-11-24 Forearm movement posture recognition method, interface interaction control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011331040.6A CN112426709B (en) 2020-11-24 2020-11-24 Forearm movement posture recognition method, interface interaction control method and device

Publications (2)

Publication Number Publication Date
CN112426709A CN112426709A (en) 2021-03-02
CN112426709B true CN112426709B (en) 2022-11-18

Family

ID=74694540

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011331040.6A Active CN112426709B (en) 2020-11-24 2020-11-24 Forearm movement posture recognition method, interface interaction control method and device

Country Status (1)

Country Link
CN (1) CN112426709B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117311490A (en) * 2022-06-29 2023-12-29 华为技术有限公司 Wrist-worn device control method, related system and storage medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160078289A1 (en) * 2014-09-16 2016-03-17 Foundation for Research and Technology - Hellas (FORTH) (acting through its Institute of Computer Gesture Recognition Apparatuses, Methods and Systems for Human-Machine Interaction
CN106468951A (en) * 2016-08-29 2017-03-01 华东师范大学 A kind of intelligent remote control systems based on the fusion of both hands ring sensor and its method
CN108245880A (en) * 2018-01-05 2018-07-06 华东师范大学 Body-sensing detection method for visualizing and system based on more wearing annulus sensor fusions

Also Published As

Publication number Publication date
CN112426709A (en) 2021-03-02

Similar Documents

Publication Publication Date Title
CN102246125B (en) Mobile devices with motion gesture recognition
CN105159539B (en) Touch-control response method, device and the wearable device of wearable device
JP6064280B2 (en) System and method for recognizing gestures
CN110262664B (en) Intelligent interactive glove with cognitive ability
CN104049759A (en) Instruction input and protection method integrating touch screen and behavior sensing
EP2860611A1 (en) User interface method and apparatus based on spatial location recognition
CN101625607A (en) Finger mouse
CN106441350A (en) Step counting method and terminal
CN104769522A (en) Remote control with 3D pointing and gesture recognition capabilities
CN105302021A (en) Wearable gesture control device for controlling movement of robot in human-computer collaborative remanufacturing
US10296096B2 (en) Operation recognition device and operation recognition method
CN104423566A (en) Gesture recognition method and wearable device
CN103294226B (en) A kind of virtual input device and method
CN112426709B (en) Forearm movement posture recognition method, interface interaction control method and device
EP3289435B1 (en) User interface control using impact gestures
CN111552383A (en) Finger identification method and system of virtual augmented reality interaction equipment and interaction equipment
CN113031840A (en) False triggering prevention method and device for wrist-worn device, electronic device and storage medium
CN110554706A (en) visual navigation self-balancing vehicle and balancing method
US20180373392A1 (en) Information processing device and information processing method
CN113721764A (en) IMU-based human-computer interaction system and control and evaluation method
CN108009620A (en) A kind of fortnightly holiday method of counting, system and device
CN110236560A (en) Six axis attitude detecting methods of intelligent wearable device, system
CN107272877A (en) A kind of recognition methods of intelligent glasses and intelligent glasses
JP6891891B2 (en) Information processing device
CN106933342A (en) Body-sensing system, motion sensing control equipment and intelligent electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20221025

Address after: 518000 2801, block a, building 2, Shenzhen Bay innovation and technology center, No. 3156, Keyuan South Road, community, high tech Zone, Yuehai street, Nanshan District, Shenzhen, Guangdong

Applicant after: Shenzhen Jinling Technology Co.,Ltd.

Address before: 811-1, building 10, Shenzhen Bay science and technology ecological park, No.10, Gaoxin South 9th Road, high tech Zone community, Yuehai street, Nanshan District, Shenzhen, Guangdong 518000

Applicant before: Shenzhen Jinling mutual Entertainment Technology Co.,Ltd.

GR01 Patent grant
GR01 Patent grant